Sample records for developed method enables

  1. Propulsion Diagnostic Method Evaluation Strategy (ProDiMES) User's Guide

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.

    2010-01-01

    This report is a User's Guide for the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES). ProDiMES is a standard benchmarking problem and a set of evaluation metrics to enable the comparison of candidate aircraft engine gas path diagnostic methods. This Matlab (The Mathworks, Inc.) based software tool enables users to independently develop and evaluate diagnostic methods. Additionally, a set of blind test case data is also distributed as part of the software. This will enable the side-by-side comparison of diagnostic approaches developed by multiple users. The Users Guide describes the various components of ProDiMES, and provides instructions for the installation and operation of the tool.

  2. Development of the High-Order Decoupled Direct Method in Three Dimensions for Particulate Matter: Enabling Advanced Sensitivity Analysis in Air Quality Models

    EPA Science Inventory

    The high-order decoupled direct method in three dimensions for particular matter (HDDM-3D/PM) has been implemented in the Community Multiscale Air Quality (CMAQ) model to enable advanced sensitivity analysis. The major effort of this work is to develop high-order DDM sensitivity...

  3. Using practice development methodology to develop children's centre teams: ideas for the future.

    PubMed

    Hemingway, Ann; Cowdell, Fiona

    2009-09-01

    The Children's Centre Programme is a recent development in the UK and brings together multi-agency teams to work with disadvantaged families. Practice development methods enable teams to work together in new ways. Although the term practice development remains relatively poorly defined, its key properties suggest that it embraces engagement, empowerment, evaluation and evolution. This paper introduces the Children's Centre Programme and practice development methods and aims to discuss the relevance of using this method to develop teams in children's centres through considering the findings from an evaluation of a two-year project to develop inter-agency public health teams. The evaluation showed that practice development methods can enable successful team development and showed that through effective facilitation, teams can change their practice to focus on areas of local need. The team came up with their own process to develop a strategy for their locality.

  4. Air Quality Management Alternatives: United States Air Force Firefighter Training Facilities

    DTIC Science & Technology

    1988-01-01

    Pollution at LAX, JFK , and ORD," Impact of Aircraft Emissions on Air Quality in the Vicinity of Airports , Volume II, FAA-EE-80-09B, Federal...developed and applied . This method enabled fire prevention, and environmental management experts and professionals to provide data, opinions, and to...methodology utilizing questionnaires, interviews, and site visits is developed and applied . This method enabled fire prevention, and environmental

  5. Technology Enabled Learning. Symposium.

    ERIC Educational Resources Information Center

    2002

    This document contains three papers on technology-enabled learning and human resource development. Among results found in "Current State of Technology-enabled Learning Programs in Select Federal Government Organizations: a Case Study of Ten Organizations" (Letitia A. Combs) are the following: the dominant delivery method is traditional…

  6. What helps or hinders the transformation from a major tertiary center to a major trauma center? Identifying barriers and enablers using the Theoretical Domains Framework.

    PubMed

    Roberts, Neil; Lorencatto, Fabiana; Manson, Joanna; Brundage, Susan I; Jansen, Jan O

    2016-03-12

    Major Trauma Centers (MTCs), as part of a trauma system, improve survival and functional outcomes from injury. Developing such centers from current teaching hospitals is likely to generate diverse beliefs amongst staff. These may act as barriers or enablers. Prior identification of these may make the service development process more efficient. The importance of applying theory to systematically identify barriers and enablers to changing clinical practice in emergency medicine has been emphasized. This study systematically explored theory-based barriers and enablers towards implementing the transformation of a tertiary hospital into a MTC. Our goal was to demonstrate the use of a replicable method to identify targets that could be addressed to achieve a successful transformation from an organization evolved to provide a particular type of clinical care into a clinical system with different demands, requirements and expectations. The Theoretical Domains Framework (TDF) is a tool designed to elicit and analyze beliefs affecting behavior. Semi-structured interviews based around the TDF were conducted in a major tertiary hospital in Scotland due to become a MTC with a purposive sample of major stakeholders including clinicians and nurses from specialties involved in trauma care, clinical managers and administration. Belief statements were identified through qualitative analysis, and assessed for importance according to prevalence, discordance and evidence base. 1728 utterances were recorded and coded into 91 belief statements. 58 were classified as important barriers/enablers. There were major concerns about resource demands, with optimism conditional on these being met. Distracting priorities abound within the Emergency Department. Better communication is needed. Staff motivation is high and they should be engaged in skills development and developing performance improvement processes. This study presents a systematic and replicable method of identifying theory-based barriers and enablers towards complex service development. It identifies multiple barriers/enablers that may serve as a basis for developing an implementation intervention to enhance the development of MTCs. This method can be used to address similar challenges in developing specialist centers or implementing clinical practice change in emergency care across both developing and developed countries.

  7. Transforming paper-based assessment forms to a digital format: Exemplified by the Housing Enabler prototype app.

    PubMed

    Svarre, Tanja; Lunn, Tine Bieber Kirkegaard; Helle, Tina

    2017-11-01

    The aim of this paper is to provide the reader with an overall impression of the stepwise user-centred design approach including the specific methods used and lessons learned when transforming paper-based assessment forms into a prototype app, taking the Housing Enabler as an example. Four design iterations were performed, building on a domain study, workshops, expert evaluation and controlled and realistic usability tests. The user-centred design process involved purposefully selected participants with different Housing Enabler knowledge and housing adaptation experience. The design iterations resulted in the development of a Housing Enabler prototype app. The prototype app has several features and options that are new compared with the original paper-based Housing Enabler assessment form. These new features include a user friendly overview of the assessment form; easy navigation by swiping back and forth between items; onsite data analysis; and ranking of the accessibility score, photo documentation and a data export facility. Based on the presented stepwise approach, a high-fidelity Housing Enabler prototype app was successfully developed. The development process has emphasized the importance of combining design participants' knowledge and experiences, and has shown that methods should seem relevant to participants to increase their engagement.

  8. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    NASA Technical Reports Server (NTRS)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  9. Selective C(sp3)−H Aerobic Oxidation Enabled by Decatungstate Photocatalysis in Flow

    PubMed Central

    Laudadio, Gabriele; Govaerts, Sebastian; Wang, Ying; Ravelli, Davide; Koolman, Hannes F.; Fagnoni, Maurizio; Djuric, Stevan W.

    2018-01-01

    Abstract A mild and selective C(sp3)−H aerobic oxidation enabled by decatungstate photocatalysis has been developed. The reaction can be significantly improved in a microflow reactor enabling the safe use of oxygen and enhanced irradiation of the reaction mixture. Our method allows for the oxidation of both activated and unactivated C−H bonds (30 examples). The ability to selectively oxidize natural scaffolds, such as (−)‐ambroxide, pregnenolone acetate, (+)‐sclareolide, and artemisinin, exemplifies the utility of this new method. PMID:29451725

  10. Expanded Genetic Codes in Next Generation Sequencing Enable Decontamination and Mitochondrial Enrichment

    PubMed Central

    McKernan, Kevin J.; Spangler, Jessica; Zhang, Lei; Tadigotla, Vasisht; McLaughlin, Stephen; Warner, Jason; Zare, Amir; Boles, Richard G.

    2014-01-01

    We have developed a PCR method, coined Déjà vu PCR, that utilizes six nucleotides in PCR with two methyl specific restriction enzymes that respectively digest these additional nucleotides. Use of this enzyme-and-nucleotide combination enables what we term a “DNA diode”, where DNA can advance in a laboratory in only one direction and cannot feedback into upstream assays. Here we describe aspects of this method that enable consecutive amplification with the introduction of a 5th and 6th base while simultaneously providing methylation dependent mitochondrial DNA enrichment. These additional nucleotides enable a novel DNA decontamination technique that generates ephemeral and easy to decontaminate DNA. PMID:24788618

  11. ENFIN--A European network for integrative systems biology.

    PubMed

    Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan

    2009-11-01

    Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.

  12. Theory and applications of structured light single pixel imaging

    NASA Astrophysics Data System (ADS)

    Stokoe, Robert J.; Stockton, Patrick A.; Pezeshki, Ali; Bartels, Randy A.

    2018-02-01

    Many single-pixel imaging techniques have been developed in recent years. Though the methods of image acquisition vary considerably, the methods share unifying features that make general analysis possible. Furthermore, the methods developed thus far are based on intuitive processes that enable simple and physically-motivated reconstruction algorithms, however, this approach may not leverage the full potential of single-pixel imaging. We present a general theoretical framework of single-pixel imaging based on frame theory, which enables general, mathematically rigorous analysis. We apply our theoretical framework to existing single-pixel imaging techniques, as well as provide a foundation for developing more-advanced methods of image acquisition and reconstruction. The proposed frame theoretic framework for single-pixel imaging results in improved noise robustness, decrease in acquisition time, and can take advantage of special properties of the specimen under study. By building on this framework, new methods of imaging with a single element detector can be developed to realize the full potential associated with single-pixel imaging.

  13. Coater/developer based techniques to improve high-resolution EUV patterning defectivity

    NASA Astrophysics Data System (ADS)

    Hontake, Koichi; Huli, Lior; Lemley, Corey; Hetzer, Dave; Liu, Eric; Ko, Akiteru; Kawakami, Shinichiro; Shimoaoki, Takeshi; Hashimoto, Yusaku; Tanaka, Koichiro; Petrillo, Karen; Meli, Luciana; De Silva, Anuja; Xu, Yongan; Felix, Nelson; Johnson, Richard; Murray, Cody; Hubbard, Alex

    2017-10-01

    Extreme ultraviolet lithography (EUVL) technology is one of the leading candidates under consideration for enabling the next generation of devices, for 7nm node and beyond. As the focus shifts to driving down the 'effective' k1 factor and enabling the full scaling entitlement of EUV patterning, new techniques and methods must be developed to reduce the overall defectivity, mitigate pattern collapse, and eliminate film-related defects. In addition, CD uniformity and LWR/LER must be improved in terms of patterning performance. Tokyo Electron Limited (TEL™) and IBM Corporation are continuously developing manufacturing quality processes for EUV. In this paper, we review the ongoing progress in coater/developer based processes (coating, developing, baking) that are required to enable EUV patterning.

  14. Method to fabricate silicon chromatographic column comprising fluid ports

    DOEpatents

    Manginell, Ronald P.; Frye-Mason, Gregory C.; Heller, Edwin J.; Adkins, Douglas R.

    2004-03-02

    A new method for fabricating a silicon chromatographic column comprising through-substrate fluid ports has been developed. This new method enables the fabrication of multi-layer interconnected stacks of silicon chromatographic columns.

  15. Sonic Boom Modeling Technical Challenge

    NASA Technical Reports Server (NTRS)

    Sullivan, Brenda M.

    2007-01-01

    This viewgraph presentation reviews the technical challenges in modeling sonic booms. The goal of this program is to develop knowledge, capabilities and technologies to enable overland supersonic flight. The specific objectives of the modeling are: (1) Develop and validate sonic boom propagation model through realistic atmospheres, including effects of turbulence (2) Develop methods enabling prediction of response of and acoustic transmission into structures impacted by sonic booms (3) Develop and validate psychoacoustic model of human response to sonic booms under both indoor and outdoor listening conditions, using simulators.

  16. Selective C(sp3 )-H Aerobic Oxidation Enabled by Decatungstate Photocatalysis in Flow.

    PubMed

    Laudadio, Gabriele; Govaerts, Sebastian; Wang, Ying; Ravelli, Davide; Koolman, Hannes F; Fagnoni, Maurizio; Djuric, Stevan W; Noël, Timothy

    2018-04-03

    A mild and selective C(sp 3 )-H aerobic oxidation enabled by decatungstate photocatalysis has been developed. The reaction can be significantly improved in a microflow reactor enabling the safe use of oxygen and enhanced irradiation of the reaction mixture. Our method allows for the oxidation of both activated and unactivated C-H bonds (30 examples). The ability to selectively oxidize natural scaffolds, such as (-)-ambroxide, pregnenolone acetate, (+)-sclareolide, and artemisinin, exemplifies the utility of this new method. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  17. Enzyme catalysis: Evolution made easy

    NASA Astrophysics Data System (ADS)

    Wee, Eugene J. H.; Trau, Matt

    2014-09-01

    Directed evolution is a powerful tool for the development of improved enzyme catalysts. Now, a method that enables an enzyme, its encoding DNA and a fluorescent reaction product to be encapsulated in a gel bead enables the application of directed evolution in an ultra-high-throughput format.

  18. Methods for transition toward computer assisted cognitive examination.

    PubMed

    Jurica, P; Valenzi, S; Struzik, Z R; Cichocki, A

    2015-01-01

    We present a software framework which enables the extension of current methods for the assessment of cognitive fitness using recent technological advances. Screening for cognitive impairment is becoming more important as the world's population grows older. Current methods could be enhanced by use of computers. Introduction of new methods to clinics requires basic tools for collection and communication of collected data. To develop tools that, with minimal interference, offer new opportunities for the enhancement of the current interview based cognitive examinations. We suggest methods and discuss process by which established cognitive tests can be adapted for data collection through digitization by pen enabled tablets. We discuss a number of methods for evaluation of collected data, which promise to increase the resolution and objectivity of the common scoring strategy based on visual inspection. By involving computers in the roles of both instructing and scoring, we aim to increase the precision and reproducibility of cognitive examination. The tools provided in Python framework CogExTools available at http://bsp. brain.riken.jp/cogextools/ enable the design, application and evaluation of screening tests for assessment of cognitive impairment. The toolbox is a research platform; it represents a foundation for further collaborative development by the wider research community and enthusiasts. It is free to download and use, and open-source. We introduce a set of open-source tools that facilitate the design and development of new cognitive tests for modern technology. We provide these tools in order to enable the adaptation of technology for cognitive examination in clinical settings. The tools provide the first step in a possible transition toward standardized mental state examination using computers.

  19. Development of Light-Activated CRISPR Using Guide RNAs with Photocleavable Protectors.

    PubMed

    Jain, Piyush K; Ramanan, Vyas; Schepers, Arnout G; Dalvie, Nisha S; Panda, Apekshya; Fleming, Heather E; Bhatia, Sangeeta N

    2016-09-26

    The ability to remotely trigger CRISPR/Cas9 activity would enable new strategies to study cellular events with greater precision and complexity. In this work, we have developed a method to photocage the activity of the guide RNA called "CRISPR-plus" (CRISPR-precise light-mediated unveiling of sgRNAs). The photoactivation capability of our CRISPR-plus method is compatible with the simultaneous targeting of multiple DNA sequences and supports numerous modifications that can enable guide RNA labeling for use in imaging and mechanistic investigations. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A Muskingum-based methodology for river discharge estimation and rating curve development under significant lateral inflow conditions

    NASA Astrophysics Data System (ADS)

    Barbetta, Silvia; Moramarco, Tommaso; Perumal, Muthiah

    2017-11-01

    Quite often the discharge at a site is estimated using the rating curve developed for that site and its development requires river flow measurements, which are costly, tedious and dangerous during severe floods. To circumvent the conventional rating curve development approach, Perumal et al. in 2007 and 2010 applied the Variable Parameter Muskingum Stage-hydrograph (VPMS) routing method for developing stage-discharge relationships especially at those ungauged river sites where stage measurements and details of section geometry are available, but discharge measurements are not made. The VPMS method enables to estimate rating curves at ungauged river sites with acceptable accuracy. But the application of the method is subjected to the limitation of negligible presence of lateral flow within the routing reach. To overcome this limitation, this study proposes an extension of the VPMS method, henceforth, known herein as the VPMS-Lin method, for enabling the streamflow assessment even when significant lateral inflow occurs along the river reach considered for routing. The lateral inflow is estimated through the continuity equation expressed in the characteristic form as advocated by Barbetta et al. in 2012. The VPMS-Lin, is tested on two rivers characterized by different geometric and hydraulic properties: 1) a 50 km reach of the Tiber River in (central Italy) and 2) a 73 km reach of the Godavari River in the peninsular India. The study demonstrates that both the upstream and downstream discharge hydrographs are well reproduced, with a root mean square error equal on average to about 35 and 1700 m3 s-1 for the Tiber River and the Godavari River case studies, respectively. Moreover, simulation studies carried out on a river stretch of the Tiber River using the one-dimensional hydraulic model MIKE11 and the VPMS-Lin models demonstrate the accuracy of the VMPS-Lin model, which besides enabling the estimation of streamflow, also enables the estimation of reach averaged optimal roughness coefficients for the considered routing events.

  1. Multilayer ultra thick resist development for MEMS

    NASA Astrophysics Data System (ADS)

    Washio, Yasushi; Senzaki, Takahiro; Masuda, Yasuo; Saito, Koji; Obiya, Hiroyuki

    2005-05-01

    MEMS (Micro-Electro-Mechanical Systems) is achieved through a process technology, called Micro-machining. There are two distinct methods to manufacture a MEMS-product. One method is to form permanent film through photolithography, and the other is to form a non-permanent film resist after photolithography proceeded by etch or plating process. The three-dimensional ultra-fine processing technology based on photolithography, and is assembled by processes, such as anode junction, and post lithography processes such as etching and plating. Currently ORDYL PR-100 (Dry Film Type) is used for the permanent resist process. TOK has developed TMMR S2000 (Liquid Type) and TMMF S2000 (Dry Film Type) also. TOK has developed a new process utilizing these resist. The electro-forming method by photolithography is developed as one of the methods for enabling high resolution and high aspect formation. In recent years, it has become possible to manufacture conventionally difficult multilayer through our development with material and equipment project (M&E). As for material for electro-forming, it was checked that chemically amplified resist is optimal from the reaction mechanism as it is easily removed by the clean solution. Moreover, multiple plating formations were enabled with the resist through a new process. As for the equipment, TOK developed Applicator (It can apply 500 or more μms) and Developer, which achieves high throughput and quality. The detailed plating formations, which a path differs, and air wiring are realizable through M&E. From the above results, opposed to metallic mold plating, electro-forming method by resist, enabled to form high resolution and aspect pattern, at low cost. It is thought that the infinite possibility spreads by applying this process.

  2. Anatomy Drawing Screencasts: Enabling Flexible Learning for Medical Students

    ERIC Educational Resources Information Center

    Pickering, James D.

    2015-01-01

    The traditional lecture remains an essential method of disseminating information to medical students. However, due to the constant development of the modern medical curriculum many institutions are embracing novel means for delivering the core anatomy syllabus. Using mobile media devices is one such way, enabling students to access core material…

  3. Implementing Extreme Programming in Distributed Software Project Teams: Strategies and Challenges

    NASA Astrophysics Data System (ADS)

    Maruping, Likoebe M.

    Agile software development methods and distributed forms of organizing teamwork are two team process innovations that are gaining prominence in today's demanding software development environment. Individually, each of these innovations has yielded gains in the practice of software development. Agile methods have enabled software project teams to meet the challenges of an ever turbulent business environment through enhanced flexibility and responsiveness to emergent customer needs. Distributed software project teams have enabled organizations to access highly specialized expertise across geographic locations. Although much progress has been made in understanding how to more effectively manage agile development teams and how to manage distributed software development teams, managers have little guidance on how to leverage these two potent innovations in combination. In this chapter, I outline some of the strategies and challenges associated with implementing agile methods in distributed software project teams. These are discussed in the context of a study of a large-scale software project in the United States that lasted four months.

  4. Effective Biot theory and its generalization to poroviscoelastic models

    NASA Astrophysics Data System (ADS)

    Liu, Xu; Greenhalgh, Stewart; Zhou, Bing; Greenhalgh, Mark

    2018-02-01

    A method is suggested to express the effective bulk modulus of the solid frame of a poroelastic material as a function of the saturated bulk modulus. This method enables effective Biot theory to be described through the use of seismic dispersion measurements or other models developed for the effective saturated bulk modulus. The effective Biot theory is generalized to a poroviscoelastic model of which the moduli are represented by the relaxation functions of the generalized fractional Zener model. The latter covers the general Zener and the Cole-Cole models as special cases. A global search method is described to determine the parameters of the relaxation functions, and a simple deterministic method is also developed to find the defining parameters of the single Cole-Cole model. These methods enable poroviscoelastic models to be constructed, which are based on measured seismic attenuation functions, and ensure that the model dispersion characteristics match the observations.

  5. An inverse method for determining the spatially resolved properties of viscoelastic–viscoplastic three-dimensional printed materials

    PubMed Central

    Chen, X.; Ashcroft, I. A.; Wildman, R. D.; Tuck, C. J.

    2015-01-01

    A method using experimental nanoindentation and inverse finite-element analysis (FEA) has been developed that enables the spatial variation of material constitutive properties to be accurately determined. The method was used to measure property variation in a three-dimensional printed (3DP) polymeric material. The accuracy of the method is dependent on the applicability of the constitutive model used in the inverse FEA, hence four potential material models: viscoelastic, viscoelastic–viscoplastic, nonlinear viscoelastic and nonlinear viscoelastic–viscoplastic were evaluated, with the latter enabling the best fit to experimental data. Significant changes in material properties were seen in the depth direction of the 3DP sample, which could be linked to the degree of cross-linking within the material, a feature inherent in a UV-cured layer-by-layer construction method. It is proposed that the method is a powerful tool in the analysis of manufacturing processes with potential spatial property variation that will also enable the accurate prediction of final manufactured part performance. PMID:26730216

  6. An inverse method for determining the spatially resolved properties of viscoelastic-viscoplastic three-dimensional printed materials.

    PubMed

    Chen, X; Ashcroft, I A; Wildman, R D; Tuck, C J

    2015-11-08

    A method using experimental nanoindentation and inverse finite-element analysis (FEA) has been developed that enables the spatial variation of material constitutive properties to be accurately determined. The method was used to measure property variation in a three-dimensional printed (3DP) polymeric material. The accuracy of the method is dependent on the applicability of the constitutive model used in the inverse FEA, hence four potential material models: viscoelastic, viscoelastic-viscoplastic, nonlinear viscoelastic and nonlinear viscoelastic-viscoplastic were evaluated, with the latter enabling the best fit to experimental data. Significant changes in material properties were seen in the depth direction of the 3DP sample, which could be linked to the degree of cross-linking within the material, a feature inherent in a UV-cured layer-by-layer construction method. It is proposed that the method is a powerful tool in the analysis of manufacturing processes with potential spatial property variation that will also enable the accurate prediction of final manufactured part performance.

  7. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  8. Microchannel gel electrophoretic separation systems and methods for preparing and using

    DOEpatents

    Herr, Amy E; Singh, Anup K; Throckmorton, Daniel J

    2015-02-24

    A micro-analytical platform for performing electrophoresis-based immunoassays was developed by integrating photopolymerized cross-linked polyacrylamide gels within a microfluidic device. The microfluidic immunoassays are performed by gel electrophoretic separation and quantifying analyte concentration based upon conventional polyacrylamide gel electrophoresis (PAGE). To retain biological activity of proteins and maintain intact immune complexes, native PAGE conditions were employed. Both direct (non-competitive) and competitive immunoassay formats are demonstrated in microchips for detecting toxins and biomarkers (cytokines, c-reactive protein) in bodily fluids (serum, saliva, oral fluids). Further, a description of gradient gels fabrication is included, in an effort to describe methods we have developed for further optimization of on-chip PAGE immunoassays. The described chip-based PAGE immunoassay method enables immunoassays that are fast (minutes) and require very small amounts of sample (less than a few microliters). Use of microfabricated chips as a platform enables integration, parallel assays, automation and development of portable devices.

  9. Microchannel gel electrophoretic separation systems and methods for preparing and using

    DOEpatents

    Herr, Amy; Singh, Anup K; Throckmorton, Daniel J

    2013-09-03

    A micro-analytical platform for performing electrophoresis-based immunoassays was developed by integrating photopolymerized cross-linked polyacrylamide gels within a microfluidic device. The microfluidic immunoassays are performed by gel electrophoretic separation and quantifying analyte concentration based upon conventional polyacrylamide gel electrophoresis (PAGE). To retain biological activity of proteins and maintain intact immune complexes, native PAGE conditions were employed. Both direct (non-competitive) and competitive immunoassay formats are demonstrated in microchips for detecting toxins and biomarkers (cytokines, c-reactive protein) in bodily fluids (serum, saliva, oral fluids). Further, a description of gradient gels fabrication is included, in an effort to describe methods we have developed for further optimization of on-chip PAGE immunoassays. The described chip-based PAGE immunoassay method enables immunoassays that are fast (minutes) and require very small amounts of sample (less than a few microliters). Use of microfabricated chips as a platform enables integration, parallel assays, automation and development of portable devices.

  10. Hydrogen Infrastructure Testing and Research Facility | Hydrogen and Fuel

    Science.gov Websites

    stations, enabling NREL to validate current industry standards and methods for hydrogen fueling as well as the HITRF to: Develop, quantify performance of, and improve renewable hydrogen production methods

  11. Informatics methods to enable sharing of quantitative imaging research data.

    PubMed

    Levy, Mia A; Freymann, John B; Kirby, Justin S; Fedorov, Andriy; Fennessy, Fiona M; Eschrich, Steven A; Berglund, Anders E; Fenstermacher, David A; Tan, Yongqiang; Guo, Xiaotao; Casavant, Thomas L; Brown, Bartley J; Braun, Terry A; Dekker, Andre; Roelofs, Erik; Mountz, James M; Boada, Fernando; Laymon, Charles; Oborski, Matt; Rubin, Daniel L

    2012-11-01

    The National Cancer Institute Quantitative Research Network (QIN) is a collaborative research network whose goal is to share data, algorithms and research tools to accelerate quantitative imaging research. A challenge is the variability in tools and analysis platforms used in quantitative imaging. Our goal was to understand the extent of this variation and to develop an approach to enable sharing data and to promote reuse of quantitative imaging data in the community. We performed a survey of the current tools in use by the QIN member sites for representation and storage of their QIN research data including images, image meta-data and clinical data. We identified existing systems and standards for data sharing and their gaps for the QIN use case. We then proposed a system architecture to enable data sharing and collaborative experimentation within the QIN. There are a variety of tools currently used by each QIN institution. We developed a general information system architecture to support the QIN goals. We also describe the remaining architecture gaps we are developing to enable members to share research images and image meta-data across the network. As a research network, the QIN will stimulate quantitative imaging research by pooling data, algorithms and research tools. However, there are gaps in current functional requirements that will need to be met by future informatics development. Special attention must be given to the technical requirements needed to translate these methods into the clinical research workflow to enable validation and qualification of these novel imaging biomarkers. Copyright © 2012 Elsevier Inc. All rights reserved.

  12. Enabling CoO improvement thru green initiatives

    NASA Astrophysics Data System (ADS)

    Gross, Eric; Padmabandu, G. G.; Ujazdowski, Richard; Haran, Don; Lake, Matt; Mason, Eric; Gillespie, Walter

    2015-03-01

    Chipmakers continued pressure to drive down costs while increasing utilization requires development in all areas. Cymer's commitment to meeting customer's needs includes developing solutions that enable higher productivity as well as lowering cost of lightsource operation. Improvements in system power efficiency and predictability were deployed to chipmakers' in 2014 with release of our latest Master Oscillating gas chamber. In addition, Cymer has committed to reduced gas usage, completing development in methods to reduce Helium gas usage while maintaining superior bandwidth and wavelength stability. The latest developments in lowering cost of operations are paired with our advanced ETC controller in Cymer's XLR 700ix product.

  13. Optimal cooperative time-fixed impulsive rendezvous

    NASA Technical Reports Server (NTRS)

    Mirfakhraie, Koorosh; Conway, Bruce A.

    1990-01-01

    New capabilities have been added to a method that had been developed for determining optimal, i.e., minimum fuel, trajectories for the fixed-time cooperative rendezvous of two spacecraft. The method utilizes the primer vector theory. The new capabilities enable the method to accomodate cases in which there are fuel constraints on the spacecraft and/or enable the addition of a mid-course impulse to one of the vehicle's trajectories. Results are presented for a large number of cases, and the effect of varying parameters, such as vehicle fuel constraints, vehicle initial masses, and time allowed for the rendezvous, is demonstrated.

  14. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids by Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  15. Efficient Construction of Discrete Adjoint Operators on Unstructured Grids Using Complex Variables

    NASA Technical Reports Server (NTRS)

    Nielsen, Eric J.; Kleb, William L.

    2005-01-01

    A methodology is developed and implemented to mitigate the lengthy software development cycle typically associated with constructing a discrete adjoint solver for aerodynamic simulations. The approach is based on a complex-variable formulation that enables straightforward differentiation of complicated real-valued functions. An automated scripting process is used to create the complex-variable form of the set of discrete equations. An efficient method for assembling the residual and cost function linearizations is developed. The accuracy of the implementation is verified through comparisons with a discrete direct method as well as a previously developed handcoded discrete adjoint approach. Comparisons are also shown for a large-scale configuration to establish the computational efficiency of the present scheme. To ultimately demonstrate the power of the approach, the implementation is extended to high temperature gas flows in chemical nonequilibrium. Finally, several fruitful research and development avenues enabled by the current work are suggested.

  16. Methods for Evaluating Mammography Imaging Techniques

    DTIC Science & Technology

    1999-06-01

    Distribution Unlimited 12b. DIS5TRIBUTION CODE 13. ABSTRACT (Maximum 200 words) This Department of Defense Breast Cancer Research Program Career...Development Award is enabling Dr. Rütter to develop bio’statistical methods for breast cancer research. Dr. Rutter is focusing on methods for...evaluating the accuracy of breast cancer screening. This four year program includes advanced training in the epidemiology of breast cancer , training in

  17. Developing a WWW Resource Centre for Acquiring and Accessing Open Learning Materials on Research Methods (ReMOTE).

    ERIC Educational Resources Information Center

    Newton, Robert; Marcella, Rita; Middleton, Iain; McConnell, Michael

    This paper reports on ReMOTE (Research Methods Online Teaching Environment), a Robert Gordon University (Scotland) project focusing on the development of a World Wide Web (WWW) site devoted to the teaching of research methods. The aim of ReMOTE is to provide an infrastructure that allows direct links to specialist sources in order to enable the…

  18. An Information System Development Method Combining Business Process Modeling with Executable Modeling and its Evaluation by Prototyping

    NASA Astrophysics Data System (ADS)

    Okawa, Tsutomu; Kaminishi, Tsukasa; Hirabayashi, Syuichi; Suzuki, Ryo; Mitsui, Hiroyasu; Koizumi, Hisao

    The business in the enterprise is closely related with the information system to such an extent that the business activities are difficult without the information system. The system design technique that considers the business process well, and that enables a quick system development is requested. In addition, the demand for the development cost is also severe than before. To cope with the current situation, the modeling technology named BPM(Business Process Management/Modeling)is drawing attention and becoming important as a key technology. BPM is a technology to model business activities as business processes and visualize them to improve the business efficiency. However, a general methodology to develop the information system using the analysis result of BPM doesn't exist, and a few development cases are reported. This paper proposes an information system development method combining business process modeling with executable modeling. In this paper we describe a guideline to support consistency of development and development efficiency and the framework enabling to develop the information system from model. We have prototyped the information system with the proposed method and our experience has shown that the methodology is valuable.

  19. Enabling Self-Directed Computer Use for Individuals with Cerebral Palsy: A Systematic Review of Assistive Devices and Technologies

    ERIC Educational Resources Information Center

    Davies, T. Claire; Mudge, Suzie; Ameratunga, Shanthi; Stott, N. Susan

    2010-01-01

    Aim: The purpose of this study was to systematically review published evidence on the development, use, and effectiveness of devices and technologies that enable or enhance self-directed computer access by individuals with cerebral palsy (CP). Methods: Nine electronic databases were searched using keywords "computer", "software", "spastic",…

  20. A survey of enabling technologies in synthetic biology

    PubMed Central

    2013-01-01

    Background Realizing constructive applications of synthetic biology requires continued development of enabling technologies as well as policies and practices to ensure these technologies remain accessible for research. Broadly defined, enabling technologies for synthetic biology include any reagent or method that, alone or in combination with associated technologies, provides the means to generate any new research tool or application. Because applications of synthetic biology likely will embody multiple patented inventions, it will be important to create structures for managing intellectual property rights that best promote continued innovation. Monitoring the enabling technologies of synthetic biology will facilitate the systematic investigation of property rights coupled to these technologies and help shape policies and practices that impact the use, regulation, patenting, and licensing of these technologies. Results We conducted a survey among a self-identifying community of practitioners engaged in synthetic biology research to obtain their opinions and experiences with technologies that support the engineering of biological systems. Technologies widely used and considered enabling by survey participants included public and private registries of biological parts, standard methods for physical assembly of DNA constructs, genomic databases, software tools for search, alignment, analysis, and editing of DNA sequences, and commercial services for DNA synthesis and sequencing. Standards and methods supporting measurement, functional composition, and data exchange were less widely used though still considered enabling by a subset of survey participants. Conclusions The set of enabling technologies compiled from this survey provide insight into the many and varied technologies that support innovation in synthetic biology. Many of these technologies are widely accessible for use, either by virtue of being in the public domain or through legal tools such as non-exclusive licensing. Access to some patent protected technologies is less clear and use of these technologies may be subject to restrictions imposed by material transfer agreements or other contract terms. We expect the technologies considered enabling for synthetic biology to change as the field advances. By monitoring the enabling technologies of synthetic biology and addressing the policies and practices that impact their development and use, our hope is that the field will be better able to realize its full potential. PMID:23663447

  1. Open source 3D printers: an appropriate technology for building low cost optics labs for the developing communities

    NASA Astrophysics Data System (ADS)

    Gwamuri, J.; Pearce, Joshua M.

    2017-08-01

    The recent introduction of RepRap (self-replicating rapid prototyper) 3-D printers and the resultant open source technological improvements have resulted in affordable 3-D printing, enabling low-cost distributed manufacturing for individuals. This development and others such as the rise of open source-appropriate technology (OSAT) and solar powered 3-D printing are moving 3-D printing from an industry based technology to one that could be used in the developing world for sustainable development. In this paper, we explore some specific technological improvements and how distributed manufacturing with open-source 3-D printing can be used to provide open-source 3-D printable optics components for developing world communities through the ability to print less expensive and customized products. This paper presents an open-source low cost optical equipment library which enables relatively easily adapted customizable designs with the potential of changing the way optics is taught in resource constraint communities. The study shows that this method of scientific hardware development has a potential to enables a much broader audience to participate in optical experimentation both as research and teaching platforms. Conclusions on the technical viability of 3-D printing to assist in development and recommendations on how developing communities can fully exploit this technology to improve the learning of optics through hands-on methods have been outlined.

  2. Indentation-Enabled In Situ Mechanical Characterization of Micro/Nanopillars in Electron Microscopes

    NASA Astrophysics Data System (ADS)

    Guo, Qiang; Fu, Xidan; Guo, Xiaolei; Liu, Zhiying; Shi, Yan; Zhang, Di

    2018-04-01

    Indentation-enabled micro/nanomechanical characterization of small-scale specimens provides powerful new tools for probing materials properties that were once unattainable by conventional experimental methods. Recent advancement in instrumentation further allows mechanical testing to be carried out in situ in electron microscopes, with high spatial and temporal resolution. This review discusses the recent development of nanoindentation-enabled in situ mechanical testing in electron microscopes, with an emphasis on the study of micro/nanopillars. Focus is given to novel applications beyond simple compressive and tensile testing that have been developed in the past few years, and limitations and possible future research directions in this field are proposed and discussed.

  3. Thinking Through Computational Exposure as an Evolving Paradign Shift for Exposure Science: Development and Application of Predictive Models from Big Data

    EPA Science Inventory

    Symposium Abstract: Exposure science has evolved from a time when the primary focus was on measurements of environmental and biological media and the development of enabling field and laboratory methods. The Total Exposure Assessment Method (TEAM) studies of the 1980s were class...

  4. Materials Genome Initiative Element

    NASA Technical Reports Server (NTRS)

    Vickers, John

    2015-01-01

    NASA is committed to developing new materials and manufacturing methods that can enable new missions with ever increasing mission demands. Typically, the development and certification of new materials and manufacturing methods in the aerospace industry has required more than 20 years of development time with a costly testing and certification program. To reduce the cost and time to mature these emerging technologies, NASA is developing computational materials tools to improve understanding of the material and guide the certification process.

  5. Developing and implementing a service charter for an integrated regional stroke service: an exploratory case study

    PubMed Central

    2014-01-01

    Background Based on practices in commercial organizations and public services, healthcare organizations are using service charters to inform patients about the quality of service they can expect and to increase patient-centeredness. In the Netherlands, an integrated regional stroke service involving five organizations has developed and implemented a single service charter. The purpose of this study is to determine the organizational enablers for the effective development and implementation of this service charter. Methods We have conducted an exploratory qualitative study using Grounded Theory to determine the organizational enablers of charter development and implementation. Individual semi-structured interviews were held with all members of the steering committee and the taskforce responsible for the service charter. In these twelve interviews, participants were retrospectively asked for their opinions of the enablers. Interview transcripts have been analysed using Glaser’s approach of substantive coding consisting of open and selective coding in order to develop a framework of these enablers. A tabula rasa approach was used without any preconceived frameworks used in the coding process. Results We have determined seven categories of enablers formed of a total of 27 properties. The categories address a broad spectrum of enablers dealing with the basic foundations for cooperation, the way to manage the project’s organization and the way to implement the service charter. In addition to the enablers within each individual organization, enablers that reflect the whole chain seem to be important for the effective development and implementation of this service charter. Strategic alignment of goals within the chain, trust between organizations, willingness to cooperate and the extent of process integration are all important properties. Conclusions This first exploratory study into the enablers of the effective development and implementation was based on a single case study in the Netherlands. This is the only integrated care chain using a single service charter that we could find. Nevertheless, the results of our explorative study provide an initial framework for the development and implementation of service charters in integrated care settings. This research contributes to the literature on service charters, on patient-centeredness in integrated care and on the implementation of innovations. PMID:24678839

  6. 3DHZETRN: Inhomogeneous Geometry Issues

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Slaba, Tony C.; Badavi, Francis F.

    2017-01-01

    Historical methods for assessing radiation exposure inside complicated geometries for space applications were limited by computational constraints and lack of knowledge associated with nuclear processes occurring over a broad range of particles and energies. Various methods were developed and utilized to simplify geometric representations and enable coupling with simplified but efficient particle transport codes. Recent transport code development efforts, leading to 3DHZETRN, now enable such approximate methods to be carefully assessed to determine if past exposure analyses and validation efforts based on those approximate methods need to be revisited. In this work, historical methods of representing inhomogeneous spacecraft geometry for radiation protection analysis are first reviewed. Two inhomogeneous geometry cases, previously studied with 3DHZETRN and Monte Carlo codes, are considered with various levels of geometric approximation. Fluence, dose, and dose equivalent values are computed in all cases and compared. It is found that although these historical geometry approximations can induce large errors in neutron fluences up to 100 MeV, errors on dose and dose equivalent are modest (<10%) for the cases studied here.

  7. Leading Game-Simulation Development Teams: Enabling Collaboration with Faculty Experts

    ERIC Educational Resources Information Center

    Aleckson, Jon D.

    2010-01-01

    This study explored how educational technology development leaders can facilitate increased collaboration between the instructional design and development team and faculty member experts when developing games and simulations. A qualitative, case study method was used to analyze interviews and documents, and Web postings related specifically to…

  8. Developing Water Quality Critera for Suspended and Bedded Sediments-Illustrative Example Application.

    EPA Science Inventory

    The U. S. EPA's Framework for Developing Suspended and Bedded Sediments (SABS) Water Quality Criteria (SABS Framework) provides a consistent process, technical methods, and supporting materials to enable resource managers to develop ambient water quality criteria for one of the m...

  9. Structures and Materials Working Group report

    NASA Technical Reports Server (NTRS)

    Torczyner, Robert; Hanks, Brantley R.

    1986-01-01

    The appropriateness of the selection of four issues (advanced materials development, analysis/design methods, tests of large flexible structures, and structural concepts) was evaluated. A cross-check of the issues and their relationship to the technology drivers is presented. Although all of the issues addressed numerous drivers, the advanced materials development issue impacts six out of the seven drivers and is considered to be the most crucial. The advanced materials technology development and the advanced design/analysis methods development were determined to be enabling technologies with the testing issues and development of structural concepts considered to be of great importance, although not enabling technologies. In addition, and of more general interest and criticality, the need for a Government/Industry commitment which does not now exist, was established. This commitment would call for the establishment of the required infrastructure to facilitate the development of the capabilities highlighted through the availability of resources and testbed facilities, including a national testbed in space to be in place in ten years.

  10. Defining and Enabling Resiliency of Electric Distribution Systems With Multiple Microgrids

    DOE PAGES

    Chanda, Sayonsom; Srivastava, Anurag K.

    2016-05-02

    This paper presents a method for quantifying and enabling the resiliency of a power distribution system (PDS) using analytical hierarchical process and percolation theory. Using this metric, quantitative analysis can be done to analyze the impact of possible control decisions to pro-actively enable the resilient operation of distribution system with multiple microgrids and other resources. Developed resiliency metric can also be used in short term distribution system planning. The benefits of being able to quantify resiliency can help distribution system planning engineers and operators to justify control actions, compare different reconfiguration algorithms, develop proactive control actions to avert power systemmore » outage due to impending catastrophic weather situations or other adverse events. Validation of the proposed method is done using modified CERTS microgrids and a modified industrial distribution system. Furthermore, simulation results show topological and composite metric considering power system characteristics to quantify the resiliency of a distribution system with the proposed methodology, and improvements in resiliency using two-stage reconfiguration algorithm and multiple microgrids.« less

  11. Enabling Self-Monitoring Data Exchange in Participatory Medicine.

    PubMed

    Lopez-Campos, Guillermo; Ofoghi, Bahadorreza; Martin-Sanchez, Fernando

    2015-01-01

    The development of new methods, devices and apps for self-monitoring have enabled the extension of the application of these approaches for consumer health and research purposes. The increase in the number and variety of devices has generated a complex scenario where reporting guidelines and data exchange formats will be needed to ensure the quality of the information and the reproducibility of results of the experiments. Based on the Minimal Information for Self Monitoring Experiments (MISME) reporting guideline we have developed an XML format (MISME-ML) to facilitate data exchange for self monitoring experiments. We have also developed a sample instance to illustrate the concept and a Java MISME-ML validation tool. The implementation and adoption of these tools should contribute to the consolidation of a set of methods that ensure the reproducibility of self monitoring experiments for research purposes.

  12. Computational Analysis of the Caenorhabditis elegans Germline to Study the Distribution of Nuclei, Proteins, and the Cytoskeleton.

    PubMed

    Gopal, Sandeep; Pocock, Roger

    2018-04-19

    The Caenorhabditis elegans (C. elegans) germline is used to study several biologically important processes including stem cell development, apoptosis, and chromosome dynamics. While the germline is an excellent model, the analysis is often two dimensional due to the time and labor required for three-dimensional analysis. Major readouts in such studies are the number/position of nuclei and protein distribution within the germline. Here, we present a method to perform automated analysis of the germline using confocal microscopy and computational approaches to determine the number and position of nuclei in each region of the germline. Our method also analyzes germline protein distribution that enables the three-dimensional examination of protein expression in different genetic backgrounds. Further, our study shows variations in cytoskeletal architecture in distinct regions of the germline that may accommodate specific spatial developmental requirements. Finally, our method enables automated counting of the sperm in the spermatheca of each germline. Taken together, our method enables rapid and reproducible phenotypic analysis of the C. elegans germline.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    J. Carmack; L. Braase; F. Goldner

    The mission of the Advanced Fuels Campaign (AFC) is to perform Research, Development, and Demonstration (RD&D) activities for advanced fuel forms (including cladding) to enhance the performance and safety of the nation’s current and future reactors, enhance proliferation resistance of nuclear fuel, effectively utilize nuclear energy resources, and address the longer-term waste management challenges. This includes development of a state of the art Research and Development (R&D) infrastructure to support the use of a “goal oriented science based approach.” AFC uses a “goal oriented, science based approach” aimed at a fundamental understanding of fuel and cladding fabrication methods and performancemore » under irradiation, enabling the pursuit of multiple fuel forms for future fuel cycle options. This approach includes fundamental experiments, theory, and advanced modeling and simulation. One of the most challenging aspects of AFC is the management, integration, and coordination of major R&D activities across multiple organizations. AFC interfaces and collaborates with Fuel Cycle Technologies (FCT) campaigns, universities, industry, various DOE programs and laboratories, federal agencies (e.g., Nuclear Regulatory Commission [NRC]), and international organizations. Key challenges are the development of fuel technologies to enable major increases in fuel performance (safety, reliability, power and burnup) beyond current technologies, and development of characterization methods and predictive fuel performance models to enable more efficient development and licensing of advanced fuels. Challenged with the research and development of fuels for two different reactor technology platforms, AFC targeted transmutation fuel development and focused ceramic fuel development for Advanced LWR Fuels.« less

  14. Characterization of Residues from the Detonation of Insensitive Munitions

    DTIC Science & Technology

    Unfortunately, many energetic compounds are toxic or harmful to the environment and human health. The US Army Cold Regions Research and Engineering...Laboratory and Defence Research and Development Canada Valcartier have developed methods through SERDP and ESTCP programs that enable the reproducible...reproducible method for energetics residues characterization research . SERDP Project ER-2219 is focused on three areas: determining mass DEPOSITION and

  15. The Impact of Being Part of an Action Learning Set for New Lecturers: A Reflective Analysis

    ERIC Educational Resources Information Center

    Haith, Mark P.; Whittingham, Katrina A.

    2012-01-01

    What is an action learning set (ALS)? An ALS is a regular, action focused peer discussion group, generally facilitated, to address work place issues. Methods of undertaking ALS: methods are flexible within a range of approaches according to the group's developing needs. Benefits of ALS: builds trust, professional development, enables action,…

  16. Technical Matters: Method, Knowledge and Infrastructure in Twentieth-Century Life Science

    PubMed Central

    Creager, Angela N. H.; Landecker, Hannah

    2010-01-01

    Conceptual breakthroughs in science tend to garner accolades and attention. But, as the invention of tissue culture and the development of isotopic tracers show, innovative methods open up new fields and enable the solution of longstanding problems. PMID:19953684

  17. Application of machine learning methods in bioinformatics

    NASA Astrophysics Data System (ADS)

    Yang, Haoyu; An, Zheng; Zhou, Haotian; Hou, Yawen

    2018-05-01

    Faced with the development of bioinformatics, high-throughput genomic technology have enabled biology to enter the era of big data. [1] Bioinformatics is an interdisciplinary, including the acquisition, management, analysis, interpretation and application of biological information, etc. It derives from the Human Genome Project. The field of machine learning, which aims to develop computer algorithms that improve with experience, holds promise to enable computers to assist humans in the analysis of large, complex data sets.[2]. This paper analyzes and compares various algorithms of machine learning and their applications in bioinformatics.

  18. Evaluation of the potential use of hybrid LC-MS/MS for active drug quantification applying the 'free analyte QC concept'.

    PubMed

    Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F

    2017-11-01

    Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.

  19. Direct writing of metal nanostructures: lithographic tools for nanoplasmonics research.

    PubMed

    Leggett, Graham J

    2011-03-22

    Continued progress in the fast-growing field of nanoplasmonics will require the development of new methods for the fabrication of metal nanostructures. Optical lithography provides a continually expanding tool box. Two-photon processes, as demonstrated by Shukla et al. (doi: 10.1021/nn103015g), enable the fabrication of gold nanostructures encapsulated in dielectric material in a simple, direct process and offer the prospect of three-dimensional fabrication. At higher resolution, scanning probe techniques enable nanoparticle particle placement by localized oxidation, and near-field sintering of nanoparticulate films enables direct writing of nanowires. Direct laser "printing" of single gold nanoparticles offers a remarkable capability for the controlled fabrication of model structures for fundamental studies, particle-by-particle. Optical methods continue to provide a powerful support for research into metamaterials.

  20. New calibration method for I-scan sensors to enable the precise measurement of pressures delivered by 'pressure garments'.

    PubMed

    Macintyre, Lisa

    2011-11-01

    Accurate measurement of the pressure delivered by medical compression products is highly desirable both in monitoring treatment and in developing new pressure inducing garments or products. There are several complications in measuring pressure at the garment/body interface and at present no ideal pressure measurement tool exists for this purpose. This paper summarises a thorough evaluation of the accuracy and reproducibility of measurements taken following both of Tekscan Inc.'s recommended calibration procedures for I-scan sensors; and presents an improved method for calibrating and using I-scan pressure sensors. The proposed calibration method enables accurate (±2.1 mmHg) measurement of pressures delivered by pressure garments to body parts with a circumference ≥30 cm. This method is too cumbersome for routine clinical use but is very useful, accurate and reproducible for product development or clinical evaluation purposes. Copyright © 2011 Elsevier Ltd and ISBI. All rights reserved.

  1. Eye-tracking for clinical decision support: A method to capture automatically what physicians are viewing in the EMR.

    PubMed

    King, Andrew J; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F

    2017-01-01

    Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device's accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use.

  2. Eye-tracking for clinical decision support: A method to capture automatically what physicians are viewing in the EMR

    PubMed Central

    King, Andrew J.; Hochheiser, Harry; Visweswaran, Shyam; Clermont, Gilles; Cooper, Gregory F.

    2017-01-01

    Eye-tracking is a valuable research tool that is used in laboratory and limited field environments. We take steps toward developing methods that enable widespread adoption of eye-tracking and its real-time application in clinical decision support. Eye-tracking will enhance awareness and enable intelligent views, more precise alerts, and other forms of decision support in the Electronic Medical Record (EMR). We evaluated a low-cost eye-tracking device and found the device’s accuracy to be non-inferior to a more expensive device. We also developed and evaluated an automatic method for mapping eye-tracking data to interface elements in the EMR (e.g., a displayed laboratory test value). Mapping was 88% accurate across the six participants in our experiment. Finally, we piloted the use of the low-cost device and the automatic mapping method to label training data for a Learning EMR (LEMR) which is a system that highlights the EMR elements a physician is predicted to use. PMID:28815151

  3. Inoculum production and long-term conservation methods for cucurbits and tomato powdery mildews.

    PubMed

    Bardin, Marc; Suliman, Muna E; Sage-Palloix, Anne-Marie; Mohamed, Youssif F; Nicot, Philippe C

    2007-06-01

    The behaviour of cucurbit powdery mildews (Podosphaera xanthii and Golovinomyces cichoracearum) and tomato powdery mildew (Oidium neolycopersici) infesting detached cotyledons of Lagenaria leucantha cv. 'Minibottle' was studied in order to develop an easy culture method for pure inoculum production. High spore production was found with a combination of mannitol (0.1 m), sucrose (0.02 m) and agar (8 gl(-1)) in the cotyledon survival medium. Sporulation on cotyledons and viability of conidia were affected by the age of culture for the three species of powdery mildew tested. The age of cotyledons had also an impact of the spore production. This method was used to produce large amounts of inoculum for P. xanthii, G. cichoracearum and O. neolycopersici and enable the development of other species of powdery mildew like Leveillula taurica. Freezing conidia in liquid nitrogen enabled the long-term conservation of P. xanthii without any loss of virulence. The same method was unsuccessful with G. cichoracearum, and L. taurica and partly successful with O. neolycopersici.

  4. Development of a water-use data system in Minnesota

    USGS Publications Warehouse

    Horn, M.A.

    1986-01-01

    The Minnesota Water-Use Data System stores data on the quantity of individual annual water withdrawals and discharges in relation to the water resources affected, provides descriptors for aggregation of data and trend analysis, and enables access to additional data contained in other data bases. MWUDS is stored on a computer at the Land Management Information Center, an agency associated with the State Planning Agency. Interactive menu-driven programs simplify data entry, update, and retrieval and are easy to use. Estimates of unreported water use supplement reported water use to completely describe the stress on the hydrologic system. Links or common elements developed in the MWUDS enable access to data available in other State waterrelated data bases, forming a water-resource information system. Water-use information can be improved by developing methods for increasing accuracy of reported water use and refining methods for estimating unreported water use.

  5. Rapid Technology Assessment via Unified Deployment of Global Optical and Virtual Diagnostics

    NASA Technical Reports Server (NTRS)

    Jordan, Jeffrey D.; Watkins, A. Neal; Fleming, Gary A.; Leighty, Bradley D.; Schwartz, Richard J.; Ingram, JoAnne L.; Grinstead, Keith D., Jr.; Oglesby, Donald M.; Tyler, Charles

    2003-01-01

    This paper discusses recent developments in rapid technology assessment resulting from an active collaboration between researchers at the Air Force Research Laboratory (AFRL) at Wright Patterson Air Force Base (WPAFB) and the NASA Langley Research Center (LaRC). This program targets the unified development and deployment of global measurement technologies coupled with a virtual diagnostic interface to enable the comparative evaluation of experimental and computational results. Continuing efforts focus on the development of seamless data translation methods to enable integration of data sets of disparate file format in a common platform. Results from a successful low-speed wind tunnel test at WPAFB in which global surface pressure distributions were acquired simultaneously with model deformation and geometry measurements are discussed and comparatively evaluated with numerical simulations. Intensity- and lifetime-based pressure-sensitive paint (PSP) and projection moire interferometry (PMI) results are presented within the context of rapid technology assessment to enable simulation-based R&D.

  6. Genome-wide Mapping of Cellular Protein–RNA Interactions Enabled by Chemical Crosslinking

    PubMed Central

    Li, Xiaoyu; Song, Jinghui; Yi, Chengqi

    2014-01-01

    RNA–protein interactions influence many biological processes. Identifying the binding sites of RNA-binding proteins (RBPs) remains one of the most fundamental and important challenges to the studies of such interactions. Capturing RNA and RBPs via chemical crosslinking allows stringent purification procedures that significantly remove the non-specific RNA and protein interactions. Two major types of chemical crosslinking strategies have been developed to date, i.e., UV-enabled crosslinking and enzymatic mechanism-based covalent capture. In this review, we compare such strategies and their current applications, with an emphasis on the technologies themselves rather than the biology that has been revealed. We hope such methods could benefit broader audience and also urge for the development of new methods to study RNA−RBP interactions. PMID:24747191

  7. NEW METHODS TO SCREEN FOR DEVELOPMENTAL NEUROTOXICITY.

    EPA Science Inventory

    The development of alternative methods for toxicity testing is driven by the need for scientifically valid data (i.e. predictive of a toxic effect) that can be obtained in a rapid and cost-efficient manner. These predictions will enable decisions to be made as to whether further ...

  8. Pick a Sample.

    ERIC Educational Resources Information Center

    Peterson, Ivars

    1991-01-01

    A method that enables people to obtain the benefits of statistics and probability theory without the shortcomings of conventional methods because it is free of mathematical formulas and is easy to understand and use is described. A resampling technique called the "bootstrap" is discussed in terms of application and development. (KR)

  9. The "Movement" of Mixed Methods Research and the Role of Educators

    ERIC Educational Resources Information Center

    Creswell, John W.; Garrett, Amanda L.

    2008-01-01

    The landscape of research is continually evolving, enabling researchers to study increasingly complex phenomena. Educational researchers have propelled much of this forward progress and have developed novel methodologies to provide increasingly sound and complete evidence. Mixed methods research has emerged alongside quantitative and qualitative…

  10. Trust-based information system architecture for personal wellness.

    PubMed

    Ruotsalainen, Pekka; Nykänen, Pirkko; Seppälä, Antto; Blobel, Bernd

    2014-01-01

    Modern eHealth, ubiquitous health and personal wellness systems take place in an unsecure and ubiquitous information space where no predefined trust occurs. This paper presents novel information model and an architecture for trust based privacy management of personal health and wellness information in ubiquitous environment. The architecture enables a person to calculate a dynamic and context-aware trust value for each service provider, and using it to design personal privacy policies for trustworthy use of health and wellness services. For trust calculation a novel set of measurable context-aware and health information-sensitive attributes is developed. The architecture enables a person to manage his or her privacy in ubiquitous environment by formulating context-aware and service provider specific policies. Focus groups and information modelling was used for developing a wellness information model. System analysis method based on sequential steps that enable to combine results of analysis of privacy and trust concerns and the selection of trust and privacy services was used for development of the information system architecture. Its services (e.g. trust calculation, decision support, policy management and policy binding services) and developed attributes enable a person to define situation-aware policies that regulate the way his or her wellness and health information is processed.

  11. How to detect and reduce movement artifacts in near-infrared imaging using moving standard deviation and spline interpolation.

    PubMed

    Scholkmann, F; Spichtig, S; Muehlemann, T; Wolf, M

    2010-05-01

    Near-infrared imaging (NIRI) is a neuroimaging technique which enables us to non-invasively measure hemodynamic changes in the human brain. Since the technique is very sensitive, the movement of a subject can cause movement artifacts (MAs), which affect the signal quality and results to a high degree. No general method is yet available to reduce these MAs effectively. The aim was to develop a new MA reduction method. A method based on moving standard deviation and spline interpolation was developed. It enables the semi-automatic detection and reduction of MAs in the data. It was validated using simulated and real NIRI signals. The results show that a significant reduction of MAs and an increase in signal quality are achieved. The effectiveness and usability of the method is demonstrated by the improved detection of evoked hemodynamic responses. The present method can not only be used in the postprocessing of NIRI signals but also for other kinds of data containing artifacts, for example ECG or EEG signals.

  12. Novel ratio difference at coabsorptive point spectrophotometric method for determination of components with wide variation in their absorptivities.

    PubMed

    Saad, Ahmed S; Abo-Talib, Nisreen F; El-Ghobashy, Mohamed R

    2016-01-05

    Different methods have been introduced to enhance selectivity of UV-spectrophotometry thus enabling accurate determination of co-formulated components, however mixtures whose components exhibit wide variation in absorptivities has been an obstacle against application of UV-spectrophotometry. The developed ratio difference at coabsorptive point method (RDC) represents a simple effective solution for the mentioned problem, where the additive property of light absorbance enabled the consideration of the two components as multiples of the lower absorptivity component at certain wavelength (coabsorptive point), at which their total concentration multiples could be determined, whereas the other component was selectively determined by applying the ratio difference method in a single step. Mixture of perindopril arginine (PA) and amlodipine besylate (AM) figures that problem, where the low absorptivity of PA relative to AM hinders selective spectrophotometric determination of PA. The developed method successfully determined both components in the overlapped region of their spectra with accuracy 99.39±1.60 and 100.51±1.21, for PA and AM, respectively. The method was validated as per the USP guidelines and showed no significant difference upon statistical comparison with reported chromatographic method. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Magnetic resonance microscopy of prostate tissue: How basic science can inform clinical imaging development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bourne, Roger

    2013-03-15

    This commentary outlines how magnetic resonance imaging (MRI) microscopy studies of prostate tissue samples and whole organs have shed light on a number of clinical imaging mysteries and may enable more effective development of new clinical imaging methods.

  14. Rational, computer-enabled peptide drug design: principles, methods, applications and future directions.

    PubMed

    Diller, David J; Swanson, Jon; Bayden, Alexander S; Jarosinski, Mark; Audie, Joseph

    2015-01-01

    Peptides provide promising templates for developing drugs to occupy a middle space between small molecules and antibodies and for targeting 'undruggable' intracellular protein-protein interactions. Importantly, rational or in cerebro design, especially when coupled with validated in silico tools, can be used to efficiently explore chemical space and identify islands of 'drug-like' peptides to satisfy diverse drug discovery program objectives. Here, we consider the underlying principles of and recent advances in rational, computer-enabled peptide drug design. In particular, we consider the impact of basic physicochemical properties, potency and ADME/Tox opportunities and challenges, and recently developed computational tools for enabling rational peptide drug design. Key principles and practices are spotlighted by recent case studies. We close with a hypothetical future case study.

  15. Microwave sensing technology issues related to a global change technology architecture trade study

    NASA Technical Reports Server (NTRS)

    Campbell, Thomas G.; Shiue, Jim; Connolly, Denis; Woo, Ken

    1991-01-01

    The objectives are to enable the development of lighter and less power consuming, high resolution microwave sensors which will operate at frequencies from 1 to 200 GHz. These systems will use large aperture antenna systems (both reflector and phased arrays) capable of wide scan angle, high polarization purity, and utilize sidelobe suppression techniques as required. Essentially, the success of this technology program will enable high resolution microwave radiometers from geostationary orbit, lightweight and more efficient radar systems from low Earth orbit, and eliminate mechanical scanning methods to the fullest extent possible; a main source of platform instability in large space systems. The Global Change Technology Initiative (GCTI) will develop technology which will enable the use of satellite systems for Earth observations on a global scale.

  16. Research Committee Issues Brief: Professional Development for Virtual Schooling and Online Learning

    ERIC Educational Resources Information Center

    Davis, Niki; Rose, Ray

    2007-01-01

    This report examines the types of professional development necessary to implement successful online learning initiatives. The potential for schools utilizing online learning is tremendous: schools can develop new distribution methods to enable equity and access for all students, they can provide high quality content for all students and they can…

  17. Developments in Cylindrical Shell Stability Analysis

    NASA Technical Reports Server (NTRS)

    Knight, Norman F., Jr.; Starnes, James H., Jr.

    1998-01-01

    Today high-performance computing systems and new analytical and numerical techniques enable engineers to explore the use of advanced materials for shell design. This paper reviews some of the historical developments of shell buckling analysis and design. The paper concludes by identifying key research directions for reliable and robust methods development in shell stability analysis and design.

  18. Next-generation genome-scale models for metabolic engineering.

    PubMed

    King, Zachary A; Lloyd, Colton J; Feist, Adam M; Palsson, Bernhard O

    2015-12-01

    Constraint-based reconstruction and analysis (COBRA) methods have become widely used tools for metabolic engineering in both academic and industrial laboratories. By employing a genome-scale in silico representation of the metabolic network of a host organism, COBRA methods can be used to predict optimal genetic modifications that improve the rate and yield of chemical production. A new generation of COBRA models and methods is now being developed--encompassing many biological processes and simulation strategies-and next-generation models enable new types of predictions. Here, three key examples of applying COBRA methods to strain optimization are presented and discussed. Then, an outlook is provided on the next generation of COBRA models and the new types of predictions they will enable for systems metabolic engineering. Copyright © 2014 Elsevier Ltd. All rights reserved.

  19. Nanoscale methods for single-molecule electrochemistry.

    PubMed

    Mathwig, Klaus; Aartsma, Thijs J; Canters, Gerard W; Lemay, Serge G

    2014-01-01

    The development of experiments capable of probing individual molecules has led to major breakthroughs in fields ranging from molecular electronics to biophysics, allowing direct tests of knowledge derived from macroscopic measurements and enabling new assays that probe population heterogeneities and internal molecular dynamics. Although still somewhat in their infancy, such methods are also being developed for probing molecular systems in solution using electrochemical transduction mechanisms. Here we outline the present status of this emerging field, concentrating in particular on optical methods, metal-molecule-metal junctions, and electrochemical nanofluidic devices.

  20. Virtualising the Quantitative Research Methods Course: An Island-Based Approach

    ERIC Educational Resources Information Center

    Baglin, James; Reece, John; Baker, Jenalle

    2015-01-01

    Many recent improvements in pedagogical practice have been enabled by the rapid development of innovative technologies, particularly for teaching quantitative research methods and statistics. This study describes the design, implementation, and evaluation of a series of specialised computer laboratory sessions. The sessions combined the use of an…

  1. Completely monodisperse, highly repetitive proteins for bioconjugate capillary electrophoresis: Development and characterization

    PubMed Central

    Lin, Jennifer S.; Albrecht, Jennifer Coyne; Meagher, Robert J.; Wang, Xiaoxiao; Barron, Annelise E.

    2011-01-01

    Protein-based polymers are increasingly being used in biomaterial applications due to their ease of customization and potential monodispersity. These advantages make protein polymers excellent candidates for bioanalytical applications. Here we describe improved methods for producing drag-tags for Free-Solution Conjugate Electrophoresis (FSCE). FSCE utilizes a pure, monodisperse recombinant protein, tethered end-on to a ssDNA molecule, to enable DNA size separation in aqueous buffer. FSCE also provides a highly sensitive method to evaluate the polydispersity of a protein drag-tag and thus its suitability for bioanalytical uses. This method is able to detect slight differences in drag-tag charge or mass. We have devised an improved cloning, expression, and purification strategy that enables us to generate, for the first time, a truly monodisperse 20 kDa protein polymer and a nearly monodisperse 38 kDa protein. These newly produced proteins can be used as drag-tags to enable longer read DNA sequencing by free-solution microchannel electrophoresis. PMID:21553840

  2. Processing methods for photoacoustic Doppler flowmetry with a clinical ultrasound scanner

    NASA Astrophysics Data System (ADS)

    Bücking, Thore M.; van den Berg, Pim J.; Balabani, Stavroula; Steenbergen, Wiendelt; Beard, Paul C.; Brunker, Joanna

    2018-02-01

    Photoacoustic flowmetry (PAF) based on time-domain cross correlation of photoacoustic signals is a promising technique for deep tissue measurement of blood flow velocity. Signal processing has previously been developed for single element transducers. Here, the processing methods for acoustic resolution PAF using a clinical ultrasound transducer array are developed and validated using a 64-element transducer array with a -6 dB detection band of 11 to 17 MHz. Measurements were performed on a flow phantom consisting of a tube (580 μm inner diameter) perfused with human blood flowing at physiological speeds ranging from 3 to 25 mm / s. The processing pipeline comprised: image reconstruction, filtering, displacement detection, and masking. High-pass filtering and background subtraction were found to be key preprocessing steps to enable accurate flow velocity estimates, which were calculated using a cross-correlation based method. In addition, the regions of interest in the calculated velocity maps were defined using a masking approach based on the amplitude of the cross-correlation functions. These developments enabled blood flow measurements using a transducer array, bringing PAF one step closer to clinical applicability.

  3. Multidisciplinary eHealth Survey Evaluation Methods

    ERIC Educational Resources Information Center

    Karras, Bryant T.; Tufano, James T.

    2006-01-01

    This paper describes the development process of an evaluation framework for describing and comparing web survey tools. We believe that this approach will help shape the design, development, deployment, and evaluation of population-based health interventions. A conceptual framework for describing and evaluating web survey systems will enable the…

  4. Learning to Cope with Change: The Story of an OD Intervention.

    ERIC Educational Resources Information Center

    McGivern, Chris; Broomhall, Mike

    1978-01-01

    Describes a change program developed for a small manufacturing firm to enable management to cope with organizational change and development. Intervention methods included data collection, anonymous questionnaires, a confrontation meeting, and introduction of change technologies using the organizational analysis approach. Benefits and problems…

  5. Ultra-Rapid 2-D and 3-D Laser Microprinting of Proteins

    NASA Astrophysics Data System (ADS)

    Scott, Mark Andrew

    When viewed under the microscope, biological tissues reveal an exquisite microarchitecture. These complex patterns arise during development, as cells interact with a multitude of chemical and mechanical cues in the surrounding extracellular matrix. Tissue engineers have sought for decades to repair or replace damaged tissue, often relying on porous scaffolds as an artificial extracellular matrix to support cell development. However, these grafts are unable to recapitulate the complexity of the in vivo environment, limiting our ability to regenerate functional tissue. Biomedical engineers have developed several methods for printing two- and three-dimensional patterns of proteins for studying and directing cell development. Of these methods, laser microprinting of proteins has shown the most promise for printing sub-cellular resolution gradients of cues, but the photochemistry remains too slow to enable large-scale applications for screening and therapeutics In this work, we demonstrate a novel high-speed photochemistry based on multi-photon photobleaching of fluorescein, and we build the fastest 2-D and 3-D laser microprinter for proteins to date. First, we show that multiphoton photobleaching of a deoxygenated solution of biotin-4-fluorescein onto a PEG monolayer with acrylate end-group can enable print speeds of almost 20 million pixels per second at 600 nanometer resolution. We discovered that the mechanism of fluorescein photobleaching evolves from a 2-photon to 3- and 4-photon regime at higher laser intensities, unlocking faster printing kinetics. Using this 2-D printing system, we develop a novel triangle-ratchet method for directing the polarization of single hippocampal neurons. This ability to determine which neurite becomes an axon, and which neuritis become dendrites is an essential step for developing defined in vitro neural networks. Next, we modify our multiphoton photobleaching system to print in three dimensions. For the first time, we demonstrate 3-D printing of full length proteins in collagen, fibrin and gelatin methacrylate scaffolds, as well as printing in agarose and agarose methacrylate scaffolds. We also present a novel method for 3-D printing collagen scaffolds at unprecedented speeds, up to 14layers per second, generating complex shapes in seconds with sub-micron resolution. Finally, we demonstrate that 3-D printing of scaffold architecture and protein cues inside the scaffold can be combined, for the first time enabling structures with complex sub-micron architectures and chemical cues for directing development. We believe that the ultra-rapid printing technology presented in this thesis will be a key enabler in the development of complex, artificially engineered tissues and organs. (Copies available exclusively from MIT Libraries, libraries.mit.edu/docs - docs mit.edu)

  6. Recent developments in multi-layer flat knitting technology for waste free production of complex shaped 3D-reinforcing structures for composites

    NASA Astrophysics Data System (ADS)

    Trümper, W.; Lin, H.; Callin, T.; Bollengier, Q.; Cherif, C.; Krzywinski, S.

    2016-07-01

    Constantly increasing prices for raw materials and energy as well as the current discourse on the reduction of CO2-emissions places a special emphasis on the advantages of lightweight constructions and its resource conserving production methods. Fibre-reinforced composites are already seeing a number of applications in automobile, energy and mechanical engineering. Future applications within the named areas require greater material and energy efficiency and therefore manufacturing methods for textile preforms and lightweight constructions enabling an optimal arrangement of the reinforcing fibres while in the same time limiting waste to a minimum. One manufacturing method for textile reinforced preforms fulfilling quite many of the named requirements is the multilayer weft knitting technology. Multilayer weft knitted fabrics containing straight reinforcing yarns at least in two directions. The arrangement of these yarns is fixed by the loop yarn. Used yarn material in each knitting row is adaptable e. g. according to the load requirements or for the local integration of sensors. Draping properties of these fabrics can be varied within a great range and through this enabling draping of very complex shaped 3D-preforms without wrinkles from just one uncut fabric. The latest developments at ITM are concentrating on the development of a full production chain considering the 3D-CAD geometry, the load analysis, the generation of machine control programs as well as the development of technology and machines to enable the manufacturing of innovative net shape 3D-multilayer weft knitted fabrics such as complex shaped spacer fabrics and tubular fabrics with biaxial reinforcement.

  7. A high-level 3D visualization API for Java and ImageJ.

    PubMed

    Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin

    2010-05-21

    Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.

  8. Bio++: a set of C++ libraries for sequence analysis, phylogenetics, molecular evolution and population genetics.

    PubMed

    Dutheil, Julien; Gaillard, Sylvain; Bazin, Eric; Glémin, Sylvain; Ranwez, Vincent; Galtier, Nicolas; Belkhir, Khalid

    2006-04-04

    A large number of bioinformatics applications in the fields of bio-sequence analysis, molecular evolution and population genetics typically share input/output methods, data storage requirements and data analysis algorithms. Such common features may be conveniently bundled into re-usable libraries, which enable the rapid development of new methods and robust applications. We present Bio++, a set of Object Oriented libraries written in C++. Available components include classes for data storage and handling (nucleotide/amino-acid/codon sequences, trees, distance matrices, population genetics datasets), various input/output formats, basic sequence manipulation (concatenation, transcription, translation, etc.), phylogenetic analysis (maximum parsimony, markov models, distance methods, likelihood computation and maximization), population genetics/genomics (diversity statistics, neutrality tests, various multi-locus analyses) and various algorithms for numerical calculus. Implementation of methods aims at being both efficient and user-friendly. A special concern was given to the library design to enable easy extension and new methods development. We defined a general hierarchy of classes that allow the developer to implement its own algorithms while remaining compatible with the rest of the libraries. Bio++ source code is distributed free of charge under the CeCILL general public licence from its website http://kimura.univ-montp2.fr/BioPP.

  9. Enabling Microfluidics: From Clean Rooms to Makerspaces

    DTIC Science & Technology

    2016-09-30

    anyone can make 133 and rapidly scale to bulk manufacturing . To enable others to take part in this type of product 134 design and development, we...cost molds for a fee; however, the 77 design process is slowed down waiting for molds to be manufactured and shipped. While 78 PDMS devices may be...finished prototype into a commercial product . An example of a rapid 101 prototyping method amenable to scaled-up manufacturing is laser cutting. Figure

  10. Large-field-of-view imaging by multi-pupil adaptive optics.

    PubMed

    Park, Jung-Hoon; Kong, Lingjie; Zhou, Yifeng; Cui, Meng

    2017-06-01

    Adaptive optics can correct for optical aberrations. We developed multi-pupil adaptive optics (MPAO), which enables simultaneous wavefront correction over a field of view of 450 × 450 μm 2 and expands the correction area to nine times that of conventional methods. MPAO's ability to perform spatially independent wavefront control further enables 3D nonplanar imaging. We applied MPAO to in vivo structural and functional imaging in the mouse brain.

  11. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework

    PubMed Central

    2012-01-01

    Background There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. Methods The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? Results A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. Conclusions We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a comprehensive intervention development process. PMID:22531013

  12. Breakthrough: Better Fiber for Better Products

    ScienceCinema

    Griffith, George; Garnier, John

    2018-01-08

    Researchers at Idaho National Laboratory have developed a cost-effective method for the continuous production of alpha silicon carbide fiber. The exceptionally strong, lightweight fiber could enable significant performance improvements in many everyday products.

  13. Self-Assembling Process for Fabricating Tailored Thin Films

    ScienceCinema

    Sandia

    2017-12-09

    A simple, economical nanotechnology coating process that enables the development of nanoparticle thin films with architectures and properties unattainable by any other processing method. 2007 R&D 100 winner (SAND2007-1878P)

  14. Designing Pedagogical Innovation for Collaborating Teacher Teams

    ERIC Educational Resources Information Center

    Weitze, Charlotte Laerke

    2017-01-01

    In this design-based research project, teachers co-created and used a new learning design model, the "IT-Pedagogical Think Tank Model for Teacher Teams." This continuous-competence-development method enabled teachers to collaborate and develop innovative-learning designs for students in a new hybrid synchronous video-mediated learning…

  15. 78 FR 48422 - Agency Information Collection Activities: Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-08

    ... quantitative data through surveys with working-age (age 18-61) and older American (age 62 and older) consumers in order to develop and refine survey instruments that will enable the CFPB to reliably and... conducting research to identify methods and strategies to educate and counsel seniors, and developing goals...

  16. Academic Civic Mindedness and Model Citizenship in the International Baccalaureate Diploma Programme

    ERIC Educational Resources Information Center

    Saavedra, Anna Rosefsky

    2016-01-01

    This study uses interview and survey methods to describe the International Baccalaureate (IB) Diploma Programme's (DP) development of students' "academic civic mindedness" and "model citizenship" at four public schools in California. Results indicate that the DP pedagogy enables students to develop many of the skills that are…

  17. An adaptive-binning method for generating constant-uncertainty/constant-significance light curves with Fermi -LAT data

    DOE PAGES

    Lott, B.; Escande, L.; Larsson, S.; ...

    2012-07-19

    Here, we present a method enabling the creation of constant-uncertainty/constant-significance light curves with the data of the Fermi-Large Area Telescope (LAT). The adaptive-binning method enables more information to be encapsulated within the light curve than with the fixed-binning method. Although primarily developed for blazar studies, it can be applied to any sources. Furthermore, this method allows the starting and ending times of each interval to be calculated in a simple and quick way during a first step. The reported mean flux and spectral index (assuming the spectrum is a power-law distribution) in the interval are calculated via the standard LATmore » analysis during a second step. In the absence of major caveats associated with this method Monte-Carlo simulations have been established. We present the performance of this method in determining duty cycles as well as power-density spectra relative to the traditional fixed-binning method.« less

  18. Performance Evaluation of an Improved GC-MS Method to Quantify Methylmercury in Fish.

    PubMed

    Watanabe, Takahiro; Kikuchi, Hiroyuki; Matsuda, Rieko; Hayashi, Tomoko; Akaki, Koichi; Teshima, Reiko

    2015-01-01

    Here, we set out to improve our previously developed methylmercury analytical method, involving phenyl derivatization and gas chromatography-mass spectrometry (GC-MS). In the improved method, phenylation of methylmercury with sodium tetraphenylborate was carried out in a toluene/water two-phase system, instead of in water alone. The modification enabled derivatization at optimum pH, and the formation of by-products was dramatically reduced. In addition, adsorption of methyl phenyl mercury in the GC system was suppressed by co-injection of PEG200, enabling continuous analysis without loss of sensitivity. The performance of the improved analytical method was independently evaluated by three analysts using certified reference materials and methylmercury-spiked fresh fish samples. The present analytical method was validated as suitable for determination of compliance with the provisional regulation value for methylmercury in fish, set in the Food Sanitation haw.

  19. Thiol-ene immobilisation of carbohydrates onto glass slides as a simple alternative to gold-thiol monolayers, amines or lipid binding.

    PubMed

    Biggs, Caroline I; Edmondson, Steve; Gibson, Matthew I

    2015-01-01

    Carbohydrate arrays are a vital tool in studying infection, probing the mechanisms of bacterial, viral and toxin adhesion and the development of new treatments, by mimicking the structure of the glycocalyx. Current methods rely on the formation of monolayers of carbohydrates that have been chemically modified with a linker to enable interaction with a functionalised surface. This includes amines, biotin, lipids or thiols. Thiol-addition to gold to form self-assembled monolayers is perhaps the simplest method for immobilisation as thiolated glycans are readily accessible from reducing carbohydrates in a single step, but are limited to gold surfaces. Here we have developed a quick and versatile methodology which enables the use of thiolated carbohydrates to be immobilised as monolayers directly onto acrylate-functional glass slides via a 'thiol-ene'/Michael-type reaction. By combining the ease of thiol chemistry with glass slides, which are compatible with microarray scanners this offers a cost effective, but also useful method to assemble arrays.

  20. Development of three-dimensional hollow elastic model for cerebral aneurysm clipping simulation enabling rapid and low cost prototyping.

    PubMed

    Mashiko, Toshihiro; Otani, Keisuke; Kawano, Ryutaro; Konno, Takehiko; Kaneko, Naoki; Ito, Yumiko; Watanabe, Eiju

    2015-03-01

    We developed a method for fabricating a three-dimensional hollow and elastic aneurysm model useful for surgical simulation and surgical training. In this article, we explain the hollow elastic model prototyping method and report on the effects of applying it to presurgical simulation and surgical training. A three-dimensional printer using acrylonitrile-butadiene-styrene as a modeling material was used to produce a vessel model. The prototype was then coated with liquid silicone. After the silicone had hardened, the acrylonitrile-butadiene-styrene was melted with xylene and removed, leaving an outer layer as a hollow elastic model. Simulations using the hollow elastic model were performed in 12 patients. In all patients, the clipping proceeded as scheduled. The surgeon's postoperative assessment was favorable in all cases. This method enables easy fabrication at low cost. Simulation using the hollow elastic model is thought to be useful for understanding of three-dimensional aneurysm structure. Copyright © 2015 Elsevier Inc. All rights reserved.

  1. Passing waves from atomistic to continuum

    NASA Astrophysics Data System (ADS)

    Chen, Xiang; Diaz, Adrian; Xiong, Liming; McDowell, David L.; Chen, Youping

    2018-02-01

    Progress in the development of coupled atomistic-continuum methods for simulations of critical dynamic material behavior has been hampered by a spurious wave reflection problem at the atomistic-continuum interface. This problem is mainly caused by the difference in material descriptions between the atomistic and continuum models, which results in a mismatch in phonon dispersion relations. In this work, we introduce a new method based on atomistic dynamics of lattice coupled with a concurrent atomistic-continuum method to enable a full phonon representation in the continuum description. This permits the passage of short-wavelength, high-frequency phonon waves from the atomistic to continuum regions. The benchmark examples presented in this work demonstrate that the new scheme enables the passage of all allowable phonons through the atomistic-continuum interface; it also preserves the wave coherency and energy conservation after phonons transport across multiple atomistic-continuum interfaces. This work is the first step towards developing a concurrent atomistic-continuum simulation tool for non-equilibrium phonon-mediated thermal transport in materials with microstructural complexity.

  2. High-Throughput, Data-Rich Cellular RNA Device Engineering

    PubMed Central

    Townshend, Brent; Kennedy, Andrew B.; Xiang, Joy S.; Smolke, Christina D.

    2015-01-01

    Methods for rapidly assessing sequence-structure-function landscapes and developing conditional gene-regulatory devices are critical to our ability to manipulate and interface with biology. We describe a framework for engineering RNA devices from preexisting aptamers that exhibit ligand-responsive ribozyme tertiary interactions. Our methodology utilizes cell sorting, high-throughput sequencing, and statistical data analyses to enable parallel measurements of the activities of hundreds of thousands of sequences from RNA device libraries in the absence and presence of ligands. Our tertiary interaction RNA devices exhibit improved performance in terms of gene silencing, activation ratio, and ligand sensitivity as compared to optimized RNA devices that rely on secondary structure changes. We apply our method to building biosensors for diverse ligands and determine consensus sequences that enable ligand-responsive tertiary interactions. These methods advance our ability to develop broadly applicable genetic tools and to elucidate understanding of the underlying sequence-structure-function relationships that empower rational design of complex biomolecules. PMID:26258292

  3. Social Activity Method (SAM): A Fractal Language for Mathematics

    ERIC Educational Resources Information Center

    Dowling, Paul

    2013-01-01

    In this paper I shall present and develop my organisational language, "social activity method" (SAM), and illustrate some of its applications. I shall introduce a new scheme for "modes of recontextualisation" that enables the analysis of the ways in which one activity--which might be school mathematics or social research or any…

  4. Active Subspace Methods for Data-Intensive Inverse Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Qiqi

    2017-04-27

    The project has developed theory and computational tools to exploit active subspaces to reduce the dimension in statistical calibration problems. This dimension reduction enables MCMC methods to calibrate otherwise intractable models. The same theoretical and computational tools can also reduce the measurement dimension for calibration problems that use large stores of data.

  5. An Organic Vertical Field-Effect Transistor with Underside-Doped Graphene Electrodes.

    PubMed

    Kim, Jong Su; Kim, Beom Joon; Choi, Young Jin; Lee, Moo Hyung; Kang, Moon Sung; Cho, Jeong Ho

    2016-06-01

    High-performance vertical field-effect transistors are developed, which are based on graphene electrodes doped using the underside doping method. The underside doping method enables effective tuning of the graphene work function while maintaining the surface properties of the pristine graphene. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. Development of a virtual speaking simulator using Image Based Rendering.

    PubMed

    Lee, J M; Kim, H; Oh, M J; Ku, J H; Jang, D P; Kim, I Y; Kim, S I

    2002-01-01

    The fear of speaking is often cited as the world's most common social phobia. The rapid growth of computer technology has enabled the use of virtual reality (VR) for the treatment of the fear of public speaking. There are two techniques for building virtual environments for the treatment of this fear: a model-based and a movie-based method. Both methods have the weakness that they are unrealistic and not controllable individually. To understand these disadvantages, this paper presents a virtual environment produced with Image Based Rendering (IBR) and a chroma-key simultaneously. IBR enables the creation of realistic virtual environments where the images are stitched panoramically with the photos taken from a digital camera. And the use of chroma-keys puts virtual audience members under individual control in the environment. In addition, real time capture technique is used in constructing the virtual environments enabling spoken interaction between the subject and a therapist or another subject.

  7. A Solution Framework for Environmental Characterization Problems

    EPA Science Inventory

    This paper describes experiences developing a grid-enabled framework for solving environmental inverse problems. The solution approach taken here couples environmental simulation models with global search methods and requires readily available computational resources of the grid ...

  8. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    PubMed

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  9. Advances in dual-tone development for pitch frequency doubling

    NASA Astrophysics Data System (ADS)

    Fonseca, Carlos; Somervell, Mark; Scheer, Steven; Kuwahara, Yuhei; Nafus, Kathleen; Gronheid, Roel; Tarutani, Shinji; Enomoto, Yuuichiro

    2010-04-01

    Dual-tone development (DTD) has been previously proposed as a potential cost-effective double patterning technique1. DTD was reported as early as in the late 1990's2. The basic principle of dual-tone imaging involves processing exposed resist latent images in both positive tone (aqueous base) and negative tone (organic solvent) developers. Conceptually, DTD has attractive cost benefits since it enables pitch doubling without the need for multiple etch steps of patterned resist layers. While the concept for DTD technique is simple to understand, there are many challenges that must be overcome and understood in order to make it a manufacturing solution. Previous work by the authors demonstrated feasibility of DTD imaging for 50nm half-pitch features at 0.80NA (k1 = 0.21) and discussed challenges lying ahead for printing sub-40nm half-pitch features with DTD. While previous experimental results suggested that clever processing on the wafer track can be used to enable DTD beyond 50nm halfpitch, it also suggest that identifying suitable resist materials or chemistries is essential for achieving successful imaging results with novel resist processing methods on the wafer track. In this work, we present recent advances in the search for resist materials that work in conjunction with novel resist processing methods on the wafer track to enable DTD. Recent experimental results with new resist chemistries, specifically designed for DTD, are presented in this work. We also present simulation studies that help and support identifying resist properties that could enable DTD imaging, which ultimately lead to producing viable DTD resist materials.

  10. FY2016 Ceramic Fuels Development Annual Highlights

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mcclellan, Kenneth James

    Key challenges for the Advanced Fuels Campaign are the development of fuel technologies to enable major increases in fuel performance (safety, reliability, power and burnup) beyond current technologies, and development of characterization methods and predictive fuel performance models to enable more efficient development and licensing of advanced fuels. Ceramic fuel development activities for fiscal year 2016 fell within the areas of 1) National and International Technical Integration, 2) Advanced Accident Tolerant Ceramic Fuel Development, 3) Advanced Techniques and Reference Materials Development, and 4) Fabrication of Enriched Ceramic Fuels. High uranium density fuels were the focus of the ceramic fuels efforts.more » Accomplishments for FY16 primarily reflect the prioritization of identification and assessment of new ceramic fuels for light water reactors which have enhanced accident tolerance while also maintaining or improving normal operation performance, and exploration of advanced post irradiation examination techniques which will support more efficient testing and qualification of new fuel systems.« less

  11. Protein–protein docking by fast generalized Fourier transforms on 5D rotational manifolds

    PubMed Central

    Padhorny, Dzmitry; Kazennov, Andrey; Zerbe, Brandon S.; Porter, Kathryn A.; Xia, Bing; Mottarella, Scott E.; Kholodov, Yaroslav; Ritchie, David W.; Vajda, Sandor; Kozakov, Dima

    2016-01-01

    Energy evaluation using fast Fourier transforms (FFTs) enables sampling billions of putative complex structures and hence revolutionized rigid protein–protein docking. However, in current methods, efficient acceleration is achieved only in either the translational or the rotational subspace. Developing an efficient and accurate docking method that expands FFT-based sampling to five rotational coordinates is an extensively studied but still unsolved problem. The algorithm presented here retains the accuracy of earlier methods but yields at least 10-fold speedup. The improvement is due to two innovations. First, the search space is treated as the product manifold SO(3)×(SO(3)∖S1), where SO(3) is the rotation group representing the space of the rotating ligand, and (SO(3)∖S1) is the space spanned by the two Euler angles that define the orientation of the vector from the center of the fixed receptor toward the center of the ligand. This representation enables the use of efficient FFT methods developed for SO(3). Second, we select the centers of highly populated clusters of docked structures, rather than the lowest energy conformations, as predictions of the complex, and hence there is no need for very high accuracy in energy evaluation. Therefore, it is sufficient to use a limited number of spherical basis functions in the Fourier space, which increases the efficiency of sampling while retaining the accuracy of docking results. A major advantage of the method is that, in contrast to classical approaches, increasing the number of correlation function terms is computationally inexpensive, which enables using complex energy functions for scoring. PMID:27412858

  12. GLO-Roots: an imaging platform enabling multidimensional characterization of soil-grown root systems

    PubMed Central

    Rellán-Álvarez, Rubén; Lobet, Guillaume; Lindner, Heike; Pradier, Pierre-Luc; Sebastian, Jose; Yee, Muh-Ching; Geng, Yu; Trontin, Charlotte; LaRue, Therese; Schrager-Lavelle, Amanda; Haney, Cara H; Nieu, Rita; Maloof, Julin; Vogel, John P; Dinneny, José R

    2015-01-01

    Root systems develop different root types that individually sense cues from their local environment and integrate this information with systemic signals. This complex multi-dimensional amalgam of inputs enables continuous adjustment of root growth rates, direction, and metabolic activity that define a dynamic physical network. Current methods for analyzing root biology balance physiological relevance with imaging capability. To bridge this divide, we developed an integrated-imaging system called Growth and Luminescence Observatory for Roots (GLO-Roots) that uses luminescence-based reporters to enable studies of root architecture and gene expression patterns in soil-grown, light-shielded roots. We have developed image analysis algorithms that allow the spatial integration of soil properties, gene expression, and root system architecture traits. We propose GLO-Roots as a system that has great utility in presenting environmental stimuli to roots in ways that evoke natural adaptive responses and in providing tools for studying the multi-dimensional nature of such processes. DOI: http://dx.doi.org/10.7554/eLife.07597.001 PMID:26287479

  13. GLO-Roots: An imaging platform enabling multidimensional characterization of soil-grown root systems

    DOE PAGES

    Rellan-Alvarez, Ruben; Lobet, Guillaume; Lindner, Heike; ...

    2015-08-19

    Root systems develop different root types that individually sense cues from their local environment and integrate this information with systemic signals. This complex multi-dimensional amalgam of inputs enables continuous adjustment of root growth rates, direction, and metabolic activity that define a dynamic physical network. Current methods for analyzing root biology balance physiological relevance with imaging capability. To bridge this divide, we developed an integrated-imaging system called Growth and Luminescence Observatory for Roots (GLO-Roots) that uses luminescence-based reporters to enable studies of root architecture and gene expression patterns in soil-grown, light-shielded roots. We have developed image analysis algorithms that allow themore » spatial integration of soil properties, gene expression, and root system architecture traits. We propose GLO-Roots as a system that has great utility in presenting environmental stimuli to roots in ways that evoke natural adaptive responses and in providing tools for studying the multi-dimensional nature of such processes.« less

  14. The New South Wales Allied Health Workplace Learning Study: barriers and enablers to learning in the workplace

    PubMed Central

    2014-01-01

    Background Workplace learning refers to continuing professional development that is stimulated by and occurs through participation in workplace activities. Workplace learning is essential for staff development and high quality clinical care. The purpose of this study was to explore the barriers to and enablers of workplace learning for allied health professionals within NSW Health. Methods A qualitative study was conducted with a purposively selected maximum variation sample (n = 46) including 19 managers, 19 clinicians and eight educators from 10 allied health professions. Seven semi-structured interviews and nine focus groups were audio-recorded and transcribed. The ‘framework approach’ was used to guide the interviews and analysis. Textual data were coded and charted using an evolving thematic framework. Results Key enablers of workplace learning included having access to peers, expertise and ‘learning networks’, protected learning time, supportive management and positive staff attitudes. The absence of these key enablers including heavy workload and insufficient staffing were important barriers to workplace learning. Conclusion Attention to these barriers and enablers may help organisations to more effectively optimise allied health workplace learning. Ultimately better workplace learning may lead to improved patient, staff and organisational outcomes. PMID:24661614

  15. Semantics of data and service registration to advance interdisciplinary information and data access.

    NASA Astrophysics Data System (ADS)

    Fox, P. P.; McGuinness, D. L.; Raskin, R.; Sinha, A. K.

    2008-12-01

    In developing an application of semantic web methods and technologies to address the integration of heterogeneous and interdisciplinary earth-science datasets, we have developed methodologies for creating rich semantic descriptions (ontologies) of the application domains. We have leveraged and extended where possible existing ontology frameworks such as SWEET. As a result of this semantic approach, we have also utilized ontologic descriptions of key enabling elements of the application, such as the registration of datasets with ontologies at several levels of granularity. This has enabled the location and usage of the data across disciplines. We are also realizing the need to develop similar semantic registration of web service data holdings as well as those provided with community and/or standard markup languages (e.g. GeoSciML). This level of semantic enablement extending beyond domain terms and relations significantly enhances our ability to provide a coherent semantic data framework for data and information systems. Much of this work is on the frontier of technology development and we will present the current and near-future capabilities we are developing. This work arises from the Semantically-Enabled Science Data Integration (SESDI) project, which is an NASA/ESTO/ACCESS-funded project involving the High Altitude Observatory at the National Center for Atmospheric Research (NCAR), McGuinness Associates Consulting, NASA/JPL and Virginia Polytechnic University.

  16. CMC Technology Advancements for Gas Turbine Engine Applications

    NASA Technical Reports Server (NTRS)

    Grady, Joseph E.

    2013-01-01

    CMC research at NASA Glenn is focused on aircraft propulsion applications. The objective is to enable reduced engine emissions and fuel consumption for more environmentally friendly aircraft. Engine system studies show that incorporation of ceramic composites into turbine engines will enable significant reductions in emissions and fuel burn due to increased engine efficiency resulting from reduced cooling requirements for hot section components. This presentation will describe recent progress and challenges in developing fiber and matrix constituents for 2700 F CMC turbine applications. In addition, ongoing research in the development of durable environmental barrier coatings, ceramic joining integration technologies and life prediction methods for CMC engine components will be reviewed.

  17. The Stanford how things work project

    NASA Technical Reports Server (NTRS)

    Fikes, Richard; Gruber, Tom; Iwasaki, Yumi

    1994-01-01

    We provide an overview of the Stanford How Things Work (HTW) project, an ongoing integrated collection of research activities in the Knowledge Systems Laboratory at Stanford University. The project is developing technology for representing knowledge about engineered devices in a form that enables the knowledge to be used in multiple systems for multiple reasoning tasks and reasoning methods that enable the represented knowledge to be effectively applied to the performance of the core engineering task of simulating and analyzing device behavior. The central new capabilities currently being developed in the project are automated assistance with model formulation and with verification that a design for an electro-mechanical device satisfies its functional specification.

  18. DNA-encoded chemistry: enabling the deeper sampling of chemical space.

    PubMed

    Goodnow, Robert A; Dumelin, Christoph E; Keefe, Anthony D

    2017-02-01

    DNA-encoded chemical library technologies are increasingly being adopted in drug discovery for hit and lead generation. DNA-encoded chemistry enables the exploration of chemical spaces four to five orders of magnitude more deeply than is achievable by traditional high-throughput screening methods. Operation of this technology requires developing a range of capabilities including aqueous synthetic chemistry, building block acquisition, oligonucleotide conjugation, large-scale molecular biological transformations, selection methodologies, PCR, sequencing, sequence data analysis and the analysis of large chemistry spaces. This Review provides an overview of the development and applications of DNA-encoded chemistry, highlighting the challenges and future directions for the use of this technology.

  19. Rapid Development and Distribution of Mobile Media-Rich Clinical Practice Guidelines Nationwide in Colombia.

    PubMed

    Flórez-Arango, José F; Sriram Iyengar, M; Caicedo, Indira T; Escobar, German

    2017-01-01

    Development and electronic distribution of Clinical Practice Guidelines production is costly and challenging. This poster presents a rapid method to represent existing guidelines in auditable, computer executable multimedia format. We used a technology that enables a small number of clinicians to, in a short period of time, develop a substantial amount of computer executable guidelines without programming.

  20. A student-led process to enhance the learning and teaching of teamwork skills in medicine.

    PubMed

    Balasooriya, Chinthaka; Olupeliyawa, Asela; Iqbal, Maha; Lawley, Claire; Cohn, Amanda; Ma, David; Luu, Queenie

    2013-01-01

    The development of teamwork skills is a critical aspect of modern medical education. This paper reports on a project that aimed to identify student perceptions of teamwork-focused learning activities and generate student recommendations for the development of effective educational strategies. The project utilized a unique method, which drew on the skills of student research assistants (RAs) to explore the views of their peers. Using structured interview guides, the RAs interviewed their colleagues to clarify their perceptions of the effectiveness of current methods of teamwork teaching and to explore ideas for more effective methods. The RAs shared their deidentified findings with each other, identified preliminary themes, and developed a number of recommendations which were finalized through consultation with faculty. The key themes that emerged focused on the need to clarify the relevance of teamwork skills to clinical practice, reward individual contributions to group process, facilitate feedback and reflection on teamwork skills, and systematically utilize clinical experiences to support experiential learning of teamwork. Based on these findings, a number of recommendations for stage appropriate teamwork learning and assessment activities were developed. Key among these were recommendations to set up a peer-mentoring system for students, suggestions for more authentic teamwork assessment methods, and strategies to utilize the clinical learning environment in developing teamwork skills. The student-led research process enabled identification of issues that may not have been otherwise revealed by students, facilitated a better understanding of teamwork teaching and developed ownership of the curriculum among students. The project enabled the development of recommendations for designing learning, teaching, and assessment methods that were likely to be more effective from a student perspective.

  1. Methods to enable the design of bioactive small molecules targeting RNA

    PubMed Central

    Disney, Matthew D.; Yildirim, Ilyas; Childs-Disney, Jessica L.

    2014-01-01

    RNA is an immensely important target for small molecule therapeutics or chemical probes of function. However, methods that identify, annotate, and optimize RNA-small molecule interactions that could enable the design of compounds that modulate RNA function are in their infancies. This review describes recent approaches that have been developed to understand and optimize RNA motif-small molecule interactions, including Structure-Activity Relationships Through Sequencing (StARTS), quantitative structure-activity relationships (QSAR), chemical similarity searching, structure-based design and docking, and molecular dynamics (MD) simulations. Case studies described include the design of small molecules targeting RNA expansions, the bacterial A-site, viral RNAs, and telomerase RNA. These approaches can be combined to afford a synergistic method to exploit the myriad of RNA targets in the transcriptome. PMID:24357181

  2. Methods to enable the design of bioactive small molecules targeting RNA.

    PubMed

    Disney, Matthew D; Yildirim, Ilyas; Childs-Disney, Jessica L

    2014-02-21

    RNA is an immensely important target for small molecule therapeutics or chemical probes of function. However, methods that identify, annotate, and optimize RNA-small molecule interactions that could enable the design of compounds that modulate RNA function are in their infancies. This review describes recent approaches that have been developed to understand and optimize RNA motif-small molecule interactions, including structure-activity relationships through sequencing (StARTS), quantitative structure-activity relationships (QSAR), chemical similarity searching, structure-based design and docking, and molecular dynamics (MD) simulations. Case studies described include the design of small molecules targeting RNA expansions, the bacterial A-site, viral RNAs, and telomerase RNA. These approaches can be combined to afford a synergistic method to exploit the myriad of RNA targets in the transcriptome.

  3. WEC Design Response Toolbox v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan; Michelen, Carlos; Eckert-Gallup, Aubrey

    2016-03-30

    The WEC Design Response Toolbox (WDRT) is a numerical toolbox for design-response analysis of wave energy converters (WECs). The WDRT was developed during a series of efforts to better understand WEC survival design. The WDRT has been designed as a tool for researchers and developers, enabling the straightforward application of statistical and engineering methods. The toolbox includes methods for short-term extreme response, environmental characterization, long-term extreme response and risk analysis, fatigue, and design wave composition.

  4. Scientific Inquiry Based Professional Development Models in Teacher Education

    ERIC Educational Resources Information Center

    Corlu, Mehmet Ali; Corlu, M. Sencer

    2012-01-01

    Scientific inquiry helps students develop critical thinking abilities and enables students to think and construct knowledge like a scientist. The study describes a method course implementation at a major public teachers college in Turkey. The main goal of the course was to improve research and teaching abilities of prospective physics teachers…

  5. Agile Methods and Request for Change (RFC): Observations from DoD Acquisition Programs

    DTIC Science & Technology

    2014-01-01

    at the Software Development Plan, then it’s worth having a conversation with the contractor that includes answering the above questions. MSA TD EMD...Lap- ham 2010] CMU/SEI-2013-TN-031 | 18 those undertaken in more traditional waterfall-based developments. Some of the government PMO enabling

  6. Innovative Methods for Promoting and Assessing Intercultural Competence in Higher Education

    ERIC Educational Resources Information Center

    Hiller, Gundula Gwenn

    2010-01-01

    This paper presents an intercultural training program that was developed by the Center for Intercultural Learning at the European University Viadrina in cooperation with students. A few of the student-generated activities will be described in detail. The program, aimed at enabling students to acquire intercultural competence, was developed at an…

  7. A Learning Network as a Development Method--An Example of Small Enterprises and a University Working Together.

    ERIC Educational Resources Information Center

    Tell, Joakim; Halila, Fawzi

    2001-01-01

    Small businesses implementing ISO 14001 standards worked with a university to develop a learning network. The network served as a source of inspiration and reflection as well as a sounding board. It enabled small enterprises to act collectively, compensating for individual lack of resources. (SK)

  8. Compact fusion energy based on the spherical tokamak

    NASA Astrophysics Data System (ADS)

    Sykes, A.; Costley, A. E.; Windsor, C. G.; Asunta, O.; Brittles, G.; Buxton, P.; Chuyanov, V.; Connor, J. W.; Gryaznevich, M. P.; Huang, B.; Hugill, J.; Kukushkin, A.; Kingham, D.; Langtry, A. V.; McNamara, S.; Morgan, J. G.; Noonan, P.; Ross, J. S. H.; Shevchenko, V.; Slade, R.; Smith, G.

    2018-01-01

    Tokamak Energy Ltd, UK, is developing spherical tokamaks using high temperature superconductor magnets as a possible route to fusion power using relatively small devices. We present an overview of the development programme including details of the enabling technologies, the key modelling methods and results, and the remaining challenges on the path to compact fusion.

  9. Overview of DOE-NE Proliferation and Terrorism Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sadasivan, Pratap

    2012-08-24

    Research objectives are: (1) Develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the life of current reactors; (2) Develop improvements in the affordability of new reactors to enable nuclear energy; (3) Develop Sustainable Nuclear Fuel Cycles; and (4) Understand and minimize the risks of nuclear proliferation and terrorism. The goal is to enable the use of risk information to inform NE R&D program planning. The PTRA program supports DOE-NE's goal of using risk information to inform R&D program planning. The FY12 PTRA program is focused on terrorism risk. The program includes a mixmore » of innovative methods that support the general practice of risk assessments, and selected applications.« less

  10. Broadband quantitative NQR for authentication of vitamins and dietary supplements

    NASA Astrophysics Data System (ADS)

    Chen, Cheng; Zhang, Fengchao; Bhunia, Swarup; Mandal, Soumyajit

    2017-05-01

    We describe hardware, pulse sequences, and algorithms for nuclear quadrupole resonance (NQR) spectroscopy of medicines and dietary supplements. Medicine and food safety is a pressing problem that has drawn more and more attention. NQR is an ideal technique for authenticating these substances because it is a non-invasive method for chemical identification. We have recently developed a broadband NQR front-end that can excite and detect 14N NQR signals over a wide frequency range; its operating frequency can be rapidly set by software, while sensitivity is comparable to conventional narrowband front-ends over the entire range. This front-end improves the accuracy of authentication by enabling multiple-frequency experiments. We have also developed calibration and signal processing techniques to convert measured NQR signal amplitudes into nuclear spin densities, thus enabling its use as a quantitative technique. Experimental results from several samples are used to illustrate the proposed methods.

  11. Causal Structure of Brain Physiology after Brain Injury from Subarachnoid Hemorrhage.

    PubMed

    Claassen, Jan; Rahman, Shah Atiqur; Huang, Yuxiao; Frey, Hans-Peter; Schmidt, J Michael; Albers, David; Falo, Cristina Maria; Park, Soojin; Agarwal, Sachin; Connolly, E Sander; Kleinberg, Samantha

    2016-01-01

    High frequency physiologic data are routinely generated for intensive care patients. While massive amounts of data make it difficult for clinicians to extract meaningful signals, these data could provide insight into the state of critically ill patients and guide interventions. We develop uniquely customized computational methods to uncover the causal structure within systemic and brain physiologic measures recorded in a neurological intensive care unit after subarachnoid hemorrhage. While the data have many missing values, poor signal-to-noise ratio, and are composed from a heterogeneous patient population, our advanced imputation and causal inference techniques enable physiologic models to be learned for individuals. Our analyses confirm that complex physiologic relationships including demand and supply of oxygen underlie brain oxygen measurements and that mechanisms for brain swelling early after injury may differ from those that develop in a delayed fashion. These inference methods will enable wider use of ICU data to understand patient physiology.

  12. A spectrally tunable LED sphere source enables accurate calibration of tristimulus colorimeters

    NASA Astrophysics Data System (ADS)

    Fryc, I.; Brown, S. W.; Ohno, Y.

    2006-02-01

    The Four-Color Matrix method (FCM) was developed to improve the accuracy of chromaticity measurements of various display colors. The method is valid for each type of display having similar spectra. To develop the Four-Color correction matrix, spectral measurements of primary red, green, blue, and white colors of a display are needed. Consequently, a calibration facility should be equipped with a number of different displays. This is very inconvenient and expensive. A spectrally tunable light source (STS) that can mimic different display spectral distributions would eliminate the need for maintaining a wide variety of displays and would enable a colorimeter to be calibrated for a number of different displays using the same setup. Simulations show that an STS that can create red, green, blue and white distributions that are close to the real spectral power distribution (SPD) of a display works well with the FCM for the calibration of colorimeters.

  13. Molecular dynamics simulations and docking enable to explore the biophysical factors controlling the yields of engineered nanobodies.

    PubMed

    Soler, Miguel A; de Marco, Ario; Fortuna, Sara

    2016-10-10

    Nanobodies (VHHs) have proved to be valuable substitutes of conventional antibodies for molecular recognition. Their small size represents a precious advantage for rational mutagenesis based on modelling. Here we address the problem of predicting how Camelidae nanobody sequences can tolerate mutations by developing a simulation protocol based on all-atom molecular dynamics and whole-molecule docking. The method was tested on two sets of nanobodies characterized experimentally for their biophysical features. One set contained point mutations introduced to humanize a wild type sequence, in the second the CDRs were swapped between single-domain frameworks with Camelidae and human hallmarks. The method resulted in accurate scoring approaches to predict experimental yields and enabled to identify the structural modifications induced by mutations. This work is a promising tool for the in silico development of single-domain antibodies and opens the opportunity to customize single functional domains of larger macromolecules.

  14. Molecular dynamics simulations and docking enable to explore the biophysical factors controlling the yields of engineered nanobodies

    NASA Astrophysics Data System (ADS)

    Soler, Miguel A.; De Marco, Ario; Fortuna, Sara

    2016-10-01

    Nanobodies (VHHs) have proved to be valuable substitutes of conventional antibodies for molecular recognition. Their small size represents a precious advantage for rational mutagenesis based on modelling. Here we address the problem of predicting how Camelidae nanobody sequences can tolerate mutations by developing a simulation protocol based on all-atom molecular dynamics and whole-molecule docking. The method was tested on two sets of nanobodies characterized experimentally for their biophysical features. One set contained point mutations introduced to humanize a wild type sequence, in the second the CDRs were swapped between single-domain frameworks with Camelidae and human hallmarks. The method resulted in accurate scoring approaches to predict experimental yields and enabled to identify the structural modifications induced by mutations. This work is a promising tool for the in silico development of single-domain antibodies and opens the opportunity to customize single functional domains of larger macromolecules.

  15. The Virtual Learning Commons (VLC): Enabling Co-Innovation Across Disciplines

    NASA Astrophysics Data System (ADS)

    Pennington, D. D.; Gandara, A.; Del Rio, N.

    2014-12-01

    A key challenge for scientists addressing grand-challenge problems is identifying, understanding, and integrating potentially relevant methods, models and tools that that are rapidly evolving in the informatics community. Such tools are essential for effectively integrating data and models in complex research projects, yet it is often difficult to know what tools are available and it is not easy to understand or evaluate how they might be used in a given research context. The goal of the National Science Foundation-funded Virtual Learning Commons (VLC) is to improve awareness and understanding of emerging methodologies and technologies, facilitate individual and group evaluation of these, and trace the impact of innovations within and across teams, disciplines, and communities. The VLC is a Web-based social bookmarking site designed specifically to support knowledge exchange in research communities. It is founded on well-developed models of technology adoption, diffusion of innovation, and experiential learning. The VLC makes use of Web 2.0 (Social Web) and Web 3.0 (Semantic Web) approaches. Semantic Web approaches enable discovery of potentially relevant methods, models, and tools, while Social Web approaches enable collaborative learning about their function. The VLC is under development and the first release is expected Fall 2014.

  16. Zephyr: Open-source Parallel Seismic Waveform Inversion in an Integrated Python-based Framework

    NASA Astrophysics Data System (ADS)

    Smithyman, B. R.; Pratt, R. G.; Hadden, S. M.

    2015-12-01

    Seismic Full-Waveform Inversion (FWI) is an advanced method to reconstruct wave properties of materials in the Earth from a series of seismic measurements. These methods have been developed by researchers since the late 1980s, and now see significant interest from the seismic exploration industry. As researchers move towards implementing advanced numerical modelling (e.g., 3D, multi-component, anisotropic and visco-elastic physics), it is desirable to make use of a modular approach, minimizing the effort developing a new set of tools for each new numerical problem. SimPEG (http://simpeg.xyz) is an open source project aimed at constructing a general framework to enable geophysical inversion in various domains. In this abstract we describe Zephyr (https://github.com/bsmithyman/zephyr), which is a coupled research project focused on parallel FWI in the seismic context. The software is built on top of Python, Numpy and IPython, which enables very flexible testing and implementation of new features. Zephyr is an open source project, and is released freely to enable reproducible research. We currently implement a parallel, distributed seismic forward modelling approach that solves the 2.5D (two-and-one-half dimensional) viscoacoustic Helmholtz equation at a range modelling frequencies, generating forward solutions for a given source behaviour, and gradient solutions for a given set of observed data. Solutions are computed in a distributed manner on a set of heterogeneous workers. The researcher's frontend computer may be separated from the worker cluster by a network link to enable full support for computation on remote clusters from individual workstations or laptops. The present codebase introduces a numerical discretization equivalent to that used by FULLWV, a well-known seismic FWI research codebase. This makes it straightforward to compare results from Zephyr directly with FULLWV. The flexibility introduced by the use of a Python programming environment makes extension of the codebase with new methods much more straightforward. This enables comparison and integration of new efforts with existing results.

  17. Synthetic carbohydrate research based on organic electrochemistry.

    PubMed

    Nokami, Toshiki; Saito, Kodai; Yoshida, Jun-Ichi

    2012-12-01

    Development of a novel method for generating glycosyl cations or their equivalents is highly desired, because such intermediates are crucial for developing stereoselective glycosylations in oligosaccharide syntheses. In this review we focus on electrochemical methods that we have recently developed. The anodic oxidation of thioglycosides is effective for generating glycosyl triflate pools, which react with glycosyl acceptors. The reaction of glycosyl triflate pools with diorganosulfides gives glycosyl sulfonium ions, which also serve as effective glycosylation intermediates. The indirect electrochemical method is effective for the generation of glycosyl cations or their equivalents and the use of a flow-microreactor system enables glycosylation using such intermediates. Copyright © 2012 Elsevier Ltd. All rights reserved.

  18. Developing a Theory of Digitally-Enabled Trial-Based Problem Solving through Simulation Methods: The Case of Direct-Response Marketing

    ERIC Educational Resources Information Center

    Clark, Joseph Warren

    2012-01-01

    In turbulent business environments, change is rapid, continuous, and unpredictable. Turbulence undermines those adaptive problem solving methods that generate solutions by extrapolating from what worked (or did not work) in the past. To cope with this challenge, organizations utilize trial-based problem solving (TBPS) approaches in which they…

  19. Improved Cell Culture Method for Growing Contracting Skeletal Muscle Models

    NASA Technical Reports Server (NTRS)

    Marquette, Michele L.; Sognier, Marguerite A.

    2013-01-01

    An improved method for culturing immature muscle cells (myoblasts) into a mature skeletal muscle overcomes some of the notable limitations of prior culture methods. The development of the method is a major advance in tissue engineering in that, for the first time, a cell-based model spontaneously fuses and differentiates into masses of highly aligned, contracting myotubes. This method enables (1) the construction of improved two-dimensional (monolayer) skeletal muscle test beds; (2) development of contracting three-dimensional tissue models; and (3) improved transplantable tissues for biomedical and regenerative medicine applications. With adaptation, this method also offers potential application for production of other tissue types (i.e., bone and cardiac) from corresponding precursor cells.

  20. Summary of transformation equations and equations of motion used in free flight and wind tunnel data reduction and analysis

    NASA Technical Reports Server (NTRS)

    Gainer, T. G.; Hoffman, S.

    1972-01-01

    Basic formulations for developing coordinate transformations and motion equations used with free-flight and wind-tunnel data reduction are presented. The general forms presented include axes transformations that enable transfer back and forth between any of the five axes systems that are encountered in aerodynamic analysis. Equations of motion are presented that enable calculation of motions anywhere in the vicinity of the earth. A bibliography of publications on methods of analyzing flight data is included.

  1. A Voice Enabled Procedure Browser for the International Space Station

    NASA Technical Reports Server (NTRS)

    Rayner, Manny; Chatzichrisafis, Nikos; Hockey, Beth Ann; Farrell, Kim; Renders, Jean-Michel

    2005-01-01

    Clarissa, an experimental voice enabled procedure browser that has recently been deployed on the International Space Station (ISS), is to the best of our knowledge the first spoken dialog system in space. This paper gives background on the system and the ISS procedures, then discusses the research developed to address three key problems: grammar-based speech recognition using the Regulus toolkit; SVM based methods for open microphone speech recognition; and robust side-effect free dialogue management for handling undos, corrections and confirmations.

  2. Optical methods for enabling focus cues in head-mounted displays for virtual and augmented reality

    NASA Astrophysics Data System (ADS)

    Hua, Hong

    2017-05-01

    Developing head-mounted displays (HMD) that offer uncompromised optical pathways to both digital and physical worlds without encumbrance and discomfort confronts many grand challenges, both from technological perspectives and human factors. Among the many challenges, minimizing visual discomfort is one of the key obstacles. One of the key contributing factors to visual discomfort is the lack of the ability to render proper focus cues in HMDs to stimulate natural eye accommodation responses, which leads to the well-known accommodation-convergence cue discrepancy problem. In this paper, I will provide a summary on the various optical methods approaches toward enabling focus cues in HMDs for both virtual reality (VR) and augmented reality (AR).

  3. New Developments in Cathodoluminescence Spectroscopy for the Study of Luminescent Materials

    PubMed Central

    den Engelsen, Daniel; Fern, George R.; Harris, Paul G.; Ireland, Terry G.; Silver, Jack

    2017-01-01

    Herein, we describe three advanced techniques for cathodoluminescence (CL) spectroscopy that have recently been developed in our laboratories. The first is a new method to accurately determine the CL-efficiency of thin layers of phosphor powders. When a wide band phosphor with a band gap (Eg > 5 eV) is bombarded with electrons, charging of the phosphor particles will occur, which eventually leads to erroneous results in the determination of the luminous efficacy. To overcome this problem of charging, a comparison method has been developed, which enables accurate measurement of the current density of the electron beam. The study of CL from phosphor specimens in a scanning electron microscope (SEM) is the second subject to be treated. A detailed description of a measuring method to determine the overall decay time of single phosphor crystals in a SEM without beam blanking is presented. The third technique is based on the unique combination of microscopy and spectrometry in the transmission electron microscope (TEM) of Brunel University London (UK). This combination enables the recording of CL-spectra of nanometre-sized specimens and determining spatial variations in CL emission across individual particles by superimposing the scanning TEM and CL-images. PMID:28772671

  4. Ultrahigh-Speed Optical Coherence Tomography for Three-Dimensional and En Face Imaging of the Retina and Optic Nerve Head

    PubMed Central

    Srinivasan, Vivek J.; Adler, Desmond C.; Chen, Yueli; Gorczynska, Iwona; Huber, Robert; Duker, Jay S.; Schuman, Joel S.; Fujimoto, James G.

    2009-01-01

    Purpose To demonstrate ultrahigh-speed optical coherence tomography (OCT) imaging of the retina and optic nerve head at 249,000 axial scans per second and a wavelength of 1060 nm. To investigate methods for visualization of the retina, choroid, and optic nerve using high-density sampling enabled by improved imaging speed. Methods A swept-source OCT retinal imaging system operating at a speed of 249,000 axial scans per second was developed. Imaging of the retina, choroid, and optic nerve were performed. Display methods such as speckle reduction, slicing along arbitrary planes, en face visualization of reflectance from specific retinal layers, and image compounding were investigated. Results High-definition and three-dimensional (3D) imaging of the normal retina and optic nerve head were performed. Increased light penetration at 1060 nm enabled improved visualization of the choroid, lamina cribrosa, and sclera. OCT fundus images and 3D visualizations were generated with higher pixel density and less motion artifacts than standard spectral/Fourier domain OCT. En face images enabled visualization of the porous structure of the lamina cribrosa, nerve fiber layer, choroid, photoreceptors, RPE, and capillaries of the inner retina. Conclusions Ultrahigh-speed OCT imaging of the retina and optic nerve head at 249,000 axial scans per second is possible. The improvement of ∼5 to 10× in imaging speed over commercial spectral/Fourier domain OCT technology enables higher density raster scan protocols and improved performance of en face visualization methods. The combination of the longer wavelength and ultrahigh imaging speed enables excellent visualization of the choroid, sclera, and lamina cribrosa. PMID:18658089

  5. Methodology in the Assessment of Construction and Development Investment Projects, Including the Graphic Multi-Criteria Analysis - a Systemic Approach

    NASA Astrophysics Data System (ADS)

    Szafranko, Elżbieta

    2017-10-01

    Assessment of variant solutions developed for a building investment project needs to be made at the stage of planning. While considering alternative solutions, the investor defines various criteria, but a direct evaluation of the degree of their fulfilment by developed variant solutions can be very difficult. In practice, there are different methods which enable the user to include a large number of parameters into an analysis, but their implementation can be challenging. Some methods require advanced mathematical computations, preceded by complicating input data processing, and the generated results may not lend themselves easily to interpretation. Hence, during her research, the author has developed a systemic approach, which involves several methods and whose goal is to compare their outcome. The final stage of the proposed method consists of graphic interpretation of results. The method has been tested on a variety of building and development projects.

  6. Enabling the participation of marginalized populations: case studies from a health service organization in Ontario, Canada.

    PubMed

    Montesanti, Stephanie R; Abelson, Julia; Lavis, John N; Dunn, James R

    2017-08-01

    We examined efforts to engage marginalized populations in Ontario Community Health Centers (CHCs), which are primary health care organizations serving 74 high-risk communities. Qualitative case studies of community participation in four Ontario CHCs were carried out through key informant interviews with CHC staff to identify: (i) the approaches, strategies and methods used in participation initiatives aimed specifically at engaging marginalized populations in the planning of and decision making for health services; and (ii) the challenges and enablers for engaging these populations. The marginalized populations involved in the community participation initiatives studied included Low-German Speaking Mennonites in a rural town, newcomer immigrants and refugees in an urban downtown city, immigrant and francophone seniors in an inner city and refugee women in an inner city. Our analysis revealed that enabling the participation of marginalized populations requires CHCs to attend to the barriers experienced by marginalized populations that constrain their participation. Key informants outlined the features of a 'community development approach' that they rely on to address the barriers to marginalized peoples' involvement by strengthening their skills, abilities and leadership in capacity-building activities. The community development approach also shaped the participation methods that were used in the engagement process of CHCs. However, key informants also described the challenges of applying this approach, influenced by the cultural values of some groups, which shaped their willingness and motivation to participate. This study provides further insight into the approach, strategies and methods used in the engagement process to enable the participation of marginalized populations, which may be transferable to other health services settings. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  7. Development of a Hybrid RANS/LES Method for Turbulent Mixing Layers

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    Significant research has been underway for several years in NASA Glenn Research Center's nozzle branch to develop advanced computational methods for simulating turbulent flows in exhaust nozzles. The primary efforts of this research have concentrated on improving our ability to calculate the turbulent mixing layers that dominate flows both in the exhaust systems of modern-day aircraft and in those of hypersonic vehicles under development. As part of these efforts, a hybrid numerical method was recently developed to simulate such turbulent mixing layers. The method developed here is intended for configurations in which a dominant structural feature provides an unsteady mechanism to drive the turbulent development in the mixing layer. Interest in Large Eddy Simulation (LES) methods have increased in recent years, but applying an LES method to calculate the wide range of turbulent scales from small eddies in the wall-bounded regions to large eddies in the mixing region is not yet possible with current computers. As a result, the hybrid method developed here uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall-bounded regions entering a mixing section and uses a LES procedure to calculate the mixing-dominated regions. A numerical technique was developed to enable the use of the hybrid RANS-LES method on stretched, non-Cartesian grids. With this technique, closure for the RANS equations is obtained by using the Cebeci-Smith algebraic turbulence model in conjunction with the wall-function approach of Ota and Goldberg. The LES equations are closed using the Smagorinsky subgrid scale model. Although the function of the Cebeci-Smith model to replace all of the turbulent stresses is quite different from that of the Smagorinsky subgrid model, which only replaces the small subgrid turbulent stresses, both are eddy viscosity models and both are derived at least in part from mixing-length theory. The similar formulation of these two models enables the RANS and LES equations to be solved with a single solution scheme and computational grid. The hybrid RANS-LES method has been applied to a benchmark compressible mixing layer experiment in which two isolated supersonic streams, separated by a splitter plate, provide the flows to a constant-area mixing section. Although the configuration is largely two dimensional in nature, three-dimensional calculations were found to be necessary to enable disturbances to develop in three spatial directions and to transition to turbulence. The flow in the initial part of the mixing section consists of a periodic vortex shedding downstream of the splitter plate trailing edge. This organized vortex shedding then rapidly transitions to a turbulent structure, which is very similar to the flow development observed in the experiments. Although the qualitative nature of the large-scale turbulent development in the entire mixing section is captured well by the LES part of the current hybrid method, further efforts are planned to directly calculate a greater portion of the turbulence spectrum and to limit the subgrid scale modeling to only the very small scales. This will be accomplished by the use of higher accuracy solution schemes and more powerful computers, measured both in speed and memory capabilities.

  8. Summary of methods in Wide-Area Motion Imagery (WAMI)

    NASA Astrophysics Data System (ADS)

    Blasch, Erik; Seetharaman, Guna; Suddarth, Steve; Palaniappan, Kannappan; Chen, Genshe; Ling, Haibin; Basharat, Arlsan

    2014-06-01

    In the last decade, there have been numerous developments in wide-area motion imagery (WAMI) from the sensor design to data exploitation. In this paper, we summarize the published literature on WAMI results in an effort to organize the techniques, discuss the developments, and determine the state-of-the-art. Using the organization of developments, we see the variations in approaches and relations to the data sets available. The literature summary provides and anthology of many of the developers in the last decade and their associated techniques. In our use case, we showcase current methods and products that enable future WAMI exploitation developments.

  9. Web-enabling Ecological Risk Assessment for Accessibility and Transparency

    EPA Science Inventory

    Ecological risk methods and tools are necessarily diverse to account for different combinations of receptors, exposure processes, effects estimation, and degree of conservatism/realism necessary to support chemical-based assessments. These tools have been continuously developed s...

  10. 3D printing via ambient reactive extrusion

    DOE PAGES

    Rios, Orlando; Carter, William G.; Post, Brian K.; ...

    2018-03-14

    Here, Additive Manufacturing (AM) has the potential to offer many benefits over traditional manufacturing methods in the fabrication of complex parts with advantages such as low weight, complex geometry, and embedded functionality. In practice, today’s AM technologies are limited by their slow speed and highly directional properties. To address both issues, we have developed a reactive mixture deposition approach that can enable 3D printing of polymer materials at over 100X the volumetric deposition rate, enabled by a greater than 10X reduction in print head mass compared to existing large-scale thermoplastic deposition methods, with material chemistries that can be tuned formore » specific properties. Additionally, the reaction kinetics and transient rheological properties are specifically designed for the target deposition rates, enabling the synchronized development of increasing shear modulus and extensive cross linking across the printed layers. This ambient cure eliminates the internal stresses and bulk distortions that typically hamper AM of large parts, and yields a printed part with inter-layer covalent bonds that significantly improve the strength of the part along the build direction. The fast cure kinetics combined with the fine-tuned viscoelastic properties of the mixture enable rapid vertical builds that are not possible using other approaches. Through rheological characterization of mixtures that were capable of printing in this process as well as materials that have sufficient structural integrity for layer-on-layer printing, a “printability” rheological phase diagram has been developed, and is presented here. We envision this approach implemented as a deployable manufacturing system, where manufacturing is done on-site using the efficiently-shipped polymer, locally-sourced fillers, and a small, deployable print system. Unlike existing additive manufacturing approaches which require larger and slower print systems and complex thermal management strategies as scale increases, liquid reactive polymers decouple performance and print speed from the scale of the part, enabling a new class of cost-effective, fuel-efficient additive manufacturing.« less

  11. 3D printing via ambient reactive extrusion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rios, Orlando; Carter, William G.; Post, Brian K.

    Here, Additive Manufacturing (AM) has the potential to offer many benefits over traditional manufacturing methods in the fabrication of complex parts with advantages such as low weight, complex geometry, and embedded functionality. In practice, today’s AM technologies are limited by their slow speed and highly directional properties. To address both issues, we have developed a reactive mixture deposition approach that can enable 3D printing of polymer materials at over 100X the volumetric deposition rate, enabled by a greater than 10X reduction in print head mass compared to existing large-scale thermoplastic deposition methods, with material chemistries that can be tuned formore » specific properties. Additionally, the reaction kinetics and transient rheological properties are specifically designed for the target deposition rates, enabling the synchronized development of increasing shear modulus and extensive cross linking across the printed layers. This ambient cure eliminates the internal stresses and bulk distortions that typically hamper AM of large parts, and yields a printed part with inter-layer covalent bonds that significantly improve the strength of the part along the build direction. The fast cure kinetics combined with the fine-tuned viscoelastic properties of the mixture enable rapid vertical builds that are not possible using other approaches. Through rheological characterization of mixtures that were capable of printing in this process as well as materials that have sufficient structural integrity for layer-on-layer printing, a “printability” rheological phase diagram has been developed, and is presented here. We envision this approach implemented as a deployable manufacturing system, where manufacturing is done on-site using the efficiently-shipped polymer, locally-sourced fillers, and a small, deployable print system. Unlike existing additive manufacturing approaches which require larger and slower print systems and complex thermal management strategies as scale increases, liquid reactive polymers decouple performance and print speed from the scale of the part, enabling a new class of cost-effective, fuel-efficient additive manufacturing.« less

  12. Three-Component Reaction Discovery Enabled by Mass Spectrometry of Self-Assembled Monolayers

    PubMed Central

    Montavon, Timothy J.; Li, Jing; Cabrera-Pardo, Jaime R.; Mrksich, Milan; Kozmin, Sergey A.

    2011-01-01

    Multi-component reactions have been extensively employed in many areas of organic chemistry. Despite significant progress, the discovery of such enabling transformations remains challenging. Here, we present the development of a parallel, label-free reaction-discovery platform, which can be used for identification of new multi-component transformations. Our approach is based on the parallel mass spectrometric screening of interfacial chemical reactions on arrays of self-assembled monolayers. This strategy enabled the identification of a simple organic phosphine that can catalyze a previously unknown condensation of siloxy alkynes, aldehydes and amines to produce 3-hydroxy amides with high efficiency and diastereoselectivity. The reaction was further optimized using solution phase methods. PMID:22169871

  13. Robust synthesis and continuous manufacturing of carbon nanotube forests and graphene films

    NASA Astrophysics Data System (ADS)

    Polsen, Erik S.

    Successful translation of the outstanding properties of carbon nanotubes (CNTs) and graphene to commercial applications requires highly consistent methods of synthesis, using scalable and cost-effective machines. This thesis presents robust process conditions and a series of process operations that will enable integrated roll-to-roll (R2R) CNT and graphene growth on flexible substrates. First, a comprehensive study was undertaken to establish the sources of variation in laboratory CVD growth of CNT forests. Statistical analysis identified factors that contribute to variation in forest height and density including ambient humidity, sample position in the reactor, and barometric pressure. Implementation of system modifications and user procedures reduced the variation in height and density by 50% and 54% respectively. With improved growth, two new methods for continuous deposition and patterning of catalyst nanoparticles for CNT forest growth were developed, enabling the diameter, density and pattern geometry to be tailored through the control of process parameters. Convective assembly of catalyst nanoparticles in solution enables growth of CNT forests with density 3-fold higher than using sputtered catalyst films with the same growth parameters. Additionally, laser printing of magnetic ink character recognition toner provides a large scale patterning method, with digital control of the pattern density and tunable CNT density via laser intensity. A concentric tube CVD reactor was conceptualized, designed and built for R2R growth of CNT forests and graphene on flexible substrates helically fed through the annular gap. The design enables downstream injection of the hydrocarbon source, and gas consumption is reduced 90% compared to a standard tube furnace. Multi-wall CNT forests are grown continuously on metallic and ceramic fiber substrates at 33 mm/min. High quality, uniform bi- and multi-layer graphene is grown on Cu and Ni foils at 25 - 495 mm/min. A second machine for continuous forest growth and delamination was developed; and forest-substrate adhesion strength was controlled through CVD parameters. Taken together, these methods enable uniform R2R processing of CNT forests and graphene with engineered properties. Last, it is projected that foreseeable improvements in CNT forest quality and density using these methods will result in electrical and thermal properties that exceed state-of-the-art bulk materials.

  14. Temporal slow-growth formulation for direct numerical simulation of compressible wall-bounded flows

    NASA Astrophysics Data System (ADS)

    Topalian, Victor; Oliver, Todd A.; Ulerich, Rhys; Moser, Robert D.

    2017-08-01

    A slow-growth formulation for DNS of wall-bounded turbulent flow is developed and demonstrated to enable extension of slow-growth modeling concepts to wall-bounded flows with complex physics. As in previous slow-growth approaches, the formulation assumes scale separation between the fast scales of turbulence and the slow evolution of statistics such as the mean flow. This separation enables the development of approaches where the fast scales of turbulence are directly simulated while the forcing provided by the slow evolution is modeled. The resulting model admits periodic boundary conditions in the streamwise direction, which avoids the need for extremely long domains and complex inflow conditions that typically accompany spatially developing simulations. Further, it enables the use of efficient Fourier numerics. Unlike previous approaches [Guarini, Moser, Shariff, and Wray, J. Fluid Mech. 414, 1 (2000), 10.1017/S0022112000008466; Maeder, Adams, and Kleiser, J. Fluid Mech. 429, 187 (2001), 10.1017/S0022112000002718; Spalart, J. Fluid Mech. 187, 61 (1988), 10.1017/S0022112088000345], the present approach is based on a temporally evolving boundary layer and is specifically tailored to give results for calibration and validation of Reynolds-averaged Navier-Stokes (RANS) turbulence models. The use of a temporal homogenization simplifies the modeling, enabling straightforward extension to flows with complicating features, including cold and blowing walls. To generate data useful for calibration and validation of RANS models, special care is taken to ensure that the mean slow-growth forcing is closed in terms of the mean and other quantities that appear in standard RANS models, ensuring that there is no confounding between typical RANS closures and additional closures required for the slow-growth problem. The performance of the method is demonstrated on two problems: an essentially incompressible, zero-pressure-gradient boundary layer and a transonic boundary layer over a cooled, transpiring wall. The results show that the approach produces flows that are qualitatively similar to other slow-growth methods as well as spatially developing simulations and that the method can be a useful tool in investigating wall-bounded flows with complex physics.

  15. Appreciative Inquiry for Quality Improvement in Primary Care Practices

    PubMed Central

    Ruhe, Mary C.; Bobiak, Sarah N.; Litaker, David; Carter, Caroline A.; Wu, Laura; Schroeder, Casey; Zyzanski, Stephen; Weyer, Sharon M.; Werner, James J.; Fry, Ronald E.; Stange, Kurt C.

    2014-01-01

    Purpose To test the effect of an Appreciative Inquiry (AI) quality improvement strategy, on clinical quality management and practice development outcomes. AI enables discovery of shared motivations, envisioning a transformed future, and learning around implementation of a change process. Methods Thirty diverse primary care practices were randomly assigned to receive an AI-based intervention focused on a practice-chosen topic and on improving preventive service delivery (PSD) rates. Medical record review assessed change in PSD rates. Ethnographic fieldnotes and observational checklist analysis used editing and immersion/crystallization methods to identify factors affecting intervention implementation and practice development outcomes. Results PSD rates did not change. Field note analysis suggested that the intervention elicited core motivations, facilitated development of a shared vision, defined change objectives and fostered respectful interactions. Practices most likely to implement the intervention or develop new practice capacities exhibited one or more of the following: support from key leader(s), a sense of urgency for change, a mission focused on serving patients, health care system and practice flexibility, and a history of constructive practice change. Conclusions An AI approach and enabling practice conditions can lead to intervention implementation and practice development by connecting individual and practice strengths and motivations to the change objective. PMID:21192206

  16. LibKiSAO: a Java library for Querying KiSAO.

    PubMed

    Zhukova, Anna; Adams, Richard; Laibe, Camille; Le Novère, Nicolas

    2012-09-24

    The Kinetic Simulation Algorithm Ontology (KiSAO) supplies information about existing algorithms available for the simulation of Systems Biology models, their characteristics, parameters and inter-relationships. KiSAO enables the unambiguous identification of algorithms from simulation descriptions. Information about analogous methods having similar characteristics and about algorithm parameters incorporated into KiSAO is desirable for simulation tools. To retrieve this information programmatically an application programming interface (API) for KiSAO is needed. We developed libKiSAO, a Java library to enable querying of the KiSA Ontology. It implements methods to retrieve information about simulation algorithms stored in KiSAO, their characteristics and parameters, and methods to query the algorithm hierarchy and search for similar algorithms providing comparable results for the same simulation set-up. Using libKiSAO, simulation tools can make logical inferences based on this knowledge and choose the most appropriate algorithm to perform a simulation. LibKiSAO also enables simulation tools to handle a wider range of simulation descriptions by determining which of the available methods are similar and can be used instead of the one indicated in the simulation description if that one is not implemented. LibKiSAO enables Java applications to easily access information about simulation algorithms, their characteristics and parameters stored in the OWL-encoded Kinetic Simulation Algorithm Ontology. LibKiSAO can be used by simulation description editors and simulation tools to improve reproducibility of computational simulation tasks and facilitate model re-use.

  17. Developing an expert panel process to refine health outcome definitions in observational data.

    PubMed

    Fox, Brent I; Hollingsworth, Joshua C; Gray, Michael D; Hollingsworth, Michael L; Gao, Juan; Hansen, Richard A

    2013-10-01

    Drug safety surveillance using observational data requires valid adverse event, or health outcome of interest (HOI) measurement. The objectives of this study were to develop a method to review HOI definitions in claims databases using (1) web-based digital tools to present de-identified patient data, (2) a systematic expert panel review process, and (3) a data collection process enabling analysis of concepts-of-interest that influence panelists' determination of HOI. De-identified patient data were presented via an interactive web-based dashboard to enable case review and determine if specific HOIs were present or absent. Criteria for determining HOIs and their severity were provided to each panelist. Using a modified Delphi method, six panelist pairs independently reviewed approximately 200 cases across each of three HOIs (acute liver injury, acute kidney injury, and acute myocardial infarction) such that panelist pairs independently reviewed the same cases. Panelists completed an assessment within the dashboard for each case that included their assessment of the presence or absence of the HOI, HOI severity (if present), and data contributing to their decision. Discrepancies within panelist pairs were resolved during a consensus process. Dashboard development was iterative, focusing on data presentation and recording panelists' assessments. Panelists reported quickly learning how to use the dashboard. The assessment module was used consistently. The dashboard was reliable, enabling an efficient review process for panelists. Modifications were made to the dashboard and review process when necessary to facilitate case review. Our methods should be applied to other health outcomes of interest to further refine the dashboard and case review process. The expert review process was effective and was supported by the web-based dashboard. Our methods for case review and classification can be applied to future methods for case identification in observational data sources. Copyright © 2013 Elsevier Inc. All rights reserved.

  18. Expert2OWL: A Methodology for Pattern-Based Ontology Development.

    PubMed

    Tahar, Kais; Xu, Jie; Herre, Heinrich

    2017-01-01

    The formalization of expert knowledge enables a broad spectrum of applications employing ontologies as underlying technology. These include eLearning, Semantic Web and expert systems. However, the manual construction of such ontologies is time-consuming and thus expensive. Moreover, experts are often unfamiliar with the syntax and semantics of formal ontology languages such as OWL and usually have no experience in developing formal ontologies. To overcome these barriers, we developed a new method and tool, called Expert2OWL that provides efficient features to support the construction of OWL ontologies using GFO (General Formal Ontology) as a top-level ontology. This method allows a close and effective collaboration between ontologists and domain experts. Essentially, this tool integrates Excel spreadsheets as part of a pattern-based ontology development and refinement process. Expert2OWL enables us to expedite the development process and modularize the resulting ontologies. We applied this method in the field of Chinese Herbal Medicine (CHM) and used Expert2OWL to automatically generate an accurate Chinese Herbology ontology (CHO). The expressivity of CHO was tested and evaluated using ontology query languages SPARQL and DL. CHO shows promising results and can generate answers to important scientific questions such as which Chinese herbal formulas contain which substances, which substances treat which diseases, and which ones are the most frequently used in CHM.

  19. Using Colaizzi's method of data analysis to explore the experiences of nurse academics teaching on satellite campuses.

    PubMed

    Wirihana, Lisa; Welch, Anthony; Williamson, Moira; Christensen, Martin; Bakon, Shannon; Craft, Judy

    2018-03-16

    Phenomenology is a useful methodological approach in qualitative nursing research. It enables researchers to put aside their perceptions of a phenomenon and give meaning to a participant's experiences. Exploring the experiences of others enables previously unavailable insights to be discovered. To delineate the implementation of Colaizzi's ( 1978 ) method of data analysis in descriptive phenomenological nursing research. The use of Colaizzi's method of data analysis enabled new knowledge to be revealed and provided insights into the experiences of nurse academics teaching on satellite campuses. Local adaptation of the nursing curriculum and additional unnoticed responsibilities had not been identified previously and warrant further research. Colaizzi's ( 1978 ) method of data analysis is rigorous and robust, and therefore a qualitative method that ensures the credibility and reliability of its results. It allows researchers to reveal emergent themes and their interwoven relationships. Researchers using a descriptive phenomenological approach should consider using this method as a clear and logical process through which the fundamental structure of an experience can be explored. Colaizzi's phenomenological methodology can be used reliably to understand people's experiences. This may prove beneficial in the development of therapeutic policy and the provision of patient-centred care. ©2018 RCN Publishing Company Ltd. All rights reserved. Not to be copied, transmitted or recorded in any way, in whole or part, without prior permission of the publishers.

  20. Policy Driven Development: Flexible Policy Insertion for Large Scale Systems.

    PubMed

    Demchak, Barry; Krüger, Ingolf

    2012-07-01

    The success of a software system depends critically on how well it reflects and adapts to stakeholder requirements. Traditional development methods often frustrate stakeholders by creating long latencies between requirement articulation and system deployment, especially in large scale systems. One source of latency is the maintenance of policy decisions encoded directly into system workflows at development time, including those involving access control and feature set selection. We created the Policy Driven Development (PDD) methodology to address these development latencies by enabling the flexible injection of decision points into existing workflows at runtime , thus enabling policy composition that integrates requirements furnished by multiple, oblivious stakeholder groups. Using PDD, we designed and implemented a production cyberinfrastructure that demonstrates policy and workflow injection that quickly implements stakeholder requirements, including features not contemplated in the original system design. PDD provides a path to quickly and cost effectively evolve such applications over a long lifetime.

  1. The Power of Many: Mentoring Networks for Growth and Development

    ERIC Educational Resources Information Center

    Wild, Lynn; Canale, Anne Marie; Herdklotz, Cheryl

    2017-01-01

    In higher education, as in many professions, employees new to their positions are advised to seek a mentor--an experienced individual who knows the profession and the academy and is invested in his or her mentee's success. Mentoring has long been recognized as an effective method for enabling new employees to develop the knowledge, skills,…

  2. Developing nursing ethical competences online versus in the traditional classroom.

    PubMed

    Trobec, Irena; Starcic, Andreja Istenic

    2015-05-01

    The development of society and science, especially medical science, gives rise to new moral and ethical challenges in healthcare. In order to respond to the contemporary challenges that require autonomous decision-making in different work contexts, a pedagogical experiment was conducted to identify the readiness and responsiveness of current organisation of nursing higher education in Slovenia. It compared the successfulness of active learning methods online (experimental group) and in the traditional classroom (control group) and their impact on the ethical competences of nursing students. The hypothesis set in the experiment, hypothesis 1 (the experimental group will be successful and will have good achievements in comprehension and application of ethical principles) was confirmed based on pre-tests and post-tests. The hypothesis tested by the questionnaire, hypothesis 2 (according to the students, the active learning methods online in the experimental group have a positive impact on the development of ethical competences) was confirmed. The pedagogical experiment was supported by a multiple-case study that enabled the in-depth analysis of the students' attitudes towards the active learning methods in both settings. The study included Slovenian first-year nursing students (N = 211) of all the enrolled students (N = 225) at the University of Ljubljana and University of Primorska in the academic year 2010/2011. Before the study ethical permission was obtained from the managements of both participating faculties. The students were given all the necessary information of the experiment before the tutorials. No significant difference was found between the two learning settings and both had a positive impact upon learning. The results of the content analysis show that the students' active engagement with the active learning methods in the group enables the development of ethical competences and the related communicative competences, interpersonal skills, collaboration and critical thinking. Active learning methods in the settings compared, online and the traditional classroom, enabled the development of a higher level of knowledge defined by the ability of critical thinking and reflective response, the core of ethical competences. Students develop ethical competence through active engagement in a group work, role play and discussion, and there is no difference between online or traditional learning settings. In the healthcare, it is crucial for providers to be capable of making autonomous decisions and managing various communication situations and contexts in which the moral attitudes and ethical sensibility are essential. © The Author(s) 2014.

  3. Modular Exposure Disaggregation Methodologies for Catastrophe Modelling using GIS and Remotely-Sensed Data

    NASA Astrophysics Data System (ADS)

    Foulser-Piggott, R.; Saito, K.; Spence, R.

    2012-04-01

    Loss estimates produced by catastrophe models are dependent on the quality of the input data, including both the hazard and exposure data. Currently, some of the exposure data input into a catastrophe model is aggregated over an area and therefore an estimate of the risk in this area may have a low level of accuracy. In order to obtain a more detailed and accurate loss estimate, it is necessary to have higher resolution exposure data. However, high resolution exposure data is not commonly available worldwide and therefore methods to infer building distribution and characteristics at higher resolution from existing information must be developed. This study is focussed on the development of disaggregation methodologies for exposure data which, if implemented in current catastrophe models, would lead to improved loss estimates. The new methodologies developed for disaggregating exposure data make use of GIS, remote sensing and statistical techniques. The main focus of this study is on earthquake risk, however the methods developed are modular so that they may be applied to different hazards. A number of different methods are proposed in order to be applicable to different regions of the world which have different amounts of data available. The new methods give estimates of both the number of buildings in a study area and a distribution of building typologies, as well as a measure of the vulnerability of the building stock to hazard. For each method, a way to assess and quantify the uncertainties in the methods and results is proposed, with particular focus on developing an index to enable input data quality to be compared. The applicability of the methods is demonstrated through testing for two study areas, one in Japan and the second in Turkey, selected because of the occurrence of recent and damaging earthquake events. The testing procedure is to use the proposed methods to estimate the number of buildings damaged at different levels following a scenario earthquake event. This enables the results of the models to be compared with real data and the relative performance of the different methodologies to be evaluated. A sensitivity analysis is also conducted for two main reasons. Firstly, to determine the key input variables in the methodology that have the most significant impact on the resulting loss estimate. Secondly, to enable the uncertainty in the different approaches to be quantified and therefore provide a range of uncertainty in the loss estimates.

  4. Mapping the pathways of resistance to targeted therapies

    PubMed Central

    Wood, Kris C.

    2015-01-01

    Resistance substantially limits the depth and duration of clinical responses to targeted anticancer therapies. Through the use of complementary experimental approaches, investigators have revealed that cancer cells can achieve resistance through adaptation or selection driven by specific genetic, epigenetic, or microenvironmental alterations. Ultimately, these diverse alterations often lead to the activation of signaling pathways that, when co-opted, enable cancer cells to survive drug treatments. Recently developed methods enable the direct and scalable identification of the signaling pathways capable of driving resistance in specific contexts. Using these methods, novel pathways of resistance to clinically approved drugs have been identified and validated. By combining systematic resistance pathway mapping methods with studies revealing biomarkers of specific resistance pathways and pharmacological approaches to block these pathways, it may be possible to rationally construct drug combinations that yield more penetrant and lasting responses in patients. PMID:26392071

  5. Adult stem cell lineage tracing and deep tissue imaging

    PubMed Central

    Fink, Juergen; Andersson-Rolf, Amanda; Koo, Bon-Kyoung

    2015-01-01

    Lineage tracing is a widely used method for understanding cellular dynamics in multicellular organisms during processes such as development, adult tissue maintenance, injury repair and tumorigenesis. Advances in tracing or tracking methods, from light microscopy-based live cell tracking to fluorescent label-tracing with two-photon microscopy, together with emerging tissue clearing strategies and intravital imaging approaches have enabled scientists to decipher adult stem and progenitor cell properties in various tissues and in a wide variety of biological processes. Although technical advances have enabled time-controlled genetic labeling and simultaneous live imaging, a number of obstacles still need to be overcome. In this review, we aim to provide an in-depth description of the traditional use of lineage tracing as well as current strategies and upcoming new methods of labeling and imaging. [BMB Reports 2015; 48(12): 655-667] PMID:26634741

  6. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Qin, Hong; Liu, Jian; Xiao, Jianyuan

    Particle-in-cell (PIC) simulation is the most important numerical tool in plasma physics. However, its long-term accuracy has not been established. To overcome this difficulty, we developed a canonical symplectic PIC method for the Vlasov-Maxwell system by discretising its canonical Poisson bracket. A fast local algorithm to solve the symplectic implicit time advance is discovered without root searching or global matrix inversion, enabling applications of the proposed method to very large-scale plasma simulations with many, e.g. 10(9), degrees of freedom. The long-term accuracy and fidelity of the algorithm enables us to numerically confirm Mouhot and Villani's theory and conjecture on nonlinearmore » Landau damping over several orders of magnitude using the PIC method, and to calculate the nonlinear evolution of the reflectivity during the mode conversion process from extraordinary waves to Bernstein waves.« less

  7. SECIMTools: a suite of metabolomics data analysis tools.

    PubMed

    Kirpich, Alexander S; Ibarra, Miguel; Moskalenko, Oleksandr; Fear, Justin M; Gerken, Joseph; Mi, Xinlei; Ashrafi, Ali; Morse, Alison M; McIntyre, Lauren M

    2018-04-20

    Metabolomics has the promise to transform the area of personalized medicine with the rapid development of high throughput technology for untargeted analysis of metabolites. Open access, easy to use, analytic tools that are broadly accessible to the biological community need to be developed. While technology used in metabolomics varies, most metabolomics studies have a set of features identified. Galaxy is an open access platform that enables scientists at all levels to interact with big data. Galaxy promotes reproducibility by saving histories and enabling the sharing workflows among scientists. SECIMTools (SouthEast Center for Integrated Metabolomics) is a set of Python applications that are available both as standalone tools and wrapped for use in Galaxy. The suite includes a comprehensive set of quality control metrics (retention time window evaluation and various peak evaluation tools), visualization techniques (hierarchical cluster heatmap, principal component analysis, modular modularity clustering), basic statistical analysis methods (partial least squares - discriminant analysis, analysis of variance, t-test, Kruskal-Wallis non-parametric test), advanced classification methods (random forest, support vector machines), and advanced variable selection tools (least absolute shrinkage and selection operator LASSO and Elastic Net). SECIMTools leverages the Galaxy platform and enables integrated workflows for metabolomics data analysis made from building blocks designed for easy use and interpretability. Standard data formats and a set of utilities allow arbitrary linkages between tools to encourage novel workflow designs. The Galaxy framework enables future data integration for metabolomics studies with other omics data.

  8. The spectral method and the central limit theorem for general Markov chains

    NASA Astrophysics Data System (ADS)

    Nagaev, S. V.

    2017-12-01

    We consider Markov chains with an arbitrary phase space and develop a modification of the spectral method that enables us to prove the central limit theorem (CLT) for non-uniformly ergodic Markov chains. The conditions imposed on the transition function are more general than those by Athreya-Ney and Nummelin. Our proof of the CLT is purely analytical.

  9. Murai Reaction on Furfural Derivatives Enabled by Removable N,N'-Bidentate Directing Groups.

    PubMed

    Pezzetta, Cristofer; Veiros, Luis F; Oble, Julie; Poli, Giovanni

    2017-06-22

    Furfural and related compounds are industrially relevant building blocks obtained from lignocellulosic biomass. To enhance the added value of these renewable resources, a Ru-catalyzed hydrofurylation of alkenes, involving a directed C-H activation at C3 of the furan ring, was developed. A thorough experimental study revealed that a bidentate amino-imine directing group enabled the desired coupling. Removal of the directing group occurred during the purification step, directly releasing the C3-functionalized furfurals. Development of the reaction as well as optimization and scope of the method were described. A mechanism was proposed on the basis of DFT calculations. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. NXE pellicle: development update

    NASA Astrophysics Data System (ADS)

    Brouns, Derk; Bendiksen, Aage; Broman, Par; Casimiri, Eric; Colsters, Paul; de Graaf, Dennis; Harrold, Hilary; Hennus, Piet; Janssen, Paul; Kramer, Ronald; Kruizinga, Matthias; Kuntzel, Henk; Lafarre, Raymond; Mancuso, Andrea; Ockwell, David; Smith, Daniel; van de Weg, David; Wiley, Jim

    2016-09-01

    ASML introduced the NXE pellicle concept, a removable pellicle solution that is compatible with current and future patterned mask inspection methods. We will present results of how we have taken the idea from concept to a demonstrated solution enabling the use of EUV pellicle by the industry for high volume manufacturing. We will update on the development of the next generation of pellicle films with higher power capability. Further, we will provide an update on top level requirements for pellicles and external interface requirements needed to support NXE pellicle adoption at a mask shop. Finally, we will present ASML's pellicle handling equipment to enable pellicle use at mask shops and our NXE pellicle roadmap outlining future improvements.

  11. Using Electronic Messaging to Improve the Quality of Instruction.

    ERIC Educational Resources Information Center

    Zack, Michael H.

    1995-01-01

    Qualitative and quantitative data from business students using electronic mail and computer conferencing showed these methods enabled the instructor to be more accessible and responsive; greater class cohesion developed, and perceived quality of the course and instructor effectiveness increased. (SK)

  12. Shining Light on Higher Education's Newest Baccalaureate Degrees and the Research Needed to Understand Their Impact

    ERIC Educational Resources Information Center

    Bragg, Debra D.; Soler, Maria Claudia

    2016-01-01

    This chapter discusses methods and measures that are needed to conduct research on newly developing Applied Baccalaureate degrees that enable students to transfer applied college credits heretofore considered terminal to bachelor's degree programs.

  13. Company Profile: Selventa, Inc.

    PubMed

    Fryburg, David A; Latino, Louis J; Tagliamonte, John; Kenney, Renee D; Song, Diane H; Levine, Arnold J; de Graaf, David

    2012-08-01

    Selventa, Inc. (MA, USA) is a biomarker discovery company that enables personalized healthcare. Originally founded as Genstruct, Inc., Selventa has undergone significant evolution from a technology-based service provider to an active partner in the development of diagnostic tests, functioning as a molecular dashboard of disease activity using a unique platform. As part of that evolution, approximately 2 years ago the company was rebranded as Selventa to reflect its new identity and mission. The contributions to biomedical research by Selventa are based on in silico, reverse-engineering methods to determine biological causality. That is, given a set of in vitro or in vivo biological observations, which biological mechanisms can explain the measured results? Facilitated by a large and carefully curated knowledge base, these in silico methods generated new insights into the mechanisms driving a disease. As Selventa's methods would enable biomarker discovery and be directly applicable to generating novel diagnostics, the scientists at Selventa have focused on the development of predictive biomarkers of response in autoimmune and oncologic diseases. Selventa is presently building a portfolio of independent, as well as partnered, biomarker projects with the intention to create diagnostic tests that predict response to therapy.

  14. "Reagent-free" L-asparaginase activity assay based on CD spectroscopy and conductometry.

    PubMed

    Kudryashova, Elena V; Sukhoverkov, Kirill V

    2016-02-01

    A new method to determine the catalytic parameters of L-asparaginase using circular dichroism spectroscopy (CD spectroscopy) has been developed. The assay is based on the difference in CD signal between the substrate (L-asparagine) and the product (L-aspartic acid) of enzymatic reaction. CD spectroscopy, being a direct method, enables continuous measurement, and thus differentiates from multistage and laborious approach based on Nessler's method, and overcomes limitations of conjugated enzymatic reaction methods. In this work, we show robust measurements of L-asparaginase activity in conjugates with PEG-chitosan copolymers, which otherwise would not have been possible. The main limitation associated with the CD method is that the analysis should be performed at substrate saturation conditions (V max regime). For K M measurement, the conductometry method is suggested, which can serve as a complimentary method to CD spectroscopy. The activity assay based on CD spectroscopy and conductometry was successfully implicated to examine the catalytic parameters of L-asparaginase conjugates with chitosan and its derivatives, and for optimization of the molecular architecture and composition of such conjugates for improving biocatalytic properties of the enzyme in the physiological conditions. The approach developed is potentially applicable to other enzymatic reactions where the spectroscopic properties of substrate and product do not enable direct measurement with absorption or fluorescence spectroscopy. This may include a number of amino acid or glycoside-transforming enzymes.

  15. Bio-recycling of metals: Recycling of technical products using biological applications.

    PubMed

    Pollmann, Katrin; Kutschke, Sabine; Matys, Sabine; Raff, Johannes; Hlawacek, Gregor; Lederer, Franziska L

    2018-03-16

    The increasing demand of different essential metals as a consequence of the development of new technologies, especially in the so called "low carbon technologies" require the development of innovative technologies that enable an economic and environmentally friendly metal recovery from primary and secondary resources. There is serious concern that the demand of some critical elements might exceed the present supply within a few years, thus necessitating the development of novel strategies and technologies to meet the requirements of industry and society. Besides an improvement of exploitation and processing of ores, the more urgent issue of recycling of strategic metals has to be enforced. However, current recycling rates are very low due to the increasing complexity of products and the low content of certain critical elements, thus hindering an economic metal recovery. On the other hand, increasing environmental consciousness as well as limitations of classical methods require innovative recycling methodologies in order to enable a circular economy. Modern biotechnologies can contribute to solve some of the problems related to metal recycling. These approaches use natural properties of organisms, bio-compounds, and biomolecules to interact with minerals, materials, metals, or metal ions such as surface attachment, mineral dissolution, transformation, and metal complexation. Further, modern genetic approaches, e.g. realized by synthetic biology, enable the smart design of new chemicals. The article presents some recent developments in the fields of bioleaching, biosorption, bioreduction, and bioflotation, and their use for metal recovery from different waste materials. Currently only few of these developments are commercialized. Major limitations are high costs in comparison to conventional methods and low element selectivity. The article discusses future trends to overcome these barriers. Especially interdisciplinary approaches, the combination of different technologies, the inclusion of modern genetic methods, as well as the consideration of existing, yet unexplored natural resources will push innovations in these fields. Copyright © 2018 Elsevier Inc. All rights reserved.

  16. Method of synthesizing tungsten nanoparticles

    DOEpatents

    Thoma, Steven G; Anderson, Travis M

    2013-02-12

    A method to synthesize tungsten nanoparticles has been developed that enables synthesis of nanometer-scale, monodisperse particles that can be stabilized only by tetrahydrofuran. The method can be used at room temperature, is scalable, and the product concentrated by standard means. Since no additives or stabilizing surfactants are required, this method is particularly well suited for producing tungsten nanoparticles for dispersion in polymers. If complete dispersion is achieved due to the size of the nanoparticles, then the optical properties of the polymer can be largely maintained.

  17. Modelling of percolation rate of stormwater from underground infiltration systems.

    PubMed

    Burszta-Adamiak, Ewa; Lomotowski, Janusz

    2013-01-01

    Underground or surface stormwater storage tank systems that enable the infiltration of water into the ground are basic elements used in Sustainable Urban Drainage Systems (SUDS). So far, the design methods for such facilities have not taken into account the phenomenon of ground clogging during stormwater infiltration. Top layer sealing of the filter bed influences the infiltration rate of water into the ground. This study presents an original mathematical model describing changes in the infiltration rate variability in the phases of filling and emptying the storage and infiltration tank systems, which enables the determination of the degree of top ground layer clogging. The input data for modelling were obtained from studies conducted on experimental sites on objects constructed on a semi-technological scale. The experiment conducted has proven that the application of the model developed for the phase of water infiltration enables us to estimate the degree of module clogging. However, this method is more suitable for reservoirs embedded in more permeable soils than for those located in cohesive soils.

  18. Fully Synthetic Granulocyte Colony-Stimulating Factor Enabled by Isonitrile-Mediated Coupling of Large, Side-Chain-Unprotected Peptides

    PubMed Central

    Roberts, Andrew G.; Johnston, Eric V.; Shieh, Jae-Hung; Sondey, Joseph P.; Hendrickson, Ronald C.; Moore, Malcolm A. S.; Danishefsky, Samuel J.

    2015-01-01

    Human granulocyte colony-stimulating factor (G-CSF) is an endogenous glycoprotein involved in hematopoiesis. Natively glycosylated and nonglycosylated recombinant forms, lenograstim and filgrastim, respectively, are used clinically to manage neutropenia in patients undergoing chemotherapeutic treatment. Despite their comparable therapeutic potential, the purpose of O-linked glycosylation at Thr133 remains a subject of controversy. In light of this, we have developed a synthetic platform to prepare G-CSF aglycone with the goal of enabling access to native and designed glycoforms with site-selectivity and glycan homogeneity. To address the synthesis of a relatively large, aggregation-prone sequence, we advanced an isonitrile-mediated ligation method. The chemoselective activation and coupling of C-terminal peptidyl Gly thioacids with the N-terminus of an unprotected peptide provide ligated peptides directly in a manner complementary to that with conventional native chemical ligation–desulfurization strategies. Herein, we describe the details and application of this method as it enabled the convergent total synthesis of G-CSF aglycone. PMID:26401918

  19. Emerging MRI Methods in Translational Cardiovascular Research

    PubMed Central

    Vandsburger, Moriel H; Epstein, Frederick H

    2011-01-01

    Cardiac magnetic resonance imaging (CMR) has become a reference standard modality for imaging of left ventricular (LV) structure and function, and, using late gadolinium enhancement, for imaging myocardial infarction. Emerging CMR techniques enable a more comprehensive examination of the heart, making CMR an excellent tool for use in translational cardiovascular research. Specifically, emerging CMR methods have been developed to measure the extent of myocardial edema, changes in ventricular mechanics, changes in tissue composition as a result of fibrosis, and changes in myocardial perfusion as a function of both disease and infarct healing. New CMR techniques also enable the tracking of labeled cells, molecular imaging of biomarkers of disease, and changes in calcium flux in cardiomyocytes. In addition, MRI can quantify blood flow velocity and wall shear stress in large blood vessels. Almost all of these techniques can be applied in both pre-clinical and clinical settings, enabling both the techniques themselves and the knowledge gained using such techniques in pre-clinical research to be translated from the lab bench to the patient bedside. PMID:21452060

  20. 78 FR 29387 - Government-Owned Inventions, Available for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-20

    ....: MSC-24919-1: Systems and Methods for RFID-Enables Information Collection; NASA Case No.: MSC-25632-1... Methods for RFID-Enabled Dispenser; NASA Case No.: MSC-25313-1: Hydrostatic Hyperbaric Apparatus and...; NASA Case No: MSC-25590-1: Systems and Methods for RFID-Enabled Pressure Sensing Apparatus; NASA Case...

  1. Recent trends in the determination of vitamin D.

    PubMed

    Gomes, Fabio P; Shaw, P Nicholas; Whitfield, Karen; Koorts, Pieter; Hewavitharana, Amitha K

    2013-12-01

    The occurrence of vitamin D deficiency has become an issue of serious concern in the worldwide population. As a result numerous analytical methods have been developed, for a variety of matrices, during the last few years to measure vitamin D analogs and metabolites. This review employs a comprehensive search of all vitamin D methods developed during the last 5 years for all applications, using ISI Web of Science(®), Scifinder(®), Science Direct, Scopus and PubMed. Particular emphasis is given to sample-preparation methods and the different forms of vitamin D measured across different fields of applications such as biological fluids, food and pharmaceutical preparations. This review compares and critically evaluates a wide range of approaches and methods, and hence it will enable readers to access developments across a number of applications and to select or develop the optimal analytical method for vitamin D for their particular application.

  2. Development of a Hybrid RANS/LES Method for Compressible Mixing Layer Simulations

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.; Alexander, J. Iwan D.; Reshotko, Eli

    2001-01-01

    A hybrid method has been developed for simulations of compressible turbulent mixing layers. Such mixing layers dominate the flows in exhaust systems of modem day aircraft and also those of hypersonic vehicles currently under development. The hybrid method uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall bounded regions entering a mixing section, and a Large Eddy Simulation (LES) procedure to calculate the mixing dominated regions. A numerical technique was developed to enable the use of the hybrid RANS/LES method on stretched, non-Cartesian grids. The hybrid RANS/LES method is applied to a benchmark compressible mixing layer experiment. Preliminary two-dimensional calculations are used to investigate the effects of axial grid density and boundary conditions. Actual LES calculations, performed in three spatial directions, indicated an initial vortex shedding followed by rapid transition to turbulence, which is in agreement with experimental observations.

  3. Recent advances in neural dust: towards a neural interface platform.

    PubMed

    Neely, Ryan M; Piech, David K; Santacruz, Samantha R; Maharbiz, Michel M; Carmena, Jose M

    2018-06-01

    The neural dust platform uses ultrasonic power and communication to enable a scalable, wireless, and batteryless system for interfacing with the nervous system. Ultrasound offers several advantages over alternative wireless approaches, including a safe method for powering and communicating with sub mm-sized devices implanted deep in tissue. Early studies demonstrated that neural dust motes could wirelessly transmit high-fidelity electrophysiological data in vivo, and that theoretically, this system could be miniaturized well below the mm-scale. Future developments are focused on further minimization of the platform, better encapsulation methods as a path towards truly chronic neural interfaces, improved delivery mechanisms, stimulation capabilities, and finally refinements to enable deployment of neural dust in the central nervous system. Copyright © 2017. Published by Elsevier Ltd.

  4. Enablers and challenges to occupational therapists’ research engagement: A qualitative study

    PubMed Central

    Di Bona, Laura; Wenborn, Jennifer; Field, Becky; Hynes, Sinéad M; Ledgerd, Ritchard; Mountain, Gail; Swinson, Tom

    2017-01-01

    Introduction To develop occupational therapy’s evidence base and improve its clinical outcomes, occupational therapists must increase their research involvement. Barriers to research consumption and leadership are well documented, but those relating to delivering research interventions, less so. Yet, interventions need to be researched within practice to demonstrate their clinical effectiveness. This study aims to improve understanding of challenges and enablers experienced by occupational therapists who deliver interventions within research programmes. Method Twenty-eight occupational therapists who participated in the Valuing Active Life in Dementia (VALID) research programme reported their experiences in five focus groups. Data were analysed thematically to identify key and subthemes. Results Occupational therapists reported that overwhelming paperwork, use of videos, recruitment and introducing a new intervention challenged their research involvement, whereas support, protected time and a positive attitude enabled it. The impact of these challenges and enablers varied between therapists and organisations. Conclusion Challenges and enablers to research involvement can be identified but must be addressed within individual and organisational contexts. Multifaceted collective action to minimise challenges and maximise enablers can facilitate clinicians’ involvement in research. Using this approach should enable occupational therapists to increase their research involvement, thus demonstrating the clinical effectiveness of their interventions. PMID:29170592

  5. Development of Abrasion-Resistant Coating for Solar Reflective Films. Cooperative Research and Development Final Report, CRADA Number CRD-07-247

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gray, Matthew

    The purpose of this CRADA is to develop an abrasion-resistant coating, suitable for use on polymeric-based reflective films (e.g., the ReflecTech reflective film), that allows for improved scratch resistance and enables the use of aggressive cleaning techniques (e.g., direct contact methods like brushing) without damaging the specular reflectance properties of the reflective film.

  6. Advances in Digital Calibration Techniques Enabling Real-Time Beamforming SweepSAR Architectures

    NASA Technical Reports Server (NTRS)

    Hoffman, James P.; Perkovic, Dragana; Ghaemi, Hirad; Horst, Stephen; Shaffer, Scott; Veilleux, Louise

    2013-01-01

    Real-time digital beamforming, combined with lightweight, large aperture reflectors, enable SweepSAR architectures, which promise significant increases in instrument capability for solid earth and biomass remote sensing. These new instrument concepts require new methods for calibrating the multiple channels, which are combined on-board, in real-time. The benefit of this effort is that it enables a new class of lightweight radar architecture, Digital Beamforming with SweepSAR, providing significantly larger swath coverage than conventional SAR architectures for reduced mass and cost. This paper will review the on-going development of the digital calibration architecture for digital beamforming radar instrument, such as the proposed Earth Radar Mission's DESDynI (Deformation, Ecosystem Structure, and Dynamics of Ice) instrument. This proposed instrument's baseline design employs SweepSAR digital beamforming and requires digital calibration. We will review the overall concepts and status of the system architecture, algorithm development, and the digital calibration testbed currently being developed. We will present results from a preliminary hardware demonstration. We will also discuss the challenges and opportunities specific to this novel architecture.

  7. ICAROUS - Integrated Configurable Algorithms for Reliable Operations Of Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Consiglio, María; Muñoz, César; Hagen, George; Narkawicz, Anthony; Balachandran, Swee

    2016-01-01

    NASA's Unmanned Aerial System (UAS) Traffic Management (UTM) project aims at enabling near-term, safe operations of small UAS vehicles in uncontrolled airspace, i.e., Class G airspace. A far-term goal of UTM research and development is to accommodate the expected rise in small UAS traffic density throughout the National Airspace System (NAS) at low altitudes for beyond visual line-of-sight operations. This paper describes a new capability referred to as ICAROUS (Integrated Configurable Algorithms for Reliable Operations of Unmanned Systems), which is being developed under the UTM project. ICAROUS is a software architecture comprised of highly assured algorithms for building safety-centric, autonomous, unmanned aircraft applications. Central to the development of the ICAROUS algorithms is the use of well-established formal methods to guarantee higher levels of safety assurance by monitoring and bounding the behavior of autonomous systems. The core autonomy-enabling capabilities in ICAROUS include constraint conformance monitoring and contingency control functions. ICAROUS also provides a highly configurable user interface that enables the modular integration of mission-specific software components.

  8. ICAROUS: Integrated Configurable Architecture for Unmanned Systems

    NASA Technical Reports Server (NTRS)

    Consiglio, Maria C.

    2016-01-01

    NASA's Unmanned Aerial System (UAS) Traffic Management (UTM) project aims at enabling near-term, safe operations of small UAS vehicles in uncontrolled airspace, i.e., Class G airspace. A far-term goal of UTM research and development is to accommodate the expected rise in small UAS traffic density throughout the National Airspace System (NAS) at low altitudes for beyond visual line-of-sight operations. This video describes a new capability referred to as ICAROUS (Integrated Configurable Algorithms for Reliable Operations of Unmanned Systems), which is being developed under the auspices of the UTM project. ICAROUS is a software architecture comprised of highly assured algorithms for building safety-centric, autonomous, unmanned aircraft applications. Central to the development of the ICAROUS algorithms is the use of well-established formal methods to guarantee higher levels of safety assurance by monitoring and bounding the behavior of autonomous systems. The core autonomy-enabling capabilities in ICAROUS include constraint conformance monitoring and autonomous detect and avoid functions. ICAROUS also provides a highly configurable user interface that enables the modular integration of mission-specific software components.

  9. Enantiomeric separation of the antiuremic drug colchicine by electrokinetic chromatography. Method development and quantitative analysis.

    PubMed

    Menéndez-López, Nuria; Valimaña-Traverso, Jesús; Castro-Puyana, María; Salgado, Antonio; García, María Ángeles; Marina, María Luisa

    2017-05-10

    Two analytical methodologies were developed by CE enabling the enantiomeric separation of colchicine, an antiuremic drug commercialized as pure enantiomer. Succinyl-γ-CD and Sulfated-γ-CD were selected as chiral selectors after a screening with different anionic CDs. Under the optimized conditions, chiral resolutions of 5.6 in 12min and 3.2 in 8min were obtained for colchicine with Succinyl-γ-CD and Sulfated-γ-CD, respectively. An opposite enantiomeric migration order was observed with these two CDs being S-colchicine the first-migrating enantiomer with Succinyl-γ-CD and the second-migrating enantiomer with Sulfated-γ-CD. H NMR experiments showed a 1:1 stoichiometry for the enantiomer-CD complexes in both cases. However, the apparent and averaged equilibrium constants for the enantiomer-CD complexes could be calculated only for Succinyl-γ-CD. The developed methods were applied to the analysis of pharmaceutical formulations but only the use of Succinyl-γ-CD enabled to detect a 0.1% of enantiomeric impurity in colchicine formulations. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Simulation of Guided Wave Interaction with In-Plane Fiber Waviness

    NASA Technical Reports Server (NTRS)

    Leckey, Cara A. C.; Juarez, Peter D.

    2016-01-01

    Reducing the timeline for certification of composite materials and enabling the expanded use of advanced composite materials for aerospace applications are two primary goals of NASA's Advanced Composites Project (ACP). A key a technical challenge area for accomplishing these goals is the development of rapid composite inspection methods with improved defect characterization capabilities. Ongoing work at NASA Langley is focused on expanding ultrasonic simulation capabilities for composite materials. Simulation tools can be used to guide the development of optimal inspection methods. Custom code based on elastodynamic finite integration technique is currently being developed and implemented to study ultrasonic wave interaction with manufacturing defects, such as in-plane fiber waviness (marcelling). This paper describes details of validation comparisons performed to enable simulation of guided wave propagation in composites containing fiber waviness. Simulation results for guided wave interaction with in-plane fiber waviness are also discussed. The results show that the wavefield is affected by the presence of waviness on both the surface containing fiber waviness, as well as the opposite surface to the location of waviness.

  11. Simulation of guided wave interaction with in-plane fiber waviness

    NASA Astrophysics Data System (ADS)

    Leckey, Cara A. C.; Juarez, Peter D.

    2017-02-01

    Reducing the timeline for certification of composite materials and enabling the expanded use of advanced composite materials for aerospace applications are two primary goals of NASA's Advanced Composites Project (ACP). A key a technical challenge area for accomplishing these goals is the development of rapid composite inspection methods with improved defect characterization capabilities. Ongoing work at NASA Langley is focused on expanding ultrasonic simulation capabilities for composite materials. Simulation tools can be used to guide the development of optimal inspection methods. Custom code based on elastodynamic finite integration technique is currently being developed and implemented to study ultrasonic wave interaction with manufacturing defects, such as in-plane fiber waviness (marcelling). This paper describes details of validation comparisons performed to enable simulation of guided wave propagation in composites containing fiber waviness. Simulation results for guided wave interaction with in-plane fiber waviness are also discussed. The results show that the wavefield is affected by the presence of waviness on both the surface containing fiber waviness, as well as the opposite surface to the location of waviness.

  12. Simple Web-based interactive key development software (WEBiKEY) and an example key for Kuruna (Poaceae: Bambusoideae)1

    PubMed Central

    Attigala, Lakshmi; De Silva, Nuwan I.; Clark, Lynn G.

    2016-01-01

    Premise of the study: Programs that are user-friendly and freely available for developing Web-based interactive keys are scarce and most of the well-structured applications are relatively expensive. WEBiKEY was developed to enable researchers to easily develop their own Web-based interactive keys with fewer resources. Methods and Results: A Web-based multiaccess identification tool (WEBiKEY) was developed that uses freely available Microsoft ASP.NET technologies and an SQL Server database for Windows-based hosting environments. WEBiKEY was tested for its usability with a sample data set, the temperate woody bamboo genus Kuruna (Poaceae). Conclusions: WEBiKEY is freely available to the public and can be used to develop Web-based interactive keys for any group of species. The interactive key we developed for Kuruna using WEBiKEY enables users to visually inspect characteristics of Kuruna and identify an unknown specimen as one of seven possible species in the genus. PMID:27144109

  13. UKRmol: a low-energy electron- and positron-molecule scattering suite

    NASA Astrophysics Data System (ADS)

    Carr, J. M.; Galiatsatos, P. G.; Gorfinkiel, J. D.; Harvey, A. G.; Lysaght, M. A.; Madden, D.; Mašín, Z.; Plummer, M.; Tennyson, J.; Varambhia, H. N.

    2012-03-01

    We describe the UK computational implementation of the R-matrix method for the treatment of electron and positron scattering from molecules. Recent developments in the UKRmol suite are detailed together with the collision processes it is enabling us to treat.

  14. Identification of Hotspots of Genetic Risk for Type 2 Diabetes Using GIS Methods

    EPA Science Inventory

    BACKGROUND: Having the ability to scan the entire country for potential "hotspots" with increased risk of developing chronic diseases due to various environmental, demographic, and genetic susceptibility factors may inform risk management decisions and enable better env...

  15. Exclusion-Based Capture and Enumeration of CD4+ T Cells from Whole Blood for Low-Resource Settings.

    PubMed

    Howard, Alexander L; Pezzi, Hannah M; Beebe, David J; Berry, Scott M

    2014-06-01

    In developing countries, demand exists for a cost-effective method to evaluate human immunodeficiency virus patients' CD4(+) T-helper cell count. The TH (CD4) cell count is the current marker used to identify when an HIV patient has progressed to acquired immunodeficiency syndrome, which results when the immune system can no longer prevent certain opportunistic infections. A system to perform TH count that obviates the use of costly flow cytometry will enable physicians to more closely follow patients' disease progression and response to therapy in areas where such advanced equipment is unavailable. Our system of two serially-operated immiscible phase exclusion-based cell isolations coupled with a rapid fluorescent readout enables exclusion-based isolation and accurate counting of T-helper cells at lower cost and from a smaller volume of blood than previous methods. TH cell isolation via immiscible filtration assisted by surface tension (IFAST) compares well against the established Dynal T4 Quant Kit and is sensitive at CD4 counts representative of immunocompromised patients (less than 200 TH cells per microliter of blood). Our technique retains use of open, simple-to-operate devices that enable IFAST as a high-throughput, automatable sample preparation method, improving throughput over previous low-resource methods. © 2013 Society for Laboratory Automation and Screening.

  16. Spatial characterization of the meltwater field from icebergs in the Weddell Sea.

    PubMed

    Helly, John J; Kaufmann, Ronald S; Vernet, Maria; Stephenson, Gordon R

    2011-04-05

    We describe the results from a spatial cyberinfrastructure developed to characterize the meltwater field around individual icebergs and integrate the results with regional- and global-scale data. During the course of the cyberinfrastructure development, it became clear that we were also building an integrated sampling planning capability across multidisciplinary teams that provided greater agility in allocating expedition resources resulting in new scientific insights. The cyberinfrastructure-enabled method is a complement to the conventional methods of hydrographic sampling in which the ship provides a static platform on a station-by-station basis. We adapted a sea-floor mapping method to more rapidly characterize the sea surface geophysically and biologically. By jointly analyzing the multisource, continuously sampled biological, chemical, and physical parameters, using Global Positioning System time as the data fusion key, this surface-mapping method enables us to examine the relationship between the meltwater field of the iceberg to the larger-scale marine ecosystem of the Southern Ocean. Through geospatial data fusion, we are able to combine very fine-scale maps of dynamic processes with more synoptic but lower-resolution data from satellite systems. Our results illustrate the importance of spatial cyberinfrastructure in the overall scientific enterprise and identify key interfaces and sources of error that require improved controls for the development of future Earth observing systems as we move into an era of peta- and exascale, data-intensive computing.

  17. Spatial characterization of the meltwater field from icebergs in the Weddell Sea

    PubMed Central

    Helly, John J.; Kaufmann, Ronald S.; Vernet, Maria; Stephenson, Gordon R.

    2011-01-01

    We describe the results from a spatial cyberinfrastructure developed to characterize the meltwater field around individual icebergs and integrate the results with regional- and global-scale data. During the course of the cyberinfrastructure development, it became clear that we were also building an integrated sampling planning capability across multidisciplinary teams that provided greater agility in allocating expedition resources resulting in new scientific insights. The cyberinfrastructure-enabled method is a complement to the conventional methods of hydrographic sampling in which the ship provides a static platform on a station-by-station basis. We adapted a sea-floor mapping method to more rapidly characterize the sea surface geophysically and biologically. By jointly analyzing the multisource, continuously sampled biological, chemical, and physical parameters, using Global Positioning System time as the data fusion key, this surface-mapping method enables us to examine the relationship between the meltwater field of the iceberg to the larger-scale marine ecosystem of the Southern Ocean. Through geospatial data fusion, we are able to combine very fine-scale maps of dynamic processes with more synoptic but lower-resolution data from satellite systems. Our results illustrate the importance of spatial cyberinfrastructure in the overall scientific enterprise and identify key interfaces and sources of error that require improved controls for the development of future Earth observing systems as we move into an era of peta- and exascale, data-intensive computing. PMID:21444769

  18. A Hybrid Numerical Method for Turbulent Mixing Layers. Degree awarded by Case Western Reserve Univ.

    NASA Technical Reports Server (NTRS)

    Georgiadis, Nicholas J.

    2001-01-01

    A hybrid method has been developed for simulations of compressible turbulent mixing layers. Such mixing layers dominate the flows in exhaust systems of modern day aircraft and also those of hypersonic vehicles currently under development. The method configurations in which a dominant structural feature provides an unsteady mechanism to drive the turbulent development in the mixing layer. The hybrid method uses a Reynolds-averaged Navier-Stokes (RANS) procedure to calculate wall bounded regions entering a mixing section, and a Large Eddy Simulation (LES) procedure to calculate the mixing dominated regions. A numerical technique was developed to enable the use of the hybrid RANS-LES method on stretched, non-Cartesian grids. Closure for the RANS equations was obtained using the Cebeci-Smith algebraic turbulence model in conjunction with the wall-function approach of Ota and Goldberg. The wall-function approach enabled a continuous computational grid from the RANS regions to the LES region. The LES equations were closed using the Smagorinsky subgrid scale model. The hybrid RANS-LES method is applied to a benchmark compressible mixing layer experiment. Preliminary two dimensional calculations are used to investigate the effects of axial grid density and boundary conditions. Vortex shedding from the base region of a splitter plate separating the upstream flows was observed to eventually transition to turbulence. The location of the transition, however, was much further downstream than indicated by experiments. Actual LES calculations, performed in three spatial directions, also indicated vortex shedding, but the transition to turbulence was found to occur much closer to the beginning of the mixing section. which is in agreement with experimental observations. These calculations demonstrated that LES simulations must be performed in three dimensions. Comparisons of time-averaged axial velocities and turbulence intensities indicated reasonable agreement with experimental data.

  19. Enabling fast charging - A battery technology gap assessment

    NASA Astrophysics Data System (ADS)

    Ahmed, Shabbir; Bloom, Ira; Jansen, Andrew N.; Tanim, Tanvir; Dufek, Eric J.; Pesaran, Ahmad; Burnham, Andrew; Carlson, Richard B.; Dias, Fernando; Hardy, Keith; Keyser, Matthew; Kreuzer, Cory; Markel, Anthony; Meintz, Andrew; Michelbacher, Christopher; Mohanpurkar, Manish; Nelson, Paul A.; Robertson, David C.; Scoffield, Don; Shirk, Matthew; Stephens, Thomas; Vijayagopal, Ram; Zhang, Jiucai

    2017-11-01

    The battery technology literature is reviewed, with an emphasis on key elements that limit extreme fast charging. Key gaps in existing elements of the technology are presented as well as developmental needs. Among these needs are advanced models and methods to detect and prevent lithium plating; new positive-electrode materials which are less prone to stress-induced failure; better electrode designs to accommodate very rapid diffusion in and out of the electrode; measure temperature distributions during fast charge to enable/validate models; and develop thermal management and pack designs to accommodate the higher operating voltage.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendez Cruz, Carmen Margarita; Rochau, Gary E.; Middleton, Bobby

    Sandia National Laboratories and General Atomics are pleased to respond to the Advanced Research Projects Agency-Energy (ARPA-e)’s request for information on innovative developments that may overcome various current reactor-technology limitations. The RFI is particularly interested in innovations that enable ultra-safe and secure modular nuclear energy systems. Our response addresses the specific features for reactor designs called out in the RFI, including a brief assessment of the current state of the technologies that would enable each feature and the methods by which they could be best incorporated into a reactor design.

  1. A Descriptive Examination of the Types of Relationships Formed between Children with Developmental Disability and Their Closest Peers in Inclusive School Settings

    ERIC Educational Resources Information Center

    Webster, Amanda A.; Carter, Mark

    2013-01-01

    Background: One of the most commonly cited rationales for inclusive education is to enable the development of quality relationships with typically developing peers. Relatively few researchers have examined the features of the range of relationships that children with developmental disability form in inclusive school settings. Method: Interviews…

  2. Lake Erie...A Day in the Life of a Fish.

    ERIC Educational Resources Information Center

    Canning, Maureen; Dunlevy, Margie

    This elementary school teaching unit was developed as a part of a series of units that deal with Lake Erie. This unit was developed to enable children to: (1) examine a moving fish; (2) conduct experiments with a live fish; (3) understand the swimming habits of fish; (4) learn how fish breathe; (5) recognize different methods of fish protection…

  3. Improved Sand-Compaction Method for Lost-Foam Metal Casting

    NASA Technical Reports Server (NTRS)

    Bakhtiyarov, Sayavur I.; Overfelt, Ruel A.

    2008-01-01

    An improved method of filling a molding flask with sand and compacting the sand around a refractory-coated foam mold pattern has been developed for incorporation into the lost-foam metal-casting process. In comparison with the conventional method of sand filling and compaction, this method affords more nearly complete filling of the space around the refractory-coated foam mold pattern and more thorough compaction of the sand. In so doing, this method enables the sand to better support the refractory coat under metallostatic pressure during filling of the mold with molten metal.

  4. Optimization of end-pumped, actively Q-switched quasi-III-level lasers.

    PubMed

    Jabczynski, Jan K; Gorajek, Lukasz; Kwiatkowski, Jacek; Kaskow, Mateusz; Zendzian, Waldemar

    2011-08-15

    The new model of end-pumped quasi-III-level laser considering transient pumping processes, ground-state-depletion and up-conversion effects was developed. The model consists of two parts: pumping stage and Q-switched part, which can be separated in a case of active Q-switching regime. For pumping stage the semi-analytical model was developed, enabling the calculations for final occupation of upper laser level for given pump power and duration, spatial profile of pump beam, length and dopant level of gain medium. For quasi-stationary inversion, the optimization procedure of Q-switching regime based on Lagrange multiplier technique was developed. The new approach for optimization of CW regime of quasi-three-level lasers was developed to optimize the Q-switched lasers operating with high repetition rates. Both methods of optimizations enable calculation of optimal absorbance of gain medium and output losses for given pump rate. © 2011 Optical Society of America

  5. Computational Electromagnetics Application to Small Geometric Anomalies and Associated Ucertainty Evaluation

    DTIC Science & Technology

    2010-02-28

    implemented a fast method to enable the statistical characterization of electromagnetic interference and compatibility (EMI/EMC) phenomena on electrically...higher accuracy is needed, e.g., to compute higher moment statistics . To address this problem, we have developed adaptive stochastic collocation methods ...SPONSORING/MONITORING AGENCY NAME(S) AND ADDRESS(ES) AF OFFICE OF SCIENTIFIC RESEARCH 875 N. RANDOLPH ST. ROOM 3112 ARLINGTON VA 22203 UA

  6. Microbiological Quality and Food Safety of Plants Grown on ISS Project

    NASA Technical Reports Server (NTRS)

    Wheeler, Raymond M. (Compiler)

    2014-01-01

    The goal of this project is to select and advance methods to enable real-time sampling, microbiological analysis, and sanitation of crops grown on the International Space Station (ISS). These methods would validate the microbiological quality of crops grown for consumption to ensure safe and palatable fresh foods. This would be achieved through the development / advancement of microbiological sample collection, rapid pathogen detection and effective sanitation methods that are compatible with a microgravity environment.

  7. Discovering charge density functionals and structure-property relationships with PROPhet: A general framework for coupling machine learning and first-principles methods

    DOE PAGES

    Kolb, Brian; Lentz, Levi C.; Kolpak, Alexie M.

    2017-04-26

    Modern ab initio methods have rapidly increased our understanding of solid state materials properties, chemical reactions, and the quantum interactions between atoms. However, poor scaling often renders direct ab initio calculations intractable for large or complex systems. There are two obvious avenues through which to remedy this problem: (i) develop new, less expensive methods to calculate system properties, or (ii) make existing methods faster. This paper describes an open source framework designed to pursue both of these avenues. PROPhet (short for PROPerty Prophet) utilizes machine learning techniques to find complex, non-linear mappings between sets of material or system properties. Themore » result is a single code capable of learning analytical potentials, non-linear density functionals, and other structure-property or property-property relationships. These capabilities enable highly accurate mesoscopic simulations, facilitate computation of expensive properties, and enable the development of predictive models for systematic materials design and optimization. Here, this work explores the coupling of machine learning to ab initio methods through means both familiar (e.g., the creation of various potentials and energy functionals) and less familiar (e.g., the creation of density functionals for arbitrary properties), serving both to demonstrate PROPhet’s ability to create exciting post-processing analysis tools and to open the door to improving ab initio methods themselves with these powerful machine learning techniques.« less

  8. Discovering charge density functionals and structure-property relationships with PROPhet: A general framework for coupling machine learning and first-principles methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kolb, Brian; Lentz, Levi C.; Kolpak, Alexie M.

    Modern ab initio methods have rapidly increased our understanding of solid state materials properties, chemical reactions, and the quantum interactions between atoms. However, poor scaling often renders direct ab initio calculations intractable for large or complex systems. There are two obvious avenues through which to remedy this problem: (i) develop new, less expensive methods to calculate system properties, or (ii) make existing methods faster. This paper describes an open source framework designed to pursue both of these avenues. PROPhet (short for PROPerty Prophet) utilizes machine learning techniques to find complex, non-linear mappings between sets of material or system properties. Themore » result is a single code capable of learning analytical potentials, non-linear density functionals, and other structure-property or property-property relationships. These capabilities enable highly accurate mesoscopic simulations, facilitate computation of expensive properties, and enable the development of predictive models for systematic materials design and optimization. Here, this work explores the coupling of machine learning to ab initio methods through means both familiar (e.g., the creation of various potentials and energy functionals) and less familiar (e.g., the creation of density functionals for arbitrary properties), serving both to demonstrate PROPhet’s ability to create exciting post-processing analysis tools and to open the door to improving ab initio methods themselves with these powerful machine learning techniques.« less

  9. Assessing the Potential of Metal-Assisted Imaging Mass Spectrometry in Cancer Research.

    PubMed

    Dufresne, M; Patterson, N H; Lauzon, N; Chaurand, P

    2017-01-01

    In the last decade, imaging mass spectrometry (IMS) has been the primary tool for biomolecular imaging. While it is possible to map a wide range of biomolecules using matrix-assisted laser desorption/ionization IMS ranging from high-molecular-weight proteins to small metabolites, more often than not only the most abundant easily ionisable species are detected. To better understand complex diseases such as cancer more specific and sensitive methods need to be developed to enable the detection of lower abundance molecules but also molecules that have yet to be imaged by IMS. In recent years, a big shift has occurred in the imaging community from developing wide reaching methods to developing targeted ones which increases sensitivity through the use of more specific sample preparations. This has been primarily marked by the advent of solvent-free matrix deposition methods for polar lipids, chemical derivatization for hormones and metabolites, and the use of alternative ionization agents for neutral lipids. In this chapter, we discuss two of the latest sample preparations which exploit the use of alternative ionization agents to enable the detection of certain classes of neutral lipids along with free fatty acids by high-sensitivity IMS as demonstrated within our lab. © 2017 Elsevier Inc. All rights reserved.

  10. Users perspectives on interactive distance technology enabling home-based motor training for stroke patients.

    PubMed

    Ehn, Maria; Hansson, Pär; Sjölinder, Marie; Boman, Inga-Lill; Folke, Mia; Sommerfeld, Disa; Borg, Jörgen; Palmcrantz, Susanne

    2015-01-01

    The aim of this work has been to develop a technical support enabling home-based motor training after stroke. The basis for the work plan has been to develop an interactive technical solution supporting three different groups of stroke patients: (1) patients with stroke discharged from hospital with support from neuro team; (2) patients with stroke whose support from neuro team will be phased out and (3) patients living with impaired motor functions long-term. The technology has been developed in close collaboration with end-users using a method earlier evaluated and described [12]. This paper describes the main functions of the developed technology. Further, results from early user-tests with end-users, performed to identify needs for improvements to be carried out during further technical development. The developed technology will be tested further in a pilot study of the safety and, usefulness of the technology when applied as a support for motor training in three different phases of the post-stroke rehabilitation process.

  11. Whole embryo culture: a "New" technique that enabled decades of mechanistic discoveries.

    PubMed

    Ellis-Hutchings, Robert G; Carney, Edward W

    2010-08-01

    Denis New's development of the rodent whole embryo culture (WEC) method in the early 1960s was a groundbreaking achievement that gave embryologists and teratologists an unprecedented degree of access to the developing postimplantation rodent embryo. In the five decades since its development, WEC has enabled detailed investigations into the regulation of normal embryo development as well as a plethora of research on mechanisms of teratogenesis as induced by a wide range of agents. In addition, WEC is one of the few techniques that has been validated for use in teratogenicity screening of drugs and chemicals. In this review, we retrace the steps leading to New's development of WEC, and highlight many examples in which WEC played a crucial role leading to important discoveries in teratological research. The impact of WEC on the field of teratology has been enormous, and it is anticipated that WEC will remain a preferred tool for teratologists and embryologists seeking to interrogate embryo development for many years to come. Copyright 2010 Wiley-Liss, Inc.

  12. Learning from Collaborative New Product Development Projects

    ERIC Educational Resources Information Center

    Kleinsmann, Maaike; Valkenburg, Rianne

    2005-01-01

    Purpose--In an empirical study learning opportunities were identified. Learning opportunities are enablers or disablers for the achievement of shared understanding. Design/methodology/approach--Actors were interviewed about their communication process. The learning history method was used to analyze and structure the data. From the learning…

  13. Community Air Sensor Network (CAIRSENSE) Project: Lower Cost, Continuous Ambient Monitoring Methods

    EPA Science Inventory

    Advances in air pollution sensor technology have enabled the development of small and low cost systems to measure outdoor air pollution. The deployment of numerous sensors across a small geographic area would have potential benefits to supplement existing monitoring networks and ...

  14. Measurement in Physical Education. 5th Edition.

    ERIC Educational Resources Information Center

    Mathews, Donald K.

    Concepts of measurement in physical education are presented in this college-level text to enable the preservice physical education major to develop skills in determining pupil status, designing effective physical activity programs, and measuring student progress. Emphasis is placed upon discussion of essential statistical methods, test…

  15. A Nontraditional Education Model with Indian Indigenous Social Service Workers.

    ERIC Educational Resources Information Center

    Kelley, M. L.; Nelson, C. H.

    1986-01-01

    Describes educational processes to enable non-Indian social work educators to support development of Indian social service workers. Suggests holistic/ecological/systems perspective, facilitator/mentor role, mutuality, maximizing differences, empowerment, and structural approach. Discusses effective helping methods and roles for Indian social…

  16. Applications in Educational Assessment: Future Technologies.

    ERIC Educational Resources Information Center

    Bank Street Coll. of Education, New York, NY. Center for Children and Technology.

    The development of improved and alternative methods of educational assessment should take advantage of technologies that enable different aspects of learning, teaching, and student achievement to be part of an improved assessment system. The current understanding of knowledge assessment, new approaches to assessment, and technologies that may…

  17. FIELD AND LABORATORY PERFORMANCE CHARACTERISTICS OF A NEW SAMPLING PROTOCOL FOR RIVERINE MACROINVERTEBRATE ASSEMBLAGES

    EPA Science Inventory

    Measurement and estimation of performance characteristics (i.e., precision, bias, performance range, interferences and sensitivity) are often neglected in the development and use of new biological sampling methods. However, knowledge of this information is critical in enabling p...

  18. A stepwise, multi-objective, multi-variable parameter optimization method for the APEX model

    USDA-ARS?s Scientific Manuscript database

    Proper parameterization enables hydrological models to make reliable estimates of non-point source pollution for effective control measures. The automatic calibration of hydrologic models requires significant computational power limiting its application. The study objective was to develop and eval...

  19. Nanostructure Neutron Converter Layer Development

    NASA Technical Reports Server (NTRS)

    Park, Cheol (Inventor); Lowther, Sharon E. (Inventor); Kang, Jin Ho (Inventor); Thibeault, Sheila A. (Inventor); Sauti, Godfrey (Inventor); Bryant, Robert G. (Inventor)

    2016-01-01

    Methods for making a neutron converter layer are provided. The various embodiment methods enable the formation of a single layer neutron converter material. The single layer neutron converter material formed according to the various embodiments may have a high neutron absorption cross section, tailored resistivity providing a good electric field penetration with submicron particles, and a high secondary electron emission coefficient. In an embodiment method a neutron converter layer may be formed by sequential supercritical fluid metallization of a porous nanostructure aerogel or polyimide film. In another embodiment method a neutron converter layer may be formed by simultaneous supercritical fluid metallization of a porous nanostructure aerogel or polyimide film. In a further embodiment method a neutron converter layer may be formed by in-situ metalized aerogel nanostructure development.

  20. Enabling scientific workflows in virtual reality

    USGS Publications Warehouse

    Kreylos, O.; Bawden, G.; Bernardin, T.; Billen, M.I.; Cowgill, E.S.; Gold, R.D.; Hamann, B.; Jadamec, M.; Kellogg, L.H.; Staadt, O.G.; Sumner, D.Y.

    2006-01-01

    To advance research and improve the scientific return on data collection and interpretation efforts in the geosciences, we have developed methods of interactive visualization, with a special focus on immersive virtual reality (VR) environments. Earth sciences employ a strongly visual approach to the measurement and analysis of geologic data due to the spatial and temporal scales over which such data ranges, As observations and simulations increase in size and complexity, the Earth sciences are challenged to manage and interpret increasing amounts of data. Reaping the full intellectual benefits of immersive VR requires us to tailor exploratory approaches to scientific problems. These applications build on the visualization method's strengths, using both 3D perception and interaction with data and models, to take advantage of the skills and training of the geological scientists exploring their data in the VR environment. This interactive approach has enabled us to develop a suite of tools that are adaptable to a range of problems in the geosciences and beyond. Copyright ?? 2008 by the Association for Computing Machinery, Inc.

  1. Transient-Free Operations With Physics-Based Real-time Analysis and Control

    NASA Astrophysics Data System (ADS)

    Kolemen, Egemen; Burrell, Keith; Eggert, William; Eldon, David; Ferron, John; Glasser, Alex; Humphreys, David

    2016-10-01

    In order to understand and predict disruptions, the two most common methods currently employed in tokamak analysis are the time-consuming ``kinetic EFITs,'' which are done offline with significant human involvement, and the search for correlations with global precursors using various parameterization techniques. We are developing automated ``kinetic EFITs'' at DIII-D to enable calculation of the stability as the plasma evolves close to the disruption. This allows us to quantify the probabilistic nature of the stability calculations and provides a stability metric for all possible linear perturbations to the plasma. This study also provides insight into how the control system can avoid the unstable operating space, which is critical for high-performance operations close to stability thresholds at ITER. A novel, efficient ideal stability calculation method and new real-time CER acquisition system are being developed, and a new 77-core server has been installed on the DIII-D PCS to enable experimental use. Sponsored by US DOE under DE-SC0015878 and DE-FC02-04ER54698.

  2. State criminal justice telecommunications (STACOM). Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    Fielding, J. E.; Frewing, H. K.; Lee, J. J.; Leflang, W. G.; Reilly, N. B.

    1977-01-01

    Techniques for identifying user requirements and network designs for criminal justice networks on a state wide basis are discussed. Topics covered include: methods for determining data required; data collection and survey; data organization procedures, and methods for forecasting network traffic volumes. Developed network design techniques center around a computerized topology program which enables the user to generate least cost network topologies that satisfy network traffic requirements, response time requirements and other specified functional requirements. The developed techniques were applied in Texas and Ohio, and results of these studies are presented.

  3. Developing porous ceramics on the base of zirconia oxide with thin and permeable pores by crystallization of organic additive method

    NASA Astrophysics Data System (ADS)

    Kamyshnaya, K. S.; Khabas, T. A.

    2016-11-01

    In this paper porous ceramics on the base of ZrO2 nanopowders and micropowders has been developed by freeze-casting method. A zirconia/carbamide slurry was frozen in mold and dehydrated in CaCl2 at room temperature. This simple process enabled the formation of porous ceramics with highly aligned pores as a replica of the carbamide crystals. The samples showed higher porosity of 47.9%. In addition, these materials could be used as membrane for air cleaning.

  4. Load Disaggregation Technologies: Real World and Laboratory Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mayhorn, Ebony T.; Sullivan, Greg P.; Petersen, Joseph M.

    Low cost interval metering and communication technology improvements over the past ten years have enabled the maturity of load disaggregation (or non-intrusive load monitoring) technologies to better estimate and report energy consumption of individual end-use loads. With the appropriate performance characteristics, these technologies have the potential to enable many utility and customer facing applications such as billing transparency, itemized demand and energy consumption, appliance diagnostics, commissioning, energy efficiency savings verification, load shape research, and demand response measurement. However, there has been much skepticism concerning the ability of load disaggregation products to accurately identify and estimate energy consumption of end-uses; whichmore » has hindered wide-spread market adoption. A contributing factor is that common test methods and metrics are not available to evaluate performance without having to perform large scale field demonstrations and pilots, which can be costly when developing such products. Without common and cost-effective methods of evaluation, more developed disaggregation technologies will continue to be slow to market and potential users will remain uncertain about their capabilities. This paper reviews recent field studies and laboratory tests of disaggregation technologies. Several factors are identified that are important to consider in test protocols, so that the results reflect real world performance. Potential metrics are examined to highlight their effectiveness in quantifying disaggregation performance. This analysis is then used to suggest performance metrics that are meaningful and of value to potential users and that will enable researchers/developers to identify beneficial ways to improve their technologies.« less

  5. Microscale High-Throughput Experimentation as an Enabling Technology in Drug Discovery: Application in the Discovery of (Piperidinyl)pyridinyl-1H-benzimidazole Diacylglycerol Acyltransferase 1 Inhibitors.

    PubMed

    Cernak, Tim; Gesmundo, Nathan J; Dykstra, Kevin; Yu, Yang; Wu, Zhicai; Shi, Zhi-Cai; Vachal, Petr; Sperbeck, Donald; He, Shuwen; Murphy, Beth Ann; Sonatore, Lisa; Williams, Steven; Madeira, Maria; Verras, Andreas; Reiter, Maud; Lee, Claire Heechoon; Cuff, James; Sherer, Edward C; Kuethe, Jeffrey; Goble, Stephen; Perrotto, Nicholas; Pinto, Shirly; Shen, Dong-Ming; Nargund, Ravi; Balkovec, James; DeVita, Robert J; Dreher, Spencer D

    2017-05-11

    Miniaturization and parallel processing play an important role in the evolution of many technologies. We demonstrate the application of miniaturized high-throughput experimentation methods to resolve synthetic chemistry challenges on the frontlines of a lead optimization effort to develop diacylglycerol acyltransferase (DGAT1) inhibitors. Reactions were performed on ∼1 mg scale using glass microvials providing a miniaturized high-throughput experimentation capability that was used to study a challenging S N Ar reaction. The availability of robust synthetic chemistry conditions discovered in these miniaturized investigations enabled the development of structure-activity relationships that ultimately led to the discovery of soluble, selective, and potent inhibitors of DGAT1.

  6. Developing theory-informed behaviour change interventions to implement evidence into practice: a systematic approach using the Theoretical Domains Framework.

    PubMed

    French, Simon D; Green, Sally E; O'Connor, Denise A; McKenzie, Joanne E; Francis, Jill J; Michie, Susan; Buchbinder, Rachelle; Schattner, Peter; Spike, Neil; Grimshaw, Jeremy M

    2012-04-24

    There is little systematic operational guidance about how best to develop complex interventions to reduce the gap between practice and evidence. This article is one in a Series of articles documenting the development and use of the Theoretical Domains Framework (TDF) to advance the science of implementation research. The intervention was developed considering three main components: theory, evidence, and practical issues. We used a four-step approach, consisting of guiding questions, to direct the choice of the most appropriate components of an implementation intervention: Who needs to do what, differently? Using a theoretical framework, which barriers and enablers need to be addressed? Which intervention components (behaviour change techniques and mode(s) of delivery) could overcome the modifiable barriers and enhance the enablers? And how can behaviour change be measured and understood? A complex implementation intervention was designed that aimed to improve acute low back pain management in primary care. We used the TDF to identify the barriers and enablers to the uptake of evidence into practice and to guide the choice of intervention components. These components were then combined into a cohesive intervention. The intervention was delivered via two facilitated interactive small group workshops. We also produced a DVD to distribute to all participants in the intervention group. We chose outcome measures in order to assess the mediating mechanisms of behaviour change. We have illustrated a four-step systematic method for developing an intervention designed to change clinical practice based on a theoretical framework. The method of development provides a systematic framework that could be used by others developing complex implementation interventions. While this framework should be iteratively adjusted and refined to suit other contexts and settings, we believe that the four-step process should be maintained as the primary framework to guide researchers through a comprehensive intervention development process.

  7. Modern Focused-Ion-Beam-Based Site-Specific Specimen Preparation for Atom Probe Tomography.

    PubMed

    Prosa, Ty J; Larson, David J

    2017-04-01

    Approximately 30 years after the first use of focused ion beam (FIB) instruments to prepare atom probe tomography specimens, this technique has grown to be used by hundreds of researchers around the world. This past decade has seen tremendous advances in atom probe applications, enabled by the continued development of FIB-based specimen preparation methodologies. In this work, we provide a short review of the origin of the FIB method and the standard methods used today for lift-out and sharpening, using the annular milling method as applied to atom probe tomography specimens. Key steps for enabling correlative analysis with transmission electron-beam backscatter diffraction, transmission electron microscopy, and atom probe tomography are presented, and strategies for preparing specimens for modern microelectronic device structures are reviewed and discussed in detail. Examples are used for discussion of the steps for each of these methods. We conclude with examples of the challenges presented by complex topologies such as nanowires, nanoparticles, and organic materials.

  8. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vasdekis, Andreas E.; Stephanopoulos, Gregory

    The sampling and manipulation of cells down to the individual has been of substantial interest since the very beginning of Life Sciences. Herein, our objective is to highlight the most recent developments in single cell manipulation, as well as pioneering ones. First, flow-through methods will be discussed, namely methods in which the single cells flow continuously in an ordered manner during their analysis. This section will be followed by confinement techniques that enable cell isolation and confinement in one, two- or three-dimensions. Flow cytometry and droplet microfluidics are the two most common methods of flow-through analysis. While both are high-throughputmore » techniques, their difference lays in the fact that the droplet encapsulated cells experience a restricted and personal microenvironment, while in flow cytometry cells experience similar nutrient and stimuli initial concentrations. These methods are rather well established; however, they recently enabled immense strides in single cell phenotypic analysis, namely the identification and analysis of metabolically distinct individuals from an isogenic population using both droplet microfluidics and flow cytometry.« less

  9. Ammonia Analysis by Gas Chromatograph/Infrared Detector (GC/IRD)

    NASA Technical Reports Server (NTRS)

    Scott, Joseph P.; Whitfield, Steve W.

    2003-01-01

    Methods are being developed at Marshall Space Flight Center's Toxicity Lab on a CG/IRD System that will be used to detect ammonia in low part per million (ppm) levels. These methods will allow analysis of gas samples by syringe injections. The GC is equipped with a unique cryogenic-cooled inlet system that will enable our lab to make large injections of a gas sample. Although the initial focus of the work will be analysis of ammonia, this instrument could identify other compounds on a molecular level. If proper methods can be developed, the IRD could work as a powerful addition to our offgassing capabilities.

  10. A synthesis of fluorescent starch based on carbon nanoparticles for fingerprints detection

    NASA Astrophysics Data System (ADS)

    Li, Hongren; Guo, Xingjia; Liu, Jun; Li, Feng

    2016-10-01

    A pyrolysis method for synthesizing carbon nanoparticles (CNPs) were developed by using malic acid and ammonium oxalate as raw materials. The incorporation of a minor amount of carbon nanoparticles into starch powder imparts remarkable color-tunability. Based on this phenomenon, an environment friendly fluorescent starch powder for detecting latent fingerprints in non-porous surfaces was prepared. The fingerprints on different non-porous surfaces developed with this powder showed very good fluorescent images under ultraviolet excitation. The method using fluorescent starch powder as fluorescent marks is simple, rapid and green. Experimental results illustrated the effectiveness of proposed methods, enabling its practical applications in forensic sciences.

  11. Enablers and barriers in delivery of a cancer exercise program: the Canadian experience

    PubMed Central

    Mina, D. Santa; Petrella, A.; Currie, K.L.; Bietola, K.; Alibhai, S.M.H.; Trachtenberg, J.; Ritvo, P.; Matthew, A.G.

    2015-01-01

    Background Exercise is an important therapy to improve well-being after a cancer diagnosis. Accordingly, cancer-exercise programs have been developed to enhance clinical care; however, few programs exist in Canada. Expansion of cancer-exercise programming depends on an understanding of the process of program implementation, as well as enablers and barriers to program success. Gaining knowledge from current professionals in cancer-exercise programs could serve to facilitate the necessary understanding. Methods Key personnel from Canadian cancer-exercise programs (n = 14) participated in semistructured interviews about program development and delivery. Results Content analysis revealed 13 categories and 15 subcategories, which were grouped by three organizing domains: Program Implementation, Program Enablers, and Program Barriers. ■ Program Implementation (5 categories, 8 subcategories) included Program Initiation (clinical care extension, research project expansion, program champion), Funding, Participant Intake (avenues of awareness, health and safety assessment), Active Programming (monitoring patient exercise progress, health care practitioner involvement, program composition), and Discharge and Follow-up Plan.■ Program Enablers (4 categories, 4 subcategories) included Patient Participation (personalized care, supportive network, personal control, awareness of benefits), Partnerships, Advocacy and Support, and Program Characteristics.■ Program Barriers (4 categories, 3 subcategories) included Lack of Funding, Lack of Physician Support, Deterrents to Participation (fear and shame, program location, competing interests), and Disease Progression and Treatment. Conclusions Interview results provided insight into the development and delivery of cancer-exercise programs in Canada and could be used to guide future program development and expansion in Canada. PMID:26715869

  12. Participatory design of healthcare technology with children.

    PubMed

    Sims, Tara

    2018-02-12

    Purpose There are many frameworks and methods for involving children in design research. Human-Computer Interaction provides rich methods for involving children when designing technologies. The paper aims to discuss these issues. Design/methodology/approach This paper examines various approaches to involving children in design, considering whether users view children as study objects or active participants. Findings The BRIDGE method is a sociocultural approach to product design that views children as active participants, enabling them to contribute to the design process as competent and resourceful partners. An example is provided, in which BRIDGE was successfully applied to developing upper limb prostheses with children. Originality/value Approaching design in this way can provide children with opportunities to develop social, academic and design skills and to develop autonomy.

  13. A Sensor-Type PC Strand with an Embedded FBG Sensor for Monitoring Prestress Forces

    PubMed Central

    Kim, Sung Tae; Park, YoungHwan; Park, Sung Yong; Cho, Keunhee; Cho, Jeong-Rae

    2015-01-01

    Prestressed Concrete Wire and Strand (PC) strands are the most used materials to introduce prestress in a Pre-Stressed Concrete (PSC) structure. However, it is difficult to evaluate the final prestress force of the PC strand after prestressing or its residual prestress force after completion of the structure on site. This impossibility to assess eventual loss of prestress of the PC strand has resulted in a number of serious accidents and even in the collapse of several structures. This situation stresses the necessity to maintain the prestress force residual or after prestressing for the evaluation of the health of the concrete structure throughout its lifespan. Recently, several researchers have studied methods enabling one to verify the prestress force by inserting an optical fiber sensor inside the strand but failed to provide simple techniques for the fabrication of these devices to fulfill measurement performance from the design prestress to failure. Moreover, these methods require the additional installation of electrical resistance strain gages, displacement sensors and load cells on the outer surface of the structure for long-term precise measurement. This paper proposes a method enabling one to evaluate precisely and effectively the prestress force of the PC strand and intends to verify the applicability of the proposed method on actual concrete structures. To that end, an innovative PC strand is developed by embedding a Fiber Bragg Grating (FBG) sensor in the core wire of the PC strand so as to enable short term as well as long term monitoring. The measurement performance of the developed strand is then evaluated experimentally and the reliability of the monitoring data is assessed. PMID:25580903

  14. A sensor-type PC strand with an embedded FBG sensor for monitoring prestress forces.

    PubMed

    Kim, Sung Tae; Park, YoungHwan; Park, Sung Yong; Cho, Keunhee; Cho, Jeong-Rae

    2015-01-08

    Prestressed Concrete Wire and Strand (PC) strands are the most used materials to introduce prestress in a Pre-Stressed Concrete (PSC) structure. However, it is difficult to evaluate the final prestress force of the PC strand after prestressing or its residual prestress force after completion of the structure on site. This impossibility to assess eventual loss of prestress of the PC strand has resulted in a number of serious accidents and even in the collapse of several structures. This situation stresses the necessity to maintain the prestress force residual or after prestressing for the evaluation of the health of the concrete structure throughout its lifespan. Recently, several researchers have studied methods enabling one to verify the prestress force by inserting an optical fiber sensor inside the strand but failed to provide simple techniques for the fabrication of these devices to fulfill measurement performance from the design prestress to failure. Moreover, these methods require the additional installation of electrical resistance strain gages, displacement sensors and load cells on the outer surface of the structure for long-term precise measurement. This paper proposes a method enabling one to evaluate precisely and effectively the prestress force of the PC strand and intends to verify the applicability of the proposed method on actual concrete structures. To that end, an innovative PC strand is developed by embedding a Fiber Bragg Grating (FBG) sensor in the core wire of the PC strand so as to enable short term as well as long term monitoring. The measurement performance of the developed strand is then evaluated experimentally and the reliability of the monitoring data is assessed.

  15. Quantum dot coating of baculoviral vectors enables visualization of transduced cells and tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, Ying; Lo, Seong Loong; Zheng, Yuangang

    2013-04-26

    Highlights: •The use of quantum dot (QD)-labeled viral vectors for in vivo imaging is not well investigated. •A new method to label enveloped baculovirus with glutathione-capped CdTe QDs is developed. •The labeling enables the identification of transduced, cultured cells based on fluorescence. •The labeling also allows evaluation of viral transduction in a real-time manner in living mice. •The method has the potential to assess viral vector-based gene therapy protocols in future. -- Abstract: Imaging of transduced cells and tissues is valuable in developing gene transfer vectors and evaluating gene therapy efficacy. We report here a simple method to use brightmore » and photostable quantum dots to label baculovirus, an emerging gene therapy vector. The labeling was achieved through the non-covalent interaction of glutathione-capped CdTe quantum dots with the virus envelope, without the use of chemical conjugation. The quantum dot labeling was nondestructive to viral transduction function and enabled the identification of baculoviral vector-transduced, living cells based on red fluorescence. When the labeled baculoviral vectors were injected intravenously or intraventricularly for in vivo delivery of a transgene into mice, quantum dot fluorescence signals allow us monitor whether or not the injected tissues were transduced. More importantly, using a dual-color whole-body imaging technology, we demonstrated that in vivo viral transduction could be evaluated in a real-time manner in living mice. Thus, our method of labeling a read-to-use gene delivery vector with quantum dots could be useful towards the improvement of vector design and will have the potential to assess baculovirus-based gene therapy protocols in future.« less

  16. Methodical and technological aspects of creation of interactive computer learning systems

    NASA Astrophysics Data System (ADS)

    Vishtak, N. M.; Frolov, D. A.

    2017-01-01

    The article presents a methodology for the development of an interactive computer training system for training power plant. The methods used in the work are a generalization of the content of scientific and methodological sources on the use of computer-based training systems in vocational education, methods of system analysis, methods of structural and object-oriented modeling of information systems. The relevance of the development of the interactive computer training systems in the preparation of the personnel in the conditions of the educational and training centers is proved. Development stages of the computer training systems are allocated, factors of efficient use of the interactive computer training system are analysed. The algorithm of work performance at each development stage of the interactive computer training system that enables one to optimize time, financial and labor expenditure on the creation of the interactive computer training system is offered.

  17. Simulating the Historical Process To Create Laboratory Exercises That Teach Research Methods.

    ERIC Educational Resources Information Center

    Alcock, James

    1994-01-01

    Explains how controlling student access to data can be used as a strategy enabling students to take the role of a research geologist. Students develop models based on limited data and conduct field tests by comparing their predictions with the additional data. (DDR)

  18. Educational Benefits of Multimedia Skills Training

    ERIC Educational Resources Information Center

    Wang, Tsung juang

    2010-01-01

    The use of multimedia technologies in education has enabled teachers to simulate final outcomes and assist students in applying knowledge learned from textbooks, thereby compensating for the deficiency of traditional teaching methods. It is important to examine how effective these technologies are in practical use. This study developed online…

  19. Physical activity problem-solving inventory for adolescents: Development and initial validation

    USDA-ARS?s Scientific Manuscript database

    Youth encounter physical activity barriers, often called problems. The purpose of problem-solving is to generate solutions to overcome the barriers. Enhancing problem-solving ability may enable youth to be more physically active. Therefore, a method for reliably assessing physical activity problem-s...

  20. 77 FR 16846 - Published Privacy Impact Assessments on the Web

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-22

    ... Security Advanced Research Projects Agency (HSARPA), S&T Directorate seeks to develop physiological and behavioral screening technologies that will enable security officials to test the effectiveness of current... FAST research is adding a new type of research, the Passive Methods for Precision Behavioral Screening...

  1. OPEN PATH TUNABLE DIODE LASER ABSORPTION SPECTROSCOPY FOR ACQUISITION OF FUGITIVE EMISSION FLUX DATA

    EPA Science Inventory

    Air pollutant emission from unconfined sources is an increasingly important environmental issue. The U.S. EPA has developed a gorund-based optical remote sensing method that enables direct measurement of fugitive emission flux from large area sources. Open-path Fourier transfor...

  2. Human health risk assessment (HHRA) for environmental development and transfer of antibiotic resistance

    EPA Science Inventory

    Objective: Here we present possible approaches and identify research needs to enable human health risk assessments that focus on the role the environment plays in antibiotic treatment failure of patients. Methods: The authors participated in a workshop sub-committee to define t...

  3. Literature review : simple test method for possible use in predicting the fatigue of asphaltic concrete.

    DOT National Transportation Integrated Search

    1975-01-01

    It has been recognized for many years that fatigue is one of many mechanisms by which asphaltic concrete pavements fail. Experience and empirical design procedures such as those developed by Marshall and Hveem have enabled engineers to design-mixture...

  4. Developing Confidence Limits For Reliability Of Software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1991-01-01

    Technique developed for estimating reliability of software by use of Moranda geometric de-eutrophication model. Pivotal method enables straightforward construction of exact bounds with associated degree of statistical confidence about reliability of software. Confidence limits thus derived provide precise means of assessing quality of software. Limits take into account number of bugs found while testing and effects of sampling variation associated with random order of discovering bugs.

  5. Effect of 5E Learning Model on Academic Achievement, Attitude and Science Process Skills: Meta-Analysis Study

    ERIC Educational Resources Information Center

    Cakir, Nevin Kozcu

    2017-01-01

    Today, with the development of science and technology and its rapid progress, the importance attached to science education has increased. This increase in interest has led to the development of the methods, techniques, and approaches that enable the students to be active, question and construct knowledge. The 5E learning model is one of them, and…

  6. Real-time, haptics-enabled simulator for probing ex vivo liver tissue.

    PubMed

    Lister, Kevin; Gao, Zhan; Desai, Jaydev P

    2009-01-01

    The advent of complex surgical procedures has driven the need for realistic surgical training simulators. Comprehensive simulators that provide realistic visual and haptic feedback during surgical tasks are required to familiarize surgeons with the procedures they are to perform. Complex organ geometry inherent to biological tissues and intricate material properties drive the need for finite element methods to assure accurate tissue displacement and force calculations. Advances in real-time finite element methods have not reached the state where they are applicable to soft tissue surgical simulation. Therefore a real-time, haptics-enabled simulator for probing of soft tissue has been developed which utilizes preprocessed finite element data (derived from accurate constitutive model of the soft-tissue obtained from carefully collected experimental data) to accurately replicate the probing task in real-time.

  7. Quantitation of sweet steviol glycosides by means of a HILIC-MS/MS-SIDA approach.

    PubMed

    Well, Caroline; Frank, Oliver; Hofmann, Thomas

    2013-11-27

    Meeting the rising consumer demand for natural food ingredients, steviol glycosides, the sweet principle of Stevia rebaudiana Bertoni (Bertoni), have recently been approved as food additives in the European Union. As regulatory constraints require sensitive methods to analyze the sweet-tasting steviol glycosides in foods and beverages, a HILIC-MS/MS method was developed enabling the accurate and reliable quantitation of the major steviol glycosides stevioside, rebaudiosides A-F, steviolbioside, rubusoside, and dulcoside A by using the corresponding deuterated 16,17-dihydrosteviol glycosides as suitable internal standards. This quantitation not only enables the analysis of the individual steviol glycosides in foods and beverages but also can support the optimization of breeding and postharvest downstream processing of Stevia plants to produce preferentially sweet and least bitter tasting Stevia extracts.

  8. Promising approaches to circumvent the blood-brain barrier: progress, pitfalls and clinical prospects in brain cancer

    PubMed Central

    Papademetriou, Iason T; Porter, Tyrone

    2015-01-01

    Brain drug delivery is a major challenge for therapy of central nervous system (CNS) diseases. Biochemical modifications of drugs or drug nanocarriers, methods of local delivery, and blood–brain barrier (BBB) disruption with focused ultrasound and microbubbles are promising approaches which enhance transport or bypass the BBB. These approaches are discussed in the context of brain cancer as an example in CNS drug development. Targeting to receptors enabling transport across the BBB offers noninvasive delivery of small molecule and biological cancer therapeutics. Local delivery methods enable high dose delivery while avoiding systemic exposure. BBB disruption with focused ultrasound and microbubbles offers local and noninvasive treatment. Clinical trials show the prospects of these technologies and point to challenges for the future. PMID:26488496

  9. Gutzwiller renormalization group

    DOE PAGES

    Lanatà, Nicola; Yao, Yong -Xin; Deng, Xiaoyu; ...

    2016-01-06

    We develop a variational scheme called the “Gutzwiller renormalization group” (GRG), which enables us to calculate the ground state of Anderson impurity models (AIM) with arbitrary numerical precision. Our method exploits the low-entanglement property of the ground state of local Hamiltonians in combination with the framework of the Gutzwiller wave function and indicates that the ground state of the AIM has a very simple structure, which can be represented very accurately in terms of a surprisingly small number of variational parameters. Furthermore, we perform benchmark calculations of the single-band AIM that validate our theory and suggest that the GRG mightmore » enable us to study complex systems beyond the reach of the other methods presently available and pave the way to interesting generalizations, e.g., to nonequilibrium transport in nanostructures.« less

  10. Promising approaches to circumvent the blood-brain barrier: progress, pitfalls and clinical prospects in brain cancer.

    PubMed

    Papademetriou, Iason T; Porter, Tyrone

    2015-01-01

    Brain drug delivery is a major challenge for therapy of central nervous system (CNS) diseases. Biochemical modifications of drugs or drug nanocarriers, methods of local delivery, and blood-brain barrier (BBB) disruption with focused ultrasound and microbubbles are promising approaches which enhance transport or bypass the BBB. These approaches are discussed in the context of brain cancer as an example in CNS drug development. Targeting to receptors enabling transport across the BBB offers noninvasive delivery of small molecule and biological cancer therapeutics. Local delivery methods enable high dose delivery while avoiding systemic exposure. BBB disruption with focused ultrasound and microbubbles offers local and noninvasive treatment. Clinical trials show the prospects of these technologies and point to challenges for the future.

  11. Analysis of Processed Foods Containing Oils and Fats by Time of Flight Mass Spectrometry with an APCI Direct Probe.

    PubMed

    Ito, Shihomi; Chikasou, Masato; Inohana, Shuichi; Fujita, Kazuhiro

    2016-01-01

    Discriminating vegetable oils and animal and milk fats by infrared absorption spectroscopy is difficult due to similarities in their spectral patterns. Therefore, a rapid and simple method for analyzing vegetable oils, animal fats, and milk fats using TOF/MS with an APCI direct probe ion source was developed. This method enabled discrimination of these oils and fats based on mass spectra and detailed analyses of the ions derived from sterols, even in samples consisting of only a few milligrams. Analyses of the mass spectra of processed foods containing oils and milk fats, such as butter, cheese, and chocolate, enabled confirmation of the raw material origin based on specific ions derived from the oils and fats used to produce the final product.

  12. Translating building information modeling to building energy modeling using model view definition.

    PubMed

    Jeong, WoonSeong; Kim, Jong Bum; Clayton, Mark J; Haberl, Jeff S; Yan, Wei

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process.

  13. Translating Building Information Modeling to Building Energy Modeling Using Model View Definition

    PubMed Central

    Kim, Jong Bum; Clayton, Mark J.; Haberl, Jeff S.

    2014-01-01

    This paper presents a new approach to translate between Building Information Modeling (BIM) and Building Energy Modeling (BEM) that uses Modelica, an object-oriented declarative, equation-based simulation environment. The approach (BIM2BEM) has been developed using a data modeling method to enable seamless model translations of building geometry, materials, and topology. Using data modeling, we created a Model View Definition (MVD) consisting of a process model and a class diagram. The process model demonstrates object-mapping between BIM and Modelica-based BEM (ModelicaBEM) and facilitates the definition of required information during model translations. The class diagram represents the information and object relationships to produce a class package intermediate between the BIM and BEM. The implementation of the intermediate class package enables system interface (Revit2Modelica) development for automatic BIM data translation into ModelicaBEM. In order to demonstrate and validate our approach, simulation result comparisons have been conducted via three test cases using (1) the BIM-based Modelica models generated from Revit2Modelica and (2) BEM models manually created using LBNL Modelica Buildings library. Our implementation shows that BIM2BEM (1) enables BIM models to be translated into ModelicaBEM models, (2) enables system interface development based on the MVD for thermal simulation, and (3) facilitates the reuse of original BIM data into building energy simulation without an import/export process. PMID:25309954

  14. New method for the rapid extraction of natural products: efficient isolation of shikimic acid from star anise.

    PubMed

    Just, Jeremy; Deans, Bianca J; Olivier, Wesley J; Paull, Brett; Bissember, Alex C; Smith, Jason A

    2015-05-15

    A new, practical, rapid, and high-yielding process for the pressurized hot water extraction (PHWE) of multigram quantities of shikimic acid from star anise (Illicium verum) using an unmodified household espresso machine has been developed. This operationally simple and inexpensive method enables the efficient and straightforward isolation of shikimic acid and the facile preparation of a range of its synthetic derivatives.

  15. Method of fabricating a high aspect ratio microstructure

    DOEpatents

    Warren, John B.

    2003-05-06

    The present invention is for a method of fabricating a high aspect ratio, freestanding microstructure. The fabrication method modifies the exposure process for SU-8, an negative-acting, ultraviolet-sensitive photoresist used for microfabrication whereby a UV-absorbent glass substrate, chosen for complete absorption of UV radiation at 380 nanometers or less, is coated with a negative photoresist, exposed and developed according to standard practice. This UV absorbent glass enables the fabrication of cylindrical cavities in a negative photoresist microstructures that have aspect ratios of 8:1.

  16. Determination of iodine and molybdenum in milk by quadrupole ICP-MS.

    PubMed

    Reid, Helen J; Bashammakh, Abdul A; Goodall, Phillip S; Landon, Mark R; O'Connor, Ciaran; Sharp, Barry L

    2008-03-15

    A reliable method for the determination of iodine and molybdenum in milk samples, using alkaline digestion with tetramethylammonium hydroxide and hydrogen peroxide, followed by quadrupole ICP-MS analysis, has been developed and tested using certified reference materials. The use of He+O2 (1.0 ml min(-1) and 0.6 ml min(-1)) in the collision-reaction cell of the mass spectrometer to remove (129)Xe+-- initially to enable the determination of low levels of 129I--also resulted in the quantitative conversion of Mo(+) to MoO2+ which enabled the molybdenum in the milk to be determined at similar mass to the iodine with the use of Sb as a common internal standard. In order to separate and pre-concentrate iodine at sub microg l(-1) concentrations, a novel method was developed using a cation-exchange column loaded with Pd2+ and Ca2+ ions to selectively retain iodide followed by elution with a small volume of ammonium thiosulfate. This method showed excellent results for aqueous iodide solutions, although the complex milk digest matrix made the method unsuitable for such samples. An investigation of the iodine species formed during oxidation and extraction of milk sample digests was carried out with a view to controlling the iodine chemistry.

  17. Proteomics goes forensic: Detection and mapping of blood signatures in fingermarks.

    PubMed

    Deininger, Lisa; Patel, Ekta; Clench, Malcolm R; Sears, Vaughn; Sammon, Chris; Francese, Simona

    2016-06-01

    A bottom up in situ proteomic method has been developed enabling the mapping of multiple blood signatures on the intact ridges of blood fingermarks by Matrix Assisted Laser Desorption Mass Spectrometry Imaging (MALDI-MSI). This method, at a proof of concept stage, builds upon recently published work demonstrating the opportunity to profile and identify multiple blood signatures in bloodstains via a bottom up proteomic approach. The present protocol addresses the limitation of the previously developed profiling method with respect to destructivity; destructivity should be avoided for evidence such as blood fingermarks, where the ridge detail must be preserved in order to provide the associative link between the biometric information and the events of bloodshed. Using a blood mark reference model, trypsin concentration and spraying conditions have been optimised within the technical constraints of the depositor eventually employed; the application of MALDI-MSI and Ion Mobility MS have enabled the detection, confirmation and visualisation of blood signatures directly onto the ridge pattern. These results are to be considered a first insight into a method eventually informing investigations (and judicial debates) of violent crimes in which the reliable and non-destructive detection and mapping of blood in fingermarks is paramount to reconstruct the events of bloodshed. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. A Principled Approach to the Specification of System Architectures for Space Missions

    NASA Technical Reports Server (NTRS)

    McKelvin, Mark L. Jr.; Castillo, Robert; Bonanne, Kevin; Bonnici, Michael; Cox, Brian; Gibson, Corrina; Leon, Juan P.; Gomez-Mustafa, Jose; Jimenez, Alejandro; Madni, Azad

    2015-01-01

    Modern space systems are increasing in complexity and scale at an unprecedented pace. Consequently, innovative methods, processes, and tools are needed to cope with the increasing complexity of architecting these systems. A key systems challenge in practice is the ability to scale processes, methods, and tools used to architect complex space systems. Traditionally, the process for specifying space system architectures has largely relied on capturing the system architecture in informal descriptions that are often embedded within loosely coupled design documents and domain expertise. Such informal descriptions often lead to misunderstandings between design teams, ambiguous specifications, difficulty in maintaining consistency as the architecture evolves throughout the system development life cycle, and costly design iterations. Therefore, traditional methods are becoming increasingly inefficient to cope with ever-increasing system complexity. We apply the principles of component-based design and platform-based design to the development of the system architecture for a practical space system to demonstrate feasibility of our approach using SysML. Our results show that we are able to apply a systematic design method to manage system complexity, thus enabling effective data management, semantic coherence and traceability across different levels of abstraction in the design chain. Just as important, our approach enables interoperability among heterogeneous tools in a concurrent engineering model based design environment.

  19. Use of multilevel modeling for determining optimal parameters of heat supply systems

    NASA Astrophysics Data System (ADS)

    Stennikov, V. A.; Barakhtenko, E. A.; Sokolov, D. V.

    2017-07-01

    The problem of finding optimal parameters of a heat-supply system (HSS) is in ensuring the required throughput capacity of a heat network by determining pipeline diameters and characteristics and location of pumping stations. Effective methods for solving this problem, i.e., the method of stepwise optimization based on the concept of dynamic programming and the method of multicircuit optimization, were proposed in the context of the hydraulic circuit theory developed at Melentiev Energy Systems Institute (Siberian Branch, Russian Academy of Sciences). These methods enable us to determine optimal parameters of various types of piping systems due to flexible adaptability of the calculation procedure to intricate nonlinear mathematical models describing features of used equipment items and methods of their construction and operation. The new and most significant results achieved in developing methodological support and software for finding optimal parameters of complex heat supply systems are presented: a new procedure for solving the problem based on multilevel decomposition of a heat network model that makes it possible to proceed from the initial problem to a set of interrelated, less cumbersome subproblems with reduced dimensionality; a new algorithm implementing the method of multicircuit optimization and focused on the calculation of a hierarchical model of a heat supply system; the SOSNA software system for determining optimum parameters of intricate heat-supply systems and implementing the developed methodological foundation. The proposed procedure and algorithm enable us to solve engineering problems of finding the optimal parameters of multicircuit heat supply systems having large (real) dimensionality, and are applied in solving urgent problems related to the optimal development and reconstruction of these systems. The developed methodological foundation and software can be used for designing heat supply systems in the Central and the Admiralty regions in St. Petersburg, the city of Bratsk, and the Magistral'nyi settlement.

  20. Boulder Capture System Design Options for the Asteroid Robotic Redirect Mission Alternate Approach Trade Study

    NASA Technical Reports Server (NTRS)

    Belbin, Scott P.; Merrill, Raymond G.

    2014-01-01

    This paper presents a boulder acquisition and asteroid surface interaction electromechanical concept developed for the Asteroid Robotic Redirect Mission (ARRM) option to capture a free standing boulder on the surface of a 100 m or larger Near Earth Asteroid (NEA). It details the down select process and ranking of potential boulder capture methods, the evolution of a simple yet elegant articulating spaceframe, and ongoing risk reduction and concept refinement efforts. The capture system configuration leverages the spaceframe, heritage manipulators, and a new microspine technology to enable the ARRM boulder capture. While at the NEA it enables attenuation of terminal descent velocity, ascent to escape velocity, boulder collection and restraint. After departure from the NEA it enables, robotic inspection, sample caching, and crew Extra Vehicular Activities (EVA).

  1. Increasing Accuracy of Tissue Shear Modulus Reconstruction Using Ultrasonic Strain Tensor Measurement

    NASA Astrophysics Data System (ADS)

    Sumi, C.

    Previously, we developed three displacement vector measurement methods, i.e., the multidimensional cross-spectrum phase gradient method (MCSPGM), the multidimensional autocorrelation method (MAM), and the multidimensional Doppler method (MDM). To increase the accuracies and stabilities of lateral and elevational displacement measurements, we also developed spatially variant, displacement component-dependent regularization. In particular, the regularization of only the lateral/elevational displacements is advantageous for the lateral unmodulated case. The demonstrated measurements of the displacement vector distributions in experiments using an inhomogeneous shear modulus agar phantom confirm that displacement-component-dependent regularization enables more stable shear modulus reconstruction. In this report, we also review our developed lateral modulation methods that use Parabolic functions, Hanning windows, and Gaussian functions in the apodization function and the optimized apodization function that realizes the designed point spread function (PSF). The modulations significantly increase the accuracy of the strain tensor measurement and shear modulus reconstruction (demonstrated using an agar phantom).

  2. JOINING DISSIMILAR MATERIALS USING FRICTION STIR SCRIBE TECHNIQUE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Hovanski, Yuri; Jana, Saumyadeep

    2016-09-01

    Development of robust and cost effective method of joining dissimilar materials can provide a critical pathway to enable widespread use of multi-material design and components in mainstream industrial applications. The use of multi-material components such as Steel-Aluminum, Aluminum-Polymer allows design engineers to optimize material utilization based on service requirements and often lead weight and cost reductions. However producing an effective joint between materials with vastly different thermal, microstructural and deformation response is highly problematic using conventional joining and /or fastening methods. This is especially challenging in cost sensitive high volume markets that largely rely on low–cost joining solutions. Friction Stirmore » Scribe technology was developed to meet the demands of joining materials with drastically different properties and melting regimes. The process enables joining of light metals like Magnesium and Aluminum to high temperature materials like Steels and Titanium. Additionally viable joints between polymer composites and metal can also be made using this method. This paper will present state of the art, progress made and challenges associated with this innovative derivative of Friction Stir welding in reference to joining dissimilar metals and polymer/metal combinations.« less

  3. Joining Dissimilar Materials Using Friction Stir Scribe Technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Hovanski, Yuri; Jana, Saumyadeep

    2016-10-03

    Development of a robust and cost-effective method of joining dissimilar materials could provide a critical pathway to enable widespread use of multi-material designs and components in mainstream industrial applications. The use of multi-material components such as steel-aluminum and aluminum-polymer would allow design engineers to optimize material utilization based on service requirements and could often lead to weight and cost reductions. However, producing an effective joint between materials with vastly different thermal, microstructural, and deformation responses is highly problematic using conventional joining and/or fastening methods. This is especially challenging in cost sensitive, high volume markets that largely rely on low costmore » joining solutions. Friction stir scribe technology was developed to meet the demands of joining materials with drastically different properties and melting regimes. The process enables joining of light metals like magnesium and aluminum to high temperature materials like steel and titanium. Viable joints between polymer composites and metal can also be made using this method. This paper will present the state of the art, progress made, and challenges associated with this innovative derivative of friction stir welding in reference to joining dissimilar metals and polymer/metal combinations.« less

  4. In vitro motility evaluation of aggregated cancer cells by means of automatic image processing.

    PubMed

    De Hauwer, C; Darro, F; Camby, I; Kiss, R; Van Ham, P; Decaesteker, C

    1999-05-01

    Set up of an automatic image processing based method that enables the motility of in vitro aggregated cells to be evaluated for a number of hours. Our biological model included the PC-3 human prostate cancer cell line growing as a monolayer on the bottom of Falcon plastic dishes containing conventional culture media. Our equipment consisted of an incubator, an inverted phase contrast microscope, a Charge Coupled Device (CCD) video camera, and a computer equipped with an image processing software developed in our laboratory. This computer-assisted microscope analysis of aggregated cells enables global cluster motility to be evaluated. This analysis also enables the trajectory of each cell to be isolated and parametrized within a given cluster or, indeed, the trajectories of individual cells outside a cluster. The results show that motility inside a PC-3 cluster is not restricted to slight motion due to cluster expansion, but rather consists of a marked cell movement within the cluster. The proposed equipment enables in vitro aggregated cell motility to be studied. This method can, therefore, be used in pharmacological studies in order to select anti-motility related compounds. The compounds selected by the equipment described could then be tested in vivo as potential anti-metastatic.

  5. Force and Conductance Spectroscopy of Single Molecule Junctions

    NASA Astrophysics Data System (ADS)

    Frei, Michael

    Investigation of mechanical properties of single molecule junctions is crucial to develop an understanding and enable control of single molecular junctions. This work presents an experimental and analytical approach that enables the statistical evaluation of force and simultaneous conductance data of metallic atomic point contacts and molecular junctions. A conductive atomic force microscope based break junction technique is developed to form single molecular junctions and collect conductance and force data simultaneously. Improvements of the optical components have been achieved through the use of a super-luminescent diode, enabling tremendous increases in force resolution. An experimental procedure to collect data for various molecular junctions has been developed and includes deposition, calibration, and analysis methods. For the statistical analysis of force, novel approaches based on two dimensional histograms and a direct force identification method are presented. The two dimensional method allows for an unbiased evaluation of force events that are identified using corresponding conductance signatures. This is not always possible however, and in these situations, the force based identification of junction rearrangement events is an attractive alternative method. This combined experimental and analytical approach is then applied to three studies: First, the impact of molecular backbones to the mechanical behavior of single molecule junctions is investigated and it is found that junctions formed with identical linkers but different backbone structure result in junctions with varying breaking forces. All molecules used show a clear molecular signature and force data can be evaluated using the 2D method. Second, the effects of the linker group used to attach molecules to gold electrodes are investigated. A study of four alkane molecules with different linkers finds a drastic difference in the evolution of donor-acceptor and covalently bonded molecules respectively. In fact, the covalent bond is found to significantly distort the metal electrode rearrangement such that junction rearrangement events can no longer be identified with a clean and well defined conductance signature. For this case, the force based identification process is used. Third, results for break junction measurements with different metals are presented. It is found that silver and palladium junctions rupture with forces different from those of gold contacts. In the case of silver experiments in ambient conditions, we can also identify oxygen impurities in the silver contact formation process, leading to force and conductance measurements of silver-oxygen structures. For the future, this work provides an experimental and analytical foundation that will enable insights into single molecule systems not previously accessible.

  6. SIMULTANEOUS MULTISLICE MAGNETIC RESONANCE FINGERPRINTING WITH LOW-RANK AND SUBSPACE MODELING

    PubMed Central

    Zhao, Bo; Bilgic, Berkin; Adalsteinsson, Elfar; Griswold, Mark A.; Wald, Lawrence L.; Setsompop, Kawin

    2018-01-01

    Magnetic resonance fingerprinting (MRF) is a new quantitative imaging paradigm that enables simultaneous acquisition of multiple magnetic resonance tissue parameters (e.g., T1, T2, and spin density). Recently, MRF has been integrated with simultaneous multislice (SMS) acquisitions to enable volumetric imaging with faster scan time. In this paper, we present a new image reconstruction method based on low-rank and subspace modeling for improved SMS-MRF. Here the low-rank model exploits strong spatiotemporal correlation among contrast-weighted images, while the subspace model captures the temporal evolution of magnetization dynamics. With the proposed model, the image reconstruction problem is formulated as a convex optimization problem, for which we develop an algorithm based on variable splitting and the alternating direction method of multipliers. The performance of the proposed method has been evaluated by numerical experiments, and the results demonstrate that the proposed method leads to improved accuracy over the conventional approach. Practically, the proposed method has a potential to allow for a 3x speedup with minimal reconstruction error, resulting in less than 5 sec imaging time per slice. PMID:29060594

  7. Review of methods to probe single cell metabolism and bioenergetics

    DOE PAGES

    Vasdekis, Andreas E.; Stephanopoulos, Gregory

    2014-10-31

    The sampling and manipulation of cells down to the individual has been of substantial interest since the very beginning of Life Sciences. Herein, our objective is to highlight the most recent developments in single cell manipulation, as well as pioneering ones. First, flow-through methods will be discussed, namely methods in which the single cells flow continuously in an ordered manner during their analysis. This section will be followed by confinement techniques that enable cell isolation and confinement in one, two- or three-dimensions. Flow cytometry and droplet microfluidics are the two most common methods of flow-through analysis. While both are high-throughputmore » techniques, their difference lays in the fact that the droplet encapsulated cells experience a restricted and personal microenvironment, while in flow cytometry cells experience similar nutrient and stimuli initial concentrations. These methods are rather well established; however, they recently enabled immense strides in single cell phenotypic analysis, namely the identification and analysis of metabolically distinct individuals from an isogenic population using both droplet microfluidics and flow cytometry.« less

  8. Simultaneous multislice magnetic resonance fingerprinting with low-rank and subspace modeling.

    PubMed

    Bo Zhao; Bilgic, Berkin; Adalsteinsson, Elfar; Griswold, Mark A; Wald, Lawrence L; Setsompop, Kawin

    2017-07-01

    Magnetic resonance fingerprinting (MRF) is a new quantitative imaging paradigm that enables simultaneous acquisition of multiple magnetic resonance tissue parameters (e.g., T 1 , T 2 , and spin density). Recently, MRF has been integrated with simultaneous multislice (SMS) acquisitions to enable volumetric imaging with faster scan time. In this paper, we present a new image reconstruction method based on low-rank and subspace modeling for improved SMS-MRF. Here the low-rank model exploits strong spatiotemporal correlation among contrast-weighted images, while the subspace model captures the temporal evolution of magnetization dynamics. With the proposed model, the image reconstruction problem is formulated as a convex optimization problem, for which we develop an algorithm based on variable splitting and the alternating direction method of multipliers. The performance of the proposed method has been evaluated by numerical experiments, and the results demonstrate that the proposed method leads to improved accuracy over the conventional approach. Practically, the proposed method has a potential to allow for a 3× speedup with minimal reconstruction error, resulting in less than 5 sec imaging time per slice.

  9. The Design and Implementation of Instruments for Low-Frequency Electromagnetic Sounding of the Martian Subsurface

    NASA Technical Reports Server (NTRS)

    Delory, G. T.; Grimm, R. E.

    2003-01-01

    Low-frequency electromagnetic soundings of the subsurface can identify liquid water at depths ranging from hundreds of meters to approx. 10 km in an environment such as Mars. Among the tools necessary to perform these soundings are low-frequency electric and magnetic field sensors capable of being deployed from a lander or rover such that horizontal and vertical components of the fields can be measured free of structural or electrical interference. Under a NASA Planetary Instrument Definition and Development Program (PIDDP), we are currently engaged in the prototype stages of low frequency sensor implementations that will enable this technique to be performed autonomously within the constraints of a lander platform. Once developed, this technique will represent both a complementary and alternative method to orbital radar sounding investigations, as the latter may not be able to identify subsurface water without significant ambiguities. Low frequency EM methods can play a crucial role as a ground truth measurement, performing deep soundings at sites identified as high priority areas by orbital radars. Alternatively, the penetration depth and conductivity discrimination of low-frequency methods may enable detection of subsurface water in areas that render radar methods ineffective. In either case, the sensitivity and depth of penetration inherent in low frequency EM exploration makes this tool a compelling candidate method to identify subsurface liquid water from a landed platform on Mars or other targets of interest.

  10. New modes of electron microscopy for materials science enabled by fast direct electron detectors

    NASA Astrophysics Data System (ADS)

    Minor, Andrew

    There is an ongoing revolution in the development of electron detector technology that has enabled modes of electron microscopy imaging that had only before been theorized. The age of electron microscopy as a tool for imaging is quickly giving way to a new frontier of multidimensional datasets to be mined. These improvements in electron detection have enabled cryo-electron microscopy to resolve the three-dimensional structures of non-crystalized proteins, revolutionizing structural biology. In the physical sciences direct electron detectors has enabled four-dimensional reciprocal space maps of materials at atomic resolution, providing all the structural information about nanoscale materials in one experiment. This talk will highlight the impact of direct electron detectors for materials science, including a new method of scanning nanobeam diffraction. With faster detectors we can take a series of 2D diffraction patterns at each position in a 2D STEM raster scan resulting in a four-dimensional data set. For thin film analysis, direct electron detectors hold the potential to enable strain, polarization, composition and electrical field mapping over relatively large fields of view, all from a single experiment.

  11. EMERALD: Coping with the Explosion of Seismic Data

    NASA Astrophysics Data System (ADS)

    West, J. D.; Fouch, M. J.; Arrowsmith, R.

    2009-12-01

    The geosciences are currently generating an unparalleled quantity of new public broadband seismic data with the establishment of large-scale seismic arrays such as the EarthScope USArray, which are enabling new and transformative scientific discoveries of the structure and dynamics of the Earth’s interior. Much of this explosion of data is a direct result of the formation of the IRIS consortium, which has enabled an unparalleled level of open exchange of seismic instrumentation, data, and methods. The production of these massive volumes of data has generated new and serious data management challenges for the seismological community. A significant challenge is the maintenance and updating of seismic metadata, which includes information such as station location, sensor orientation, instrument response, and clock timing data. This key information changes at unknown intervals, and the changes are not generally communicated to data users who have already downloaded and processed data. Another basic challenge is the ability to handle massive seismic datasets when waveform file volumes exceed the fundamental limitations of a computer’s operating system. A third, long-standing challenge is the difficulty of exchanging seismic processing codes between researchers; each scientist typically develops his or her own unique directory structure and file naming convention, requiring that codes developed by another researcher be rewritten before they can be used. To address these challenges, we are developing EMERALD (Explore, Manage, Edit, Reduce, & Analyze Large Datasets). The overarching goal of the EMERALD project is to enable more efficient and effective use of seismic datasets ranging from just a few hundred to millions of waveforms with a complete database-driven system, leading to higher quality seismic datasets for scientific analysis and enabling faster, more efficient scientific research. We will present a preliminary (beta) version of EMERALD, an integrated, extensible, standalone database server system based on the open-source PostgreSQL database engine. The system is designed for fast and easy processing of seismic datasets, and provides the necessary tools to manage very large datasets and all associated metadata. EMERALD provides methods for efficient preprocessing of seismic records; large record sets can be easily and quickly searched, reviewed, revised, reprocessed, and exported. EMERALD can retrieve and store station metadata and alert the user to metadata changes. The system provides many methods for visualizing data, analyzing dataset statistics, and tracking the processing history of individual datasets. EMERALD allows development and sharing of visualization and processing methods using any of 12 programming languages. EMERALD is designed to integrate existing software tools; the system provides wrapper functionality for existing widely-used programs such as GMT, SOD, and TauP. Users can interact with EMERALD via a web browser interface, or they can directly access their data from a variety of database-enabled external tools. Data can be imported and exported from the system in a variety of file formats, or can be directly requested and downloaded from the IRIS DMC from within EMERALD.

  12. Enabling the First Ever Measurement of Coherent Neutrino Scattering Through Background Neutron Measurements.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reyna, David; Betty, Rita

    Using High Performance Computing to Examine the Processes of Neurogenesis Underlying Pattern Separation/Completion of Episodic Information - Sandia researchers developed novel methods and metrics for studying the computational function of neurogenesis,thus generating substantial impact to the neuroscience and neural computing communities. This work could benefit applications in machine learning and other analysis activities. The purpose of this project was to computationally model the impact of neural population dynamics within the neurobiological memory system in order to examine how subareas in the brain enable pattern separation and completion of information in memory across time as associated experiences.

  13. Enabling fast charging – A battery technology gap assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed, Shabbir; Bloom, Ira; Jansen, Andrew N.

    The battery technology literature is reviewed, with an emphasis on key elements that limit extreme fast charging. Key gaps in existing elements of the technology are presented as well as developmental needs. Among these needs are advanced models and methods to detect and prevent lithium plating; new positive-electrode materials which are less prone to stress-induced failure; better electrode designs to accommodate very rapid diffusion in and out of the electrode; measure temperature distributions during fast charge to enable/validate models; and develop thermal management and pack designs to accommodate the higher operating voltage.

  14. Enabling fast charging – A battery technology gap assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmed, Shabbir; Bloom, Ira; Jansen, Andrew N.

    The battery technology literature is reviewed, with an emphasis on key elements that limit extreme fast charging. Key gaps in existing elements of the technology are presented as well as developmental needs. Among these needs are advanced models and methods to detect and prevent lithium plating; new positive-electrode materials which are less prone to stress-induced failure; better electrode designs to accommodate very rapid diffusion in and out of the electrode; measure temperature distributions during fast charge to enable / validate models; and develop thermal management and pack designs to accommodate the higher operating voltage.

  15. X-ray Computed Microtomography technique applied for cementitious materials: A review.

    PubMed

    da Silva, Ítalo Batista

    2018-04-01

    The main objective of this article is to present a bibliographical review about the use of the X-ray microtomography method in 3D images processing of cementitious materials microstructure, analyzing the pores microstructure and connectivity network, enabling tthe possibility of building a relationship between permeability and porosity. The use of this technique enables the understanding of physical, chemical and mechanical properties of cementitious materials by publishing good results, considering that the quality and quantity of accessible information were significant and may contribute to the study of cementitious materials development. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Opus in the Classroom: Striking CoRDS with Content-Related Digital Storytelling

    ERIC Educational Resources Information Center

    Roby, Teshia Young

    2010-01-01

    Writing personal narratives provides students with additional techniques for making deeper connections to subject matter. Content-related narrative development offers a departure from the traditional methods of teaching and learning and enables students to construe meaning individually and make deeper connections with subject matter content. By…

  17. The Aggregate Exposure Pathway (AEP): A conceptual framework for advancing exposure science research and transforming risk assessment

    EPA Science Inventory

    Recent advances in analytical methods, biomarker discovery, cell-based assay development, computational tools, sensor/monitor, and omics technology have enabled new streams of exposure and toxicity data to be generated at higher volumes and speed. These new data offer the opport...

  18. The Teacher's Guide to Winning Grants.

    ERIC Educational Resources Information Center

    Bauer, David G.

    This step-by-step primer helps teachers develop effective grantseeking methods that will enable them to secure funds. It shows teachers how to select the right funding sources, organize proposal ideas, write a convincing and well-prepared proposal, identify who will evaluate the proposal and the scoring system they will use, and efficiently…

  19. Using Qualitative Methods for Revising Items in the Hispanic Stress Inventory

    ERIC Educational Resources Information Center

    Cervantes, Richard C.; Goldbach, Jeremy T.; Padilla, Amado M.

    2012-01-01

    Despite progress in the development of measures to assess psychosocial stress experiences in the general population, a lack of culturally informed assessment instruments exist to enable clinicians and researchers to detect and accurately diagnosis mental health concerns among Hispanics. The Hispanic Stress Inventory (HSI) was developed…

  20. Improving Attachments of Non-Invasive (Type III) Electronic Data Loggers to Cetaceans

    DTIC Science & Technology

    2014-09-30

    the assessment tag impact on animal health and well-being. Specifically, we are working to develop methods that will enable the accurate estimates...currently not available for any marine mammal, about animal health and activity has the potential to revolutionize how animals are cared for in these

  1. Metal-catalyzed Decarboxylative Fluoroalkylation Reactions.

    PubMed

    Ambler, Brett R; Yang, Ming-Hsiu; Altman, Ryan A

    2016-12-01

    Metal-catalyzed decarboxylative fluoroalkylation reactions enable the conversion of simple O-based substrates into biologically relevant fluorinated analogs. Herein, we present decarboxylative methods that facilitate the synthesis of trifluoromethyl- and difluoroketone-containing products. We highlight key mechanistic aspects that are critical for efficient catalysis, and that inspired our thinking while developing the reactions.

  2. Teachers, Micro-Credentials, and the Performance Assessment Movement

    ERIC Educational Resources Information Center

    French, Dan; Berry, Barnett

    2017-01-01

    Micro-credentials, a new form of personalized professional development for teachers, offer a unique solution to the challenge of training school staff to design and implement performance assessments. In a relatively short period of time, micro-credentials have shown promise in enabling a more personalized, effective method of promoting teacher…

  3. Building Staff Competencies and Selecting Communications Methods for Waste Management Programs.

    ERIC Educational Resources Information Center

    Richardson, John G.

    The Waste Management Institute provided in-service training to interested County Extension agents in North Carolina to enable them to provide leadership in developing and delivering a comprehensive county-level waste management program. Training included technical, economic, environmental, social, and legal aspects of waste management presented in…

  4. A new ChainMail approach for real-time soft tissue simulation.

    PubMed

    Zhang, Jinao; Zhong, Yongmin; Smith, Julian; Gu, Chengfan

    2016-07-03

    This paper presents a new ChainMail method for real-time soft tissue simulation. This method enables the use of different material properties for chain elements to accommodate various materials. Based on the ChainMail bounding region, a new time-saving scheme is developed to improve computational efficiency for isotropic materials. The proposed method also conserves volume and strain energy. Experimental results demonstrate that the proposed ChainMail method can not only accommodate isotropic, anisotropic and heterogeneous materials but also model incompressibility and relaxation behaviors of soft tissues. Further, the proposed method can achieve real-time computational performance.

  5. Extension of the thermal porosimetry method to high gas pressure for nanoporosimetry estimation

    NASA Astrophysics Data System (ADS)

    Jannot, Y.; Degiovanni, A.; Camus, M.

    2018-04-01

    Standard pore size determination methods like mercury porosimetry, nitrogen sorption, microscopy, or X-ray tomography are not suited to highly porous, low density, and thus very fragile materials. For this kind of materials, a method based on thermal characterization has been developed in a previous study. This method has been used with air pressure varying from 10-1 to 105 Pa for materials having a thermal conductivity less than 0.05 W m-1 K-1 at atmospheric pressure. It enables the estimation of pore size distribution between 100 nm and 1 mm. In this paper, we present a new experimental device enabling thermal conductivity measurement under gas pressure up to 106 Pa, enabling the estimation of the volume fraction of pores having a 10 nm diameter. It is also demonstrated that the main thermal conductivity models (parallel, series, Maxwell, Bruggeman, self-consistent) lead to the same estimation of the pore size distribution as the extended parallel model (EPM) presented in this paper and then used to process the experimental data. Three materials with thermal conductivities at atmospheric pressure ranging from 0.014 W m-1 K-1 to 0.04 W m-1 K-1 are studied. The thermal conductivity measurement results obtained with the three materials are presented, and the corresponding pore size distributions between 10 nm and 1 mm are presented and discussed.

  6. Development of an electrothermal vaporization ICP-MS method and assessment of its applicability to studies of the homogeneity of reference materials.

    PubMed

    Friese, K C; Grobecker, K H; Wätjen, U

    2001-07-01

    A method has been developed for measurement of the homogeneity of analyte distribution in powdered materials by use of electrothermal vaporization with inductively coupled plasma mass spectrometric (ETV-ICP-MS) detection. The method enabled the simultaneous determination of As, Cd, Cu, Fe, Mn, Pb, and Zn in milligram amounts of samples of biological origin. The optimized conditions comprised a high plasma power of 1,500 W, reduced aerosol transport flow, and heating ramps below 300 degrees C s(-1). A temperature ramp to 550 degrees C ensured effective pyrolysis of approximately 70% of the organic compounds without losses of analyte. An additional hold stage at 700 degrees C led to separation of most of the analyte signals from the evaporation of carbonaceous matrix compounds. The effect of time resolution of signal acquisition on the precision of the ETV measurements was investigated. An increase in the number of masses monitored up to 20 is possible with not more than 1% additional relative standard deviation of results caused by limited temporal resolution of the transient signals. Recording of signals from the nebulization of aqueous standards in each sample run enabled correction for drift of the sensitivity of the ETV-ICP-MS instrument. The applicability of the developed method to homogeneity studies was assessed by use of four certified reference materials. According to the best repeatability observed in these sample runs, the maximum contribution of the method to the standard deviation is approximately 5% to 6% for all the elements investigated.

  7. Some connections between importance sampling and enhanced sampling methods in molecular dynamics.

    PubMed

    Lie, H C; Quer, J

    2017-11-21

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  8. Some connections between importance sampling and enhanced sampling methods in molecular dynamics

    NASA Astrophysics Data System (ADS)

    Lie, H. C.; Quer, J.

    2017-11-01

    In molecular dynamics, enhanced sampling methods enable the collection of better statistics of rare events from a reference or target distribution. We show that a large class of these methods is based on the idea of importance sampling from mathematical statistics. We illustrate this connection by comparing the Hartmann-Schütte method for rare event simulation (J. Stat. Mech. Theor. Exp. 2012, P11004) and the Valsson-Parrinello method of variationally enhanced sampling [Phys. Rev. Lett. 113, 090601 (2014)]. We use this connection in order to discuss how recent results from the Monte Carlo methods literature can guide the development of enhanced sampling methods.

  9. Attractive design: an elution solvent optimization platform for magnetic-bead-based fractionation using digital microfluidics and design of experiments.

    PubMed

    Lafrenière, Nelson M; Mudrik, Jared M; Ng, Alphonsus H C; Seale, Brendon; Spooner, Neil; Wheeler, Aaron R

    2015-04-07

    There is great interest in the development of integrated tools allowing for miniaturized sample processing, including solid phase extraction (SPE). We introduce a new format for microfluidic SPE relying on C18-functionalized magnetic beads that can be manipulated in droplets in a digital microfluidic platform. This format provides the opportunity to tune the amount (and potentially the type) of stationary phase on-the-fly, and allows the removal of beads after the extraction (to enable other operations in same device-space), maintaining device reconfigurability. Using the new method, we employed a design of experiments (DOE) operation to enable automated on-chip optimization of elution solvent composition for reversed phase SPE of a model system. Further, conditions were selected to enable on-chip fractionation of multiple analytes. Finally, the method was demonstrated to be useful for online cleanup of extracts from dried blood spot (DBS) samples. We anticipate this combination of features will prove useful for separating a wide range of analytes, from small molecules to peptides, from complex matrices.

  10. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images

    PubMed Central

    Afshar, Yaser; Sbalzarini, Ivo F.

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 1010 pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments. PMID:27046144

  11. Modification of measurement methods for evaluation of tissue-engineered cartilage function and biochemical properties using nanosecond pulsed laser

    NASA Astrophysics Data System (ADS)

    Ishihara, Miya; Sato, Masato; Kutsuna, Toshiharu; Ishihara, Masayuki; Mochida, Joji; Kikuchi, Makoto

    2008-02-01

    There is a demand in the field of regenerative medicine for measurement technology that enables determination of functions and components of engineered tissue. To meet this demand, we developed a method for extracellular matrix characterization using time-resolved autofluorescence spectroscopy, which enabled simultaneous measurements with mechanical properties using relaxation of laser-induced stress wave. In this study, in addition to time-resolved fluorescent spectroscopy, hyperspectral sensor, which enables to capture both spectral and spatial information, was used for evaluation of biochemical characterization of tissue-engineered cartilage. Hyperspectral imaging system provides spectral resolution of 1.2 nm and image rate of 100 images/sec. The imaging system consisted of the hyperspectral sensor, a scanner for x-y plane imaging, magnifying optics and Xenon lamp for transmmissive lighting. Cellular imaging using the hyperspectral image system has been achieved by improvement in spatial resolution up to 9 micrometer. The spectroscopic cellular imaging could be observed using cultured chondrocytes as sample. At early stage of culture, the hyperspectral imaging offered information about cellular function associated with endogeneous fluorescent biomolecules.

  12. A Parallel Distributed-Memory Particle Method Enables Acquisition-Rate Segmentation of Large Fluorescence Microscopy Images.

    PubMed

    Afshar, Yaser; Sbalzarini, Ivo F

    2016-01-01

    Modern fluorescence microscopy modalities, such as light-sheet microscopy, are capable of acquiring large three-dimensional images at high data rate. This creates a bottleneck in computational processing and analysis of the acquired images, as the rate of acquisition outpaces the speed of processing. Moreover, images can be so large that they do not fit the main memory of a single computer. We address both issues by developing a distributed parallel algorithm for segmentation of large fluorescence microscopy images. The method is based on the versatile Discrete Region Competition algorithm, which has previously proven useful in microscopy image segmentation. The present distributed implementation decomposes the input image into smaller sub-images that are distributed across multiple computers. Using network communication, the computers orchestrate the collectively solving of the global segmentation problem. This not only enables segmentation of large images (we test images of up to 10(10) pixels), but also accelerates segmentation to match the time scale of image acquisition. Such acquisition-rate image segmentation is a prerequisite for the smart microscopes of the future and enables online data compression and interactive experiments.

  13. Launch Vehicle Design and Optimization Methods and Priority for the Advanced Engineering Environment

    NASA Technical Reports Server (NTRS)

    Rowell, Lawrence F.; Korte, John J.

    2003-01-01

    NASA's Advanced Engineering Environment (AEE) is a research and development program that will improve collaboration among design engineers for launch vehicle conceptual design and provide the infrastructure (methods and framework) necessary to enable that environment. In this paper, three major technical challenges facing the AEE program are identified, and three specific design problems are selected to demonstrate how advanced methods can improve current design activities. References are made to studies that demonstrate these design problems and methods, and these studies will provide the detailed information and check cases to support incorporation of these methods into the AEE. This paper provides background and terminology for discussing the launch vehicle conceptual design problem so that the diverse AEE user community can participate in prioritizing the AEE development effort.

  14. Fast dose kernel interpolation using Fourier transform with application to permanent prostate brachytherapy dosimetry.

    PubMed

    Liu, Derek; Sloboda, Ron S

    2014-05-01

    Boyer and Mok proposed a fast calculation method employing the Fourier transform (FT), for which calculation time is independent of the number of seeds but seed placement is restricted to calculation grid points. Here an interpolation method is described enabling unrestricted seed placement while preserving the computational efficiency of the original method. The Iodine-125 seed dose kernel was sampled and selected values were modified to optimize interpolation accuracy for clinically relevant doses. For each seed, the kernel was shifted to the nearest grid point via convolution with a unit impulse, implemented in the Fourier domain. The remaining fractional shift was performed using a piecewise third-order Lagrange filter. Implementation of the interpolation method greatly improved FT-based dose calculation accuracy. The dose distribution was accurate to within 2% beyond 3 mm from each seed. Isodose contours were indistinguishable from explicit TG-43 calculation. Dose-volume metric errors were negligible. Computation time for the FT interpolation method was essentially the same as Boyer's method. A FT interpolation method for permanent prostate brachytherapy TG-43 dose calculation was developed which expands upon Boyer's original method and enables unrestricted seed placement. The proposed method substantially improves the clinically relevant dose accuracy with negligible additional computation cost, preserving the efficiency of the original method.

  15. Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow

    PubMed Central

    Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L.

    2014-01-01

    Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. PMID:24780131

  16. Brain vascular image segmentation based on fuzzy local information C-means clustering

    NASA Astrophysics Data System (ADS)

    Hu, Chaoen; Liu, Xia; Liang, Xiao; Hui, Hui; Yang, Xin; Tian, Jie

    2017-02-01

    Light sheet fluorescence microscopy (LSFM) is a powerful optical resolution fluorescence microscopy technique which enables to observe the mouse brain vascular network in cellular resolution. However, micro-vessel structures are intensity inhomogeneity in LSFM images, which make an inconvenience for extracting line structures. In this work, we developed a vascular image segmentation method by enhancing vessel details which should be useful for estimating statistics like micro-vessel density. Since the eigenvalues of hessian matrix and its sign describes different geometric structure in images, which enable to construct vascular similarity function and enhance line signals, the main idea of our method is to cluster the pixel values of the enhanced image. Our method contained three steps: 1) calculate the multiscale gradients and the differences between eigenvalues of Hessian matrix. 2) In order to generate the enhanced microvessels structures, a feed forward neural network was trained by 2.26 million pixels for dealing with the correlations between multi-scale gradients and the differences between eigenvalues. 3) The fuzzy local information c-means clustering (FLICM) was used to cluster the pixel values in enhance line signals. To verify the feasibility and effectiveness of this method, mouse brain vascular images have been acquired by a commercial light-sheet microscope in our lab. The experiment of the segmentation method showed that dice similarity coefficient can reach up to 85%. The results illustrated that our approach extracting line structures of blood vessels dramatically improves the vascular image and enable to accurately extract blood vessels in LSFM images.

  17. Advancing working and learning through critical action research: creativity and constraints.

    PubMed

    Bellman, Loretta; Bywood, Catherine; Dale, Susan

    2003-12-01

    Continuous professional development is an essential component within many health care 'Learning Organisations'. The paper describes the first phase of an initiative to develop a professional practice development framework for nurses in an NHS general hospital. The project was undertaken within a critical action research methodology. A tripartite arrangement between the hospital, a university and professional nursing organisation enabled clinical, educational and research support for the nurses (co-researchers) engaged in the project. Initial challenges were from some managers, educationalists and the ethics committee who did not appear to understand the action research process. A multi-method approach to data collection was undertaken to capture the change process from different stakeholders' perceptions. Triangulation of the data was undertaken. Despite organisational constraints, transformational leadership and peer support enabled the co-researchers to identify and initiate three patient-focused initiatives. The change process for the co-researchers included: enlightening personal journey, exploring the research-practice gap, enhancing personal and professional knowledge, evolving cultural change and collaborative working, empowering and disempowering messages. A hospital merger and corporate staff changes directly impacted on the project. A more flexible time-scale and longer term funding are required to enable continuity for trust-wide projects undertaken in dynamic clinical settings.

  18. Neurobehavioural methods, effects and prevention: workers' human rights are why the field matters for developing countries.

    PubMed

    London, L

    2009-11-01

    Little research into neurobehavioural methods and effects occurs in developing countries, where established neurotoxic chemicals continue to pose significant occupational and environmental burdens, and where agents newly identified as neurotoxic are also widespread. Much of the morbidity and mortality associated with neurotoxic agents remains hidden in developing countries as a result of poor case detection, lack of skilled personnel, facilities and equipment for diagnosis, inadequate information systems, limited resources for research and significant competing causes of ill-health, such as HIV/AIDS and malaria. Placing the problem in a human rights context enables researchers and scientists in developing countries to make a strong case for why the field of neurobehavioural methods and effects matters because there are numerous international human rights commitments that make occupational and environmental health and safety a human rights obligation.

  19. Invited review: A position on the Global Livestock Environmental Assessment Model (GLEAM).

    PubMed

    MacLeod, M J; Vellinga, T; Opio, C; Falcucci, A; Tempio, G; Henderson, B; Makkar, H; Mottet, A; Robinson, T; Steinfeld, H; Gerber, P J

    2018-02-01

    The livestock sector is one of the fastest growing subsectors of the agricultural economy and, while it makes a major contribution to global food supply and economic development, it also consumes significant amounts of natural resources and alters the environment. In order to improve our understanding of the global environmental impact of livestock supply chains, the Food and Agriculture Organization of the United Nations has developed the Global Livestock Environmental Assessment Model (GLEAM). The purpose of this paper is to provide a review of GLEAM. Specifically, it explains the model architecture, methods and functionality, that is the types of analysis that the model can perform. The model focuses primarily on the quantification of greenhouse gases emissions arising from the production of the 11 main livestock commodities. The model inputs and outputs are managed and produced as raster data sets, with spatial resolution of 0.05 decimal degrees. The Global Livestock Environmental Assessment Model v1.0 consists of five distinct modules: (a) the Herd Module; (b) the Manure Module; (c) the Feed Module; (d) the System Module; (e) the Allocation Module. In terms of the modelling approach, GLEAM has several advantages. For example spatial information on livestock distributions and crops yields enables rations to be derived that reflect the local availability of feed resources in developing countries. The Global Livestock Environmental Assessment Model also contains a herd model that enables livestock statistics to be disaggregated and variation in livestock performance and management to be captured. Priorities for future development of GLEAM include: improving data quality and the methods used to perform emissions calculations; extending the scope of the model to include selected additional environmental impacts and to enable predictive modelling; and improving the utility of GLEAM output.

  20. Coupled-cluster based R-matrix codes (CCRM): Recent developments

    NASA Astrophysics Data System (ADS)

    Sur, Chiranjib; Pradhan, Anil K.

    2008-05-01

    We report the ongoing development of the new coupled-cluster R-matrix codes (CCRM) for treating electron-ion scattering and radiative processes within the framework of the relativistic coupled-cluster method (RCC), interfaced with the standard R-matrix methodology. The RCC method is size consistent and in principle equivalent to an all-order many-body perturbation theory. The RCC method is one of the most accurate many-body theories, and has been applied for several systems. This project should enable the study of electron-interactions with heavy atoms/ions, utilizing not only high speed computing platforms but also improved theoretical description of the relativistic and correlation effects for the target atoms/ions as treated extensively within the RCC method. Here we present a comprehensive outline of the newly developed theoretical method and a schematic representation of the new suite of CCRM codes. We begin with the flowchart and description of various stages involved in this development. We retain the notations and nomenclature of different stages as analogous to the standard R-matrix codes.

  1. A fibrin-supported myocardial organ culture for isolation of cardiac stem cells via the recapitulation of cardiac homeostasis.

    PubMed

    Kim, Jong-Tae; Chung, Hye Jin; Seo, Ji-Yeon; Yang, Young-Il; Choi, Min-Young; Kim, Hyeong-In; Yang, Tae-Hyun; Lee, Won-Jin; Youn, Young Chul; Kim, Hye Jung; Kim, Yeon Mee; Lee, Hyukjin; Jang, Yang-Soo; Lee, Seung-Jin

    2015-04-01

    There is great interest in the development of cardiac stem cells (CSCs) cell-based therapeutics; thus, clinical translation requires an efficient method for attaining therapeutic quantities of these cells. Furthermore, an in vitro model to investigate the mechanisms regulating the cardiac homeostasis is crucial. We sought to develop a simple myocardial culture method for enabling both the recapitulation of myocardial homeostasis and the simultaneous isolation of CSCs. The intact myocardial fragments were encapsulated 3-dimensionally into the fibrin and cultured under dynamic conditions. The fibrin provided secure physical support and substratum to the myocardium, which mediated integrin-mediated cell signaling that allowed in situ renewal, outgrowth and cardiomyogenic differentiation of CSCs, mimicking myocardial homeostasis. Since our culture maintained the myocardial CSCs niches, it was possible to define the identity of in vitro renewed CSCs that situated in the interstitium between cardiomyocytes and microvessels. Lastly, the use of matrix-restricted fibrinolysis enabled the selective isolation of outgrown CSCs that retained the clonogenicity, long-term growth competency and cardiovascular commitment potential. Collectively, this myocardial culture might be used as an alternative tool for studying cardiac biology and developing cell-based therapeutics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  2. Holoentropy enabled-decision tree for automatic classification of diabetic retinopathy using retinal fundus images.

    PubMed

    Mane, Vijay Mahadeo; Jadhav, D V

    2017-05-24

    Diabetic retinopathy (DR) is the most common diabetic eye disease. Doctors are using various test methods to detect DR. But, the availability of test methods and requirements of domain experts pose a new challenge in the automatic detection of DR. In order to fulfill this objective, a variety of algorithms has been developed in the literature. In this paper, we propose a system consisting of a novel sparking process and a holoentropy-based decision tree for automatic classification of DR images to further improve the effectiveness. The sparking process algorithm is developed for automatic segmentation of blood vessels through the estimation of optimal threshold. The holoentropy enabled decision tree is newly developed for automatic classification of retinal images into normal or abnormal using hybrid features which preserve the disease-level patterns even more than the signal level of the feature. The effectiveness of the proposed system is analyzed using standard fundus image databases DIARETDB0 and DIARETDB1 for sensitivity, specificity and accuracy. The proposed system yields sensitivity, specificity and accuracy values of 96.72%, 97.01% and 96.45%, respectively. The experimental result reveals that the proposed technique outperforms the existing algorithms.

  3. Measurement Frontiers in Molecular Biology

    NASA Astrophysics Data System (ADS)

    Laderman, Stephen

    2009-03-01

    Developments of molecular measurements and manipulations have long enabled forefront research in evolution, genetics, biological development and its dysfunction, and the impact of external factors on the behavior of cells. Measurement remains at the heart of exciting and challenging basic and applied problems in molecular and cell biology. Methods to precisely determine the identity and abundance of particular molecules amongst a complex mixture of similar and dissimilar types require the successful design and integration of multiple steps involving biochemical manipulations, separations, physical probing, and data processing. Accordingly, today's most powerful methods for characterizing life at the molecular level depend on coordinated advances in applied physics, biochemistry, chemistry, computer science, and engineering. This is well illustrated by recent approaches to the measurement of DNA, RNA, proteins, and intact cells. Such successes underlie well founded visions of how molecular biology can further assist in answering compelling scientific questions and in enabling the development of remarkable advances in human health. These visions, in turn, are motivating the interdisciplinary creation of even more comprehensive measurements. As a further and closely related consequence, they are motivating innovations in the conceptual and practical approaches to organizing and visualizing large, complex sets of interrelated experimental results and distilling from those data compelling, informative conclusions.

  4. Development of the Vinylogous Pictet-Spengler Cyclization and Total Synthesis of (±)-Lundurine A.

    PubMed

    Nash, Aaron; Qi, Xiangbing; Maity, Pradip; Owens, Kyle; Tambar, Uttam K

    2018-04-16

    A novel vinylogous Pictet-Spengler cyclization has been developed for the generation of indole-annulated medium-sized rings. The method enables the synthesis of tetrahydroazocinoindoles with a fully substituted carbon center, a prevalent structural motif in many biologically active alkaloids. The strategy has been applied to the total synthesis of (±)-lundurine A. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Model-Driven Engineering: Automatic Code Generation and Beyond

    DTIC Science & Technology

    2015-03-01

    and Weblogic as well as cloud environments such as Mi- crosoft Azure and Amazon Web Services®. Finally, while the generated code has dependencies on...code generation in the context of the full system lifecycle from development to sustainment. Acquisition programs in govern- ment or large commercial...Acquirers are concerned with the full system lifecycle, and they need confidence that the development methods will enable the system to meet the functional

  6. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines.

    PubMed

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W; Panepucci, Ezequiel; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods.

  7. Human Connectome Project Informatics: quality control, database services, and data visualization

    PubMed Central

    Marcus, Daniel S.; Harms, Michael P.; Snyder, Abraham Z.; Jenkinson, Mark; Wilson, J Anthony; Glasser, Matthew F.; Barch, Deanna M.; Archie, Kevin A.; Burgess, Gregory C.; Ramaratnam, Mohana; Hodge, Michael; Horton, William; Herrick, Rick; Olsen, Timothy; McKay, Michael; House, Matthew; Hileman, Michael; Reid, Erin; Harwell, John; Coalson, Timothy; Schindler, Jon; Elam, Jennifer S.; Curtiss, Sandra W.; Van Essen, David C.

    2013-01-01

    The Human Connectome Project (HCP) has developed protocols, standard operating and quality control procedures, and a suite of informatics tools to enable high throughput data collection, data sharing, automated data processing and analysis, and data mining and visualization. Quality control procedures include methods to maintain data collection consistency over time, to measure head motion, and to establish quantitative modality-specific overall quality assessments. Database services developed as customizations of the XNAT imaging informatics platform support both internal daily operations and open access data sharing. The Connectome Workbench visualization environment enables user interaction with HCP data and is increasingly integrated with the HCP's database services. Here we describe the current state of these procedures and tools and their application in the ongoing HCP study. PMID:23707591

  8. SUPPORTING PHYSICIANS' PRACTICE-BASED LEARNING AND IMPROVEMENT (PBLI) AND QUALITY IMPROVEMENT THROUGH EXPLORATION OF POPULATION-BASED MEDICAL DATA.

    PubMed

    Baumgart, Leigh A; Bass, Ellen J; Lyman, Jason A; Springs, Sherry; Voss, John; Hayden, Gregory F; Hellems, Martha A; Hoke, Tracey R; Schlag, Katharine A; Schorling, John B

    2010-01-01

    Participating in self-assessment activities may stimulate improvement in practice behaviors. However, it is unclear how best to support the development of self-assessment skills, particularly in the health care domain. Exploration of population-based data is one method to enable health care providers to identify deficiencies in overall practice behavior that can motivate quality improvement initiatives. At the University of Virginia, we are developing a decision support tool to integrate and present population-based patient data to health care providers related to both clinical outcomes and non-clinical measures (e.g., demographic information). By enabling users to separate their direct impact on clinical outcomes from other factors out of their control, we may enhance the self-assessment process.

  9. SUPPORTING PHYSICIANS’ PRACTICE-BASED LEARNING AND IMPROVEMENT (PBLI) AND QUALITY IMPROVEMENT THROUGH EXPLORATION OF POPULATION-BASED MEDICAL DATA

    PubMed Central

    Baumgart, Leigh A.; Bass, Ellen J.; Lyman, Jason A.; Springs, Sherry; Voss, John; Hayden, Gregory F.; Hellems, Martha A.; Hoke, Tracey R.; Schlag, Katharine A.; Schorling, John B.

    2011-01-01

    Participating in self-assessment activities may stimulate improvement in practice behaviors. However, it is unclear how best to support the development of self-assessment skills, particularly in the health care domain. Exploration of population-based data is one method to enable health care providers to identify deficiencies in overall practice behavior that can motivate quality improvement initiatives. At the University of Virginia, we are developing a decision support tool to integrate and present population-based patient data to health care providers related to both clinical outcomes and non-clinical measures (e.g., demographic information). By enabling users to separate their direct impact on clinical outcomes from other factors out of their control, we may enhance the self-assessment process. PMID:21874123

  10. Enabling Data Access for Environmental Monitoring: SERVIR West Africa

    NASA Astrophysics Data System (ADS)

    Yetman, G.; de Sherbinin, A. M.

    2017-12-01

    SERVIR is a join effort between NASA and the U.S. Agency for International Development to form regional partnerships and bring satellite-based earth monitoring and geographic information technologies to bear on environmental issues. The recently established SERVIR node for West Africa aims to "connect space to villages" and enable response to environmental change at the national and local level through partnering with a network of organizations in the region. Comprehensive services—data streams, analysis methods and algorithms, and information products for decision making—to support environmental monitoring of five critical issues identified by West African network members are being designed and developed: ephemeral water, charcoal production, locusts, groundwater, and land use/land cover change. Additionally, climate change information is critical for planning and context in each of these issues. The selection of data and methods is a collaborative effort, with experts in the region working with experts at NASA and the scientific community to best meet information monitoring requirements. Design and delivery of these services requires capacity development in a number of areas, including best practices in data management, analysis methods for combining multiple data streams, and information technology infrastructure. Two research centers at Columbia University are implementing partners for SERVIR West Africa, acting to support capacity development in network members through a combination of workshops, training, and implementation of technologies in the region. The presentation will focus on efforts by these centers to assess current capabilities and improve capacity through gathering requirements, system design, technology selection, technology deployment, training, and workshops.

  11. The software-cycle model for re-engineering and reuse

    NASA Technical Reports Server (NTRS)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  12. Flexible Method for Inter-object Communication in C++

    NASA Technical Reports Server (NTRS)

    Curlett, Brian P.; Gould, Jack J.

    1994-01-01

    A method has been developed for organizing and sharing large amounts of information between objects in C++ code. This method uses a set of object classes to define variables and group them into tables. The variable tables presented here provide a convenient way of defining and cataloging data, as well as a user-friendly input/output system, a standardized set of access functions, mechanisms for ensuring data integrity, methods for interprocessor data transfer, and an interpretive language for programming relationships between parameters. The object-oriented nature of these variable tables enables the use of multiple data types, each with unique attributes and behavior. Because each variable provides its own access methods, redundant table lookup functions can be bypassed, thus decreasing access times while maintaining data integrity. In addition, a method for automatic reference counting was developed to manage memory safely.

  13. Cost-Effective, Insitu Field Measurements for Determining the Water Retention Quantification onBehavior of Individual Right-of-Way Bioswales

    NASA Astrophysics Data System (ADS)

    Wang, S.; McGillis, W. R.; Hu, R.; Culligan, P. J.

    2017-12-01

    Green infrastructure (GI) interventions, such as right-of-way bioswales, are being implemented in many urban areas, including New York City, to help mitigate the negative impacts of stormwater runoff. To understand the storm water retention capacity of bioswales, hydrological models, at scales ranging from the tributary area of a single right-of-way bioswale to an entire watershed, are often invoked. The validation and calibration of these models is, however, currently hampered by lack of extensive field measurements that quantify bioswale stormwater retention behaviors for different storm sizes and bioswale configurations. To overcome this problem, three field methods to quantify the water retention capacity of individual bioswales were developed. The methods are potentially applicable to other applications concerned with quantifying flow regimes in urban area. Precise measurements with high time resolutions and low environmental impacts are desired for gauging the hydraulic performance of bioswales, and similar GI configurations. To satisfy these requirements, an in-field measurement method was developed which involved the deployment of acoustic water-level sensors to measure the upstream and downstream water levels of flow into and out of a bioswale located in the Bronx areas of New York City. The measurements were made during several individual storm events. To provide reference flow rates to enable accurate calibration of the acoustic water level measurements, two other conductometry-based methods, which made use of YSI sensors and injected calcium chloride solutions, were also developed and deployed simultaneously with the water level measurements. The suite of data gathered by these methods enabled the development of a relationship between stage-discharge and rainfall intensity, which was then used to obtain the upstream and downstream hydrographs for the individual bioswale for the different storm events. This presentation will describe in detail the developed field methods, and will present results arising from the deployment of the methods, including results on the stormwater infiltration quantity and infiltration rate of the studied bioswale. The field methods are easily deployed at other bioswales sites and for other similar GI configurations.

  14. A Way to Select Electrical Sheets of the Segment Stator Core Motors.

    NASA Astrophysics Data System (ADS)

    Enomoto, Yuji; Kitamura, Masashi; Sakai, Toshihiko; Ohara, Kouichiro

    The segment stator core, high density winding coil, high-energy-product permanent magnet are indispensable technologies in the development of a compact and also high efficient motors. The conventional design method for the segment stator core mostly depended on experienced knowledge of selecting a suitable electromagnetic material, far from optimized design. Therefore, we have developed a novel design method in the selection of a suitable electromagnetic material based on the correlation evaluation between the material characteristics and motor performance. It enables the selection of suitable electromagnetic material that will meet the motor specification.

  15. The technology and biology of single-cell RNA sequencing.

    PubMed

    Kolodziejczyk, Aleksandra A; Kim, Jong Kyoung; Svensson, Valentine; Marioni, John C; Teichmann, Sarah A

    2015-05-21

    The differences between individual cells can have profound functional consequences, in both unicellular and multicellular organisms. Recently developed single-cell mRNA-sequencing methods enable unbiased, high-throughput, and high-resolution transcriptomic analysis of individual cells. This provides an additional dimension to transcriptomic information relative to traditional methods that profile bulk populations of cells. Already, single-cell RNA-sequencing methods have revealed new biology in terms of the composition of tissues, the dynamics of transcription, and the regulatory relationships between genes. Rapid technological developments at the level of cell capture, phenotyping, molecular biology, and bioinformatics promise an exciting future with numerous biological and medical applications. Copyright © 2015 Elsevier Inc. All rights reserved.

  16. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  17. Organocatalyzed Photocontrolled Radical Polymerization of Semifluorinated (Meth)acrylates Driven by Visible Light.

    PubMed

    Gong, Honghong; Zhao, Yucheng; Shen, Xianwang; Lin, Jun; Chen, Mao

    2018-01-02

    Fluorinated polymers are important materials that are widely used in many areas. Herein, we report the development of a metal-free photocontrolled radical polymerization of semifluorinated (meth)acrylates with a new visible-light-absorbing organocatalyst. This method enabled the production of a variety of semifluorinated polymers with narrow molar-weight distributions from semifluorinated trithiocarbonates or perfluoroalkyl iodides. The high performance of "ON/OFF" control and chain-extension experiments further demonstrate the utility and reliability of this method. Furthermore, to streamline the preparation of semifluorinated polymers, a scalable continuous-flow approach has been developed. Given the broad interest in fluorinated materials and photopolymerization, we expect that this method will facilitate the development of advanced materials with unique properties. © 2018 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. SU-C-17A-07: The Development of An MR Accelerator-Enabled Planning-To-Delivery Technique for Stereotactic Palliative Radiotherapy Treatment of Spinal Metastases

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hoogcarspel, S J; Kontaxis, C; Velden, J M van der

    2014-06-01

    Purpose: To develop an MR accelerator-enabled online planning-todelivery technique for stereotactic palliative radiotherapy treatment of spinal metastases. The technical challenges include; automated stereotactic treatment planning, online MR-based dose calculation and MR guidance during treatment. Methods: Using the CT data of 20 patients previously treated at our institution, a class solution for automated treatment planning for spinal bone metastases was created. For accurate dose simulation right before treatment, we fused geometrically correct online MR data with pretreatment CT data of the target volume (TV). For target tracking during treatment, a dynamic T2-weighted TSE MR sequence was developed. An in house developedmore » GPU based IMRT optimization and dose calculation algorithm was used for fast treatment planning and simulation. An automatically generated treatment plan developed with this treatment planning system was irradiated on a clinical 6 MV linear accelerator and evaluated using a Delta4 dosimeter. Results: The automated treatment planning method yielded clinically viable plans for all patients. The MR-CT fusion based dose calculation accuracy was within 2% as compared to calculations performed with original CT data. The dynamic T2-weighted TSE MR Sequence was able to provide an update of the anatomical location of the TV every 10 seconds. Dose calculation and optimization of the automatically generated treatment plans using only one GPU took on average 8 minutes. The Delta4 measurement of the irradiated plan agreed with the dose calculation with a 3%/3mm gamma pass rate of 86.4%. Conclusions: The development of an MR accelerator-enabled planning-todelivery technique for stereotactic palliative radiotherapy treatment of spinal metastases was presented. Future work will involve developing an intrafraction motion adaptation strategy, MR-only dose calculation, radiotherapy quality-assurance in a magnetic field, and streamlining the entire treatment process on an MR accelerator.« less

  19. Toxin Detection by Surface Plasmon Resonance

    PubMed Central

    Hodnik, Vesna; Anderluh, Gregor

    2009-01-01

    Significant efforts have been invested in the past years for the development of analytical methods for fast toxin detection in food and water. Immunochemical methods like ELISA, spectroscopy and chromatography are the most used in toxin detection. Different methods have been linked, e.g. liquid chromatography and mass spectrometry (LC-MS), in order to detect as low concentrations as possible. Surface plasmon resonance (SPR) is one of the new biophysical methods which enables rapid toxin detection. Moreover, this method was already included in portable sensors for on-site determinations. In this paper we describe some of the most common methods for toxin detection, with an emphasis on SPR. PMID:22573957

  20. High-yield fermentation and a novel heat-precipitation purification method for hydrophobin HGFI from Grifola frondosa in Pichia pastoris.

    PubMed

    Song, Dongmin; Gao, Zhendong; Zhao, Liqiang; Wang, Xiangxiang; Xu, Haijin; Bai, Yanling; Zhang, Xiuming; Linder, Markus B; Feng, Hui; Qiao, Mingqiang

    2016-12-01

    Hydrophobins are proteins produced by filamentous fungi with high natural-surfactant activities and that can self-assemble in interfaces of air-water or solid-water to form amphiphilic membranes. Here, we reported a high-yield fermentation method for hydrophobin HGFI from Grifola frondosa in Pichia pastoris, attaining production of 300 mg/L by keeping the dissolved oxygen level at 15%-25% by turning the methanol-feeding speed. We also developed a novel HGFI-purification method enabling large-scare purification of HGFI, with >90% recovery. Additionally, we observed that hydrophobin HGFI in fermentation broth precipitated at pH < 7.0 and temperatures >90 °C. We also identified the structure and properties of proteins purified by this method through atomic force microscopy, circular dichroism, X-ray photoelectron spectroscopy, and water-contact angle measurement, which is similar to protein purification by ultrafiltration without heating treatment that enables our method to maintain native HGFI structure and properties. Furthermore, the purification method presented here can be applied to large-scale purification of other type I hydrophobins. Copyright © 2016. Published by Elsevier Inc.

  1. IT investments can add business value.

    PubMed

    Williams, Terry G

    2002-05-01

    Investment in information technology (IT) is costly, but necessary to enable healthcare organizations to improve their infrastructure and achieve other improvement initiatives. Such an investment is even more costly, however, if the technology does not appropriately enable organizations to perform business processes that help them accomplish their mission of providing safe, high-quality care cost-effectively. Before committing to a costly IT investment, healthcare organizations should implement a decision-making process that can help them choose, implement, and use technology that will provide sustained business value. A seven-step decision-making process that can help healthcare organizations achieve this result involves performing a gap analysis, assessing and aligning organizational goals, establishing distributed accountability, identifying linked organizational-change initiatives, determining measurement methods, establishing appropriate teams to ensure systems are integrated with multidisciplinary improvement methods, and developing a plan to accelerate adoption of the IT product.

  2. A high performance biometric signal and image processing method to reveal blood perfusion towards 3D oxygen saturation mapping

    NASA Astrophysics Data System (ADS)

    Imms, Ryan; Hu, Sijung; Azorin-Peris, Vicente; Trico, Michaël.; Summers, Ron

    2014-03-01

    Non-contact imaging photoplethysmography (PPG) is a recent development in the field of physiological data acquisition, currently undergoing a large amount of research to characterize and define the range of its capabilities. Contact-based PPG techniques have been broadly used in clinical scenarios for a number of years to obtain direct information about the degree of oxygen saturation for patients. With the advent of imaging techniques, there is strong potential to enable access to additional information such as multi-dimensional blood perfusion and saturation mapping. The further development of effective opto-physiological monitoring techniques is dependent upon novel modelling techniques coupled with improved sensor design and effective signal processing methodologies. The biometric signal and imaging processing platform (bSIPP) provides a comprehensive set of features for extraction and analysis of recorded iPPG data, enabling direct comparison with other biomedical diagnostic tools such as ECG and EEG. Additionally, utilizing information about the nature of tissue structure has enabled the generation of an engineering model describing the behaviour of light during its travel through the biological tissue. This enables the estimation of the relative oxygen saturation and blood perfusion in different layers of the tissue to be calculated, which has the potential to be a useful diagnostic tool.

  3. Neural-Network-Development Program

    NASA Technical Reports Server (NTRS)

    Phillips, Todd A.

    1993-01-01

    NETS, software tool for development and evaluation of neural networks, provides simulation of neural-network algorithms plus computing environment for development of such algorithms. Uses back-propagation learning method for all of networks it creates. Enables user to customize patterns of connections between layers of network. Also provides features for saving, during learning process, values of weights, providing more-precise control over learning process. Written in ANSI standard C language. Machine-independent version (MSC-21588) includes only code for command-line-interface version of NETS 3.0.

  4. Satellite voice broadcast. Volume 2: System study

    NASA Technical Reports Server (NTRS)

    Bachtell, E. E.; Bettadapur, S. S.; Coyner, J. V.; Farrell, C. E.

    1985-01-01

    The Technical Volume of the Satellite Broadcast System Study is presented. Designs are synthesized for direct sound broadcast satellite systems for HF-, VHF-, L-, and Ku-bands. Methods are developed and used to predict satellite weight, volume, and RF performance for the various concepts considered. Cost and schedule risk assessments are performed to predict time and cost required to implement selected concepts. Technology assessments and tradeoffs are made to identify critical enabling technologies that require development to bring technical risk to acceptable levels for full scale development.

  5. Employing the arts for knowledge production and translation: Visualizing new possibilities for women speaking up about safety concerns in maternity.

    PubMed

    Mackintosh, Nicola; Sandall, Jane; Collison, Claire; Carter, Wendy; Harris, James

    2018-06-01

    This project used animated film to translate research findings into accessible health information aimed at enabling women to speak up and secure professional help for serious safety concerns during pregnancy and after birth. We tested as proof of concept our use of the arts both as product (knowledge production) and process (enabling involvement). Emergencies during pregnancy and birth, while unusual, can develop rapidly and unexpectedly, with catastrophic consequences. Women's tacit knowledge of changes in their condition is an important resource to aid early detection, but women can worry about the legitimacy of their concerns and struggle to get these taken seriously by staff. Arts-based knowledge translation. A user group of women who had experienced complications in the perinatal period (n = 34) helped us develop and pilot test the animation. Obstetricians and midwives (15), clinical leads (3) and user group representatives (8) helped with the design and testing. The consultation process, script and storyboard enabled active interaction with the evidence, meaningful engagement with stakeholders and new understandings about securing help for perinatal complications. The method enabled us to address gender stereotypes and social norms about speaking up and embed a social script for women within the animation, to help structure their help seeking. While for some women, there was an emotional burden, the majority were glad to have been part of the animation's development and felt it had enabled their voices to be heard. This project has demonstrated the benefits of arts-science collaborations for meaningful co-production and effective translation of research evidence. © 2017 The Authors. Health Expectations published by John Wiley & Sons Ltd.

  6. Playing with food. A novel approach to understanding nutritional behaviour development.

    PubMed

    Lynch, Meghan

    2010-06-01

    This study explored the use of a novel method of collecting data on nutritional behaviour development in young children: videos posted on the Internet site YouTube. YouTube videos (n=115) of children alone and interacting with parents in toy kitchen settings were analyzed using constant comparison analysis. Results revealed that in the videos of play nutritional behaviours, children showed influences of their real social environments, and that this medium enabled the observation of parent-child interactions in a more natural context without the researcher's presence. These findings encourage further research in the development and validity of alternative methods of data collection. Copyright 2010 Elsevier Ltd. All rights reserved.

  7. GEOS. User Tutorials

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fu, Pengchen; Settgast, Randolph R.; Johnson, Scott M.

    2014-12-17

    GEOS is a massively parallel, multi-physics simulation application utilizing high performance computing (HPC) to address subsurface reservoir stimulation activities with the goal of optimizing current operations and evaluating innovative stimulation methods. GEOS enables coupling of di erent solvers associated with the various physical processes occurring during reservoir stimulation in unique and sophisticated ways, adapted to various geologic settings, materials and stimulation methods. Developed at the Lawrence Livermore National Laboratory (LLNL) as a part of a Laboratory-Directed Research and Development (LDRD) Strategic Initiative (SI) project, GEOS represents the culmination of a multi-year ongoing code development and improvement e ort that hasmore » leveraged existing code capabilities and sta expertise to design new computational geosciences software.« less

  8. Three dimensional fabrication at small size scales

    PubMed Central

    Leong, Timothy G.; Zarafshar, Aasiyeh M.; Gracias, David H.

    2010-01-01

    Despite the fact that we live in a three-dimensional (3D) world and macroscale engineering is 3D, conventional sub-mm scale engineering is inherently two-dimensional (2D). New fabrication and patterning strategies are needed to enable truly three-dimensionally-engineered structures at small size scales. Here, we review strategies that have been developed over the last two decades that seek to enable such millimeter to nanoscale 3D fabrication and patterning. A focus of this review is the strategy of self-assembly, specifically in a biologically inspired, more deterministic form known as self-folding. Self-folding methods can leverage the strengths of lithography to enable the construction of precisely patterned 3D structures and “smart” components. This self-assembling approach is compared with other 3D fabrication paradigms, and its advantages and disadvantages are discussed. PMID:20349446

  9. Google Earth Engine: a new cloud-computing platform for global-scale earth observation data and analysis

    NASA Astrophysics Data System (ADS)

    Moore, R. T.; Hansen, M. C.

    2011-12-01

    Google Earth Engine is a new technology platform that enables monitoring and measurement of changes in the earth's environment, at planetary scale, on a large catalog of earth observation data. The platform offers intrinsically-parallel computational access to thousands of computers in Google's data centers. Initial efforts have focused primarily on global forest monitoring and measurement, in support of REDD+ activities in the developing world. The intent is to put this platform into the hands of scientists and developing world nations, in order to advance the broader operational deployment of existing scientific methods, and strengthen the ability for public institutions and civil society to better understand, manage and report on the state of their natural resources. Earth Engine currently hosts online nearly the complete historical Landsat archive of L5 and L7 data collected over more than twenty-five years. Newly-collected Landsat imagery is downloaded from USGS EROS Center into Earth Engine on a daily basis. Earth Engine also includes a set of historical and current MODIS data products. The platform supports generation, on-demand, of spatial and temporal mosaics, "best-pixel" composites (for example to remove clouds and gaps in satellite imagery), as well as a variety of spectral indices. Supervised learning methods are available over the Landsat data catalog. The platform also includes a new application programming framework, or "API", that allows scientists access to these computational and data resources, to scale their current algorithms or develop new ones. Under the covers of the Google Earth Engine API is an intrinsically-parallel image-processing system. Several forest monitoring applications powered by this API are currently in development and expected to be operational in 2011. Combining science with massive data and technology resources in a cloud-computing framework can offer advantages of computational speed, ease-of-use and collaboration, as well as transparency in data and methods. Methods developed for global processing of MODIS data to map land cover are being adopted for use with Landsat data. Specifically, the MODIS Vegetation Continuous Field product methodology has been applied for mapping forest extent and change at national scales using Landsat time-series data sets. Scaling this method to continental and global scales is enabled by Google Earth Engine computing capabilities. By combining the supervised learning VCF approach with the Landsat archive and cloud computing, unprecedented monitoring of land cover dynamics is enabled.

  10. Micromechanics-Based Structural Analysis (FEAMAC) and Multiscale Visualization within Abaqus/CAE Environment

    NASA Technical Reports Server (NTRS)

    Arnold, Steven M.; Bednarcyk, Brett A.; Hussain, Aquila; Katiyar, Vivek

    2010-01-01

    A unified framework is presented that enables coupled multiscale analysis of composite structures and associated graphical pre- and postprocessing within the Abaqus/CAE environment. The recently developed, free, Finite Element Analysis--Micromechanics Analysis Code (FEAMAC) software couples NASA's Micromechanics Analysis Code with Generalized Method of Cells (MAC/GMC) with Abaqus/Standard and Abaqus/Explicit to perform micromechanics based FEA such that the nonlinear composite material response at each integration point is modeled at each increment by MAC/GMC. The Graphical User Interfaces (FEAMAC-Pre and FEAMAC-Post), developed through collaboration between SIMULIA Erie and the NASA Glenn Research Center, enable users to employ a new FEAMAC module within Abaqus/CAE that provides access to the composite microscale. FEA IAC-Pre is used to define and store constituent material properties, set-up and store composite repeating unit cells, and assign composite materials as sections with all data being stored within the CAE database. Likewise FEAMAC-Post enables multiscale field quantity visualization (contour plots, X-Y plots), with point and click access to the microscale i.e., fiber and matrix fields).

  11. Bio-inspired direct patterning functional nanothin microlines: controllable liquid transfer.

    PubMed

    Wang, Qianbin; Meng, Qingan; Wang, Pengwei; Liu, Huan; Jiang, Lei

    2015-04-28

    Developing a general and low-cost strategy that enables direct patterning of microlines with nanometer thickness from versatile liquid-phase functional materials and precise positioning of them on various substrates remains a challenge. Herein, with inspiration from the oriental wisdom to control ink transfer by Chinese brushes, we developed a facile and general writing strategy to directly pattern various functional microlines with homogeneous distribution and nanometer-scale thickness. It is demonstrated that the width and thickness of the microlines could be well-controlled by tuning the writing method, providing guidance for the adaptation of this technique to various systems. It is also shown that various functional liquid-phase materials, such as quantum dots, small molecules, polymers, and suspensions of nanoparticles, could directly write on the substrates with intrinsic physicochemical properties well-preserved. Moreover, this technique enabled direct patterning of liquid-phase materials on certain microdomains, even in multiple layered style, thus a microdomain localized chemical reaction and the patterned surface chemical modification were enabled. This bio-inspired direct writing device will shed light on the template-free printing of various functional micropatterns, as well as the integrated functional microdevices.

  12. Micro-Randomized Trials: An Experimental Design for Developing Just-in-Time Adaptive Interventions

    PubMed Central

    Klasnja, Predrag; Hekler, Eric B.; Shiffman, Saul; Boruvka, Audrey; Almirall, Daniel; Tewari, Ambuj; Murphy, Susan A.

    2015-01-01

    Objective This paper presents an experimental design, the micro-randomized trial, developed to support optimization of just-in-time adaptive interventions (JITAIs). JITAIs are mHealth technologies that aim to deliver the right intervention components at the right times and locations to optimally support individuals’ health behaviors. Micro-randomized trials offer a way to optimize such interventions by enabling modeling of causal effects and time-varying effect moderation for individual intervention components within a JITAI. Methods The paper describes the micro-randomized trial design, enumerates research questions that this experimental design can help answer, and provides an overview of the data analyses that can be used to assess the causal effects of studied intervention components and investigate time-varying moderation of those effects. Results Micro-randomized trials enable causal modeling of proximal effects of the randomized intervention components and assessment of time-varying moderation of those effects. Conclusions Micro-randomized trials can help researchers understand whether their interventions are having intended effects, when and for whom they are effective, and what factors moderate the interventions’ effects, enabling creation of more effective JITAIs. PMID:26651463

  13. Robust Real-Time Musculoskeletal Modeling Driven by Electromyograms.

    PubMed

    Durandau, Guillaume; Farina, Dario; Sartori, Massimo

    2018-03-01

    Current clinical biomechanics involves lengthy data acquisition and time-consuming offline analyses with biomechanical models not operating in real-time for man-machine interfacing. We developed a method that enables online analysis of neuromusculoskeletal function in vivo in the intact human. We used electromyography (EMG)-driven musculoskeletal modeling to simulate all transformations from muscle excitation onset (EMGs) to mechanical moment production around multiple lower-limb degrees of freedom (DOFs). We developed a calibration algorithm that enables adjusting musculoskeletal model parameters specifically to an individual's anthropometry and force-generating capacity. We incorporated the modeling paradigm into a computationally efficient, generic framework that can be interfaced in real-time with any movement data collection system. The framework demonstrated the ability of computing forces in 13 lower-limb muscle-tendon units and resulting moments about three joint DOFs simultaneously in real-time. Remarkably, it was capable of extrapolating beyond calibration conditions, i.e., predicting accurate joint moments during six unseen tasks and one unseen DOF. The proposed framework can dramatically reduce evaluation latency in current clinical biomechanics and open up new avenues for establishing prompt and personalized treatments, as well as for establishing natural interfaces between patients and rehabilitation systems. The integration of EMG with numerical modeling will enable simulating realistic neuromuscular strategies in conditions including muscular/orthopedic deficit, which could not be robustly simulated via pure modeling formulations. This will enable translation to clinical settings and development of healthcare technologies including real-time bio-feedback of internal mechanical forces and direct patient-machine interfacing.

  14. Complete magnesiothermic reduction reaction of vertically aligned mesoporous silica channels to form pure silicon nanoparticles

    PubMed Central

    Kim, Kyoung Hwan; Lee, Dong Jin; Cho, Kyeong Min; Kim, Seon Joon; Park, Jung-Ki; Jung, Hee-Tae

    2015-01-01

    Owing to its simplicity and low temperature conditions, magnesiothermic reduction of silica is one of the most powerful methods for producing silicon nanostructures. However, incomplete reduction takes place in this process leaving unconverted silica under the silicon layer. This phenomenon limits the use of this method for the rational design of silicon structures. In this effort, a technique that enables complete magnesiothermic reduction of silica to form silicon has been developed. The procedure involves magnesium promoted reduction of vertically oriented mesoporous silica channels on reduced graphene oxides (rGO) sheets. The mesopores play a significant role in effectively enabling magnesium gas to interact with silica through a large number of reaction sites. Utilizing this approach, highly uniform, ca. 10 nm sized silicon nanoparticles are generated without contamination by unreacted silica. The new method for complete magnesiothermic reduction of mesoporous silica approach provides a foundation for the rational design of silicon structures. PMID:25757800

  15. Development of potent in vivo mutagenesis plasmids with broad mutational spectra

    PubMed Central

    Badran, Ahmed H.; Liu, David R.

    2015-01-01

    Methods to enhance random mutagenesis in cells offer advantages over in vitro mutagenesis, but current in vivo methods suffer from a lack of control, genomic instability, low efficiency and narrow mutational spectra. Using a mechanism-driven approach, we created a potent, inducible, broad-spectrum and vector-based mutagenesis system in E. coli that enhances mutation 322,000-fold over basal levels, surpassing the mutational efficiency and spectra of widely used in vivo and in vitro methods. We demonstrate that this system can be used to evolve antibiotic resistance in wild-type E. coli in <24 h, outperforming chemical mutagens, ultraviolet light and the mutator strain XL1-Red under similar conditions. This system also enables the continuous evolution of T7 RNA polymerase variants capable of initiating transcription using the T3 promoter in <10 h. Our findings enable broad-spectrum mutagenesis of chromosomes, episomes and viruses in vivo, and are applicable to both bacterial and bacteriophage-mediated laboratory evolution platforms. PMID:26443021

  16. Development of potent in vivo mutagenesis plasmids with broad mutational spectra.

    PubMed

    Badran, Ahmed H; Liu, David R

    2015-10-07

    Methods to enhance random mutagenesis in cells offer advantages over in vitro mutagenesis, but current in vivo methods suffer from a lack of control, genomic instability, low efficiency and narrow mutational spectra. Using a mechanism-driven approach, we created a potent, inducible, broad-spectrum and vector-based mutagenesis system in E. coli that enhances mutation 322,000-fold over basal levels, surpassing the mutational efficiency and spectra of widely used in vivo and in vitro methods. We demonstrate that this system can be used to evolve antibiotic resistance in wild-type E. coli in <24 h, outperforming chemical mutagens, ultraviolet light and the mutator strain XL1-Red under similar conditions. This system also enables the continuous evolution of T7 RNA polymerase variants capable of initiating transcription using the T3 promoter in <10 h. Our findings enable broad-spectrum mutagenesis of chromosomes, episomes and viruses in vivo, and are applicable to both bacterial and bacteriophage-mediated laboratory evolution platforms.

  17. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030

    PubMed Central

    Slotnick, Jeffrey P.; Khodadoust, Abdollah; Alonso, Juan J.; Darmofal, David L.; Gropp, William D.; Lurie, Elizabeth A.; Mavriplis, Dimitri J.; Venkatakrishnan, Venkat

    2014-01-01

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be ‘cleaner’ and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. PMID:25024413

  18. Learning through Geography. Pathways in Geography Series, Title No. 7.

    ERIC Educational Resources Information Center

    Slater, Frances

    This teacher's guide is to enable the teacher to promote thinking through the use of geography. The book lays out the rationale in learning theory for an issues-based, question-driven inquiry method and proceeds through a simple model of progression from identifying key questions to developing generalizations. Students study issues of geographic…

  19. Cloud Computing as a Core Discipline in a Technology Entrepreneurship Program

    ERIC Educational Resources Information Center

    Lawler, James; Joseph, Anthony

    2012-01-01

    Education in entrepreneurship continues to be a developing area of curricula for computer science and information systems students. Entrepreneurship is enabled frequently by cloud computing methods that furnish benefits to especially medium and small-sized firms. Expanding upon an earlier foundation paper, the authors of this paper present an…

  20. A Road Map Towards High pH Adaptability: Phenomic and Genomic Approaches to Azalea Breeding (Rhododendron sp.)

    USDA-ARS?s Scientific Manuscript database

    A research grant from the Azalea Society of America has enabled us to collect and begin evaluating diverse Rhododendron viscosum germplasm to identify genetic and phenotypic variation for pH adaptability. During the Spring of 2014, we developed novel, in vitro screening methods for Rhododendron to ...

  1. Rentz's Student Affairs Practice in Higher Education. Fourth Edition

    ERIC Educational Resources Information Center

    Zhang, Naijian

    2011-01-01

    The mission of this new fourth edition is to provide the reader with a solid foundation in the historical and philosophical perspectives of college student affairs development; assist the reader in understanding the major concepts and purpose of student affairs' practice, methods, and program models; enable the reader to conceptualize the theme,…

  2. Access-enabling architectures : new hybrid multi-modal spatial prototypes towards resource and social sustainability : USDOT Region V Regional University Transportation Center final report.

    DOT National Transportation Integrated Search

    2016-12-19

    The efforts of this project aim to capture and engage these potentials through a design-research method that incorporates a top down, data-driven approach with bottom-up stakeholder perspectives to develop prototypical scenario-based design solutions...

  3. Improving Clinical Feedback to Anesthesia Residents by Using an Optical Scanner and a Microcomputer.

    ERIC Educational Resources Information Center

    Albanese, Mark A.; And Others

    1989-01-01

    At the University of Iowa problems associated with managing evaluations of anesthesia residents led to a major backlog of unanalyzed evaluation forms. A system developed at the University that enables ongoing feedback to residents and provides a method to assess the clinical competence of residents is described. (Author/MLW)

  4. Genetic algorithms

    NASA Technical Reports Server (NTRS)

    Wang, Lui; Bayer, Steven E.

    1991-01-01

    Genetic algorithms are mathematical, highly parallel, adaptive search procedures (i.e., problem solving methods) based loosely on the processes of natural genetics and Darwinian survival of the fittest. Basic genetic algorithms concepts are introduced, genetic algorithm applications are introduced, and results are presented from a project to develop a software tool that will enable the widespread use of genetic algorithm technology.

  5. Mathematics Lectures as Narratives: Insights from Network Graph Methodology

    ERIC Educational Resources Information Center

    Weinberg, Aaron; Wiesner, Emilie; Fukawa-Connelly, Tim

    2016-01-01

    Although lecture is the traditional method of university mathematics instruction, there has been little empirical research that describes the general structure of lectures. In this paper, we adapt ideas from narrative analysis and apply them to an upper-level mathematics lecture. We develop a framework that enables us to conceptualize the lecture…

  6. Accuracy of a Screening Tool for Early Identification of Language Impairment

    ERIC Educational Resources Information Center

    Uilenburg, Noëlle; Wiefferink, Karin; Verkerk, Paul; van Denderen, Margot; van Schie, Carla; Oudesluys-Murphy, Ann-Marie

    2018-01-01

    Purpose: A screening tool called the "VTO Language Screening Instrument" (VTO-LSI) was developed to enable more uniform and earlier detection of language impairment. This report, consisting of 2 retrospective studies, focuses on the effects of using the VTO-LSI compared to regular detection procedures. Method: Study 1 retrospectively…

  7. Microscale Gas Chemistry

    ERIC Educational Resources Information Center

    Mattson, Bruce; Anderson, Michael P.

    2011-01-01

    The development of syringes having free movement while remaining gas-tight enabled methods in chemistry to be changed. Successfully containing and measuring volumes of gas without the need to trap them using liquids made it possible to work with smaller quantities. The invention of the LuerLok syringe cap also allowed the gas to be stored for a…

  8. Development of primer sets for loop-mediated isothermal amplification that enables rapid and specific detection of Streptococcus dysgalactiae, Streptococcus uberis and Streptococcus agalactiae

    USDA-ARS?s Scientific Manuscript database

    Streptococcus dysgalactiae, Streptococcus uberis and Streptococcus agalactiae are the three main pathogens causing bovine mastitis, with great losses to the dairy industry. Rapid and specific loop-mediated isothermal amplification methods (LAMP) for identification and differentiation of these three ...

  9. Recent Enhancements to the Development of CFD-Based Aeroelastic Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.

    2007-01-01

    Recent enhancements to the development of CFD-based unsteady aerodynamic and aeroelastic reduced-order models (ROMs) are presented. These enhancements include the simultaneous application of structural modes as CFD input, static aeroelastic analysis using a ROM, and matched-point solutions using a ROM. The simultaneous application of structural modes as CFD input enables the computation of the unsteady aerodynamic state-space matrices with a single CFD execution, independent of the number of structural modes. The responses obtained from a simultaneous excitation of the CFD-based unsteady aerodynamic system are processed using system identification techniques in order to generate an unsteady aerodynamic state-space ROM. Once the unsteady aerodynamic state-space ROM is generated, a method for computing the static aeroelastic response using this unsteady aerodynamic ROM and a state-space model of the structure, is presented. Finally, a method is presented that enables the computation of matchedpoint solutions using a single ROM that is applicable over a range of dynamic pressures and velocities for a given Mach number. These enhancements represent a significant advancement of unsteady aerodynamic and aeroelastic ROM technology.

  10. Using interactive problem-solving techniques to enhance control systems education for non English-speakers

    NASA Astrophysics Data System (ADS)

    Lamont, L. A.; Chaar, L.; Toms, C.

    2010-03-01

    Interactive learning is beneficial to students in that it allows the continual development and testing of many skills. An interactive approach enables students to improve their technical capabilities, as well as developing both verbal and written communicative ability. Problem solving and communication skills are vital for engineering students; in the workplace they will be required to communicate with people of varying technical abilities and from different linguistic and engineering backgrounds. In this paper, a case study is presented that discusses how the traditional method of teaching control systems can be improved. 'Control systems' is a complex engineering topic requiring students to process an extended amount of mathematical formulae. MATLAB software, which enables students to interactively compare a range of possible combinations and analyse the optimal solution, is used to this end. It was found that students became more enthusiastic and interested when given ownership of their learning objectives. As well as improving the students' technical knowledge, other important engineering skills are also improved by introducing an interactive method of teaching.

  11. Generation of Cardiomyocytes from Pluripotent Stem Cells.

    PubMed

    Nakahama, Hiroko; Di Pasquale, Elisa

    2016-01-01

    The advent of pluripotent stem cells (PSCs) enabled a multitude of studies for modeling the development of diseases and testing pharmaceutical therapeutic potential in vitro. These PSCs have been differentiated to multiple cell types to demonstrate its pluripotent potential, including cardiomyocytes (CMs). However, the efficiency and efficacy of differentiation vary greatly between different cell lines and methods. Here, we describe two different methods for acquiring CMs from human pluripotent lines. One method involves the generation of embryoid bodies, which emulates the natural developmental process, while the other method chemically activates the canonical Wnt signaling pathway to induce a monolayer of cardiac differentiation.

  12. Internal scanning method as unique imaging method of optical vortex scanning microscope

    NASA Astrophysics Data System (ADS)

    Popiołek-Masajada, Agnieszka; Masajada, Jan; Szatkowski, Mateusz

    2018-06-01

    The internal scanning method is specific for the optical vortex microscope. It allows to move the vortex point inside the focused vortex beam with nanometer resolution while the whole beam stays in place. Thus the sample illuminated by the focused vortex beam can be scanned just by the vortex point. We show that this method enables high resolution imaging. The paper presents the preliminary experimental results obtained with the first basic image recovery procedure. A prospect of developing more powerful tools for topography recovery with the optical vortex scanning microscope is discussed shortly.

  13. Sediment acoustic index method for computing continuous suspended-sediment concentrations

    USGS Publications Warehouse

    Landers, Mark N.; Straub, Timothy D.; Wood, Molly S.; Domanski, Marian M.

    2016-07-11

    Once developed, sediment acoustic index ratings must be validated with additional suspended-sediment samples, beyond the period of record used in the rating development, to verify that the regression model continues to adequately represent sediment conditions within the stream. Changes in ADVM configuration or installation, or replacement with another ADVM, may require development of a new rating. The best practices described in this report can be used to develop continuous estimates of suspended-sediment concentration and load using sediment acoustic surrogates to enable more informed and accurate responses to diverse sedimentation issues.

  14. DNA-based identification of spices: DNA isolation, whole genome amplification, and polymerase chain reaction.

    PubMed

    Focke, Felix; Haase, Ilka; Fischer, Markus

    2011-01-26

    Usually spices are identified morphologically using simple methods like magnifying glasses or microscopic instruments. On the other hand, molecular biological methods like the polymerase chain reaction (PCR) enable an accurate and specific detection also in complex matrices. Generally, the origins of spices are plants with diverse genetic backgrounds and relationships. The processing methods used for the production of spices are complex and individual. Consequently, the development of a reliable DNA-based method for spice analysis is a challenging intention. However, once established, this method will be easily adapted to less difficult food matrices. In the current study, several alternative methods for the isolation of DNA from spices have been developed and evaluated in detail with regard to (i) its purity (photometric), (ii) yield (fluorimetric methods), and (iii) its amplifiability (PCR). Whole genome amplification methods were used to preamplify isolates to improve the ratio between amplifiable DNA and inhibiting substances. Specific primer sets were designed, and the PCR conditions were optimized to detect 18 spices selectively. Assays of self-made spice mixtures were performed to proof the applicability of the developed methods.

  15. Irish set dancing classes for people with Parkinson's disease: The needs of participants and dance teachers.

    PubMed

    Shanahan, Joanne; Bhriain, Orfhlaith Ní; Morris, Meg E; Volpe, Daniele; Clifford, Amanda M

    2016-08-01

    As the number of people diagnosed with Parkinson's disease increases, there is a need to develop initiatives that promote health and wellbeing and support self-management. Additionally, as exercise may slow physical decline, there is a need to develop methods that facilitate greater engagement with community-based exercise. The aim of this study is to examine the needs of (1) people with Parkinson's disease and (2) set dancing teachers to enable the development of participant-centred community set dance classes. A mixed methods study design was used. Two consensus group discussions using nominal group technique were held to (1) identify factors pertaining to the needs of people with Parkinson's disease from a set dance class and (2) the educational needs of set dancing teachers to enable them to teach set dancing to people with Parkinson's disease. Group discussions began with silent generation of ideas. A round-robin discussion and grouping of ideas into broader topic areas followed. Finally, participants ranked, by order of priority (1-5), the topic areas developed. Final data analysis involved summation of participants' ranking scores for each topic area. Rich information on the needs of people with Parkinson's disease from a dance class and the educational guidance sought by set dancing teachers was gathered. Topic areas developed include "teaching method" for set dances and "class environment". Accessing community exercise programmes is important for this population. The results of this study will inform the development of an educational resource on Parkinson's disease for set dancing teachers. This resource may facilitate a larger number of teachers to establish sustainable community set dancing classes for people with Parkinson's disease. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Probabilistic numerical methods for PDE-constrained Bayesian inverse problems

    NASA Astrophysics Data System (ADS)

    Cockayne, Jon; Oates, Chris; Sullivan, Tim; Girolami, Mark

    2017-06-01

    This paper develops meshless methods for probabilistically describing discretisation error in the numerical solution of partial differential equations. This construction enables the solution of Bayesian inverse problems while accounting for the impact of the discretisation of the forward problem. In particular, this drives statistical inferences to be more conservative in the presence of significant solver error. Theoretical results are presented describing rates of convergence for the posteriors in both the forward and inverse problems. This method is tested on a challenging inverse problem with a nonlinear forward model.

  17. [Diagnosis of tropical malaria by express-methods].

    PubMed

    Popov, A F; Nikiforov, N D; Ivanis, V A; Barkun, S P; Sanin, B I; Fed'kina, L I

    2004-01-01

    An examination of a thick blood drop and of blood smear for the presence of plasmodia is a classic and indisputable diagnostic test for tropic malaria. However, express-methods, based on the immune-enzyme analysis, have been introduced into the health-care practice primarily in developing and underdeveloped countries. The diagnosis of tropic malaria by using the discussed methods enables, in the non-laboratory settings, a rapid and reliable detection of PI. falciparum in blood. This is important because an untimely diagnosis of tropic malaria increases the risk of the lethal outcome.

  18. Cyber-physical geographical information service-enabled control of diverse in-situ sensors.

    PubMed

    Chen, Nengcheng; Xiao, Changjiang; Pu, Fangling; Wang, Xiaolei; Wang, Chao; Wang, Zhili; Gong, Jianya

    2015-01-23

    Realization of open online control of diverse in-situ sensors is a challenge. This paper proposes a Cyber-Physical Geographical Information Service-enabled method for control of diverse in-situ sensors, based on location-based instant sensing of sensors, which provides closed-loop feedbacks. The method adopts the concepts and technologies of newly developed cyber-physical systems (CPSs) to combine control with sensing, communication, and computation, takes advantage of geographical information service such as services provided by the Tianditu which is a basic geographic information service platform in China and Sensor Web services to establish geo-sensor applications, and builds well-designed human-machine interfaces (HMIs) to support online and open interactions between human beings and physical sensors through cyberspace. The method was tested with experiments carried out in two geographically distributed scientific experimental fields, Baoxie Sensor Web Experimental Field in Wuhan city and Yemaomian Landslide Monitoring Station in Three Gorges, with three typical sensors chosen as representatives using the prototype system Geospatial Sensor Web Common Service Platform. The results show that the proposed method is an open, online, closed-loop means of control.

  19. Cyber-Physical Geographical Information Service-Enabled Control of Diverse In-Situ Sensors

    PubMed Central

    Chen, Nengcheng; Xiao, Changjiang; Pu, Fangling; Wang, Xiaolei; Wang, Chao; Wang, Zhili; Gong, Jianya

    2015-01-01

    Realization of open online control of diverse in-situ sensors is a challenge. This paper proposes a Cyber-Physical Geographical Information Service-enabled method for control of diverse in-situ sensors, based on location-based instant sensing of sensors, which provides closed-loop feedbacks. The method adopts the concepts and technologies of newly developed cyber-physical systems (CPSs) to combine control with sensing, communication, and computation, takes advantage of geographical information service such as services provided by the Tianditu which is a basic geographic information service platform in China and Sensor Web services to establish geo-sensor applications, and builds well-designed human-machine interfaces (HMIs) to support online and open interactions between human beings and physical sensors through cyberspace. The method was tested with experiments carried out in two geographically distributed scientific experimental fields, Baoxie Sensor Web Experimental Field in Wuhan city and Yemaomian Landslide Monitoring Station in Three Gorges, with three typical sensors chosen as representatives using the prototype system Geospatial Sensor Web Common Service Platform. The results show that the proposed method is an open, online, closed-loop means of control. PMID:25625906

  20. Single-cell DNA methylome sequencing and bioinformatic inference of epigenomic cell-state dynamics.

    PubMed

    Farlik, Matthias; Sheffield, Nathan C; Nuzzo, Angelo; Datlinger, Paul; Schönegger, Andreas; Klughammer, Johanna; Bock, Christoph

    2015-03-03

    Methods for single-cell genome and transcriptome sequencing have contributed to our understanding of cellular heterogeneity, whereas methods for single-cell epigenomics are much less established. Here, we describe a whole-genome bisulfite sequencing (WGBS) assay that enables DNA methylation mapping in very small cell populations (μWGBS) and single cells (scWGBS). Our assay is optimized for profiling many samples at low coverage, and we describe a bioinformatic method that analyzes collections of single-cell methylomes to infer cell-state dynamics. Using these technological advances, we studied epigenomic cell-state dynamics in three in vitro models of cellular differentiation and pluripotency, where we observed characteristic patterns of epigenome remodeling and cell-to-cell heterogeneity. The described method enables single-cell analysis of DNA methylation in a broad range of biological systems, including embryonic development, stem cell differentiation, and cancer. It can also be used to establish composite methylomes that account for cell-to-cell heterogeneity in complex tissue samples. Copyright © 2015 The Authors. Published by Elsevier Inc. All rights reserved.

  1. Developments in label-free microfluidic methods for single-cell analysis and sorting.

    PubMed

    Carey, Thomas R; Cotner, Kristen L; Li, Brian; Sohn, Lydia L

    2018-04-24

    Advancements in microfluidic technologies have led to the development of many new tools for both the characterization and sorting of single cells without the need for exogenous labels. Label-free microfluidics reduce the preparation time, reagents needed, and cost of conventional methods based on fluorescent or magnetic labels. Furthermore, these devices enable analysis of cell properties such as mechanical phenotype and dielectric parameters that cannot be characterized with traditional labels. Some of the most promising technologies for current and future development toward label-free, single-cell analysis and sorting include electronic sensors such as Coulter counters and electrical impedance cytometry; deformation analysis using optical traps and deformation cytometry; hydrodynamic sorting such as deterministic lateral displacement, inertial focusing, and microvortex trapping; and acoustic sorting using traveling or standing surface acoustic waves. These label-free microfluidic methods have been used to screen, sort, and analyze cells for a wide range of biomedical and clinical applications, including cell cycle monitoring, rapid complete blood counts, cancer diagnosis, metastatic progression monitoring, HIV and parasite detection, circulating tumor cell isolation, and point-of-care diagnostics. Because of the versatility of label-free methods for characterization and sorting, the low-cost nature of microfluidics, and the rapid prototyping capabilities of modern microfabrication, we expect this class of technology to continue to be an area of high research interest going forward. New developments in this field will contribute to the ongoing paradigm shift in cell analysis and sorting technologies toward label-free microfluidic devices, enabling new capabilities in biomedical research tools as well as clinical diagnostics. This article is categorized under: Diagnostic Tools > Biosensing Diagnostic Tools > Diagnostic Nanodevices. © 2018 Wiley Periodicals, Inc.

  2. A systematic review and appraisal of methods of developing and validating lifestyle cardiovascular disease risk factors questionnaires.

    PubMed

    Nse, Odunaiya; Quinette, Louw; Okechukwu, Ogah

    2015-09-01

    Well developed and validated lifestyle cardiovascular disease (CVD) risk factors questionnaires is the key to obtaining accurate information to enable planning of CVD prevention program which is a necessity in developing countries. We conducted this review to assess methods and processes used for development and content validation of lifestyle CVD risk factors questionnaires and possibly develop an evidence based guideline for development and content validation of lifestyle CVD risk factors questionnaires. Relevant databases at the Stellenbosch University library were searched for studies conducted between 2008 and 2012, in English language and among humans. Using the following databases; pubmed, cinahl, psyc info and proquest. Search terms used were CVD risk factors, questionnaires, smoking, alcohol, physical activity and diet. Methods identified for development of lifestyle CVD risk factors were; review of literature either systematic or traditional, involvement of expert and /or target population using focus group discussion/interview, clinical experience of authors and deductive reasoning of authors. For validation, methods used were; the involvement of expert panel, the use of target population and factor analysis. Combination of methods produces questionnaires with good content validity and other psychometric properties which we consider good.

  3. Engineering Design Theory: Applying the Success of the Modern World to Campaign Creation

    DTIC Science & Technology

    2009-05-21

    and school of thought) to the simple methods of design.6 This progression is analogous to Peter Senge’s levels of learning disciplines.7 Senge...iterative learning and adaptive action that develops and employs critical and creative thinking , enabling leaders to apply the necessary logic to...overcome mental rigidity and develop group insight, the Army must learn to utilize group learning and thinking , through a fluid and creative open process

  4. Whole genome sequence analysis of unidentified genetically modified papaya for development of a specific detection method.

    PubMed

    Nakamura, Kosuke; Kondo, Kazunari; Akiyama, Hiroshi; Ishigaki, Takumi; Noguchi, Akio; Katsumata, Hiroshi; Takasaki, Kazuto; Futo, Satoshi; Sakata, Kozue; Fukuda, Nozomi; Mano, Junichi; Kitta, Kazumi; Tanaka, Hidenori; Akashi, Ryo; Nishimaki-Mogami, Tomoko

    2016-08-15

    Identification of transgenic sequences in an unknown genetically modified (GM) papaya (Carica papaya L.) by whole genome sequence analysis was demonstrated. Whole genome sequence data were generated for a GM-positive fresh papaya fruit commodity detected in monitoring using real-time polymerase chain reaction (PCR). The sequences obtained were mapped against an open database for papaya genome sequence. Transgenic construct- and event-specific sequences were identified as a GM papaya developed to resist infection from a Papaya ringspot virus. Based on the transgenic sequences, a specific real-time PCR detection method for GM papaya applicable to various food commodities was developed. Whole genome sequence analysis enabled identifying unknown transgenic construct- and event-specific sequences in GM papaya and development of a reliable method for detecting them in papaya food commodities. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Radiographic methods of wear analysis in total hip arthroplasty.

    PubMed

    Rahman, Luthfur; Cobb, Justin; Muirhead-Allwood, Sarah

    2012-12-01

    Polyethylene wear is an important factor in failure of total hip arthroplasty (THA). With increasing numbers of THAs being performed worldwide, particularly in younger patients, the burden of failure and revision arthroplasty is increasing, as well, along with associated costs and workload. Various radiographic methods of measuring polyethylene wear have been developed to assist in deciding when to monitor patients more closely and when to consider revision surgery. Radiographic methods that have been developed to measure polyethylene wear include manual and computer-assisted plain radiography, two- and three-dimensional techniques, and radiostereometric analysis. Some of these methods are important in both clinical and research settings. CT has the potential to provide additional information on component orientation and enables assessment of periprosthetic osteolysis, which is an important consequence of polyethylene wear.

  6. Nano-ceramics and method thereof

    DOEpatents

    Satcher, Jr., Joe H.; Gash, Alex [Livermore, CA; Simpson, Randall [Livermore, CA; Landingham, Richard [Livermore, CA; Reibold, Robert A [Salida, CA

    2006-08-08

    Disclosed herein is a method to produce ceramic materials utilizing the sol-gel process. The methods enable the preparation of intimate homogeneous dispersions of materials while offering the ability to control the size of one component within another. The method also enables the preparation of materials that will densify at reduced temperature.

  7. Applying flow chemistry: methods, materials, and multistep synthesis.

    PubMed

    McQuade, D Tyler; Seeberger, Peter H

    2013-07-05

    The synthesis of complex molecules requires control over both chemical reactivity and reaction conditions. While reactivity drives the majority of chemical discovery, advances in reaction condition control have accelerated method development/discovery. Recent tools include automated synthesizers and flow reactors. In this Synopsis, we describe how flow reactors have enabled chemical advances in our groups in the areas of single-stage reactions, materials synthesis, and multistep reactions. In each section, we detail the lessons learned and propose future directions.

  8. Metabolic cartography: experimental quantification of metabolic fluxes from isotopic labelling studies.

    PubMed

    O'Grady, John; Schwender, Jörg; Shachar-Hill, Yair; Morgan, John A

    2012-03-01

    For the past decade, flux maps have provided researchers with an in-depth perspective on plant metabolism. As a rapidly developing field, significant headway has been made recently in computation, experimentation, and overall understanding of metabolic flux analysis. These advances are particularly applicable to the study of plant metabolism. New dynamic computational methods such as non-stationary metabolic flux analysis are finding their place in the toolbox of metabolic engineering, allowing more organisms to be studied and decreasing the time necessary for experimentation, thereby opening new avenues by which to explore the vast diversity of plant metabolism. Also, improved methods of metabolite detection and measurement have been developed, enabling increasingly greater resolution of flux measurements and the analysis of a greater number of the multitude of plant metabolic pathways. Methods to deconvolute organelle-specific metabolism are employed with increasing effectiveness, elucidating the compartmental specificity inherent in plant metabolism. Advances in metabolite measurements have also enabled new types of experiments, such as the calculation of metabolic fluxes based on (13)CO(2) dynamic labelling data, and will continue to direct plant metabolic engineering. Newly calculated metabolic flux maps reveal surprising and useful information about plant metabolism, guiding future genetic engineering of crops to higher yields. Due to the significant level of complexity in plants, these methods in combination with other systems biology measurements are necessary to guide plant metabolic engineering in the future.

  9. Partial Variance of Increments Method in Solar Wind Observations and Plasma Simulations

    NASA Astrophysics Data System (ADS)

    Greco, A.; Matthaeus, W. H.; Perri, S.; Osman, K. T.; Servidio, S.; Wan, M.; Dmitruk, P.

    2018-02-01

    The method called "PVI" (Partial Variance of Increments) has been increasingly used in analysis of spacecraft and numerical simulation data since its inception in 2008. The purpose of the method is to study the kinematics and formation of coherent structures in space plasmas, a topic that has gained considerable attention, leading the development of identification methods, observations, and associated theoretical research based on numerical simulations. This review paper will summarize key features of the method and provide a synopsis of the main results obtained by various groups using the method. This will enable new users or those considering methods of this type to find details and background collected in one place.

  10. Methods for genetic transformation of filamentous fungi.

    PubMed

    Li, Dandan; Tang, Yu; Lin, Jun; Cai, Weiwen

    2017-10-03

    Filamentous fungi have been of great interest because of their excellent ability as cell factories to manufacture useful products for human beings. The development of genetic transformation techniques is a precondition that enables scientists to target and modify genes efficiently and may reveal the function of target genes. The method to deliver foreign nucleic acid into cells is the sticking point for fungal genome modification. Up to date, there are some general methods of genetic transformation for fungi, including protoplast-mediated transformation, Agrobacterium-mediated transformation, electroporation, biolistic method and shock-wave-mediated transformation. This article reviews basic protocols and principles of these transformation methods, as well as their advantages and disadvantages.

  11. High-speed bioimaging with frequency-division-multiplexed fluorescence confocal microscopy

    NASA Astrophysics Data System (ADS)

    Mikami, Hideharu; Harmon, Jeffrey; Ozeki, Yasuyuki; Goda, Keisuke

    2017-04-01

    We present methods of fluorescence confocal microscopy that enable unprecedentedly high frame rate of > 10,000 fps. The methods are based on a frequency-division multiplexing technique, which was originally developed in the field of communication engineering. Specifically, we achieved a broad bandwidth ( 400 MHz) of detection signals using a dual- AOD method and overcame limitations in frame rate, due to a scanning device, by using a multi-line focusing method, resulting in a significant increase in frame rate. The methods have potential biomedical applications such as observation of sub-millisecond dynamics in biological tissues, in-vivo three-dimensional imaging, and fluorescence imaging flow cytometry.

  12. An Electronic Measurement Instrumentation of the Impedance of a Loaded Fuel Cell or Battery

    PubMed Central

    Aglzim, El-Hassane; Rouane, Amar; El-Moznine, Reddad

    2007-01-01

    In this paper we present an inexpensive electronic measurement instrumentation developed in our laboratory, to measure and plot the impedance of a loaded fuel cell or battery. Impedance measurements were taken by using the load modulation method. This instrumentation has been developed around a VXI system stand which controls electronic cards. Software under Hpvee® was developed for automatic measurements and the layout of the impedance of the fuel cell on load. The measurement environment, like the ambient temperature, the fuel cell temperature, the level of the hydrogen, etc…, were taken with several sensors that enable us to control the measurement. To filter the noise and the influence of the 50Hz, we have implemented a synchronous detection which filters in a very narrow way around the useful signal. The theoretical result obtained by a simulation under Pspice® of the method used consolidates the choice of this method and the possibility of obtaining correct and exploitable results. The experimental results are preliminary results on a 12V vehicle battery, having an inrush current of 330A and a capacity of 40Ah (impedance measurements on a fuel cell are in progress, and will be the subject of a forthcoming paper). The results were plotted at various nominal voltages of the battery (12.7V, 10V, 8V and 5V) and with two imposed currents (0.6A and 4A). The Nyquist diagram resulting from the experimental data enable us to show an influence of the load of the battery on its internal impedance. The similitude in the graph form and in order of magnitude of the values obtained (both theoretical and practical) enables us to validate our electronic measurement instrumentation. One of the future uses for this instrumentation is to integrate it with several control sensors, on a vehicle as an embedded system to monitor the degradation of fuel cell membranes. PMID:28903231

  13. Real-time myocardium segmentation for the assessment of cardiac function variation

    NASA Astrophysics Data System (ADS)

    Zoehrer, Fabian; Huellebrand, Markus; Chitiboi, Teodora; Oechtering, Thekla; Sieren, Malte; Frahm, Jens; Hahn, Horst K.; Hennemuth, Anja

    2017-03-01

    Recent developments in MRI enable the acquisition of image sequences with high spatio-temporal resolution. Cardiac motion can be captured without gating and triggering. Image size and contrast relations differ from conventional cardiac MRI cine sequences requiring new adapted analysis methods. We suggest a novel segmentation approach utilizing contrast invariant polar scanning techniques. It has been tested with 20 datasets of arrhythmia patients. The results do not differ significantly more between automatic and manual segmentations than between observers. This indicates that the presented solution could enable clinical applications of real-time MRI for the examination of arrhythmic cardiac motion in the future.

  14. Space Telecommunications Radio System (STRS) Architecture Standard. Release 1.02.1

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Kacpura, Thomas J.; Handler, Louis M.; Hall, C. Steve; Mortensen, Dale J.; Johnson, Sandra K.; Briones, Janette C.; Nappier, Jennifer M.; Downey, Joseph A.; Lux, James P.

    2012-01-01

    This document contains the NASA architecture standard for software defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer.

  15. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems

    PubMed Central

    Wu, Jun; Su, Zhou; Li, Jianhua

    2017-01-01

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on “friend” relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems. PMID:28758943

  16. Crowd Sensing-Enabling Security Service Recommendation for Social Fog Computing Systems.

    PubMed

    Wu, Jun; Su, Zhou; Wang, Shen; Li, Jianhua

    2017-07-30

    Fog computing, shifting intelligence and resources from the remote cloud to edge networks, has the potential of providing low-latency for the communication from sensing data sources to users. For the objects from the Internet of Things (IoT) to the cloud, it is a new trend that the objects establish social-like relationships with each other, which efficiently brings the benefits of developed sociality to a complex environment. As fog service become more sophisticated, it will become more convenient for fog users to share their own services, resources, and data via social networks. Meanwhile, the efficient social organization can enable more flexible, secure, and collaborative networking. Aforementioned advantages make the social network a potential architecture for fog computing systems. In this paper, we design an architecture for social fog computing, in which the services of fog are provisioned based on "friend" relationships. To the best of our knowledge, this is the first attempt at an organized fog computing system-based social model. Meanwhile, social networking enhances the complexity and security risks of fog computing services, creating difficulties of security service recommendations in social fog computing. To address this, we propose a novel crowd sensing-enabling security service provisioning method to recommend security services accurately in social fog computing systems. Simulation results show the feasibilities and efficiency of the crowd sensing-enabling security service recommendation method for social fog computing systems.

  17. A statistically harmonized alignment-classification in image space enables accurate and robust alignment of noisy images in single particle analysis.

    PubMed

    Kawata, Masaaki; Sato, Chikara

    2007-06-01

    In determining the three-dimensional (3D) structure of macromolecular assemblies in single particle analysis, a large representative dataset of two-dimensional (2D) average images from huge number of raw images is a key for high resolution. Because alignments prior to averaging are computationally intensive, currently available multireference alignment (MRA) software does not survey every possible alignment. This leads to misaligned images, creating blurred averages and reducing the quality of the final 3D reconstruction. We present a new method, in which multireference alignment is harmonized with classification (multireference multiple alignment: MRMA). This method enables a statistical comparison of multiple alignment peaks, reflecting the similarities between each raw image and a set of reference images. Among the selected alignment candidates for each raw image, misaligned images are statistically excluded, based on the principle that aligned raw images of similar projections have a dense distribution around the correctly aligned coordinates in image space. This newly developed method was examined for accuracy and speed using model image sets with various signal-to-noise ratios, and with electron microscope images of the Transient Receptor Potential C3 and the sodium channel. In every data set, the newly developed method outperformed conventional methods in robustness against noise and in speed, creating 2D average images of higher quality. This statistically harmonized alignment-classification combination should greatly improve the quality of single particle analysis.

  18. The ReaxFF reactive force-field: Development, applications, and future directions

    DOE PAGES

    Senftle, Thomas; Hong, Sungwook; Islam, Md Mahbubul; ...

    2016-03-04

    The reactive force-field (ReaxFF) interatomic potential is a powerful computational tool for exploring, developing and optimizing material properties. Methods based on the principles of quantum mechanics (QM), while offering valuable theoretical guidance at the electronic level, are often too computationally intense for simulations that consider the full dynamic evolution of a system. Alternatively, empirical interatomic potentials that are based on classical principles require significantly fewer computational resources, which enables simulations to better describe dynamic processes over longer timeframes and on larger scales. Such methods, however, typically require a predefined connectivity between atoms, precluding simulations that involve reactive events. The ReaxFFmore » method was developed to help bridge this gap. Approaching the gap from the classical side, ReaxFF casts the empirical interatomic potential within a bond-order formalism, thus implicitly describing chemical bonding without expensive QM calculations. As a result, this article provides an overview of the development, application, and future directions of the ReaxFF method.« less

  19. Label-free assay based on immobilized capillary enzyme reactor of Leishmania infantum nucleoside triphosphate diphosphohydrolase (LicNTPDase-2-ICER-LC/UV).

    PubMed

    Magalhães, Luana; de Oliveira, Arthur Henrique Cavalcante; de Souza Vasconcellos, Raphael; Mariotini-Moura, Christiane; de Cássia Firmino, Rafaela; Fietto, Juliana Lopes Rangel; Cardoso, Carmen Lúcia

    2016-01-01

    Nucleoside triphosphate diphosphohydrolase (NTPDase) is an enzyme belonging to the apyrase family that participates in the hydrolysis of the nucleosides di- and triphosphate to the corresponding nucleoside monophosphate. This enzyme underlies the virulence of parasites such as Leishmania. Recently, an NTPDase from Leishmania infantum (LicNTPDase-2) was cloned and expressed and has been considered as a new drug target for the treatment of leishmaniasis. With the intent of developing label-free online screening methodologies, LicNTPDase-2 was covalently immobilized onto a fused silica capillary tube in the present study to create an immobilized capillary enzyme reactor (ICER) based on LicNTPDase-2 (LicNTPDase-2-ICER). To perform the activity assays, a multidimensional chromatographic method was developed employing the LicNTPDase-2-ICER in the first dimension, and an analytical Ascentis C8 column was used in the second dimension to provide analytical separation of the substrates and products. The validated LicNTPDase-2-ICER method provided the following kinetic parameters of the immobilized enzyme: KM of 2.2 and 1.8mmolL(-1) for the ADP and ATP substrates, respectively. Suramin (1mmolL(-1)) was also shown to inhibit 32.9% of the enzymatic activity. The developed method is applicable to kinetic studies and enables the recognition of the ligands. Furthermore, a comparison of the values of LicNTPDase-2-ICER with those obtained with an LC method using free enzyme in solution showed that LicNTPDase-2-ICER-LC/UV was an accurate and reproducible method that enabled automated measurements for the rapid screening of ligands. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Thin and open vessel windows for intra-vital fluorescence imaging of murine cochlear blood flow.

    PubMed

    Shi, Xiaorui; Zhang, Fei; Urdang, Zachary; Dai, Min; Neng, Lingling; Zhang, Jinhui; Chen, Songlin; Ramamoorthy, Sripriya; Nuttall, Alfred L

    2014-07-01

    Normal microvessel structure and function in the cochlea is essential for maintaining the ionic and metabolic homeostasis required for hearing function. Abnormal cochlear microcirculation has long been considered an etiologic factor in hearing disorders. A better understanding of cochlear blood flow (CoBF) will enable more effective amelioration of hearing disorders that result from aberrant blood flow. However, establishing the direct relationship between CoBF and other cellular events in the lateral wall and response to physio-pathological stress remains a challenge due to the lack of feasible interrogation methods and difficulty in accessing the inner ear. Here we report on new methods for studying the CoBF in a mouse model using a thin or open vessel-window in combination with fluorescence intra-vital microscopy (IVM). An open vessel-window enables investigation of vascular cell biology and blood flow permeability, including pericyte (PC) contractility, bone marrow cell migration, and endothelial barrier leakage, in wild type and fluorescent protein-labeled transgenic mouse models with high spatial and temporal resolution. Alternatively, the thin vessel-window method minimizes disruption of the homeostatic balance in the lateral wall and enables study CoBF under relatively intact physiological conditions. A thin vessel-window method can also be used for time-based studies of physiological and pathological processes. Although the small size of the mouse cochlea makes surgery difficult, the methods are sufficiently developed for studying the structural and functional changes in CoBF under normal and pathological conditions. Copyright © 2014 Elsevier B.V. All rights reserved.

  1. DA+ data acquisition and analysis software at the Swiss Light Source macromolecular crystallography beamlines

    PubMed Central

    Wojdyla, Justyna Aleksandra; Kaminski, Jakub W.; Ebner, Simon; Wang, Xiaoqiang; Gabadinho, Jose; Wang, Meitian

    2018-01-01

    Data acquisition software is an essential component of modern macromolecular crystallography (MX) beamlines, enabling efficient use of beam time at synchrotron facilities. Developed at the Paul Scherrer Institute, the DA+ data acquisition software is implemented at all three Swiss Light Source (SLS) MX beamlines. DA+ consists of distributed services and components written in Python and Java, which communicate via messaging and streaming technologies. The major components of DA+ are the user interface, acquisition engine, online processing and database. Immediate data quality feedback is achieved with distributed automatic data analysis routines. The software architecture enables exploration of the full potential of the latest instrumentation at the SLS MX beamlines, such as the SmarGon goniometer and the EIGER X 16M detector, and development of new data collection methods. PMID:29271779

  2. Process-Hardened, Multi-Analyte Sensor for Characterizing Rocket Plume Constituents

    NASA Technical Reports Server (NTRS)

    Goswami, Kisholoy

    2011-01-01

    A multi-analyte sensor was developed that enables simultaneous detection of rocket engine combustion-product molecules in a launch-vehicle ground test stand. The sensor was developed using a pin-printing method by incorporating multiple sensor elements on a single chip. It demonstrated accurate and sensitive detection of analytes such as carbon dioxide, carbon monoxide, kerosene, isopropanol, and ethylene from a single measurement. The use of pin-printing technology enables high-volume fabrication of the sensor chip, which will ultimately eliminate the need for individual sensor calibration since many identical sensors are made in one batch. Tests were performed using a single-sensor chip attached to a fiber-optic bundle. The use of a fiber bundle allows placement of the opto-electronic readout device at a place remote from the test stand. The sensors are rugged for operation in harsh environments.

  3. Indexing method of digital audiovisual medical resources with semantic Web integration.

    PubMed

    Cuggia, Marc; Mougin, Fleur; Le Beux, Pierre

    2005-03-01

    Digitalization of audiovisual resources and network capability offer many possibilities which are the subject of intensive work in scientific and industrial sectors. Indexing such resources is a major challenge. Recently, the Motion Pictures Expert Group (MPEG) has developed MPEG-7, a standard for describing multimedia content. The goal of this standard is to develop a rich set of standardized tools to enable efficient retrieval from digital archives or the filtering of audiovisual broadcasts on the Internet. How could this kind of technology be used in the medical context? In this paper, we propose a simpler indexing system, based on the Dublin Core standard and compliant to MPEG-7. We use MeSH and the UMLS to introduce conceptual navigation. We also present a video-platform which enables encoding and gives access to audiovisual resources in streaming mode.

  4. Open Architecture Standard for NASA's Software-Defined Space Telecommunications Radio Systems

    NASA Technical Reports Server (NTRS)

    Reinhart, Richard C.; Johnson, Sandra K.; Kacpura, Thomas J.; Hall, Charles S.; Smith, Carl R.; Liebetreu, John

    2008-01-01

    NASA is developing an architecture standard for software-defined radios used in space- and ground-based platforms to enable commonality among radio developments to enhance capability and services while reducing mission and programmatic risk. Transceivers (or transponders) with functionality primarily defined in software (e.g., firmware) have the ability to change their functional behavior through software alone. This radio architecture standard offers value by employing common waveform software interfaces, method of instantiation, operation, and testing among different compliant hardware and software products. These common interfaces within the architecture abstract application software from the underlying hardware to enable technology insertion independently at either the software or hardware layer. This paper presents the initial Space Telecommunications Radio System (STRS) Architecture for NASA missions to provide the desired software abstraction and flexibility while minimizing the resources necessary to support the architecture.

  5. eLoom and Flatland: specification, simulation and visualization engines for the study of arbitrary hierarchical neural architectures.

    PubMed

    Caudell, Thomas P; Xiao, Yunhai; Healy, Michael J

    2003-01-01

    eLoom is an open source graph simulation software tool, developed at the University of New Mexico (UNM), that enables users to specify and simulate neural network models. Its specification language and libraries enables users to construct and simulate arbitrary, potentially hierarchical network structures on serial and parallel processing systems. In addition, eLoom is integrated with UNM's Flatland, an open source virtual environments development tool to provide real-time visualizations of the network structure and activity. Visualization is a useful method for understanding both learning and computation in artificial neural networks. Through 3D animated pictorially representations of the state and flow of information in the network, a better understanding of network functionality is achieved. ART-1, LAPART-II, MLP, and SOM neural networks are presented to illustrate eLoom and Flatland's capabilities.

  6. Analysis of the enablers of capacities to produce primary health care-based reforms in Latin America: a multiple case study

    PubMed Central

    Báscolo, Ernesto Pablo; Yavich, Natalia; Denis, Jean-Louis

    2016-01-01

    Abstract Background Primary health care (PHC)-based reforms have had different results in Latin America. Little attention has been paid to the enablers of collective action capacities required to produce a comprehensive PHC approach. Objective To analyse the enablers of collective action capacities to transform health systems towards a comprehensive PHC approach in Latin American PHC-based reforms. Methods We conducted a longitudinal, retrospective case study of three municipal PHC-based reforms in Bolivia and Argentina. We used multiple data sources and methodologies: document review; interviews with policymakers, managers and practitioners; and household and services surveys. We used temporal bracketing to analyse how the dynamic of interaction between the institutional reform process and the collective action characteristics enabled or hindered the enablers of collective action capacities required to produce the envisioned changes. Results The institutional structuring dynamics and collective action capacities were different in each case. In Cochabamba, there was an ‘interrupted’ structuring process that achieved the establishment of a primary level with a selective PHC approach. In Vicente López, there was a ‘path-dependency’ structuring process that permitted the consolidation of a ‘primary care’ approach, but with limited influence in hospitals. In Rosario, there was a ‘dialectic’ structuring process that favoured the development of the capacities needed to consolidate a comprehensive PHC approach that permeates the entire system. Conclusion The institutional change processes achieved the development of a primary health care level with different degrees of consolidation and system-wide influence given how the characteristics of each collective action enabled or hindered the ‘structuring’ processes. PMID:27209640

  7. Integration of DICOM and openEHR standards

    NASA Astrophysics Data System (ADS)

    Wang, Ying; Yao, Zhihong; Liu, Lei

    2011-03-01

    The standard format for medical imaging storage and transmission is DICOM. openEHR is an open standard specification in health informatics that describes the management and storage, retrieval and exchange of health data in electronic health records. Considering that the integration of DICOM and openEHR is beneficial to information sharing, on the basis of XML-based DICOM format, we developed a method of creating a DICOM Imaging Archetype in openEHR to enable the integration of DICOM and openEHR. Each DICOM file contains abundant imaging information. However, because reading a DICOM involves looking up the DICOM Data Dictionary, the readability of a DICOM file has been limited. openEHR has innovatively adopted two level modeling method, making clinical information divided into lower level, the information model, and upper level, archetypes and templates. But one critical challenge posed to the development of openEHR is the information sharing problem, especially in imaging information sharing. For example, some important imaging information cannot be displayed in an openEHR file. In this paper, to enhance the readability of a DICOM file and semantic interoperability of an openEHR file, we developed a method of mapping a DICOM file to an openEHR file by adopting the form of archetype defined in openEHR. Because an archetype has a tree structure, after mapping a DICOM file to an openEHR file, the converted information is structuralized in conformance with openEHR format. This method enables the integration of DICOM and openEHR and data exchange without losing imaging information between two standards.

  8. 3D Printed Programmable Release Capsules.

    PubMed

    Gupta, Maneesh K; Meng, Fanben; Johnson, Blake N; Kong, Yong Lin; Tian, Limei; Yeh, Yao-Wen; Masters, Nina; Singamaneni, Srikanth; McAlpine, Michael C

    2015-08-12

    The development of methods for achieving precise spatiotemporal control over chemical and biomolecular gradients could enable significant advances in areas such as synthetic tissue engineering, biotic-abiotic interfaces, and bionanotechnology. Living organisms guide tissue development through highly orchestrated gradients of biomolecules that direct cell growth, migration, and differentiation. While numerous methods have been developed to manipulate and implement biomolecular gradients, integrating gradients into multiplexed, three-dimensional (3D) matrices remains a critical challenge. Here we present a method to 3D print stimuli-responsive core/shell capsules for programmable release of multiplexed gradients within hydrogel matrices. These capsules are composed of an aqueous core, which can be formulated to maintain the activity of payload biomolecules, and a poly(lactic-co-glycolic) acid (PLGA, an FDA approved polymer) shell. Importantly, the shell can be loaded with plasmonic gold nanorods (AuNRs), which permits selective rupturing of the capsule when irradiated with a laser wavelength specifically determined by the lengths of the nanorods. This precise control over space, time, and selectivity allows for the ability to pattern 2D and 3D multiplexed arrays of enzyme-loaded capsules along with tunable laser-triggered rupture and release of active enzymes into a hydrogel ambient. The advantages of this 3D printing-based method include (1) highly monodisperse capsules, (2) efficient encapsulation of biomolecular payloads, (3) precise spatial patterning of capsule arrays, (4) "on the fly" programmable reconfiguration of gradients, and (5) versatility for incorporation in hierarchical architectures. Indeed, 3D printing of programmable release capsules may represent a powerful new tool to enable spatiotemporal control over biomolecular gradients.

  9. A fuzzy model for achieving lean attributes for competitive advantages development using AHP-QFD-PROMETHEE

    NASA Astrophysics Data System (ADS)

    Roghanian, E.; Alipour, Mohammad

    2014-06-01

    Lean production has become an integral part of the manufacturing landscape as its link with superior performance and its ability to provide competitive advantage is well accepted among academics and practitioners. Lean production helps producers in overcoming the challenges organizations face through using powerful tools and enablers. However, most companies are faced with restricted resources such as financial and human resources, time, etc., in using these enablers, and are not capable of implementing all these techniques. Therefore, identifying and selecting the most appropriate and efficient tool can be a significant challenge for many companies. Hence, this literature seeks to combine competitive advantages, lean attributes, and lean enablers to determine the most appropriate enablers for improvement of lean attributes. Quality function deployment in fuzzy environment and house of quality matrix are implemented. Throughout the methodology, fuzzy logic is the basis for translating linguistic judgments required for the relationships and correlation matrix to numerical values. Moreover, for final ranking of lean enablers, a multi-criteria decision-making method (PROMETHEE) is adopted. Finally, a case study in automotive industry is presented to illustrate the implementation of the proposed methodology.

  10. Design, Development and Testing of Web Services for Multi-Sensor Snow Cover Mapping

    NASA Astrophysics Data System (ADS)

    Kadlec, Jiri

    This dissertation presents the design, development and validation of new data integration methods for mapping the extent of snow cover based on open access ground station measurements, remote sensing images, volunteer observer snow reports, and cross country ski track recordings from location-enabled mobile devices. The first step of the data integration procedure includes data discovery, data retrieval, and data quality control of snow observations at ground stations. The WaterML R package developed in this work enables hydrologists to retrieve and analyze data from multiple organizations that are listed in the Consortium of Universities for the Advancement of Hydrologic Sciences Inc (CUAHSI) Water Data Center catalog directly within the R statistical software environment. Using the WaterML R package is demonstrated by running an energy balance snowpack model in R with data inputs from CUAHSI, and by automating uploads of real time sensor observations to CUAHSI HydroServer. The second step of the procedure requires efficient access to multi-temporal remote sensing snow images. The Snow Inspector web application developed in this research enables the users to retrieve a time series of fractional snow cover from the Moderate Resolution Imaging Spectroradiometer (MODIS) for any point on Earth. The time series retrieval method is based on automated data extraction from tile images provided by a Web Map Tile Service (WMTS). The average required time for retrieving 100 days of data using this technique is 5.4 seconds, which is significantly faster than other methods that require the download of large satellite image files. The presented data extraction technique and space-time visualization user interface can be used as a model for working with other multi-temporal hydrologic or climate data WMTS services. The third, final step of the data integration procedure is generating continuous daily snow cover maps. A custom inverse distance weighting method has been developed to combine volunteer snow reports, cross-country ski track reports and station measurements to fill cloud gaps in the MODIS snow cover product. The method is demonstrated by producing a continuous daily time step snow presence probability map dataset for the Czech Republic region. The ability of the presented methodology to reconstruct MODIS snow cover under cloud is validated by simulating cloud cover datasets and comparing estimated snow cover to actual MODIS snow cover. The percent correctly classified indicator showed accuracy between 80 and 90% using this method. Using crowdsourcing data (volunteer snow reports and ski tracks) improves the map accuracy by 0.7--1.2%. The output snow probability map data sets are published online using web applications and web services. Keywords: crowdsourcing, image analysis, interpolation, MODIS, R statistical software, snow cover, snowpack probability, Tethys platform, time series, WaterML, web services, winter sports.

  11. Neural classifier in the estimation process of maturity of selected varieties of apples

    NASA Astrophysics Data System (ADS)

    Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Zbytek, Z.; Ludwiczak, A.; Przybylak, A.; Lewicki, A.

    2015-07-01

    This paper seeks to present methods of neural image analysis aimed at estimating the maturity state of selected varieties of apples which are popular in Poland. An identification of the degree of maturity of selected varieties of apples has been conducted on the basis of information encoded in graphical form, presented in the digital photos. The above process involves the application of the BBCH scale, used to determine the maturity of apples. The aforementioned scale is widely used in the EU and has been developed for many species of monocotyledonous plants and dicotyledonous plants. It is also worth noticing that the given scale enables detailed determinations of development stage of a given plant. The purpose of this work is to identify maturity level of selected varieties of apples, which is supported by the use of image analysis methods and classification techniques represented by artificial neural networks. The analysis of graphical representative features based on image analysis method enabled the assessment of the maturity of apples. For the utilitarian purpose the "JabVis 1.1" neural IT system was created, in accordance with requirements of the software engineering dedicated to support the decision-making processes occurring in broadly understood production process and processing of apples.

  12. Processing methods for differential analysis of LC/MS profile data

    PubMed Central

    Katajamaa, Mikko; Orešič, Matej

    2005-01-01

    Background Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. Results We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. Conclusion The software is freely available under the GNU General Public License and it can be obtained from the project web page at: . PMID:16026613

  13. Processing methods for differential analysis of LC/MS profile data.

    PubMed

    Katajamaa, Mikko; Oresic, Matej

    2005-07-18

    Liquid chromatography coupled to mass spectrometry (LC/MS) has been widely used in proteomics and metabolomics research. In this context, the technology has been increasingly used for differential profiling, i.e. broad screening of biomolecular components across multiple samples in order to elucidate the observed phenotypes and discover biomarkers. One of the major challenges in this domain remains development of better solutions for processing of LC/MS data. We present a software package MZmine that enables differential LC/MS analysis of metabolomics data. This software is a toolbox containing methods for all data processing stages preceding differential analysis: spectral filtering, peak detection, alignment and normalization. Specifically, we developed and implemented a new recursive peak search algorithm and a secondary peak picking method for improving already aligned results, as well as a normalization tool that uses multiple internal standards. Visualization tools enable comparative viewing of data across multiple samples. Peak lists can be exported into other data analysis programs. The toolbox has already been utilized in a wide range of applications. We demonstrate its utility on an example of metabolic profiling of Catharanthus roseus cell cultures. The software is freely available under the GNU General Public License and it can be obtained from the project web page at: http://mzmine.sourceforge.net/.

  14. Development of a method for the determination of Fusarium fungi on corn using mid-infrared spectroscopy with attenuated total reflection and chemometrics.

    PubMed

    Kos, Gregor; Lohninger, Hans; Krska, Rudolf

    2003-03-01

    A novel method, which enables the determination of fungal infection with Fusarium graminearum on corn within minutes, is presented. The ground sample was sieved and the particle size fraction between >250 and 100 microm was used for mid-infrared/attenuated total reflection (ATR) measurements. The sample was pressed onto the ATR crystal, and reproducible pressure was applied. After the spectra were recorded, they were subjected to principle component analysis (PCA) and classified using cluster analysis. Observed changes in the spectra reflected changes in protein, carbohydrate, and lipid contents. Ergosterol (for the total fungal biomass) and the toxin deoxynivalenol (DON; a secondary metabolite) of Fusarium fungi served as reference parameters, because of their relevance for the examination of corn based food and feed. The repeatability was highly improved by sieving prior to recording the spectra, resulting in a better clustering in PCA score/score plots. The developed method enabled the separation of samples with a toxin content of as low as 310 microg/kg from noncontaminated (blank) samples. Investigated concentration ranges were 880-3600 microg/kg for ergosterol and 310-2596 microg/kg for DON. The percentage of correctly classified samples was up to 100% for individual samples compared with a number of blank samples.

  15. Development of the Glenn-HT Computer Code to Enable Time-Filtered Navier-Stokes (TFNS) Simulations and Application to Film Cooling on a Flat Plate Through Long Cooling Tubes

    NASA Technical Reports Server (NTRS)

    Ameri, Ali; Shyam, Vikram; Rigby, David; Poinsatte, Philip; Thurman, Douglas; Steinthorsson, Erlendur

    2014-01-01

    Computational fluid dynamics (CFD) analysis using Reynolds-averaged Navier-Stokes (RANS) formulation for turbomachinery-related flows has enabled improved engine component designs. RANS methodology has limitations which are related to its inability to accurately describe the spectrum of flow phenomena encountered in engines. Examples of flows that are difficult to compute accurately with RANS include phenomena such as laminarturbulent transition, turbulent mixing due to mixing of streams, and separated flows. Large eddy simulation (LES) can improve accuracy but at a considerably higher cost. In recent years, hybrid schemes which take advantage of both unsteady RANS and LES have been proposed. This study investigated an alternative scheme, the time-filtered Navier-Stokes (TFNS) method applied to compressible flows. The method developed by Shih and Liu was implemented in the Glenn-HT code and applied to film cooling flows. In this report the method and its implementation is briefly described. The film effectiveness results obtained for film cooling from a row of 30 holes with a pitch of 3.0 diameters emitting air at a nominal density ratio of unity and four blowing ratios of 0.5, 1.0, 1.5 and 2.0 are shown. Flow features under those conditions are also described.

  16. Using Morpholinos to Probe Gene Networks in Sea Urchin.

    PubMed

    Materna, Stefan C

    2017-01-01

    The control processes that underlie the progression of development can be summarized in maps of gene regulatory networks (GRNs). A critical step in their assembly is the systematic perturbation of network candidates. In sea urchins the most important method for interfering with expression in a gene-specific way is application of morpholino antisense oligonucleotides (MOs). MOs act by binding to their sequence complement in transcripts resulting in a block in translation or a change in splicing and thus result in a loss of function. Despite the tremendous success of this technology, recent comparisons to mutants generated by genome editing have led to renewed criticism and challenged its reliability. As with all methods based on sequence recognition, MOs are prone to off-target binding that may result in phenotypes that are erroneously ascribed to the loss of the intended target. However, the slow progression of development in sea urchins has enabled extremely detailed studies of gene activity in the embryo. This wealth of knowledge paired with the simplicity of the sea urchin embryo enables careful analysis of MO phenotypes through a variety of methods that do not rely on terminal phenotypes. This article summarizes the use of MOs in probing GRNs and the steps that should be taken to assure their specificity.

  17. Burst-mode optical label processor with ultralow power consumption.

    PubMed

    Ibrahim, Salah; Nakahara, Tatsushi; Ishikawa, Hiroshi; Takahashi, Ryo

    2016-04-04

    A novel label processor subsystem for 100-Gbps (25-Gbps × 4λs) burst-mode optical packets is developed, in which a highly energy-efficient method is pursued for extracting and interfacing the ultrafast packet-label to a CMOS-based processor where label recognition takes place. The method involves performing serial-to-parallel conversion for the label bits on a bit-by-bit basis by using an optoelectronic converter that is operated with a set of optical triggers generated in a burst-mode manner upon packet arrival. Here we present three key achievements that enabled a significant reduction in the total power consumption and latency of the whole subsystem; 1) based on a novel operation mechanism for providing amplification with bit-level selectivity, an optical trigger pulse generator, that consumes power for a very short duration upon packet arrival, is proposed and experimentally demonstrated, 2) the energy of optical triggers needed by the optoelectronic serial-to-parallel converter is reduced by utilizing a negative-polarity signal while employing an enhanced conversion scheme entitled the discharge-or-hold scheme, 3) the necessary optical trigger energy is further cut down by half by coupling the triggers through the chip's backside, whereas a novel lens-free packaging method is developed to enable a low-cost alignment process that works with simple visual observation.

  18. Next-generation concurrent engineering: developing models to complement point designs

    NASA Technical Reports Server (NTRS)

    Morse, Elizabeth; Leavens, Tracy; Cohanim, Babak; Harmon, Corey; Mahr, Eric; Lewis, Brian

    2006-01-01

    Concurrent Engineering Design (CED) teams have made routine the rapid development of point designs for space missions. The Jet Propulsion Laboratory's Team X is now evolving into a 'next-generation CED; in addition to a point design, the Team develops a model of the local trade space. The process is a balance between the power of a model developing tools and the creativity of humal experts, enabling the development of a variety of trade models for any space mission. This paper reviews the modeling method and its practical implementation in the ED environment. Example results illustrate the benefit of this approach.

  19. Review of recent advances in analytical techniques for the determination of neurotransmitters

    PubMed Central

    Perry, Maura; Li, Qiang; Kennedy, Robert T.

    2009-01-01

    Methods and advances for monitoring neurotransmitters in vivo or for tissue analysis of neurotransmitters over the last five years are reviewed. The review is organized primarily by neurotransmitter type. Transmitter and related compounds may be monitored by either in vivo sampling coupled to analytical methods or implanted sensors. Sampling is primarily performed using microdialysis, but low-flow push-pull perfusion may offer advantages of spatial resolution while minimizing the tissue disruption associated with higher flow rates. Analytical techniques coupled to these sampling methods include liquid chromatography, capillary electrophoresis, enzyme assays, sensors, and mass spectrometry. Methods for the detection of amino acid, monoamine, neuropeptide, acetylcholine, nucleoside, and soluable gas neurotransmitters have been developed and improved upon. Advances in the speed and sensitivity of these methods have enabled improvements in temporal resolution and increased the number of compounds detectable. Similar advances have enabled improved detection at tissue samples, with a substantial emphasis on single cell and other small samples. Sensors provide excellent temporal and spatial resolution for in vivo monitoring. Advances in application to catecholamines, indoleamines, and amino acids have been prominent. Improvements in stability, sensitivity, and selectivity of the sensors have been of paramount interest. PMID:19800472

  20. Composite Bloom Filters for Secure Record Linkage.

    PubMed

    Durham, Elizabeth Ashley; Kantarcioglu, Murat; Xue, Yuan; Toth, Csaba; Kuzu, Mehmet; Malin, Bradley

    2014-12-01

    The process of record linkage seeks to integrate instances that correspond to the same entity. Record linkage has traditionally been performed through the comparison of identifying field values ( e.g., Surname ), however, when databases are maintained by disparate organizations, the disclosure of such information can breach the privacy of the corresponding individuals. Various private record linkage (PRL) methods have been developed to obscure such identifiers, but they vary widely in their ability to balance competing goals of accuracy, efficiency and security. The tokenization and hashing of field values into Bloom filters (BF) enables greater linkage accuracy and efficiency than other PRL methods, but the encodings may be compromised through frequency-based cryptanalysis. Our objective is to adapt a BF encoding technique to mitigate such attacks with minimal sacrifices in accuracy and efficiency. To accomplish these goals, we introduce a statistically-informed method to generate BF encodings that integrate bits from multiple fields, the frequencies of which are provably associated with a minimum number of fields. Our method enables a user-specified tradeoff between security and accuracy. We compare our encoding method with other techniques using a public dataset of voter registration records and demonstrate that the increases in security come with only minor losses to accuracy.

  1. Investigation of Human Cancers for Retrovirus by Low-Stringency Target Enrichment and High-Throughput Sequencing.

    PubMed

    Vinner, Lasse; Mourier, Tobias; Friis-Nielsen, Jens; Gniadecki, Robert; Dybkaer, Karen; Rosenberg, Jacob; Langhoff, Jill Levin; Cruz, David Flores Santa; Fonager, Jannik; Izarzugaza, Jose M G; Gupta, Ramneek; Sicheritz-Ponten, Thomas; Brunak, Søren; Willerslev, Eske; Nielsen, Lars Peter; Hansen, Anders Johannes

    2015-08-19

    Although nearly one fifth of all human cancers have an infectious aetiology, the causes for the majority of cancers remain unexplained. Despite the enormous data output from high-throughput shotgun sequencing, viral DNA in a clinical sample typically constitutes a proportion of host DNA that is too small to be detected. Sequence variation among virus genomes complicates application of sequence-specific, and highly sensitive, PCR methods. Therefore, we aimed to develop and characterize a method that permits sensitive detection of sequences despite considerable variation. We demonstrate that our low-stringency in-solution hybridization method enables detection of <100 viral copies. Furthermore, distantly related proviral sequences may be enriched by orders of magnitude, enabling discovery of hitherto unknown viral sequences by high-throughput sequencing. The sensitivity was sufficient to detect retroviral sequences in clinical samples. We used this method to conduct an investigation for novel retrovirus in samples from three cancer types. In accordance with recent studies our investigation revealed no retroviral infections in human B-cell lymphoma cells, cutaneous T-cell lymphoma or colorectal cancer biopsies. Nonetheless, our generally applicable method makes sensitive detection possible and permits sequencing of distantly related sequences from complex material.

  2. Composite Bloom Filters for Secure Record Linkage

    PubMed Central

    Durham, Elizabeth Ashley; Kantarcioglu, Murat; Xue, Yuan; Toth, Csaba; Kuzu, Mehmet; Malin, Bradley

    2014-01-01

    The process of record linkage seeks to integrate instances that correspond to the same entity. Record linkage has traditionally been performed through the comparison of identifying field values (e.g., Surname), however, when databases are maintained by disparate organizations, the disclosure of such information can breach the privacy of the corresponding individuals. Various private record linkage (PRL) methods have been developed to obscure such identifiers, but they vary widely in their ability to balance competing goals of accuracy, efficiency and security. The tokenization and hashing of field values into Bloom filters (BF) enables greater linkage accuracy and efficiency than other PRL methods, but the encodings may be compromised through frequency-based cryptanalysis. Our objective is to adapt a BF encoding technique to mitigate such attacks with minimal sacrifices in accuracy and efficiency. To accomplish these goals, we introduce a statistically-informed method to generate BF encodings that integrate bits from multiple fields, the frequencies of which are provably associated with a minimum number of fields. Our method enables a user-specified tradeoff between security and accuracy. We compare our encoding method with other techniques using a public dataset of voter registration records and demonstrate that the increases in security come with only minor losses to accuracy. PMID:25530689

  3. Object Recognition using Feature- and Color-Based Methods

    NASA Technical Reports Server (NTRS)

    Duong, Tuan; Duong, Vu; Stubberud, Allen

    2008-01-01

    An improved adaptive method of processing image data in an artificial neural network has been developed to enable automated, real-time recognition of possibly moving objects under changing (including suddenly changing) conditions of illumination and perspective. The method involves a combination of two prior object-recognition methods one based on adaptive detection of shape features and one based on adaptive color segmentation to enable recognition in situations in which either prior method by itself may be inadequate. The chosen prior feature-based method is known as adaptive principal-component analysis (APCA); the chosen prior color-based method is known as adaptive color segmentation (ACOSE). These methods are made to interact with each other in a closed-loop system to obtain an optimal solution of the object-recognition problem in a dynamic environment. One of the results of the interaction is to increase, beyond what would otherwise be possible, the accuracy of the determination of a region of interest (containing an object that one seeks to recognize) within an image. Another result is to provide a minimized adaptive step that can be used to update the results obtained by the two component methods when changes of color and apparent shape occur. The net effect is to enable the neural network to update its recognition output and improve its recognition capability via an adaptive learning sequence. In principle, the improved method could readily be implemented in integrated circuitry to make a compact, low-power, real-time object-recognition system. It has been proposed to demonstrate the feasibility of such a system by integrating a 256-by-256 active-pixel sensor with APCA, ACOSE, and neural processing circuitry on a single chip. It has been estimated that such a system on a chip would have a volume no larger than a few cubic centimeters, could operate at a rate as high as 1,000 frames per second, and would consume in the order of milliwatts of power.

  4. Point-of-care testing: applications of 3D printing.

    PubMed

    Chan, Ho Nam; Tan, Ming Jun Andrew; Wu, Hongkai

    2017-08-08

    Point-of-care testing (POCT) devices fulfil a critical need in the modern healthcare ecosystem, enabling the decentralized delivery of imperative clinical strategies in both developed and developing worlds. To achieve diagnostic utility and clinical impact, POCT technologies are immensely dependent on effective translation from academic laboratories out to real-world deployment. However, the current research and development pipeline is highly bottlenecked owing to multiple restraints in material, cost, and complexity of conventionally available fabrication techniques. Recently, 3D printing technology has emerged as a revolutionary, industry-compatible method enabling cost-effective, facile, and rapid manufacturing of objects. This has allowed iterative design-build-test cycles of various things, from microfluidic chips to smartphone interfaces, that are geared towards point-of-care applications. In this review, we focus on highlighting recent works that exploit 3D printing in developing POCT devices, underscoring its utility in all analytical steps. Moreover, we also discuss key advantages of adopting 3D printing in the device development pipeline and identify promising opportunities in 3D printing technology that can benefit global health applications.

  5. Methods for the Study of Gonadal Development.

    PubMed

    Piprek, Rafal P

    2016-01-01

    Current knowledge on gonadal development and sex determination is the product of many decades of research involving a variety of scientific methods from different biological disciplines such as histology, genetics, biochemistry, and molecular biology. The earliest embryological investigations, followed by the invention of microscopy and staining methods, were based on histological examinations. The most robust development of histological staining techniques occurred in the second half of the nineteenth century and resulted in structural descriptions of gonadogenesis. These first studies on gonadal development were conducted on domesticated animals; however, currently the mouse is the most extensively studied species. The next key point in the study of gonadogenesis was the advancement of methods allowing for the in vitro culture of fetal gonads. For instance, this led to the description of the origin of cell lines forming the gonads. Protein detection using antibodies and immunolabeling methods and the use of reporter genes were also invaluable for developmental studies, enabling the visualization of the formation of gonadal structure. Recently, genetic and molecular biology techniques, especially gene expression analysis, have revolutionized studies on gonadogenesis and have provided insight into the molecular mechanisms that govern this process. The successive invention of new methods is reflected in the progress of research on gonadal development.

  6. Development of an Aerosol Opacity Retrieval Algorithm for Use with Multi-Angle Land Surface Images

    NASA Technical Reports Server (NTRS)

    Diner, D.; Paradise, S.; Martonchik, J.

    1994-01-01

    In 1998, the Multi-angle Imaging SpectroRadiometer (MISR) will fly aboard the EOS-AM1 spacecraft. MISR will enable unique methods for retrieving the properties of atmospheric aerosols, by providing global imagery of the Earth at nine viewing angles in four visible and near-IR spectral bands. As part of the MISR algorithm development, theoretical methods of analyzing multi-angle, multi-spectral data are being tested using images acquired by the airborne Advanced Solid-State Array Spectroradiometer (ASAS). In this paper we derive a method to be used over land surfaces for retrieving the change in opacity between spectral bands, which can then be used in conjunction with an aerosol model to derive a bound on absolute opacity.

  7. OpenMS - A platform for reproducible analysis of mass spectrometry data.

    PubMed

    Pfeuffer, Julianus; Sachsenberg, Timo; Alka, Oliver; Walzer, Mathias; Fillbrunn, Alexander; Nilse, Lars; Schilling, Oliver; Reinert, Knut; Kohlbacher, Oliver

    2017-11-10

    In recent years, several mass spectrometry-based omics technologies emerged to investigate qualitative and quantitative changes within thousands of biologically active components such as proteins, lipids and metabolites. The research enabled through these methods potentially contributes to the diagnosis and pathophysiology of human diseases as well as to the clarification of structures and interactions between biomolecules. Simultaneously, technological advances in the field of mass spectrometry leading to an ever increasing amount of data, demand high standards in efficiency, accuracy and reproducibility of potential analysis software. This article presents the current state and ongoing developments in OpenMS, a versatile open-source framework aimed at enabling reproducible analyses of high-throughput mass spectrometry data. It provides implementations of frequently occurring processing operations on MS data through a clean application programming interface in C++ and Python. A collection of 185 tools and ready-made workflows for typical MS-based experiments enable convenient analyses for non-developers and facilitate reproducible research without losing flexibility. OpenMS will continue to increase its ease of use for developers as well as users with improved continuous integration/deployment strategies, regular trainings with updated training materials and multiple sources of support. The active developer community ensures the incorporation of new features to support state of the art research. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  8. A metadata reporting framework for standardization and synthesis of ecohydrological field observations

    NASA Astrophysics Data System (ADS)

    Christianson, D. S.; Varadharajan, C.; Detto, M.; Faybishenko, B.; Gimenez, B.; Jardine, K.; Negron Juarez, R. I.; Pastorello, G.; Powell, T.; Warren, J.; Wolfe, B.; McDowell, N. G.; Kueppers, L. M.; Chambers, J.; Agarwal, D.

    2016-12-01

    The U.S. Department of Energy's (DOE) Next Generation Ecosystem Experiment (NGEE) Tropics project aims to develop a process-rich tropical forest ecosystem model that is parameterized and benchmarked by field observations. Thus, data synthesis, quality assurance and quality control (QA/QC), and data product generation of a diverse and complex set of ecohydrological observations, including sapflux, leaf surface temperature, soil water content, and leaf gas exchange from sites across the Tropics, are required to support model simulations. We have developed a metadata reporting framework, implemented in conjunction with the NGEE Tropics Data Archive tool, to enable cross-site and cross-method comparison, data interpretability, and QA/QC. We employed a modified User-Centered Design approach, which involved short development cycles based on user-identified needs, and iterative testing with data providers and users. The metadata reporting framework currently has been implemented for sensor-based observations and leverages several existing metadata protocols. The framework consists of templates that define a multi-scale measurement position hierarchy, descriptions of measurement settings, and details about data collection and data file organization. The framework also enables data providers to define data-access permission settings, provenance, and referencing to enable appropriate data usage, citation, and attribution. In addition to describing the metadata reporting framework, we discuss tradeoffs and impressions from both data providers and users during the development process, focusing on the scalability, usability, and efficiency of the framework.

  9. A theoretical treatment of technical risk in modern propulsion system design

    NASA Astrophysics Data System (ADS)

    Roth, Bryce Alexander

    2000-09-01

    A prevalent trend in modern aerospace systems is increasing complexity and cost, which in turn drives increased risk. Consequently, there is a clear and present need for the development of formalized methods to analyze the impact of risk on the design of aerospace vehicles. The objective of this work is to develop such a method that enables analysis of risk via a consistent, comprehensive treatment of aerothermodynamic and mass properties aspects of vehicle design. The key elements enabling the creation of this methodology are recent developments in the analytical estimation of work potential based on the second law of thermodynamics. This dissertation develops the theoretical foundation of a vehicle analysis method based on work potential and validates it using the Northrop F-5E with GE J85-GE-21 engines as a case study. Although the method is broadly applicable, emphasis is given to aircraft propulsion applications. Three work potential figures of merit are applied using this method: exergy, available energy, and thrust work potential. It is shown that each possesses unique properties making them useful for specific vehicle analysis tasks, though the latter two are actually special cases of exergy. All three are demonstrated on the analysis of the J85-GE-21 propulsion system, resulting in a comprehensive description of propulsion system thermodynamic loss. This "loss management" method is used to analyze aerodynamic drag loss of the F-5E and is then used in conjunction with the propulsive loss model to analyze the usage of fuel work potential throughout the F-5E design mission. The results clearly show how and where work potential is used during flight and yield considerable insight as to where the greatest opportunity for design improvement is. Next, usage of work potential is translated into fuel weight so that the aerothermodynamic performance of the F-5E can be expressed entirely in terms of vehicle gross weight. This technique is then applied as a means to quantify the impact of engine cycle technologies on the F-5E airframe. Finally, loss management methods are used in conjunction with probabilistic analysis methods to quantify the impact of risk on F-5E aerothermodynamic performance.

  10. A vacuum flash-assisted solution process for high-efficiency large-area perovskite solar cells

    NASA Astrophysics Data System (ADS)

    Li, Xiong; Bi, Dongqin; Yi, Chenyi; Décoppet, Jean-David; Luo, Jingshan; Zakeeruddin, Shaik Mohammed; Hagfeldt, Anders; Grätzel, Michael

    2016-07-01

    Metal halide perovskite solar cells (PSCs) currently attract enormous research interest because of their high solar-to-electric power conversion efficiency (PCE) and low fabrication costs, but their practical development is hampered by difficulties in achieving high performance with large-size devices. We devised a simple vacuum flash-assisted solution processing method to obtain shiny, smooth, crystalline perovskite films of high electronic quality over large areas. This enabled us to fabricate solar cells with an aperture area exceeding 1 square centimeter, a maximum efficiency of 20.5%, and a certified PCE of 19.6%. By contrast, the best certified PCE to date is 15.6% for PSCs of similar size. We demonstrate that the reproducibility of the method is excellent and that the cells show virtually no hysteresis. Our approach enables the realization of highly efficient large-area PSCs for practical deployment.

  11. Determining Regulatory Networks Governing the Differentiation of Embryonic Stem Cells to Pancreatic Lineage

    NASA Astrophysics Data System (ADS)

    Banerjee, Ipsita

    2009-03-01

    Knowledge of pathways governing cellular differentiation to specific phenotype will enable generation of desired cell fates by careful alteration of the governing network by adequate manipulation of the cellular environment. With this aim, we have developed a novel method to reconstruct the underlying regulatory architecture of a differentiating cell population from discrete temporal gene expression data. We utilize an inherent feature of biological networks, that of sparsity, in formulating the network reconstruction problem as a bi-level mixed-integer programming problem. The formulation optimizes the network topology at the upper level and the network connectivity strength at the lower level. The method is first validated by in-silico data, before applying it to the complex system of embryonic stem (ES) cell differentiation. This formulation enables efficient identification of the underlying network topology which could accurately predict steps necessary for directing differentiation to subsequent stages. Concurrent experimental verification demonstrated excellent agreement with model prediction.

  12. Gender Representation on Journal Editorial Boards in the Mathematical Sciences.

    PubMed

    Topaz, Chad M; Sen, Shilad

    2016-01-01

    We study gender representation on the editorial boards of 435 journals in the mathematical sciences. Women are known to comprise approximately 15% of tenure-stream faculty positions in doctoral-granting mathematical sciences departments in the United States. Compared to this group, we find that 8.9% of the 13067 editorships in our study are held by women. We describe group variations within the editorships by identifying specific journals, subfields, publishers, and countries that significantly exceed or fall short of this average. To enable our study, we develop a semi-automated method for inferring gender that has an estimated accuracy of 97.5%. Our findings provide the first measure of gender distribution on editorial boards in the mathematical sciences, offer insights that suggest future studies in the mathematical sciences, and introduce new methods that enable large-scale studies of gender distribution in other fields.

  13. Lithium metal protection enabled by in-situ olefin polymerization for high-performance secondary lithium sulfur batteries

    NASA Astrophysics Data System (ADS)

    An, Yongling; Zhang, Zhen; Fei, Huifang; Xu, Xiaoyan; Xiong, Shenglin; Feng, Jinkui; Ci, Lijie

    2017-09-01

    Lithium metal is considered to be the optimal choice of next-generation anode materials due to its ultrahigh theoretical capacity and the lowest redox potential. However, the growth of dendritic and mossy lithium for rechargeable Li metal batteries lead to the possible short circuiting and subsequently serious safety issues during charge/discharge cycles. For the further practical applications of Li anode, here we report a facile method for fabricating robust interfacial layer via in-situ olefin polymerization. The resulting polymer layer effectively suppresses the formation of Li dendrites and enables the long-term operation of Li metal batteries. Using Li-S cells as a test system, we also demonstrate an improved capacity retention with the protection of tetramethylethylene-polymer. Our results indicate that this method could be a promising strategy to tackle the intrinsic problems of lithium metal anodes and promote the development of Li metal batteries.

  14. Gender Representation on Journal Editorial Boards in the Mathematical Sciences

    PubMed Central

    2016-01-01

    We study gender representation on the editorial boards of 435 journals in the mathematical sciences. Women are known to comprise approximately 15% of tenure-stream faculty positions in doctoral-granting mathematical sciences departments in the United States. Compared to this group, we find that 8.9% of the 13067 editorships in our study are held by women. We describe group variations within the editorships by identifying specific journals, subfields, publishers, and countries that significantly exceed or fall short of this average. To enable our study, we develop a semi-automated method for inferring gender that has an estimated accuracy of 97.5%. Our findings provide the first measure of gender distribution on editorial boards in the mathematical sciences, offer insights that suggest future studies in the mathematical sciences, and introduce new methods that enable large-scale studies of gender distribution in other fields. PMID:27536970

  15. Cell type-specific manipulation with GFP-dependent Cre recombinase.

    PubMed

    Tang, Jonathan C Y; Rudolph, Stephanie; Dhande, Onkar S; Abraira, Victoria E; Choi, Seungwon; Lapan, Sylvain W; Drew, Iain R; Drokhlyansky, Eugene; Huberman, Andrew D; Regehr, Wade G; Cepko, Constance L

    2015-09-01

    There are many transgenic GFP reporter lines that allow the visualization of specific populations of cells. Using such lines for functional studies requires a method that transforms GFP into a molecule that enables genetic manipulation. We developed a method that exploits GFP for gene manipulation, Cre recombinase dependent on GFP (CRE-DOG), a split component system that uses GFP and its derivatives to directly induce Cre/loxP recombination. Using plasmid electroporation and AAV viral vectors, we delivered CRE-DOG to multiple GFP mouse lines, which led to effective recombination selectively in GFP-labeled cells. Furthermore, CRE-DOG enabled optogenetic control of these neurons. Beyond providing a new set of tools for manipulation of gene expression selectively in GFP(+) cells, we found that GFP can be used to reconstitute the activity of a protein not known to have a modular structure, suggesting that this strategy might be applicable to a wide range of proteins.

  16. Recombinant G protein-coupled receptor expression in Saccharomyces cerevisiae for protein characterization.

    PubMed

    Blocker, Kory M; Britton, Zachary T; Naranjo, Andrea N; McNeely, Patrick M; Young, Carissa L; Robinson, Anne S

    2015-01-01

    G protein-coupled receptors (GPCRs) are membrane proteins that mediate signaling across the cellular membrane and facilitate cellular responses to external stimuli. Due to the critical role that GPCRs play in signal transduction, therapeutics have been developed to influence GPCR function without an extensive understanding of the receptors themselves. Closing this knowledge gap is of paramount importance to improving therapeutic efficacy and specificity, where efforts to achieve this end have focused chiefly on improving our knowledge of the structure-function relationship. The purpose of this chapter is to review methods for the heterologous expression of GPCRs in Saccharomyces cerevisiae, including whole-cell assays that enable quantitation of expression, localization, and function in vivo. In addition, we describe methods for the micellular solubilization of the human adenosine A2a receptor and for reconstitution of the receptor in liposomes that have enabled its biophysical characterization. © 2015 Elsevier Inc. All rights reserved.

  17. Methods for the culture of C. elegans and S. cerevisiae in microgravity

    NASA Technical Reports Server (NTRS)

    Fahlen, Thomas; Sunga, June; Rask, Jon; Herrera, Anna; Lam, Kitty; Sing, Luke; Sato, Kevin; Ramos, Ross A.; Kirven-Brooks, Melissa; Reiss-Bubenheim, Debra

    2005-01-01

    To support the study of the effects of microgravity on biological systems, our group is developing and testing methods that allow the cultivation of C. elegans and S. cerevisiae in microgravity. Our aim is to develop the experimental means by which investigators may conduct peer reviewed biological experiments with C. elegans or S. cerevisiae in microgravity. Our protocols are aimed at enabling investigators to grow these organisms for extended periods during which samples may be sub-cultured, collected, preserved, frozen, and/or returned to earth for analysis. Data presented include characterization of the growth phenotype of these organisms in liquid medium in OptiCells(TM) (Biocrystal, LTD).

  18. Computational Design of Materials: Planetary Entry to Electric Aircraft and Beyond

    NASA Technical Reports Server (NTRS)

    Thompson, Alexander; Lawson, John W.

    2014-01-01

    NASA's projects and missions push the bounds of what is possible. To support the agency's work, materials development must stay on the cutting edge in order to keep pace. Today, researchers at NASA Ames Research Center perform multiscale modeling to aid the development of new materials and provide insight into existing ones. Multiscale modeling enables researchers to determine micro- and macroscale properties by connecting computational methods ranging from the atomic level (density functional theory, molecular dynamics) to the macroscale (finite element method). The output of one level is passed on as input to the next level, creating a powerful predictive model.

  19. A Self-Directed Method for Cell-Type Identification and Separation of Gene Expression Microarrays

    PubMed Central

    Zuckerman, Neta S.; Noam, Yair; Goldsmith, Andrea J.; Lee, Peter P.

    2013-01-01

    Gene expression analysis is generally performed on heterogeneous tissue samples consisting of multiple cell types. Current methods developed to separate heterogeneous gene expression rely on prior knowledge of the cell-type composition and/or signatures - these are not available in most public datasets. We present a novel method to identify the cell-type composition, signatures and proportions per sample without need for a-priori information. The method was successfully tested on controlled and semi-controlled datasets and performed as accurately as current methods that do require additional information. As such, this method enables the analysis of cell-type specific gene expression using existing large pools of publically available microarray datasets. PMID:23990767

  20. Studying learning in the healthcare setting: the potential of quantitative diary methods.

    PubMed

    Ciere, Yvette; Jaarsma, Debbie; Visser, Annemieke; Sanderman, Robbert; Snippe, Evelien; Fleer, Joke

    2015-08-01

    Quantitative diary methods are longitudinal approaches that involve the repeated measurement of aspects of peoples' experience of daily life. In this article, we outline the main characteristics and applications of quantitative diary methods and discuss how their use may further research in the field of medical education. Quantitative diary methods offer several methodological advantages, such as measuring aspects of learning with great detail, accuracy and authenticity. Moreover, they enable researchers to study how and under which conditions learning in the health care setting occurs and in which way learning can be promoted. Hence, quantitative diary methods may contribute to theory development and the optimization of teaching methods in medical education.

  1. An assessment of implementation of Community-Oriented Primary Care in Kenyan family medicine postgraduate medical education programmes

    PubMed Central

    Shabani, Jacob; Taché, Stephanie; Mohamoud, Gulnaz; Mahoney, Megan

    2016-01-01

    Background and objectives Family medicine postgraduate programmes in Kenya are examining the benefits of Community-Oriented Primary Care (COPC) curriculum, as a method to train residents in population-based approaches to health care delivery. Whilst COPC is an established part of family medicine training in the United States, little is known about its application in Kenya. We sought to conduct a qualitative study to explore the development and implementation of COPC curriculum in the first two family medicine postgraduate programmes in Kenya. Method Semi-structured interviews of COPC educators, practitioners, and academic stakeholders and focus groups of postgraduate students were conducted with COPC educators, practitioners and academic stakeholders in two family medicine postgraduate programmes in Kenya. Discussions were transcribed, inductively coded and thematically analysed. Results Two focus groups with eight family medicine postgraduate students and interviews with five faculty members at two universities were conducted. Two broad themes emerged from the analysis: expected learning outcomes and important community-based enablers. Three learning outcomes were (1) making a community diagnosis, (2) understanding social determinants of health and (3) training in participatory research. Three community-based enablers for sustainability of COPC were (1) partnerships with community health workers, (2) community empowerment and engagement and (3) institutional financial support. Conclusions Our findings illustrate the expected learning outcomes and important community-based enablers associated with the successful implementation of COPC projects in Kenya and will help to inform future curriculum development in Kenya. PMID:28155322

  2. A Delphi study to determine the European core curriculum for Master programmes in genetic counselling.

    PubMed

    Skirton, Heather; Barnoy, Sivia; Ingvoldstad, Charlotta; van Kessel, Ingrid; Patch, Christine; O'Connor, Anita; Serra-Juhe, Clara; Stayner, Barbara; Voelckel, Marie-Antoinette

    2013-10-01

    Genetic counsellors have been working in some European countries for at least 30 years. Although there are great disparities between the numbers, education, practice and acceptance of these professionals across Europe, it is evident that genetic counsellors and genetic nurses in Europe are working autonomously within teams to deliver patient care. The aim of this study was to use the Delphi research method to develop a core curriculum to guide the educational preparation of these professionals in Europe. The Delphi method enables the researcher to utilise the views and opinions of a group of recognised experts in the field of study; this study consisted of four phases. Phases 1 and 4 consisted of expert workshops, whereas data were collected in phases 2 and 3 (n=35) via online surveys. All participants in the study were considered experts in the field of genetic counselling. The topics considered essential for genetic counsellor training have been organised under the following headings: (1) counselling; (2) psychological issues; (3) medical genetics; (4) human genetics; (5) ethics, law and sociology; (6) professional practice; and (7) education and research. Each topic includes the knowledge, skills and attitudes required to enable genetic counsellors to develop competence. In addition, it was considered by the experts that clinical practice should comprise 50% of the educational programme. The core Master programme curriculum will enable current courses to be assessed and inform the design of future educational programmes for European genetic counsellors.

  3. Software Reuse Methods to Improve Technological Infrastructure for e-Science

    NASA Technical Reports Server (NTRS)

    Marshall, James J.; Downs, Robert R.; Mattmann, Chris A.

    2011-01-01

    Social computing has the potential to contribute to scientific research. Ongoing developments in information and communications technology improve capabilities for enabling scientific research, including research fostered by social computing capabilities. The recent emergence of e-Science practices has demonstrated the benefits from improvements in the technological infrastructure, or cyber-infrastructure, that has been developed to support science. Cloud computing is one example of this e-Science trend. Our own work in the area of software reuse offers methods that can be used to improve new technological development, including cloud computing capabilities, to support scientific research practices. In this paper, we focus on software reuse and its potential to contribute to the development and evaluation of information systems and related services designed to support new capabilities for conducting scientific research.

  4. Controlled chain polymerisation and chemical soldering for single-molecule electronics.

    PubMed

    Okawa, Yuji; Akai-Kasaya, Megumi; Kuwahara, Yuji; Mandal, Swapan K; Aono, Masakazu

    2012-05-21

    Single functional molecules offer great potential for the development of novel nanoelectronic devices with capabilities beyond today's silicon-based devices. To realise single-molecule electronics, the development of a viable method for connecting functional molecules to each other using single conductive polymer chains is required. The method of initiating chain polymerisation using the tip of a scanning tunnelling microscope (STM) is very useful for fabricating single conductive polymer chains at designated positions and thereby wiring single molecules. In this feature article, developments in the controlled chain polymerisation of diacetylene compounds and the properties of polydiacetylene chains are summarised. Recent studies of "chemical soldering", a technique enabling the covalent connection of single polydiacetylene chains to single functional molecules, are also introduced. This represents a key step in advancing the development of single-molecule electronics.

  5. Boolean logic analysis for flow regime recognition of gas-liquid horizontal flow

    NASA Astrophysics Data System (ADS)

    Ramskill, Nicholas P.; Wang, Mi

    2011-10-01

    In order to develop a flowmeter for the accurate measurement of multiphase flows, it is of the utmost importance to correctly identify the flow regime present to enable the selection of the optimal method for metering. In this study, the horizontal flow of air and water in a pipeline was studied under a multitude of conditions using electrical resistance tomography but the flow regimes that are presented in this paper have been limited to plug and bubble air-water flows. This study proposes a novel method for recognition of the prevalent flow regime using only a fraction of the data, thus rendering the analysis more efficient. By considering the average conductivity of five zones along the central axis of the tomogram, key features can be identified, thus enabling the recognition of the prevalent flow regime. Boolean logic and frequency spectrum analysis has been applied for flow regime recognition. Visualization of the flow using the reconstructed images provides a qualitative comparison between different flow regimes. Application of the Boolean logic scheme enables a quantitative comparison of the flow patterns, thus reducing the subjectivity in the identification of the prevalent flow regime.

  6. Probing the structure of heterogeneous diluted materials by diffraction tomography.

    PubMed

    Bleuet, Pierre; Welcomme, Eléonore; Dooryhée, Eric; Susini, Jean; Hodeau, Jean-Louis; Walter, Philippe

    2008-06-01

    The advent of nanosciences calls for the development of local structural probes, in particular to characterize ill-ordered or heterogeneous materials. Furthermore, because materials properties are often related to their heterogeneity and the hierarchical arrangement of their structure, different structural probes covering a wide range of scales are required. X-ray diffraction is one of the prime structural methods but suffers from a relatively poor detection limit, whereas transmission electron analysis involves destructive sample preparation. Here we show the potential of coupling pencil-beam tomography with X-ray diffraction to examine unidentified phases in nanomaterials and polycrystalline materials. The demonstration is carried out on a high-pressure pellet containing several carbon phases and on a heterogeneous powder containing chalcedony and iron pigments. The present method enables a non-invasive structural refinement with a weight sensitivity of one part per thousand. It enables the extraction of the scattering patterns of amorphous and crystalline compounds with similar atomic densities and compositions. Furthermore, such a diffraction-tomography experiment can be carried out simultaneously with X-ray fluorescence, Compton and absorption tomographies, enabling a multimodal analysis of prime importance in materials science, chemistry, geology, environmental science, medical science, palaeontology and cultural heritage.

  7. Safety Assurance Factors for Electronic Health Record Resilience (SAFER): study protocol

    PubMed Central

    2013-01-01

    Background Implementation and use of electronic health records (EHRs) could lead to potential improvements in quality of care. However, the use of EHRs also introduces unique and often unexpected patient safety risks. Proactive assessment of risks and vulnerabilities can help address potential EHR-related safety hazards before harm occurs; however, current risk assessment methods are underdeveloped. The overall objective of this project is to develop and validate proactive assessment tools to ensure that EHR-enabled clinical work systems are safe and effective. Methods/Design This work is conceptually grounded in an 8-dimension model of safe and effective health information technology use. Our first aim is to develop self-assessment guides that can be used by health care institutions to evaluate certain high-risk components of their EHR-enabled clinical work systems. We will solicit input from subject matter experts and relevant stakeholders to develop guides focused on 9 specific risk areas and will subsequently pilot test the guides with individuals representative of likely users. The second aim will be to examine the utility of the self-assessment guides by beta testing the guides at selected facilities and conducting on-site evaluations. Our multidisciplinary team will use a variety of methods to assess the content validity and perceived usefulness of the guides, including interviews, naturalistic observations, and document analysis. The anticipated output of this work will be a series of self-administered EHR safety assessment guides with clear, actionable, checklist-type items. Discussion Proactive assessment of patient safety risks increases the resiliency of health care organizations to unanticipated hazards of EHR use. The resulting products and lessons learned from the development of the assessment guides are expected to be helpful to organizations that are beginning the EHR selection and implementation process as well as those that have already implemented EHRs. Findings from our project, currently underway, will inform future efforts to validate and implement tools that can be used by health care organizations to improve the safety of EHR-enabled clinical work systems. PMID:23587208

  8. Enhanced Droplet Control by Transition Boiling

    PubMed Central

    Grounds, Alex; Still, Richard; Takashina, Kei

    2012-01-01

    A droplet of water on a heated surface can levitate over a film of gas produced by its own evaporation in the Leidenfrost effect. When the surface is prepared with ratchet-like saw-teeth topography, these droplets can self-propel and can even climb uphill. However, the extent to which the droplets can be controlled is limited by the physics of the Leidenfrost effect. Here, we show that transition boiling can be induced even at very high surface temperatures and provide additional control over the droplets. Ratchets with acute protrusions enable droplets to climb steeper inclines while ratchets with sub-structures enable their direction of motion to be controlled by varying the temperature of the surface. The droplets' departure from the Leidenfrost regime is assessed by analysing the sound produced by their boiling. We anticipate these techniques will enable the development of more sophisticated methods for controlling small droplets and heat transfer. PMID:23056912

  9. Enhancing understanding and improving prediction of severe weather through spatiotemporal relational learning.

    PubMed

    McGovern, Amy; Gagne, David J; Williams, John K; Brown, Rodger A; Basara, Jeffrey B

    Severe weather, including tornadoes, thunderstorms, wind, and hail annually cause significant loss of life and property. We are developing spatiotemporal machine learning techniques that will enable meteorologists to improve the prediction of these events by improving their understanding of the fundamental causes of the phenomena and by building skillful empirical predictive models. In this paper, we present significant enhancements of our Spatiotemporal Relational Probability Trees that enable autonomous discovery of spatiotemporal relationships as well as learning with arbitrary shapes. We focus our evaluation on two real-world case studies using our technique: predicting tornadoes in Oklahoma and predicting aircraft turbulence in the United States. We also discuss how to evaluate success for a machine learning algorithm in the severe weather domain, which will enable new methods such as ours to transfer from research to operations, provide a set of lessons learned for embedded machine learning applications, and discuss how to field our technique.

  10. MEAs and 3D nanoelectrodes: electrodeposition as tool for a precisely controlled nanofabrication.

    PubMed

    Weidlich, Sabrina; Krause, Kay J; Schnitker, Jan; Wolfrum, Bernhard; Offenhäusser, Andreas

    2017-01-31

    Microelectrode arrays (MEAs) are gaining increasing importance for the investigation of signaling processes between electrogenic cells. However, efficient cell-chip coupling for robust and long-term electrophysiological recording and stimulation still remains a challenge. A possible approach for the improvement of the cell-electrode contact is the utilization of three-dimensional structures. In recent years, various 3D electrode geometries have been developed, but we are still lacking a fabrication approach that enables the formation of different 3D structures on a single chip in a controlled manner. This, however, is needed to enable a direct and reliable comparison of the recording capabilities of the different structures. Here, we present a method for a precisely controlled deposition of nanoelectrodes, enabling the fabrication of multiple, well-defined types of structures on our 64 electrode MEAs towards a rapid-prototyping approach to 3D electrodes.

  11. Enhanced Droplet Control by Transition Boiling

    NASA Astrophysics Data System (ADS)

    Grounds, Alex; Still, Richard; Takashina, Kei

    2012-10-01

    A droplet of water on a heated surface can levitate over a film of gas produced by its own evaporation in the Leidenfrost effect. When the surface is prepared with ratchet-like saw-teeth topography, these droplets can self-propel and can even climb uphill. However, the extent to which the droplets can be controlled is limited by the physics of the Leidenfrost effect. Here, we show that transition boiling can be induced even at very high surface temperatures and provide additional control over the droplets. Ratchets with acute protrusions enable droplets to climb steeper inclines while ratchets with sub-structures enable their direction of motion to be controlled by varying the temperature of the surface. The droplets' departure from the Leidenfrost regime is assessed by analysing the sound produced by their boiling. We anticipate these techniques will enable the development of more sophisticated methods for controlling small droplets and heat transfer.

  12. IRREVERSIBLE PROCESSES IN A PLASMA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Balescu, R.

    1959-04-01

    ABS>The characteristic divergences caused by long-range phenomena in gases can be eliminated in equilibrium situations by partial summations of terms individually divergent but whose sum converges. It is shown how the recently developed diagram technique enables treatment of non-equilibrium cases by a rigorous asymptotic method. The general ideas underlying the approach are briefly indicated. (T.R. H.)

  13. Dechorionation of Zebrafish Embryos on Day 1 Post Fertilization Alters Response to an Acute Chemical Challenge at 6 Days Post Fertilization

    EPA Science Inventory

    Dechorionation is a method used to enable image acquisition in embryonic and larval zebrafish studies. As it is assumed that dechorionation has no long-term effects on fish embryo development, it is important to determine if that assumption is correct. The present study explored ...

  14. A Task-Based Needs Analysis for Australian Aboriginal Students: Going beyond the Target Situation to Address Cultural Issues

    ERIC Educational Resources Information Center

    Oliver, Rhonda; Grote, Ellen; Rochecouste, Judith; Exell, Michael

    2013-01-01

    While needs analyses underpin the design of second language analytic syllabi, the methodologies undertaken are rarely examined. This paper explores the value of multiple data sources and collection methods for developing a needs analysis model to enable vocational education and training teachers to address the needs of Australian Aboriginal…

  15. The critical role of peptide chemistry in the life sciences.

    PubMed

    Kent, Stephen B H

    2015-03-01

    Peptide chemistry plays a key role in the synthesis and study of protein molecules and their functions. Modern ligation methods enable the total synthesis of enzymes and the systematic dissection of the chemical basis of enzyme catalysis. Predicted developments in peptide science are described. Copyright © 2015 European Peptide Society and John Wiley & Sons, Ltd.

  16. The CAST Initiative in Guam: A Model of Effective Teachers Teaching Teachers

    ERIC Educational Resources Information Center

    Zuercher, Deborah K.; Kessler, Cristy; Yoshioka, Jon

    2011-01-01

    The CAST (content area specialized training) model of professional development enables sustainable teacher leadership and is responsive to the need for culturally relevant educational practices. The purpose of this paper is to share the background, methods, findings and recommendations of a case study on the CAST initiative in Guam. The case study…

  17. Design, Development and Validation of a Model of Problem Solving for Egyptian Science Classes

    ERIC Educational Resources Information Center

    Shahat, Mohamed A.; Ohle, Annika; Treagust, David F.; Fischer, Hans E.

    2013-01-01

    Educators and policymakers envision the future of education in Egypt as enabling learners to acquire scientific inquiry and problem-solving skills. In this article, we describe the validation of a model for problem solving and the design of instruments for evaluating new teaching methods in Egyptian science classes. The instruments were based on…

  18. A Computational Method for Enabling Teaching-Learning Process in Huge Online Courses and Communities

    ERIC Educational Resources Information Center

    Mora, Higinio; Ferrández, Antonio; Gil, David; Peral, Jesús

    2017-01-01

    Massive Open Online Courses and e-learning represent the future of the teaching-learning processes through the development of Information and Communication Technologies. They are the response to the new education needs of society. However, this future also presents many challenges such as the processing of online forums when a huge number of…

  19. Realistic Real World Contexts: Model Eliciting Activities

    ERIC Educational Resources Information Center

    Doruk, Bekir Kürsat

    2016-01-01

    Researchers have proposed a variety of methods to make a connection between real life and mathematics so that it can be learned in a practical way and enable people to utilise mathematics in their daily lives. Model-eliciting activities (MEAs) were developed to fulfil this need and are very capable of serving this purpose. The reason MEAs are so…

  20. Exploring an Experiential Learning Project through Kolb's Learning Theory Using a Qualitative Research Method

    ERIC Educational Resources Information Center

    Chan, Cecilia Ka Yuk

    2012-01-01

    Experiential learning pedagogy is taking a lead in the development of graduate attributes and educational aims as these are of prime importance for society. This paper shows a community service experiential project conducted in China. The project enabled students to serve the affected community in a post-earthquake area by applying their knowledge…

  1. Primary School Science: Implementation of Domain-General Strategies into Teaching Didactics

    ERIC Educational Resources Information Center

    Dejonckheere, Peter J. N.; Van de Keere, Kristof; Tallir, Isabel; Vervaet, Stephanie

    2013-01-01

    In the present study we present a didactic method to help children aged 11 and 12 learn science in such a way as to enable a dynamic interaction between domain general strategies and the development of conceptual knowledge, whilst each type of scientific process has been considered (forming of hypotheses, experimenting and evaluating). We have…

  2. Nanodevices for Single Molecule Studies

    NASA Astrophysics Data System (ADS)

    Craighead, H. G.; Stavis, S. M.; Samiee, K. T.

    During the last two decades, biotechnology research has resulted in progress in fields as diverse as the life sciences, agriculture and healthcare. While existing technology enables the analysis of a variety of biological systems, new tools are needed for increasing the efficiency of current methods, and for developing new ones altogether. Interest has grown in single molecule analysis for these reasons.

  3. Communication Design: The Nature and Use of a Communication Model in Chemical Education.

    ERIC Educational Resources Information Center

    Barnard, William Robert

    The author suggests that there is a need to identify methods of effectively and efficiently enhancing communication in the lecture room, laboratory or in the out-of-class environment. The study reported concerns the development of suitable new systems which will enable the teacher of chemistry to (1) use projection television for brief periods…

  4. Eppur Si Muove! The 2013 Nobel Prize in Chemistry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Jeremy C.; Roux, Benoit

    2013-12-03

    The 2013 Nobel Prize in Chemistry has been awarded to Martin Karplus, Michael Levitt, and Arieh Warshel for their work on developing computational methods to study complex chemical systems. Hence, their work has led to mechanistic critical insights into chemical systems both large and small and has enabled progress in a number of different fields, including structural biology.

  5. Cultural Practices in Making of Installation Art: A New Perspective to Preparing Future Art Teachers.

    ERIC Educational Resources Information Center

    Chen, Li-Tsu

    In Taiwan, traditional pedagogy and technique-oriented teaching methods have become too outdated to enable students to fight with a society full of complicated and confusing socio-cultural phenomena. An art education curriculum change is needed, and innovative art programs should be developed with careful consideration of the socio-cultural…

  6. Using multiple research methods to understand family forest owners

    Treesearch

    John Schelhas

    2012-01-01

    Applied research on family forest owners ensures that we understand who they are, what they do, and why they do it. This information enables us to develop policy, management, and outreach approaches that can optimize the social, economic, cultural, and environmental benefits of private forests at the landowner, community, and national levels. The three principal...

  7. Professional Development Experiences of Alternatively Certified Career and Technical Education Teachers in North Carolina

    ERIC Educational Resources Information Center

    Welfare, Rhonda Marie

    2013-01-01

    In an effort to increase the quantity and quality of available teachers, states have begun to offer alternate methods of teacher certification. This means that in addition to traditional teacher training, which involves graduation from an accredited teacher-education institution, states provide alternate routes to enable teachers to transition to…

  8. Modeling Innovations Advance Wind Energy Industry

    NASA Technical Reports Server (NTRS)

    2009-01-01

    In 1981, Glenn Research Center scientist Dr. Larry Viterna developed a model that predicted certain elements of wind turbine performance with far greater accuracy than previous methods. The model was met with derision from others in the wind energy industry, but years later, Viterna discovered it had become the most widely used method of its kind, enabling significant wind energy technologies-like the fixed pitch turbines produced by manufacturers like Aerostar Inc. of Westport, Massachusetts-that are providing sustainable, climate friendly energy sources today.

  9. Process optimization using combinatorial design principles: parallel synthesis and design of experiment methods.

    PubMed

    Gooding, Owen W

    2004-06-01

    The use of parallel synthesis techniques with statistical design of experiment (DoE) methods is a powerful combination for the optimization of chemical processes. Advances in parallel synthesis equipment and easy to use software for statistical DoE have fueled a growing acceptance of these techniques in the pharmaceutical industry. As drug candidate structures become more complex at the same time that development timelines are compressed, these enabling technologies promise to become more important in the future.

  10. Evaluation Framework for NASA's Educational Outreach Programs

    NASA Technical Reports Server (NTRS)

    Berg, Rick; Booker, Angela; Linde, Charlotte; Preston, Connie

    1999-01-01

    The objective of the proposed work is to develop an evaluation framework for NASA's educational outreach efforts. We focus on public (rather than technical or scientific) dissemination efforts, specifically on Internet-based outreach sites for children.The outcome of this work is to propose both methods and criteria for evaluation, which would enable NASA to do a more analytic evaluation of its outreach efforts. The proposed framework is based on IRL's ethnographic and video-based observational methods, which allow us to analyze how these sites are actually used.

  11. Laser-directed 3D assembly of carbon nanotubes using two-photon polymerization (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Liu, Ying; Xiong, Wei; Jiang, Li Jia; Zhou, Yunshen; Li, Dawei; Jiang, Lan; Silvain, Jean-Francois; Lu, Yongfeng

    2017-02-01

    Precise assembly of carbon nanotubes (CNTs) in arbitrary 3D space with proper alignment is critically important and desirable for CNT applications but still remains as a long-standing challenge. Using the two-photon polymerization (TPP) technique, it is possible to fabricate 3D micro/nanoscale CNT/polymer architectures with proper CNT alignments in desired directions, which is expected to enable a broad range of applications of CNTs in functional devices. To unleash the full potential of CNTs, it is strategically important to develop TPP-compatible resins with high CNT concentrations for precise assembly of CNTs into 3D micro/nanostructures for functional device applications. We investigated a thiol grafting method in functionalizing multiwalled carbon nanotubes (MWNTs) to develop TPP-compatible MWNT-thiol-acrylate (MTA) composite resins. The composite resins developed had high MWNT concentrations up to 0.2 wt%, over one order of magnitude higher than previously published work. Significantly enhanced electrical and mechanical properties of the 3D micro/nanostructures were achieved. Precisely controlled MWNT assembly and strong anisotropic effects were confirmed. Microelectronic devices made of the MTA composite polymer were demonstrated. The nanofabrication method can achieve controlled assembly of MWNTs in 3D micro/nanostructures, enabling a broad range of CNT applications, including 3D electronics, integrated photonics, and micro/nanoelectromechanical systems (MEMS/NEMS).

  12. Extraction and evaluation of gas-flow-dependent features from dynamic measurements of gas sensors array

    NASA Astrophysics Data System (ADS)

    Kalinowski, Paweł; Woźniak, Łukasz; Jasiński, Grzegorz; Jasiński, Piotr

    2016-11-01

    Gas analyzers based on gas sensors are the devices which enable recognition of various kinds of volatile compounds. They have continuously been developed and investigated for over three decades, however there are still limitations which slow down the implementation of those devices in many applications. For example, the main drawbacks are the lack of selectivity, sensitivity and long term stability of those devices caused by the drift of utilized sensors. This implies the necessity of investigations not only in the field of development of gas sensors construction, but also the development of measurement procedures or methods of analysis of sensor responses which compensate the limitations of sensors devices. One of the fields of investigations covers the dynamic measurements of sensors or sensor-arrays response with the utilization of flow modulation techniques. Different gas delivery patterns enable the possibility of extraction of unique features which improves the stability and selectivity of gas detecting systems. In this article three utilized flow modulation techniques are presented, together with the proposition of the evaluation method of their usefulness and robustness in environmental pollutants detecting systems. The results of dynamic measurements of an commercially available TGS sensor array in the presence of nitrogen dioxide and ammonia are shown.

  13. VisitSense: Sensing Place Visit Patterns from Ambient Radio on Smartphones for Targeted Mobile Ads in Shopping Malls.

    PubMed

    Kim, Byoungjip; Kang, Seungwoo; Ha, Jin-Young; Song, Junehwa

    2015-07-16

    In this paper, we introduce a novel smartphone framework called VisitSense that automatically detects and predicts a smartphone user's place visits from ambient radio to enable behavioral targeting for mobile ads in large shopping malls. VisitSense enables mobile app developers to adopt visit-pattern-aware mobile advertising for shopping mall visitors in their apps. It also benefits mobile users by allowing them to receive highly relevant mobile ads that are aware of their place visit patterns in shopping malls. To achieve the goal, VisitSense employs accurate visit detection and prediction methods. For accurate visit detection, we develop a change-based detection method to take into consideration the stability change of ambient radio and the mobility change of users. It performs well in large shopping malls where ambient radio is quite noisy and causes existing algorithms to easily fail. In addition, we proposed a causality-based visit prediction model to capture the causality in the sequential visit patterns for effective prediction. We have developed a VisitSense prototype system, and a visit-pattern-aware mobile advertising application that is based on it. Furthermore, we deploy the system in the COEX Mall, one of the largest shopping malls in Korea, and conduct diverse experiments to show the effectiveness of VisitSense.

  14. Finger-Shaped GelForce: Sensor for Measuring Surface Traction Fields for Robotic Hand.

    PubMed

    Sato, K; Kamiyama, K; Kawakami, N; Tachi, S

    2010-01-01

    It is believed that the use of haptic sensors to measure the magnitude, direction, and distribution of a force will enable a robotic hand to perform dexterous operations. Therefore, we develop a new type of finger-shaped haptic sensor using GelForce technology. GelForce is a vision-based sensor that can be used to measure the distribution of force vectors, or surface traction fields. The simple structure of the GelForce enables us to develop a compact finger-shaped GelForce for the robotic hand. GelForce that is developed on the basis of an elastic theory can be used to calculate surface traction fields using a conversion equation. However, this conversion equation cannot be analytically solved when the elastic body of the sensor has a complicated shape such as the shape of a finger. Therefore, we propose an observational method and construct a prototype of the finger-shaped GelForce. By using this prototype, we evaluate the basic performance of the finger-shaped GelForce. Then, we conduct a field test by performing grasping operations using a robotic hand. The results of this test show that using the observational method, the finger-shaped GelForce can be successfully used in a robotic hand.

  15. Statistical analysis of road-vehicle-driver interaction as an enabler to designing behavioural models

    NASA Astrophysics Data System (ADS)

    Chakravarty, T.; Chowdhury, A.; Ghose, A.; Bhaumik, C.; Balamuralidhar, P.

    2014-03-01

    Telematics form an important technology enabler for intelligent transportation systems. By deploying on-board diagnostic devices, the signatures of vehicle vibration along with its location and time are recorded. Detailed analyses of the collected signatures offer deep insights into the state of the objects under study. Towards that objective, we carried out experiments by deploying telematics device in one of the office bus that ferries employees to office and back. Data is being collected from 3-axis accelerometer, GPS, speed and the time for all the journeys. In this paper, we present initial results of the above exercise by applying statistical methods to derive information through systematic analysis of the data collected over four months. It is demonstrated that the higher order derivative of the measured Z axis acceleration samples display the properties Weibull distribution when the time axis is replaced by the amplitude of such processed acceleration data. Such an observation offers us a method to predict future behaviour where deviations from prediction are classified as context-based aberrations or progressive degradation of the system. In addition we capture the relationship between speed of the vehicle and median of the jerk energy samples using regression analysis. Such results offer an opportunity to develop a robust method to model road-vehicle interaction thereby enabling us to predict such like driving behaviour and condition based maintenance etc.

  16. WordSeeker: concurrent bioinformatics software for discovering genome-wide patterns and word-based genomic signatures

    PubMed Central

    2010-01-01

    Background An important focus of genomic science is the discovery and characterization of all functional elements within genomes. In silico methods are used in genome studies to discover putative regulatory genomic elements (called words or motifs). Although a number of methods have been developed for motif discovery, most of them lack the scalability needed to analyze large genomic data sets. Methods This manuscript presents WordSeeker, an enumerative motif discovery toolkit that utilizes multi-core and distributed computational platforms to enable scalable analysis of genomic data. A controller task coordinates activities of worker nodes, each of which (1) enumerates a subset of the DNA word space and (2) scores words with a distributed Markov chain model. Results A comprehensive suite of performance tests was conducted to demonstrate the performance, speedup and efficiency of WordSeeker. The scalability of the toolkit enabled the analysis of the entire genome of Arabidopsis thaliana; the results of the analysis were integrated into The Arabidopsis Gene Regulatory Information Server (AGRIS). A public version of WordSeeker was deployed on the Glenn cluster at the Ohio Supercomputer Center. Conclusion WordSeeker effectively utilizes concurrent computing platforms to enable the identification of putative functional elements in genomic data sets. This capability facilitates the analysis of the large quantity of sequenced genomic data. PMID:21210985

  17. Developing an occupational skills profile for the emerging profession of "big-data-enabled professional"

    NASA Astrophysics Data System (ADS)

    Kastens, K. A.; Malyn-Smith, J.; Ippolito, J.; Krumhansl, R.

    2014-12-01

    In August of 2014, the Oceans of Data Institute at Education Development Center, Inc. (EDC) is convening an expert panel to begin the process of developing an occupational skills profile for the "big-data-enabled professional." We define such a professional as an "individual who works with large complex data sets on a regular basis, asking and answering questions, analyzing trends, and finding meaningful patterns, in order to increase the efficiency of processes, make decisions and predictions, solve problems, generate hypotheses, and/or develop new understandings." The expert panel includes several geophysicists, as well as data professionals from engineering, higher education, analytical journalism, forensics, bioinformatics, and telecommunications. Working with experienced facilitators, the expert panel will create a detailed synopsis of the tasks and responsibilities characteristic of their profession, as well as the skills, knowledge and behaviors that enable them to succeed in the workplace. After the panel finishes their work, the task matrix and associated narrative will be vetted and validated by a larger group of additional professionals, and then disseminated for use by educators and employers. The process we are using is called DACUM (Developing a Curriculum), adapted by EDC and optimized for emergent professions, such as the "big-data-enabled professional." DACUM is a well-established method for analyzing jobs and occupations, commonly used in technical fields to develop curriculum and training programs that reflect authentic work tasks found in scientific and technical workplaces. The premises behind the DACUM approach are that: expert workers are better able to describe their own occupation than anyone else; any job can be described in terms of the tasks that successful workers in the occupation perform; all tasks have direct implications for the knowledge, skills, understandings and attitudes that must be taught and learned in preparation for the targeted career. At AGU, we will describe the process and present the finalized occupational profile.

  18. The Python Spectral Analysis Tool (PySAT): A Powerful, Flexible, Preprocessing and Machine Learning Library and Interface

    NASA Astrophysics Data System (ADS)

    Anderson, R. B.; Finch, N.; Clegg, S. M.; Graff, T. G.; Morris, R. V.; Laura, J.; Gaddis, L. R.

    2017-12-01

    Machine learning is a powerful but underutilized approach that can enable planetary scientists to derive meaningful results from the rapidly-growing quantity of available spectral data. For example, regression methods such as Partial Least Squares (PLS) and Least Absolute Shrinkage and Selection Operator (LASSO), can be used to determine chemical concentrations from ChemCam and SuperCam Laser-Induced Breakdown Spectroscopy (LIBS) data [1]. Many scientists are interested in testing different spectral data processing and machine learning methods, but few have the time or expertise to write their own software to do so. We are therefore developing a free open-source library of software called the Python Spectral Analysis Tool (PySAT) along with a flexible, user-friendly graphical interface to enable scientists to process and analyze point spectral data without requiring significant programming or machine-learning expertise. A related but separately-funded effort is working to develop a graphical interface for orbital data [2]. The PySAT point-spectra tool includes common preprocessing steps (e.g. interpolation, normalization, masking, continuum removal, dimensionality reduction), plotting capabilities, and capabilities to prepare data for machine learning such as creating stratified folds for cross validation, defining training and test sets, and applying calibration transfer so that data collected on different instruments or under different conditions can be used together. The tool leverages the scikit-learn library [3] to enable users to train and compare the results from a variety of multivariate regression methods. It also includes the ability to combine multiple "sub-models" into an overall model, a method that has been shown to improve results and is currently used for ChemCam data [4]. Although development of the PySAT point-spectra tool has focused primarily on the analysis of LIBS spectra, the relevant steps and methods are applicable to any spectral data. The tool is available at https://github.com/USGS-Astrogeology/PySAT_Point_Spectra_GUI. [1] Clegg, S.M., et al. (2017) Spectrochim Acta B. 129, 64-85. [2] Gaddis, L. et al. (2017) 3rd Planetary Data Workshop, #1986. [3] http://scikit-learn.org/ [4] Anderson, R.B., et al. (2017) Spectrochim. Acta B. 129, 49-57.

  19. Noncontact evaluation for interface states by photocarrier counting

    NASA Astrophysics Data System (ADS)

    Furuta, Masaaki; Shimizu, Kojiro; Maeta, Takahiro; Miyashita, Moriya; Izunome, Koji; Kubota, Hiroshi

    2018-03-01

    We have developed a noncontact measurement method that enables in-line measurement and does not have any test element group (TEG) formation. In this method, the number of photocarriers excited from the interface states are counted which is called “photocarrier counting”, and then the energy distribution of the interface states density (D it) is evaluated by spectral light excitation. In our previous experiment, the method used was a preliminary contact measurement method at the oxide on top of the Si wafer. We developed, at this time, a D it measurement method as a noncontact measurement with a gap between the probes and the wafer. The shallow trench isolation (STI) sidewall has more localized interface states than the region under the gate electrode. We demonstrate the noncontact measurement of trapped carriers from interface states using wafers of three different crystal plane orientations. The demonstration will pave the way for evaluating STI sidewall interface states in future studies.

  20. Aircraft Engine Gas Path Diagnostic Methods: Public Benchmarking Results

    NASA Technical Reports Server (NTRS)

    Simon, Donald L.; Borguet, Sebastien; Leonard, Olivier; Zhang, Xiaodong (Frank)

    2013-01-01

    Recent technology reviews have identified the need for objective assessments of aircraft engine health management (EHM) technologies. To help address this issue, a gas path diagnostic benchmark problem has been created and made publicly available. This software tool, referred to as the Propulsion Diagnostic Method Evaluation Strategy (ProDiMES), has been constructed based on feedback provided by the aircraft EHM community. It provides a standard benchmark problem enabling users to develop, evaluate and compare diagnostic methods. This paper will present an overview of ProDiMES along with a description of four gas path diagnostic methods developed and applied to the problem. These methods, which include analytical and empirical diagnostic techniques, will be described and associated blind-test-case metric results will be presented and compared. Lessons learned along with recommendations for improving the public benchmarking processes will also be presented and discussed.

  1. Formation Flying Design and Applications in Weak Stability Boundary Regions

    NASA Technical Reports Server (NTRS)

    Folta, David

    2003-01-01

    Weak Stability regions serve as superior locations for interferometric scientific investigations. These regions are often selected to minimize environmental disturbances and maximize observing efficiency. Design of formations in these regions are becoming ever more challenging as more complex missions are envisioned. The development of algorithms to enable the capability for formation design must be further enabled to incorporate better understanding of WSB solution space. This development will improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple formation missions in WSB regions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes both algorithm and software development. The Constellation-X, Maxim, and Stellar Imager missions are examples of the use of improved numerical methods for attaining constrained formation geometries and controlling their dynamical evolution. This paper presents a survey of formation missions in the WSB regions and a brief description of the formation design using numerical and dynamical techniques.

  2. Formation flying design and applications in weak stability boundary regions.

    PubMed

    Folta, David

    2004-05-01

    Weak stability regions serve as superior locations for interferomertric scientific investigations. These regions are often selected to minimize environmental disturbances and maximize observation efficiency. Designs of formations in these regions are becoming ever more challenging as more complex missions are envisioned. The development of algorithms to enable the capability for formation design must be further enabled to incorporate better understanding of weak stability boundary solution space. This development will improve the efficiency and expand the capabilities of current approaches. The Goddard Space Flight Center (GSFC) is currently supporting multiple formation missions in weak stability boundary regions. This end-to-end support consists of mission operations, trajectory design, and control. It also includes both algorithm and software development. The Constellation-X, Maxim, and Stellar Imager missions are examples of the use of improved numeric methods to attain constrained formation geometries and control their dynamical evolution. This paper presents a survey of formation missions in the weak stability boundary regions and a brief description of formation design using numerical and dynamical techniques.

  3. Driving down defect density in composite EUV patterning film stacks

    NASA Astrophysics Data System (ADS)

    Meli, Luciana; Petrillo, Karen; De Silva, Anuja; Arnold, John; Felix, Nelson; Johnson, Richard; Murray, Cody; Hubbard, Alex; Durrant, Danielle; Hontake, Koichi; Huli, Lior; Lemley, Corey; Hetzer, Dave; Kawakami, Shinichiro; Matsunaga, Koichi

    2017-03-01

    Extreme ultraviolet lithography (EUVL) technology is one of the leading candidates for enabling the next generation devices, for 7nm node and beyond. As the technology matures, further improvement is required in the area of blanket film defectivity, pattern defectivity, CD uniformity, and LWR/LER. As EUV pitch scaling approaches sub 20 nm, new techniques and methods must be developed to reduce the overall defectivity, mitigate pattern collapse and eliminate film related defect. IBM Corporation and Tokyo Electron Limited (TELTM) are continuously collaborating to develop manufacturing quality processes for EUVL. In this paper, we review key defectivity learning required to enable 7nm node and beyond technology. We will describe ongoing progress in addressing these challenges through track-based processes (coating, developer, baking), highlighting the limitations of common defect detection strategies and outlining methodologies necessary for accurate characterization and mitigation of blanket defectivity in EUV patterning stacks. We will further discuss defects related to pattern collapse and thinning of underlayer films.

  4. Implicity restarted Arnoldi/Lanczos methods for large scale eigenvalue calculations

    NASA Technical Reports Server (NTRS)

    Sorensen, Danny C.

    1996-01-01

    Eigenvalues and eigenfunctions of linear operators are important to many areas of applied mathematics. The ability to approximate these quantities numerically is becoming increasingly important in a wide variety of applications. This increasing demand has fueled interest in the development of new methods and software for the numerical solution of large-scale algebraic eigenvalue problems. In turn, the existence of these new methods and software, along with the dramatically increased computational capabilities now available, has enabled the solution of problems that would not even have been posed five or ten years ago. Until very recently, software for large-scale nonsymmetric problems was virtually non-existent. Fortunately, the situation is improving rapidly. The purpose of this article is to provide an overview of the numerical solution of large-scale algebraic eigenvalue problems. The focus will be on a class of methods called Krylov subspace projection methods. The well-known Lanczos method is the premier member of this class. The Arnoldi method generalizes the Lanczos method to the nonsymmetric case. A recently developed variant of the Arnoldi/Lanczos scheme called the Implicitly Restarted Arnoldi Method is presented here in some depth. This method is highlighted because of its suitability as a basis for software development.

  5. Advances in thickness measurements and dynamic visualization of the tear film using non-invasive optical approaches.

    PubMed

    Bai, Yuqiang; Nichols, Jason J

    2017-05-01

    The thickness of tear film has been investigated under both invasive and non-invasive methods. While invasive methods are largely historical, more recent noninvasive methods are generally based on optical approaches that provide accurate, precise, and rapid measures. Optical microscopy, interferometry, and optical coherence tomography (OCT) have been developed to characterize the thickness of tear film or certain aspects of the tear film (e.g., the lipid layer). This review provides an in-depth overview on contemporary optical techniques used in studying the tear film, including both advantages and limitations of these approaches. It is anticipated that further developments of high-resolution OCT and other interferometric methods will enable a more accurate and precise measurement of the thickness of the tear film and its related dynamic properties. Copyright © 2017 Elsevier Ltd. All rights reserved.

  6. Zone plate method for electronic holographic display using resolution redistribution technique.

    PubMed

    Takaki, Yasuhiro; Nakamura, Junya

    2011-07-18

    The resolution redistribution (RR) technique can increase the horizontal viewing-zone angle and screen size of electronic holographic display. The present study developed a zone plate method that would reduce hologram calculation time for the RR technique. This method enables calculation of an image displayed on a spatial light modulator by performing additions of the zone plates, while the previous calculation method required performing the Fourier transform twice. The derivation and modeling of the zone plate are shown. In addition, the look-up table approach was introduced for further reduction in computation time. Experimental verification using a holographic display module based on the RR technique is presented.

  7. Analysis method for Thomson scattering diagnostics in GAMMA 10/PDX.

    PubMed

    Ohta, K; Yoshikawa, M; Yasuhara, R; Chikatsu, M; Shima, Y; Kohagura, J; Sakamoto, M; Nakasima, Y; Imai, T; Ichimura, M; Yamada, I; Funaba, H; Minami, T

    2016-11-01

    We have developed an analysis method to improve the accuracies of electron temperature measurement by employing a fitting technique for the raw Thomson scattering (TS) signals. Least square fitting of the raw TS signals enabled reduction of the error in the electron temperature measurement. We applied the analysis method to a multi-pass (MP) TS system. Because the interval between the MPTS signals is very short, it is difficult to separately analyze each Thomson scattering signal intensity by using the raw signals. We used the fitting method to obtain the original TS scattering signals from the measured raw MPTS signals to obtain the electron temperatures in each pass.

  8. 3D printing functional materials and devices (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    McAlpine, Michael C.

    2017-05-01

    The development of methods for interfacing high performance functional devices with biology could impact regenerative medicine, smart prosthetics, and human-machine interfaces. Indeed, the ability to three-dimensionally interweave biological and functional materials could enable the creation of devices possessing unique geometries, properties, and functionalities. Yet, most high quality functional materials are two dimensional, hard and brittle, and require high crystallization temperatures for maximal performance. These properties render the corresponding devices incompatible with biology, which is three-dimensional, soft, stretchable, and temperature sensitive. We overcome these dichotomies by: 1) using 3D printing and scanning for customized, interwoven, anatomically accurate device architectures; 2) employing nanotechnology as an enabling route for overcoming mechanical discrepancies while retaining high performance; and 3) 3D printing a range of soft and nanoscale materials to enable the integration of a diverse palette of high quality functional nanomaterials with biology. 3D printing is a multi-scale platform, allowing for the incorporation of functional nanoscale inks, the printing of microscale features, and ultimately the creation of macroscale devices. This three-dimensional blending of functional materials and `living' platforms may enable next-generation 3D printed devices.

  9. Enabling the environmentally clean air transportation of the future: a vision of computational fluid dynamics in 2030.

    PubMed

    Slotnick, Jeffrey P; Khodadoust, Abdollah; Alonso, Juan J; Darmofal, David L; Gropp, William D; Lurie, Elizabeth A; Mavriplis, Dimitri J; Venkatakrishnan, Venkat

    2014-08-13

    As global air travel expands rapidly to meet demand generated by economic growth, it is essential to continue to improve the efficiency of air transportation to reduce its carbon emissions and address concerns about climate change. Future transports must be 'cleaner' and designed to include technologies that will continue to lower engine emissions and reduce community noise. The use of computational fluid dynamics (CFD) will be critical to enable the design of these new concepts. In general, the ability to simulate aerodynamic and reactive flows using CFD has progressed rapidly during the past several decades and has fundamentally changed the aerospace design process. Advanced simulation capabilities not only enable reductions in ground-based and flight-testing requirements, but also provide added physical insight, and enable superior designs at reduced cost and risk. In spite of considerable success, reliable use of CFD has remained confined to a small region of the operating envelope due, in part, to the inability of current methods to reliably predict turbulent, separated flows. Fortunately, the advent of much more powerful computing platforms provides an opportunity to overcome a number of these challenges. This paper summarizes the findings and recommendations from a recent NASA-funded study that provides a vision for CFD in the year 2030, including an assessment of critical technology gaps and needed development, and identifies the key CFD technology advancements that will enable the design and development of much cleaner aircraft in the future. © 2014 The Author(s) Published by the Royal Society. All rights reserved.

  10. Comparison of methods for acid quantification: impact of resist components on acid-generating efficiency

    NASA Astrophysics Data System (ADS)

    Cameron, James F.; Fradkin, Leslie; Moore, Kathryn; Pohlers, Gerd

    2000-06-01

    Chemically amplified deep UV (CA-DUV) positive resists are the enabling materials for manufacture of devices at and below 0.18 micrometer design rules in the semiconductor industry. CA-DUV resists are typically based on a combination of an acid labile polymer and a photoacid generator (PAG). Upon UV exposure, a catalytic amount of a strong Bronsted acid is released and is subsequently used in a post-exposure bake step to deprotect the acid labile polymer. Deprotection transforms the acid labile polymer into a base soluble polymer and ultimately enables positive tone image development in dilute aqueous base. As CA-DUV resist systems continue to mature and are used in increasingly demanding situations, it is critical to develop a fundamental understanding of how robust these materials are. One of the most important factors to quantify is how much acid is photogenerated in these systems at key exposure doses. For the purpose of quantifying photoacid generation several methods have been devised. These include spectrophotometric methods, ion conductivity methods and most recently an acid-base type titration similar to the standard addition method. This paper compares many of these techniques. First, comparisons between the most commonly used acid sensitive dye, tetrabromophenol blue sodium salt (TBPB) and a less common acid sensitive dye, Rhodamine B base (RB) are made in several resist systems. Second, the novel acid-base type titration based on the standard addition method is compared to the spectrophotometric titration method. During these studies, the make up of the resist system is probed as follows: the photoacid generator and resist additives are varied to understand the impact of each of these resist components on the acid generation process.

  11. Vouchers in Fragile States: Reducing Barriers to Long-Acting Reversible Contraception in Yemen and Pakistan

    PubMed Central

    Boddam-Whetham, Luke; Gul, Xaher; Al-Kobati, Eman; Gorter, Anna C

    2016-01-01

    ABSTRACT In conflict-affected states, vouchers have reduced barriers to reproductive health services and have enabled health programs to use targeted subsidies to increase uptake of specific health services. Vouchers can also be used to channel funds to public- and private-service providers and improve service quality. The Yamaan Foundation for Health and Social Development in Yemen and the Marie Stopes Society (MSS) in Pakistan—both working with Options Consultancy Services—have developed voucher programs that subsidize voluntary access to long-acting reversible contraceptives (LARCs) and permanent methods (PMs) of family planning in their respective fragile countries. The programs focus on LARCs and PMs because these methods are particularly difficult for poor women to access due to their cost and to provider biases against offering them. Using estimates of expected voluntary uptake of LARCs and PMs for 2014 based on contraceptive prevalence rates, and comparing these with uptake of LARCs and PMs through the voucher programs, we show the substantial increase in service utilization that vouchers can enable by contributing to an expanded method choice. In the governorate of Lahj, Yemen, vouchers for family planning led to an estimated 38% increase in 2014 over the expected use of LARCs and PMs (720 vs. 521 expected). We applied the same approach in 13 districts of Punjab, Khyber Pakhtunkhwa (KPK), and Sindh provinces in Pakistan. Our calculations suggest that vouchers enabled 10 times more women than expected to choose LARCs and PMs in 2014 in those areas of Pakistan (73,639 vs. 6,455 expected). Voucher programs can promote and maintain access to family planning services where existing health systems are hampered. Vouchers are a flexible financing approach that enable expansion of contraceptive choice and the inclusion of the private sector in service delivery to the poor. They can keep financial resources flowing where the public sector is prevented from offering services, and ensure that alternative sources are available for reproductive health services such as family planning. Programs should consider using vouchers in fragile states to facilitate access to family planning services and support the countries’ health systems. PMID:27540129

  12. Microwave non-contact imaging of subcutaneous human body tissues.

    PubMed

    Kletsov, Andrey; Chernokalov, Alexander; Khripkov, Alexander; Cho, Jaegeol; Druchinin, Sergey

    2015-10-01

    A small-size microwave sensor is developed for non-contact imaging of a human body structure in 2D, enabling fitness and health monitoring using mobile devices. A method for human body tissue structure imaging is developed and experimentally validated. Subcutaneous fat tissue reconstruction depth of up to 70 mm and maximum fat thickness measurement error below 2 mm are demonstrated by measurements with a human body phantom and human subjects. Electrically small antennas are developed for integration of the microwave sensor into a mobile device. Usability of the developed microwave sensor for fitness applications, healthcare, and body weight management is demonstrated.

  13. Using mixed methods to develop and evaluate complex interventions in palliative care research.

    PubMed

    Farquhar, Morag C; Ewing, Gail; Booth, Sara

    2011-12-01

    there is increasing interest in combining qualitative and quantitative research methods to provide comprehensiveness and greater knowledge yield. Mixed methods are valuable in the development and evaluation of complex interventions. They are therefore particularly valuable in palliative care research where the majority of interventions are complex, and the identification of outcomes particularly challenging. this paper aims to introduce the role of mixed methods in the development and evaluation of complex interventions in palliative care, and how they may be used in palliative care research. the paper defines mixed methods and outlines why and how mixed methods are used to develop and evaluate complex interventions, with a pragmatic focus on design and data collection issues and data analysis. Useful texts are signposted and illustrative examples provided of mixed method studies in palliative care, including a detailed worked example of the development and evaluation of a complex intervention in palliative care for breathlessness. Key challenges to conducting mixed methods in palliative care research are identified in relation to data collection, data integration in analysis, costs and dissemination and how these might be addressed. the development and evaluation of complex interventions in palliative care benefit from the application of mixed methods. Mixed methods enable better understanding of whether and how an intervention works (or does not work) and inform the design of subsequent studies. However, they can be challenging: mixed method studies in palliative care will benefit from working with agreed protocols, multidisciplinary teams and engaging staff with appropriate skill sets.

  14. Reflection-Based Learning for Professional Ethical Formation.

    PubMed

    Branch, William T; George, Maura

    2017-04-01

    One way practitioners learn ethics is by reflecting on experience. They may reflect in the moment (reflection-in-action) or afterwards (reflection-on-action). We illustrate how a teaching clinician may transform relationships with patients and teach person-centered care through reflective learning. We discuss reflective learning pedagogies and present two case examples of our preferred method, guided group reflection using narratives. This method fosters moral development alongside professional identity formation in students and advanced learners. Our method for reflective learning addresses and enables processing of the most pressing ethical issues that learners encounter in practice. © 2017 American Medical Association. All Rights Reserved.

  15. Automatic Topography Using High Precision Digital Moire Methods

    NASA Astrophysics Data System (ADS)

    Yatagai, T.; Idesawa, M.; Saito, S.

    1983-07-01

    Three types of moire topographic methods using digital techniques are proposed. Deformed gratings obtained by projecting a reference grating onto an object under test are subjected to digital analysis. The electronic analysis procedures of deformed gratings described here enable us to distinguish between depression and elevation of the object, so that automatic measurement of 3-D shapes and automatic moire fringe interpolation are performed. Based on the digital moire methods, we have developed a practical measurement system, with a linear photodiode array on a micro-stage as a scanning image sensor. Examples of fringe analysis in medical applications are presented.

  16. MPAI (mass probes aided ionization) method for total analysis of biomolecules by mass spectrometry.

    PubMed

    Honda, Aki; Hayashi, Shinichiro; Hifumi, Hiroki; Honma, Yuya; Tanji, Noriyuki; Iwasawa, Naoko; Suzuki, Yoshio; Suzuki, Koji

    2007-01-01

    We have designed and synthesized various mass probes, which enable us to effectively ionize various molecules to be detected with mass spectrometry. We call the ionization method using mass probes the "MPAI (mass probes aided ionization)" method. We aim at the sensitive detection of various biological molecules, and also the detection of bio-molecules by a single mass spectrometry serially without changing the mechanical settings. Here, we review mass probes for small molecules with various functional groups and mass probes for proteins. Further, we introduce newly developed mass probes for proteins for highly sensitive detection.

  17. The Evolution of Chemical High-Throughput Experimentation To Address Challenging Problems in Pharmaceutical Synthesis.

    PubMed

    Krska, Shane W; DiRocco, Daniel A; Dreher, Spencer D; Shevlin, Michael

    2017-12-19

    The structural complexity of pharmaceuticals presents a significant challenge to modern catalysis. Many published methods that work well on simple substrates often fail when attempts are made to apply them to complex drug intermediates. The use of high-throughput experimentation (HTE) techniques offers a means to overcome this fundamental challenge by facilitating the rational exploration of large arrays of catalysts and reaction conditions in a time- and material-efficient manner. Initial forays into the use of HTE in our laboratories for solving chemistry problems centered around screening of chiral precious-metal catalysts for homogeneous asymmetric hydrogenation. The success of these early efforts in developing efficient catalytic steps for late-stage development programs motivated the desire to increase the scope of this approach to encompass other high-value catalytic chemistries. Doing so, however, required significant advances in reactor and workflow design and automation to enable the effective assembly and agitation of arrays of heterogeneous reaction mixtures and retention of volatile solvents under a wide range of temperatures. Associated innovations in high-throughput analytical chemistry techniques greatly increased the efficiency and reliability of these methods. These evolved HTE techniques have been utilized extensively to develop highly innovative catalysis solutions to the most challenging problems in large-scale pharmaceutical synthesis. Starting with Pd- and Cu-catalyzed cross-coupling chemistry, subsequent efforts expanded to other valuable modern synthetic transformations such as chiral phase-transfer catalysis, photoredox catalysis, and C-H functionalization. As our experience and confidence in HTE techniques matured, we envisioned their application beyond problems in process chemistry to address the needs of medicinal chemists. Here the problem of reaction generality is felt most acutely, and HTE approaches should prove broadly enabling. However, the quantities of both time and starting materials available for chemistry troubleshooting in this space generally are severely limited. Adapting to these needs led us to invest in smaller predefined arrays of transformation-specific screening "kits" and push the boundaries of miniaturization in chemistry screening, culminating in the development of "nanoscale" reaction screening carried out in 1536-well plates. Grappling with the problem of generality also inspired the exploration of cheminformatics-driven HTE approaches such as the Chemistry Informer Libraries. These next-generation HTE methods promise to empower chemists to run orders of magnitude more experiments and enable "big data" informatics approaches to reaction design and troubleshooting. With these advances, HTE is poised to revolutionize how chemists across both industry and academia discover new synthetic methods, develop them into tools of broad utility, and apply them to problems of practical significance.

  18. Development of the Internet-Based Customer-Oriented Ordering System Framework for Complicated Mechanical Product

    NASA Astrophysics Data System (ADS)

    Ong, Mingwei; Watanuki, Keiichi

    Recently, as consumers gradually prefer buying products that reflect their own personality, there exist some consumers who wish to involve in the product design process. Parallel with the popularization of e-business, many manufacturers have utilized the Internet to promote their products, and some have even built websites that enable consumers to select their desirable product specifications. Nevertheless, this method has not been applied on complicated mechanical product due to the facts that complicated mechanical product has a large number of specifications that inter-relate among one another. In such a case, ordinary consumers who are lacking of design knowledge, are not capable of determining these specifications. In this paper, a prototype framework called Internet-based consumer-oriented product ordering system has been developed in which it enables ordinary consumers to have large freedom in determining complicated mechanical product specifications, and meanwhile ensures that the manufacturing of the determined product is feasible.

  19. Challenges and perspectives of garnet solid electrolytes for all solid-state lithium batteries

    NASA Astrophysics Data System (ADS)

    Liu, Qi; Geng, Zhen; Han, Cuiping; Fu, Yongzhu; Li, Song; He, Yan-bing; Kang, Feiyu; Li, Baohua

    2018-06-01

    Garnet Li7La3Zr2O12 (LLZO) solid electrolytes recently have attracted tremendous interest as they have the potential to enable all solid-state lithium batteries (ASSLBs) owing to high ionic conductivity (10-3 to 10-4 S cm-1), negligible electronic transport, wide potential window (up to 9 V), and good chemical stability. Here we present the key issues and challenges of LLZO in the aspects of ion conduction property, interfacial compatibility, and stability in air. First, different preparation methods of LLZO are reviewed. Then, recent progress about the improvement of ionic conductivity and interfacial property between LLZO and electrodes are presented. Finally, we list some emerging LLZO-based solid-state batteries and provide perspectives for further research. The aim of this review is to summarize the up-to-date developments of LLZO and lead the direction for future development which could enable LLZO-based ASSLBs.

  20. Optical imaging of localized chemical events using programmable diamond quantum nanosensors

    NASA Astrophysics Data System (ADS)

    Rendler, Torsten; Neburkova, Jitka; Zemek, Ondrej; Kotek, Jan; Zappe, Andrea; Chu, Zhiqin; Cigler, Petr; Wrachtrup, Jörg

    2017-03-01

    Development of multifunctional nanoscale sensors working under physiological conditions enables monitoring of intracellular processes that are important for various biological and medical applications. By attaching paramagnetic gadolinium complexes to nanodiamonds (NDs) with nitrogen-vacancy (NV) centres through surface engineering, we developed a hybrid nanoscale sensor that can be adjusted to directly monitor physiological species through a proposed sensing scheme based on NV spin relaxometry. We adopt a single-step method to measure spin relaxation rates enabling time-dependent measurements on changes in pH or redox potential at a submicrometre-length scale in a microfluidic channel that mimics cellular environments. Our experimental data are reproduced by numerical simulations of the NV spin interaction with gadolinium complexes covering the NDs. Considering the versatile engineering options provided by polymer chemistry, the underlying mechanism can be expanded to detect a variety of physiologically relevant species and variables.

  1. A Surgical Device to Study the Efficacy of Bioengineered Skin Substitutes in Mice Wound Healing Models.

    PubMed

    Jeschke, Marc G; Sadri, Ali-Reza; Belo, Cassandra; Amini-Nik, Saeid

    2017-04-01

    Due to the poor regenerative capacity of adult mammalian skin, there is a need to develop effective skin substitutes for promoting skin regeneration after a severe wound. However, the complexity of skin biology has made it difficult to enable perfect regeneration of skin. Thus, animal models are being used to test potential skin substitutes. Murine models are valuable but their healing process involves dermal contraction. We have developed a device called a dome that is able to eliminate the contraction effect of rodent skin while simultaneously housing a bioengineered skin graft. The dome comes in two models, which enables researchers to evaluate the cells that contribute in wound healing from neighboring intact tissue during skin healing/regeneration. This protocol simplifies grafting of skin substitutes, eliminates the contraction effect of surrounding skin, and summarizes a simple method for animal surgery for wound healing and skin regeneration studies.

  2. Indexing method of digital audiovisual medical resources with semantic Web integration.

    PubMed

    Cuggia, Marc; Mougin, Fleur; Le Beux, Pierre

    2003-01-01

    Digitalization of audio-visual resources combined with the performances of the networks offer many possibilities which are the subject of intensive work in the scientific and industrial sectors. Indexing such resources is a major challenge. Recently, the Motion Pictures Expert Group (MPEG) has been developing MPEG-7, a standard for describing multimedia content. The good of this standard is to develop a rich set of standardized tools to enable fast efficient retrieval from digital archives or filtering audiovisual broadcasts on the internet. How this kind of technologies could be used in the medical context? In this paper, we propose a simpler indexing system, based on Dublin Core standard and complaint to MPEG-7. We use MeSH and UMLS to introduce conceptual navigation. We also present a video-platform with enables to encode and give access to audio-visual resources in streaming mode.

  3. Object-based media and stream-based computing

    NASA Astrophysics Data System (ADS)

    Bove, V. Michael, Jr.

    1998-03-01

    Object-based media refers to the representation of audiovisual information as a collection of objects - the result of scene-analysis algorithms - and a script describing how they are to be rendered for display. Such multimedia presentations can adapt to viewing circumstances as well as to viewer preferences and behavior, and can provide a richer link between content creator and consumer. With faster networks and processors, such ideas become applicable to live interpersonal communications as well, creating a more natural and productive alternative to traditional videoconferencing. In this paper is outlined an example of object-based media algorithms and applications developed by my group, and present new hardware architectures and software methods that we have developed to enable meeting the computational requirements of object- based and other advanced media representations. In particular we describe stream-based processing, which enables automatic run-time parallelization of multidimensional signal processing tasks even given heterogenous computational resources.

  4. Predicting scattering scanning near-field optical microscopy of mass-produced plasmonic devices

    NASA Astrophysics Data System (ADS)

    Otto, Lauren M.; Burgos, Stanley P.; Staffaroni, Matteo; Ren, Shen; Süzer, Özgün; Stipe, Barry C.; Ashby, Paul D.; Hammack, Aeron T.

    2018-05-01

    Scattering scanning near-field optical microscopy enables optical imaging and characterization of plasmonic devices with nanometer-scale resolution well below the diffraction limit. This technique enables developers to probe and understand the waveguide-coupled plasmonic antenna in as-fabricated heat-assisted magnetic recording heads. In order to validate and predict results and to extract information from experimental measurements that is physically comparable to simulations, a model was developed to translate the simulated electric field into expected near-field measurements using physical parameters specific to scattering scanning near-field optical microscopy physics. The methods used in this paper prove that scattering scanning near-field optical microscopy can be used to determine critical sub-diffraction-limited dimensions of optical field confinement, which is a crucial metrology requirement for the future of nano-optics, semiconductor photonic devices, and biological sensing where the near-field character of light is fundamental to device operation.

  5. Resolving the morphology of niobium carbonitride nano-precipitates in steel using atom probe tomography.

    PubMed

    Breen, Andrew J; Xie, Kelvin Y; Moody, Michael P; Gault, Baptiste; Yen, Hung-Wei; Wong, Christopher C; Cairney, Julie M; Ringer, Simon P

    2014-08-01

    Atom probe is a powerful technique for studying the composition of nano-precipitates, but their morphology within the reconstructed data is distorted due to the so-called local magnification effect. A new technique has been developed to mitigate this limitation by characterizing the distribution of the surrounding matrix atoms, rather than those contained within the nano-precipitates themselves. A comprehensive chemical analysis enables further information on size and chemistry to be obtained. The method enables new insight into the morphology and chemistry of niobium carbonitride nano-precipitates within ferrite for a series of Nb-microalloyed ultra-thin cast strip steels. The results are supported by complementary high-resolution transmission electron microscopy.

  6. Software Analytical Instrument for Assessment of the Process of Casting Slabs

    NASA Astrophysics Data System (ADS)

    Franěk, Zdeněk; Kavička, František; Štětina, Josef; Masarik, Miloš

    2010-06-01

    The paper describes the original proposal of ways of solution and function of the program equipment for assessment of the process of casting slabs. The program system LITIOS was developed and implemented in EVRAZ Vitkovice Steel Ostrava on the equipment of continuous casting of steel (further only ECC). This program system works on the data warehouse of technological parameters of casting and quality parameters of slabs. It enables an ECC technologist to analyze the course of casting melt and with using statistics methods to set the influence of single technological parameters on the duality of final slabs. The system also enables long term monitoring and optimization of the production.

  7. Design and Implementation of Scientific Software Components to Enable Multiscale Modeling: The Effective Fragment Potential (QM/EFP) Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha

    2012-10-19

    The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less

  8. Systems Vaccinology: Enabling rational vaccine design with systems biological approaches

    PubMed Central

    Hagan, Thomas; Nakaya, Helder I.; Subramaniam, Shankar; Pulendran, Bali

    2015-01-01

    Vaccines have drastically reduced the mortality and morbidity of many diseases. However, vaccines have historically been developed empirically, and recent development of vaccines against current pandemics such as HIV and malaria has been met with difficulty. The advent of high-throughput technologies, coupled with systems biological methods of data analysis, has enabled researchers to interrogate the entire complement of a variety of molecular components within cells, and characterize the myriad interactions among them in order to model and understand the behavior of the system as a whole. In the context of vaccinology, these tools permit exploration of the molecular mechanisms by which vaccines induce protective immune responses. Here we review the recent advances, challenges, and potential of systems biological approaches in vaccinology. If the challenges facing this developing field can be overcome, systems vaccinology promises to empower the identification of early predictive signatures of vaccine response, as well as novel and robust correlates of protection from infection. Such discoveries, along with the improved understanding of immune responses to vaccination they impart, will play an instrumental role in development of the next generation of rationally designed vaccines. PMID:25858860

  9. PDF modeling of near-wall turbulent flows

    NASA Astrophysics Data System (ADS)

    Dreeben, Thomas David

    1997-06-01

    Pdf methods are extended to include modeling of wall- bounded turbulent flows. For flows in which resolution of the viscous sublayer is desired, a Pdf near-wall model is developed in which the Generalized Langevin model is combined with an exact model for viscous transport. Durbin's method of elliptic relaxation is used to incorporate the wall effects into the governing equations without the use of wall functions or damping functions. Close to the wall, the Generalized Langevin model provides an analogy to the effect of the fluctuating continuity equation. This enables accurate modeling of the near-wall turbulent statistics. Demonstrated accuracy for fully-developed channel flow is achieved with a Pdf/Monte Carlo simulation, and with its related Reynolds-stress closure. For flows in which the details of the viscous sublayer are not important, a Pdf wall- function method is developed with the Simplified Langevin model.

  10. A new mass screening method for methylmercury poisoning using mercury-volatilizing bacteria from Minamata Bay.

    PubMed

    Nakamura, K; Naruse, I; Takizawa, Y

    1999-09-01

    A simplified mass screening method for methylmercury exposure was developed using methylmercury-volatilizing bacteria from Minamata Bay. Some bacteria can transform methylmercury into mercury vapor. Most mercury in the hair is methylmercury, which is readily extracted with HCl solution. Black spots are formed on X-ray film due to the reduction of Ag(+) emulsion with mercury vapor produced by methylmercury-volatilizing bacteria. By exploiting these characteristics, a screening method was developed, whereby the fur of rats injected with methylmercury chloride formed clear black spots on X-ray film, whereas the fur of rats injected with saline did not. Subsequently, 50 human hair samples were examined using this mass screening method. The method identified people who had high mercury concentration, over 20 microg/g. A few thousand hair samples may be screened in a day using this method because it is rapid, simple, and economical. This method, therefore, enables screening of persons with methylmercury poisoning in mercury-polluted areas. Copyright 1999 Academic Press.

  11. Non-Adiabatic Molecular Dynamics Methods for Materials Discovery

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Furche, Filipp; Parker, Shane M.; Muuronen, Mikko J.

    2017-04-04

    The flow of radiative energy in light-driven materials such as photosensitizer dyes or photocatalysts is governed by non-adiabatic transitions between electronic states and cannot be described within the Born-Oppenheimer approximation commonly used in electronic structure theory. The non-adiabatic molecular dynamics (NAMD) methods based on Tully surface hopping and time-dependent density functional theory developed in this project have greatly extended the range of molecular materials that can be tackled by NAMD simulations. New algorithms to compute molecular excited state and response properties efficiently were developed. Fundamental limitations of common non-linear response methods were discovered and characterized. Methods for accurate computations ofmore » vibronic spectra of materials such as black absorbers were developed and applied. It was shown that open-shell TDDFT methods capture bond breaking in NAMD simulations, a longstanding challenge for single-reference molecular dynamics simulations. The methods developed in this project were applied to study the photodissociation of acetaldehyde and revealed that non-adiabatic effects are experimentally observable in fragment kinetic energy distributions. Finally, the project enabled the first detailed NAMD simulations of photocatalytic water oxidation by titania nanoclusters, uncovering the mechanism of this fundamentally important reaction for fuel generation and storage.« less

  12. Controlled Environments Enable Adaptive Management in Aquatic Ecosystems Under Altered Environments

    NASA Technical Reports Server (NTRS)

    Bubenheim, David L.

    2016-01-01

    Ecosystems worldwide are impacted by altered environment conditions resulting from climate, drought, and land use changes. Gaps in the science knowledge base regarding plant community response to these novel and rapid changes limit both science understanding and management of ecosystems. We describe how CE Technologies have enabled the rapid supply of gap-filling science, development of ecosystem simulation models, and remote sensing assessment tools to provide science-informed, adaptive management methods in the impacted aquatic ecosystem of the California Sacramento-San Joaquin River Delta. The Delta is the hub for California's water, supplying Southern California agriculture and urban communities as well as the San Francisco Bay area. The changes in environmental conditions including temperature, light, and water quality and associated expansion of invasive aquatic plants negatively impact water distribution and ecology of the San Francisco Bay/Delta complex. CE technologies define changes in resource use efficiencies, photosynthetic productivity, evapotranspiration, phenology, reproductive strategies, and spectral reflectance modifications in native and invasive species in response to altered conditions. We will discuss how the CE technologies play an enabling role in filling knowledge gaps regarding plant response to altered environments, parameterization and validation of ecosystem models, development of satellite-based, remote sensing tools, and operational management strategies.

  13. Cloning Nacre's 3D Interlocking Skeleton in Engineering Composites to Achieve Exceptional Mechanical Properties.

    PubMed

    Zhao, Hewei; Yue, Yonghai; Guo, Lin; Wu, Juntao; Zhang, Youwei; Li, Xiaodong; Mao, Shengcheng; Han, Xiaodong

    2016-07-01

    Ceramic/polymer composite equipped with 3D interlocking skeleton (3D IL) is developed through a simple freeze-casting method, exhibiting exceptionally light weight, high strength, toughness, and shock resistance. Long-range crack energy dissipation enabled by 3D interlocking structure is considered as the primary reinforcing mechanism for such superior properties. The smart composite design strategy should hold a place in developing future structural engineering materials. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Apollo: a sequence annotation editor

    PubMed Central

    Lewis, SE; Searle, SMJ; Harris, N; Gibson, M; Iyer, V; Richter, J; Wiel, C; Bayraktaroglu, L; Birney, E; Crosby, MA; Kaminker, JS; Matthews, BB; Prochnik, SE; Smith, CD; Tupy, JL; Rubin, GM; Misra, S; Mungall, CJ; Clamp, ME

    2002-01-01

    The well-established inaccuracy of purely computational methods for annotating genome sequences necessitates an interactive tool to allow biological experts to refine these approximations by viewing and independently evaluating the data supporting each annotation. Apollo was developed to meet this need, enabling curators to inspect genome annotations closely and edit them. FlyBase biologists successfully used Apollo to annotate the Drosophila melanogaster genome and it is increasingly being used as a starting point for the development of customized annotation editing tools for other genome projects. PMID:12537571

  15. Chipster: user-friendly analysis software for microarray and other high-throughput data.

    PubMed

    Kallio, M Aleksi; Tuimala, Jarno T; Hupponen, Taavi; Klemelä, Petri; Gentile, Massimiliano; Scheinin, Ilari; Koski, Mikko; Käki, Janne; Korpelainen, Eija I

    2011-10-14

    The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available.

  16. Effectively Identifying eQTLs from Multiple Tissues by Combining Mixed Model and Meta-analytic Approaches

    PubMed Central

    Choi, Ted; Eskin, Eleazar

    2013-01-01

    Gene expression data, in conjunction with information on genetic variants, have enabled studies to identify expression quantitative trait loci (eQTLs) or polymorphic locations in the genome that are associated with expression levels. Moreover, recent technological developments and cost decreases have further enabled studies to collect expression data in multiple tissues. One advantage of multiple tissue datasets is that studies can combine results from different tissues to identify eQTLs more accurately than examining each tissue separately. The idea of aggregating results of multiple tissues is closely related to the idea of meta-analysis which aggregates results of multiple genome-wide association studies to improve the power to detect associations. In principle, meta-analysis methods can be used to combine results from multiple tissues. However, eQTLs may have effects in only a single tissue, in all tissues, or in a subset of tissues with possibly different effect sizes. This heterogeneity in terms of effects across multiple tissues presents a key challenge to detect eQTLs. In this paper, we develop a framework that leverages two popular meta-analysis methods that address effect size heterogeneity to detect eQTLs across multiple tissues. We show by using simulations and multiple tissue data from mouse that our approach detects many eQTLs undetected by traditional eQTL methods. Additionally, our method provides an interpretation framework that accurately predicts whether an eQTL has an effect in a particular tissue. PMID:23785294

  17. Autocalibrating motion-corrected wave-encoding for highly accelerated free-breathing abdominal MRI.

    PubMed

    Chen, Feiyu; Zhang, Tao; Cheng, Joseph Y; Shi, Xinwei; Pauly, John M; Vasanawala, Shreyas S

    2017-11-01

    To develop a motion-robust wave-encoding technique for highly accelerated free-breathing abdominal MRI. A comprehensive 3D wave-encoding-based method was developed to enable fast free-breathing abdominal imaging: (a) auto-calibration for wave-encoding was designed to avoid extra scan for coil sensitivity measurement; (b) intrinsic butterfly navigators were used to track respiratory motion; (c) variable-density sampling was included to enable compressed sensing; (d) golden-angle radial-Cartesian hybrid view-ordering was incorporated to improve motion robustness; and (e) localized rigid motion correction was combined with parallel imaging compressed sensing reconstruction to reconstruct the highly accelerated wave-encoded datasets. The proposed method was tested on six subjects and image quality was compared with standard accelerated Cartesian acquisition both with and without respiratory triggering. Inverse gradient entropy and normalized gradient squared metrics were calculated, testing whether image quality was improved using paired t-tests. For respiratory-triggered scans, wave-encoding significantly reduced residual aliasing and blurring compared with standard Cartesian acquisition (metrics suggesting P < 0.05). For non-respiratory-triggered scans, the proposed method yielded significantly better motion correction compared with standard motion-corrected Cartesian acquisition (metrics suggesting P < 0.01). The proposed methods can reduce motion artifacts and improve overall image quality of highly accelerated free-breathing abdominal MRI. Magn Reson Med 78:1757-1766, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  18. Chipster: user-friendly analysis software for microarray and other high-throughput data

    PubMed Central

    2011-01-01

    Background The growth of high-throughput technologies such as microarrays and next generation sequencing has been accompanied by active research in data analysis methodology, producing new analysis methods at a rapid pace. While most of the newly developed methods are freely available, their use requires substantial computational skills. In order to enable non-programming biologists to benefit from the method development in a timely manner, we have created the Chipster software. Results Chipster (http://chipster.csc.fi/) brings a powerful collection of data analysis methods within the reach of bioscientists via its intuitive graphical user interface. Users can analyze and integrate different data types such as gene expression, miRNA and aCGH. The analysis functionality is complemented with rich interactive visualizations, allowing users to select datapoints and create new gene lists based on these selections. Importantly, users can save the performed analysis steps as reusable, automatic workflows, which can also be shared with other users. Being a versatile and easily extendable platform, Chipster can be used for microarray, proteomics and sequencing data. In this article we describe its comprehensive collection of analysis and visualization tools for microarray data using three case studies. Conclusions Chipster is a user-friendly analysis software for high-throughput data. Its intuitive graphical user interface enables biologists to access a powerful collection of data analysis and integration tools, and to visualize data interactively. Users can collaborate by sharing analysis sessions and workflows. Chipster is open source, and the server installation package is freely available. PMID:21999641

  19. A photo-cross-linking approach to monitor folding and assembly of newly synthesized proteins in a living cell.

    PubMed

    Miyazaki, Ryoji; Myougo, Naomi; Mori, Hiroyuki; Akiyama, Yoshinori

    2018-01-12

    Many proteins form multimeric complexes that play crucial roles in various cellular processes. Studying how proteins are correctly folded and assembled into such complexes in a living cell is important for understanding the physiological roles and the qualitative and quantitative regulation of the complex. However, few methods are suitable for analyzing these rapidly occurring processes. Site-directed in vivo photo-cross-linking is an elegant technique that enables analysis of protein-protein interactions in living cells with high spatial resolution. However, the conventional site-directed in vivo photo-cross-linking method is unsuitable for analyzing dynamic processes. Here, by combining an improved site-directed in vivo photo-cross-linking technique with a pulse-chase approach, we developed a new method that can analyze the folding and assembly of a newly synthesized protein with high spatiotemporal resolution. We demonstrate that this method, named the pulse-chase and in vivo photo-cross-linking experiment (PiXie), enables the kinetic analysis of the formation of an Escherichia coli periplasmic (soluble) protein complex (PhoA). We also used our new technique to investigate assembly/folding processes of two membrane complexes (SecD-SecF in the inner membrane and LptD-LptE in the outer membrane), which provided new insights into the biogenesis of these complexes. Our PiXie method permits analysis of the dynamic behavior of various proteins and enables examination of protein-protein interactions at the level of individual amino acid residues. We anticipate that our new technique will have valuable utility for studies of protein dynamics in many organisms. © 2018 by The American Society for Biochemistry and Molecular Biology, Inc.

  20. Adjustment method for embedded metrology engine in an EM773 series microcontroller.

    PubMed

    Blazinšek, Iztok; Kotnik, Bojan; Chowdhury, Amor; Kačič, Zdravko

    2015-09-01

    This paper presents the problems of implementation and adjustment (calibration) of a metrology engine embedded in NXP's EM773 series microcontroller. The metrology engine is used in a smart metering application to collect data about energy utilization and is controlled with the use of metrology engine adjustment (calibration) parameters. The aim of this research is to develop a method which would enable the operators to find and verify the optimum parameters which would ensure the best possible accuracy. Properly adjusted (calibrated) metrology engines can then be used as a base for variety of products used in smart and intelligent environments. This paper focuses on the problems encountered in the development, partial automatisation, implementation and verification of this method. Copyright © 2015 ISA. Published by Elsevier Ltd. All rights reserved.

Top