Models Extracted from Text for System-Software Safety Analyses
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2010-01-01
This presentation describes extraction and integration of requirements information and safety information in visualizations to support early review of completeness, correctness, and consistency of lengthy and diverse system safety analyses. Software tools have been developed and extended to perform the following tasks: 1) extract model parts and safety information from text in interface requirements documents, failure modes and effects analyses and hazard reports; 2) map and integrate the information to develop system architecture models and visualizations for safety analysts; and 3) provide model output to support virtual system integration testing. This presentation illustrates the methods and products with a rocket motor initiation case.
Li, Cheng Guo; Lee, Kwang; Lee, Chang Yeol; Dangol, Manita; Jung, Hyungil
2012-08-28
A minimally invasive blood-extraction system is fabricated by the integration of an elastic self-recovery actuator and an ultrahigh-aspect-ratio microneedle. The simple elastic self-recovery actuator converts finger force to elastic energy to provide power for blood extraction and transport without requiring an external source of power. This device has potential utility in the biomedical field within the framework of complete micro-electromechanical systems. Copyright © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Mathieson, William; Guljar, Nafia; Sanchez, Ignacio; Sroya, Manveer; Thomas, Gerry A
2018-05-03
DNA extracted from formalin-fixed, paraffin-embedded (FFPE) tissue blocks is amenable to analytical techniques, including sequencing. DNA extraction protocols are typically long and complex, often involving an overnight proteinase K digest. Automated platforms that shorten and simplify the process are therefore an attractive proposition for users wanting a faster turn-around or to process large numbers of biospecimens. It is, however, unclear whether automated extraction systems return poorer DNA yields or quality than manual extractions performed by experienced technicians. We extracted DNA from 42 FFPE clinical tissue biospecimens using the QiaCube (Qiagen) and ExScale (ExScale Biospecimen Solutions) automated platforms, comparing DNA yields and integrities with those from manual extractions. The QIAamp DNA FFPE Spin Column Kit was used for manual and QiaCube DNA extractions and the ExScale extractions were performed using two of the manufacturer's magnetic bead kits: one extracting DNA only and the other simultaneously extracting DNA and RNA. In all automated extraction methods, DNA yields and integrities (assayed using DNA Integrity Numbers from a 4200 TapeStation and the qPCR-based Illumina FFPE QC Assay) were poorer than in the manual method, with the QiaCube system performing better than the ExScale system. However, ExScale was fastest, offered the highest reproducibility when extracting DNA only, and required the least intervention or technician experience. Thus, the extraction methods have different strengths and weaknesses, would appeal to different users with different requirements, and therefore, we cannot recommend one method over another.
Improved system integration for integrated gasification combined cycle (IGCC) systems.
Frey, H Christopher; Zhu, Yunhua
2006-03-01
Integrated gasification combined cycle (IGCC) systems are a promising technology for power generation. They include an air separation unit (ASU), a gasification system, and a gas turbine combined cycle power block, and feature competitive efficiency and lower emissions compared to conventional power generation technology. IGCC systems are not yet in widespread commercial use and opportunities remain to improve system feasibility via improved process integration. A process simulation model was developed for IGCC systems with alternative types of ASU and gas turbine integration. The model is applied to evaluate integration schemes involving nitrogen injection, air extraction, and combinations of both, as well as different ASU pressure levels. The optimal nitrogen injection only case in combination with an elevated pressure ASU had the highest efficiency and power output and approximately the lowest emissions per unit output of all cases considered, and thus is a recommended design option. The optimal combination of air extraction coupled with nitrogen injection had slightly worse efficiency, power output, and emissions than the optimal nitrogen injection only case. Air extraction alone typically produced lower efficiency, lower power output, and higher emissions than all other cases. The recommended nitrogen injection only case is estimated to provide annualized cost savings compared to a nonintegrated design. Process simulation modeling is shown to be a useful tool for evaluation and screening of technology options.
Hunter, Lawrence; Lu, Zhiyong; Firby, James; Baumgartner, William A; Johnson, Helen L; Ogren, Philip V; Cohen, K Bretonnel
2008-01-01
Background Information extraction (IE) efforts are widely acknowledged to be important in harnessing the rapid advance of biomedical knowledge, particularly in areas where important factual information is published in a diverse literature. Here we report on the design, implementation and several evaluations of OpenDMAP, an ontology-driven, integrated concept analysis system. It significantly advances the state of the art in information extraction by leveraging knowledge in ontological resources, integrating diverse text processing applications, and using an expanded pattern language that allows the mixing of syntactic and semantic elements and variable ordering. Results OpenDMAP information extraction systems were produced for extracting protein transport assertions (transport), protein-protein interaction assertions (interaction) and assertions that a gene is expressed in a cell type (expression). Evaluations were performed on each system, resulting in F-scores ranging from .26 – .72 (precision .39 – .85, recall .16 – .85). Additionally, each of these systems was run over all abstracts in MEDLINE, producing a total of 72,460 transport instances, 265,795 interaction instances and 176,153 expression instances. Conclusion OpenDMAP advances the performance standards for extracting protein-protein interaction predications from the full texts of biomedical research articles. Furthermore, this level of performance appears to generalize to other information extraction tasks, including extracting information about predicates of more than two arguments. The output of the information extraction system is always constructed from elements of an ontology, ensuring that the knowledge representation is grounded with respect to a carefully constructed model of reality. The results of these efforts can be used to increase the efficiency of manual curation efforts and to provide additional features in systems that integrate multiple sources for information extraction. The open source OpenDMAP code library is freely available at PMID:18237434
NASA Astrophysics Data System (ADS)
Zhu, Zhen; Vana, Sudha; Bhattacharya, Sumit; Uijt de Haag, Maarten
2009-05-01
This paper discusses the integration of Forward-looking Infrared (FLIR) and traffic information from, for example, the Automatic Dependent Surveillance - Broadcast (ADS-B) or the Traffic Information Service-Broadcast (TIS-B). The goal of this integration method is to obtain an improved state estimate of a moving obstacle within the Field-of-View of the FLIR with added integrity. The focus of the paper will be on the approach phase of the flight. The paper will address methods to extract moving objects from the FLIR imagery and geo-reference these objects using outputs of both the onboard Global Positioning System (GPS) and the Inertial Navigation System (INS). The proposed extraction method uses a priori airport information and terrain databases. Furthermore, state information from the traffic information sources will be extracted and integrated with the state estimates from the FLIR. Finally, a method will be addressed that performs a consistency check between both sources of traffic information. The methods discussed in this paper will be evaluated using flight test data collected with a Gulfstream V in Reno, NV (GVSITE) and simulated ADS-B.
Wu, Lijie; Song, Ying; Hu, Mingzhu; Xu, Xu; Zhang, Hanqi; Yu, Aimin; Ma, Qiang; Wang, Ziming
2015-03-01
A simple and efficient integrated microwave processing system (IMPS) was firstly assembled and validated for the extraction of organophosphorus pesticides in fresh vegetables. Two processes under microwave irradiation, dynamic microwave-assisted extraction (DMAE) and microwave-accelerated solvent elution (MASE), were integrated for simplifying the pretreatment of the sample. Extraction, separation, enrichment and elution were finished in a simple step. The organophosphorus pesticides were extracted from the fresh vegetables into hexane with DMAE, and then the extract was directly introduced into the enrichment column packed with active carbon fiber (ACF). Subsequently, the organophosphorus pesticides trapped on the ACF were eluted with ethyl acetate under microwave irradiation. No further filtration or cleanup was required before analysis of the eluate by gas chromatography-mass spectrometry. Some experimental parameters affecting extraction efficiency were investigated and optimized, such as microwave output power, kind and volume of extraction solvent, extraction time, amount of sorbent, elution microwave power, kind and volume of elution solvent, elution solvent flow rate. Under the optimized conditions, the recoveries were in the range of 71.5-105.2%, and the relative standard deviations were lower than 11.6%. The experiment results prove that the present method is a simple and effective sample preparation method for the determination of pesticides in solid samples. Copyright © 2014 Elsevier B.V. All rights reserved.
Automation for System Safety Analysis
NASA Technical Reports Server (NTRS)
Malin, Jane T.; Fleming, Land; Throop, David; Thronesbery, Carroll; Flores, Joshua; Bennett, Ted; Wennberg, Paul
2009-01-01
This presentation describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Information Extraction for System-Software Safety Analysis: Calendar Year 2008 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2009-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis and simulation to identify and evaluate possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations and scenarios; and 4) identify resulting candidate scenarios for software integration testing. There has been significant technical progress in model extraction from Orion program text sources, architecture model derivation (components and connections) and documentation of extraction sources. Models have been derived from Internal Interface Requirements Documents (IIRDs) and FMEA documents. Linguistic text processing is used to extract model parts and relationships, and the Aerospace Ontology also aids automated model development from the extracted information. Visualizations of these models assist analysts in requirements overview and in checking consistency and completeness.
Jiang, Min; Chen, Yukun; Liu, Mei; Rosenbloom, S Trent; Mani, Subramani; Denny, Joshua C; Xu, Hua
2011-01-01
The authors' goal was to develop and evaluate machine-learning-based approaches to extracting clinical entities-including medical problems, tests, and treatments, as well as their asserted status-from hospital discharge summaries written using natural language. This project was part of the 2010 Center of Informatics for Integrating Biology and the Bedside/Veterans Affairs (VA) natural-language-processing challenge. The authors implemented a machine-learning-based named entity recognition system for clinical text and systematically evaluated the contributions of different types of features and ML algorithms, using a training corpus of 349 annotated notes. Based on the results from training data, the authors developed a novel hybrid clinical entity extraction system, which integrated heuristic rule-based modules with the ML-base named entity recognition module. The authors applied the hybrid system to the concept extraction and assertion classification tasks in the challenge and evaluated its performance using a test data set with 477 annotated notes. Standard measures including precision, recall, and F-measure were calculated using the evaluation script provided by the Center of Informatics for Integrating Biology and the Bedside/VA challenge organizers. The overall performance for all three types of clinical entities and all six types of assertions across 477 annotated notes were considered as the primary metric in the challenge. Systematic evaluation on the training set showed that Conditional Random Fields outperformed Support Vector Machines, and semantic information from existing natural-language-processing systems largely improved performance, although contributions from different types of features varied. The authors' hybrid entity extraction system achieved a maximum overall F-score of 0.8391 for concept extraction (ranked second) and 0.9313 for assertion classification (ranked fourth, but not statistically different than the first three systems) on the test data set in the challenge.
Intelligent multi-sensor integrations
NASA Technical Reports Server (NTRS)
Volz, Richard A.; Jain, Ramesh; Weymouth, Terry
1989-01-01
Growth in the intelligence of space systems requires the use and integration of data from multiple sensors. Generic tools are being developed for extracting and integrating information obtained from multiple sources. The full spectrum is addressed for issues ranging from data acquisition, to characterization of sensor data, to adaptive systems for utilizing the data. In particular, there are three major aspects to the project, multisensor processing, an adaptive approach to object recognition, and distributed sensor system integration.
Sahore, Vishal; Sonker, Mukul; Nielsen, Anna V; Knob, Radim; Kumar, Suresh; Woolley, Adam T
2018-01-01
We have developed multichannel integrated microfluidic devices for automated preconcentration, labeling, purification, and separation of preterm birth (PTB) biomarkers. We fabricated multilayer poly(dimethylsiloxane)-cyclic olefin copolymer (PDMS-COC) devices that perform solid-phase extraction (SPE) and microchip electrophoresis (μCE) for automated PTB biomarker analysis. The PDMS control layer had a peristaltic pump and pneumatic valves for flow control, while the PDMS fluidic layer had five input reservoirs connected to microchannels and a μCE system. The COC layers had a reversed-phase octyl methacrylate porous polymer monolith for SPE and fluorescent labeling of PTB biomarkers. We determined μCE conditions for two PTB biomarkers, ferritin (Fer) and corticotropin-releasing factor (CRF). We used these integrated microfluidic devices to preconcentrate and purify off-chip-labeled Fer and CRF in an automated fashion. Finally, we performed a fully automated on-chip analysis of unlabeled PTB biomarkers, involving SPE, labeling, and μCE separation with 1 h total analysis time. These integrated systems have strong potential to be combined with upstream immunoaffinity extraction, offering a compact sample-to-answer biomarker analysis platform. Graphical abstract Pressure-actuated integrated microfluidic devices have been developed for automated solid-phase extraction, fluorescent labeling, and microchip electrophoresis of preterm birth biomarkers.
The LifeWatch approach to the exploration of distributed species information
Fuentes, Daniel; Fiore, Nicola
2014-01-01
Abstract This paper introduces a new method of automatically extracting, integrating and presenting information regarding species from the most relevant online taxonomic resources. First, the information is extracted and joined using data wrappers and integration solutions. Then, an analytical tool is used to provide a visual representation of the data. The information is then integrated into a user friendly content management system. The proposal has been implemented using data from the Global Biodiversity Information Facility (GBIF), the Catalogue of Life (CoL), the World Register of Marine Species (WoRMS), the Integrated Taxonomic Information System (ITIS) and the Global Names Index (GNI). The approach improves data quality, avoiding taxonomic and nomenclature errors whilst increasing the availability and accessibility of the information. PMID:25589865
Microfluidic device for acoustic cell lysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Branch, Darren W.; Cooley, Erika Jane; Smith, Gennifer Tanabe
2015-08-04
A microfluidic acoustic-based cell lysing device that can be integrated with on-chip nucleic acid extraction. Using a bulk acoustic wave (BAW) transducer array, acoustic waves can be coupled into microfluidic cartridges resulting in the lysis of cells contained therein by localized acoustic pressure. Cellular materials can then be extracted from the lysed cells. For example, nucleic acids can be extracted from the lysate using silica-based sol-gel filled microchannels, nucleic acid binding magnetic beads, or Nafion-coated electrodes. Integration of cell lysis and nucleic acid extraction on-chip enables a small, portable system that allows for rapid analysis in the field.
NASA Astrophysics Data System (ADS)
Wang, Lanjing; Shao, Wenjing; Wang, Zhiyue; Fu, Wenfeng; Zhao, Wensheng
2018-02-01
Taking the MEA chemical absorption carbon capture system with 85% of the carbon capture rate of a 660MW ultra-super critical unit as an example,this paper puts forward a new type of turbine which dedicated to supply steam to carbon capture system. The comparison of the thermal systems of the power plant under different steam supply schemes by using the EBSILON indicated optimal extraction scheme for Steam Extraction System in Carbon Capture System. The results show that the cycle heat efficiency of the unit introduced carbon capture turbine system is higher than that of the usual scheme without it. With the introduction of the carbon capture turbine, the scheme which extracted steam from high pressure cylinder’ s steam input point shows the highest cycle thermal efficiency. Its indexes are superior to other scheme, and more suitable for existing coal-fired power plant integrated post combustion carbon dioxide capture system.
Integrated printed circuit board device for cell lysis and nucleic acid extraction.
Marshall, Lewis A; Wu, Liang Li; Babikian, Sarkis; Bachman, Mark; Santiago, Juan G
2012-11-06
Preparation of raw, untreated biological samples remains a major challenge in microfluidics. We present a novel microfluidic device based on the integration of printed circuit boards and an isotachophoresis assay for sample preparation of nucleic acids from biological samples. The device has integrated resistive heaters and temperature sensors as well as a 70 μm × 300 μm × 3.7 cm microfluidic channel connecting two 15 μL reservoirs. We demonstrated this device by extracting pathogenic nucleic acids from 1 μL dispensed volume of whole blood spiked with Plasmodium falciparum. We dispensed whole blood directly onto an on-chip reservoir, and the system's integrated heaters simultaneously lysed and mixed the sample. We used isotachophoresis to extract the nucleic acids into a secondary buffer via isotachophoresis. We analyzed the convective mixing action with micro particle image velocimetry (micro-PIV) and verified the purity and amount of extracted nucleic acids using off-chip quantitative polymerase chain reaction (PCR). We achieved a clinically relevant limit of detection of 500 parasites per microliter. The system has no moving parts, and the process is potentially compatible with a wide range of on-chip hybridization or amplification assays.
Quiñones, Karin D; Su, Hua; Marshall, Byron; Eggers, Shauna; Chen, Hsinchun
2007-09-01
Explosive growth in biomedical research has made automated information extraction, knowledge integration, and visualization increasingly important and critically needed. The Arizona BioPathway (ABP) system extracts and displays biological regulatory pathway information from the abstracts of journal articles. This study uses relations extracted from more than 200 PubMed abstracts presented in a tabular and graphical user interface with built-in search and aggregation functionality. This paper presents a task-centered assessment of the usefulness and usability of the ABP system focusing on its relation aggregation and visualization functionalities. Results suggest that our graph-based visualization is more efficient in supporting pathway analysis tasks and is perceived as more useful and easier to use as compared to a text-based literature-viewing method. Relation aggregation significantly contributes to knowledge-acquisition efficiency. Together, the graphic and tabular views in the ABP Visualizer provide a flexible and effective interface for pathway relation browsing and analysis. Our study contributes to pathway-related research and biological information extraction by assessing the value of a multiview, relation-based interface that supports user-controlled exploration of pathway information across multiple granularities.
CMS-2 Reverse Engineering and ENCORE/MODEL Integration
1992-05-01
Automated extraction of design information from an existing software system written in CMS-2 can be used to document that system as-built, and that I The...extracted information is provided by a commer- dally available CASE tool. * Information describing software system design is automatically extracted...the displays in Figures 1, 2, and 3. T achiev ths GE 11 b iuo w as rjcs CM-2t Aa nsltr(M2da 1 n Joia Reverse EwngiernTcnlg 5RT [2GRE] . Two xampe fD
Integrating biofiltration with SVE: A case study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lesley, M.P.; Rangan, C.R.
1996-12-01
A prototype integrated soil vacuum extraction/biofiltration system has been designed and installed at a gasoline contaminated LUST site in southern Delaware. The prototype system remediates contaminated moisture entrained in the air stream, employs automatic water level controls in the filters, and achieves maximum vapor extraction and VOC destruction efficiency with an optimum power input. In addition, the valving and piping layout allows the direction of air flow through the filters to be reversed at a given time interval, which minimizes biofouling, thereby increasing efficiency by minimizing the need for frequent cleaning. This integrated system achieves constant VOC destruction rates ofmore » 40 to 70% while maintaining optimal VOC removal rates from the subsurface. The modular design allows for easy mobilization, setup and demobilization at state-lead LUST sites throughout Delaware.« less
Continuous nucleus extraction by optically-induced cell lysis on a batch-type microfluidic platform.
Huang, Shih-Hsuan; Hung, Lien-Yu; Lee, Gwo-Bin
2016-04-21
The extraction of a cell's nucleus is an essential technique required for a number of procedures, such as disease diagnosis, genetic replication, and animal cloning. However, existing nucleus extraction techniques are relatively inefficient and labor-intensive. Therefore, this study presents an innovative, microfluidics-based approach featuring optically-induced cell lysis (OICL) for nucleus extraction and collection in an automatic format. In comparison to previous micro-devices designed for nucleus extraction, the new OICL device designed herein is superior in terms of flexibility, selectivity, and efficiency. To facilitate this OICL module for continuous nucleus extraction, we further integrated an optically-induced dielectrophoresis (ODEP) module with the OICL device within the microfluidic chip. This on-chip integration circumvents the need for highly trained personnel and expensive, cumbersome equipment. Specifically, this microfluidic system automates four steps by 1) automatically focusing and transporting cells, 2) releasing the nuclei on the OICL module, 3) isolating the nuclei on the ODEP module, and 4) collecting the nuclei in the outlet chamber. The efficiency of cell membrane lysis and the ODEP nucleus separation was measured to be 78.04 ± 5.70% and 80.90 ± 5.98%, respectively, leading to an overall nucleus extraction efficiency of 58.21 ± 2.21%. These results demonstrate that this microfluidics-based system can successfully perform nucleus extraction, and the integrated platform is therefore promising in cell fusion technology with the goal of achieving genetic replication, or even animal cloning, in the near future.
NASA Astrophysics Data System (ADS)
Lewis, B. E.
1982-12-01
The primary decontamination extraction section product (HAP) heat exchanger will be located between the extracting section (HA) and scrubbing section (HS) of centrifugal solvent extraction contactors in the Integrated Equipment Test (IET) facility. The heat exchanger is required to raise the temperature of the organic product stream from the HA contactor from 40 to 500 C. Tests were conducted under prototypic IET operating conditions to determine the head requirements for gravity flow and the overall heat transfer coefficient for the heat exchanger. Results from the tests indicated that the specified heat exchanger would perform satisfactorily under normal operating conditions.
Object-oriented software design in semiautomatic building extraction
NASA Astrophysics Data System (ADS)
Guelch, Eberhard; Mueller, Hardo
1997-08-01
Developing a system for semiautomatic building acquisition is a complex process, that requires constant integration and updating of software modules and user interfaces. To facilitate these processes we apply an object-oriented design not only for the data but also for the software involved. We use the unified modeling language (UML) to describe the object-oriented modeling of the system in different levels of detail. We can distinguish between use cases from the users point of view, that represent a sequence of actions, yielding in an observable result and the use cases for the programmers, who can use the system as a class library to integrate the acquisition modules in their own software. The structure of the system is based on the model-view-controller (MVC) design pattern. An example from the integration of automated texture extraction for the visualization of results demonstrate the feasibility of this approach.
Information Extraction for System-Software Safety Analysis: Calendar Year 2007 Year-End Report
NASA Technical Reports Server (NTRS)
Malin, Jane T.
2008-01-01
This annual report describes work to integrate a set of tools to support early model-based analysis of failures and hazards due to system-software interactions. The tools perform and assist analysts in the following tasks: 1) extract model parts from text for architecture and safety/hazard models; 2) combine the parts with library information to develop the models for visualization and analysis; 3) perform graph analysis on the models to identify possible paths from hazard sources to vulnerable entities and functions, in nominal and anomalous system-software configurations; 4) perform discrete-time-based simulation on the models to investigate scenarios where these paths may play a role in failures and mishaps; and 5) identify resulting candidate scenarios for software integration testing. This paper describes new challenges in a NASA abort system case, and enhancements made to develop the integrated tool set.
Sankaran, Revathy; Show, Pau Loke; Lee, Sze Ying; Yap, Yee Jiun; Ling, Tau Chuan
2018-02-01
Liquid Biphasic Flotation (LBF) is an advanced recovery method that has been effectively applied for biomolecules extraction. The objective of this investigation is to incorporate the fermentation and extraction process of lipase from Burkholderia cepacia using flotation system. Initial study was conducted to compare the performance of bacteria growth and lipase production using flotation and shaker system. From the results obtained, bacteria shows quicker growth and high lipase yield via flotation system. Integration process for lipase separation was investigated and the result showed high efficiency reaching 92.29% and yield of 95.73%. Upscaling of the flotation system exhibited consistent result with the lab-scale which are 89.53% efficiency and 93.82% yield. The combination of upstream and downstream processes in a single system enables the acceleration of product formation, improves the product yield and facilitates downstream processing. This integration system demonstrated its potential for biomolecules fermentation and separation that possibly open new opportunities for industrial production. Copyright © 2017 Elsevier Ltd. All rights reserved.
A Database Design for a Unit Status Reporting System.
1987-03-01
definitions. g. Extraction of data dictionary entries from existing programs. [Ref. 7:pp. 63-66] The third tool is used to define the logic of the...Automation of the Unit Status Reporting System is feasible, and would require: integrated files of data, some direct data extraction from those files...an extract of AR 220-1. Relevant sections of the regulation are included to provide an easy reference for the reader. The last section of the
Integrating Information Extraction Agents into a Tourism Recommender System
NASA Astrophysics Data System (ADS)
Esparcia, Sergio; Sánchez-Anguix, Víctor; Argente, Estefanía; García-Fornes, Ana; Julián, Vicente
Recommender systems face some problems. On the one hand information needs to be maintained updated, which can result in a costly task if it is not performed automatically. On the other hand, it may be interesting to include third party services in the recommendation since they improve its quality. In this paper, we present an add-on for the Social-Net Tourism Recommender System that uses information extraction and natural language processing techniques in order to automatically extract and classify information from the Web. Its goal is to maintain the system updated and obtain information about third party services that are not offered by service providers inside the system.
Navigation integrity monitoring and obstacle detection for enhanced-vision systems
NASA Astrophysics Data System (ADS)
Korn, Bernd; Doehler, Hans-Ullrich; Hecker, Peter
2001-08-01
Typically, Enhanced Vision (EV) systems consist of two main parts, sensor vision and synthetic vision. Synthetic vision usually generates a virtual out-the-window view using databases and accurate navigation data, e. g. provided by differential GPS (DGPS). The reliability of the synthetic vision highly depends on both, the accuracy of the used database and the integrity of the navigation data. But especially in GPS based systems, the integrity of the navigation can't be guaranteed. Furthermore, only objects that are stored in the database can be displayed to the pilot. Consequently, unexpected obstacles are invisible and this might cause severe problems. Therefore, additional information has to be extracted from sensor data to overcome these problems. In particular, the sensor data analysis has to identify obstacles and has to monitor the integrity of databases and navigation. Furthermore, if a lack of integrity arises, navigation data, e.g. the relative position of runway and aircraft, has to be extracted directly from the sensor data. The main contribution of this paper is about the realization of these three sensor data analysis tasks within our EV system, which uses the HiVision 35 GHz MMW radar of EADS, Ulm as the primary EV sensor. For the integrity monitoring, objects extracted from radar images are registered with both database objects and objects (e. g. other aircrafts) transmitted via data link. This results in a classification into known and unknown radar image objects and consequently, in a validation of the integrity of database and navigation. Furthermore, special runway structures are searched for in the radar image where they should appear. The outcome of this runway check contributes to the integrity analysis, too. Concurrent to this investigation a radar image based navigation is performed without using neither precision navigation nor detailed database information to determine the aircraft's position relative to the runway. The performance of our approach is demonstrated with real data acquired during extensive flight tests to several airports in Northern Germany.
Human Systems Integration (HSI) Associated Development Activities in Japan
2008-06-12
machine learning and data mining methods. The continuous effort ( KAIZEN ) to improve the analysis phases are illustrated in Figure 14. Although there...model Extraction of a workflow Extraction of a control rule Variation analysis and improvement Plant operation KAIZEN Fig. 14
Construction of Green Tide Monitoring System and Research on its Key Techniques
NASA Astrophysics Data System (ADS)
Xing, B.; Li, J.; Zhu, H.; Wei, P.; Zhao, Y.
2018-04-01
As a kind of marine natural disaster, Green Tide has been appearing every year along the Qingdao Coast, bringing great loss to this region, since the large-scale bloom in 2008. Therefore, it is of great value to obtain the real time dynamic information about green tide distribution. In this study, methods of optical remote sensing and microwave remote sensing are employed in Green Tide Monitoring Research. A specific remote sensing data processing flow and a green tide information extraction algorithm are designed, according to the optical and microwave data of different characteristics. In the aspect of green tide spatial distribution information extraction, an automatic extraction algorithm of green tide distribution boundaries is designed based on the principle of mathematical morphology dilation/erosion. And key issues in information extraction, including the division of green tide regions, the obtaining of basic distributions, the limitation of distribution boundary, and the elimination of islands, have been solved. The automatic generation of green tide distribution boundaries from the results of remote sensing information extraction is realized. Finally, a green tide monitoring system is built based on IDL/GIS secondary development in the integrated environment of RS and GIS, achieving the integration of RS monitoring and information extraction.
ERIC Educational Resources Information Center
Hlavac, Rebecca J.; Klaus, Rachel; Betts, Kourtney; Smith, Shilo M.; Stabio, Maureen E.
2018-01-01
Medical schools in the United States continue to undergo curricular change, reorganization, and reformation as more schools transition to an integrated curriculum. Anatomy educators must find novel approaches to teach in a way that will bridge multiple disciplines. The cadaveric extraction of the central nervous system (CNS) provides an…
A fully automated liquid–liquid extraction system utilizing interface detection
Maslana, Eugene; Schmitt, Robert; Pan, Jeffrey
2000-01-01
The development of the Abbott Liquid-Liquid Extraction Station was a result of the need for an automated system to perform aqueous extraction on large sets of newly synthesized organic compounds used for drug discovery. The system utilizes a cylindrical laboratory robot to shuttle sample vials between two loading racks, two identical extraction stations, and a centrifuge. Extraction is performed by detecting the phase interface (by difference in refractive index) of the moving column of fluid drawn from the bottom of each vial containing a biphasic mixture. The integration of interface detection with fluid extraction maximizes sample throughput. Abbott-developed electronics process the detector signals. Sample mixing is performed by high-speed solvent injection. Centrifuging of the samples reduces interface emulsions. Operating software permits the user to program wash protocols with any one of six solvents per wash cycle with as many cycle repeats as necessary. Station capacity is eighty, 15 ml vials. This system has proven successful with a broad spectrum of both ethyl acetate and methylene chloride based chemistries. The development and characterization of this automated extraction system will be presented. PMID:18924693
Sinabro: A Smartphone-Integrated Opportunistic Electrocardiogram Monitoring System
Kwon, Sungjun; Lee, Dongseok; Kim, Jeehoon; Lee, Youngki; Kang, Seungwoo; Seo, Sangwon; Park, Kwangsuk
2016-01-01
In our preliminary study, we proposed a smartphone-integrated, unobtrusive electrocardiogram (ECG) monitoring system, Sinabro, which monitors a user’s ECG opportunistically during daily smartphone use without explicit user intervention. The proposed system also monitors ECG-derived features, such as heart rate (HR) and heart rate variability (HRV), to support the pervasive healthcare apps for smartphones based on the user’s high-level contexts, such as stress and affective state levels. In this study, we have extended the Sinabro system by: (1) upgrading the sensor device; (2) improving the feature extraction process; and (3) evaluating extensions of the system. We evaluated these extensions with a good set of algorithm parameters that were suggested based on empirical analyses. The results showed that the system could capture ECG reliably and extract highly accurate ECG-derived features with a reasonable rate of data drop during the user’s daily smartphone use. PMID:26978364
Sinabro: A Smartphone-Integrated Opportunistic Electrocardiogram Monitoring System.
Kwon, Sungjun; Lee, Dongseok; Kim, Jeehoon; Lee, Youngki; Kang, Seungwoo; Seo, Sangwon; Park, Kwangsuk
2016-03-11
In our preliminary study, we proposed a smartphone-integrated, unobtrusive electrocardiogram (ECG) monitoring system, Sinabro, which monitors a user's ECG opportunistically during daily smartphone use without explicit user intervention. The proposed system also monitors ECG-derived features, such as heart rate (HR) and heart rate variability (HRV), to support the pervasive healthcare apps for smartphones based on the user's high-level contexts, such as stress and affective state levels. In this study, we have extended the Sinabro system by: (1) upgrading the sensor device; (2) improving the feature extraction process; and (3) evaluating extensions of the system. We evaluated these extensions with a good set of algorithm parameters that were suggested based on empirical analyses. The results showed that the system could capture ECG reliably and extract highly accurate ECG-derived features with a reasonable rate of data drop during the user's daily smartphone use.
Integrated sample-to-detection chip for nucleic acid test assays.
Prakash, R; Pabbaraju, K; Wong, S; Tellier, R; Kaler, K V I S
2016-06-01
Nucleic acid based diagnostic techniques are routinely used for the detection of infectious agents. Most of these assays rely on nucleic acid extraction platforms for the extraction and purification of nucleic acids and a separate real-time PCR platform for quantitative nucleic acid amplification tests (NATs). Several microfluidic lab on chip (LOC) technologies have been developed, where mechanical and chemical methods are used for the extraction and purification of nucleic acids. Microfluidic technologies have also been effectively utilized for chip based real-time PCR assays. However, there are few examples of microfluidic systems which have successfully integrated these two key processes. In this study, we have implemented an electro-actuation based LOC micro-device that leverages multi-frequency actuation of samples and reagents droplets for chip based nucleic acid extraction and real-time, reverse transcription (RT) PCR (qRT-PCR) amplification from clinical samples. Our prototype micro-device combines chemical lysis with electric field assisted isolation of nucleic acid in a four channel parallel processing scheme. Furthermore, a four channel parallel qRT-PCR amplification and detection assay is integrated to deliver the sample-to-detection NAT chip. The NAT chip combines dielectrophoresis and electrostatic/electrowetting actuation methods with resistive micro-heaters and temperature sensors to perform chip based integrated NATs. The two chip modules have been validated using different panels of clinical samples and their performance compared with standard platforms. This study has established that our integrated NAT chip system has a sensitivity and specificity comparable to that of the standard platforms while providing up to 10 fold reduction in sample/reagent volumes.
Yoon, Seung-Yil; Sagi, Hemi; Goldhammer, Craig; Li, Lei
2012-01-01
Container closure integrity (CCI) is a critical factor to ensure that product sterility is maintained over its entire shelf life. Assuring the CCI during container closure (C/C) system qualification, routine manufacturing and stability is important. FDA guidance also encourages industry to develop a CCI physical testing method in lieu of sterility testing in a stability program. A mass extraction system has been developed to check CCI for a variety of container closure systems such as vials, syringes, and cartridges. Various types of defects (e.g., glass micropipette, laser drill, wire) were created and used to demonstrate a detection limit. Leakage, detected as mass flow in this study, changes as a function of defect length and diameter. Therefore, the morphology of defects has been examined in detail with fluid theories. This study demonstrated that a mass extraction system was able to distinguish between intact samples and samples with 2 μm defects reliably when the defect was exposed to air, water, placebo, or drug product (3 mg/mL concentration) solution. Also, it has been verified that the method was robust, and capable of determining the acceptance limit using 3σ for syringes and 6σ for vials. Sterile products must maintain their sterility over their entire shelf life. Container closure systems such as those found in syringes and vials provide a seal between rubber and glass containers. This seal must be ensured to maintain product sterility. A mass extraction system has been developed to check container closure integrity for a variety of container closure systems such as vials, syringes, and cartridges. In order to demonstrate the method's capability, various types of defects (e.g., glass micropipette, laser drill, wire) were created in syringes and vials and were tested. This study demonstrated that a mass extraction system was able to distinguish between intact samples and samples with 2 μm defects reliably when the defect was exposed to air, water, placebo, or drug product (3 mg/mL concentration) solution. Also, it was verified that the method showed consistent results, and was able to determine the acceptance limit using 3σ for syringes and 6σ for vials.
Automated Design Tools for Integrated Mixed-Signal Microsystems (NeoCAD)
2005-02-01
method, Model Order Reduction (MOR) tools, system-level, mixed-signal circuit synthesis and optimization tools, and parsitic extraction tools. A unique...Mission Area: Command and Control mixed signal circuit simulation parasitic extraction time-domain simulation IC design flow model order reduction... Extraction 1.2 Overall Program Milestones CHAPTER 2 FAST TIME DOMAIN MIXED-SIGNAL CIRCUIT SIMULATION 2.1 HAARSPICE Algorithms 2.1.1 Mathematical Background
Integrated Spacesuit Audio System Enhances Speech Quality and Reduces Noise
NASA Technical Reports Server (NTRS)
Huang, Yiteng Arden; Chen, Jingdong; Chen, Shaoyan Sharyl
2009-01-01
A new approach has been proposed for increasing astronaut comfort and speech capture. Currently, the special design of a spacesuit forms an extreme acoustic environment making it difficult to capture clear speech without compromising comfort. The proposed Integrated Spacesuit Audio (ISA) system is to incorporate the microphones into the helmet and use software to extract voice signals from background noise.
USDA-ARS?s Scientific Manuscript database
Silvopastoral management strategies seek to expand spatial and temporal boundaries of forage production and promote ecosystem integrity through a combination of tree thinning and understory pastures. We determined the effects of water extracts of leaf litter from yellow poplar, Liriodendron tulipife...
USDA-ARS?s Scientific Manuscript database
A simultaneous saccharification fermentation (SSF) system was studied for ethanol production in flour industrial sweetpotato (ISP) feedstocks (lines: white DM02-180 and purple NC-413) as an integrated cost saving process, and to examine the feasibility of extracting anthocyanins from flour purple IS...
Semantic World Modelling and Data Management in a 4d Forest Simulation and Information System
NASA Astrophysics Data System (ADS)
Roßmann, J.; Hoppen, M.; Bücken, A.
2013-08-01
Various types of 3D simulation applications benefit from realistic forest models. They range from flight simulators for entertainment to harvester simulators for training and tree growth simulations for research and planning. Our 4D forest simulation and information system integrates the necessary methods for data extraction, modelling and management. Using modern methods of semantic world modelling, tree data can efficiently be extracted from remote sensing data. The derived forest models contain position, height, crown volume, type and diameter of each tree. This data is modelled using GML-based data models to assure compatibility and exchangeability. A flexible approach for database synchronization is used to manage the data and provide caching, persistence, a central communication hub for change distribution, and a versioning mechanism. Combining various simulation techniques and data versioning, the 4D forest simulation and information system can provide applications with "both directions" of the fourth dimension. Our paper outlines the current state, new developments, and integration of tree extraction, data modelling, and data management. It also shows several applications realized with the system.
A Risk Assessment System with Automatic Extraction of Event Types
NASA Astrophysics Data System (ADS)
Capet, Philippe; Delavallade, Thomas; Nakamura, Takuya; Sandor, Agnes; Tarsitano, Cedric; Voyatzi, Stavroula
In this article we describe the joint effort of experts in linguistics, information extraction and risk assessment to integrate EventSpotter, an automatic event extraction engine, into ADAC, an automated early warning system. By detecting as early as possible weak signals of emerging risks ADAC provides a dynamic synthetic picture of situations involving risk. The ADAC system calculates risk on the basis of fuzzy logic rules operated on a template graph whose leaves are event types. EventSpotter is based on a general purpose natural language dependency parser, XIP, enhanced with domain-specific lexical resources (Lexicon-Grammar). Its role is to automatically feed the leaves with input data.
Integrated Heat Exchange For Recuperation In Gas Turbine Engines
2016-12-01
exchange system within the engine using existing blade surfaces to extract and insert heat. Due to the highly turbulent and transient flow, heat...transfer coefficients in turbomachinery are extremely high, making this possible. Heat transfer between the turbine and compressor blade surfaces could be...exchange system within the engine using existing blade surfaces to extract and insert heat. Due to the highly turbulent and transient flow, heat transfer
Current Nucleic Acid Extraction Methods and Their Implications to Point-of-Care Diagnostics.
Ali, Nasir; Rampazzo, Rita de Cássia Pontello; Costa, Alexandre Dias Tavares; Krieger, Marco Aurelio
2017-01-01
Nucleic acid extraction (NAE) plays a vital role in molecular biology as the primary step for many downstream applications. Many modifications have been introduced to the original 1869 method. Modern processes are categorized into chemical or mechanical, each with peculiarities that influence their use, especially in point-of-care diagnostics (POC-Dx). POC-Dx is a new approach aiming to replace sophisticated analytical machinery with microanalytical systems, able to be used near the patient, at the point of care or point of need . Although notable efforts have been made, a simple and effective extraction method is still a major challenge for widespread use of POC-Dx. In this review, we dissected the working principle of each of the most common NAE methods, overviewing their advantages and disadvantages, as well their potential for integration in POC-Dx systems. At present, it seems difficult, if not impossible, to establish a procedure which can be universally applied to POC-Dx. We also discuss the effects of the NAE chemicals upon the main plastic polymers used to mass produce POC-Dx systems. We end our review discussing the limitations and challenges that should guide the quest for an efficient extraction method that can be integrated in a POC-Dx system.
Current Nucleic Acid Extraction Methods and Their Implications to Point-of-Care Diagnostics
Ali, Nasir; Rampazzo, Rita de Cássia Pontello; Krieger, Marco Aurelio
2017-01-01
Nucleic acid extraction (NAE) plays a vital role in molecular biology as the primary step for many downstream applications. Many modifications have been introduced to the original 1869 method. Modern processes are categorized into chemical or mechanical, each with peculiarities that influence their use, especially in point-of-care diagnostics (POC-Dx). POC-Dx is a new approach aiming to replace sophisticated analytical machinery with microanalytical systems, able to be used near the patient, at the point of care or point of need. Although notable efforts have been made, a simple and effective extraction method is still a major challenge for widespread use of POC-Dx. In this review, we dissected the working principle of each of the most common NAE methods, overviewing their advantages and disadvantages, as well their potential for integration in POC-Dx systems. At present, it seems difficult, if not impossible, to establish a procedure which can be universally applied to POC-Dx. We also discuss the effects of the NAE chemicals upon the main plastic polymers used to mass produce POC-Dx systems. We end our review discussing the limitations and challenges that should guide the quest for an efficient extraction method that can be integrated in a POC-Dx system. PMID:28785592
Wang, Hsiao-Ning; Liu, Tsan-Zon; Chen, Ya-Lei; Shiuan, David
2007-01-01
The protective effects of a freeze-dried extracts of vegetables and fruits (BauYuan; BY) on the hydroxyl radical-mediated DNA strand breakages and the structural integrity of human red blood cells (RBCs) were investigated. First, the supercoiled plasmid (pEGFP-C1) DNA was subjected to oxidation damage by an ascorbate-fortified Fenton reaction and the protective effects were analyzed by agarose gel electrophoresis. In the absence of BY extracts, exposure of the high-throughput .OH-generating system (Fe2+ concentration >1.0 microM) caused a complete fragmentation of DNA. Supplementation of BY extract (1 mg/mL) to the plasmid DNA prior to the exposure could prevent it significantly. In contrast, as the plasmid exposed to a low-grade .OH-generating system (Fe2+<0.1 microM), the BY extract (1 mg/mL) provided an almost complete protection. Next, the cell deformabilities were measured to assess the protection effects of various BY extracts on human erythrocytes exposed to the oxidative insults. We found that both the aqueous extract and the organic solvent-derived extracts could strongly protect human RBCs from the reactive oxygen species (ROS)-mediated decrease in the deformability indices. The results implicated that the BY extracts could effectively protect the cell membrane integrity via scavenging ROS which enabling RBCs to maintain a balance of water content and surface area to prevent the drop of cell deformability.
PKDE4J: Entity and relation extraction for public knowledge discovery.
Song, Min; Kim, Won Chul; Lee, Dahee; Heo, Go Eun; Kang, Keun Young
2015-10-01
Due to an enormous number of scientific publications that cannot be handled manually, there is a rising interest in text-mining techniques for automated information extraction, especially in the biomedical field. Such techniques provide effective means of information search, knowledge discovery, and hypothesis generation. Most previous studies have primarily focused on the design and performance improvement of either named entity recognition or relation extraction. In this paper, we present PKDE4J, a comprehensive text-mining system that integrates dictionary-based entity extraction and rule-based relation extraction in a highly flexible and extensible framework. Starting with the Stanford CoreNLP, we developed the system to cope with multiple types of entities and relations. The system also has fairly good performance in terms of accuracy as well as the ability to configure text-processing components. We demonstrate its competitive performance by evaluating it on many corpora and found that it surpasses existing systems with average F-measures of 85% for entity extraction and 81% for relation extraction. Copyright © 2015 Elsevier Inc. All rights reserved.
Bejan, Cosmin Adrian; Wei, Wei-Qi; Denny, Joshua C
2015-01-01
Objective To evaluate the contribution of the MEDication Indication (MEDI) resource and SemRep for identifying treatment relations in clinical text. Materials and methods We first processed clinical documents with SemRep to extract the Unified Medical Language System (UMLS) concepts and the treatment relations between them. Then, we incorporated MEDI into a simple algorithm that identifies treatment relations between two concepts if they match a medication-indication pair in this resource. For a better coverage, we expanded MEDI using ontology relationships from RxNorm and UMLS Metathesaurus. We also developed two ensemble methods, which combined the predictions of SemRep and the MEDI algorithm. We evaluated our selected methods on two datasets, a Vanderbilt corpus of 6864 discharge summaries and the 2010 Informatics for Integrating Biology and the Bedside (i2b2)/Veteran's Affairs (VA) challenge dataset. Results The Vanderbilt dataset included 958 manually annotated treatment relations. A double annotation was performed on 25% of relations with high agreement (Cohen's κ = 0.86). The evaluation consisted of comparing the manual annotated relations with the relations identified by SemRep, the MEDI algorithm, and the two ensemble methods. On the first dataset, the best F1-measure results achieved by the MEDI algorithm and the union of the two resources (78.7 and 80, respectively) were significantly higher than the SemRep results (72.3). On the second dataset, the MEDI algorithm achieved better precision and significantly lower recall values than the best system in the i2b2 challenge. The two systems obtained comparable F1-measure values on the subset of i2b2 relations with both arguments in MEDI. Conclusions Both SemRep and MEDI can be used to extract treatment relations from clinical text. Knowledge-based extraction with MEDI outperformed use of SemRep alone, but superior performance was achieved by integrating both systems. The integration of knowledge-based resources such as MEDI into information extraction systems such as SemRep and the i2b2 relation extractors may improve treatment relation extraction from clinical text. PMID:25336593
Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.; Burnum-Johnson, Kristin E.; Kim, Young-Mo; Kyle, Jennifer E.; Matzke, Melissa M.; Shukla, Anil K.; Chu, Rosalie K.; Schepmoes, Athena A.; Jacobs, Jon M.; Baric, Ralph S.; Webb-Robertson, Bobbie-Jo; Smith, Richard D.
2016-01-01
ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. The metabolite, protein, and lipid extraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of this protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental, in vitro, and clinical). IMPORTANCE In systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample. Author Video: An author video summary of this article is available. PMID:27822525
Grisales Díaz, Víctor Hugo; Olivar Tost, Gerard
2017-01-01
Dual extraction, high-temperature extraction, mixture extraction, and oleyl alcohol extraction have been proposed in the literature for acetone, butanol, and ethanol (ABE) production. However, energy and economic evaluation under similar assumptions of extraction-based separation systems are necessary. Hence, the new process proposed in this work, direct steam distillation (DSD), for regeneration of high-boiling extractants was compared with several extraction-based separation systems. The evaluation was performed under similar assumptions through simulation in Aspen Plus V7.3 ® software. Two end distillation systems (number of non-ideal stages between 70 and 80) were studied. Heat integration and vacuum operation of some units were proposed reducing the energy requirements. Energy requirement of hybrid processes, substrate concentration of 200 g/l, was between 6.4 and 8.3 MJ-fuel/kg-ABE. The minimum energy requirements of extraction-based separation systems, feeding a water concentration in the substrate equivalent to extractant selectivity, and ideal assumptions were between 2.6 and 3.5 MJ-fuel/kg-ABE, respectively. The efficiencies of recovery systems for baseline case and ideal evaluation were 0.53-0.57 and 0.81-0.84, respectively. The main advantages of DSD were the operation of the regeneration column at atmospheric pressure, the utilization of low-pressure steam, and the low energy requirements of preheating. The in situ recovery processes, DSD, and mixture extraction with conventional regeneration were the approaches with the lowest energy requirements and total annualized costs.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nakayasu, Ernesto S.; Nicora, Carrie D.; Sims, Amy C.
2016-05-03
ABSTRACT Integrative multi-omics analyses can empower more effective investigation and complete understanding of complex biological systems. Despite recent advances in a range of omics analyses, multi-omic measurements of the same sample are still challenging and current methods have not been well evaluated in terms of reproducibility and broad applicability. Here we adapted a solvent-based method, widely applied for extracting lipids and metabolites, to add proteomics to mass spectrometry-based multi-omics measurements. Themetabolite,protein, andlipidextraction (MPLEx) protocol proved to be robust and applicable to a diverse set of sample types, including cell cultures, microbial communities, and tissues. To illustrate the utility of thismore » protocol, an integrative multi-omics analysis was performed using a lung epithelial cell line infected with Middle East respiratory syndrome coronavirus, which showed the impact of this virus on the host glycolytic pathway and also suggested a role for lipids during infection. The MPLEx method is a simple, fast, and robust protocol that can be applied for integrative multi-omic measurements from diverse sample types (e.g., environmental,in vitro, and clinical). IMPORTANCEIn systems biology studies, the integration of multiple omics measurements (i.e., genomics, transcriptomics, proteomics, metabolomics, and lipidomics) has been shown to provide a more complete and informative view of biological pathways. Thus, the prospect of extracting different types of molecules (e.g., DNAs, RNAs, proteins, and metabolites) and performing multiple omics measurements on single samples is very attractive, but such studies are challenging due to the fact that the extraction conditions differ according to the molecule type. Here, we adapted an organic solvent-based extraction method that demonstrated broad applicability and robustness, which enabled comprehensive proteomics, metabolomics, and lipidomics analyses from the same sample.« less
Extraction and Separation Modeling of Orion Test Vehicles with ADAMS Simulation
NASA Technical Reports Server (NTRS)
Fraire, Usbaldo, Jr.; Anderson, Keith; Cuthbert, Peter A.
2013-01-01
The Capsule Parachute Assembly System (CPAS) project has increased efforts to demonstrate the performance of fully integrated parachute systems at both higher dynamic pressures and in the presence of wake fields using a Parachute Compartment Drop Test Vehicle (PCDTV) and a Parachute Test Vehicle (PTV), respectively. Modeling the extraction and separation events has proven challenging and an understanding of the physics is required to reduce the risk of separation malfunctions. The need for extraction and separation modeling is critical to a successful CPAS test campaign. Current PTV-alone simulations, such as Decelerator System Simulation (DSS), require accurate initial conditions (ICs) drawn from a separation model. Automatic Dynamic Analysis of Mechanical Systems (ADAMS), a Commercial off the Shelf (COTS) tool, was employed to provide insight into the multi-body six degree of freedom (DOF) interaction between parachute test hardware and external and internal forces. Components of the model include a composite extraction parachute, primary vehicle (PTV or PCDTV), platform cradle, a release mechanism, aircraft ramp, and a programmer parachute with attach points. Independent aerodynamic forces were applied to the mated test vehicle/platform cradle and the separated test vehicle and platform cradle. The aero coefficients were determined from real time lookup tables which were functions of both angle of attack ( ) and sideslip ( ). The atmospheric properties were also determined from a real time lookup table characteristic of the Yuma Proving Grounds (YPG) atmosphere relative to the planned test month. Representative geometries were constructed in ADAMS with measured mass properties generated for each independent vehicle. Derived smart separation parameters were included in ADAMS as sensors with defined pitch and pitch rate criteria used to refine inputs to analogous avionics systems for optimal separation conditions. Key design variables were dispersed in a Monte Carlo analysis to provide the maximum expected range of the state variables at programmer deployment to be used as ICs in DSS. Extensive comparisons were made with Decelerator System Simulation Application (DSSA) to validate the mated portion of the ADAMS extraction trajectory. Results of the comparisons improved the fidelity of ADAMS with a ramp pitch profile update from DSSA. Post-test reconstructions resulted in improvements to extraction parachute drag area knock-down factors, extraction line modeling, and the inclusion of ball-to-socket attachments used as a release mechanism on the PTV. Modeling of two Extraction parachutes was based on United States Air Force (USAF) tow test data and integrated into ADAMS for nominal and Monte Carlo trajectory assessments. Video overlay of ADAMS animations and actual C-12 chase plane test videos supported analysis and observation efforts of extraction and separation events. The COTS ADAMS simulation has been integrated with NASA based simulations to provide complete end to end trajectories with a focus on the extraction, separation, and programmer deployment sequence. The flexibility of modifying ADAMS inputs has proven useful for sensitivity studies and extraction/separation modeling efforts. 1
Integrated Micro-Chip Amino Acid Chirality Detector for MOD
NASA Technical Reports Server (NTRS)
Glavin, D. P.; Bada, J. L.; Botta, O.; Kminek, G.; Grunthaner, F.; Mathies, R.
2001-01-01
Integration of a micro-chip capillary electrophoresis analyzer with a sublimation-based extraction technique, as used in the Mars Organic Detector (MOD), for the in-situ detection of amino acids and their enantiomers on solar system bodies. Additional information is contained in the original extended abstract.
ERIC Educational Resources Information Center
Higgins, Pamela J.
2005-01-01
This undergraduate laboratory experiment integrates multiple techniques ("in vitro" synthesis, enzyme assays, Western blotting) to determine the production and detection sensitivity of two common reporter proteins (beta-galactosidase and luciferase) within an "Escherichia coli" S30 transcription/translation extract. Comparison of the data suggests…
Chen, Zhongxian; Yu, Haitao; Wen, Cheng
2014-01-01
The goal of direct drive ocean wave energy extraction system is to convert ocean wave energy into electricity. The problem explored in this paper is the design and optimal control for the direct drive ocean wave energy extraction system. An optimal control method based on internal model proportion integration differentiation (IM-PID) is proposed in this paper though most of ocean wave energy extraction systems are optimized by the structure, weight, and material. With this control method, the heavy speed of outer heavy buoy of the energy extraction system is in resonance with incident wave, and the system efficiency is largely improved. Validity of the proposed optimal control method is verified in both regular and irregular ocean waves, and it is shown that IM-PID control method is optimal in that it maximizes the energy conversion efficiency. In addition, the anti-interference ability of IM-PID control method has been assessed, and the results show that the IM-PID control method has good robustness, high precision, and strong anti-interference ability. PMID:25152913
Chen, Zhongxian; Yu, Haitao; Wen, Cheng
2014-01-01
The goal of direct drive ocean wave energy extraction system is to convert ocean wave energy into electricity. The problem explored in this paper is the design and optimal control for the direct drive ocean wave energy extraction system. An optimal control method based on internal model proportion integration differentiation (IM-PID) is proposed in this paper though most of ocean wave energy extraction systems are optimized by the structure, weight, and material. With this control method, the heavy speed of outer heavy buoy of the energy extraction system is in resonance with incident wave, and the system efficiency is largely improved. Validity of the proposed optimal control method is verified in both regular and irregular ocean waves, and it is shown that IM-PID control method is optimal in that it maximizes the energy conversion efficiency. In addition, the anti-interference ability of IM-PID control method has been assessed, and the results show that the IM-PID control method has good robustness, high precision, and strong anti-interference ability.
Liu, Xilin; Zhang, Milin; Richardson, Andrew G; Lucas, Timothy H; Van der Spiegel, Jan
2017-08-01
This paper presents a bidirectional brain machine interface (BMI) microsystem designed for closed-loop neuroscience research, especially experiments in freely behaving animals. The system-on-chip (SoC) consists of 16-channel neural recording front-ends, neural feature extraction units, 16-channel programmable neural stimulator back-ends, in-channel programmable closed-loop controllers, global analog-digital converters (ADC), and peripheral circuits. The proposed neural feature extraction units includes 1) an ultra low-power neural energy extraction unit enabling a 64-step natural logarithmic domain frequency tuning, and 2) a current-mode action potential (AP) detection unit with time-amplitude window discriminator. A programmable proportional-integral-derivative (PID) controller has been integrated in each channel enabling a various of closed-loop operations. The implemented ADCs include a 10-bit voltage-mode successive approximation register (SAR) ADC for the digitization of the neural feature outputs and/or local field potential (LFP) outputs, and an 8-bit current-mode SAR ADC for the digitization of the action potential outputs. The multi-mode stimulator can be programmed to perform monopolar or bipolar, symmetrical or asymmetrical charge balanced stimulation with a maximum current of 4 mA in an arbitrary channel configuration. The chip has been fabricated in 0.18 μ m CMOS technology, occupying a silicon area of 3.7 mm 2 . The chip dissipates 56 μW/ch on average. General purpose low-power microcontroller with Bluetooth module are integrated in the system to provide wireless link and SoC configuration. Methods, circuit techniques and system topology proposed in this work can be used in a wide range of relevant neurophysiology research, especially closed-loop BMI experiments.
Integrating the Allen Brain Institute Cell Types Database into Automated Neuroscience Workflow.
Stockton, David B; Santamaria, Fidel
2017-10-01
We developed software tools to download, extract features, and organize the Cell Types Database from the Allen Brain Institute (ABI) in order to integrate its whole cell patch clamp characterization data into the automated modeling/data analysis cycle. To expand the potential user base we employed both Python and MATLAB. The basic set of tools downloads selected raw data and extracts cell, sweep, and spike features, using ABI's feature extraction code. To facilitate data manipulation we added a tool to build a local specialized database of raw data plus extracted features. Finally, to maximize automation, we extended our NeuroManager workflow automation suite to include these tools plus a separate investigation database. The extended suite allows the user to integrate ABI experimental and modeling data into an automated workflow deployed on heterogeneous computer infrastructures, from local servers, to high performance computing environments, to the cloud. Since our approach is focused on workflow procedures our tools can be modified to interact with the increasing number of neuroscience databases being developed to cover all scales and properties of the nervous system.
NASA Astrophysics Data System (ADS)
Brook, A.; Cristofani, E.; Vandewal, M.; Matheis, C.; Jonuscheit, J.; Beigang, R.
2012-05-01
The present study proposes a fully integrated, semi-automatic and near real-time mode-operated image processing methodology developed for Frequency-Modulated Continuous-Wave (FMCW) THz images with the center frequencies around: 100 GHz and 300 GHz. The quality control of aeronautics composite multi-layered materials and structures using Non-Destructive Testing is the main focus of this work. Image processing is applied on the 3-D images to extract useful information. The data is processed by extracting areas of interest. The detected areas are subjected to image analysis for more particular investigation managed by a spatial model. Finally, the post-processing stage examines and evaluates the spatial accuracy of the extracted information.
Novel method of extracting motion from natural movies.
Suzuki, Wataru; Ichinohe, Noritaka; Tani, Toshiki; Hayami, Taku; Miyakawa, Naohisa; Watanabe, Satoshi; Takeichi, Hiroshige
2017-11-01
The visual system in primates can be segregated into motion and shape pathways. Interaction occurs at multiple stages along these pathways. Processing of shape-from-motion and biological motion is considered to be a higher-order integration process involving motion and shape information. However, relatively limited types of stimuli have been used in previous studies on these integration processes. We propose a new algorithm to extract object motion information from natural movies and to move random dots in accordance with the information. The object motion information is extracted by estimating the dynamics of local normal vectors of the image intensity projected onto the x-y plane of the movie. An electrophysiological experiment on two adult common marmoset monkeys (Callithrix jacchus) showed that the natural and random dot movies generated with this new algorithm yielded comparable neural responses in the middle temporal visual area. In principle, this algorithm provided random dot motion stimuli containing shape information for arbitrary natural movies. This new method is expected to expand the neurophysiological and psychophysical experimental protocols to elucidate the integration processing of motion and shape information in biological systems. The novel algorithm proposed here was effective in extracting object motion information from natural movies and provided new motion stimuli to investigate higher-order motion information processing. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
MARS: bringing the automation of small-molecule bioanalytical sample preparations to a new frontier.
Li, Ming; Chou, Judy; Jing, Jing; Xu, Hui; Costa, Aldo; Caputo, Robin; Mikkilineni, Rajesh; Flannelly-King, Shane; Rohde, Ellen; Gan, Lawrence; Klunk, Lewis; Yang, Liyu
2012-06-01
In recent years, there has been a growing interest in automating small-molecule bioanalytical sample preparations specifically using the Hamilton MicroLab(®) STAR liquid-handling platform. In the most extensive work reported thus far, multiple small-molecule sample preparation assay types (protein precipitation extraction, SPE and liquid-liquid extraction) have been integrated into a suite that is composed of graphical user interfaces and Hamilton scripts. Using that suite, bioanalytical scientists have been able to automate various sample preparation methods to a great extent. However, there are still areas that could benefit from further automation, specifically, the full integration of analytical standard and QC sample preparation with study sample extraction in one continuous run, real-time 2D barcode scanning on the Hamilton deck and direct Laboratory Information Management System database connectivity. We developed a new small-molecule sample-preparation automation system that improves in all of the aforementioned areas. The improved system presented herein further streamlines the bioanalytical workflow, simplifies batch run design, reduces analyst intervention and eliminates sample-handling error.
García-Remesal, M; Maojo, V; Billhardt, H; Crespo, J
2010-01-01
Bringing together structured and text-based sources is an exciting challenge for biomedical informaticians, since most relevant biomedical sources belong to one of these categories. In this paper we evaluate the feasibility of integrating relational and text-based biomedical sources using: i) an original logical schema acquisition method for textual databases developed by the authors, and ii) OntoFusion, a system originally designed by the authors for the integration of relational sources. We conducted an integration experiment involving a test set of seven differently structured sources covering the domain of genetic diseases. We used our logical schema acquisition method to generate schemas for all textual sources. The sources were integrated using the methods and tools provided by OntoFusion. The integration was validated using a test set of 500 queries. A panel of experts answered a questionnaire to evaluate i) the quality of the extracted schemas, ii) the query processing performance of the integrated set of sources, and iii) the relevance of the retrieved results. The results of the survey show that our method extracts coherent and representative logical schemas. Experts' feedback on the performance of the integrated system and the relevance of the retrieved results was also positive. Regarding the validation of the integration, the system successfully provided correct results for all queries in the test set. The results of the experiment suggest that text-based sources including a logical schema can be regarded as equivalent to structured databases. Using our method, previous research and existing tools designed for the integration of structured databases can be reused - possibly subject to minor modifications - to integrate differently structured sources.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jiskoot, R.J.J.
Accurate and reliable sampling systems are imperative when confirming natural gas' commercial value. Buyers and sellers need accurate hydrocarbon-composition information to conduct fair sale transactions. Because of poor sample extraction, preparation or analysis can invalidate the sale, more attention should be directed toward improving representative sampling. Consider all sampling components, i.e., gas types, line pressure and temperature, equipment maintenance and service needs, etc. The paper discusses gas sampling, design considerations (location, probe type, extraction devices, controller, and receivers), operating requirements, and system integration.
Li, Lishuang; Zhang, Panpan; Zheng, Tianfu; Zhang, Hongying; Jiang, Zhenchao; Huang, Degen
2014-01-01
Protein-Protein Interaction (PPI) extraction is an important task in the biomedical information extraction. Presently, many machine learning methods for PPI extraction have achieved promising results. However, the performance is still not satisfactory. One reason is that the semantic resources were basically ignored. In this paper, we propose a multiple-kernel learning-based approach to extract PPIs, combining the feature-based kernel, tree kernel and semantic kernel. Particularly, we extend the shortest path-enclosed tree kernel (SPT) by a dynamic extended strategy to retrieve the richer syntactic information. Our semantic kernel calculates the protein-protein pair similarity and the context similarity based on two semantic resources: WordNet and Medical Subject Heading (MeSH). We evaluate our method with Support Vector Machine (SVM) and achieve an F-score of 69.40% and an AUC of 92.00%, which show that our method outperforms most of the state-of-the-art systems by integrating semantic information.
Strict integrity control of biomedical images
NASA Astrophysics Data System (ADS)
Coatrieux, Gouenou; Maitre, Henri; Sankur, Bulent
2001-08-01
The control of the integrity and authentication of medical images is becoming ever more important within the Medical Information Systems (MIS). The intra- and interhospital exchange of images, such as in the PACS (Picture Archiving and Communication Systems), and the ease of copying, manipulation and distribution of images have brought forth the security aspects. In this paper we focus on the role of watermarking for MIS security and address the problem of integrity control of medical images. We discuss alternative schemes to extract verification signatures and compare their tamper detection performance.
A scalable architecture for extracting, aligning, linking, and visualizing multi-Int data
NASA Astrophysics Data System (ADS)
Knoblock, Craig A.; Szekely, Pedro
2015-05-01
An analyst today has a tremendous amount of data available, but each of the various data sources typically exists in their own silos, so an analyst has limited ability to see an integrated view of the data and has little or no access to contextual information that could help in understanding the data. We have developed the Domain-Insight Graph (DIG) system, an innovative architecture for extracting, aligning, linking, and visualizing massive amounts of domain-specific content from unstructured sources. Under the DARPA Memex program we have already successfully applied this architecture to multiple application domains, including the enormous international problem of human trafficking, where we extracted, aligned and linked data from 50 million online Web pages. DIG builds on our Karma data integration toolkit, which makes it easy to rapidly integrate structured data from a variety of sources, including databases, spreadsheets, XML, JSON, and Web services. The ability to integrate Web services allows Karma to pull in live data from the various social media sites, such as Twitter, Instagram, and OpenStreetMaps. DIG then indexes the integrated data and provides an easy to use interface for query, visualization, and analysis.
PLAN2L: a web tool for integrated text mining and literature-based bioentity relation extraction.
Krallinger, Martin; Rodriguez-Penagos, Carlos; Tendulkar, Ashish; Valencia, Alfonso
2009-07-01
There is an increasing interest in using literature mining techniques to complement information extracted from annotation databases or generated by bioinformatics applications. Here we present PLAN2L, a web-based online search system that integrates text mining and information extraction techniques to access systematically information useful for analyzing genetic, cellular and molecular aspects of the plant model organism Arabidopsis thaliana. Our system facilitates a more efficient retrieval of information relevant to heterogeneous biological topics, from implications in biological relationships at the level of protein interactions and gene regulation, to sub-cellular locations of gene products and associations to cellular and developmental processes, i.e. cell cycle, flowering, root, leaf and seed development. Beyond single entities, also predefined pairs of entities can be provided as queries for which literature-derived relations together with textual evidences are returned. PLAN2L does not require registration and is freely accessible at http://zope.bioinfo.cnio.es/plan2l.
ANAEROBIC TREATMENT OF SOIL WASH FLUIDS FROM A WOOD PRESERVING SITE
An integrated system has been developed to remediate sols contaminated with pentachlorophenol (PCP) and polycyclic aromatic hydrocarbons (PAHs). This system involves the coupling of two treatment technologies, soil solvent washing and anaerobic biotreatment of the extract. Specif...
Calic, M; Jarlov, C; Gallo, P; Dwir, B; Rudra, A; Kapon, E
2017-06-22
A system of two site-controlled semiconductor quantum dots (QDs) is deterministically integrated with a photonic crystal membrane nano-cavity. The two QDs are identified via their reproducible emission spectral features, and their coupling to the fundamental cavity mode is established by emission co-polarization and cavity feeding features. A theoretical model accounting for phonon interaction and pure dephasing reproduces the observed results and permits extraction of the light-matter coupling constant for this system. The demonstrated approach offers a platform for scaling up the integration of QD systems and nano-photonic elements for integrated quantum photonics applications.
Integrated Mg/TiO2-ionic liquid system for deep desulfurization
NASA Astrophysics Data System (ADS)
Yin, Yee Cia; Kait, Chong Fai; Fatimah, Hayyiratul; Wilfred, Cecilia
2014-10-01
A series of Mg/TiO2 photocatalysts were prepared using wet impregnation method followed by calcination at 300, 400 and 500°C for 1 h. The photocatalysts were characterized using Thermal Gravimetric Analysis, Fourier-Transform Infrared Spectroscopy, X-Ray Diffraction, and Field Emission Scanning Electron Microscopy. The performance for deep desulfurization was investigated using model oil with 100 ppm sulfur (in the form of dibenzothiophene). The integrated system involves photocatalytic oxidation followed by ionic liquid-extraction processes. The best performing photocatalyst was 0.25wt% Mg loaded on titania calcined at 400°C (0.25Mg400), giving 98.5% conversion of dibenzothiophene to dibenzothiophene sulfone. The highest extraction efficiency of 97.8% was displayed by 1,2-diethylimidazolium diethylphosphate. The overall total sulfur removal was 96.3%.
Chang, Yung-Chun; Dai, Hong-Jie; Wu, Johnny Chi-Yang; Chen, Jian-Ming; Tsai, Richard Tzong-Han; Hsu, Wen-Lian
2013-12-01
Patient discharge summaries provide detailed medical information about individuals who have been hospitalized. To make a precise and legitimate assessment of the abundant data, a proper time layout of the sequence of relevant events should be compiled and used to drive a patient-specific timeline, which could further assist medical personnel in making clinical decisions. The process of identifying the chronological order of entities is called temporal relation extraction. In this paper, we propose a hybrid method to identify appropriate temporal links between a pair of entities. The method combines two approaches: one is rule-based and the other is based on the maximum entropy model. We develop an integration algorithm to fuse the results of the two approaches. All rules and the integration algorithm are formally stated so that one can easily reproduce the system and results. To optimize the system's configuration, we used the 2012 i2b2 challenge TLINK track dataset and applied threefold cross validation to the training set. Then, we evaluated its performance on the training and test datasets. The experiment results show that the proposed TEMPTING (TEMPoral relaTion extractING) system (ranked seventh) achieved an F-score of 0.563, which was at least 30% better than that of the baseline system, which randomly selects TLINK candidates from all pairs and assigns the TLINK types. The TEMPTING system using the hybrid method also outperformed the stage-based TEMPTING system. Its F-scores were 3.51% and 0.97% better than those of the stage-based system on the training set and test set, respectively. Copyright © 2013 Elsevier Inc. All rights reserved.
NASA tests composters for space
NASA Technical Reports Server (NTRS)
Atkinson, C.
1997-01-01
For long term missions, composters may be an integral part of a life support system that provides edible food crops, extracts nutrients from plant biomass and removes contaminants from the recycling stream.
VLSI Design of SVM-Based Seizure Detection System With On-Chip Learning Capability.
Feng, Lichen; Li, Zunchao; Wang, Yuanfa
2018-02-01
Portable automatic seizure detection system is very convenient for epilepsy patients to carry. In order to make the system on-chip trainable with high efficiency and attain high detection accuracy, this paper presents a very large scale integration (VLSI) design based on the nonlinear support vector machine (SVM). The proposed design mainly consists of a feature extraction (FE) module and an SVM module. The FE module performs the three-level Daubechies discrete wavelet transform to fit the physiological bands of the electroencephalogram (EEG) signal and extracts the time-frequency domain features reflecting the nonstationary signal properties. The SVM module integrates the modified sequential minimal optimization algorithm with the table-driven-based Gaussian kernel to enable efficient on-chip learning. The presented design is verified on an Altera Cyclone II field-programmable gate array and tested using the two publicly available EEG datasets. Experiment results show that the designed VLSI system improves the detection accuracy and training efficiency.
Gait recognition based on integral outline
NASA Astrophysics Data System (ADS)
Ming, Guan; Fang, Lv
2017-02-01
Biometric identification technology replaces traditional security technology, which has become a trend, and gait recognition also has become a hot spot of research because its feature is difficult to imitate and theft. This paper presents a gait recognition system based on integral outline of human body. The system has three important aspects: the preprocessing of gait image, feature extraction and classification. Finally, using a method of polling to evaluate the performance of the system, and summarizing the problems existing in the gait recognition and the direction of development in the future.
Bras, Eduardo J S; Soares, Ruben R G; Azevedo, Ana M; Fernandes, Pedro; Arévalo-Rodríguez, Miguel; Chu, Virginia; Conde, João P; Aires-Barros, M Raquel
2017-09-15
Antibodies and other protein products such as interferons and cytokines are biopharmaceuticals of critical importance which, in order to be safely administered, have to be thoroughly purified in a cost effective and efficient manner. The use of aqueous two-phase extraction (ATPE) is a viable option for this purification, but these systems are difficult to model and optimization procedures require lengthy and expensive screening processes. Here, a methodology for the rapid screening of antibody extraction conditions using a microfluidic channel-based toolbox is presented. A first microfluidic structure allows a simple negative-pressure driven rapid screening of up to 8 extraction conditions simultaneously, using less than 20μL of each phase-forming solution per experiment, while a second microfluidic structure allows the integration of multi-step extraction protocols based on the results obtained with the first device. In this paper, this microfluidic toolbox was used to demonstrate the potential of LYTAG fusion proteins used as affinity tags to optimize the partitioning of antibodies in ATPE processes, where a maximum partition coefficient (K) of 9.2 in a PEG 3350/phosphate system was obtained for the antibody extraction in the presence of the LYTAG-Z dual ligand. This represents an increase of approx. 3.7 fold when compared with the same conditions without the affinity molecule (K=2.5). Overall, this miniaturized and versatile approach allowed the rapid optimization of molecule partition followed by a proof-of-concept demonstration of an integrated back extraction procedure, both of which are critical procedures towards obtaining high purity biopharmaceuticals using ATPE. Copyright © 2017 Elsevier B.V. All rights reserved.
The BioExtract Server: a web-based bioinformatic workflow platform
Lushbough, Carol M.; Jennewein, Douglas M.; Brendel, Volker P.
2011-01-01
The BioExtract Server (bioextract.org) is an open, web-based system designed to aid researchers in the analysis of genomic data by providing a platform for the creation of bioinformatic workflows. Scientific workflows are created within the system by recording tasks performed by the user. These tasks may include querying multiple, distributed data sources, saving query results as searchable data extracts, and executing local and web-accessible analytic tools. The series of recorded tasks can then be saved as a reproducible, sharable workflow available for subsequent execution with the original or modified inputs and parameter settings. Integrated data resources include interfaces to the National Center for Biotechnology Information (NCBI) nucleotide and protein databases, the European Molecular Biology Laboratory (EMBL-Bank) non-redundant nucleotide database, the Universal Protein Resource (UniProt), and the UniProt Reference Clusters (UniRef) database. The system offers access to numerous preinstalled, curated analytic tools and also provides researchers with the option of selecting computational tools from a large list of web services including the European Molecular Biology Open Software Suite (EMBOSS), BioMoby, and the Kyoto Encyclopedia of Genes and Genomes (KEGG). The system further allows users to integrate local command line tools residing on their own computers through a client-side Java applet. PMID:21546552
Hwang, Wonjun; Wang, Haitao; Kim, Hyunwoo; Kee, Seok-Cheol; Kim, Junmo
2011-04-01
The authors present a robust face recognition system for large-scale data sets taken under uncontrolled illumination variations. The proposed face recognition system consists of a novel illumination-insensitive preprocessing method, a hybrid Fourier-based facial feature extraction, and a score fusion scheme. First, in the preprocessing stage, a face image is transformed into an illumination-insensitive image, called an "integral normalized gradient image," by normalizing and integrating the smoothed gradients of a facial image. Then, for feature extraction of complementary classifiers, multiple face models based upon hybrid Fourier features are applied. The hybrid Fourier features are extracted from different Fourier domains in different frequency bandwidths, and then each feature is individually classified by linear discriminant analysis. In addition, multiple face models are generated by plural normalized face images that have different eye distances. Finally, to combine scores from multiple complementary classifiers, a log likelihood ratio-based score fusion scheme is applied. The proposed system using the face recognition grand challenge (FRGC) experimental protocols is evaluated; FRGC is a large available data set. Experimental results on the FRGC version 2.0 data sets have shown that the proposed method shows an average of 81.49% verification rate on 2-D face images under various environmental variations such as illumination changes, expression changes, and time elapses.
The 1984 NASA/ASEE summer faculty fellowship program
NASA Technical Reports Server (NTRS)
1984-01-01
The assessment of forest productivity and associated nitrogen flux in a number of conifer ecosystems is described. As a base line study of acid precipitation in the Sierra Nevada, involved is the extraction and integration of a number of data planes describing the terrain, soils, lithology, vegetation cover and structure, and microclimate of the region. The development of automated techniques to extract topographic networks (stream canyons and ridge lines) for use as a landscrape skeleton to organize and integrate data sets into an efficient geographical information system is examined. The software is written in both FORTRAN and C, and is portable to a number of different computer environments with minimal modification.
System identification methods for aircraft flight control development and validation
NASA Technical Reports Server (NTRS)
Tischler, Mark B.
1995-01-01
System-identification methods compose a mathematical model, or series of models, from measurements of inputs and outputs of dynamic systems. The extracted models allow the characterization of the response of the overall aircraft or component subsystem behavior, such as actuators and on-board signal processing algorithms. This paper discusses the use of frequency-domain system-identification methods for the development and integration of aircraft flight-control systems. The extraction and analysis of models of varying complexity from nonparametric frequency-responses to transfer-functions and high-order state-space representations is illustrated using the Comprehensive Identification from FrEquency Responses (CIFER) system-identification facility. Results are presented for test data of numerous flight and simulation programs at the Ames Research Center including rotorcraft, fixed-wing aircraft, advanced short takeoff and vertical landing (ASTOVL), vertical/short takeoff and landing (V/STOL), tiltrotor aircraft, and rotor experiments in the wind tunnel. Excellent system characterization and dynamic response prediction is achieved for this wide class of systems. Examples illustrate the role of system-identification technology in providing an integrated flow of dynamic response data around the entire life-cycle of aircraft development from initial specifications, through simulation and bench testing, and into flight-test optimization.
Enhancing security of fingerprints through contextual biometric watermarking.
Noore, Afzel; Singh, Richa; Vatsa, Mayank; Houck, Max M
2007-07-04
This paper presents a novel digital watermarking technique using face and demographic text data as multiple watermarks for verifying the chain of custody and protecting the integrity of a fingerprint image. The watermarks are embedded in selected texture regions of a fingerprint image using discrete wavelet transform. Experimental results show that modifications in these locations are visually imperceptible and maintain the minutiae details. The integrity of the fingerprint image is verified through the high matching scores obtained from an automatic fingerprint identification system. There is also a high degree of visual correlation between the embedded images, and the extracted images from the watermarked fingerprint. The degree of similarity is computed using pixel-based metrics and human visual system metrics. The results also show that the proposed watermarked fingerprint and the extracted images are resilient to common attacks such as compression, filtering, and noise.
NASA Astrophysics Data System (ADS)
Wang, Ximing; Kim, Bokkyu; Park, Ji Hoon; Wang, Erik; Forsyth, Sydney; Lim, Cody; Ravi, Ragini; Karibyan, Sarkis; Sanchez, Alexander; Liu, Brent
2017-03-01
Quantitative imaging biomarkers are used widely in clinical trials for tracking and evaluation of medical interventions. Previously, we have presented a web based informatics system utilizing quantitative imaging features for predicting outcomes in stroke rehabilitation clinical trials. The system integrates imaging features extraction tools and a web-based statistical analysis tool. The tools include a generalized linear mixed model(GLMM) that can investigate potential significance and correlation based on features extracted from clinical data and quantitative biomarkers. The imaging features extraction tools allow the user to collect imaging features and the GLMM module allows the user to select clinical data and imaging features such as stroke lesion characteristics from the database as regressors and regressands. This paper discusses the application scenario and evaluation results of the system in a stroke rehabilitation clinical trial. The system was utilized to manage clinical data and extract imaging biomarkers including stroke lesion volume, location and ventricle/brain ratio. The GLMM module was validated and the efficiency of data analysis was also evaluated.
Biotechnology for Solar System Exploration
NASA Astrophysics Data System (ADS)
Steele, A.; Maule, J.; Toporski, J.; Parro-Garcia, V.; Briones, C.; Schweitzer, M.; McKay, D.
With the advent of a new era of astrobiology missions in the exploration of the solar system and the search for evidence of life elsewhere, we present a new approach to this goal, the integration of biotechnology. We have reviewed the current list of biotechnology techniques, which are applicable to miniaturization, automatization and integration into a combined flight platform. Amongst the techniques reviewed are- The uses of antibodies- Fluorescent detection strategies- Protein and DNA chip technology- Surface plasmon resonance and its relation to other techniques- Micro electronic machining (MEMS where applicable to biologicalsystems)- nanotechnology (e.g. molecular motors)- Lab-on-a-chip technology (including PCR)- Mass spectrometry (i.e. MALDI-TOF)- Fluid handling and extraction technologies- Chemical Force Microscopy (CFM)- Raman Spectroscopy We have begun to integrate this knowledge into a single flight instrument approach for the sole purpose of combining several mutually confirming tests for life, organic and/or microbial contamination, as well as prebiotic and abiotic organic chemicals. We will present several innovative designs for new instrumentation including pro- engineering design drawings of a protein chip reader for space flight and fluid handling strategies. We will also review the use of suitable extraction methodologies for use on different solar system bodies.
Metadata-Driven SOA-Based Application for Facilitation of Real-Time Data Warehousing
NASA Astrophysics Data System (ADS)
Pintar, Damir; Vranić, Mihaela; Skočir, Zoran
Service-oriented architecture (SOA) has already been widely recognized as an effective paradigm for achieving integration of diverse information systems. SOA-based applications can cross boundaries of platforms, operation systems and proprietary data standards, commonly through the usage of Web Services technology. On the other side, metadata is also commonly referred to as a potential integration tool given the fact that standardized metadata objects can provide useful information about specifics of unknown information systems with which one has interest in communicating with, using an approach commonly called "model-based integration". This paper presents the result of research regarding possible synergy between those two integration facilitators. This is accomplished with a vertical example of a metadata-driven SOA-based business process that provides ETL (Extraction, Transformation and Loading) and metadata services to a data warehousing system in need of a real-time ETL support.
A case study of data integration for aquatic resources using semantic web technologies
Gordon, Janice M.; Chkhenkeli, Nina; Govoni, David L.; Lightsom, Frances L.; Ostroff, Andrea C.; Schweitzer, Peter N.; Thongsavanh, Phethala; Varanka, Dalia E.; Zednik, Stephan
2015-01-01
Use cases, information modeling, and linked data techniques are Semantic Web technologies used to develop a prototype system that integrates scientific observations from four independent USGS and cooperator data systems. The techniques were tested with a use case goal of creating a data set for use in exploring potential relationships among freshwater fish populations and environmental factors. The resulting prototype extracts data from the BioData Retrieval System, the Multistate Aquatic Resource Information System, the National Geochemical Survey, and the National Hydrography Dataset. A prototype user interface allows a scientist to select observations from these data systems and combine them into a single data set in RDF format that includes explicitly defined relationships and data definitions. The project was funded by the USGS Community for Data Integration and undertaken by the Community for Data Integration Semantic Web Working Group in order to demonstrate use of Semantic Web technologies by scientists. This allows scientists to simultaneously explore data that are available in multiple, disparate systems beyond those they traditionally have used.
Xenopus egg extract: A powerful tool to study genome maintenance mechanisms.
Hoogenboom, Wouter S; Klein Douwel, Daisy; Knipscheer, Puck
2017-08-15
DNA repair pathways are crucial to maintain the integrity of our genome and prevent genetic diseases such as cancer. There are many different types of DNA damage and specific DNA repair mechanisms have evolved to deal with these lesions. In addition to these repair pathways there is an extensive signaling network that regulates processes important for repair, such as cell cycle control and transcription. Despite extensive research, DNA damage repair and signaling are not fully understood. In vitro systems such as the Xenopus egg extract system, have played, and still play, an important role in deciphering the molecular details of these processes. Xenopus laevis egg extracts contain all factors required to efficiently perform DNA repair outside a cell, using mechanisms conserved in humans. These extracts have been used to study several genome maintenance pathways, including mismatch repair, non-homologous end joining, ICL repair, DNA damage checkpoint activation, and replication fork stability. Here we describe how the Xenopus egg extract system, in combination with specifically designed DNA templates, contributed to our detailed understanding of these pathways. Copyright © 2017. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Chan, Yi-Tung; Wang, Shuenn-Jyi; Tsai, Chung-Hsien
2017-09-01
Public safety is a matter of national security and people's livelihoods. In recent years, intelligent video-surveillance systems have become important active-protection systems. A surveillance system that provides early detection and threat assessment could protect people from crowd-related disasters and ensure public safety. Image processing is commonly used to extract features, e.g., people, from a surveillance video. However, little research has been conducted on the relationship between foreground detection and feature extraction. Most current video-surveillance research has been developed for restricted environments, in which the extracted features are limited by having information from a single foreground; they do not effectively represent the diversity of crowd behavior. This paper presents a general framework based on extracting ensemble features from the foreground of a surveillance video to analyze a crowd. The proposed method can flexibly integrate different foreground-detection technologies to adapt to various monitored environments. Furthermore, the extractable representative features depend on the heterogeneous foreground data. Finally, a classification algorithm is applied to these features to automatically model crowd behavior and distinguish an abnormal event from normal patterns. The experimental results demonstrate that the proposed method's performance is both comparable to that of state-of-the-art methods and satisfies the requirements of real-time applications.
Knowledge Representation Of CT Scans Of The Head
NASA Astrophysics Data System (ADS)
Ackerman, Laurens V.; Burke, M. W.; Rada, Roy
1984-06-01
We have been investigating diagnostic knowledge models which assist in the automatic classification of medical images by combining information extracted from each image with knowledge specific to that class of images. In a more general sense we are trying to integrate verbal and pictorial descriptions of disease via representations of knowledge, study automatic hypothesis generation as related to clinical medicine, evolve new mathematical image measures while integrating them into the total diagnostic process, and investigate ways to augment the knowledge of the physician. Specifically, we have constructed an artificial intelligence knowledge model using the technique of a production system blending pictorial and verbal knowledge about the respective CT scan and patient history. It is an attempt to tie together different sources of knowledge representation, picture feature extraction and hypothesis generation. Our knowledge reasoning and representation system (KRRS) works with data at the conscious reasoning level of the practicing physician while at the visual perceptional level we are building another production system, the picture parameter extractor (PPE). This paper describes KRRS and its relationship to PPE.
Qi, Xiubin; Crooke, Emma; Ross, Andrew; Bastow, Trevor P; Stalvies, Charlotte
2011-09-21
This paper presents a system and method developed to identify a source oil's characteristic properties by testing the oil's dissolved components in water. Through close examination of the oil dissolution process in water, we hypothesise that when oil is in contact with water, the resulting oil-water extract, a complex hydrocarbon mixture, carries the signature property information of the parent oil. If the dominating differences in compositions between such extracts of different oils can be identified, this information could guide the selection of various sensors, capable of capturing such chemical variations. When used as an array, such a sensor system can be used to determine parent oil information from the oil-water extract. To test this hypothesis, 22 oils' water extracts were prepared and selected dominant hydrocarbons analyzed with Gas Chromatography-Mass Spectrometry (GC-MS); the subsequent Principal Component Analysis (PCA) indicates that the major difference between the extract solutions is the relative concentration between the volatile mono-aromatics and fluorescent polyaromatics. An integrated sensor array system that is composed of 3 volatile hydrocarbon sensors and 2 polyaromatic hydrocarbon sensors was built accordingly to capture the major and subtle differences of these extracts. It was tested by exposure to a total of 110 water extract solutions diluted from the 22 extracts. The sensor response data collected from the testing were processed with two multivariate analysis tools to reveal information retained in the response patterns of the arrayed sensors: by conducting PCA, we were able to demonstrate the ability to qualitatively identify and distinguish different oil samples from their sensor array response patterns. When a supervised PCA, Linear Discriminate Analysis (LDA), was applied, even quantitative classification can be achieved: the multivariate model generated from the LDA achieved 89.7% of successful classification of the type of the oil samples. By grouping the samples based on the level of viscosity and density we were able to reveal the correlation between the oil extracts' sensor array responses and their original oils' feature properties. The equipment and method developed in this study have promising potential to be readily applied in field studies and marine surveys for oil exploration or oil spill monitoring.
Engineering the System and Technical Integration
NASA Technical Reports Server (NTRS)
Blair, J. C.; Ryan, R. S.; Schutzenhofer, L. A.
2011-01-01
Approximately 80% of the problems encountered in aerospace systems have been due to a breakdown in technical integration and/or systems engineering. One of the major challenges we face in designing, building, and operating space systems is: how is adequate integration achieved for the systems various functions, parts, and infrastructure? This Contractor Report (CR) deals with part of the problem of how we engineer the total system in order to achieve the best balanced design. We will discuss a key aspect of this question - the principle of Technical Integration and its components, along with management and decision making. The CR will first provide an introduction with a discussion of the Challenges in Space System Design and meeting the challenges. Next is an overview of Engineering the System including Technical Integration. Engineering the System is expanded to include key aspects of the Design Process, Lifecycle Considerations, etc. The basic information and figures used in this CR were presented in a NASA training program for Program and Project Managers Development (PPMD) in classes at Georgia Tech and at Marshall Space Flight Center (MSFC). Many of the principles and illustrations are extracted from the courses we teach for MSFC.
NASA Astrophysics Data System (ADS)
Kang, Yun Hee; Hwang, Jae Ran; Chung, Ik Kyo; Park, Sang Rul
2013-03-01
Integrated multi-trophic aquaculture (IMTA) has been proposed as a concept that combines the cultivation of fed aquaculture species ( e.g., finfish/shrimp) with extractive aquaculture species ( e.g., shellfish/seaweed). In seaweed-based integrated aquaculture, seaweeds have the capacity to reduce the environmental impact of nitrogen-rich effluents on coastal ecosystems. Thus, selection of optimal species for such aquaculture is of great importance. The present study aimed to develop a seaweed species-selection index for selecting suitable species in seaweed-based integrated aquaculture system. The index was synthesized using available literature-based information, reference data, and physiological seaweed experiments to identify and prioritize the desired species. Undaria pinnatifida, Porphyra yezoensis and Ulva compressa scored the highest according to a seaweed-based integrated aquaculture suitability index (SASI). Seaweed species with the highest scores were adjudged to fit the integrated aquaculture systems. Despite the application of this model limited by local aquaculture environment, it is considered to be a useful tool for selecting seaweed species in IMTA.
eGARD: Extracting associations between genomic anomalies and drug responses from text
Rao, Shruti; McGarvey, Peter; Wu, Cathy; Madhavan, Subha; Vijay-Shanker, K.
2017-01-01
Tumor molecular profiling plays an integral role in identifying genomic anomalies which may help in personalizing cancer treatments, improving patient outcomes and minimizing risks associated with different therapies. However, critical information regarding the evidence of clinical utility of such anomalies is largely buried in biomedical literature. It is becoming prohibitive for biocurators, clinical researchers and oncologists to keep up with the rapidly growing volume and breadth of information, especially those that describe therapeutic implications of biomarkers and therefore relevant for treatment selection. In an effort to improve and speed up the process of manually reviewing and extracting relevant information from literature, we have developed a natural language processing (NLP)-based text mining (TM) system called eGARD (extracting Genomic Anomalies association with Response to Drugs). This system relies on the syntactic nature of sentences coupled with various textual features to extract relations between genomic anomalies and drug response from MEDLINE abstracts. Our system achieved high precision, recall and F-measure of up to 0.95, 0.86 and 0.90, respectively, on annotated evaluation datasets created in-house and obtained externally from PharmGKB. Additionally, the system extracted information that helps determine the confidence level of extraction to support prioritization of curation. Such a system will enable clinical researchers to explore the use of published markers to stratify patients upfront for ‘best-fit’ therapies and readily generate hypotheses for new clinical trials. PMID:29261751
Sample extraction and injection with a microscale preconcentrator.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Robinson, Alex Lockwood; Chan, Helena Kai Lun
2007-09-01
This report details the development of a microfabricated preconcentrator that functions as a fully integrated chemical extractor-injector for a microscale gas chromatograph (GC). The device enables parts-per-billion detection and quantitative analysis of volatile organic compounds (VOCs) in indoor air with size and power advantages over macro-scale systems. The 44 mm{sup 3} preconcentrator extracts VOCs using highly adsorptive, granular forms of graphitized carbon black and carbon molecular sieves. The micron-sized silicon cavities have integrated heating and temperature sensing allowing low power, yet rapid heating to thermally desorb the collected VOCs (GC injection). The keys to device construction are a new adsorbent-solventmore » filling technique and solvent-tolerant wafer-level silicon-gold eutectic bonding technology. The product is the first granular adsorbent preconcentrator integrated at the wafer level. Other advantages include exhaustive VOC extraction and injection peak widths an order of magnitude narrower than predecessor prototypes. A mass transfer model, the first for any microscale preconcentrator, is developed to describe both adsorption and desorption behaviors. The physically intuitive model uses implicit and explicit finite differences to numerically solve the required partial differential equations. The model is applied to the adsorption and desorption of decane at various concentrations to extract Langmuir adsorption isotherm parameters from effluent curve measurements where properties are unknown a priori.« less
Evolving Maturation of the Series-Bosch System
NASA Technical Reports Server (NTRS)
Stanley, Christine; Abney, Morgan B.; Barnett, Bill
2017-01-01
Human exploration missions to Mars and other destinations beyond low Earth orbit require highly robust, reliable, and maintainable life support systems that maximize recycling of water and oxygen. In order to meet this requirement, NASA has continued the development of a Series-Bosch System, a two stage reactor process that reduces carbon dioxide (CO2) with hydrogen (H2) to produce water and solid carbon. Theoretically, the Bosch process can recover 100% of the oxygen (O2) from CO2 in the form of water, making it an attractive option for long duration missions. The Series Bosch system includes a reverse water gas shift (RWGS) reactor, a carbon formation reactor (CFR), an H2 extraction membrane, and a CO2 extraction membrane. In 2016, the results of integrated testing of the Series Bosch system showed great promise and resulted in design modifications to the CFR to further improve performance. This year, integrated testing was conducted with the modified reactor to evaluate its performance and compare it with the performance of the previous configuration. Additionally, a CFR with the capability to load new catalyst and remove spent catalyst in-situ was built. Flow demonstrations were performed to evaluate both the catalyst loading and removal process and the hardware performance. The results of the integrated testing with the modified CFR as well as the flow demonstrations are discussed in this paper.
Mutual information, neural networks and the renormalization group
NASA Astrophysics Data System (ADS)
Koch-Janusz, Maciej; Ringel, Zohar
2018-06-01
Physical systems differing in their microscopic details often display strikingly similar behaviour when probed at macroscopic scales. Those universal properties, largely determining their physical characteristics, are revealed by the powerful renormalization group (RG) procedure, which systematically retains `slow' degrees of freedom and integrates out the rest. However, the important degrees of freedom may be difficult to identify. Here we demonstrate a machine-learning algorithm capable of identifying the relevant degrees of freedom and executing RG steps iteratively without any prior knowledge about the system. We introduce an artificial neural network based on a model-independent, information-theoretic characterization of a real-space RG procedure, which performs this task. We apply the algorithm to classical statistical physics problems in one and two dimensions. We demonstrate RG flow and extract the Ising critical exponent. Our results demonstrate that machine-learning techniques can extract abstract physical concepts and consequently become an integral part of theory- and model-building.
Information Extraction from Unstructured Text for the Biodefense Knowledge Center
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samatova, N F; Park, B; Krishnamurthy, R
2005-04-29
The Bio-Encyclopedia at the Biodefense Knowledge Center (BKC) is being constructed to allow an early detection of emerging biological threats to homeland security. It requires highly structured information extracted from variety of data sources. However, the quantity of new and vital information available from every day sources cannot be assimilated by hand, and therefore reliable high-throughput information extraction techniques are much anticipated. In support of the BKC, Lawrence Livermore National Laboratory and Oak Ridge National Laboratory, together with the University of Utah, are developing an information extraction system built around the bioterrorism domain. This paper reports two important pieces ofmore » our effort integrated in the system: key phrase extraction and semantic tagging. Whereas two key phrase extraction technologies developed during the course of project help identify relevant texts, our state-of-the-art semantic tagging system can pinpoint phrases related to emerging biological threats. Also we are enhancing and tailoring the Bio-Encyclopedia by augmenting semantic dictionaries and extracting details of important events, such as suspected disease outbreaks. Some of these technologies have already been applied to large corpora of free text sources vital to the BKC mission, including ProMED-mail, PubMed abstracts, and the DHS's Information Analysis and Infrastructure Protection (IAIP) news clippings. In order to address the challenges involved in incorporating such large amounts of unstructured text, the overall system is focused on precise extraction of the most relevant information for inclusion in the BKC.« less
Diagnostic/drug delivery "sense-respond" devices, systems, and uses thereof
DOE Office of Scientific and Technical Information (OSTI.GOV)
Polsky, Ronen; Miller, Philip Rocco; Edwards, Thayne L.
The present invention is directed to devices, systems, and methods for detecting and/or monitoring one or more markers in a sample. In particular, such devices integrate a plurality of hollow needles configured to extract or obtain a fluid sample from a subject, as well as transducers to detect a marker of interest.
APMS 3.0 Flight Analyst Guide: Aviation Performance Measuring System
NASA Technical Reports Server (NTRS)
Jay, Griff; Prothero, Gary; Romanowski, Timothy; Lynch, Robert; Lawrence, Robert; Rosenthal, Loren
2004-01-01
The Aviation Performance Measuring System (APMS) is a method-embodied in software-that uses mathematical algorithms and related procedures to analyze digital flight data extracted from aircraft flight data recorders. APMS consists of an integrated set of tools used to perform two primary functions: a) Flight Data Importation b) Flight Data Analysis.
Factors Affecting the Capture Efficiency of a Fume Extraction Torch for Gas Metal Arc Welding.
Bonthoux, Francis
2016-07-01
Welding fumes are classified as Group 2B 'possibly carcinogenic' and this prompts to the implementation of local exhaust ventilation (LEV). The fume extraction torch with LEV integrated into the tool is the most attractive solution but its capture efficiency is often disappointing in practice. This study assesses the main parameters affecting fume capture efficiency namely the extraction flow rate, the positioning of the suction openings on the torch, the angle of inclination of the torch to the workpiece during welding, the metal transfer modes, and the welding deposition rate. The theoretical velocity induced by suction, estimated from the extraction flow rate and the position of the suction openings, is the main parameter affecting effectiveness of the device. This is the design parameter and its value should never be <0.25 m s(-1) The angle of the torch relative to the workpiece also has a great deal of influence. To improve efficiency, work station layouts need to favour positions where the torch is held with angles closer to perpendicular (<15°). Welding with high deposition rates (>1.1g s(-1)) and spray transfer leads to low capture efficiency if induced velocities are <0.5 m s(-1) The results of the study can be used in the design of integrated on-torch extraction systems and provide information for fixing system objectives. © The Author 2016. Published by Oxford University Press on behalf of the British Occupational Hygiene Society.
NASA Astrophysics Data System (ADS)
Riera, Enrique; Blanco, Alfonso; García, José; Benedito, José; Mulet, Antonio; Gallego-Juárez, Juan A.; Blasco, Miguel
2010-01-01
Oil is an important component of almonds and other vegetable substrates that can show an influence on human health. In this work the development and validation of an innovative, robust, stable, reliable and efficient ultrasonic system at pilot scale to assist supercritical CO2 extraction of oils from different substrates is presented. In the extraction procedure ultrasonic energy represents an efficient way of producing deep agitation enhancing mass transfer processes because of some mechanisms (radiation pressure, streaming, agitation, high amplitude vibrations, etc.). A previous work to this research pointed out the feasibility of integrating an ultrasonic field inside a supercritical extractor without losing a significant volume fraction. This pioneer method enabled to accelerate mass transfer and then, improving supercritical extraction times. To commercially develop the new procedure fulfilling industrial requirements, a new configuration device has been designed, implemented, tested and successfully validated for supercritical fluid extraction of oil from different vegetable substrates.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hughes, Richard; Tyagi, Mayank; Radonjic, Mileva
This project is intended to demonstrate the technical and economic feasibility, and environmental and social attractiveness of a novel method of heat extraction from geothermal reservoirs. The emphasis is on assessing the potential for a heat extraction method that couples forced and free convection to maximize extraction efficiency. The heat extraction concept is enhanced by considering wellbore energy conversion, which may include only a boiler for a working fluid, or perhaps a complete boiler, turbine, and condenser cycle within the wellbore. The feasibility of this system depends on maintaining mechanical and hydraulic integrity of the wellbore, so the material propertiesmore » of the casing-cement system are examined both experimentally and with well design calculations. The attractiveness depends on mitigation of seismic and subsidence risks, economic performance, environmental impact, and social impact – all of which are assessed as components of this study.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
O'Neil, D.J.; Bery, M.K.; El-Barbary, I.A.
1979-01-01
In 1973 it was reported that the treatment of southern pine trees with the herbicide Paraquat could induce lightwood formation with very significant increases in the extractable oleoresins and turpentine fractions. The objectives of this project included the characterization of this phenomenon, development of realistic qualitative and quantitative data on the extent of lightwood formation and the recovery of oleoresin and turpentine fractions. The principal objective was to determine if the yields of oleoresinous products and turpentine justified a stand-alone, economic wood extraction process technology, based on the utilization of whole- or complete-Paraquat-treated pine trees. The application of this technologymore » was considered to be appropriate as a sub-system of an integrated chemical process system wherein ethanol, lignin (or hydrocarbon derivatives), and sugars would be manufactured as co-products. Alternately, such extraction technology could be used as a pre-treatment operation prior to Kraft pulping processing. Yield results tended to be variable. Turpentine increases ranged from 2- to 4-fold on a merchantable bole basis with increases at the site of injection as high as 12-fold. The distribution of the turpentine content in Paraquat-treated trees, as well as for extractives content, decreased to normal background levels at about six feet above the wound site. Oleoresin content increases normally ranged from 2 to 3 fold with a maximum total extractables content (or yield) of about 8% on a dry weight basis. Under current conditions, the phenomenon of lightwood formation in mature trees may best be exploited in pulp process plants.« less
Tags Extarction from Spatial Documents in Search Engines
NASA Astrophysics Data System (ADS)
Borhaninejad, S.; Hakimpour, F.; Hamzei, E.
2015-12-01
Nowadays the selective access to information on the Web is provided by search engines, but in the cases which the data includes spatial information the search task becomes more complex and search engines require special capabilities. The purpose of this study is to extract the information which lies in spatial documents. To that end, we implement and evaluate information extraction from GML documents and a retrieval method in an integrated approach. Our proposed system consists of three components: crawler, database and user interface. In crawler component, GML documents are discovered and their text is parsed for information extraction; storage. The database component is responsible for indexing of information which is collected by crawlers. Finally the user interface component provides the interaction between system and user. We have implemented this system as a pilot system on an Application Server as a simulation of Web. Our system as a spatial search engine provided searching capability throughout the GML documents and thus an important step to improve the efficiency of search engines has been taken.
Improving IUE High Dispersion Extraction
NASA Technical Reports Server (NTRS)
Lawton, Patricia J.; VanSteenberg, M. E.; Massa, D.
2007-01-01
We present a different method to extract high dispersion International Ultraviolet Explorer (IUE) spectra from the New Spectral Image Processing System (NEWSIPS) geometrically and photometrically corrected (SI HI) images of the echellogram. The new algorithm corrects many of the deficiencies that exist in the NEWSIPS high dispersion (SIHI) spectra . Specifically, it does a much better job of accounting for the overlap of the higher echelle orders, it eliminates a significant time dependency in the extracted spectra (which can be traced to the background model used in the NEWSIPS extractions), and it can extract spectra from echellogram images that are more highly distorted than the NEWSIPS extraction routines can handle. Together, these improvements yield a set of IUE high dispersion spectra whose scientific integrity is sign ificantly better than the NEWSIPS products. This work has been supported by NASA ADP grants.
Gillespie, Peter J.; Gambus, Agnieszka; Blow, J. Julian
2012-01-01
The use of cell-free extracts prepared from eggs of the South African clawed toad, Xenopus laevis, has led to many important discoveries in cell cycle research. These egg extracts recapitulate the key nuclear transitions of the eukaryotic cell cycle in vitro under apparently the same controls that exist in vivo. DNA added to the extract is first assembled into a nucleus and is then efficiently replicated. Progression of the extract into mitosis then allows the separation of paired sister chromatids. The Xenopus cell-free system is therefore uniquely suited to the study of the mechanisms, dynamics and integration of cell cycle regulated processes at a biochemical level. In this article we describe methods currently in use in our laboratory for the preparation of Xenopus egg extracts and demembranated sperm nuclei for the study of DNA replication in vitro. We also detail how DNA replication can be quantified in this system. In addition, we describe methods for isolating chromatin and chromatin-bound protein complexes from egg extracts. These recently developed and revised techniques provide a practical starting point for investigating the function of proteins involved in DNA replication. PMID:22521908
Are galaxy distributions scale invariant? A perspective from dynamical systems theory
NASA Astrophysics Data System (ADS)
McCauley, J. L.
2002-06-01
Unless there is an evidence for fractal scaling with a single exponent over distances 0.1<=r<=100h-1Mpc, then the widely accepted notion of scale invariance of the correlation integral for 0.1<=r<=10h-1Mpc must be questioned. The attempt to extract a scaling exponent /ν from the correlation integral /n(r) by plotting /log(n(r)) vs. /log(r) is unreliable unless the underlying point set is approximately monofractal. The extraction of a spectrum of generalized dimensions νq from a plot of the correlation integral generating function Gn(q) by a similar procedure is probably an indication that Gn(q) does not scale at all. We explain these assertions after defining the term multifractal, mutually inconsistent definitions having been confused together in the cosmology literature. Part of this confusion is traced to the confusion in interpreting a measure-theoretic formula written down by Hentschel and Procaccia in the dynamical systems theory literature, while other errors follow from confusing together entirely different definitions of multifractal from two different schools of thought. Most important are serious errors in data analysis that follow from taking for granted a largest term approximation that is inevitably advertised in the literature on both fractals and dynamical systems theory.
1999-10-01
Kharisov V.N., Perov A.I., Boldin V.A. (editors). 1977. The global satelllite radio-navigational system 20. Wu W.-R. Target tracking with glint...the coordinates of the OP techniques for their searching and extracting in deep seas. These techniques. have yielded Researches have shown that, an OP
NASA Technical Reports Server (NTRS)
Young, Steve; UijtdeHaag, Maarten; Campbell, Jacob
2004-01-01
To enable safe use of Synthetic Vision Systems at low altitudes, real-time range-to-terrain measurements may be required to ensure the integrity of terrain models stored in the system. This paper reviews and extends previous work describing the application of x-band radar to terrain model integrity monitoring. A method of terrain feature extraction and a transformation of the features to a common reference domain are proposed. Expected error distributions for the extracted features are required to establish appropriate thresholds whereby a consistency-checking function can trigger an alert. A calibration-based approach is presented that can be used to obtain these distributions. To verify the approach, NASA's DC-8 airborne science platform was used to collect data from two mapping sensors. An Airborne Laser Terrain Mapping (ALTM) sensor was installed in the cargo bay of the DC-8. After processing, the ALTM produced a reference terrain model with a vertical accuracy of less than one meter. Also installed was a commercial-off-the-shelf x-band radar in the nose radome of the DC-8. Although primarily designed to measure precipitation, the radar also provides estimates of terrain reflectivity at low altitudes. Using the ALTM data as the reference, errors in features extracted from the radar are estimated. A method to estimate errors in features extracted from the terrain model is also presented.
Elayavilli, Ravikumar Komandur; Liu, Hongfang
2016-01-01
Computational modeling of biological cascades is of great interest to quantitative biologists. Biomedical text has been a rich source for quantitative information. Gathering quantitative parameters and values from biomedical text is one significant challenge in the early steps of computational modeling as it involves huge manual effort. While automatically extracting such quantitative information from bio-medical text may offer some relief, lack of ontological representation for a subdomain serves as impedance in normalizing textual extractions to a standard representation. This may render textual extractions less meaningful to the domain experts. In this work, we propose a rule-based approach to automatically extract relations involving quantitative data from biomedical text describing ion channel electrophysiology. We further translated the quantitative assertions extracted through text mining to a formal representation that may help in constructing ontology for ion channel events using a rule based approach. We have developed Ion Channel ElectroPhysiology Ontology (ICEPO) by integrating the information represented in closely related ontologies such as, Cell Physiology Ontology (CPO), and Cardiac Electro Physiology Ontology (CPEO) and the knowledge provided by domain experts. The rule-based system achieved an overall F-measure of 68.93% in extracting the quantitative data assertions system on an independently annotated blind data set. We further made an initial attempt in formalizing the quantitative data assertions extracted from the biomedical text into a formal representation that offers potential to facilitate the integration of text mining into ontological workflow, a novel aspect of this study. This work is a case study where we created a platform that provides formal interaction between ontology development and text mining. We have achieved partial success in extracting quantitative assertions from the biomedical text and formalizing them in ontological framework. The ICEPO ontology is available for download at http://openbionlp.org/mutd/supplementarydata/ICEPO/ICEPO.owl.
Zou, Denglang; Zhu, Xuelin; Zhang, Fan; Du, Yurong; Ma, Jianbin; Jiang, Renwang
2018-01-31
This study presents an efficient strategy based on liquid-liquid extraction with three-phase solvent system and high speed counter-current chromatography for rapid enrichment and separation of epimers of minor bufadienolide from toad meat. The reflux extraction conditions were optimized by response surface methodology first, and a novel three-phase solvent system composed of n-hexane/methyl acetate/acetonitrile/water (3:6:5:5, v/v) was developed for liquid-liquid extraction of the crude extract. This integrative extraction process could enrich minor bufadienolide from complex matrix efficiently and minimize the loss of minor targets induced by repeated extraction with different kinds of organic solvents occurring in the classical liquid two-phase extraction. As a result, four epimers of minor bufadienolide were greatly enriched in the middle phase and total content of these epimers of minor bufadienolide was increased from 3.25% to 46.23%. Then, the enriched four epimers were separated by HSCCC with a two-phase solvent system composed of chloroform/methanol/water (4:2:2, v/v) successfully. Furthermore, we tested Na + ,K + -ATPase (NKA) inhibitory effect of the four epimers. 3β-Isomers of bufadienolide showed stronger (>8-fold) inhibitory activity than 3α-isomers. The characterization of minor bufadienolide in toad meat and their significant difference of inhibitory effect on NKA would promote the further quantitative analysis and safety evaluation of toad meat as a food source.
Zhang, Dashan; Guo, Jie; Lei, Xiujun; Zhu, Changan
2016-04-22
The development of image sensor and optics enables the application of vision-based techniques to the non-contact dynamic vibration analysis of large-scale structures. As an emerging technology, a vision-based approach allows for remote measuring and does not bring any additional mass to the measuring object compared with traditional contact measurements. In this study, a high-speed vision-based sensor system is developed to extract structure vibration signals in real time. A fast motion extraction algorithm is required for this system because the maximum sampling frequency of the charge-coupled device (CCD) sensor can reach up to 1000 Hz. Two efficient subpixel level motion extraction algorithms, namely the modified Taylor approximation refinement algorithm and the localization refinement algorithm, are integrated into the proposed vision sensor. Quantitative analysis shows that both of the two modified algorithms are at least five times faster than conventional upsampled cross-correlation approaches and achieve satisfactory error performance. The practicability of the developed sensor is evaluated by an experiment in a laboratory environment and a field test. Experimental results indicate that the developed high-speed vision-based sensor system can extract accurate dynamic structure vibration signals by tracking either artificial targets or natural features.
Fusion of monocular cues to detect man-made structures in aerial imagery
NASA Technical Reports Server (NTRS)
Shufelt, Jefferey; Mckeown, David M.
1991-01-01
The extraction of buildings from aerial imagery is a complex problem for automated computer vision. It requires locating regions in a scene that possess properties distinguishing them as man-made objects as opposed to naturally occurring terrain features. It is reasonable to assume that no single detection method can correctly delineate or verify buildings in every scene. A cooperative-methods paradigm is useful in approaching the building extraction problem. Using this paradigm, each extraction technique provides information which can be added or assimilated into an overall interpretation of the scene. Thus, the main objective is to explore the development of computer vision system that integrates the results of various scene analysis techniques into an accurate and robust interpretation of the underlying three dimensional scene. The problem of building hypothesis fusion in aerial imagery is discussed. Building extraction techniques are briefly surveyed, including four building extraction, verification, and clustering systems. A method for fusing the symbolic data generated by these systems is described, and applied to monocular image and stereo image data sets. Evaluation methods for the fusion results are described, and the fusion results are analyzed using these methods.
Pérez-Portela, Rocío; Riesgo, Ana
2013-09-01
Transcriptomic information provides fundamental insights into biological processes. Extraction of quality RNA is a challenging step, and preservation and extraction protocols need to be adjusted in many cases. Our objectives were to optimize preservation protocols for isolation of high-quality RNA from diverse echinoderm tissues and to compare the utility of parameters as absorbance ratios and RIN values to assess RNA quality. Three different tissues (gonad, oesophagus and coelomocytes) were selected from the sea urchin Arbacia lixula. Solid tissues were flash-frozen and stored at -80 °C until processed. Four preservation treatments were applied to coelomocytes: flash freezing and storage at -80 °C, RNAlater and storage at -20 °C, preservation in TRIzol reagent and storage at -80 °C and direct extraction with TRIzol from fresh cells. Extractions of total RNA were performed with a modified TRIzol protocol for all tissues. Our results showed high values of RNA quantity and quality for all tissues, showing nonsignificant differences among them. However, while flash freezing was effective for solid tissues, it was inadequate for coelomocytes because of the low quality of the RNA extractions. Coelomocytes preserved in RNAlater displayed large variability in RNA integrity and insufficient RNA amount for further isolation of mRNA. TRIzol was the most efficient system for stabilizing RNA which resulted on high RNA quality and quantity. We did not detect correlation between absorbance ratios and RNA integrity. The best strategies for assessing RNA integrity was the visualization of 18S rRNA and 28S rRNA bands in agarose gels and estimation of RIN values with Agilent Bioanalyzer chips. © 2013 John Wiley & Sons Ltd.
Managing interoperability and complexity in health systems.
Bouamrane, M-M; Tao, C; Sarkar, I N
2015-01-01
In recent years, we have witnessed substantial progress in the use of clinical informatics systems to support clinicians during episodes of care, manage specialised domain knowledge, perform complex clinical data analysis and improve the management of health organisations' resources. However, the vision of fully integrated health information eco-systems, which provide relevant information and useful knowledge at the point-of-care, remains elusive. This journal Focus Theme reviews some of the enduring challenges of interoperability and complexity in clinical informatics systems. Furthermore, a range of approaches are proposed in order to address, harness and resolve some of the many remaining issues towards a greater integration of health information systems and extraction of useful or new knowledge from heterogeneous electronic data repositories.
Kim, Yong Tae; Lee, Dohwan; Heo, Hyun Young; Sim, Jeong Eun; Woo, Kwang Man; Kim, Do Hyun; Im, Sung Gap; Seo, Tae Seok
2016-04-15
A fully integrated slidable and valveless microsystem, which performs solid phase DNA extraction (SPE), micro-polymerase chain reaction (μPCR) and micro-capillary electrophoresis (μCE) coupled with a portable genetic analyser, has been developed for forensic genotyping. The use of a slidable chip, in which a 1 μL-volume of the PCR chamber was patterned at the center, does not necessitate any microvalves and tubing systems for fluidic control. The functional micro-units of SPE, μPCR, and μCE were fabricated on a single glass wafer by conventional photolithography, and the integrated microdevice consists of three layers: from top to bottom, a slidable chip, a channel wafer in which a SPE chamber, a mixing microchannel, and a CE microchannel were fabricated, and a Ti/Pt resistance temperature detector (RTD) wafer. The channel glass wafer and the RTD glass wafer were thermally bonded, and the slidable chip was placed on the designated functional unit. The entire process from the DNA extraction using whole human blood sample to identification of target Y chromosomal short tandem repeat (STR) loci was serially carried out with simply sliding the slidable chamber from one to another functional unit. Monoplex and multiplex detection of amelogenin and mini Y STR loci were successfully analysed on the integrated slidable SPE-μPCR-μCE microdevice by using 1 μL whole human blood within 60 min. The proposed advanced genetic analysis microsystem is capable of point-of-care DNA testing with sample-in-answer-out capability, more importantly, without use of complicated microvalves and microtubing systems for liquid transfer. Copyright © 2015 Elsevier B.V. All rights reserved.
Extractables analysis of single-use flexible plastic biocontainers.
Marghitoiu, Liliana; Liu, Jian; Lee, Hans; Perez, Lourdes; Fujimori, Kiyoshi; Ronk, Michael; Hammond, Matthew R; Nunn, Heather; Lower, Asher; Rogers, Gary; Nashed-Samuel, Yasser
2015-01-01
Studies of the extractable profiles of bioprocessing components have become an integral part of drug development efforts to minimize possible compromise in process performance, decrease in drug product quality, and potential safety risk to patients due to the possibility of small molecules leaching out from the components. In this study, an effective extraction solvent system was developed to evaluate the organic extractable profiles of single-use bioprocess equipment, which has been gaining increasing popularity in the biopharmaceutical industry because of the many advantages over the traditional stainless steel-based bioreactors and other fluid mixing and storage vessels. The chosen extraction conditions were intended to represent aggressive conditions relative to the application of single-use bags in biopharmaceutical manufacture, in which aqueous based systems are largely utilized. Those extraction conditions, along with a non-targeted analytical strategy, allowed for the generation and identification of an array of extractable compounds; a total of 53 organic compounds were identified from four types of commercially available single-use bags, the majority of which are degradation products of polymer additives. The success of this overall extractables analysis strategy was reflected partially by the effectiveness in the extraction and identification of a compound that was later found to be highly detrimental to mammalian cell growth. The usage of single-use bioreactors has been increasing in biopharmaceutical industry because of the appealing advantages that it promises regarding to the cleaning, sterilization, operational flexibility, and so on, during manufacturing of biologics. However, compared to its conventional counterparts based mainly on stainless steel, single-use bioreactors are more susceptible to potential problems associated with compound leaching into the bioprocessing fluid. As a result, extractable profiling of the single-use system has become essential in the qualification of such systems for its use in drug manufacturing. The aim of this study is to evaluate the effectiveness of an extraction solvent system developed to study the extraction profile of single-use bioreactors in which aqueous-based systems are largely used. The results showed that with a non-targeted analytical approach, the extraction solvent allowed the generation and identification of an array of extractable compounds from four commercially available single-use bioreactors. Most of extractables are degradation products of polymer additives, among which was a compound that was later found to be highly detrimental to mammalian cell growth. © PDA, Inc. 2015.
NASA Technical Reports Server (NTRS)
Young, Steven D.; Harrah, Steven D.; deHaag, Maarten Uijt
2002-01-01
Terrain Awareness and Warning Systems (TAWS) and Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data (e.g. terrain, obstacles, and/or features). As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. This lack of a quantifiable integrity level is one of the constraints that has limited certification and operational approval of TAWS/SVS to "advisory-only" systems for civil aviation. Previous work demonstrated the feasibility of using a real-time monitor to bound database integrity by using downward-looking remote sensing technology (i.e. radar altimeters). This paper describes an extension of the integrity monitor concept to include a forward-looking sensor to cover additional classes of terrain database faults and to reduce the exposure time associated with integrity threats. An operational concept is presented that combines established feature extraction techniques with a statistical assessment of similarity measures between the sensed and stored features using principles from classical detection theory. Finally, an implementation is presented that uses existing commercial-off-the-shelf weather radar sensor technology.
Integration of In-Situ Resource Utilization Into Lunar/Mars Exploration Through Field Analogs
NASA Technical Reports Server (NTRS)
Sanders, Gerald B.; Larson, William E.
2010-01-01
The NASA project to develop In-Situ Resource Utilization (ISRU) technologies, in partnership with commercial and international collaborators, has achieved full system demonstrations of oxygen production using native regolith simulants. These demonstrations included robotic extraction of material from the terrain, sealed encapsulation of material in a pressurized reactor; chemical extraction of oxygen from the material in the form of water, and the electrolysis of water into oxygen and hydrogen for storage and reuse. These successes have provided growing confidence in the prospects of ISRU oxygen production as a credible source for critical mission consumables in preparation for and during crewed missions to the moon and other destinations. Other ISRU processes, especially relevant to early lunar exploration scenarios, have also been shown to be practical, including the extraction of subsurface volatiles, especially water, and the thermal processing of surface materials for civil engineering uses and for thermal energy storage. This paper describes these recent achievements and current NASA ISRU development and demonstration activity. The ability to extract and process resources at the site of exploration into useful products such as propellants, life support and power system consumables; and radiation and rocket exhaust plume debris shielding, known as In-Situ Resource Utilization or ISRU, has the potential to significantly reduce the launch mass, risk, and cost of robotic and human exploration of space. The incorporation of ISRU into missions can also significantly influence technology selection and system development in other areas such as power, life support, and propulsion. For example. the ability to extract or produce large amounts of oxygen and/or water in-situ could minimize the need to completely close life support air and water processing system cycles, change thermal and radiation protection of habitats, and influence propellant selection for ascent vehicles and surface propulsive hoppers. While concepts and even laboratory work on evaluating and developing ISRU techniques such as oxygen extraction from lunar regolith have been going on since before the Apollo 11 Moon landing, no ISRU system has ever flown in space, and only recently have ISRU technologies been developed at a scale and at a system level that is relevant to actual robotic and human mission applications. Because ISRU hardware and systems have never been demonstrated or utilized before on robotic or human missions, architecture and mission planners and surface system hardware developers are hesitant to rely on ISRU products and services that are critical to mission and system implementation success. To build confidence in ISRU systems for future missions and assess how ISRU systems can best influence and integrate with other surface system elements, NASA, with international partners, are performing analog field tests to understand how to take advantage of ISRU capabilities and benefits with the minimum of risk associated with introducing this game-changing approach to exploration. This paper will describe and review the results of four analog field tests (Moses Lake in 6/08, Mauna Kea in 11/08. Flagstaff in 9/09; and Mauna Kea in 1/10) that have begun the process of integrating ISRU into robotic and human exploration systems and missions, and propose future ISRU-related analog field test activities that can be performed in collaboration with international space agencies.
Decentralized Multisensory Information Integration in Neural Systems.
Zhang, Wen-Hao; Chen, Aihua; Rasch, Malte J; Wu, Si
2016-01-13
How multiple sensory cues are integrated in neural circuitry remains a challenge. The common hypothesis is that information integration might be accomplished in a dedicated multisensory integration area receiving feedforward inputs from the modalities. However, recent experimental evidence suggests that it is not a single multisensory brain area, but rather many multisensory brain areas that are simultaneously involved in the integration of information. Why many mutually connected areas should be needed for information integration is puzzling. Here, we investigated theoretically how information integration could be achieved in a distributed fashion within a network of interconnected multisensory areas. Using biologically realistic neural network models, we developed a decentralized information integration system that comprises multiple interconnected integration areas. Studying an example of combining visual and vestibular cues to infer heading direction, we show that such a decentralized system is in good agreement with anatomical evidence and experimental observations. In particular, we show that this decentralized system can integrate information optimally. The decentralized system predicts that optimally integrated information should emerge locally from the dynamics of the communication between brain areas and sheds new light on the interpretation of the connectivity between multisensory brain areas. To extract information reliably from ambiguous environments, the brain integrates multiple sensory cues, which provide different aspects of information about the same entity of interest. Here, we propose a decentralized architecture for multisensory integration. In such a system, no processor is in the center of the network topology and information integration is achieved in a distributed manner through reciprocally connected local processors. Through studying the inference of heading direction with visual and vestibular cues, we show that the decentralized system can integrate information optimally, with the reciprocal connections between processers determining the extent of cue integration. Our model reproduces known multisensory integration behaviors observed in experiments and sheds new light on our understanding of how information is integrated in the brain. Copyright © 2016 Zhang et al.
Decentralized Multisensory Information Integration in Neural Systems
Zhang, Wen-hao; Chen, Aihua
2016-01-01
How multiple sensory cues are integrated in neural circuitry remains a challenge. The common hypothesis is that information integration might be accomplished in a dedicated multisensory integration area receiving feedforward inputs from the modalities. However, recent experimental evidence suggests that it is not a single multisensory brain area, but rather many multisensory brain areas that are simultaneously involved in the integration of information. Why many mutually connected areas should be needed for information integration is puzzling. Here, we investigated theoretically how information integration could be achieved in a distributed fashion within a network of interconnected multisensory areas. Using biologically realistic neural network models, we developed a decentralized information integration system that comprises multiple interconnected integration areas. Studying an example of combining visual and vestibular cues to infer heading direction, we show that such a decentralized system is in good agreement with anatomical evidence and experimental observations. In particular, we show that this decentralized system can integrate information optimally. The decentralized system predicts that optimally integrated information should emerge locally from the dynamics of the communication between brain areas and sheds new light on the interpretation of the connectivity between multisensory brain areas. SIGNIFICANCE STATEMENT To extract information reliably from ambiguous environments, the brain integrates multiple sensory cues, which provide different aspects of information about the same entity of interest. Here, we propose a decentralized architecture for multisensory integration. In such a system, no processor is in the center of the network topology and information integration is achieved in a distributed manner through reciprocally connected local processors. Through studying the inference of heading direction with visual and vestibular cues, we show that the decentralized system can integrate information optimally, with the reciprocal connections between processers determining the extent of cue integration. Our model reproduces known multisensory integration behaviors observed in experiments and sheds new light on our understanding of how information is integrated in the brain. PMID:26758843
A new diagnostic method of bolt loosening detection for thermal protection systems
NASA Astrophysics Data System (ADS)
Xie, Weihua; Meng, Songhe; Han, Jiecai; Du, Shanyi; Zhang, Boming; Yu, Dong
2009-07-01
Research and development efforts are underway to provide structural health monitoring systems to ensure the integrity of thermal protection system (TPS). An improved analytical method was proposed to assess the fastener integrity of a bolted structure in this paper. A new unsymmetrical washer was designed and fabricated, taking full advantage of piezoelectric ceramics (PZT) to play both roles as actuators and sensors, and using energy as the only extracted feature to identify abnormality. This diagnostic method is not restricted by the materials of the bracket, panel and base structure of the TPS whose condition is under inspection. A series of experiments on a metallic honeycomb sandwich panel were completed to demonstrate the capability of detecting bolt loosening on the TPS structure. Studies showed that this method can be used not only to identify the location of loosened bolts rapidly, but also to estimate the torque level of loosening bolts. Since that energy is the only extracted feature used to detect bolt loosening in this method, the diagnostic process become very simple and swift without sacrificing the accuracy of the results.
Integrated system for automated financial document processing
NASA Astrophysics Data System (ADS)
Hassanein, Khaled S.; Wesolkowski, Slawo; Higgins, Ray; Crabtree, Ralph; Peng, Antai
1997-02-01
A system was developed that integrates intelligent document analysis with multiple character/numeral recognition engines in order to achieve high accuracy automated financial document processing. In this system, images are accepted in both their grayscale and binary formats. A document analysis module starts by extracting essential features from the document to help identify its type (e.g. personal check, business check, etc.). These features are also utilized to conduct a full analysis of the image to determine the location of interesting zones such as the courtesy amount and the legal amount. These fields are then made available to several recognition knowledge sources such as courtesy amount recognition engines and legal amount recognition engines through a blackboard architecture. This architecture allows all the available knowledge sources to contribute incrementally and opportunistically to the solution of the given recognition query. Performance results on a test set of machine printed business checks using the integrated system are also reported.
Electronic processing of informed consents in a global pharmaceutical company environment.
Vishnyakova, Dina; Gobeill, Julien; Oezdemir-Zaech, Fatma; Kreim, Olivier; Vachon, Therese; Clade, Thierry; Haenning, Xavier; Mikhailov, Dmitri; Ruch, Patrick
2014-01-01
We present an electronic capture tool to process informed consents, which are mandatory recorded when running a clinical trial. This tool aims at the extraction of information expressing the duration of the consent given by the patient to authorize the exploitation of biomarker-related information collected during clinical trials. The system integrates a language detection module (LDM) to route a document into the appropriate information extraction module (IEM). The IEM is based on language-specific sets of linguistic rules for the identification of relevant textual facts. The achieved accuracy of both the LDM and IEM is 99%. The architecture of the system is described in detail.
Simulating the Generalized Gibbs Ensemble (GGE): A Hilbert space Monte Carlo approach
NASA Astrophysics Data System (ADS)
Alba, Vincenzo
By combining classical Monte Carlo and Bethe ansatz techniques we devise a numerical method to construct the Truncated Generalized Gibbs Ensemble (TGGE) for the spin-1/2 isotropic Heisenberg (XXX) chain. The key idea is to sample the Hilbert space of the model with the appropriate GGE probability measure. The method can be extended to other integrable systems, such as the Lieb-Liniger model. We benchmark the approach focusing on GGE expectation values of several local observables. As finite-size effects decay exponentially with system size, moderately large chains are sufficient to extract thermodynamic quantities. The Monte Carlo results are in agreement with both the Thermodynamic Bethe Ansatz (TBA) and the Quantum Transfer Matrix approach (QTM). Remarkably, it is possible to extract in a simple way the steady-state Bethe-Gaudin-Takahashi (BGT) roots distributions, which encode complete information about the GGE expectation values in the thermodynamic limit. Finally, it is straightforward to simulate extensions of the GGE, in which, besides the local integral of motion (local charges), one includes arbitrary functions of the BGT roots. As an example, we include in the GGE the first non-trivial quasi-local integral of motion.
An integratable microfluidic cartridge for forensic swab samples lysis.
Yang, Jianing; Brooks, Carla; Estes, Matthew D; Hurth, Cedric M; Zenhausern, Frederic
2014-01-01
Fully automated rapid forensic DNA analysis requires integrating several multistep processes onto a single microfluidic platform, including substrate lysis, extraction of DNA from the released lysate solution, multiplexed PCR amplification of STR loci, separation of PCR products by capillary electrophoresis, and analysis for allelic peak calling. Over the past several years, most of the rapid DNA analysis systems developed started with the reference swab sample lysate and involved an off-chip lysis of collected substrates. As a result of advancement in technology and chemistry, addition of a microfluidic module for swab sample lysis has been achieved in a few of the rapid DNA analysis systems. However, recent reports on integrated rapid DNA analysis systems with swab-in and answer-out capability lack any quantitative and qualitative characterization of the swab-in sample lysis module, which is important for downstream forensic sample processing. Maximal collection and subsequent recovery of the biological material from the crime scene is one of the first and critical steps in forensic DNA technology. Herein we present the design, fabrication and characterization of an integratable swab lysis cartridge module and the test results obtained from different types of commonly used forensic swab samples, including buccal, saliva, and blood swab samples, demonstrating the compatibility with different downstream DNA extraction chemistries. This swab lysis cartridge module is easy to operate, compatible with both forensic and microfluidic requirements, and ready to be integrated with our existing automated rapid forensic DNA analysis system. Following the characterization of the swab lysis module, an integrated run from buccal swab sample-in to the microchip CE electropherogram-out was demonstrated on the integrated prototype instrument. Therefore, in this study, we demonstrate that this swab lysis cartridge module is: (1) functionally, comparable with routine benchtop lysis, (2) compatible with various types of swab samples and chemistries, and (3) integratable to achieve a micro total analysis system (μTAS) for rapid DNA analysis. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
Development of Availability and Sustainability Spares Optimization Models for Aircraft Reparables
2013-09-01
the integrated SAP ® Enterprise Resource Planning ( ERP ) information system of the RSAF. A more in-depth review of OPUS10 capabilities will be provided...Dynamic Multi-Echelon Technique for Recoverable Item Control EBO: Expected Backorder EOQ: Economic Order Quantity ERP : Enterprise Resource...particular, the propulsion sub-system was expanded to include SSRUs. Spares information are extracted from the RSAF ERP system and include: 22
Kopanitsa, Georgy
2017-05-18
The efficiency and acceptance of clinical decision support systems (CDSS) can increase if they reuse medical data captured during health care delivery. High heterogeneity of the existing legacy data formats has become the main barrier for the reuse of data. Thus, we need to apply data modeling mechanisms that provide standardization, transformation, accumulation and querying medical data to allow its reuse. In this paper, we focus on the interoperability issues of the hospital information systems (HIS) and CDSS data integration. Our study is based on the approach proposed by Marcos et al. where archetypes are used as a standardized mechanism for the interaction of a CDSS with an electronic health record (EHR). We build an integration tool to enable CDSSs collect data from various institutions without a need for modifications in the implementation. The approach implies development of a conceptual level as a set of archetypes representing concepts required by a CDSS. Treatment case data from Regional Clinical Hospital in Tomsk, Russia was extracted, transformed and loaded to the archetype database of a clinical decision support system. Test records' normalization has been performed by defining transformation and aggregation rules between the EHR data and the archetypes. These mapping rules were used to automatically generate openEHR compliant data. After the transformation, archetype data instances were loaded into the CDSS archetype based data storage. The performance times showed acceptable performance for the extraction stage with a mean of 17.428 s per year (3436 case records). The transformation times were also acceptable with 136.954 s per year (0.039 s per one instance). The accuracy evaluation showed the correctness and applicability of the method for the wide range of HISes. These operations were performed without interrupting the HIS workflow to prevent the HISes from disturbing the service provision to the users. The project results have proven that archetype based technologies are mature enough to be applied in routine operations that require extraction, transformation, loading and querying medical data from heterogeneous EHR systems. Inference models in clinical research and CDSS can benefit from this by defining queries to a valid data set with known structure and constraints. The standard based nature of the archetype approach allows an easy integration of CDSSs with existing EHR systems.
The TIGER system: a Census Bureau innovation serving data analysts.
Carbaugh, L W; Marx, R W
1990-01-01
This article describes the U.S. Census Bureau's TIGER (Topologically Integrated Geographic Encoding and Referencing) system, an automated geographic data base. The emphasis is on the availability of file extracts and their usefulness to data analysts. In addition to describing the available files, it mentions various applications for the data, explains the data limitations, and notes problems encountered to date.
An integrated telemedicine platform for the assessment of affective physiological states
Katsis, Christos D; Ganiatsas, George; Fotiadis, Dimitrios I
2006-01-01
AUBADE is an integrated platform built for the affective assessment of individuals. The system performs evaluation of the emotional state by classifying vectors of features extracted from: facial Electromyogram, Respiration, Electrodermal Activity and Electrocardiogram. The AUBADE system consists of: (a) a multisensorial wearable, (b) a data acquisition and wireless communication module, (c) a feature extraction module, (d) a 3D facial animation module which is used for the projection of the obtained data through a generic 3D face model; whereas the end-user will be able to view the facial expression of the subject in real time, (e) an intelligent emotion recognition module, and (f) the AUBADE databases where the acquired signals along with the subject's animation videos are saved. The system is designed to be applied to human subjects operating under extreme stress conditions, in particular car racing drivers, and also to patients suffering from neurological and psychological disorders. AUBADE's classification accuracy into five predefined emotional classes (high stress, low stress, disappointment, euphoria and neutral face) is 86.0%. The pilot system applications and components are being tested and evaluated on Maserati's car. racing drivers. PMID:16879757
A Compact VLSI System for Bio-Inspired Visual Motion Estimation.
Shi, Cong; Luo, Gang
2018-04-01
This paper proposes a bio-inspired visual motion estimation algorithm based on motion energy, along with its compact very-large-scale integration (VLSI) architecture using low-cost embedded systems. The algorithm mimics motion perception functions of retina, V1, and MT neurons in a primate visual system. It involves operations of ternary edge extraction, spatiotemporal filtering, motion energy extraction, and velocity integration. Moreover, we propose the concept of confidence map to indicate the reliability of estimation results on each probing location. Our algorithm involves only additions and multiplications during runtime, which is suitable for low-cost hardware implementation. The proposed VLSI architecture employs multiple (frame, pixel, and operation) levels of pipeline and massively parallel processing arrays to boost the system performance. The array unit circuits are optimized to minimize hardware resource consumption. We have prototyped the proposed architecture on a low-cost field-programmable gate array platform (Zynq 7020) running at 53-MHz clock frequency. It achieved 30-frame/s real-time performance for velocity estimation on 160 × 120 probing locations. A comprehensive evaluation experiment showed that the estimated velocity by our prototype has relatively small errors (average endpoint error < 0.5 pixel and angular error < 10°) for most motion cases.
NASA Astrophysics Data System (ADS)
Tsumori, K.; Takeiri, Y.; Ikeda, K.; Nakano, H.; Geng, S.; Kisaki, M.; Nagaoka, K.; Tokuzawa, T.; Wada, M.; Sasaki, K.; Nishiyama, S.; Goto, M.; Osakabe, M.
2017-08-01
Total power of 16 MW has been successfully delivered to the plasma confined in the Large Helical Device (LHD) from three Neutral Beam Injectors (NBIs) equipped with negative hydrogen (H-) ion sources. However, the detailed mechanisms from production through extraction of H- ions are still yet to be clarified and a similar size ion source on an independent acceleration test bench called Research and development Negative Ion Source (RNIS) serves as the facility to study physics related to H- production and transport for further improvement of NBI. The production of negative-ion-rich plasma and the H- ions behavior in the beam extraction region in RNIS is being investigated by employing an integrated diagnostic system. Flow patterns of electrons, positive ions and H- ions in the extraction region are described in a two-dimensional map. The measured flow patterns indicate the existence a stagnation region, where the H- flow changes the direction at a distance about 20 mm from the plasma grid. The pattern also suggested the H- flow originated from plasma grid (PG) surface that turned back toward extraction apertures. The turning region seems formed by a layer of combined magnetic field produced by the magnetic filter field and the Electron-Deflection Magnetic (EDM) field created by magnets installed in the extraction electrode.
Maranduba, Henrique Leonardo; Robra, Sabine; Nascimento, Iracema Andrade; da Cruz, Rosenira Serpa; Rodrigues, Luciano Brito; de Almeida Neto, José Adolfo
2015-10-01
Despite environmental benefits of algal-biofuels, the energy-intensive systems for producing microalgae-feedstock may result in high GHG emissions. Trying to overcome energy-costs, this research analyzed the biodiesel production system via dry-route, based on Chlorella vulgaris cultivated in raceways, by comparing the GHG-footprints of diverse microalgae-biodiesel scenarios. These involved: the single system of biomass production (C0); the application of pyrolysis on the residual microalgal biomass (cake) from the oil extraction process (C1); the same as C0, with anaerobic cake co-digested with cattle manure (C2); the same conditions as in C1 and C2, by integrating in both cases (respectively C3 and C4), the microalgae cultivation with an autonomous ethanol distillery. The reduction of GHG emissions in scenarios with no such integration (C1 and C2), compared to CO, was insignificant (0.53% and 4.67%, respectively), whereas in the scenarios with integration with ethanol production system, the improvements were 53.57% for C3 and 63.84% for C4. Copyright © 2015 Elsevier Ltd. All rights reserved.
Passive Sensor Integration for Vehicle Self-Localization in Urban Traffic Environment †
Gu, Yanlei; Hsu, Li-Ta; Kamijo, Shunsuke
2015-01-01
This research proposes an accurate vehicular positioning system which can achieve lane-level performance in urban canyons. Multiple passive sensors, which include Global Navigation Satellite System (GNSS) receivers, onboard cameras and inertial sensors, are integrated in the proposed system. As the main source for the localization, the GNSS technique suffers from Non-Line-Of-Sight (NLOS) propagation and multipath effects in urban canyons. This paper proposes to employ a novel GNSS positioning technique in the integration. The employed GNSS technique reduces the multipath and NLOS effects by using the 3D building map. In addition, the inertial sensor can describe the vehicle motion, but has a drift problem as time increases. This paper develops vision-based lane detection, which is firstly used for controlling the drift of the inertial sensor. Moreover, the lane keeping and changing behaviors are extracted from the lane detection function, and further reduce the lateral positioning error in the proposed localization system. We evaluate the integrated localization system in the challenging city urban scenario. The experiments demonstrate the proposed method has sub-meter accuracy with respect to mean positioning error. PMID:26633420
Zheng, Shuai; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A
2017-01-01
Background Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Objective Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. Methods A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Results Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports—each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. Conclusions IDEAL-X adopts a unique online machine learning–based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. PMID:28487265
Gesture recognition for smart home applications using portable radar sensors.
Wan, Qian; Li, Yiran; Li, Changzhi; Pal, Ranadip
2014-01-01
In this article, we consider the design of a human gesture recognition system based on pattern recognition of signatures from a portable smart radar sensor. Powered by AAA batteries, the smart radar sensor operates in the 2.4 GHz industrial, scientific and medical (ISM) band. We analyzed the feature space using principle components and application-specific time and frequency domain features extracted from radar signals for two different sets of gestures. We illustrate that a nearest neighbor based classifier can achieve greater than 95% accuracy for multi class classification using 10 fold cross validation when features are extracted based on magnitude differences and Doppler shifts as compared to features extracted through orthogonal transformations. The reported results illustrate the potential of intelligent radars integrated with a pattern recognition system for high accuracy smart home and health monitoring purposes.
NASA Technical Reports Server (NTRS)
Figueroa, Fernando; Morris, Jon; Turowski, Mark; Franzl, Richard; Walker, Mark; Kapadia, Ravi; Venkatesh, Meera; Schmalzel, John
2010-01-01
Severe weather events are likely occurrences on the Mississippi Gulf Coast. It is important to rapidly diagnose and mitigate the effects of storms on Stennis Space Center's rocket engine test complex to avoid delays to critical test article programs, reduce costs, and maintain safety. An Integrated Systems Health Management (ISHM) approach and technologies are employed to integrate environmental (weather) monitoring, structural modeling, and the suite of available facility instrumentation to provide information for readiness before storms, rapid initial damage assessment to guide mitigation planning, and then support on-going assurance as repairs are effected and finally support recertification. The system is denominated Katrina Storm Monitoring System (KStorMS). Integrated Systems Health Management (ISHM) describes a comprehensive set of capabilities that provide insight into the behavior the health of a system. Knowing the status of a system allows decision makers to effectively plan and execute their mission. For example, early insight into component degradation and impending failures provides more time to develop work around strategies and more effectively plan for maintenance. Failures of system elements generally occur over time. Information extracted from sensor data, combined with system-wide knowledge bases and methods for information extraction and fusion, inference, and decision making, can be used to detect incipient failures. If failures do occur, it is critical to detect and isolate them, and suggest an appropriate course of action. ISHM enables determining the condition (health) of every element in a complex system-of-systems or SoS (detect anomalies, diagnose causes, predict future anomalies), and provide data, information, and knowledge (DIaK) to control systems for safe and effective operation. ISHM capability is achieved by using a wide range of technologies that enable anomaly detection, diagnostics, prognostics, and advise for control: (1) anomaly detection algorithms and strategies, (2) fusion of DIaK for anomaly detection (model-based, numerical, statistical, empirical, expert-based, qualitative, etc.), (3) diagnostics/prognostics strategies and methods, (4) user interface, (5) advanced control strategies, (6) integration architectures/frameworks, (7) embedding of intelligence. Many of these technologies are mature, and they are being used in the KStorMS. The paper will describe the design, implementation, and operation of the KStorMS; and discuss further evolution to support other needs such as condition-based maintenance (CBM).
NASA Astrophysics Data System (ADS)
Avitabile, Daniele; Bridges, Thomas J.
2010-06-01
Numerical integration of complex linear systems of ODEs depending analytically on an eigenvalue parameter are considered. Complex orthogonalization, which is required to stabilize the numerical integration, results in non-analytic systems. It is shown that properties of eigenvalues are still efficiently recoverable by extracting information from a non-analytic characteristic function. The orthonormal systems are constructed using the geometry of Stiefel bundles. Different forms of continuous orthogonalization in the literature are shown to correspond to different choices of connection one-form on the Stiefel bundle. For the numerical integration, Gauss-Legendre Runge-Kutta algorithms are the principal choice for preserving orthogonality, and performance results are shown for a range of GLRK methods. The theory and methods are tested by application to example boundary value problems including the Orr-Sommerfeld equation in hydrodynamic stability.
Detecting Disease Specific Pathway Substructures through an Integrated Systems Biology Approach
Alaimo, Salvatore; Marceca, Gioacchino Paolo; Ferro, Alfredo; Pulvirenti, Alfredo
2017-01-01
In the era of network medicine, pathway analysis methods play a central role in the prediction of phenotype from high throughput experiments. In this paper, we present a network-based systems biology approach capable of extracting disease-perturbed subpathways within pathway networks in connection with expression data taken from The Cancer Genome Atlas (TCGA). Our system extends pathways with missing regulatory elements, such as microRNAs, and their interactions with genes. The framework enables the extraction, visualization, and analysis of statistically significant disease-specific subpathways through an easy to use web interface. Our analysis shows that the methodology is able to fill the gap in current techniques, allowing a more comprehensive analysis of the phenomena underlying disease states. PMID:29657291
Photon extraction and conversion for scalable ion-trap quantum computing
NASA Astrophysics Data System (ADS)
Clark, Susan; Benito, Francisco; McGuinness, Hayden; Stick, Daniel
2014-03-01
Trapped ions represent one of the most mature and promising systems for quantum information processing. They have high-fidelity one- and two-qubit gates, long coherence times, and their qubit states can be reliably prepared and detected. Taking advantage of these inherent qualities in a system with many ions requires a means of entangling spatially separated ion qubits. One architecture achieves this entanglement through the use of emitted photons to distribute quantum information - a favorable strategy if photon extraction can be made efficient and reliable. Here I present results for photon extraction from an ion in a cavity formed by integrated optics on a surface trap, as well as results in frequency converting extracted photons for long distance transmission or interfering with photons from other types of optically active qubits. Sandia National Laboratories is a multi-program laboratory managed and operated by Sandia Corporation, a wholly owned subsidiary of Lockheed Martin Corporation, for the U. S. Department of Energy's National Nuclear Security Administration under contract DE-AC04-94AL85000.
Nucleic acid extraction techniques and application to the microchip.
Price, Carol W; Leslie, Daniel C; Landers, James P
2009-09-07
As recently as the early 1990s, DNA purification was time-consuming, requiring the use of toxic, hazardous reagents. The advent of solid phase extraction techniques and the availability of commercial kits for quick and reliable DNA extraction has relegated those early techniques largely to the history books. High quality DNA can now be extracted from whole blood, serum, saliva, urine, stool, cerebral spinal fluid, tissues, and cells in less time without sacrificing recovery. Having achieved such a radical change in the methodology of DNA extraction, focus has shifted to adapting these methods to a miniaturized system, or "lab-on-a-chip" (A. Manz, N. Graber and H. M. Widmer, Sens. Actuators, B, 1990, 1, 244-248). Manz et al.'s concept of a "miniaturized total chemical analysis system" (microTAS) involved a silicon chip that incorporated sample pretreatment, separation and detection. This review will focus on the first of these steps, sample pretreatment in the form of DNA purification. The intention of this review is to provide an overview of the fundamentals of nucleic acid purification and solid phase extraction (SPE) and to discuss specific microchip DNA extraction successes and challenges. In order to fully appreciate the advances in DNA purification, a brief review of the history of DNA extraction is provided so that the reader has an understanding of the impact that the development of SPE techniques have had. This review will highlight the different methods of nucleic acid extraction (Table 1), including relevant citations, but without an exhaustive summary of the literature. A recent review by Wen et al. (J. Wen, L. A. Legendre, J. M. Bienvenue and J. P. Landers, Anal. Chem., 2008, 80, 6472-6479) covers solid phase extraction methods with a greater focus on their incorporation into integrated microfluidic systems.
Thinking graphically: Connecting vision and cognition during graph comprehension.
Ratwani, Raj M; Trafton, J Gregory; Boehm-Davis, Deborah A
2008-03-01
Task analytic theories of graph comprehension account for the perceptual and conceptual processes required to extract specific information from graphs. Comparatively, the processes underlying information integration have received less attention. We propose a new framework for information integration that highlights visual integration and cognitive integration. During visual integration, pattern recognition processes are used to form visual clusters of information; these visual clusters are then used to reason about the graph during cognitive integration. In 3 experiments, the processes required to extract specific information and to integrate information were examined by collecting verbal protocol and eye movement data. Results supported the task analytic theories for specific information extraction and the processes of visual and cognitive integration for integrative questions. Further, the integrative processes scaled up as graph complexity increased, highlighting the importance of these processes for integration in more complex graphs. Finally, based on this framework, design principles to improve both visual and cognitive integration are described. PsycINFO Database Record (c) 2008 APA, all rights reserved
Installé, Arnaud Jf; Van den Bosch, Thierry; De Moor, Bart; Timmerman, Dirk
2014-10-20
Using machine-learning techniques, clinical diagnostic model research extracts diagnostic models from patient data. Traditionally, patient data are often collected using electronic Case Report Form (eCRF) systems, while mathematical software is used for analyzing these data using machine-learning techniques. Due to the lack of integration between eCRF systems and mathematical software, extracting diagnostic models is a complex, error-prone process. Moreover, due to the complexity of this process, it is usually only performed once, after a predetermined number of data points have been collected, without insight into the predictive performance of the resulting models. The objective of the study of Clinical Data Miner (CDM) software framework is to offer an eCRF system with integrated data preprocessing and machine-learning libraries, improving efficiency of the clinical diagnostic model research workflow, and to enable optimization of patient inclusion numbers through study performance monitoring. The CDM software framework was developed using a test-driven development (TDD) approach, to ensure high software quality. Architecturally, CDM's design is split over a number of modules, to ensure future extendability. The TDD approach has enabled us to deliver high software quality. CDM's eCRF Web interface is in active use by the studies of the International Endometrial Tumor Analysis consortium, with over 4000 enrolled patients, and more studies planned. Additionally, a derived user interface has been used in six separate interrater agreement studies. CDM's integrated data preprocessing and machine-learning libraries simplify some otherwise manual and error-prone steps in the clinical diagnostic model research workflow. Furthermore, CDM's libraries provide study coordinators with a method to monitor a study's predictive performance as patient inclusions increase. To our knowledge, CDM is the only eCRF system integrating data preprocessing and machine-learning libraries. This integration improves the efficiency of the clinical diagnostic model research workflow. Moreover, by simplifying the generation of learning curves, CDM enables study coordinators to assess more accurately when data collection can be terminated, resulting in better models or lower patient recruitment costs.
Weak signal amplification and detection by higher-order sensory neurons
Longtin, Andre; Maler, Leonard
2016-01-01
Sensory systems must extract behaviorally relevant information and therefore often exhibit a very high sensitivity. How the nervous system reaches such high sensitivity levels is an outstanding question in neuroscience. Weakly electric fish (Apteronotus leptorhynchus/albifrons) are an excellent model system to address this question because detailed background knowledge is available regarding their behavioral performance and its underlying neuronal substrate. Apteronotus use their electrosense to detect prey objects. Therefore, they must be able to detect electrical signals as low as 1 μV while using a sensory integration time of <200 ms. How these very weak signals are extracted and amplified by the nervous system is not yet understood. We studied the responses of cells in the early sensory processing areas, namely, the electroreceptor afferents (EAs) and pyramidal cells (PCs) of the electrosensory lobe (ELL), the first-order electrosensory processing area. In agreement with previous work we found that EAs cannot encode very weak signals with a spike count code. However, PCs can encode prey mimic signals by their firing rate, revealing a huge signal amplification between EAs and PCs and also suggesting differences in their stimulus encoding properties. Using a simple leaky integrate-and-fire (LIF) model we predict that the target neurons of PCs in the midbrain torus semicircularis (TS) are able to detect very weak signals. In particular, TS neurons could do so by assuming biologically plausible convergence rates as well as very simple decoding strategies such as temporal integration, threshold crossing, and combining the inputs of PCs. PMID:26843601
Adaptable, high recall, event extraction system with minimal configuration.
Miwa, Makoto; Ananiadou, Sophia
2015-01-01
Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration.
An integrated system for land resources supervision based on the IoT and cloud computing
NASA Astrophysics Data System (ADS)
Fang, Shifeng; Zhu, Yunqiang; Xu, Lida; Zhang, Jinqu; Zhou, Peiji; Luo, Kan; Yang, Jie
2017-01-01
Integrated information systems are important safeguards for the utilisation and development of land resources. Information technologies, including the Internet of Things (IoT) and cloud computing, are inevitable requirements for the quality and efficiency of land resources supervision tasks. In this study, an economical and highly efficient supervision system for land resources has been established based on IoT and cloud computing technologies; a novel online and offline integrated system with synchronised internal and field data that includes the entire process of 'discovering breaches, analysing problems, verifying fieldwork and investigating cases' was constructed. The system integrates key technologies, such as the automatic extraction of high-precision information based on remote sensing, semantic ontology-based technology to excavate and discriminate public sentiment on the Internet that is related to illegal incidents, high-performance parallel computing based on MapReduce, uniform storing and compressing (bitwise) technology, global positioning system data communication and data synchronisation mode, intelligent recognition and four-level ('device, transfer, system and data') safety control technology. The integrated system based on a 'One Map' platform has been officially implemented by the Department of Land and Resources of Guizhou Province, China, and was found to significantly increase the efficiency and level of land resources supervision. The system promoted the overall development of informatisation in fields related to land resource management.
Wei, Peng-Hu; Cong, Fei; Chen, Ge; Li, Ming-Chu; Yu, Xin-Guang; Bao, Yu-Hai
2017-02-01
Diffusion tensor imaging-based navigation is unable to resolve crossing fibers or to determine with accuracy the fanning, origin, and termination of fibers. It is important to improve the accuracy of localizing white matter fibers for improved surgical approaches. We propose a solution to this problem using navigation based on track density imaging extracted from high-definition fiber tractography (HDFT). A 28-year-old asymptomatic female patient with a left-lateral ventricle meningioma was enrolled in the present study. Language and visual tests, magnetic resonance imaging findings, both preoperative and postoperative HDFT, and the intraoperative navigation and surgery process are presented. Track density images were extracted from tracts derived using full q-space (514 directions) diffusion spectrum imaging (DSI) and integrated into a neuronavigation system. Navigation accuracy was verified via intraoperative records and postoperative DSI tractography, as well as a functional examination. DSI successfully represented the shape and range of the Meyer loop and arcuate fasciculus. Extracted track density images from the DSI were successfully integrated into the navigation system. The relationship between the operation channel and surrounding tracts was consistent with the postoperative findings, and the patient was functionally intact after the surgery. DSI-based TDI navigation allows for the visualization of anatomic features such as fanning and angling and helps to identify the range of a given tract. Moreover, our results show that our HDFT navigation method is a promising technique that preserves neural function. Copyright © 2016 Elsevier Inc. All rights reserved.
Dynamic Visualization of Co-expression in Systems Genetics Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
New, Joshua Ryan; Huang, Jian; Chesler, Elissa J
2008-01-01
Biologists hope to address grand scientific challenges by exploring the abundance of data made available through modern microarray technology and other high-throughput techniques. The impact of this data, however, is limited unless researchers can effectively assimilate such complex information and integrate it into their daily research; interactive visualization tools are called for to support the effort. Specifically, typical studies of gene co-expression require novel visualization tools that enable the dynamic formulation and fine-tuning of hypotheses to aid the process of evaluating sensitivity of key parameters. These tools should allow biologists to develop an intuitive understanding of the structure of biologicalmore » networks and discover genes which reside in critical positions in networks and pathways. By using a graph as a universal data representation of correlation in gene expression data, our novel visualization tool employs several techniques that when used in an integrated manner provide innovative analytical capabilities. Our tool for interacting with gene co-expression data integrates techniques such as: graph layout, qualitative subgraph extraction through a novel 2D user interface, quantitative subgraph extraction using graph-theoretic algorithms or by querying an optimized b-tree, dynamic level-of-detail graph abstraction, and template-based fuzzy classification using neural networks. We demonstrate our system using a real-world workflow from a large-scale, systems genetics study of mammalian gene co-expression.« less
Sadhukhan, Jhuma; Ng, Kok Siew; Martinez-Hernandez, Elias
2016-09-01
This paper, for the first time, reports integrated conceptual MBCT/biorefinery systems for unlocking the value of organics in municipal solid waste (MSW) through the production of levulinic acid (LA by 5wt%) that increases the economic margin by 110-150%. After mechanical separation recovering recyclables, metals (iron, aluminium, copper) and refuse derived fuel (RDF), lignocelluloses from remaining MSW are extracted by supercritical-water for chemical valorisation, comprising hydrolysis in 2wt% dilute H2SO4 catalyst producing LA, furfural, formic acid (FA), via C5/C6 sugar extraction, in plug flow (210-230°C, 25bar, 12s) and continuous stirred tank (195-215°C, 14bar, 20min) reactors; char separation and LA extraction/purification by methyl isobutyl ketone solvent; acid/solvent and by-product recovery. The by-product and pulping effluents are anaerobically digested into biogas and fertiliser. Produced biogas (6.4MWh/t), RDF (5.4MWh/t), char (4.5MWh/t) are combusted, heat recovered into steam generation in boiler (efficiency: 80%); on-site heat/steam demand is met; balance of steam is expanded into electricity in steam turbines (efficiency: 35%). Copyright © 2016 Elsevier Ltd. All rights reserved.
Rapid Automated Sample Preparation for Biological Assays
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shusteff, M
Our technology utilizes acoustic, thermal, and electric fields to separate out contaminants such as debris or pollen from environmental samples, lyse open cells, and extract the DNA from the lysate. The objective of the project is to optimize the system described for a forensic sample, and demonstrate its performance for integration with downstream assay platforms (e.g. MIT-LL's ANDE). We intend to increase the quantity of DNA recovered from the sample beyond the current {approx}80% achieved using solid phase extraction methods. Task 1: Develop and test an acoustic filter for cell extraction. Task 2: Develop and test lysis chip. Task 3:more » Develop and test DNA extraction chip. All chips have been fabricated based on the designs laid out in last month's report.« less
Integration of virtual and real scenes within an integral 3D imaging environment
NASA Astrophysics Data System (ADS)
Ren, Jinsong; Aggoun, Amar; McCormick, Malcolm
2002-11-01
The Imaging Technologies group at De Montfort University has developed an integral 3D imaging system, which is seen as the most likely vehicle for 3D television avoiding psychological effects. To create real fascinating three-dimensional television programs, a virtual studio that performs the task of generating, editing and integrating the 3D contents involving virtual and real scenes is required. The paper presents, for the first time, the procedures, factors and methods of integrating computer-generated virtual scenes with real objects captured using the 3D integral imaging camera system. The method of computer generation of 3D integral images, where the lens array is modelled instead of the physical camera is described. In the model each micro-lens that captures different elemental images of the virtual scene is treated as an extended pinhole camera. An integration process named integrated rendering is illustrated. Detailed discussion and deep investigation are focused on depth extraction from captured integral 3D images. The depth calculation method from the disparity and the multiple baseline method that is used to improve the precision of depth estimation are also presented. The concept of colour SSD and its further improvement in the precision is proposed and verified.
ERIC Educational Resources Information Center
Scott-Bracey, Pamela
2011-01-01
The purpose of this study was to explore the alignment of soft skills sought by current business IS entry-level employers in electronic job postings, with the integration of soft skills in undergraduate business information systems (IS) syllabi of public four-year universities in Texas. One hundred fifty job postings were extracted from two major…
Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo
2018-03-01
Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. PROSPERO 2018 CRD42018085205.
Automatic Authorship Detection Using Textual Patterns Extracted from Integrated Syntactic Graphs
Gómez-Adorno, Helena; Sidorov, Grigori; Pinto, David; Vilariño, Darnes; Gelbukh, Alexander
2016-01-01
We apply the integrated syntactic graph feature extraction methodology to the task of automatic authorship detection. This graph-based representation allows integrating different levels of language description into a single structure. We extract textual patterns based on features obtained from shortest path walks over integrated syntactic graphs and apply them to determine the authors of documents. On average, our method outperforms the state of the art approaches and gives consistently high results across different corpora, unlike existing methods. Our results show that our textual patterns are useful for the task of authorship attribution. PMID:27589740
Düsman, E; Almeida, I V; Pinto, E P; Lucchetta, L; Vicentini, V E P
2017-05-31
Integral grape juice is extracted from the grape through processes that allow the retention of their natural composition. However, due to the severity of some processes, fruit juices can undergo changes in their quality. The present study evaluated the cytotoxic and mutagenic effects of integral grape juice by a cytokinesis-blocked micronucleus assay in Rattus norvegicus hepatoma cells (HTC) in vitro. Vitis labrusca L. (variety Concord) were produced organically and by a conventional system, and their juice was extracted by a hot extraction process. The organic grapes were subjected to ultraviolet-type C radiation (UV-C). Experiments were performed after production and after 6 months in storage. Physicochemical analyses revealed that UV-C irradiation of organic grapes, the juice production process, and storage resulted in nutraceutical alterations. However, none of the juice concentrations were cytotoxic to HTC cells by the cytokinesis-blocked proliferation index results or were mutagenic, because the formation of micronucleated cells was not induced. In general, juice induced cell proliferation, possibly due to the presence of vitamins and sugar content (total soluble solid). The data increased the understanding of food technology and confirmed the quality and safety consumption of these juices.
Orechia, John; Pathak, Ameet; Shi, Yunling; Nawani, Aniket; Belozerov, Andrey; Fontes, Caitlin; Lakhiani, Camille; Jawale, Chetan; Patel, Chetansharan; Quinn, Daniel; Botvinnik, Dmitry; Mei, Eddie; Cotter, Elizabeth; Byleckie, James; Ullman-Cullere, Mollie; Chhetri, Padam; Chalasani, Poornima; Karnam, Purushotham; Beaudoin, Ronald; Sahu, Sandeep; Belozerova, Yelena; Mathew, Jomol P.
2015-01-01
We live in the genomic era of medicine, where a patient's genomic/molecular data is becoming increasingly important for disease diagnosis, identification of targeted therapy, and risk assessment for adverse reactions. However, decoding the genomic test results and integrating it with clinical data for retrospective studies and cohort identification for prospective clinical trials is still a challenging task. In order to overcome these barriers, we developed an overarching enterprise informatics framework for translational research and personalized medicine called Synergistic Patient and Research Knowledge Systems (SPARKS) and a suite of tools called Oncology Data Retrieval Systems (OncDRS). OncDRS enables seamless data integration, secure and self-navigated query and extraction of clinical and genomic data from heterogeneous sources. Within a year of release, the system has facilitated more than 1500 research queries and has delivered data for more than 50 research studies. PMID:27054074
Principles for timing at spallation neutron sources based on developments at LANSCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nelson, R. O.; Merl, R. B.; Rose, C. R.
2001-01-01
Due to AC-power-grid frequency fluctuations, the designers for accelerator-based spallation-neutron facilities have worked to optimize the conflicting demands of accelerator and neutron chopper performance. For the first time, we are able to quantitatively access the tradeoffs between these two constraints and design or upgrade a facility to optimize total system performance using powerful new simulation techniques. We have modeled timing systems that integrate chopper controllers and chopper hardware and built new systems. Thus, at LANSCE, we now operate multiple chopper systems and the accelerator as simple slaves to a single master-timing-reference generator. Based on this experience we recommend that spallationmore » neutron sources adhere to three principles. First, timing for pulsed sources should be planned starting with extraction at a fixed phase and working backwards toward the leading edge of the beam pulse. Second, accelerator triggers and storage ring extraction commands from neutron choppers offer only marginal benefits to accelerator-based spallation sources. Third, the storage-ring RF should be phase synchronized with neutron choppers to provide extraction without the one orbit timing uncertainty.« less
Mach 6.5 air induction system design for the Beta 2 two-stage-to-orbit booster vehicle
NASA Technical Reports Server (NTRS)
Midea, Anthony C.
1991-01-01
A preliminary, two-dimensional, mixed compression air induction system is designed for the Beta II Two Stage to Orbit booster vehicle to minimize installation losses and efficiently deliver the required airflow. Design concepts, such as an external isentropic compression ramp and a bypass system were developed and evaluated for performance benefits. The design was optimized by maximizing installed propulsion/vehicle system performance. The resulting system design operating characteristics and performance are presented. The air induction system design has significantly lower transonic drag than similar designs and only requires about 1/3 of the bleed extraction. In addition, the design efficiently provides the integrated system required airflow, while maintaining adequate levels of total pressure recovery. The excellent performance of this highly integrated air induction system is essential for the successful completion of the Beta II booster vehicle mission.
A Standard-Compliant Virtual Meeting System with Active Video Object Tracking
NASA Astrophysics Data System (ADS)
Lin, Chia-Wen; Chang, Yao-Jen; Wang, Chih-Ming; Chen, Yung-Chang; Sun, Ming-Ting
2002-12-01
This paper presents an H.323 standard compliant virtual video conferencing system. The proposed system not only serves as a multipoint control unit (MCU) for multipoint connection but also provides a gateway function between the H.323 LAN (local-area network) and the H.324 WAN (wide-area network) users. The proposed virtual video conferencing system provides user-friendly object compositing and manipulation features including 2D video object scaling, repositioning, rotation, and dynamic bit-allocation in a 3D virtual environment. A reliable, and accurate scheme based on background image mosaics is proposed for real-time extracting and tracking foreground video objects from the video captured with an active camera. Chroma-key insertion is used to facilitate video objects extraction and manipulation. We have implemented a prototype of the virtual conference system with an integrated graphical user interface to demonstrate the feasibility of the proposed methods.
Enhancing acronym/abbreviation knowledge bases with semantic information.
Torii, Manabu; Liu, Hongfang
2007-10-11
In the biomedical domain, a terminology knowledge base that associates acronyms/abbreviations (denoted as SFs) with the definitions (denoted as LFs) is highly needed. For the construction such terminology knowledge base, we investigate the feasibility to build a system automatically assigning semantic categories to LFs extracted from text. Given a collection of pairs (SF,LF) derived from text, we i) assess the coverage of LFs and pairs (SF,LF) in the UMLS and justify the need of a semantic category assignment system; and ii) automatically derive name phrases annotated with semantic category and construct a system using machine learning. Utilizing ADAM, an existing collection of (SF,LF) pairs extracted from MEDLINE, our system achieved an f-measure of 87% when assigning eight UMLS-based semantic groups to LFs. The system has been incorporated into a web interface which integrates SF knowledge from multiple SF knowledge bases. Web site: http://gauss.dbb.georgetown.edu/liblab/SFThesurus.
NASA's online machine aided indexing system
NASA Technical Reports Server (NTRS)
Silvester, June P.; Genuardi, Michael T.; Klingbiel, Paul H.
1993-01-01
This report describes the NASA Lexical Dictionary, a machine aided indexing system used online at the National Aeronautics and Space Administration's Center for Aerospace Information (CASI). This system is comprised of a text processor that is based on the computational, non-syntactic analysis of input text, and an extensive 'knowledge base' that serves to recognize and translate text-extracted concepts. The structure and function of the various NLD system components are described in detail. Methods used for the development of the knowledge base are discussed. Particular attention is given to a statistically-based text analysis program that provides the knowledge base developer with a list of concept-specific phrases extracted from large textual corpora. Production and quality benefits resulting from the integration of machine aided indexing at CASI are discussed along with a number of secondary applications of NLD-derived systems including on-line spell checking and machine aided lexicography.
Zheng, Shuai; Lu, James J; Ghasemzadeh, Nima; Hayek, Salim S; Quyyumi, Arshed A; Wang, Fusheng
2017-05-09
Extracting structured data from narrated medical reports is challenged by the complexity of heterogeneous structures and vocabularies and often requires significant manual effort. Traditional machine-based approaches lack the capability to take user feedbacks for improving the extraction algorithm in real time. Our goal was to provide a generic information extraction framework that can support diverse clinical reports and enables a dynamic interaction between a human and a machine that produces highly accurate results. A clinical information extraction system IDEAL-X has been built on top of online machine learning. It processes one document at a time, and user interactions are recorded as feedbacks to update the learning model in real time. The updated model is used to predict values for extraction in subsequent documents. Once prediction accuracy reaches a user-acceptable threshold, the remaining documents may be batch processed. A customizable controlled vocabulary may be used to support extraction. Three datasets were used for experiments based on report styles: 100 cardiac catheterization procedure reports, 100 coronary angiographic reports, and 100 integrated reports-each combines history and physical report, discharge summary, outpatient clinic notes, outpatient clinic letter, and inpatient discharge medication report. Data extraction was performed by 3 methods: online machine learning, controlled vocabularies, and a combination of these. The system delivers results with F1 scores greater than 95%. IDEAL-X adopts a unique online machine learning-based approach combined with controlled vocabularies to support data extraction for clinical reports. The system can quickly learn and improve, thus it is highly adaptable. ©Shuai Zheng, James J Lu, Nima Ghasemzadeh, Salim S Hayek, Arshed A Quyyumi, Fusheng Wang. Originally published in JMIR Medical Informatics (http://medinform.jmir.org), 09.05.2017.
Yi, Ming; Mudunuri, Uma; Che, Anney; Stephens, Robert M
2009-06-29
One of the challenges in the analysis of microarray data is to integrate and compare the selected (e.g., differential) gene lists from multiple experiments for common or unique underlying biological themes. A common way to approach this problem is to extract common genes from these gene lists and then subject these genes to enrichment analysis to reveal the underlying biology. However, the capacity of this approach is largely restricted by the limited number of common genes shared by datasets from multiple experiments, which could be caused by the complexity of the biological system itself. We now introduce a new Pathway Pattern Extraction Pipeline (PPEP), which extends the existing WPS application by providing a new pathway-level comparative analysis scheme. To facilitate comparing and correlating results from different studies and sources, PPEP contains new interfaces that allow evaluation of the pathway-level enrichment patterns across multiple gene lists. As an exploratory tool, this analysis pipeline may help reveal the underlying biological themes at both the pathway and gene levels. The analysis scheme provided by PPEP begins with multiple gene lists, which may be derived from different studies in terms of the biological contexts, applied technologies, or methodologies. These lists are then subjected to pathway-level comparative analysis for extraction of pathway-level patterns. This analysis pipeline helps to explore the commonality or uniqueness of these lists at the level of pathways or biological processes from different but relevant biological systems using a combination of statistical enrichment measurements, pathway-level pattern extraction, and graphical display of the relationships of genes and their associated pathways as Gene-Term Association Networks (GTANs) within the WPS platform. As a proof of concept, we have used the new method to analyze many datasets from our collaborators as well as some public microarray datasets. This tool provides a new pathway-level analysis scheme for integrative and comparative analysis of data derived from different but relevant systems. The tool is freely available as a Pathway Pattern Extraction Pipeline implemented in our existing software package WPS, which can be obtained at http://www.abcc.ncifcrf.gov/wps/wps_index.php.
Extracting semantically enriched events from biomedical literature
2012-01-01
Background Research into event-based text mining from the biomedical literature has been growing in popularity to facilitate the development of advanced biomedical text mining systems. Such technology permits advanced search, which goes beyond document or sentence-based retrieval. However, existing event-based systems typically ignore additional information within the textual context of events that can determine, amongst other things, whether an event represents a fact, hypothesis, experimental result or analysis of results, whether it describes new or previously reported knowledge, and whether it is speculated or negated. We refer to such contextual information as meta-knowledge. The automatic recognition of such information can permit the training of systems allowing finer-grained searching of events according to the meta-knowledge that is associated with them. Results Based on a corpus of 1,000 MEDLINE abstracts, fully manually annotated with both events and associated meta-knowledge, we have constructed a machine learning-based system that automatically assigns meta-knowledge information to events. This system has been integrated into EventMine, a state-of-the-art event extraction system, in order to create a more advanced system (EventMine-MK) that not only extracts events from text automatically, but also assigns five different types of meta-knowledge to these events. The meta-knowledge assignment module of EventMine-MK performs with macro-averaged F-scores in the range of 57-87% on the BioNLP’09 Shared Task corpus. EventMine-MK has been evaluated on the BioNLP’09 Shared Task subtask of detecting negated and speculated events. Our results show that EventMine-MK can outperform other state-of-the-art systems that participated in this task. Conclusions We have constructed the first practical system that extracts both events and associated, detailed meta-knowledge information from biomedical literature. The automatically assigned meta-knowledge information can be used to refine search systems, in order to provide an extra search layer beyond entities and assertions, dealing with phenomena such as rhetorical intent, speculations, contradictions and negations. This finer grained search functionality can assist in several important tasks, e.g., database curation (by locating new experimental knowledge) and pathway enrichment (by providing information for inference). To allow easy integration into text mining systems, EventMine-MK is provided as a UIMA component that can be used in the interoperable text mining infrastructure, U-Compare. PMID:22621266
Extracting semantically enriched events from biomedical literature.
Miwa, Makoto; Thompson, Paul; McNaught, John; Kell, Douglas B; Ananiadou, Sophia
2012-05-23
Research into event-based text mining from the biomedical literature has been growing in popularity to facilitate the development of advanced biomedical text mining systems. Such technology permits advanced search, which goes beyond document or sentence-based retrieval. However, existing event-based systems typically ignore additional information within the textual context of events that can determine, amongst other things, whether an event represents a fact, hypothesis, experimental result or analysis of results, whether it describes new or previously reported knowledge, and whether it is speculated or negated. We refer to such contextual information as meta-knowledge. The automatic recognition of such information can permit the training of systems allowing finer-grained searching of events according to the meta-knowledge that is associated with them. Based on a corpus of 1,000 MEDLINE abstracts, fully manually annotated with both events and associated meta-knowledge, we have constructed a machine learning-based system that automatically assigns meta-knowledge information to events. This system has been integrated into EventMine, a state-of-the-art event extraction system, in order to create a more advanced system (EventMine-MK) that not only extracts events from text automatically, but also assigns five different types of meta-knowledge to these events. The meta-knowledge assignment module of EventMine-MK performs with macro-averaged F-scores in the range of 57-87% on the BioNLP'09 Shared Task corpus. EventMine-MK has been evaluated on the BioNLP'09 Shared Task subtask of detecting negated and speculated events. Our results show that EventMine-MK can outperform other state-of-the-art systems that participated in this task. We have constructed the first practical system that extracts both events and associated, detailed meta-knowledge information from biomedical literature. The automatically assigned meta-knowledge information can be used to refine search systems, in order to provide an extra search layer beyond entities and assertions, dealing with phenomena such as rhetorical intent, speculations, contradictions and negations. This finer grained search functionality can assist in several important tasks, e.g., database curation (by locating new experimental knowledge) and pathway enrichment (by providing information for inference). To allow easy integration into text mining systems, EventMine-MK is provided as a UIMA component that can be used in the interoperable text mining infrastructure, U-Compare.
Integrating language models into classifiers for BCI communication: a review
NASA Astrophysics Data System (ADS)
Speier, W.; Arnold, C.; Pouratian, N.
2016-06-01
Objective. The present review systematically examines the integration of language models to improve classifier performance in brain-computer interface (BCI) communication systems. Approach. The domain of natural language has been studied extensively in linguistics and has been used in the natural language processing field in applications including information extraction, machine translation, and speech recognition. While these methods have been used for years in traditional augmentative and assistive communication devices, information about the output domain has largely been ignored in BCI communication systems. Over the last few years, BCI communication systems have started to leverage this information through the inclusion of language models. Main results. Although this movement began only recently, studies have already shown the potential of language integration in BCI communication and it has become a growing field in BCI research. BCI communication systems using language models in their classifiers have progressed down several parallel paths, including: word completion; signal classification; integration of process models; dynamic stopping; unsupervised learning; error correction; and evaluation. Significance. Each of these methods have shown significant progress, but have largely been addressed separately. Combining these methods could use the full potential of language model, yielding further performance improvements. This integration should be a priority as the field works to create a BCI system that meets the needs of the amyotrophic lateral sclerosis population.
Integrating language models into classifiers for BCI communication: a review.
Speier, W; Arnold, C; Pouratian, N
2016-06-01
The present review systematically examines the integration of language models to improve classifier performance in brain-computer interface (BCI) communication systems. The domain of natural language has been studied extensively in linguistics and has been used in the natural language processing field in applications including information extraction, machine translation, and speech recognition. While these methods have been used for years in traditional augmentative and assistive communication devices, information about the output domain has largely been ignored in BCI communication systems. Over the last few years, BCI communication systems have started to leverage this information through the inclusion of language models. Although this movement began only recently, studies have already shown the potential of language integration in BCI communication and it has become a growing field in BCI research. BCI communication systems using language models in their classifiers have progressed down several parallel paths, including: word completion; signal classification; integration of process models; dynamic stopping; unsupervised learning; error correction; and evaluation. Each of these methods have shown significant progress, but have largely been addressed separately. Combining these methods could use the full potential of language model, yielding further performance improvements. This integration should be a priority as the field works to create a BCI system that meets the needs of the amyotrophic lateral sclerosis population.
NASA Astrophysics Data System (ADS)
Saxena, Hemant; Singh, Alka; Rai, J. N.
2018-07-01
This article discusses the design and control of a single-phase grid-connected photovoltaic (PV) system. A 5-kW PV system is designed and integrated at the DC link of an H-bridge voltage source converter (VSC). The control of the VSC and switching logic is modelled using a generalised integrator (GI). The use of GI or its variants such as second-order GI have recently evolved for synchronisation and are being used as phase locked loop (PLL) circuits for grid integration. Design of PLL circuits and the use of transformations such as Park's and Clarke's are much easier in three-phase systems. But obtaining in-phase and quadrature components becomes an important and challenging issue in single-phase systems. This article addresses this issue and discusses an altogether different application of GI for the design of compensator based on the extraction of in-phase and quadrature components. GI is frequently used as a PLL; however, in this article, it is not used for synchronisation purposes. A new controller has been designed for a single-phase grid-connected PV system working as a single-phase active compensator. Extensive simulation results are shown for the working of integrated PV system under different atmospheric and operating conditions during daytime as well as night conditions. Experimental results showing the proposed control approach are presented and discussed for the hardware set-up developed in the laboratory.
Wang, Liping; Duan, Haotian; Jiang, Jiebing; Long, Jiakun; Yu, Yingjia; Chen, Guiliang; Duan, Gengli
2017-09-01
A new, simple, and fast infrared-assisted self enzymolysis extraction (IRASEE) approach for the extraction of total flavonoid aglycones (TFA) mainly including baicalein, wogonin, and oroxylin A from Scutellariae Radix is presented to enhance extraction yield. Extraction enzymolysis temperature, enzymolysis liquid-to-solid ratio, enzymolysis pH, enzymolysis time and infrared power, the factors affecting IRASEE procedure, were investigated in a newly designed, temperature-controlled infrared-assisted extraction (TC-IRAE) system to acquire the optimum analysis conditions. The results illustrated that IRASEE possessed great advantages in terms of efficiency and time compared with other conventional extraction techniques. Furthermore, the mechanism of IRASEE was preliminarily explored by observing the microscopic change of the samples surface structures, studying the main chemical compositions change of the samples before and after extraction and investigating the kinetics and thermodynamics at three temperature levels during the IRASEE process. These findings revealed that IRASEE can destroy the surface microstructures to accelerate the mass transfer and reduce the activation energy to intensify the chemical process. This integrative study presents a simple, rapid, efficient, and environmental IRASEE method for TFA extraction which has promising prospects for other similar herbal medicines. Graphical Abstract ᅟ.
5-Gb/s 0.18-μm CMOS 2:1 multiplexer with integrated clock extraction
NASA Astrophysics Data System (ADS)
Changchun, Zhang; Zhigong, Wang; Si, Shi; Peng, Miao; Ling, Tian
2009-09-01
A 5-Gb/s 2:1 MUX (multiplexer) with an on-chip integrated clock extraction circuit which possesses the function of automatic phase alignment (APA), has been designed and fabricated in SMIC's 0.18 μm CMOS technology. The chip area is 670 × 780 μm2. At a single supply voltage of 1.8 V, the total power consumption is 112 mW with an input sensitivity of less than 50 mV and an output single-ended swing of above 300 mV. The measurement results show that the IC can work reliably at any input data rate between 1.8 and 2.6 Gb/s with no need for external components, reference clock, or phase alignment between data and clock. It can be used in a parallel optic-fiber data interconnecting system.
River Devices to Recover Energy with Advanced Materials (River DREAM)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McMahon, Daniel P.
2013-07-03
The purpose of this project is to develop a generator called a Galloping Hydroelectric Energy Extraction Device (GHEED). It uses a galloping prism to convert water flow into linear motion. This motion is converted into electricity via a dielectric elastomer generator (DEG). The galloping mechanism and the DEG are combined to create a system to effectively generate electricity. This project has three research objectives: 1. Oscillator development and design a. Characterize galloping behavior, evaluate control surface shape change on oscillator performance and demonstrate shape change with water flow change. 2. Dielectric Energy Generator (DEG) characterization and modeling a. Characterize andmore » model the performance of the DEG based on oscillator design 3. Galloping Hydroelectric Energy Extraction Device (GHEED) system modeling and integration a. Create numerical models for construction of a system performance model and define operating capabilities for this approach Accomplishing these three objectives will result in the creation of a model that can be used to fully define the operating parameters and performance capabilities of a generator based on the GHEED design. This information will be used in the next phase of product development, the creation of an integrated laboratory scale generator to confirm model predictions.« less
Integrated Magneto-Chemical Sensor For On-Site Food Allergen Detection.
Lin, Hsing-Ying; Huang, Chen-Han; Park, Jongmin; Pathania, Divya; Castro, Cesar M; Fasano, Alessio; Weissleder, Ralph; Lee, Hakho
2017-10-24
Adverse food reactions, including food allergies, food sensitivities, and autoimmune reaction (e.g., celiac disease) affect 5-15% of the population and remain a considerable public health problem requiring stringent food avoidance and epinephrine availability for emergency events. Avoiding problematic foods is practically difficult, given current reliance on prepared foods and out-of-home meals. In response, we developed a portable, point-of-use detection technology, termed integrated exogenous antigen testing (iEAT). The system consists of a disposable antigen extraction device coupled with an electronic keychain reader for rapid sensing and communication. We optimized the prototype iEAT system to detect five major food antigens in peanuts, hazelnuts, wheat, milk, and eggs. Antigen extraction and detection with iEAT requires <10 min and achieves high-detection sensitivities (e.g., 0.1 mg/kg for gluten, lower than regulatory limits of 20 mg/kg). When testing under restaurant conditions, we were able to detect hidden food antigens such as gluten within "gluten-free" food items. The small size and rapid, simple testing of the iEAT system should help not only consumers but also other key stakeholders such as clinicians, food industries, and regulators to enhance food safety.
Virot, Matthieu; Tomao, Valérie; Ginies, Christian; Visinoni, Franco; Chemat, Farid
2008-07-04
Here is described a green and original alternative procedure for fats and oils' determination in oleaginous seeds. Extractions were carried out using a by-product of the citrus industry as extraction solvent, namely d-limonene, instead of hazardous petroleum solvents such as n-hexane. The described method is achieved in two steps using microwave energy: at first, extractions are attained using microwave-integrated Soxhlet, followed by the elimination of the solvent from the medium using a microwave Clevenger distillation in the second step. Oils extracted from olive seeds were compared with both conventional Soxhlet and microwave-integrated Soxhlet extraction procedures performed with n-hexane in terms of qualitative and quantitative determination. No significant difference was obtained between each extract allowing us to conclude that the proposed method is effective and valuable.
Changes in Quality of Health Care Delivery after Vertical Integration
Carlin, Caroline S; Dowd, Bryan; Feldman, Roger
2015-01-01
Objectives To fill an empirical gap in the literature by examining changes in quality of care measures occurring when multispecialty clinic systems were acquired by hospital-owned, vertically integrated health care delivery systems in the Twin Cities area. Data Sources/Study Setting Administrative data for health plan enrollees attributed to treatment and control clinic systems, merged with U.S. Census data. Study Design We compared changes in quality measures for health plan enrollees in the acquired clinics to enrollees in nine control groups using a differences-in-differences model. Our dataset spans 2 years prior to and 4 years after the acquisitions. We estimated probit models with errors clustered within enrollees. Data Collection/Extraction Methods Data were assembled by the health plan’s informatics team. Principal Findings Vertical integration is associated with increased rates of colorectal and cervical cancer screening and more appropriate emergency department use. The probability of ambulatory care–sensitive admissions increased when the acquisition caused disruption in admitting patterns. Conclusions Moving a clinic system into a vertically integrated delivery system resulted in limited increases in quality of care indicators. Caution is warranted when the acquisition causes disruption in referral patterns. PMID:25529312
Mirzaeinejad, Hossein; Mirzaei, Mehdi; Rafatnia, Sadra
2018-06-11
This study deals with the enhancement of directional stability of vehicle which turns with high speeds on various road conditions using integrated active steering and differential braking systems. In this respect, the minimum usage of intentional asymmetric braking force to compensate the drawbacks of active steering control with small reduction of vehicle longitudinal speed is desired. To this aim, a new optimal multivariable controller is analytically developed for integrated steering and braking systems based on the prediction of vehicle nonlinear responses. A fuzzy programming extracted from the nonlinear phase plane analysis is also used for managing the two control inputs in various driving conditions. With the proposed fuzzy programming, the weight factors of the control inputs are automatically tuned and softly changed. In order to simulate a real-world control system, some required information about the system states and parameters which cannot be directly measured, are estimated using the Unscented Kalman Filter (UKF). Finally, simulations studies are carried out using a validated vehicle model to show the effectiveness of the proposed integrated control system in the presence of model uncertainties and estimation errors. Copyright © 2018 ISA. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Jana, Suman; Biswas, Pabitra Kumar; Das, Upama
2018-04-01
The analytical and simulation-based study in this presented paper shows a comparative report between two level inverter and five-level inverter with the integration of Supercapacitive storage in Renewable Energy system. Sometime dependent numerical models are used to measure the voltage and current response of two level and five level inverter in MATLAB Simulink based environment. In this study supercapacitive sources, which are fed by solar cells are used as input sources to experiment the response of multilevel inverter with integration of su-percapacitor as a storage device of Renewable Energy System. The RL load is used to compute the time response in MATLABSimulink based environment. With the simulation results a comparative study has been made of two different level types of inverters. Two basic types of inverter are discussed in the study with reference to their electrical behavior. It is also simulated that multilevel inverter can convert stored energy within supercapacitor which is extracted from Renewable Energy System.
Concept recognition for extracting protein interaction relations from biomedical text
Baumgartner, William A; Lu, Zhiyong; Johnson, Helen L; Caporaso, J Gregory; Paquette, Jesse; Lindemann, Anna; White, Elizabeth K; Medvedeva, Olga; Cohen, K Bretonnel; Hunter, Lawrence
2008-01-01
Background: Reliable information extraction applications have been a long sought goal of the biomedical text mining community, a goal that if reached would provide valuable tools to benchside biologists in their increasingly difficult task of assimilating the knowledge contained in the biomedical literature. We present an integrated approach to concept recognition in biomedical text. Concept recognition provides key information that has been largely missing from previous biomedical information extraction efforts, namely direct links to well defined knowledge resources that explicitly cement the concept's semantics. The BioCreative II tasks discussed in this special issue have provided a unique opportunity to demonstrate the effectiveness of concept recognition in the field of biomedical language processing. Results: Through the modular construction of a protein interaction relation extraction system, we present several use cases of concept recognition in biomedical text, and relate these use cases to potential uses by the benchside biologist. Conclusion: Current information extraction technologies are approaching performance standards at which concept recognition can begin to deliver high quality data to the benchside biologist. Our system is available as part of the BioCreative Meta-Server project and on the internet . PMID:18834500
Ad-Hoc Queries over Document Collections - A Case Study
NASA Astrophysics Data System (ADS)
Löser, Alexander; Lutter, Steffen; Düssel, Patrick; Markl, Volker
We discuss the novel problem of supporting analytical business intelligence queries over web-based textual content, e.g., BI-style reports based on 100.000's of documents from an ad-hoc web search result. Neither conventional search engines nor conventional Business Intelligence and ETL tools address this problem, which lies at the intersection of their capabilities. "Google Squared" or our system GOOLAP.info, are examples of these kinds of systems. They execute information extraction methods over one or several document collections at query time and integrate extracted records into a common view or tabular structure. Frequent extraction and object resolution failures cause incomplete records which could not be joined into a record answering the query. Our focus is the identification of join-reordering heuristics maximizing the size of complete records answering a structured query. With respect to given costs for document extraction we propose two novel join-operations: The multi-way CJ-operator joins records from multiple relationships extracted from a single document. The two-way join-operator DJ ensures data density by removing incomplete records from results. In a preliminary case study we observe that our join-reordering heuristics positively impact result size, record density and lower execution costs.
Design of a decision-support architecture for management of remotely monitored patients.
Basilakis, Jim; Lovell, Nigel H; Redmond, Stephen J; Celler, Branko G
2010-09-01
Telehealth is the provision of health services at a distance. Typically, this occurs in unsupervised or remote environments, such as a patient's home. We describe one such telehealth system and the integration of extracted clinical measurement parameters with a decision-support system (DSS). An enterprise application-server framework, combined with a rules engine and statistical analysis tools, is used to analyze the acquired telehealth data, searching for trends and shifts in parameter values, as well as identifying individual measurements that exceed predetermined or adaptive thresholds. An overarching business process engine is used to manage the core DSS knowledge base and coordinate workflow outputs of the DSS. The primary role for such a DSS is to provide an effective means to reduce the data overload and to provide a means of health risk stratification to allow appropriate targeting of clinical resources to best manage the health of the patient. In this way, the system may ultimately influence changes in workflow by targeting scarce clinical resources to patients of most need. A single case study extracted from an initial pilot trial of the system, in patients with chronic obstructive pulmonary disease and chronic heart failure, will be reviewed to illustrate the potential benefit of integrating telehealth and decision support in the management of both acute and chronic disease.
Medical image retrieval system using multiple features from 3D ROIs
NASA Astrophysics Data System (ADS)
Lu, Hongbing; Wang, Weiwei; Liao, Qimei; Zhang, Guopeng; Zhou, Zhiming
2012-02-01
Compared to a retrieval using global image features, features extracted from regions of interest (ROIs) that reflect distribution patterns of abnormalities would benefit more for content-based medical image retrieval (CBMIR) systems. Currently, most CBMIR systems have been designed for 2D ROIs, which cannot reflect 3D anatomical features and region distribution of lesions comprehensively. To further improve the accuracy of image retrieval, we proposed a retrieval method with 3D features including both geometric features such as Shape Index (SI) and Curvedness (CV) and texture features derived from 3D Gray Level Co-occurrence Matrix, which were extracted from 3D ROIs, based on our previous 2D medical images retrieval system. The system was evaluated with 20 volume CT datasets for colon polyp detection. Preliminary experiments indicated that the integration of morphological features with texture features could improve retrieval performance greatly. The retrieval result using features extracted from 3D ROIs accorded better with the diagnosis from optical colonoscopy than that based on features from 2D ROIs. With the test database of images, the average accuracy rate for 3D retrieval method was 76.6%, indicating its potential value in clinical application.
Geng, Ping; Fang, Yingtong; Xie, Ronglong; Hu, Weilun; Xi, Xingjun; Chu, Qiao; Dong, Genlai; Shaheen, Nusrat; Wei, Yun
2017-02-01
Sugarcane rind contains some functional phenolic acids. The separation of these compounds from sugarcane rind is able to realize the integrated utilization of the crop and reduce environment pollution. In this paper, a novel protocol based on interfacing online solid-phase extraction with high-speed counter-current chromatography (HSCCC) was established, aiming at improving and simplifying the process of phenolic acids separation from sugarcane rind. The conditions of online solid-phase extraction with HSCCC involving solvent system, flow rate of mobile phase as well as saturated extent of absorption of solid-phase extraction were optimized to improve extraction efficiency and reduce separation time. The separation of phenolic acids was performed with a two-phase solvent system composed of butanol/acetic acid/water at a volume ratio of 4:1:5, and the developed online solid-phase extraction with HSCCC method was validated and successfully applied for sugarcane rind, and three phenolic acids including 6.73 mg of gallic acid, 10.85 mg of p-coumaric acid, and 2.78 mg of ferulic acid with purities of 60.2, 95.4, and 84%, respectively, were obtained from 150 mg sugarcane rind crude extracts. In addition, the three different elution methods of phenolic acids purification including HSCCC, elution-extrusion counter-current chromatography and back-extrusion counter-current chromatography were compared. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Algorithms and semantic infrastructure for mutation impact extraction and grounding.
Laurila, Jonas B; Naderi, Nona; Witte, René; Riazanov, Alexandre; Kouznetsov, Alexandre; Baker, Christopher J O
2010-12-02
Mutation impact extraction is a hitherto unaccomplished task in state of the art mutation extraction systems. Protein mutations and their impacts on protein properties are hidden in scientific literature, making them poorly accessible for protein engineers and inaccessible for phenotype-prediction systems that currently depend on manually curated genomic variation databases. We present the first rule-based approach for the extraction of mutation impacts on protein properties, categorizing their directionality as positive, negative or neutral. Furthermore protein and mutation mentions are grounded to their respective UniProtKB IDs and selected protein properties, namely protein functions to concepts found in the Gene Ontology. The extracted entities are populated to an OWL-DL Mutation Impact ontology facilitating complex querying for mutation impacts using SPARQL. We illustrate retrieval of proteins and mutant sequences for a given direction of impact on specific protein properties. Moreover we provide programmatic access to the data through semantic web services using the SADI (Semantic Automated Discovery and Integration) framework. We address the problem of access to legacy mutation data in unstructured form through the creation of novel mutation impact extraction methods which are evaluated on a corpus of full-text articles on haloalkane dehalogenases, tagged by domain experts. Our approaches show state of the art levels of precision and recall for Mutation Grounding and respectable level of precision but lower recall for the task of Mutant-Impact relation extraction. The system is deployed using text mining and semantic web technologies with the goal of publishing to a broad spectrum of consumers.
Weak signal amplification and detection by higher-order sensory neurons.
Jung, Sarah N; Longtin, Andre; Maler, Leonard
2016-04-01
Sensory systems must extract behaviorally relevant information and therefore often exhibit a very high sensitivity. How the nervous system reaches such high sensitivity levels is an outstanding question in neuroscience. Weakly electric fish (Apteronotus leptorhynchus/albifrons) are an excellent model system to address this question because detailed background knowledge is available regarding their behavioral performance and its underlying neuronal substrate. Apteronotus use their electrosense to detect prey objects. Therefore, they must be able to detect electrical signals as low as 1 μV while using a sensory integration time of <200 ms. How these very weak signals are extracted and amplified by the nervous system is not yet understood. We studied the responses of cells in the early sensory processing areas, namely, the electroreceptor afferents (EAs) and pyramidal cells (PCs) of the electrosensory lobe (ELL), the first-order electrosensory processing area. In agreement with previous work we found that EAs cannot encode very weak signals with a spike count code. However, PCs can encode prey mimic signals by their firing rate, revealing a huge signal amplification between EAs and PCs and also suggesting differences in their stimulus encoding properties. Using a simple leaky integrate-and-fire (LIF) model we predict that the target neurons of PCs in the midbrain torus semicircularis (TS) are able to detect very weak signals. In particular, TS neurons could do so by assuming biologically plausible convergence rates as well as very simple decoding strategies such as temporal integration, threshold crossing, and combining the inputs of PCs. Copyright © 2016 the American Physiological Society.
Ensemble Classifier Strategy Based on Transient Feature Fusion in Electronic Nose
NASA Astrophysics Data System (ADS)
Bagheri, Mohammad Ali; Montazer, Gholam Ali
2011-09-01
In this paper, we test the performance of several ensembles of classifiers and each base learner has been trained on different types of extracted features. Experimental results show the potential benefits introduced by the usage of simple ensemble classification systems for the integration of different types of transient features.
A Comparison of Tuition Disparities among City, Suburban, Town, and Rural Public Community Colleges
ERIC Educational Resources Information Center
Glover, Louis Charles
2009-01-01
The purpose of this study was to examine differences in tuition rates and college affordability indexes (CAIs) between and among U.S. public community colleges formulated upon urbanization criteria extracted from the Integrated Post Secondary Data System (IPEDS) maintained by the National Center for Education Statistics (NCES), which operates…
DISEASES: text mining and data integration of disease-gene associations.
Pletscher-Frankild, Sune; Pallejà, Albert; Tsafou, Kalliopi; Binder, Janos X; Jensen, Lars Juhl
2015-03-01
Text mining is a flexible technology that can be applied to numerous different tasks in biology and medicine. We present a system for extracting disease-gene associations from biomedical abstracts. The system consists of a highly efficient dictionary-based tagger for named entity recognition of human genes and diseases, which we combine with a scoring scheme that takes into account co-occurrences both within and between sentences. We show that this approach is able to extract half of all manually curated associations with a false positive rate of only 0.16%. Nonetheless, text mining should not stand alone, but be combined with other types of evidence. For this reason, we have developed the DISEASES resource, which integrates the results from text mining with manually curated disease-gene associations, cancer mutation data, and genome-wide association studies from existing databases. The DISEASES resource is accessible through a web interface at http://diseases.jensenlab.org/, where the text-mining software and all associations are also freely available for download. Copyright © 2014 The Authors. Published by Elsevier Inc. All rights reserved.
Concept of operations for knowledge discovery from Big Data across enterprise data warehouses
NASA Astrophysics Data System (ADS)
Sukumar, Sreenivas R.; Olama, Mohammed M.; McNair, Allen W.; Nutaro, James J.
2013-05-01
The success of data-driven business in government, science, and private industry is driving the need for seamless integration of intra and inter-enterprise data sources to extract knowledge nuggets in the form of correlations, trends, patterns and behaviors previously not discovered due to physical and logical separation of datasets. Today, as volume, velocity, variety and complexity of enterprise data keeps increasing, the next generation analysts are facing several challenges in the knowledge extraction process. Towards addressing these challenges, data-driven organizations that rely on the success of their analysts have to make investment decisions for sustainable data/information systems and knowledge discovery. Options that organizations are considering are newer storage/analysis architectures, better analysis machines, redesigned analysis algorithms, collaborative knowledge management tools, and query builders amongst many others. In this paper, we present a concept of operations for enabling knowledge discovery that data-driven organizations can leverage towards making their investment decisions. We base our recommendations on the experience gained from integrating multi-agency enterprise data warehouses at the Oak Ridge National Laboratory to design the foundation of future knowledge nurturing data-system architectures.
Pollard, S J T; Farmer, J G; Knight, D M; Young, P J
2002-01-01
Commercial mono- and polyclonal enzyme-linked immunosorbent assay (ELISA) systems were applied to the on-site analysis of weathered hydrocarbon-contaminated soils at a former integrated steelworks. Comparisons were made between concentrations of solvent extractable matter (SEM) determined gravimetrically by Soxhlet (dichloromethane) extraction and those estimated immunologically by ELISA determination over a concentration range of 2000-330,000 mg SEM/kg soil dry weight. Both ELISA systems tinder-reported for the more weathered soil samples. Results suggest this is due to matrix effects in the sample rather than any inherent bias in the ELISA systems and it is concluded that, for weathered hydrocarbons typical of steelworks and coke production sites, the use of ELISA requires careful consideration as a field technique. Consideration of the target analyte relative to the composition of the hydrocarbon waste encountered appears critical.
aCGH-MAS: Analysis of aCGH by means of Multiagent System
Benito, Rocío; Bajo, Javier; Rodríguez, Ana Eugenia; Abáigar, María
2015-01-01
There are currently different techniques, such as CGH arrays, to study genetic variations in patients. CGH arrays analyze gains and losses in different regions in the chromosome. Regions with gains or losses in pathologies are important for selecting relevant genes or CNVs (copy-number variations) associated with the variations detected within chromosomes. Information corresponding to mutations, genes, proteins, variations, CNVs, and diseases can be found in different databases and it would be of interest to incorporate information of different sources to extract relevant information. This work proposes a multiagent system to manage the information of aCGH arrays, with the aim of providing an intuitive and extensible system to analyze and interpret the results. The agent roles integrate statistical techniques to select relevant variations and visualization techniques for the interpretation of the final results and to extract relevant information from different sources of information by applying a CBR system. PMID:25874203
Joshi, Rohina; Negin, Joel
2018-01-01
Abstract Despite growing support for integration of frontline services, a lack of information about the pre-conditions necessary to integrate such services hampers the ability of policy makers and implementers to assess how feasible or worthwhile integration may be, especially in low- and middle-income countries (LMICs). We adopted a modified systematic review with aspects of realist review, including quantitative and qualitative studies that incorporated assessment of health system preparedness for and capacity to implement integrated services. We searched Medline via Ovid, Web of Science and the Cochrane library using terms adapted from Dudley and Garner’s systematic review on integration in LMICs. From an initial list of 10 550 articles, 206 were selected for full-text review by two reviewers who independently reviewed articles and inductively extracted and synthesized themes related to health system preparedness. We identified five ‘context’ related categories and four health system ‘capability’ themes. The contextual enabling and constraining factors for frontline service integration were: (1) the organizational framework of frontline services, (2) health care worker preparedness, (3) community and client preparedness, (4) upstream logistics and (5) policy and governance issues. The intersecting health system capabilities identified were the need for: (1) sufficiently functional frontline health services, (2) sufficiently trained and motivated health care workers, (3) availability of technical tools and equipment suitable to facilitate integrated frontline services and (4) appropriately devolved authority and decision-making processes to enable frontline managers and staff to adapt integration to local circumstances. Moving beyond claims that integration is defined differently by different programs and thus unsuitable for comparison, this review demonstrates that synthesis is possible. It presents a common set of contextual factors and health system capabilities necessary for successful service integration which may be considered indicators of preparedness and could form the basis for an ‘integration preparedness tool’. PMID:29272396
Topp, Stephanie M; Abimbola, Seye; Joshi, Rohina; Negin, Joel
2018-03-01
Despite growing support for integration of frontline services, a lack of information about the pre-conditions necessary to integrate such services hampers the ability of policy makers and implementers to assess how feasible or worthwhile integration may be, especially in low- and middle-income countries (LMICs). We adopted a modified systematic review with aspects of realist review, including quantitative and qualitative studies that incorporated assessment of health system preparedness for and capacity to implement integrated services. We searched Medline via Ovid, Web of Science and the Cochrane library using terms adapted from Dudley and Garner's systematic review on integration in LMICs. From an initial list of 10 550 articles, 206 were selected for full-text review by two reviewers who independently reviewed articles and inductively extracted and synthesized themes related to health system preparedness. We identified five 'context' related categories and four health system 'capability' themes. The contextual enabling and constraining factors for frontline service integration were: (1) the organizational framework of frontline services, (2) health care worker preparedness, (3) community and client preparedness, (4) upstream logistics and (5) policy and governance issues. The intersecting health system capabilities identified were the need for: (1) sufficiently functional frontline health services, (2) sufficiently trained and motivated health care workers, (3) availability of technical tools and equipment suitable to facilitate integrated frontline services and (4) appropriately devolved authority and decision-making processes to enable frontline managers and staff to adapt integration to local circumstances. Moving beyond claims that integration is defined differently by different programs and thus unsuitable for comparison, this review demonstrates that synthesis is possible. It presents a common set of contextual factors and health system capabilities necessary for successful service integration which may be considered indicators of preparedness and could form the basis for an 'integration preparedness tool'. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Naidu, Gayathri; Jeong, Sanghyun; Johir, Md Abu Hasan; Fane, Anthony G; Kandasamy, Jaya; Vigneswaran, Saravanamuthu
2017-10-15
The ultimate goal of seawater reverse osmosis (SWRO) brine management is to achieve minimal liquid discharge while recovering valuable resources. The suitability of an integrated system of membrane distillation (MD) with sorption for the recovery of rubidium (Rb + ) and simultaneous SWRO brine volume reduction has been evaluated for the first time. Polymer encapsulated potassium copper hexacyanoferrate (KCuFC(PAN)) sorbent exhibited a good selectivity for Rb + sorption with 10-15% increment at 55 °C (Langmuir Q max = 125.11 ± 0.20 mg/g) compared to at 25 °C (Langmuir Q max = 108.71 ± 0.20 mg/g). The integrated MD-KCuFC(PAN) system with periodic membrane cleaning, enabled concentration of SWRO brine to a volume concentration factor (VCF) of 2.9 (65% water recovery). A stable MD permeate flux was achieved with good quality permeate (conductivity of 15-20 μS/cm). Repeated cycles of MD-KCuFC(PAN) sorption with SWRO brine enabled the extraction of 2.26 mg Rb + from 12 L of brine (equivalent to 1.9 kg of Rb/day, or 0.7 tonne/yr from a plant producing 10,000 m 3 /day brine). KCuFC(PAN) showed a high regeneration and reuse capacity. NH 4 Cl air stripping followed by resorcinol formaldehyde (RF) resin filtration enabled to recover Rb + from the desorbed solution. Copyright © 2017 Elsevier Ltd. All rights reserved.
1990-12-01
Force ......,,...Human Systems Division (AFSC) .................. . . .. . . RP Pr ra -Offic (HSDIYAO) .............. ........... Brooks Air Force...intra- system piping. Six months to one year would be required to complete and integrate these components. EECA/021491/jlh 6-39 RADIAN COlVOR AT 1O IN... system capacity has been exceeded. It is a possibility that during severe storm events, the groundwater extraction wells will be shut down to avoid
An Integrated Framework for Analysis of Water Supply Strategies in a Developing City: Chennai, India
NASA Astrophysics Data System (ADS)
Srinivasan, V.; Gorelick, S.; Goulder, L.
2009-12-01
Indian cities are facing a severe water crisis: rapidly growing population, low tariffs, high leakage rates, inadequate reservoir storage, are straining water supply systems, resulting in unreliable, intermittent piped supply. Conventional approaches to studying the problem of urban water supply have typically considered only centralized piped supply by the water utility. Specifically, they have tended to overlook decentralized actions by consumers such as groundwater extraction via private wells and aquifer recharge by rainwater harvesting. We present an innovative integrative framework for analyzing urban water supply in Indian cities. The framework is used in a systems model of water supply in the city of Chennai, India that integrates different components of the urban water system: water flows into the reservoir system, diversion and distribution by the public water utility, groundwater flow in the urban aquifer, informal water markets and consumer behavior. Historical system behavior from 2002-2006 is used to calibrate the model. The historical system behavior highlights the buffering role of the urban aquifer; storing water in periods of surplus for extraction by consumers via private wells. The model results show that in Chennai, distribution pipeline leaks result in the transfer of water from the inadequate reservoir system to the urban aquifer. The systems approach also makes it possible to evaluate and compare a wide range of centralized and decentralized policies. Three very different policies: Supply Augmentation (desalination), Efficiency Improvement (raising tariffs and fixing pipe leaks), and Rainwater Harvesting (recharging the urban aquifer by capturing rooftop and yard runoff) were evaluated using the model. The model results suggest that a combination of Rainwater Harvesting and Efficiency Improvement best meets our criteria of welfare maximization, equity, system reliability, and utility profitability. Importantly, the study shows that combination policy emerges as optimal because of three conditions that are prevalent in Chennai: 1) widespread presence of private wells, 2) inadequate availability of reservoir storage to the utility, and 2) high cost of new supply sources.
Development of emergent processing loops as a system of systems concept
NASA Astrophysics Data System (ADS)
Gainey, James C., Jr.; Blasch, Erik P.
1999-03-01
This paper describes an engineering approach toward implementing the current neuroscientific understanding of how the primate brain fuses, or integrates, 'information' in the decision-making process. We describe a System of Systems (SoS) design for improving the overall performance, capabilities, operational robustness, and user confidence in Identification (ID) systems and show how it could be applied to biometrics security. We use the Physio-associative temporal sensor integration algorithm (PATSIA) which is motivated by observed functions and interactions of the thalamus, hippocampus, and cortical structures in the brain. PATSIA utilizes signal theory mathematics to model how the human efficiently perceives and uses information from the environment. The hybrid architecture implements a possible SoS-level description of the Joint Directors of US Laboratories for Fusion Working Group's functional description involving 5 levels of fusion and their associated definitions. This SoS architecture propose dynamic sensor and knowledge-source integration by implementing multiple Emergent Processing Loops for predicting, feature extracting, matching, and Searching both static and dynamic database like MSTAR's PEMS loops. Biologically, this effort demonstrates these objectives by modeling similar processes from the eyes, ears, and somatosensory channels, through the thalamus, and to the cortices as appropriate while using the hippocampus for short-term memory search and storage as necessary. The particular approach demonstrated incorporates commercially available speaker verification and face recognition software and hardware to collect data and extract features to the PATSIA. The PATSIA maximizes the confidence levels for target identification or verification in dynamic situations using a belief filter. The proof of concept described here is easily adaptable and scaleable to other military and nonmilitary sensor fusion applications.
Some calculable contributions to entanglement entropy.
Hertzberg, Mark P; Wilczek, Frank
2011-02-04
Entanglement entropy appears as a central property of quantum systems in broad areas of physics. However, its precise value is often sensitive to unknown microphysics, rendering it incalculable. By considering parametric dependence on correlation length, we extract finite, calculable contributions to the entanglement entropy for a scalar field between the interior and exterior of a spatial domain of arbitrary shape. The leading term is proportional to the area of the dividing boundary; we also extract finite subleading contributions for a field defined in the bulk interior of a waveguide in 3+1 dimensions, including terms proportional to the waveguide's cross-sectional geometry: its area, perimeter length, and integrated curvature. We also consider related quantities at criticality and suggest a class of systems for which these contributions might be measurable.
Identifying interactions between chemical entities in biomedical text.
Lamurias, Andre; Ferreira, João D; Couto, Francisco M
2014-10-23
Interactions between chemical compounds described in biomedical text can be of great importance to drug discovery and design, as well as pharmacovigilance. We developed a novel system, \\"Identifying Interactions between Chemical Entities\\" (IICE), to identify chemical interactions described in text. Kernel-based Support Vector Machines first identify the interactions and then an ensemble classifier validates and classifies the type of each interaction. This relation extraction module was evaluated with the corpus released for the DDI Extraction task of SemEval 2013, obtaining results comparable to state-of-the-art methods for this type of task. We integrated this module with our chemical named entity recognition module and made the whole system available as a web tool at www.lasige.di.fc.ul.pt/webtools/iice.
Identifying interactions between chemical entities in biomedical text.
Lamurias, Andre; Ferreira, João D; Couto, Francisco M
2014-12-01
Interactions between chemical compounds described in biomedical text can be of great importance to drug discovery and design, as well as pharmacovigilance. We developed a novel system, "Identifying Interactions between Chemical Entities" (IICE), to identify chemical interactions described in text. Kernel-based Support Vector Machines first identify the interactions and then an ensemble classifier validates and classifies the type of each interaction. This relation extraction module was evaluated with the corpus released for the DDI Extraction task of SemEval 2013, obtaining results comparable to stateof- the-art methods for this type of task. We integrated this module with our chemical named entity recognition module and made the whole system available as a web tool at www.lasige.di.fc.ul.pt/webtools/iice.
Indicators and measurement tools for health system integration: a knowledge synthesis protocol.
Oelke, Nelly D; Suter, Esther; da Silva Lima, Maria Alice Dias; Van Vliet-Brown, Cheryl
2015-07-29
Health system integration is a key component of health system reform with the goal of improving outcomes for patients, providers, and the health system. Although health systems continue to strive for better integration, current delivery of health services continues to be fragmented. A key gap in the literature is the lack of information on what successful integration looks like and how to measure achievement towards an integrated system. This multi-site study protocol builds on a prior knowledge synthesis completed by two of the primary investigators which identified 10 key principles that collectively support health system integration. The aim is to answer two research questions: What are appropriate indicators for each of the 10 key integration principles developed in our previous knowledge synthesis and what measurement tools are used to measure these indicators? To enhance generalizability of the findings, a partnership between Canada and Brazil was created as health system integration is a priority in both countries and they share similar contexts. This knowledge synthesis will follow an iterative scoping review process with emerging information from knowledge-user engagement leading to the refinement of research questions and study selection. This paper describes the methods for each phase of the study. Research questions were developed with stakeholder input. Indicator identification and prioritization will utilize a modified Delphi method and patient/user focus groups. Based on priority indicators, a search of the literature will be completed and studies screened for inclusion. Quality appraisal of relevant studies will be completed prior to data extraction. Results will be used to develop recommendations and key messages to be presented through integrated and end-of-grant knowledge translation strategies with researchers and knowledge-users from the three jurisdictions. This project will directly benefit policy and decision-makers by providing an easy accessible set of indicators and tools to measure health system integration across different contexts and cultures. Being able to evaluate the success of integration strategies and initiatives will lead to better health system design and improved health outcomes for patients.
Gstruct: a system for extracting schemas from GML documents
NASA Astrophysics Data System (ADS)
Chen, Hui; Zhu, Fubao; Guan, Jihong; Zhou, Shuigeng
2008-10-01
Geography Markup Language (GML) becomes the de facto standard for geographic information representation on the internet. GML schema provides a way to define the structure, content, and semantic of GML documents. It contains useful structural information of GML documents and plays an important role in storing, querying and analyzing GML data. However, GML schema is not mandatory, and it is common that a GML document contains no schema. In this paper, we present Gstruct, a tool for GML schema extraction. Gstruct finds the features in the input GML documents, identifies geometry datatypes as well as simple datatypes, then integrates all these features and eliminates improper components to output the optimal schema. Experiments demonstrate that Gstruct is effective in extracting semantically meaningful schemas from GML documents.
A New Optical Design for Imaging Spectroscopy
NASA Astrophysics Data System (ADS)
Thompson, K. L.
2002-05-01
We present an optical design concept for imaging spectroscopy, with some advantages over current systems. The system projects monochromatic images onto the 2-D array detector(s). Faint object and crowded field spectroscopy can be reduced first using image processing techniques, then building the spectrum, unlike integral field units where one must first extract the spectra, build data cubes from these, then reconstruct the target's integrated spectral flux. Like integral field units, all photons are detected simultaneously, unlike tunable filters which must be scanned through the wavelength range of interest and therefore pay a sensitivity pentalty. Several sample designs are presented, including an instrument optimized for measuring intermediate redshift galaxy cluster velocity dispersions, one designed for near-infrared ground-based adaptive optics, and one intended for space-based rapid follow-up of transient point sources such as supernovae and gamma ray bursts.
O'Connor, Timothy; Rawat, Siddharth; Markman, Adam; Javidi, Bahram
2018-03-01
We propose a compact imaging system that integrates an augmented reality head mounted device with digital holographic microscopy for automated cell identification and visualization. A shearing interferometer is used to produce holograms of biological cells, which are recorded using customized smart glasses containing an external camera. After image acquisition, segmentation is performed to isolate regions of interest containing biological cells in the field-of-view, followed by digital reconstruction of the cells, which is used to generate a three-dimensional (3D) pseudocolor optical path length profile. Morphological features are extracted from the cell's optical path length map, including mean optical path length, coefficient of variation, optical volume, projected area, projected area to optical volume ratio, cell skewness, and cell kurtosis. Classification is performed using the random forest classifier, support vector machines, and K-nearest neighbor, and the results are compared. Finally, the augmented reality device displays the cell's pseudocolor 3D rendering of its optical path length profile, extracted features, and the identified cell's type or class. The proposed system could allow a healthcare worker to quickly visualize cells using augmented reality smart glasses and extract the relevant information for rapid diagnosis. To the best of our knowledge, this is the first report on the integration of digital holographic microscopy with augmented reality devices for automated cell identification and visualization.
McGowan, Kevin B; Sah, Robert L
2005-05-01
Integrative repair of cartilage was previously found to depend on collagen synthesis and maturation. beta-aminopropionitrile (BAPN) treatment, which irreversibly blocks lysyl oxidase, inhibited the formation of collagen crosslinks, prevented development of adhesive strength, and caused a buildup of GuHCl-extractable collagen crosslink precursors. This buildup of crosslink precursor in the tissue may be useful for enhancing integrative repair. We tested in vitro the hypothesis that pre-treatment of cartilage with BAPN, followed by washout before implantation, could be a useful therapeutic strategy to accelerate subsequent collagen maturation. In individual cartilage disks, collagen processing was reversibly blocked by BAPN treatment (0.1 mM) as indicated by a BAPN-induced increase in the total and proportion of incorporated radiolabel that was extractable by 4M guanidine-HCl, followed by a decrease, within 3-4 days of BAPN washout, in the proportion of extractable radiolabel to control levels. With a similar pattern, integration between pairs of apposed cartilage blocks was reversibly blocked by BAPN treatment, and followed by an enhancement of integration after BAPN washout. The low and high levels of integration were associated with enrichment in [(3)H]proline in a form that was susceptible and resistant, respectively, to extraction. With increasing duration up to 7 days after BAPN pre-treatment, the levels of [(3)H]proline extraction decreased, and the development of adhesive strength increased. Thus, BAPN can be used to modulate integrative cartilage repair.
Integrative medicine for managing the symptoms of lupus nephritis
Choi, Tae-Young; Jun, Ji Hee; Lee, Myeong Soo
2018-01-01
Abstract Background: Integrative medicine is claimed to improve symptoms of lupus nephritis. No systematic reviews have been performed for the application of integrative medicine for lupus nephritis on patients with systemic lupus erythematosus (SLE). Thus, this review will aim to evaluate the current evidence on the efficacy of integrative medicine for the management of lupus nephritis in patients with SLE. Methods and analyses: The following electronic databases will be searched for studies published from their dates of inception February 2018: Medline, EMBASE and the Cochrane Central Register of Controlled Trials (CENTRAL), as well as 6 Korean medical databases (Korea Med, the Oriental Medicine Advanced Search Integrated System [OASIS], DBpia, the Korean Medical Database [KM base], the Research Information Service System [RISS], and the Korean Studies Information Services System [KISS]), and 1 Chinese medical database (the China National Knowledge Infrastructure [CNKI]). Study selection, data extraction, and assessment will be performed independently by 2 researchers. The risk of bias (ROB) will be assessed using the Cochrane ROB tool. Dissemination: This systematic review will be published in a peer-reviewed journal and disseminated both electronically and in print. The review will be updated to inform and guide healthcare practice and policy. Trial registration number: PROSPERO 2018 CRD42018085205 PMID:29595669
Challenges of Implementing ESD in the Education Sector; Experiences in Norway
NASA Astrophysics Data System (ADS)
Sandås, Astrid; Benedict, Faye
This article presents and reflects on Norwegian experiences over a period of about 15 years with implementing the Norwegian national strategy for education for sustainable development (ESD) in the education system. We extract lessons about integration of ESD into education systems. After an introduction to central ideas of sustainable development and ESD, the article discusses the need for appropriate strategies and instruments. Key factors are collaboration to allow pupils and schools to actively contribute to a positive development locally and globally, interdisciplinary approaches to complex sustainability issues, and appropriate use of the ICT and other media. ESD programmes and activities should support school development and build the capacity of schools and teachers for integration of ESD.
Plant Water Uptake in Drying Soils1
Lobet, Guillaume; Couvreur, Valentin; Meunier, Félicien; Javaux, Mathieu; Draye, Xavier
2014-01-01
Over the last decade, investigations on root water uptake have evolved toward a deeper integration of the soil and roots compartment properties, with the goal of improving our understanding of water acquisition from drying soils. This evolution parallels the increasing attention of agronomists to suboptimal crop production environments. Recent results have led to the description of root system architectures that might contribute to deep-water extraction or to water-saving strategies. In addition, the manipulation of root hydraulic properties would provide further opportunities to improve water uptake. However, modeling studies highlight the role of soil hydraulics in the control of water uptake in drying soil and call for integrative soil-plant system approaches. PMID:24515834
Adaptable, high recall, event extraction system with minimal configuration
2015-01-01
Background Biomedical event extraction has been a major focus of biomedical natural language processing (BioNLP) research since the first BioNLP shared task was held in 2009. Accordingly, a large number of event extraction systems have been developed. Most such systems, however, have been developed for specific tasks and/or incorporated task specific settings, making their application to new corpora and tasks problematic without modification of the systems themselves. There is thus a need for event extraction systems that can achieve high levels of accuracy when applied to corpora in new domains, without the need for exhaustive tuning or modification, whilst retaining competitive levels of performance. Results We have enhanced our state-of-the-art event extraction system, EventMine, to alleviate the need for task-specific tuning. Task-specific details are specified in a configuration file, while extensive task-specific parameter tuning is avoided through the integration of a weighting method, a covariate shift method, and their combination. The task-specific configuration and weighting method have been employed within the context of two different sub-tasks of BioNLP shared task 2013, i.e. Cancer Genetics (CG) and Pathway Curation (PC), removing the need to modify the system specifically for each task. With minimal task specific configuration and tuning, EventMine achieved the 1st place in the PC task, and 2nd in the CG, achieving the highest recall for both tasks. The system has been further enhanced following the shared task by incorporating the covariate shift method and entity generalisations based on the task definitions, leading to further performance improvements. Conclusions We have shown that it is possible to apply a state-of-the-art event extraction system to new tasks with high levels of performance, without having to modify the system internally. Both covariate shift and weighting methods are useful in facilitating the production of high recall systems. These methods and their combination can adapt a model to the target data with no deep tuning and little manual configuration. PMID:26201408
Min, Yul Ha; Park, Hyeoun-Ae; Chung, Eunja; Lee, Hyunsook
2013-12-01
The purpose of this paper is to describe the components of a next-generation electronic nursing records system ensuring full semantic interoperability and integrating evidence into the nursing records system. A next-generation electronic nursing records system based on detailed clinical models and clinical practice guidelines was developed at Seoul National University Bundang Hospital in 2013. This system has two components, a terminology server and a nursing documentation system. The terminology server manages nursing narratives generated from entity-attribute-value triplets of detailed clinical models using a natural language generation system. The nursing documentation system provides nurses with a set of nursing narratives arranged around the recommendations extracted from clinical practice guidelines. An electronic nursing records system based on detailed clinical models and clinical practice guidelines was successfully implemented in a hospital in Korea. The next-generation electronic nursing records system can support nursing practice and nursing documentation, which in turn will improve data quality.
Inertial navigation sensor integrated obstacle detection system
NASA Technical Reports Server (NTRS)
Bhanu, Bir (Inventor); Roberts, Barry A. (Inventor)
1992-01-01
A system that incorporates inertial sensor information into optical flow computations to detect obstacles and to provide alternative navigational paths free from obstacles. The system is a maximally passive obstacle detection system that makes selective use of an active sensor. The active detection typically utilizes a laser. Passive sensor suite includes binocular stereo, motion stereo and variable fields-of-view. Optical flow computations involve extraction, derotation and matching of interest points from sequential frames of imagery, for range interpolation of the sensed scene, which in turn provides obstacle information for purposes of safe navigation.
NASA Technical Reports Server (NTRS)
Billingsley, F.
1982-01-01
Concerns are expressed about the data handling aspects of system design and about enabling technology for data handling and data analysis. The status, contributing factors, critical issues, and recommendations for investigations are listed for data handling, rectification and registration, and information extraction. Potential supports to individual P.I., research tasks, systematic data system design, and to system operation. The need for an airborne spectrometer class instrument for fundamental research in high spectral and spatial resolution is indicated. Geographic information system formatting and labelling techniques, very large scale integration, and methods for providing multitype data sets must also be developed.
Development of a Novel and Rapid Fully Automated Genetic Testing System.
Uehara, Masayuki
2016-01-01
We have developed a rapid genetic testing system integrating nucleic acid extraction, purification, amplification, and detection in a single cartridge. The system performs real-time polymerase chain reaction (PCR) after nucleic acid purification in a fully automated manner. RNase P, a housekeeping gene, was purified from human nasal epithelial cells using silica-coated magnetic beads and subjected to real-time PCR using a novel droplet-real-time-PCR machine. The process was completed within 13 min. This system will be widely applicable for research and diagnostic uses.
A neural network ActiveX based integrated image processing environment.
Ciuca, I; Jitaru, E; Alaicescu, M; Moisil, I
2000-01-01
The paper outlines an integrated image processing environment that uses neural networks ActiveX technology for object recognition and classification. The image processing environment which is Windows based, encapsulates a Multiple-Document Interface (MDI) and is menu driven. Object (shape) parameter extraction is focused on features that are invariant in terms of translation, rotation and scale transformations. The neural network models that can be incorporated as ActiveX components into the environment allow both clustering and classification of objects from the analysed image. Mapping neural networks perform an input sensitivity analysis on the extracted feature measurements and thus facilitate the removal of irrelevant features and improvements in the degree of generalisation. The program has been used to evaluate the dimensions of the hydrocephalus in a study for calculating the Evans index and the angle of the frontal horns of the ventricular system modifications.
Li, Xiaolu; Liang, Yu
2015-05-20
Analysis of light detection and ranging (LiDAR) intensity data to extract surface features is of great interest in remote sensing research. One potential application of LiDAR intensity data is target classification. A new bidirectional reflectance distribution function (BRDF) model is derived for target characterization of rough and smooth surfaces. Based on the geometry of our coaxial full-waveform LiDAR system, the integration method is improved through coordinate transformation to establish the relationship between the BRDF model and intensity data of LiDAR. A series of experiments using typical urban building materials are implemented to validate the proposed BRDF model and integration method. The fitting results show that three parameters extracted from the proposed BRDF model can distinguish the urban building materials from perspectives of roughness, specular reflectance, and diffuse reflectance. A comprehensive analysis of these parameters will help characterize surface features in a physically rigorous manner.
Multi-dimensional spatial light communication made with on-chip InGaN photonic integration
NASA Astrophysics Data System (ADS)
Yang, Yongchao; Zhu, Bingcheng; Shi, Zheng; Wang, Jinyuan; Li, Xin; Gao, Xumin; Yuan, Jialei; Li, Yuanhang; Jiang, Yan; Wang, Yongjin
2017-04-01
Here, we propose, fabricate and characterize suspended photonic integration of InGaN multiple-quantum-well light-emitting diode (MQW-LED), waveguide and InGaN MQW-photodetector on a single chip. The unique light emission property of InGaN MQW-LED makes it feasible to establish multi-dimensional spatial data transmission using visible light. The in-plane light communication system is comprised of InGaN MQW-LED, waveguide and InGaN MQW-photodetector, and the out-of-plane data transmission is realized by detecting the free-space light emission via a commercial photodiode module. Moreover, a full-duplex light communication is experimentally demonstrated at a data transmission rate of 50 Mbps when both InGaN MQW-diodes operate under simultaneous light emission and detection mode. The in-plane superimposed signals are able to be extracted through the self-interference cancellation method, and the out-of-plane superimposed signals are in good agreement with the calculated signals according to the extracted transmitted signals. These results are promising for the development of on-chip InGaN photonic integration for diverse applications.
An integrated method for cancer classification and rule extraction from microarray data
Huang, Liang-Tsung
2009-01-01
Different microarray techniques recently have been successfully used to investigate useful information for cancer diagnosis at the gene expression level due to their ability to measure thousands of gene expression levels in a massively parallel way. One important issue is to improve classification performance of microarray data. However, it would be ideal that influential genes and even interpretable rules can be explored at the same time to offer biological insight. Introducing the concepts of system design in software engineering, this paper has presented an integrated and effective method (named X-AI) for accurate cancer classification and the acquisition of knowledge from DNA microarray data. This method included a feature selector to systematically extract the relative important genes so as to reduce the dimension and retain as much as possible of the class discriminatory information. Next, diagonal quadratic discriminant analysis (DQDA) was combined to classify tumors, and generalized rule induction (GRI) was integrated to establish association rules which can give an understanding of the relationships between cancer classes and related genes. Two non-redundant datasets of acute leukemia were used to validate the proposed X-AI, showing significantly high accuracy for discriminating different classes. On the other hand, I have presented the abilities of X-AI to extract relevant genes, as well as to develop interpretable rules. Further, a web server has been established for cancer classification and it is freely available at . PMID:19272192
Integrated microfluidic systems for cell lysis, mixing/pumping and DNA amplification
NASA Astrophysics Data System (ADS)
Lee, Chia-Yen; Lee, Gwo-Bin; Lin, Jr-Lung; Huang, Fu-Chun; Liao, Chia-Sheng
2005-06-01
The present paper reports a fully automated microfluidic system for the DNA amplification process by integrating an electroosmotic pump, an active micromixer and an on-chip temperature control system. In this DNA amplification process, the cell lysis is initially performed in a micro cell lysis reactor. Extracted DNA samples, primers and reagents are then driven electroosmotically into a mixing region where they are mixed by the active micromixer. The homogeneous mixture is then thermally cycled in a micro-PCR (polymerase chain reaction) chamber to perform DNA amplification. Experimental results show that the proposed device can successfully automate the sample pretreatment operation for DNA amplification, thereby delivering significant time and effort savings. The new microfluidic system, which facilitates cell lysis, sample driving/mixing and DNA amplification, could provide a significant contribution to ongoing efforts to miniaturize bio-analysis systems by utilizing a simple fabrication process and cheap materials.
Duftschmid, Georg; Chaloupka, Judith; Rinner, Christoph
2013-01-22
The dual model approach represents a promising solution for achieving semantically interoperable standardized electronic health record (EHR) exchange. Its acceptance, however, will depend on the effort required for integrating archetypes into legacy EHR systems. We propose a corresponding approach that: (a) automatically generates entry forms in legacy EHR systems from archetypes; and (b) allows the immediate export of EHR documents that are recorded via the generated forms and stored in the EHR systems' internal format as standardized and archetype-compliant EHR extracts. As a prerequisite for applying our approach, we define a set of basic requirements for the EHR systems. We tested our approach with an EHR system called ArchiMed and were able to successfully integrate 15 archetypes from a test set of 27. For 12 archetypes, the form generation failed owing to a particular type of complex structure (multiple repeating subnodes), which was prescribed by the archetypes but not supported by ArchiMed's data model. Our experiences show that archetypes should be customized based on the planned application scenario before their integration. This would allow problematic structures to be dissolved and irrelevant optional archetype nodes to be removed. For customization of archetypes, openEHR templates or specialized archetypes may be employed. Gaps in the data types or terminological features supported by an EHR system will often not preclude integration of the relevant archetypes. More work needs to be done on the usability of the generated forms.
Improved integral images compression based on multi-view extraction
NASA Astrophysics Data System (ADS)
Dricot, Antoine; Jung, Joel; Cagnazzo, Marco; Pesquet, Béatrice; Dufaux, Frédéric
2016-09-01
Integral imaging is a technology based on plenoptic photography that captures and samples the light-field of a scene through a micro-lens array. It provides views of the scene from several angles and therefore is foreseen as a key technology for future immersive video applications. However, integral images have a large resolution and a structure based on micro-images which is challenging to encode. A compression scheme for integral images based on view extraction has previously been proposed, with average BD-rate gains of 15.7% (up to 31.3%) reported over HEVC when using one single extracted view. As the efficiency of the scheme depends on a tradeoff between the bitrate required to encode the view and the quality of the image reconstructed from the view, it is proposed to increase the number of extracted views. Several configurations are tested with different positions and different number of extracted views. Compression efficiency is increased with average BD-rate gains of 22.2% (up to 31.1%) reported over the HEVC anchor, with a realistic runtime increase.
NASA Astrophysics Data System (ADS)
Shukla, Hemant; Bonissent, Alain
2017-04-01
We present the parameterized simulation of an integral-field unit (IFU) slicer spectrograph and its applications in spectroscopic studies, namely, for probing dark energy with type Ia supernovae. The simulation suite is called the fast-slicer IFU simulator (FISim). The data flow of FISim realistically models the optics of the IFU along with the propagation effects, including cosmological, zodiacal, instrumentation and detector effects. FISim simulates the spectrum extraction by computing the error matrix on the extracted spectrum. The applications for Type Ia supernova spectroscopy are used to establish the efficacy of the simulator in exploring the wider parametric space, in order to optimize the science and mission requirements. The input spectral models utilize the observables such as the optical depth and velocity of the Si II absorption feature in the supernova spectrum as the measured parameters for various studies. Using FISim, we introduce a mechanism for preserving the complete state of a system, called the partial p/partial f matrix, which allows for compression, reconstruction and spectrum extraction, we introduce a novel and efficient method for spectrum extraction, called super-optimal spectrum extraction, and we conduct various studies such as the optimal point spread function, optimal resolution, parameter estimation, etc. We demonstrate that for space-based telescopes, the optimal resolution lies in the region near R ˜ 117 for read noise of 1 e- and 7 e- using a 400 km s-1 error threshold on the Si II velocity.
A Circuit Extraction System and Graphical Display for VLSI (Very Large Scale Integrated) Design.
1989-12-01
understandable as a net-list. The file contains information on the different physical layers of a polysilicon chip, not how these layers combine to form...yperc; struct vwsurf vsurf =DEFAULT_VWSURF(pixwt-ndd); stt-uct vwsurf vsurf2 DEFAULT-VWSURF(pixwfLndd); ma in) another[ Ol =IV while (anothler[0O = ’y
Neurons with two sites of synaptic integration learn invariant representations.
Körding, K P; König, P
2001-12-01
Neurons in mammalian cerebral cortex combine specific responses with respect to some stimulus features with invariant responses to other stimulus features. For example, in primary visual cortex, complex cells code for orientation of a contour but ignore its position to a certain degree. In higher areas, such as the inferotemporal cortex, translation-invariant, rotation-invariant, and even view point-invariant responses can be observed. Such properties are of obvious interest to artificial systems performing tasks like pattern recognition. It remains to be resolved how such response properties develop in biological systems. Here we present an unsupervised learning rule that addresses this problem. It is based on a neuron model with two sites of synaptic integration, allowing qualitatively different effects of input to basal and apical dendritic trees, respectively. Without supervision, the system learns to extract invariance properties using temporal or spatial continuity of stimuli. Furthermore, top-down information can be smoothly integrated in the same framework. Thus, this model lends a physiological implementation to approaches of unsupervised learning of invariant-response properties.
Jiang, Peng; Zhao, Shuai; Zhu, Rong
2015-01-01
This paper presents a smart sensing strip for noninvasively monitoring respiratory flow in real time. The monitoring system comprises a monolithically-integrated flexible hot-film flow sensor adhered on a molded flexible silicone case, where a miniaturized conditioning circuit with a Bluetooth4.0 LE module are packaged, and a personal mobile device that wirelessly acquires respiratory data transmitted from the flow sensor, executes extraction of vital signs, and performs medical diagnosis. The system serves as a wearable device to monitor comprehensive respiratory flow while avoiding use of uncomfortable nasal cannula. The respiratory sensor is a flexible flow sensor monolithically integrating four elements of a Wheatstone bridge on single chip, including a hot-film resistor, a temperature-compensating resistor, and two balancing resistors. The monitor takes merits of small size, light weight, easy operation, and low power consumption. Experiments were conducted to verify the feasibility and effectiveness of monitoring and diagnosing respiratory diseases using the proposed system. PMID:26694401
A study on spatial decision support systems for HIV/AIDS prevention based on COM GIS technology
NASA Astrophysics Data System (ADS)
Yang, Kun; Luo, Huasong; Peng, Shungyun; Xu, Quanli
2007-06-01
Based on the deeply analysis of the current status and the existing problems of GIS technology applications in Epidemiology, this paper has proposed the method and process for establishing the spatial decision support systems of AIDS epidemic prevention by integrating the COM GIS, Spatial Database, GPS, Remote Sensing, and Communication technologies, as well as ASP and ActiveX software development technologies. One of the most important issues for constructing the spatial decision support systems of AIDS epidemic prevention is how to integrate the AIDS spreading models with GIS. The capabilities of GIS applications in the AIDS epidemic prevention have been described here in this paper firstly. Then some mature epidemic spreading models have also been discussed for extracting the computation parameters. Furthermore, a technical schema has been proposed for integrating the AIDS spreading models with GIS and relevant geospatial technologies, in which the GIS and model running platforms share a common spatial database and the computing results can be spatially visualized on Desktop or Web GIS clients. Finally, a complete solution for establishing the decision support systems of AIDS epidemic prevention has been offered in this paper based on the model integrating methods and ESRI COM GIS software packages. The general decision support systems are composed of data acquisition sub-systems, network communication sub-systems, model integrating sub-systems, AIDS epidemic information spatial database sub-systems, AIDS epidemic information querying and statistical analysis sub-systems, AIDS epidemic dynamic surveillance sub-systems, AIDS epidemic information spatial analysis and decision support sub-systems, as well as AIDS epidemic information publishing sub-systems based on Web GIS.
NASA Astrophysics Data System (ADS)
Nomaguch, Yutaka; Fujita, Kikuo
This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.
Low-Power Analog Processing for Sensing Applications: Low-Frequency Harmonic Signal Classification
White, Daniel J.; William, Peter E.; Hoffman, Michael W.; Balkir, Sina
2013-01-01
A low-power analog sensor front-end is described that reduces the energy required to extract environmental sensing spectral features without using Fast Fouriér Transform (FFT) or wavelet transforms. An Analog Harmonic Transform (AHT) allows selection of only the features needed by the back-end, in contrast to the FFT, where all coefficients must be calculated simultaneously. We also show that the FFT coefficients can be easily calculated from the AHT results by a simple back-substitution. The scheme is tailored for low-power, parallel analog implementation in an integrated circuit (IC). Two different applications are tested with an ideal front-end model and compared to existing studies with the same data sets. Results from the military vehicle classification and identification of machine-bearing fault applications shows that the front-end suits a wide range of harmonic signal sources. Analog-related errors are modeled to evaluate the feasibility of and to set design parameters for an IC implementation to maintain good system-level performance. Design of a preliminary transistor-level integrator circuit in a 0.13 μm complementary metal-oxide-silicon (CMOS) integrated circuit process showed the ability to use online self-calibration to reduce fabrication errors to a sufficiently low level. Estimated power dissipation is about three orders of magnitude less than similar vehicle classification systems that use commercially available FFT spectral extraction. PMID:23892765
Crataegus special extract WS 1442: up-to-date review of experimental and clinical experiences.
Zorniak, M; Szydlo, B; Krzeminski, T F
2017-08-01
Extracts and tinctures made from Crataegus spp. (Hawthorn) have been used as cardioprotective remedies since ancient times. WS 1442 special extract, manufactured by Dr. W. Schwabe Pharmaceuticals©, made from Crataegus spp. Leaves and flowers is one of the most studied and popular of preparations received from Hawthorn. It is integral, and most important active component of such herbal drugs as Crataegutt® novo 450, and CardioMax®. This standardized extract contains 18.75% oligomeric procyanidins (OPC), which have beneficial cardioprotective values and play a role as free-radicals scavengers, that protect the ischemic heart tissue from neutrophile elastase action successions. Moreover, WS 1442 also carries proven vasorelaxant activity, via affecting eNOS synthase, and prevents ischemic heart tissue swelling by influence on calcium signaling pathways, and thus detain hyperpermeability of endothelium. Actions of WS 1442 special extract were investigated in in vitro as well as in vivo studies including large clinical trials. In this review authors present current state of knowledge about possible beneficial effects of WS 1442 special extract on cardiovascular system.
Hong, Na; Wen, Andrew; Shen, Feichen; Sohn, Sunghwan; Liu, Sijia; Liu, Hongfang; Jiang, Guoqian
2018-01-01
Standards-based modeling of electronic health records (EHR) data holds great significance for data interoperability and large-scale usage. Integration of unstructured data into a standard data model, however, poses unique challenges partially due to heterogeneous type systems used in existing clinical NLP systems. We introduce a scalable and standards-based framework for integrating structured and unstructured EHR data leveraging the HL7 Fast Healthcare Interoperability Resources (FHIR) specification. We implemented a clinical NLP pipeline enhanced with an FHIR-based type system and performed a case study using medication data from Mayo Clinic's EHR. Two UIMA-based NLP tools known as MedXN and MedTime were integrated in the pipeline to extract FHIR MedicationStatement resources and related attributes from unstructured medication lists. We developed a rule-based approach for assigning the NLP output types to the FHIR elements represented in the type system, whereas we investigated the FHIR elements belonging to the source of the structured EMR data. We used the FHIR resource "MedicationStatement" as an example to illustrate our integration framework and methods. For evaluation, we manually annotated FHIR elements in 166 medication statements from 14 clinical notes generated by Mayo Clinic in the course of patient care, and used standard performance measures (precision, recall and f-measure). The F-scores achieved ranged from 0.73 to 0.99 for the various FHIR element representations. The results demonstrated that our framework based on the FHIR type system is feasible for normalizing and integrating both structured and unstructured EHR data.
D'Souza, Mark; Sulakhe, Dinanath; Wang, Sheng; Xie, Bing; Hashemifar, Somaye; Taylor, Andrew; Dubchak, Inna; Conrad Gilliam, T; Maltsev, Natalia
2017-01-01
Recent technological advances in genomics allow the production of biological data at unprecedented tera- and petabyte scales. Efficient mining of these vast and complex datasets for the needs of biomedical research critically depends on a seamless integration of the clinical, genomic, and experimental information with prior knowledge about genotype-phenotype relationships. Such experimental data accumulated in publicly available databases should be accessible to a variety of algorithms and analytical pipelines that drive computational analysis and data mining.We present an integrated computational platform Lynx (Sulakhe et al., Nucleic Acids Res 44:D882-D887, 2016) ( http://lynx.cri.uchicago.edu ), a web-based database and knowledge extraction engine. It provides advanced search capabilities and a variety of algorithms for enrichment analysis and network-based gene prioritization. It gives public access to the Lynx integrated knowledge base (LynxKB) and its analytical tools via user-friendly web services and interfaces. The Lynx service-oriented architecture supports annotation and analysis of high-throughput experimental data. Lynx tools assist the user in extracting meaningful knowledge from LynxKB and experimental data, and in the generation of weighted hypotheses regarding the genes and molecular mechanisms contributing to human phenotypes or conditions of interest. The goal of this integrated platform is to support the end-to-end analytical needs of various translational projects.
Attempts to utilize and integrate traditional medicine in North Korea.
Lim, Byungmook; Park, Jongbae; Han, Changyon
2009-03-01
To summarize the way North Korea attempted to modernize its system of traditional medicine and integrate it with Western biomedicine. We reviewed clinical textbooks and periodicals of traditional Korean medicine published in North Korea, research reports on North Korean health and medicine published elsewhere, and conducted interviews of defectors from North Korea who were students or clinicians of traditional medicine. Key findings of this study are: (1) North Korea has attempted several ways of integrating traditional medicine into education and clinical practices; (2) North Korea's communist government provided the main driving force for an integration policy; (3) school curricula of both Western and traditional Korean medicine incorporated knowledge of both disciplines, yet more weight was placed on traditional Korean medicine; (4) a combination of Western diagnosis and Korean therapeutics was the most frequent example of integration, while the dual system approach with reciprocal practice was also explored; (5) several forms of integrative therapeutic mixture were practiced including concurrent medication, injection on acupuncture points, and intramuscular or intravenous injection of extracts from medicinal plants; and (6) limited resources for research and the underdeveloped level of clinical research failed to secure rigorous scientific advancement. Despite the government-driven attempt to create an ideal integrative system of medicine, according to our findings, the actual introduction of an integrative system into practice was far from the North Korean government's anticipated outcome in regards to clinical practice. We hypothesize this was due to famine, economic crisis, and political isolation from the international realm. Traditional Korean medicine seems to have served the population, which is in desperate need of treatment amid difficulties in health, while North Korea's Western biomedicine-based health delivery system has been badly affected.
Using text mining techniques to extract phenotypic information from the PhenoCHF corpus
2015-01-01
Background Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. Methods To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Results Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. Conclusions PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single disease, the promising results achieved can stimulate further work into the extraction of phenotypic information for other diseases. The PhenoCHF annotation guidelines and annotations are publicly available at https://code.google.com/p/phenochf-corpus. PMID:26099853
Using text mining techniques to extract phenotypic information from the PhenoCHF corpus.
Alnazzawi, Noha; Thompson, Paul; Batista-Navarro, Riza; Ananiadou, Sophia
2015-01-01
Phenotypic information locked away in unstructured narrative text presents significant barriers to information accessibility, both for clinical practitioners and for computerised applications used for clinical research purposes. Text mining (TM) techniques have previously been applied successfully to extract different types of information from text in the biomedical domain. They have the potential to be extended to allow the extraction of information relating to phenotypes from free text. To stimulate the development of TM systems that are able to extract phenotypic information from text, we have created a new corpus (PhenoCHF) that is annotated by domain experts with several types of phenotypic information relating to congestive heart failure. To ensure that systems developed using the corpus are robust to multiple text types, it integrates text from heterogeneous sources, i.e., electronic health records (EHRs) and scientific articles from the literature. We have developed several different phenotype extraction methods to demonstrate the utility of the corpus, and tested these methods on a further corpus, i.e., ShARe/CLEF 2013. Evaluation of our automated methods showed that PhenoCHF can facilitate the training of reliable phenotype extraction systems, which are robust to variations in text type. These results have been reinforced by evaluating our trained systems on the ShARe/CLEF corpus, which contains clinical records of various types. Like other studies within the biomedical domain, we found that solutions based on conditional random fields produced the best results, when coupled with a rich feature set. PhenoCHF is the first annotated corpus aimed at encoding detailed phenotypic information. The unique heterogeneous composition of the corpus has been shown to be advantageous in the training of systems that can accurately extract phenotypic information from a range of different text types. Although the scope of our annotation is currently limited to a single disease, the promising results achieved can stimulate further work into the extraction of phenotypic information for other diseases. The PhenoCHF annotation guidelines and annotations are publicly available at https://code.google.com/p/phenochf-corpus.
TEES 2.2: Biomedical Event Extraction for Diverse Corpora
2015-01-01
Background The Turku Event Extraction System (TEES) is a text mining program developed for the extraction of events, complex biomedical relationships, from scientific literature. Based on a graph-generation approach, the system detects events with the use of a rich feature set built via dependency parsing. The TEES system has achieved record performance in several of the shared tasks of its domain, and continues to be used in a variety of biomedical text mining tasks. Results The TEES system was quickly adapted to the BioNLP'13 Shared Task in order to provide a public baseline for derived systems. An automated approach was developed for learning the underlying annotation rules of event type, allowing immediate adaptation to the various subtasks, and leading to a first place in four out of eight tasks. The system for the automated learning of annotation rules is further enhanced in this paper to the point of requiring no manual adaptation to any of the BioNLP'13 tasks. Further, the scikit-learn machine learning library is integrated into the system, bringing a wide variety of machine learning methods usable with TEES in addition to the default SVM. A scikit-learn ensemble method is also used to analyze the importances of the features in the TEES feature sets. Conclusions The TEES system was introduced for the BioNLP'09 Shared Task and has since then demonstrated good performance in several other shared tasks. By applying the current TEES 2.2 system to multiple corpora from these past shared tasks an overarching analysis of the most promising methods and possible pitfalls in the evolving field of biomedical event extraction are presented. PMID:26551925
TEES 2.2: Biomedical Event Extraction for Diverse Corpora.
Björne, Jari; Salakoski, Tapio
2015-01-01
The Turku Event Extraction System (TEES) is a text mining program developed for the extraction of events, complex biomedical relationships, from scientific literature. Based on a graph-generation approach, the system detects events with the use of a rich feature set built via dependency parsing. The TEES system has achieved record performance in several of the shared tasks of its domain, and continues to be used in a variety of biomedical text mining tasks. The TEES system was quickly adapted to the BioNLP'13 Shared Task in order to provide a public baseline for derived systems. An automated approach was developed for learning the underlying annotation rules of event type, allowing immediate adaptation to the various subtasks, and leading to a first place in four out of eight tasks. The system for the automated learning of annotation rules is further enhanced in this paper to the point of requiring no manual adaptation to any of the BioNLP'13 tasks. Further, the scikit-learn machine learning library is integrated into the system, bringing a wide variety of machine learning methods usable with TEES in addition to the default SVM. A scikit-learn ensemble method is also used to analyze the importances of the features in the TEES feature sets. The TEES system was introduced for the BioNLP'09 Shared Task and has since then demonstrated good performance in several other shared tasks. By applying the current TEES 2.2 system to multiple corpora from these past shared tasks an overarching analysis of the most promising methods and possible pitfalls in the evolving field of biomedical event extraction are presented.
Basewide Engineering Evaluation-Cost Analysis for Soil Vapor Extraction. General Evaluation Document
1993-09-01
Agricultural (low density) a 3.7%/ Residential Leand McFR6 Open Space (recreational) Suron ding Solid Waste Disposal Faiiy(transfer sttin Areas 12...designed, manufactured , and sold as integrated units by vendors of SVE systems. Smaller systems, such as those with total 42 GENERAL EVALUATION...the 2,000 scfm and larger capacity range, can be supplied either as portable units (e g., on multiple trailers), or manufactured on one to three skids
2013-01-01
Background The dual model approach represents a promising solution for achieving semantically interoperable standardized electronic health record (EHR) exchange. Its acceptance, however, will depend on the effort required for integrating archetypes into legacy EHR systems. Methods We propose a corresponding approach that: (a) automatically generates entry forms in legacy EHR systems from archetypes; and (b) allows the immediate export of EHR documents that are recorded via the generated forms and stored in the EHR systems’ internal format as standardized and archetype-compliant EHR extracts. As a prerequisite for applying our approach, we define a set of basic requirements for the EHR systems. Results We tested our approach with an EHR system called ArchiMed and were able to successfully integrate 15 archetypes from a test set of 27. For 12 archetypes, the form generation failed owing to a particular type of complex structure (multiple repeating subnodes), which was prescribed by the archetypes but not supported by ArchiMed’s data model. Conclusions Our experiences show that archetypes should be customized based on the planned application scenario before their integration. This would allow problematic structures to be dissolved and irrelevant optional archetype nodes to be removed. For customization of archetypes, openEHR templates or specialized archetypes may be employed. Gaps in the data types or terminological features supported by an EHR system will often not preclude integration of the relevant archetypes. More work needs to be done on the usability of the generated forms. PMID:23339403
Hong, Keehoon; Hong, Jisoo; Jung, Jae-Hyun; Park, Jae-Hyeung; Lee, Byoungho
2010-05-24
We propose a new method for rectifying a geometrical distortion in the elemental image set and extracting an accurate lens lattice lines by projective image transformation. The information of distortion in the acquired elemental image set is found by Hough transform algorithm. With this initial information of distortions, the acquired elemental image set is rectified automatically without the prior knowledge on the characteristics of pickup system by stratified image transformation procedure. Computer-generated elemental image sets with distortion on purpose are used for verifying the proposed rectification method. Experimentally-captured elemental image sets are optically reconstructed before and after the rectification by the proposed method. The experimental results support the validity of the proposed method with high accuracy of image rectification and lattice extraction.
Gai, Qingqing; Qu, Feng; Zhang, Tao; Zhang, Yukui
2011-07-15
Both of the magnetic particle adsorption and aqueous two-phase extraction (ATPE) were simple, fast and low-cost method for protein separation. Selective proteins adsorption by carboxyl modified magnetic particles was investigated according to protein isoelectric point, solution pH and ionic strength. Aqueous two-phase system of PEG/sulphate exhibited selective separation and extraction for proteins before and after magnetic adsorption. The two combination ways, magnetic adsorption followed by ATPE and ATPE followed by magnetic adsorption, for the separation of proteins mixture of lysozyme, bovine serum albumin, trypsin, cytochrome C and myloglobin were discussed and compared. The way of magnetic adsorption followed by ATPE was also applied to human serum separation. Copyright © 2011 Elsevier B.V. All rights reserved.
Low-level processing for real-time image analysis
NASA Technical Reports Server (NTRS)
Eskenazi, R.; Wilf, J. M.
1979-01-01
A system that detects object outlines in television images in real time is described. A high-speed pipeline processor transforms the raw image into an edge map and a microprocessor, which is integrated into the system, clusters the edges, and represents them as chain codes. Image statistics, useful for higher level tasks such as pattern recognition, are computed by the microprocessor. Peak intensity and peak gradient values are extracted within a programmable window and are used for iris and focus control. The algorithms implemented in hardware and the pipeline processor architecture are described. The strategy for partitioning functions in the pipeline was chosen to make the implementation modular. The microprocessor interface allows flexible and adaptive control of the feature extraction process. The software algorithms for clustering edge segments, creating chain codes, and computing image statistics are also discussed. A strategy for real time image analysis that uses this system is given.
NASA Astrophysics Data System (ADS)
Meyer, F. J.; McAlpin, D. B.; Gong, W.; Ajadi, O.; Arko, S.; Webley, P. W.; Dehn, J.
2015-02-01
Remote sensing plays a critical role in operational volcano monitoring due to the often remote locations of volcanic systems and the large spatial extent of potential eruption pre-cursor signals. Despite the all-weather capabilities of radar remote sensing and its high performance in monitoring of change, the contribution of radar data to operational monitoring activities has been limited in the past. This is largely due to: (1) the high costs associated with radar data; (2) traditionally slow data processing and delivery procedures; and (3) the limited temporal sampling provided by spaceborne radars. With this paper, we present new data processing and data integration techniques that mitigate some of these limitations and allow for a meaningful integration of radar data into operational volcano monitoring decision support systems. Specifically, we present fast data access procedures as well as new approaches to multi-track processing that improve near real-time data access and temporal sampling of volcanic systems with SAR data. We introduce phase-based (coherent) and amplitude-based (incoherent) change detection procedures that are able to extract dense time series of hazard information from these data. For a demonstration, we present an integration of our processing system with an operational volcano monitoring system that was developed for use by the Alaska Volcano Observatory (AVO). Through an application to a historic eruption, we show that the integration of SAR into systems such as AVO can significantly improve the ability of operational systems to detect eruptive precursors. Therefore, the developed technology is expected to improve operational hazard detection, alerting, and management capabilities.
Using automatically extracted information from mammography reports for decision-support
Bozkurt, Selen; Gimenez, Francisco; Burnside, Elizabeth S.; Gulkesen, Kemal H.; Rubin, Daniel L.
2016-01-01
Objective To evaluate a system we developed that connects natural language processing (NLP) for information extraction from narrative text mammography reports with a Bayesian network for decision-support about breast cancer diagnosis. The ultimate goal of this system is to provide decision support as part of the workflow of producing the radiology report. Materials and methods We built a system that uses an NLP information extraction system (which extract BI-RADS descriptors and clinical information from mammography reports) to provide the necessary inputs to a Bayesian network (BN) decision support system (DSS) that estimates lesion malignancy from BI-RADS descriptors. We used this integrated system to predict diagnosis of breast cancer from radiology text reports and evaluated it with a reference standard of 300 mammography reports. We collected two different outputs from the DSS: (1) the probability of malignancy and (2) the BI-RADS final assessment category. Since NLP may produce imperfect inputs to the DSS, we compared the difference between using perfect (“reference standard”) structured inputs to the DSS (“RS-DSS”) vs NLP-derived inputs (“NLP-DSS”) on the output of the DSS using the concordance correlation coefficient. We measured the classification accuracy of the BI-RADS final assessment category when using NLP-DSS, compared with the ground truth category established by the radiologist. Results The NLP-DSS and RS-DSS had closely matched probabilities, with a mean paired difference of 0.004 ± 0.025. The concordance correlation of these paired measures was 0.95. The accuracy of the NLP-DSS to predict the correct BI-RADS final assessment category was 97.58%. Conclusion The accuracy of the information extracted from mammography reports using the NLP system was sufficient to provide accurate DSS results. We believe our system could ultimately reduce the variation in practice in mammography related to assessment of malignant lesions and improve management decisions. PMID:27388877
Optimal Design of MPPT Controllers for Grid Connected Photovoltaic Array System
NASA Astrophysics Data System (ADS)
Ebrahim, M. A.; AbdelHadi, H. A.; Mahmoud, H. M.; Saied, E. M.; Salama, M. M.
2016-10-01
Integrating photovoltaic (PV) plants into electric power system exhibits challenges to power system dynamic performance. These challenges stem primarily from the natural characteristics of PV plants, which differ in some respects from the conventional plants. The most significant challenge is how to extract and regulate the maximum power from the sun. This paper presents the optimal design for the most commonly used Maximum Power Point Tracking (MPPT) techniques based on Proportional Integral tuned by Particle Swarm Optimization (PI-PSO). These suggested techniques are, (1) the incremental conductance, (2) perturb and observe, (3) fractional short circuit current and (4) fractional open circuit voltage techniques. This research work provides a comprehensive comparative study with the energy availability ratio from photovoltaic panels. The simulation results proved that the proposed controllers have an impressive tracking response. The system dynamic performance improved greatly using the proposed controllers.
Martins, Nuno; Carreiro, Elisabete P; Locati, Abel; Ramalho, João P Prates; Cabrita, Maria João; Burke, Anthony J; Garcia, Raquel
2015-08-28
This work firstly addresses the design and development of molecularly imprinted systems selective for deltamethrin aiming to provide a suitable sorbent for solid phase (SPE) extraction that will be further used for the implementation of an analytical methodology for the trace analysis of the target pesticide in spiked olive oil samples. To achieve this goal, a preliminary evaluation of the molecular recognition and selectivity of the molecularly imprinted polymers has been performed. In order to investigate the complexity of the mechanistic basis for template selective recognition in these polymeric matrices, the use of a quantum chemical approach has been attempted providing new insights about the mechanisms underlying template recognition, and in particular the crucial role of the crosslinker agent and the solvent used. Thus, DFT calculations corroborate the results obtained by experimental molecular recognition assays enabling one to select the most suitable imprinting system for MISPE extraction technique which encompasses acrylamide as functional monomer and ethylene glycol dimethacrylate as crosslinker. Furthermore, an analytical methodology comprising a sample preparation step based on solid phase extraction has been implemented using this "tailor made" imprinting system as sorbent, for the selective isolation/pre-concentration of deltamethrin from olive oil samples. Molecularly imprinted solid phase extraction (MISPE) methodology was successfully applied for the clean-up of spiked olive oil samples, with recovery rates up to 94%. Copyright © 2015 Elsevier B.V. All rights reserved.
Temporal integration at consecutive processing stages in the auditory pathway of the grasshopper.
Wirtssohn, Sarah; Ronacher, Bernhard
2015-04-01
Temporal integration in the auditory system of locusts was quantified by presenting single clicks and click pairs while performing intracellular recordings. Auditory neurons were studied at three processing stages, which form a feed-forward network in the metathoracic ganglion. Receptor neurons and most first-order interneurons ("local neurons") encode the signal envelope, while second-order interneurons ("ascending neurons") tend to extract more complex, behaviorally relevant sound features. In different neuron types of the auditory pathway we found three response types: no significant temporal integration (some ascending neurons), leaky energy integration (receptor neurons and some local neurons), and facilitatory processes (some local and ascending neurons). The receptor neurons integrated input over very short time windows (<2 ms). Temporal integration on longer time scales was found at subsequent processing stages, indicative of within-neuron computations and network activity. These different strategies, realized at separate processing stages and in parallel neuronal pathways within one processing stage, could enable the grasshopper's auditory system to evaluate longer time windows and thus to implement temporal filters, while at the same time maintaining a high temporal resolution. Copyright © 2015 the American Physiological Society.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trease, Lynn L.; Trease, Harold E.; Fowler, John
2007-03-15
One of the critical steps toward performing computational biology simulations, using mesh based integration methods, is in using topologically faithful geometry derived from experimental digital image data as the basis for generating the computational meshes. Digital image data representations contain both the topology of the geometric features and experimental field data distributions. The geometric features that need to be captured from the digital image data are three-dimensional, therefore the process and tools we have developed work with volumetric image data represented as data-cubes. This allows us to take advantage of 2D curvature information during the segmentation and feature extraction process.more » The process is basically: 1) segmenting to isolate and enhance the contrast of the features that we wish to extract and reconstruct, 2) extracting the geometry of the features in an isosurfacing technique, and 3) building the computational mesh using the extracted feature geometry. “Quantitative” image reconstruction and feature extraction is done for the purpose of generating computational meshes, not just for producing graphics "screen" quality images. For example, the surface geometry that we extract must represent a closed water-tight surface.« less
Identification of Curie temperature distributions in magnetic particulate systems
NASA Astrophysics Data System (ADS)
Waters, J.; Berger, A.; Kramer, D.; Fangohr, H.; Hovorka, O.
2017-09-01
This paper develops a methodology for extracting the Curie temperature distribution from magnetisation versus temperature measurements which are realizable by standard laboratory magnetometry. The method is integral in nature, robust against various sources of measurement noise, and can be adopted to a wide range of granular magnetic materials and magnetic particle systems. The validity and practicality of the method is demonstrated using large-scale Monte-Carlo simulations of an Ising-like model as a proof of concept, and general conclusions are drawn about its applicability to different classes of systems and experimental conditions.
Yongsu Lee; Hyeonwoo Lee; Seunghyup Yoo; Hoi-Jun Yoo
2016-08-01
The sticker-type sensor system is proposed targeting ECG/PPG concurrent monitoring for cardiovascular diseases. The stickers are composed of two types: Hub and Sensor-node (SN) sticker. Low-power CMOS SoC for measuring ECG and PPG signal is hybrid integrated with organic light emitting diodes (OLEDs) and organic photo detector (OPD). The sticker has only 2g weight and only consumes 141μW. The optical calibration loop is adopted for maintaining SNR of PPG signal higher than 30dB. The pulse arrival time (PAT) and SpO2 value can be extracted from various body parts and verified comparing with the reference device from 20 people in-vivo experiments.
Integrated feature extraction and selection for neuroimage classification
NASA Astrophysics Data System (ADS)
Fan, Yong; Shen, Dinggang
2009-02-01
Feature extraction and selection are of great importance in neuroimage classification for identifying informative features and reducing feature dimensionality, which are generally implemented as two separate steps. This paper presents an integrated feature extraction and selection algorithm with two iterative steps: constrained subspace learning based feature extraction and support vector machine (SVM) based feature selection. The subspace learning based feature extraction focuses on the brain regions with higher possibility of being affected by the disease under study, while the possibility of brain regions being affected by disease is estimated by the SVM based feature selection, in conjunction with SVM classification. This algorithm can not only take into account the inter-correlation among different brain regions, but also overcome the limitation of traditional subspace learning based feature extraction methods. To achieve robust performance and optimal selection of parameters involved in feature extraction, selection, and classification, a bootstrapping strategy is used to generate multiple versions of training and testing sets for parameter optimization, according to the classification performance measured by the area under the ROC (receiver operating characteristic) curve. The integrated feature extraction and selection method is applied to a structural MR image based Alzheimer's disease (AD) study with 98 non-demented and 100 demented subjects. Cross-validation results indicate that the proposed algorithm can improve performance of the traditional subspace learning based classification.
Database Integrity Monitoring for Synthetic Vision Systems Using Machine Vision and SHADE
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; Young, Steven D.
2005-01-01
In an effort to increase situational awareness, the aviation industry is investigating technologies that allow pilots to visualize what is outside of the aircraft during periods of low-visibility. One of these technologies, referred to as Synthetic Vision Systems (SVS), provides the pilot with real-time computer-generated images of obstacles, terrain features, runways, and other aircraft regardless of weather conditions. To help ensure the integrity of such systems, methods of verifying the accuracy of synthetically-derived display elements using onboard remote sensing technologies are under investigation. One such method is based on a shadow detection and extraction (SHADE) algorithm that transforms computer-generated digital elevation data into a reference domain that enables direct comparison with radar measurements. This paper describes machine vision techniques for making this comparison and discusses preliminary results from application to actual flight data.
IDEF5 Ontology Description Capture Method: Concept Paper
NASA Technical Reports Server (NTRS)
Menzel, Christopher P.; Mayer, Richard J.
1990-01-01
The results of research towards an ontology capture method referred to as IDEF5 are presented. Viewed simply as the study of what exists in a domain, ontology is an activity that can be understood to be at work across the full range of human inquiry prompted by the persistent effort to understand the world in which it has found itself - and which it has helped to shape. In the contest of information management, ontology is the task of extracting the structure of a given engineering, manufacturing, business, or logistical domain and storing it in an usable representational medium. A key to effective integration is a system ontology that can be accessed and modified across domains and which captures common features of the overall system relevant to the goals of the disparate domains. If the focus is on information integration, then the strongest motivation for ontology comes from the need to support data sharing and function interoperability. In the correct architecture, an enterprise ontology base would allow th e construction of an integrated environment in which legacy systems appear to be open architecture integrated resources. If the focus is on system/software development, then support for the rapid acquisition of reliable systems is perhaps the strongest motivation for ontology. Finally, ontological analysis was demonstrated to be an effective first step in the construction of robust knowledge based systems.
Applications of SPICE for modeling miniaturized biomedical sensor systems
NASA Technical Reports Server (NTRS)
Mundt, C. W.; Nagle, H. T.
2000-01-01
This paper proposes a model for a miniaturized signal conditioning system for biopotential and ion-selective electrode arrays. The system consists of three main components: sensors, interconnections, and signal conditioning chip. The model for this system is based on SPICE. Transmission-line based equivalent circuits are used to represent the sensors, lumped resistance-capacitance circuits describe the interconnections, and a model for the signal conditioning chip is extracted from its layout. A system for measurements of biopotentials and ionic activities can be miniaturized and optimized for cardiovascular applications based on the development of an integrated SPICE system model of its electrochemical, interconnection, and electronic components.
Analysis of tracheid development in suppressed-growth Ponderosa Pine using the FPL ring profiler
C. Tim Scott; David W. Vahey
2012-01-01
The Ring Profiler was developed to examine the cross-sectional morphology of wood tracheids in a 12.5-mm core sample. The instrument integrates a specially designed staging apparatus with an optical imaging system to obtain high-contrast, high-resolution images containing about 200-500 tracheids. These images are further enhanced and analyzed to extract tracheid cross-...
Advanced Optical Fiber Communication Systems.
1993-02-28
feedback (DFB) laser and a fiber Fabry - Perot (FFP) interferometer for optical frequency discrimination. After the photodetector and amplification, a...filter, an envelope detector, and an integrator; these three components function in tandem as a phase demodulator . We have analyzed the nonlinearities...down-converter and FSK demodulator extract the desired video signals. The measured carrier-to-noise ratio (CNR) at the photodiode must be approximately
Automatic Feature Extraction System.
1982-12-01
exploitation. It was used for * processing of black and white and multispectral reconnaissance photography, side-looking synthetic aperture radar imagery...the image data and different software modules for image queing and formatting, the result of the input process will be images in standard AFES file...timely manner. The FFS configuration provides the environment necessary for integrated testing of image processing functions and design and
An integrated gateway for various PHDs in U-healthcare environments.
Park, KeeHyun; Pak, JuGeon
2012-01-01
We propose an integrated gateway for various personal health devices (PHDs). This gateway receives measurements from various PHDs and conveys them to a remote monitoring server (MS). It provides two kinds of transmission modes: immediate transmission and integrated transmission. The former mode operates if a measurement exceeds a predetermined threshold or in the case of an emergency. In the latter mode, the gateway retains the measurements instead of forwarding them. When the reporting time comes, the gateway extracts all the stored measurements, integrates them into one message, and transmits the integrated message to the MS. Through this mechanism, the transmission overhead can be reduced. On the basis of the proposed gateway, we construct a u-healthcare system comprising an activity monitor, a medication dispenser, and a pulse oximeter. The evaluation results show that the size of separate messages from various PHDs is reduced through the integration process, and the process does not require much time; the integration time is negligible.
An Integrated Gateway for Various PHDs in U-Healthcare Environments
Park, KeeHyun; Pak, JuGeon
2012-01-01
We propose an integrated gateway for various personal health devices (PHDs). This gateway receives measurements from various PHDs and conveys them to a remote monitoring server (MS). It provides two kinds of transmission modes: immediate transmission and integrated transmission. The former mode operates if a measurement exceeds a predetermined threshold or in the case of an emergency. In the latter mode, the gateway retains the measurements instead of forwarding them. When the reporting time comes, the gateway extracts all the stored measurements, integrates them into one message, and transmits the integrated message to the MS. Through this mechanism, the transmission overhead can be reduced. On the basis of the proposed gateway, we construct a u-healthcare system comprising an activity monitor, a medication dispenser, and a pulse oximeter. The evaluation results show that the size of separate messages from various PHDs is reduced through the integration process, and the process does not require much time; the integration time is negligible. PMID:22899891
Data System for HS3 Airborne Field Campaign
NASA Astrophysics Data System (ADS)
Maskey, M.; Mceniry, M.; Berendes, T.; Bugbee, K.; Conover, H.; Ramachandran, R.
2014-12-01
Hurricane and Severe Storm Sentinel (HS3) is a NASA airborne field campaign aimed at better understanding the physical processes that control hurricane intensity change. HS3 will help answer questions related to the roles of environmental conditions and internal storm structures to storm intensification. Due to the nature of the questions that HS3 mission is addressing, it involves a variety of in-situ, satellite observations, airborne data, meteorological analyses, and simulation data. This variety of datasets presents numerous data management challenges for HS3. The methods used for airborne data management differ greatly from the methods used for space-borne data. In particular, metadata extraction, spatial and temporal indexing, and the large number of instruments and subsequent variables are a few of the data management challenges unique to airborne missions. A robust data system is required to successfully help HS3 scientist achieve their mission goals. Furthermore, the data system also needs to provide for data management that assists in broader use of HS3 data to enable future research activities. The Global Hydrology Resource Center (GHRC) is considering all these needs and designing a data system for HS3. Experience with past airborne field campaign puts GHRC in a good position to address HS3 needs. However, the scale of this mission along with science requirements separates HS3 from previous field campaigns. The HS3 data system will include automated services for geo-location, metadata extraction, discovery, and distribution for all HS3 data. To answer the science questions, the data system will include a visual data exploration tool that is fully integrated into the data catalog. The tool will allow visually augmenting airborne data with analyses and simulations. Satellite data will provide contextual information during such data explorations. All HS3 tools will be supported by an enterprise service architecture that will allow scaling, easy integration of new tools and existing services, and integration of new ESDIS metadata and security guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Olama, Mohammed M; Nutaro, James J; Sukumar, Sreenivas R
2013-01-01
The success of data-driven business in government, science, and private industry is driving the need for seamless integration of intra and inter-enterprise data sources to extract knowledge nuggets in the form of correlations, trends, patterns and behaviors previously not discovered due to physical and logical separation of datasets. Today, as volume, velocity, variety and complexity of enterprise data keeps increasing, the next generation analysts are facing several challenges in the knowledge extraction process. Towards addressing these challenges, data-driven organizations that rely on the success of their analysts have to make investment decisions for sustainable data/information systems and knowledge discovery. Optionsmore » that organizations are considering are newer storage/analysis architectures, better analysis machines, redesigned analysis algorithms, collaborative knowledge management tools, and query builders amongst many others. In this paper, we present a concept of operations for enabling knowledge discovery that data-driven organizations can leverage towards making their investment decisions. We base our recommendations on the experience gained from integrating multi-agency enterprise data warehouses at the Oak Ridge National Laboratory to design the foundation of future knowledge nurturing data-system architectures.« less
Integrated Computational System for Aerodynamic Steering and Visualization
NASA Technical Reports Server (NTRS)
Hesselink, Lambertus
1999-01-01
In February of 1994, an effort from the Fluid Dynamics and Information Sciences Divisions at NASA Ames Research Center with McDonnel Douglas Aerospace Company and Stanford University was initiated to develop, demonstrate, validate and disseminate automated software for numerical aerodynamic simulation. The goal of the initiative was to develop a tri-discipline approach encompassing CFD, Intelligent Systems, and Automated Flow Feature Recognition to improve the utility of CFD in the design cycle. This approach would then be represented through an intelligent computational system which could accept an engineer's definition of a problem and construct an optimal and reliable CFD solution. Stanford University's role focused on developing technologies that advance visualization capabilities for analysis of CFD data, extract specific flow features useful for the design process, and compare CFD data with experimental data. During the years 1995-1997, Stanford University focused on developing techniques in the area of tensor visualization and flow feature extraction. Software libraries were created enabling feature extraction and exploration of tensor fields. As a proof of concept, a prototype system called the Integrated Computational System (ICS) was developed to demonstrate CFD design cycle. The current research effort focuses on finding a quantitative comparison of general vector fields based on topological features. Since the method relies on topological information, grid matching and vector alignment is not needed in the comparison. This is often a problem with many data comparison techniques. In addition, since only topology based information is stored and compared for each field, there is a significant compression of information that enables large databases to be quickly searched. This report will (1) briefly review the technologies developed during 1995-1997 (2) describe current technologies in the area of comparison techniques, (4) describe the theory of our new method researched during the grant year (5) summarize a few of the results and finally (6) discuss work within the last 6 months that are direct extensions from the grant.
Building integration of photovoltaic systems in cold climates
NASA Astrophysics Data System (ADS)
Athienitis, Andreas K.; Candanedo, José A.
2010-06-01
This paper presents some of the research activities on building-integrated photovoltaic (BIPV) systems developed by the Solar and Daylighting Laboratory at Concordia University. BIPV systems offer considerable advantages as compared to stand-alone PV installations. For example, BIPV systems can play a role as essential components of the building envelope. BIPV systems operate as distributed power generators using the most widely available renewable source. Since BIPV systems do not require additional space, they are especially appropriate for urban environments. BIPV/Thermal (BIPV/T) systems may use exterior air to extract useful heat from the PV panels, cooling them and thereby improving their electric performance. The recovered thermal energy can then be used for space heating and domestic hot water (DHW) heating, supporting the utilization of BIVP/T as an appropriate technology for cold climates. BIPV and BIPV/T systems are the subject of several ongoing research and demonstration projects (in both residential and commercial buildings) led by Concordia University. The concept of integrated building design and operation is at the centre of these efforts: BIPV and BIPV/T systems must be treated as part of a comprehensive strategy taking into account energy conservation measures, passive solar design, efficient lighting and HVAC systems, and integration of other renewable energy systems (solar thermal, heat pumps, etc.). Concordia Solar Laboratory performs fundamental research on heat transfer and modeling of BIPV/T systems, numerical and experimental investigations on BIPV and BIPV/T in building energy systems and non-conventional applications (building-attached greenhouses), and the design and optimization of buildings and communities.
NASA Technical Reports Server (NTRS)
Brice, R.; Mosley, J.; Willis, D.; Coleman, K.; Martin, C.; Shelby, L.; Kelley, U.; Renfro, E.; Griffith, G.; Warsame, A.
1989-01-01
In a continued effort to design a surface-based factory on Mars for the production of oxygen and water, the Design Group at Prairie View A&M University made a preliminary study of the surface and atmospheric composition on Mars and determined the mass densities of the various gases in the martian atmosphere. Based on the initial studies, the design group determined oxygen and water to be the two products that could be produced economically under the martian conditions. Studies were also made on present production techniques to obtain water and oxygen. Analyses were made to evaluate the current methods of production that were adaptable to the martian conditions. The detailed report was contained in an Interim Report submitted to NASA/USRA in Aug. of 1986. Even though the initial effort was the production of oxygen and water, we found it necessary to produce some diluted gases that can be mixed with oxygen to constitute 'breathable' air. In Phase 2--Task 1A, the Prairie View A&M University team completed the conceptual design of a breathable-air manufacturing system, a means of drilling for underground water, and storage of water for future use. The design objective of the team for the 1987-1988 academic year was the conceptual design of an integrated system for the supply of quality water for biological consumption, farming, and residential and industrial use. The design has also been completed. Phase 2--Task 1C is the present task for the Prairie View Design Team. This is a continuation of the previous task, and the continuation of this effort is the investigation into the extraction of water from beneath the surface and an alternative method of extraction from ice formations on the surface of Mars if accessible. In addition to investigation of water extraction, a system for computer control of extraction and treatment was developed with emphasis on fully automated control with robotic repair and maintenance. It is expected that oxygen- and water-producing plants on Mars will be limited in the amount of human control that will be available to operate large and/or isolated plants. Therefore, it is imperative that computers be integrated into plant operation with the capability to maintain life support systems and analyze and replace defective parts or systems with no human interface.
The atmosphere of Mars - Resources for the exploration and settlement of Mars
NASA Technical Reports Server (NTRS)
Meyer, T. R.; Mckay, C. P.
1984-01-01
This paper describes methods of processing the Mars atmosphere to supply water, oxygen and buffer gas for a Mars base. Existing life support system technology is combined with innovative methods of water extraction, and buffer gas processing. The design may also be extended to incorporate an integrated greenhouse to supply food, oxygen and water recycling. It is found that the work required to supply one kilogram of an argon/nitrogen buffer gas is 9.4 kW-hr. To extract water from the dry Martian atmosphere can require up to 102.8 kW-hr per kilogram of water depending on the relative humidity of the air.
NASA Astrophysics Data System (ADS)
Semenov, K. N.; Charykov, N. A.; Postnov, V. N.; Sharoyko, V. V.; Murin, I. V.
2016-01-01
This review is the first attempt to integrate the available data on all types of phase equilibria (solubility, extraction and sorption) in systems containing light fullerenes (C60 and C70). In the case of solubility diagrams, the following types of phase equilibria are considered: individual fullerene (C60 or C70)-solvent under polythermal and polybaric conditions; C60-C70-solvent, individual fullerene-solvent(1)-solvent(2), as well as multicomponent systems comprising a single fullerene or an industrial mixture of fullerenes and vegetable oils, animal fats or essential oils under polythermal conditions. All published experimental data on the extraction equilibria in C60-C70-liquid phase(1)-liquid phase(2) systems are described systematically and the sorption characteristics of various materials towards light fullerenes are estimated. The possibility of application of these experimental data for development of pre-chromatographic and chromatographic methods for separation of fullerene mixtures and application of fullerenes as nanomodifiers are described. The bibliography includes 87 references.
New developments of a knowledge based system (VEG) for inferring vegetation characteristics
NASA Technical Reports Server (NTRS)
Kimes, D. S.; Harrison, P. A.; Harrison, P. R.
1992-01-01
An extraction technique for inferring physical and biological surface properties of vegetation using nadir and/or directional reflectance data as input has been developed. A knowledge-based system (VEG) accepts spectral data of an unknown target as input, determines the best strategy for inferring the desired vegetation characteristic, applies the strategy to the target data, and provides a rigorous estimate of the accuracy of the inference. Progress in developing the system is presented. VEG combines methods from remote sensing and artificial intelligence, and integrates input spectral measurements with diverse knowledge bases. VEG has been developed to (1) infer spectral hemispherical reflectance from any combination of nadir and/or off-nadir view angles; (2) test and develop new extraction techniques on an internal spectral database; (3) browse, plot, or analyze directional reflectance data in the system's spectral database; (4) discriminate between user-defined vegetation classes using spectral and directional reflectance relationships; and (5) infer unknown view angles from known view angles (known as view angle extension).
Leveraging Terminology Services for Extract-Transform-Load Processes: A User-Centered Approach
Peterson, Kevin J.; Jiang, Guoqian; Brue, Scott M.; Liu, Hongfang
2016-01-01
Terminology services serve an important role supporting clinical and research applications, and underpin a diverse set of processes and use cases. Through standardization efforts, terminology service-to-system interactions can leverage well-defined interfaces and predictable integration patterns. Often, however, users interact more directly with terminologies, and no such blueprints are available for describing terminology service-to-user interactions. In this work, we explore the main architecture principles necessary to build a user-centered terminology system, using an Extract-Transform-Load process as our primary usage scenario. To analyze our architecture, we present a prototype implementation based on the Common Terminology Services 2 (CTS2) standard using the Patient-Centered Network of Learning Health Systems (LHSNet) project as a concrete use case. We perform a preliminary evaluation of our prototype architecture using three architectural quality attributes: interoperability, adaptability and usability. We find that a design-time focus on user needs, cognitive models, and existing patterns is essential to maximize system utility. PMID:28269898
A summary and integration of research concerning single pilot IFR operational problems
NASA Technical Reports Server (NTRS)
Chapman, G. C.
1983-01-01
A review of seven research studies pertaining to Single Pilot IFR (SPIFR) operations was performed. Two studies were based on questionnaire surveys; two based on National Transportation Safety Board (NTSB) reports; two were based on Aviation Safety Reporting System (ASRS) incident reports, and one report used event analysis and statistics to forecast problems. The results obtained in each study were extracted and integrated. Results were synthesized and key issues pertaining to SPIFR operations problems were identified. The research that was recommended by the studies and that addressed the key issues is catalogued for each key issue.
Tsai, Huey-Pin; Tsai, You-Yuan; Lin, I-Ting; Kuo, Pin-Hwa; Chen, Tsai-Yun; Chang, Kung-Chao; Wang, Jen-Ren
2016-01-01
Quantitation of cytomegalovirus (CMV) viral load in the transplant patients has become a standard practice for monitoring the response to antiviral therapy. The cut-off values of CMV viral load assays for preemptive therapy are different due to the various assay designs employed. To establish a sensitive and reliable diagnostic assay for preemptive therapy of CMV infection, two commercial automated platforms including m2000sp extraction system integrated the Abbott RealTime (m2000rt) and the Roche COBAS AmpliPrep for extraction integrated COBAS Taqman (CAP/CTM) were evaluated using WHO international CMV standards and 110 plasma specimens from transplant patients. The performance characteristics, correlation, and workflow of the two platforms were investigated. The Abbott RealTime assay correlated well with the Roche CAP/CTM assay (R2 = 0.9379, P<0.01). The Abbott RealTime assay exhibited higher sensitivity for the detection of CMV viral load, and viral load values measured with Abbott RealTime assay were on average 0.76 log10 IU/mL higher than those measured with the Roche CAP/CTM assay (P<0.0001). Workflow analysis on a small batch size at one time, using the Roche CAP/CTM platform had a shorter hands-on time than the Abbott RealTime platform. In conclusion, these two assays can provide reliable data for different purpose in a clinical virology laboratory setting. PMID:27494707
Lintelmann, Jutta; Wu, Xiao; Kuhn, Evelyn; Ritter, Sebastian; Schmidt, Claudia; Zimmermann, Ralf
2018-05-01
A high-performance liquid chromatographic (HPLC) method with integrated solid-phase extraction for the determination of 1-hydroxypyrene and 1-, 2-, 3-, 4- and 9-hydroxyphenanthrene in urine was developed and validated. After enzymatic treatment and centrifugation of 500 μL urine, 100 μL of the sample was directly injected into the HPLC system. Integrated solid-phase extraction was performed on a selective, copper phthalocyanine modified packing material. Subsequent chromatographic separation was achieved on a pentafluorophenyl core-shell column using a methanol gradient. For quantification, time-programmed fluorescence detection was used. Matrix-dependent recoveries were between 94.8 and 102.4%, repeatability and reproducibility ranged from 2.2 to 17.9% and detection limits lay between 2.6 and 13.6 ng/L urine. A set of 16 samples from normally exposed adults was analyzed using this HPLC-fluorescence detection method. Results were comparable with those reported in other studies. The chromatographic separation of the method was transferred to an ultra-high-performance liquid chromatography pentafluorophenyl core-shell column and coupled to a high-resolution time-of-flight mass spectrometer (HR-TOF-MS). The resulting method was used to demonstrate the applicability of LC-HR-TOF-MS for simultaneous target and suspect screening of monohydroxylated polycyclic aromatic hydrocarbons in extracts of urine and particulate matter. Copyright © 2018 John Wiley & Sons, Ltd.
Fabiano-Tixier, Anne-Sylvie; Elomri, Abdelhakim; Blanckaert, Axelle; Seguin, Elisabeth; Petitcolas, Emmanuel; Chemat, Farid
2011-01-01
Quinas contains several compounds, such as quinoline alkaloids, principally quinine, quinidine, cinchonine and cichonidine. Identified from barks of Cinchona, quinine is still commonly used to treat human malaria. Microwave-Integrated Extraction and Leaching (MIEL) is proposed for the extraction of quinoline alkaloids from bark of Cinchona succirubra. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. Optimal conditions for extraction were obtained using a response surface methodology reached from a central composite design. The MIEL extraction has been compared with a conventional technique soxhlet extraction. The extracts of quinoline alkaloids from C. succirubra obtained by these two different methods were compared by HPLC. The extracts obtained by MIEL in 32 min were quantitatively (yield) and qualitatively (quinine, quinidine, cinchonine, cinchonidine) similar to those obtained by conventional Soxhlet extraction in 3 hours. MIEL is a green technology that serves as a good alternative for the extraction of Cinchona alkaloids.
Fabiano-Tixier, Anne-Sylvie; Elomri, Abdelhakim; Blanckaert, Axelle; Seguin, Elisabeth; Petitcolas, Emmanuel; Chemat, Farid
2011-01-01
Quinas contains several compounds, such as quinoline alkaloids, principally quinine, quinidine, cinchonine and cichonidine. Identified from barks of Cinchona, quinine is still commonly used to treat human malaria. Microwave-Integrated Extraction and Leaching (MIEL) is proposed for the extraction of quinoline alkaloids from bark of Cinchona succirubra. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. Optimal conditions for extraction were obtained using a response surface methodology reached from a central composite design. The MIEL extraction has been compared with a conventional technique soxhlet extraction. The extracts of quinoline alkaloids from C. succirubra obtained by these two different methods were compared by HPLC. The extracts obtained by MIEL in 32 min were quantitatively (yield) and qualitatively (quinine, quinidine, cinchonine, cinchonidine) similar to those obtained by conventional Soxhlet extraction in 3 hours. MIEL is a green technology that serves as a good alternative for the extraction of Cinchona alkaloids. PMID:22174637
Watt, Nicola; Sigfrid, Louise; Legido-Quigley, Helena; Hogarth, Sue; Maimaris, Will; Otero-García, Laura; Perel, Pablo; Buse, Kent; McKee, Martin; Piot, Peter; Balabanova, Dina
2017-11-01
Integration of services for patients with more than one diagnosed condition has intuitive appeal but it has been argued that the empirical evidence to support it is limited. We report the findings of a systematic review that sought to identify health system factors, extrinsic to the integration process, which either facilitated or hindered the integration of services for two common disorders, HIV and chronic non-communicable diseases. Findings were initially extracted and organized around a health system framework, followed by a thematic cross-cutting analysis and validation steps. Of the 150 articles included, 67% (n = 102) were from high-income countries. The articles explored integration with services for one or several chronic disorders, the most studied being alcohol or substance use disorders (47.7%), and mental health issues (29.5%). Four cross-cutting themes related to the health system were identified. The first and most common theme was the requirement for effective collaboration and coordination: formal and informal productive relationships throughout the system between providers and within teams, and between staff and patients. The second was the need for adequate and appropriately skilled and incentivized health workers-with the right expertise, training and operational support for the programme. The third was the need for supportive institutional structures and dedicated resources. The fourth was leadership in terms of political will, effective managerial oversight and organizational culture, indicating that actual implementation is as important as programme design. A fifth theme, outside the health system, but underpinning all aspects of the system operation, was that placing the patient at the centre of service delivery and responding holistically to their diverse needs. This was an important facilitator of integration. These findings confirm that integration processes in service delivery depend substantially for their success on characteristics of the health systems in which they are embedded. © The Author 2017. Published by Oxford University Press in association with The London School of Hygiene and Tropical Medicine.
Watt, Nicola; Sigfrid, Louise; Legido-Quigley, Helena; Hogarth, Sue; Maimaris, Will; Otero-García, Laura; Perel, Pablo; Buse, Kent; McKee, Martin; Piot, Peter; Balabanova, Dina
2017-01-01
Abstract Integration of services for patients with more than one diagnosed condition has intuitive appeal but it has been argued that the empirical evidence to support it is limited. We report the findings of a systematic review that sought to identify health system factors, extrinsic to the integration process, which either facilitated or hindered the integration of services for two common disorders, HIV and chronic non-communicable diseases. Findings were initially extracted and organized around a health system framework, followed by a thematic cross-cutting analysis and validation steps. Of the 150 articles included, 67% (n = 102) were from high-income countries. The articles explored integration with services for one or several chronic disorders, the most studied being alcohol or substance use disorders (47.7%), and mental health issues (29.5%). Four cross-cutting themes related to the health system were identified. The first and most common theme was the requirement for effective collaboration and coordination: formal and informal productive relationships throughout the system between providers and within teams, and between staff and patients. The second was the need for adequate and appropriately skilled and incentivized health workers—with the right expertise, training and operational support for the programme. The third was the need for supportive institutional structures and dedicated resources. The fourth was leadership in terms of political will, effective managerial oversight and organizational culture, indicating that actual implementation is as important as programme design. A fifth theme, outside the health system, but underpinning all aspects of the system operation, was that placing the patient at the centre of service delivery and responding holistically to their diverse needs. This was an important facilitator of integration. These findings confirm that integration processes in service delivery depend substantially for their success on characteristics of the health systems in which they are embedded. PMID:28666336
Manufacturing Process for OLED Integrated Substrate
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hung, Cheng-Hung; McCamy, James; Ashtosh, Ganjoo
2017-01-27
The primary objective of this project is to demonstrate manufacturing processes for technologies that will enable commercialization of a large-area and low-cost “integrated substrate” product for rigid OLED SSL lighting. The integrated substrate product will consist of a low cost, float glass substrate combined with a transparent conductive anode film layer, and light out-coupling (internal and external extraction layers) structures. In combination, these design elements will enable an integrated substrate meeting or exceeding 2015 performance targets for cost ($60/m2), extraction efficiency (50%) and sheet resistance (<10 ohm/sq).
Virot, Matthieu; Tomao, Valérie; Colnagui, Giulio; Visinoni, Franco; Chemat, Farid
2007-12-07
A new process of Soxhlet extraction assisted by microwave was designed and developed. The process is performed in four steps, which ensures complete, rapid and accurate extraction of the samples. A second-order central composite design (CCD) has been used to investigate the performance of the new device. The results provided by analysis of variance and Pareto chart, indicated that the extraction time was the most important factor followed by the leaching time. The response surface methodology allowed us to determine optimal conditions for olive oil extraction: 13 min of extraction time, 17 min of leaching time, and 720 W of irradiation power. The proposed process is suitable for lipids determination from food. Microwave-integrated Soxhlet (MIS) extraction has been compared with a conventional technique, Soxhlet extraction, for the extraction of oil from olives (Aglandau, Vaucluse, France). The oils extracted by MIS for 32 min were quantitatively (yield) and qualitatively (fatty acid composition) similar to those obtained by conventional Soxhlet extraction for 8 h. MIS is a green technology and appears as a good alternative for the extraction of fat and oils from food products.
Thermal imaging as a biometrics approach to facial signature authentication.
Guzman, A M; Goryawala, M; Wang, Jin; Barreto, A; Andrian, J; Rishe, N; Adjouadi, M
2013-01-01
A new thermal imaging framework with unique feature extraction and similarity measurements for face recognition is presented. The research premise is to design specialized algorithms that would extract vasculature information, create a thermal facial signature and identify the individual. The proposed algorithm is fully integrated and consolidates the critical steps of feature extraction through the use of morphological operators, registration using the Linear Image Registration Tool and matching through unique similarity measures designed for this task. The novel approach at developing a thermal signature template using four images taken at various instants of time ensured that unforeseen changes in the vasculature over time did not affect the biometric matching process as the authentication process relied only on consistent thermal features. Thirteen subjects were used for testing the developed technique on an in-house thermal imaging system. The matching using the similarity measures showed an average accuracy of 88.46% for skeletonized signatures and 90.39% for anisotropically diffused signatures. The highly accurate results obtained in the matching process clearly demonstrate the ability of the thermal infrared system to extend in application to other thermal imaging based systems. Empirical results applying this approach to an existing database of thermal images proves this assertion.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Steele, V.E.; Lange, C.S.
1976-07-01
The planarian owes its extensive powers of regeneration to the possession of a totipotential stem cell system. The survival of the animal after irradiation depends mainly upon this system. In this respect the planarian is analogous to mammalian organ systems such as bone marrow or gut epithelium. The differentiated cells control the course of stem cell mediated tissue renewal by the secretion of differentiator and/or inhibitor substances. One such inhibitor substance, present in extracts prepared from homogenized whole planarians, specifically inhibits brain formation. This substance is organ specific, but not species specific. The differentiative integrity of the stem cells aftermore » irradiation is measured by comparing the regenerated brain volumes resulting from the presence or absence of the brain inhibitory extract during the regeneration period. Our data suggest that increasing doses of x irradiation decreases the ability of the stem cells to respond to differentiative substances. The data presented also explore the possibility of altering the postirradiation recovery pattern by shifting the differentiative demands placed on the stem cells. The final proportions of animals (one-half regenerated with, and one-half without, the extract) surviving after 60 days were not significantly different.« less
NASA Astrophysics Data System (ADS)
Fabbrini, L.; Messina, M.; Greco, M.; Pinelli, G.
2011-10-01
In the context of augmented integrity Inertial Navigation System (INS), recent technological developments have been focusing on landmark extraction from high-resolution synthetic aperture radar (SAR) images in order to retrieve aircraft position and attitude. The article puts forward a processing chain that can automatically detect linear landmarks on highresolution synthetic aperture radar (SAR) images and can be successfully exploited also in the context of augmented integrity INS. The processing chain uses constant false alarm rate (CFAR) edge detectors as the first step of the whole processing procedure. Our studies confirm that the ratio of averages (RoA) edge detector detects object boundaries more effectively than Student T-test and Wilcoxon-Mann-Whitney (WMW) test. Nevertheless, all these statistical edge detectors are sensitive to violation of the assumptions which underlie their theory. In addition to presenting a solution to the previous problem, we put forward a new post-processing algorithm useful to remove the main false alarms, to select the most probable edge position, to reconstruct broken edges and finally to vectorize them. SAR images from the "MSTAR clutter" dataset were used to prove the effectiveness of the proposed algorithms.
Extraction of Molecular Features through Exome to Transcriptome Alignment
Mudvari, Prakriti; Kowsari, Kamran; Cole, Charles; Mazumder, Raja; Horvath, Anelia
2014-01-01
Integrative Next Generation Sequencing (NGS) DNA and RNA analyses have very recently become feasible, and the published to date studies have discovered critical disease implicated pathways, and diagnostic and therapeutic targets. A growing number of exomes, genomes and transcriptomes from the same individual are quickly accumulating, providing unique venues for mechanistic and regulatory features analysis, and, at the same time, requiring new exploration strategies. In this study, we have integrated variation and expression information of four NGS datasets from the same individual: normal and tumor breast exomes and transcriptomes. Focusing on SNPcentered variant allelic prevalence, we illustrate analytical algorithms that can be applied to extract or validate potential regulatory elements, such as expression or growth advantage, imprinting, loss of heterozygosity (LOH), somatic changes, and RNA editing. In addition, we point to some critical elements that might bias the output and recommend alternative measures to maximize the confidence of findings. The need for such strategies is especially recognized within the growing appreciation of the concept of systems biology: integrative exploration of genome and transcriptome features reveal mechanistic and regulatory insights that reach far beyond linear addition of the individual datasets. PMID:24791251
The effect of integration masking on visual processing in perceptual categorization.
Hélie, Sébastien
2017-08-01
Learning to recognize and categorize objects is an essential cognitive skill allowing animals to function in the world. However, animals rarely have access to a canonical view of an object in an uncluttered environment. Hence, it is essential to study categorization under noisy, degraded conditions. In this article, we explore how the brain processes categorization stimuli in low signal-to-noise conditions using multivariate pattern analysis. We used an integration masking paradigm with mask opacity of 50%, 60%, and 70% inside a magnetic resonance imaging scanner. The results show that mask opacity affects blood-oxygen-level dependent (BOLD) signal in visual processing areas (V1, V2, V3, and V4) but does not affect the BOLD signal in brain areas traditionally associated with categorization (prefrontal cortex, striatum, hippocampus). This suggests that when a stimulus is difficult to extract from its background (e.g., low signal-to-noise ratio), the visual system extracts the stimulus and that activity in areas typically associated with categorization are not affected by the difficulty level of the visual conditions. We conclude with implications of this result for research on visual attention, categorization, and the integration of these fields. Copyright © 2017 Elsevier Inc. All rights reserved.
A factorial design experiment as a pilot study for noninvasive genetic sampling.
Renan, Sharon; Speyer, Edith; Shahar, Naama; Gueta, Tomer; Templeton, Alan R; Bar-David, Shirli
2012-11-01
Noninvasive genetic sampling has increasingly been used in ecological and conservation studies during the last decade. A major part of the noninvasive genetic literature is dedicated to the search for optimal protocols, by comparing different methods of collection, preservation and extraction of DNA from noninvasive materials. However, the lack of quantitative comparisons among these studies and the possibility that different methods are optimal for different systems make it difficult to decide which protocol to use. Moreover, most studies that have compared different methods focused on a single factor - collection, preservation or extraction - while there could be interactions between these factors. We designed a factorial experiment, as a pilot study, aimed at exploring the effect of several collection, preservation and extraction methods, and the interactions between them, on the quality and amplification success of DNA obtained from Asiatic wild ass (Equus hemionus) faeces in Israel. The amplification success rates of one mitochondrial DNA and four microsatellite markers differed substantially as a function of collection, preservation and extraction methods and their interactions. The most efficient combination for our system integrated the use of swabs as a collection method with preservation at -20 °C and with the Qiagen DNA Stool Kit with modifications as the DNA extraction method. The significant interaction found between the collection, preservation methods and the extraction methods reinforces the importance of conducting a factorial design experiment, rather than examining each factor separately, as a pilot study before initiating a full-scale noninvasive research project. © 2012 Blackwell Publishing Ltd.
Angelis, Apostolis; Hamzaoui, Mahmoud; Aligiannis, Nektarios; Nikou, Theodora; Michailidis, Dimitris; Gerolimatos, Panagiotis; Termentzi, Aikaterini; Hubert, Jane; Halabalaki, Maria; Renault, Jean-Hugues; Skaltsounis, Alexios-Léandros
2017-03-31
An integrated extraction and purification process for the direct recovery of high added value compounds from extra virgin olive oil (EVOO) is proposed by using solid support free liquid-liquid extraction and chromatography techniques. Two different extraction methods were developed on a laboratory-scale Centrifugal Partition Extractor (CPE): a sequential strategy consisting of several "extraction-recovery" cycles and a continuous strategy based on stationary phase co-current elution. In both cases, EVOO was used as mobile phase diluted in food grade n-hexane (feed mobile phase) and the required biphasic system was obtained by adding ethanol and water as polar solvents. For the sequential process, 17.5L of feed EVOO containing organic phase (i.e. 7L of EVOO treated) were extracted yielding 9.5g of total phenolic fraction corresponding to a productivity of 5.8g/h/L of CPE column. Regarding the second approach, the co-current process, 2L of the feed oil phase (containing to 0.8L of EVOO) were treated at 100mL/min yielding 1.03g of total phenolic fraction corresponding to a productivity of 8.9g/h/L of CPE column. The total phenolic fraction was then fractionated by using stepwise gradient elution Centrifugal Partition Chromatography (CPC). The biphasic solvent systems were composed of n-hexane, ethyl acetate, ethanol and water in different proportions (X/Y/2/3, v/v). In a single run of 4h on a column with a capacity of 1L, 910mg of oleocanthal, 882mg of oleacein, 104mg of hydroxytyrosol were successfully recovered from 5g of phenolic extract with purities of 85%, 92% and 90%, respectively. CPC fractions were then submitted to orthogonal chromatographic steps (adsorption on silica gel or size exclusion chromatography) leading to the isolation of additional eleven compounds belonging to triterpens, phenolic compounds and secoiridoids. Among them, elenolic acid ethylester was found to be new compound. Thin Layer Chromatography (TLC), Nuclear magnetic Resonance (NMR) and High Performance Liquid Chromatography - Diode Array Detector (HPLC-DAD) were used for monitoring and evaluation purposes throughout the entire procedure. Copyright © 2017 Elsevier B.V. All rights reserved.
Incremental Ontology-Based Extraction and Alignment in Semi-structured Documents
NASA Astrophysics Data System (ADS)
Thiam, Mouhamadou; Bennacer, Nacéra; Pernelle, Nathalie; Lô, Moussa
SHIRIis an ontology-based system for integration of semi-structured documents related to a specific domain. The system’s purpose is to allow users to access to relevant parts of documents as answers to their queries. SHIRI uses RDF/OWL for representation of resources and SPARQL for their querying. It relies on an automatic, unsupervised and ontology-driven approach for extraction, alignment and semantic annotation of tagged elements of documents. In this paper, we focus on the Extract-Align algorithm which exploits a set of named entity and term patterns to extract term candidates to be aligned with the ontology. It proceeds in an incremental manner in order to populate the ontology with terms describing instances of the domain and to reduce the access to extern resources such as Web. We experiment it on a HTML corpus related to call for papers in computer science and the results that we obtain are very promising. These results show how the incremental behaviour of Extract-Align algorithm enriches the ontology and the number of terms (or named entities) aligned directly with the ontology increases.
Zhang, Chen; Sanders, Johan P M; Xiao, Ting T; Bruins, Marieke E
2015-01-01
Leaf protein can be obtained cost-efficiently by alkaline extraction, but overuse of chemicals and low quality of (denatured) protein limits its application. The research objective was to investigate how alkali aids protein extraction of green tea leaf residue, and use these results for further improvements in alkaline protein biorefinery. Protein extraction yield was studied for correlation to morphology of leaf tissue structure, protein solubility and hydrolysis degree, and yields of non-protein components obtained at various conditions. Alkaline protein extraction was not facilitated by increased solubility or hydrolysis of protein, but positively correlated to leaf tissue disruption. HG pectin, RGII pectin, and organic acids were extracted before protein extraction, which was followed by the extraction of cellulose and hemi-cellulose. RGI pectin and lignin were both linear to protein yield. The yields of these two components were 80% and 25% respectively when 95% protein was extracted, which indicated that RGI pectin is more likely to be the key limitation to leaf protein extraction. An integrated biorefinery was designed based on these results.
NASA Technical Reports Server (NTRS)
Majumdar, Alok; Valenzuela, Juan; LeClair, Andre; Moder, Jeff
2015-01-01
This paper presents a numerical model of a system-level test bed - the multipurpose hydrogen test bed (MHTB) using Generalized Fluid System Simulation Program (GFSSP). MHTB is representative in size and shape of a fully integrated space transportation vehicle liquid hydrogen (LH2) propellant tank and was tested at Marshall Space Flight Center (MSFC) to generate data for cryogenic storage. GFSSP is a finite volume based network flow analysis software developed at MSFC and used for thermo-fluid analysis of propulsion systems. GFSSP has been used to model the self-pressurization and ullage pressure control by Thermodynamic Vent System (TVS). A TVS typically includes a Joule-Thompson (J-T) expansion device, a two-phase heat exchanger, and a mixing pump and spray to extract thermal energy from the tank without significant loss of liquid propellant. Two GFSSP models (Self-Pressurization & TVS) were separately developed and tested and then integrated to simulate the entire system. Self-Pressurization model consists of multiple ullage nodes, propellant node and solid nodes; it computes the heat transfer through Multi-Layer Insulation blankets and calculates heat and mass transfer between ullage and liquid propellant and ullage and tank wall. TVS model calculates the flow through J-T valve, heat exchanger and spray and vent systems. Two models are integrated by exchanging data through User Subroutines of both models. The integrated models results have been compared with MHTB test data of 50% fill level. Satisfactory comparison was observed between test and numerical predictions.
NASA Astrophysics Data System (ADS)
Ranamukhaarachchi, Sahan A.; Padeste, Celestino; Dübner, Matthias; Häfeli, Urs O.; Stoeber, Boris; Cadarso, Victor J.
2016-07-01
Therapeutic drug monitoring (TDM) typically requires painful blood drawn from patients. We propose a painless and minimally-invasive alternative for TDM using hollow microneedles suitable to extract extremely small volumes (<1 nL) of interstitial fluid to measure drug concentrations. The inner lumen of a microneedle is functionalized to be used as a micro-reactor during sample collection to trap and bind target drug candidates during extraction, without requirements of sample transfer. An optofluidic device is integrated with this microneedle to rapidly quantify drug analytes with high sensitivity using a straightforward absorbance scheme. Vancomycin is currently detected by using volumes ranging between 50-100 μL with a limit of detection (LoD) of 1.35 μM. The proposed microneedle-optofluidic biosensor can detect vancomycin with a sample volume of 0.6 nL and a LoD of <100 nM, validating this painless point of care system with significant potential to reduce healthcare costs and patients suffering.
MULTISCALE TENSOR ANISOTROPIC FILTERING OF FLUORESCENCE MICROSCOPY FOR DENOISING MICROVASCULATURE.
Prasath, V B S; Pelapur, R; Glinskii, O V; Glinsky, V V; Huxley, V H; Palaniappan, K
2015-04-01
Fluorescence microscopy images are contaminated by noise and improving image quality without blurring vascular structures by filtering is an important step in automatic image analysis. The application of interest here is to automatically extract the structural components of the microvascular system with accuracy from images acquired by fluorescence microscopy. A robust denoising process is necessary in order to extract accurate vascular morphology information. For this purpose, we propose a multiscale tensor with anisotropic diffusion model which progressively and adaptively updates the amount of smoothing while preserving vessel boundaries accurately. Based on a coherency enhancing flow with planar confidence measure and fused 3D structure information, our method integrates multiple scales for microvasculature preservation and noise removal membrane structures. Experimental results on simulated synthetic images and epifluorescence images show the advantage of our improvement over other related diffusion filters. We further show that the proposed multiscale integration approach improves denoising accuracy of different tensor diffusion methods to obtain better microvasculature segmentation.
Ranamukhaarachchi, Sahan A.; Padeste, Celestino; Dübner, Matthias; Häfeli, Urs O.; Stoeber, Boris; Cadarso, Victor J.
2016-01-01
Therapeutic drug monitoring (TDM) typically requires painful blood drawn from patients. We propose a painless and minimally-invasive alternative for TDM using hollow microneedles suitable to extract extremely small volumes (<1 nL) of interstitial fluid to measure drug concentrations. The inner lumen of a microneedle is functionalized to be used as a micro-reactor during sample collection to trap and bind target drug candidates during extraction, without requirements of sample transfer. An optofluidic device is integrated with this microneedle to rapidly quantify drug analytes with high sensitivity using a straightforward absorbance scheme. Vancomycin is currently detected by using volumes ranging between 50–100 μL with a limit of detection (LoD) of 1.35 μM. The proposed microneedle-optofluidic biosensor can detect vancomycin with a sample volume of 0.6 nL and a LoD of <100 nM, validating this painless point of care system with significant potential to reduce healthcare costs and patients suffering. PMID:27380889
NASA Astrophysics Data System (ADS)
Jung, Chinte; Sun, Chih-Hong
2006-10-01
Motivated by the increasing accessibility of technology, more and more spatial data are being made digitally available. How to extract the valuable knowledge from these large (spatial) databases is becoming increasingly important to businesses, as well. It is essential to be able to analyze and utilize these large datasets, convert them into useful knowledge, and transmit them through GIS-enabled instruments and the Internet, conveying the key information to business decision-makers effectively and benefiting business entities. In this research, we combine the techniques of GIS, spatial decision support system (SDSS), spatial data mining (SDM), and ArcGIS Server to achieve the following goals: (1) integrate databases from spatial and non-spatial datasets about the locations of businesses in Taipei, Taiwan; (2) use the association rules, one of the SDM methods, to extract the knowledge from the integrated databases; and (3) develop a Web-based SDSS GIService as a location-selection tool for business by the product of ArcGIS Server.
D'Antonio, Matteo; Masseroli, Marco
2009-01-01
Background Alternative splicing has been demonstrated to affect most of human genes; different isoforms from the same gene encode for proteins which differ for a limited number of residues, thus yielding similar structures. This suggests possible correlations between alternative splicing and protein structure. In order to support the investigation of such relationships, we have developed the Alternative Splicing and Protein Structure Scrutinizer (PASS), a Web application to automatically extract, integrate and analyze human alternative splicing and protein structure data sparsely available in the Alternative Splicing Database, Ensembl databank and Protein Data Bank. Primary data from these databases have been integrated and analyzed using the Protein Identifier Cross-Reference, BLAST, CLUSTALW and FeatureMap3D software tools. Results A database has been developed to store the considered primary data and the results from their analysis; a system of Perl scripts has been implemented to automatically create and update the database and analyze the integrated data; a Web interface has been implemented to make the analyses easily accessible; a database has been created to manage user accesses to the PASS Web application and store user's data and searches. Conclusion PASS automatically integrates data from the Alternative Splicing Database with protein structure data from the Protein Data Bank. Additionally, it comprehensively analyzes the integrated data with publicly available well-known bioinformatics tools in order to generate structural information of isoform pairs. Further analysis of such valuable information might reveal interesting relationships between alternative splicing and protein structure differences, which may be significantly associated with different functions. PMID:19828075
Wang, Huili; Gao, Ming; Wang, Mei; Zhang, Rongbo; Wang, Wenwei; Dahlgren, Randy A; Wang, Xuedong
2015-03-15
Herein, we developed a novel integrated device to perform phase separation based on ultrasound-assisted salt-induced liquid-liquid microextraction for determination of five fluoroquinones (FQs) in human body fluids. The integrated device consisted of three simple HDPE components used to separate the extraction solvent from the aqueous phase prior to retrieving the extractant. A series of extraction parameters were optimized using the response surface method based on central composite design. Optimal conditions consisted of 945μL acetone extraction solvent, pH 2.1, 4.1min stir time, 5.9g Na2SO4, and 4.0min centrifugation. Under optimized conditions, the limits of detection (at S/N=3) were 0.12-0.66μgL(-1), the linear range was 0.5-500μgL(-1) and recoveries were 92.6-110.9% for the five FQs extracted from plasma and urine. The proposed method has several advantages, such as easy construction from inexpensive materials, high extraction efficiency, short extraction time, and compatibility with HPLC analysis. Thus, this method shows excellent prospects for sample pretreatment and analysis of FQs in human body fluids. Copyright © 2015 Elsevier B.V. All rights reserved.
Liu, Li; Chen, Weiping; Nie, Min; Zhang, Fengjuan; Wang, Yu; He, Ailing; Wang, Xiaonan; Yan, Gen
2016-11-01
To handle the emergence of the regional healthcare ecosystem, physicians and surgeons in various departments and healthcare institutions must process medical images securely, conveniently, and efficiently, and must integrate them with electronic medical records (EMRs). In this manuscript, we propose a software as a service (SaaS) cloud called the iMAGE cloud. A three-layer hybrid cloud was created to provide medical image processing services in the smart city of Wuxi, China, in April 2015. In the first step, medical images and EMR data were received and integrated via the hybrid regional healthcare network. Then, traditional and advanced image processing functions were proposed and computed in a unified manner in the high-performance cloud units. Finally, the image processing results were delivered to regional users using the virtual desktop infrastructure (VDI) technology. Security infrastructure was also taken into consideration. Integrated information query and many advanced medical image processing functions-such as coronary extraction, pulmonary reconstruction, vascular extraction, intelligent detection of pulmonary nodules, image fusion, and 3D printing-were available to local physicians and surgeons in various departments and healthcare institutions. Implementation results indicate that the iMAGE cloud can provide convenient, efficient, compatible, and secure medical image processing services in regional healthcare networks. The iMAGE cloud has been proven to be valuable in applications in the regional healthcare system, and it could have a promising future in the healthcare system worldwide.
Hu, Shan-Wen; Xu, Bi-Yi; Qiao, Shu; Zhao, Ge; Xu, Jing-Juan; Chen, Hong-Yuan; Xie, Fu-Wei
2016-04-01
In this work, we report a novel microfluidic gas collecting platform aiming at simultaneous sample extraction and multiplex mass spectrometry (MS) analysis. An alveolar-mimicking elastic polydimethylsiloxane (PDMS) structures was designed to move dynamically driven by external pressure. The movement was well tuned both by its amplitude and rhythm following the natural process of human respiration. By integrating the alveolar units into arrays and assembling them to gas channels, a cyclic contraction/expansion system for gas inhale and exhale was successfully constructed. Upon equipping this system with a droplet array on the alveolar array surface, we were able to get information of inhaled smoke in a new strategy. Here, with cigarette smoke as an example, analysis of accumulation for target molecules during passive smoking is taken. Relationships between the breathing times, distances away from smokers and inhaled content of nicotine are clarified. Further, by applying different types of extraction solvent droplets on different locations of the droplet array, simultaneous extraction of nicotine, formaldehyde and caproic acid in sidestream smoke (SS) are realized. Since the extract droplets are spatially separated, they can be directly analyzed by MS which is fast and can rid us of all complex sample separation and purification steps. Combining all these merits, this small, cheap and portable platform might find wide application in inhaled air pollutant analysis both in and outdoors. Copyright © 2015 Elsevier B.V. All rights reserved.
Extractive Fermentation of Sugarcane Juice to Produce High Yield and Productivity of Bioethanol
NASA Astrophysics Data System (ADS)
Rofiqah, U.; Widjaja, T.; Altway, A.; Bramantyo, A.
2017-04-01
Ethanol production by batch fermentation requires a simple process and it is widely used. Batch fermentation produces ethanol with low yield and productivity due to the accumulation of ethanol in which poisons microorganisms in the fermenter. Extractive fermentation technique is applied to solve the microorganism inhibition problem by ethanol. Extractive fermentation technique can produce ethanol with high yield and productivity. In this process raffinate still, contains much sugar because conversion in the fermentation process is not perfect. Thus, to enhance ethanol yield and productivity, recycle system is applied by returning the raffinate from the extraction process to the fermentation process. This raffinate also contains ethanol which would inhibit the performance of microorganisms in producing ethanol during the fermentation process. Therefore, this study aims to find the optimum condition for the amount of solvent to broth ratio (S: B) and recycle to fresh feed ratio (R: F) which enter the fermenter to produce high yield and productivity. This research was carried out by experiment. In the experiment, sugarcane juice was fermented using Zymomonasmobilis mutant. The fermentation broth was extracted using amyl alcohol. The process was integrated with the recycle system by varying the recycle ratio. The highest yield and productivity is 22.3901% and 103.115 g / L.h respectively, obtained in a process that uses recycle to fresh feed ratio (R: F) of 50:50 and solvents to both ratio of 1.
Note: Fully integrated 3.2 Gbps quantum random number generator with real-time extraction
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhang, Xiao-Guang; Nie, You-Qi; Liang, Hao
2016-07-15
We present a real-time and fully integrated quantum random number generator (QRNG) by measuring laser phase fluctuations. The QRNG scheme based on laser phase fluctuations is featured for its capability of generating ultra-high-speed random numbers. However, the speed bottleneck of a practical QRNG lies on the limited speed of randomness extraction. To close the gap between the fast randomness generation and the slow post-processing, we propose a pipeline extraction algorithm based on Toeplitz matrix hashing and implement it in a high-speed field-programmable gate array. Further, all the QRNG components are integrated into a module, including a compact and actively stabilizedmore » interferometer, high-speed data acquisition, and real-time data post-processing and transmission. The final generation rate of the QRNG module with real-time extraction can reach 3.2 Gbps.« less
NASA Astrophysics Data System (ADS)
Dogon-yaro, M. A.; Kumar, P.; Rahman, A. Abdul; Buyuksalih, G.
2016-10-01
Timely and accurate acquisition of information on the condition and structural changes of urban trees serves as a tool for decision makers to better appreciate urban ecosystems and their numerous values which are critical to building up strategies for sustainable development. The conventional techniques used for extracting tree features include; ground surveying and interpretation of the aerial photography. However, these techniques are associated with some constraint, such as labour intensive field work, a lot of financial requirement, influences by weather condition and topographical covers which can be overcome by means of integrated airborne based LiDAR and very high resolution digital image datasets. This study presented a semi-automated approach for extracting urban trees from integrated airborne based LIDAR and multispectral digital image datasets over Istanbul city of Turkey. The above scheme includes detection and extraction of shadow free vegetation features based on spectral properties of digital images using shadow index and NDVI techniques and automated extraction of 3D information about vegetation features from the integrated processing of shadow free vegetation image and LiDAR point cloud datasets. The ability of the developed algorithms shows a promising result as an automated and cost effective approach to estimating and delineated 3D information of urban trees. The research also proved that integrated datasets is a suitable technology and a viable source of information for city managers to be used in urban trees management.
The development of contour processing: evidence from physiology and psychophysics
Taylor, Gemma; Hipp, Daniel; Moser, Alecia; Dickerson, Kelly; Gerhardstein, Peter
2014-01-01
Object perception and pattern vision depend fundamentally upon the extraction of contours from the visual environment. In adulthood, contour or edge-level processing is supported by the Gestalt heuristics of proximity, collinearity, and closure. Less is known, however, about the developmental trajectory of contour detection and contour integration. Within the physiology of the visual system, long-range horizontal connections in V1 and V2 are the likely candidates for implementing these heuristics. While post-mortem anatomical studies of human infants suggest that horizontal interconnections reach maturity by the second year of life, psychophysical research with infants and children suggests a considerably more protracted development. In the present review, data from infancy to adulthood will be discussed in order to track the development of contour detection and integration. The goal of this review is thus to integrate the development of contour detection and integration with research regarding the development of underlying neural circuitry. We conclude that the ontogeny of this system is best characterized as a developmentally extended period of associative acquisition whereby horizontal connectivity becomes functional over longer and longer distances, thus becoming able to effectively integrate over greater spans of visual space. PMID:25071681
Le, Hoang-Quynh; Tran, Mai-Vu; Dang, Thanh Hai; Ha, Quang-Thuy; Collier, Nigel
2016-07-01
The BioCreative V chemical-disease relation (CDR) track was proposed to accelerate the progress of text mining in facilitating integrative understanding of chemicals, diseases and their relations. In this article, we describe an extension of our system (namely UET-CAM) that participated in the BioCreative V CDR. The original UET-CAM system's performance was ranked fourth among 18 participating systems by the BioCreative CDR track committee. In the Disease Named Entity Recognition and Normalization (DNER) phase, our system employed joint inference (decoding) with a perceptron-based named entity recognizer (NER) and a back-off model with Semantic Supervised Indexing and Skip-gram for named entity normalization. In the chemical-induced disease (CID) relation extraction phase, we proposed a pipeline that includes a coreference resolution module and a Support Vector Machine relation extraction model. The former module utilized a multi-pass sieve to extend entity recall. In this article, the UET-CAM system was improved by adding a 'silver' CID corpus to train the prediction model. This silver standard corpus of more than 50 thousand sentences was automatically built based on the Comparative Toxicogenomics Database (CTD) database. We evaluated our method on the CDR test set. Results showed that our system could reach the state of the art performance with F1 of 82.44 for the DNER task and 58.90 for the CID task. Analysis demonstrated substantial benefits of both the multi-pass sieve coreference resolution method (F1 + 4.13%) and the silver CID corpus (F1 +7.3%).Database URL: SilverCID-The silver-standard corpus for CID relation extraction is freely online available at: https://zenodo.org/record/34530 (doi:10.5281/zenodo.34530). © The Author(s) 2016. Published by Oxford University Press.
Integration of aerial remote sensing imaging data in a 3D-GIS environment
NASA Astrophysics Data System (ADS)
Moeller, Matthias S.
2003-03-01
For some years sensor systems have been available providing digital images of a new quality. Especially aerial stereo scanners acquire digital multispectral images with an extremely high ground resolution of about 0.10 - 0.15m and provide in addition a Digital Surface Models (DSM). These imaging products both can be used for a detailed monitoring at scales up to 1:500. The processed georeferenced multispectral orthoimages can be readily integrated into GIS making them useful for a number of applications. The DSM, derived from forward and backward facing sensors of an aerial imaging system provides a ground resolution of 0.5 m and can be used for 3D visualization purposes. In some cases it is essential, to store the ground elevation as a Digital Terrain Model (DTM) and also the height of 3-dimensional objects in a separated database. Existing automated algorithms do not work precise for the extraction of DTM from aerial scanner DSM. This paper presents a new approach which combines the visible image data and the DSM data for the generation of DTM with a reliable geometric accuracy. Already existing cadastral data can be used as a knowledge base for the extraction of building heights in cities. These elevation data is the essential source for a GIS based urban information system with a 3D visualization component.
Petty, J.D.; Jones, S.B.; Huckins, J.N.; Cranor, W.L.; Parris, J.T.; McTague, T.B.; Boyle, T.P.
2000-01-01
As an integral part of our continued development of water quality assessment approaches, we combined integrative sampling, instrumental analysis of widely occurring anthropogenic contaminants, and the application of a suite of bioindicator tests as a specific part of a broader survey of ecological conditions, species diversity, and habitat quality in the Santa Cruz River in Arizona, USA. Lipid-containing semipermeable membrane devices (SPMDs) were employed to sequester waterborne hydrophobic chemicals. Instrumental analysis and a suite of bioindicator tests were used to determine the presence and potential toxicological relevance of mixtures of bioavailable chemicals in two major water sources of the Santa Cruz River. The SPMDs were deployed at two sites; the effluent weir of the International Wastewater Treatment Plant (IWWTP) and the Nogales Wash. Both of these systems empty into the Santa Cruz River and the IWWTP effluent is a potential source of water for a constructed wetland complex. Analysis of the SPMD sample extracts revealed the presence of organochlorine pesticides (OCs), polychlorinated biphenyls (PCBs), and polycyclic aromatic hydrocarbons (PAHs). The bioindicator tests demonstrated increased liver enzyme activity, perturbation of neurotransmitter systems and potential endocrine disrupting effects (vitellogenin induction) in fish exposed to the extracts. With increasing global demands on limited water resources, the approach described herein provides an assessment paradigm applicable to determining the quality of water in a broad range of aquatic systems.
Hernando, M Elena; Pascual, Mario; Salvador, Carlos H; García-Sáez, Gema; Rodríguez-Herrero, Agustín; Martínez-Sarriegui, Iñaki; Gómez, Enrique J
2008-09-01
The growing availability of continuous data from medical devices in diabetes management makes it crucial to define novel information technology architectures for efficient data storage, data transmission, and data visualization. The new paradigm of care demands the sharing of information in interoperable systems as the only way to support patient care in a continuum of care scenario. The technological platforms should support all the services required by the actors involved in the care process, located in different scenarios and managing diverse information for different purposes. This article presents basic criteria for defining flexible and adaptive architectures that are capable of interoperating with external systems, and integrating medical devices and decision support tools to extract all the relevant knowledge to support diabetes care.
Database assessment of CMIP5 and hydrological models to determine flood risk areas
NASA Astrophysics Data System (ADS)
Limlahapun, Ponthip; Fukui, Hiromichi
2016-11-01
Solutions for water-related disasters may not be solved with a single scientific method. Based on this premise, we involved logic conceptions, associate sequential result amongst models, and database applications attempting to analyse historical and future scenarios in the context of flooding. The three main models used in this study are (1) the fifth phase of the Coupled Model Intercomparison Project (CMIP5) to derive precipitation; (2) the Integrated Flood Analysis System (IFAS) to extract amount of discharge; and (3) the Hydrologic Engineering Center (HEC) model to generate inundated areas. This research notably focused on integrating data regardless of system-design complexity, and database approaches are significantly flexible, manageable, and well-supported for system data transfer, which makes them suitable for monitoring a flood. The outcome of flood map together with real-time stream data can help local communities identify areas at-risk of flooding in advance.
Grounding Robot Autonomy in Emotion and Self-awareness
NASA Astrophysics Data System (ADS)
Sanz, Ricardo; Hernández, Carlos; Hernando, Adolfo; Gómez, Jaime; Bermejo, Julita
Much is being done in an attempt to transfer emotional mechanisms from reverse-engineered biology into social robots. There are two basic approaches: the imitative display of emotion —e.g. to intend more human-like robots— and the provision of architectures with intrinsic emotion —in the hope of enhancing behavioral aspects. This paper focuses on the second approach, describing a core vision regarding the integration of cognitive, emotional and autonomic aspects in social robot systems. This vision has evolved as a result of the efforts in consolidating the models extracted from rat emotion research and their implementation in technical use cases based on a general systemic analysis in the framework of the ICEA and C3 projects. The desire for generality of the approach intends obtaining universal theories of integrated —autonomic, emotional, cognitive— behavior. The proposed conceptualizations and architectural principles are then captured in a theoretical framework: ASys — The Autonomous Systems Framework.
[Extraction and recognition of attractors in three-dimensional Lorenz plot].
Hu, Min; Jang, Chengfan; Wang, Suxia
2018-02-01
Lorenz plot (LP) method which gives a global view of long-time electrocardiogram signals, is an efficient simple visualization tool to analyze cardiac arrhythmias, and the morphologies and positions of the extracted attractors may reveal the underlying mechanisms of the onset and termination of arrhythmias. But automatic diagnosis is still impossible because it is lack of the method of extracting attractors by now. We presented here a methodology of attractor extraction and recognition based upon homogeneously statistical properties of the location parameters of scatter points in three dimensional LP (3DLP), which was constructed by three successive RR intervals as X , Y and Z axis in Cartesian coordinate system. Validation experiments were tested in a group of RR-interval time series and tags data with frequent unifocal premature complexes exported from a 24-hour Holter system. The results showed that this method had excellent effective not only on extraction of attractors, but also on automatic recognition of attractors by the location parameters such as the azimuth of the points peak frequency ( A PF ) of eccentric attractors once stereographic projection of 3DLP along the space diagonal. Besides, A PF was still a powerful index of differential diagnosis of atrial and ventricular extrasystole. Additional experiments proved that this method was also available on several other arrhythmias. Moreover, there were extremely relevant relationships between 3DLP and two dimensional LPs which indicate any conventional achievement of LPs could be implanted into 3DLP. It would have a broad application prospect to integrate this method into conventional long-time electrocardiogram monitoring and analysis system.
Umakoshi, H; Yano, K; Kuboi, R; Komasawa, I
1996-01-01
The extractive cultivation of recombinant Escherichia coli cells to produce, release, and separate heat shock proteins (HSPs; GroEL and GroES) using poly(ethylene glycol) (PEG)/dextran (Dex) aqueous two-phase systems was developed. The growth rate of E. coli OW10/pND5 cells in the PEG/Dex two-phase media was almost the same value as that in the control media. The addition of 0.1 M potassium phosphate salts (KPi) increased the productivity of HSPs with keeping the growth rate of E. coli cells relatively high. The partition coefficients of HSPs were improved to greater values when phosphate salts were added at a concentration of more than 0.1 M. As a result, PEG/Dex systems supplemented with 0.1 M KPi were found to be the optimized two-phase systems for the extractive cultivation of E. coli cells. In the systems, the HSPs were selectively partitioned to the top phase while cells occupied the bottom phase and the interface between the two phases. This integrated process was extended to a semicontinuous operating mode, where the top phase containing the HSPs was recovered following intermittent heating and ultrasonic irradiation. The bottom phase containing cells and cell debris was recycled together with new top phase solution to repeat production and recovery of HSPs.
Flow-based analysis using microfluidics-chemiluminescence systems.
Al Lawati, Haider A J
2013-01-01
This review will discuss various approaches and techniques in which analysis using microfluidics-chemiluminescence systems (MF-CL) has been reported. A variety of applications is examined, including environmental, pharmaceutical, biological, food and herbal analysis. Reported uses of CL reagents, sample introduction techniques, sample pretreatment methods, CL signal enhancement and detection systems are discussed. A hydrodynamic pumping system is predominately used for these applications. However, several reports are available in which electro-osmotic (EO) pumping has been implemented. Various sample pretreatment methods have been used, including liquid-liquid extraction, solid-phase extraction and molecularly imprinted polymers. A wide range of innovative techniques has been reported for CL signal enhancement. Most of these techniques are based on enhancement of the mixing process in the microfluidics channels, which leads to enhancement of the CL signal. However, other techniques are also reported, such as mirror reaction, liquid core waveguide, on-line pre-derivatization and the use of an opaque white chip with a thin transparent seal. Photodetectors are the most commonly used detectors; however, other detection systems have also been used, including integrated electrochemiluminescence (ECL) and organic photodiodes (OPDs). Copyright © 2012 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Powell, Bradley W.; Burroughs, Ivan A.
1994-01-01
Through the two phases of this contract, sensors for welding applications and parameter extraction algorithms have been developed. These sensors form the foundation of a weld control system which can provide action weld control through the monitoring of the weld pool and keyhole in a VPPA welding process. Systems of this type offer the potential of quality enhancement and cost reduction (minimization of rework on faulty welds) for high-integrity welding applications. Sensors for preweld and postweld inspection, weld pool monitoring, keyhole/weld wire entry monitoring, and seam tracking were developed. Algorithms for signal extraction were also developed and analyzed to determine their application to an adaptive weld control system. The following sections discuss findings for each of the three sensors developed under this contract: (1) weld profiling sensor; (2) weld pool sensor; and (3) stereo seam tracker/keyhole imaging sensor. Hardened versions of these sensors were designed and built under this contract. A control system, described later, was developed on a multiprocessing/multitasking operating system for maximum power and flexibility. Documentation for sensor mechanical and electrical design is also included as appendices in this report.
Constructing local integrals of motion in the many-body localized phase
NASA Astrophysics Data System (ADS)
Chandran, Anushya; Kim, Isaac H.; Vidal, Guifre; Abanin, Dmitry A.
2015-02-01
Many-body localization provides a generic mechanism of ergodicity breaking in quantum systems. In contrast to conventional ergodic systems, many-body-localized (MBL) systems are characterized by extensively many local integrals of motion (LIOM), which underlie the absence of transport and thermalization in these systems. Here we report a physically motivated construction of local integrals of motion in the MBL phase. We show that any local operator (e.g., a local particle number or a spin-flip operator), evolved with the system's Hamiltonian and averaged over time, becomes a LIOM in the MBL phase. Such operators have a clear physical meaning, describing the response of the MBL system to a local perturbation. In particular, when a local operator represents a density of some globally conserved quantity, the corresponding LIOM describes how this conserved quantity propagates through the MBL phase. Being uniquely defined and experimentally measurable, these LIOMs provide a natural tool for characterizing the properties of the MBL phase, in both experiments and numerical simulations. We demonstrate the latter by numerically constructing an extensive set of LIOMs in the MBL phase of a disordered spin-chain model. We show that the resulting LIOMs are quasilocal and use their decay to extract the localization length and establish the location of the transition between the MBL and ergodic phases.
NASA Astrophysics Data System (ADS)
Fajer, V.; Rodríguez, C.; Naranjo, S.; Mesa, G.; Mora, W.; Arista, E.; Cepero, T.; Fernández, H.
2006-02-01
The combination of molecular exclusion chromatography and laser polarimetric detection has turned into a carbohydrate separation and quantification system for plant fluids of industrial value, making it possible the evaluation of the quality of sugarcane juices, agave juices and many other plant extracts. Some previous papers described a system where liquid chromatography separation and polarimetric detection using a LASERPOL 101M polarimeter with He-Ne light source allowed the collection and quantification of discrete samples for analytical purposes. In this paper, the authors are introducing a new improved system which accomplishes polarimetric measurements in a continuous flux. Chromatograms of several carbohydrates standard solutions were obtained as useful references to study juice quality of several sugarcane varieties under different physiological conditions. Results by either discrete or continuous flux systems were compared in order to test the validation of the new system. An application of the system to the diagnostics of scalded foliar is described. A computer program allowing the output of the chromatograms to a display on line and the possibility of digital storing, maxima detections, zone integration, and some other possibilities make this system very competitive and self-convincing.
Optimizing graph-based patterns to extract biomedical events from the literature
2015-01-01
In BioNLP-ST 2013 We participated in the BioNLP 2013 shared tasks on event extraction. Our extraction method is based on the search for an approximate subgraph isomorphism between key context dependencies of events and graphs of input sentences. Our system was able to address both the GENIA (GE) task focusing on 13 molecular biology related event types and the Cancer Genetics (CG) task targeting a challenging group of 40 cancer biology related event types with varying arguments concerning 18 kinds of biological entities. In addition to adapting our system to the two tasks, we also attempted to integrate semantics into the graph matching scheme using a distributional similarity model for more events, and evaluated the event extraction impact of using paths of all possible lengths as key context dependencies beyond using only the shortest paths in our system. We achieved a 46.38% F-score in the CG task (ranking 3rd) and a 48.93% F-score in the GE task (ranking 4th). After BioNLP-ST 2013 We explored three ways to further extend our event extraction system in our previously published work: (1) We allow non-essential nodes to be skipped, and incorporated a node skipping penalty into the subgraph distance function of our approximate subgraph matching algorithm. (2) Instead of assigning a unified subgraph distance threshold to all patterns of an event type, we learned a customized threshold for each pattern. (3) We implemented the well-known Empirical Risk Minimization (ERM) principle to optimize the event pattern set by balancing prediction errors on training data against regularization. When evaluated on the official GE task test data, these extensions help to improve the extraction precision from 62% to 65%. However, the overall F-score stays equivalent to the previous performance due to a 1% drop in recall. PMID:26551594
Miniature wireless recording and stimulation system for rodent behavioural testing
NASA Astrophysics Data System (ADS)
Pinnell, R. C.; Dempster, J.; Pratt, J.
2015-12-01
Objective. Elucidation of neural activity underpinning rodent behaviour has traditionally been hampered by the use of tethered systems and human involvement. Furthermore the combination of deep-brain stimulation (DBS) and various neural recording modalities can lead to complex and time-consuming laboratory setups. For studies of this type, novel tools are required to drive forward this research. Approach. A miniature wireless system weighing 8.5 g (including battery) was developed for rodent use that combined multichannel DBS and local-field potential (LFP) recordings. Its performance was verified in a working memory task that involved 4-channel fronto-hippocampal LFP recording and bilateral constant-current fimbria-fornix DBS. The system was synchronised with video-tracking for extraction of LFP at discrete task phases, and DBS was activated intermittently at discrete phases of the task. Main results. In addition to having a fast set-up time, the system could reliably transmit continuous LFP at over 8 hours across 3-5 m distances. During the working memory task, LFP pertaining to discrete task phases was extracted and compared with well-known neural correlates of active exploratory behaviour in rodents. DBS could be wirelessly activated/deactivated at any part of the experiment during EEG recording and transmission, allowing for a seamless integration of this modality. Significance. The wireless system combines a small size with a level of robustness and versatility that can greatly simplify rodent behavioural experiments involving EEG recording and DBS. Designed for versatility and simplicity, the small size and low-cost of the system and its receiver allow for enhanced portability, fast experimental setup times, and pave the way for integration with more complex behaviour.
NASA Astrophysics Data System (ADS)
Fallatah, O.; Ahmed, M.; Akanda, A. S.; Boving, T.; Cardace, D.
2017-12-01
Abstract: The Saq aquifer system represents one of the most significant transboundary aquifers in the Arabian Peninsula that extends between northern parts of Saudi Arabia, Iraq and Jordan. Recent studies show that the Saq aquifer system is witnessing rapid groundwater depletions of -6.52 ± 0.29 mm/year (-3.49 ± 0.15 km3/year) that are highly correlated with increasing groundwater extraction for irrigation and observed water level declines in regional supply wells. In addition, the region is receiving record low amounts of precipitation in recent years. Thus, quantifying the groundwater recharge rate of the Saq is essential to sustainable present and future utilization of the groundwater resources in that system. In this study, we develop and apply an integrated Geophysical, Geochemical, and Remote Sensing-based approach to quantify the recharge rates of the Saq aquifer system given the areal distribution of the Saq transboundary aquifer system, the interaction between the Saq aquifer and the overlying aquifers, as well as the very limited rates of recharge through precipitation. Specifically, we set out to accomplish the following: (1) delineate and examine the areal extent of the Saq aquifer recharge domains using geologic, climatic, and remote sensing data; (2) investigate the origin of, and recent contributions to, the groundwater in the Saq aquifer system by examining the isotopic compositions of groundwater samples collected from the Saq aquifer; and (3) estimate, to the first order, the magnitude of modern recharge utilizing the Gravity Recovery and Climate Experiment (GRACE) data and rainfall time-series of the region. Results from this paper will help us to apply the suitable location for drilling and determine the best extraction scenarios.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sun, Xiaolei; Rink, Nancy T
2011-04-29
This report presents an integrated energy system that combines the production of substitute natural gas through coal hydrogasification with an algae process for beneficial carbon dioxide (CO2) use and biofuel production (funded under Department of Energy (DOE) contract DE-FE0001099). The project planned to develop, test, operate and evaluate a 2 ton-per-day coal hydrogasification plant and 25-acre algae farm at the Arizona Public Service (APS) 1000 Megawatt (MW) Cholla coal-fired power plant in Joseph City, Arizona. Conceptual design of the integrated system was undertaken with APS partners Air Liquide (AL) and Parsons. The process engineering was separated into five major areas:more » flue gas preparation and CO2 delivery, algae farming, water management, hydrogasification, and biofuel production. The process flow diagrams, energy and material balances, and preliminary major equipment needs for each major area were prepared to reflect integrated process considerations and site infrastructure design basis. The total project also included research and development on a bench-scale hydrogasifier, one-dimensional (1-D) kinetic-model simulation, extensive algae stressing, oil extraction, lipid analysis and a half-acre algae farm demonstration at APS?s Redhawk testing facility. During the project, a two-acre algae testing facility with a half-acre algae cultivation area was built at the APS Redhawk 1000 MW natural gas combined cycle power plant located 55 miles west of Phoenix. The test site integrated flue gas delivery, CO2 capture and distribution, algae cultivation, algae nursery, algae harvesting, dewatering and onsite storage as well as water treatment. The site environmental, engineering, and biological parameters for the cultivators were monitored remotely. Direct biodiesel production from biomass through an acid-catalyzed transesterification reaction and a supercritical methanol transesterification reaction were evaluated. The highest oil-to-biodiesel conversion of 79.9% was achieved with a stressed algae sample containing 40% algae oil. The effort concluded that producing biodiesel directly from the algae biomass could be an efficient, cost-effective and readily scalable way to produce biodiesel by eliminating the oil extraction process.« less
Goldman, Mindy; Núria, Núria; Castilho, Lilian M
2015-01-01
Automated testing platforms facilitate the introduction of red cell genotyping of patients and blood donors. Fluidic microarray systems, such as Luminex XMAP (Austin, TX), are used in many clinical applications, including HLA and HPA typing. The Progenika ID CORE XT (Progenika Biopharma-Grifols, Bizkaia, Spain) uses this platform to analyze 29 polymorphisms determining 37 antigens in 10 blood group systems. Once DNA has been extracted, processing time is approximately 4 hours. The system is highly automated and includes integrated analysis software that produces a file and a report with genotype and predicted phenotype results.
Danuso, Francesco
2017-12-22
A major bottleneck for improving the governance of complex systems, rely on our ability to integrate different forms of knowledge into a decision support system (DSS). Preliminary aspects are the classification of different types of knowledge (a priori or general, a posteriori or specific, with uncertainty, numerical, textual, algorithmic, complete/incomplete, etc.), the definition of ontologies for knowledge management and the availability of proper tools like continuous simulation models, event driven models, statistical approaches, computational methods (neural networks, evolutionary optimization, rule based systems etc.) and procedure for textual documentation. Following these views at University of Udine, a computer language (SEMoLa, Simple, Easy Modelling Language) for knowledge integration has been developed. SEMoLa can handle models, data, metadata and textual knowledge; it implements and extends the system dynamics ontology (Forrester, 1968; Jørgensen, 1994) in which systems are modelled by the concepts of material, group, state, rate, parameter, internal and external events and driving variables. As an example, a SEMoLa model to improve management and sustainability (economical, energetic, environmental) of the agricultural farms is presented. The model (X-Farm) simulates a farm in which cereal and forage yield, oil seeds, milk, calves and wastes can be sold or reused. X-Farm is composed by integrated modules describing fields (crop and soil), feeds and materials storage, machinery management, manpower management, animal husbandry, economic and energetic balances, seed oil extraction, manure and wastes management, biogas production from animal wastes and biomasses.
Recent advances in enzyme extraction strategies: A comprehensive review.
Nadar, Shamraja S; Pawar, Rohini G; Rathod, Virendra K
2017-08-01
The increasing interest of industrial enzymes demands for development of new downstream strategies for maximizing enzyme recovery. The significant efforts have been focused on the development of newly adapted technologies to purify enzymes in catalytically active form. Recently, an aqueous two phase system (ATPS) is emerged as powerful tools for efficient extraction and purification of enzymes due to their versatility, lower cost, process integration capability and easy scale-up. The present review gives an overview of effect of parameters such as tie line length, pH, neutral salts, properties of polymer and salt involved in traditional polymer/polymer and polymer/salt ATPS for enzyme recovery. Further, advanced ATPS have been developed based on alcohols, surfactants, micellar compounds to avoid tedious recovery steps for getting desired enzyme. In order to improve the selectivity and efficiency of ATPS, recent approaches of conventional ATPS combined with different techniques like affinity ligands, ionic liquids, thermoseparating polymers and microfluidic device based ATPS have been reviewed. Moreover, three phase partitioning is also highlighted for enzymes enrichment as a blooming technology for efficiently integrated bioseparation techniques. At the end, it includes an overview of CLEAs technology and organic-inorganic nanoflowers preparation as novel strategies for simultaneous extraction, purification and immobilization of enzymes. Copyright © 2017 Elsevier B.V. All rights reserved.
Xu, Haiyang; Wang, Ping
2016-01-01
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.
Xu, Haiyang; Wang, Ping
2016-01-01
In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594
Social network extraction based on Web: 3. the integrated superficial method
NASA Astrophysics Data System (ADS)
Nasution, M. K. M.; Sitompul, O. S.; Noah, S. A.
2018-03-01
The Web as a source of information has become part of the social behavior information. Although, by involving only the limitation of information disclosed by search engines in the form of: hit counts, snippets, and URL addresses of web pages, the integrated extraction method produces a social network not only trusted but enriched. Unintegrated extraction methods may produce social networks without explanation, resulting in poor supplemental information, or resulting in a social network of durmise laden, consequently unrepresentative social structures. The integrated superficial method in addition to generating the core social network, also generates an expanded network so as to reach the scope of relation clues, or number of edges computationally almost similar to n(n - 1)/2 for n social actors.
NASA Astrophysics Data System (ADS)
Zhou, Y.; Zhao, H.; Hao, H.; Wang, C.
2018-05-01
Accurate remote sensing water extraction is one of the primary tasks of watershed ecological environment study. Since the Yanhe water system has typical characteristics of a small water volume and narrow river channel, which leads to the difficulty for conventional water extraction methods such as Normalized Difference Water Index (NDWI). A new Multi-Spectral Threshold segmentation of the NDWI (MST-NDWI) water extraction method is proposed to achieve the accurate water extraction in Yanhe watershed. In the MST-NDWI method, the spectral characteristics of water bodies and typical backgrounds on the Landsat/TM images have been evaluated in Yanhe watershed. The multi-spectral thresholds (TM1, TM4, TM5) based on maximum-likelihood have been utilized before NDWI water extraction to realize segmentation for a division of built-up lands and small linear rivers. With the proposed method, a water map is extracted from the Landsat/TM images in 2010 in China. An accuracy assessment is conducted to compare the proposed method with the conventional water indexes such as NDWI, Modified NDWI (MNDWI), Enhanced Water Index (EWI), and Automated Water Extraction Index (AWEI). The result shows that the MST-NDWI method generates better water extraction accuracy in Yanhe watershed and can effectively diminish the confusing background objects compared to the conventional water indexes. The MST-NDWI method integrates NDWI and Multi-Spectral Threshold segmentation algorithms, with richer valuable information and remarkable results in accurate water extraction in Yanhe watershed.
NASA Astrophysics Data System (ADS)
Mochalskyy, Serhiy; Fantz, Ursel; Wünderlich, Dirk; Minea, Tiberiu
2016-10-01
The development of negative ion (NI) sources for the ITER neutral beam injector is strongly accompanied by modelling activities. The ONIX (Orsay Negative Ion eXtraction) code simulates the formation and extraction of negative hydrogen ions and co-extracted electrons produced in caesiated sources. In this paper the 3D geometry of the BATMAN extraction system, and the source characteristics such as the extraction and bias potential, and the 3D magnetic field were integrated in the model. Calculations were performed using plasma parameters experimentally obtained on BATMAN. The comparison of the ONIX calculated extracted NI density with the experimental results suggests that predictive calculations of the extraction of NIs are possible. The results show that for an ideal status of the Cs conditioning the extracted hydrogen NI current density could reach ~30 mA cm-2 at 10 kV and ~20 mA cm-2 at 5 kV extraction potential, with an electron/NI current density ratio of about 1, as measured in the experiments under the same plasma and source conditions. The dependency of the extracted NI current on the NI density in the bulk plasma region from both the modeling and the experiment was investigated. The separate distributions composing the NI beam originating from the plasma bulk region and the PG surface are presented for different NI plasma volume densities and NI emission rates from the plasma grid (PG) wall, respectively. The extracted current from the NIs produced at the Cs covered PG surface, initially moving towards the bulk plasma and then being bent towards the extraction surfaces, is lower compared to the extracted NI current from directly extracted surface produced ions.
The Unified Medical Language System (UMLS): integrating biomedical terminology
Bodenreider, Olivier
2004-01-01
The Unified Medical Language System (http://umlsks.nlm.nih.gov) is a repository of biomedical vocabularies developed by the US National Library of Medicine. The UMLS integrates over 2 million names for some 900 000 concepts from more than 60 families of biomedical vocabularies, as well as 12 million relations among these concepts. Vocabularies integrated in the UMLS Metathesaurus include the NCBI taxonomy, Gene Ontology, the Medical Subject Headings (MeSH), OMIM and the Digital Anatomist Symbolic Knowledge Base. UMLS concepts are not only inter-related, but may also be linked to external resources such as GenBank. In addition to data, the UMLS includes tools for customizing the Metathesaurus (MetamorphoSys), for generating lexical variants of concept names (lvg) and for extracting UMLS concepts from text (MetaMap). The UMLS knowledge sources are updated quarterly. All vocabularies are available at no fee for research purposes within an institution, but UMLS users are required to sign a license agreement. The UMLS knowledge sources are distributed on CD-ROM and by FTP. PMID:14681409
The Unified Medical Language System (UMLS): integrating biomedical terminology.
Bodenreider, Olivier
2004-01-01
The Unified Medical Language System (http://umlsks.nlm.nih.gov) is a repository of biomedical vocabularies developed by the US National Library of Medicine. The UMLS integrates over 2 million names for some 900,000 concepts from more than 60 families of biomedical vocabularies, as well as 12 million relations among these concepts. Vocabularies integrated in the UMLS Metathesaurus include the NCBI taxonomy, Gene Ontology, the Medical Subject Headings (MeSH), OMIM and the Digital Anatomist Symbolic Knowledge Base. UMLS concepts are not only inter-related, but may also be linked to external resources such as GenBank. In addition to data, the UMLS includes tools for customizing the Metathesaurus (MetamorphoSys), for generating lexical variants of concept names (lvg) and for extracting UMLS concepts from text (MetaMap). The UMLS knowledge sources are updated quarterly. All vocabularies are available at no fee for research purposes within an institution, but UMLS users are required to sign a license agreement. The UMLS knowledge sources are distributed on CD-ROM and by FTP.
Design of Integrated Database on Mobile Information System: A Study of Yogyakarta Smart City App
NASA Astrophysics Data System (ADS)
Nurnawati, E. K.; Ermawati, E.
2018-02-01
An integration database is a database which acts as the data store for multiple applications and thus integrates data across these applications (in contrast to an Application Database). An integration database needs a schema that takes all its client applications into account. The benefit of the schema that sharing data among applications does not require an extra layer of integration services on the applications. Any changes to data made in a single application are made available to all applications at the time of database commit - thus keeping the applications’ data use better synchronized. This study aims to design and build an integrated database that can be used by various applications in a mobile device based system platforms with the based on smart city system. The built-in database can be used by various applications, whether used together or separately. The design and development of the database are emphasized on the flexibility, security, and completeness of attributes that can be used together by various applications to be built. The method used in this study is to choice of the appropriate database logical structure (patterns of data) and to build the relational-database models (Design Databases). Test the resulting design with some prototype apps and analyze system performance with test data. The integrated database can be utilized both of the admin and the user in an integral and comprehensive platform. This system can help admin, manager, and operator in managing the application easily and efficiently. This Android-based app is built based on a dynamic clientserver where data is extracted from an external database MySQL. So if there is a change of data in the database, then the data on Android applications will also change. This Android app assists users in searching of Yogyakarta (as smart city) related information, especially in culture, government, hotels, and transportation.
Wiegers, Thomas C; Davis, Allan Peter; Mattingly, Carolyn J
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ © The Author(s) 2014. Published by Oxford University Press.
Wiegers, Thomas C.; Davis, Allan Peter; Mattingly, Carolyn J.
2014-01-01
The Critical Assessment of Information Extraction systems in Biology (BioCreAtIvE) challenge evaluation tasks collectively represent a community-wide effort to evaluate a variety of text-mining and information extraction systems applied to the biological domain. The BioCreative IV Workshop included five independent subject areas, including Track 3, which focused on named-entity recognition (NER) for the Comparative Toxicogenomics Database (CTD; http://ctdbase.org). Previously, CTD had organized document ranking and NER-related tasks for the BioCreative Workshop 2012; a key finding of that effort was that interoperability and integration complexity were major impediments to the direct application of the systems to CTD's text-mining pipeline. This underscored a prevailing problem with software integration efforts. Major interoperability-related issues included lack of process modularity, operating system incompatibility, tool configuration complexity and lack of standardization of high-level inter-process communications. One approach to potentially mitigate interoperability and general integration issues is the use of Web services to abstract implementation details; rather than integrating NER tools directly, HTTP-based calls from CTD's asynchronous, batch-oriented text-mining pipeline could be made to remote NER Web services for recognition of specific biological terms using BioC (an emerging family of XML formats) for inter-process communications. To test this concept, participating groups developed Representational State Transfer /BioC-compliant Web services tailored to CTD's NER requirements. Participants were provided with a comprehensive set of training materials. CTD evaluated results obtained from the remote Web service-based URLs against a test data set of 510 manually curated scientific articles. Twelve groups participated in the challenge. Recall, precision, balanced F-scores and response times were calculated. Top balanced F-scores for gene, chemical and disease NER were 61, 74 and 51%, respectively. Response times ranged from fractions-of-a-second to over a minute per article. We present a description of the challenge and summary of results, demonstrating how curation groups can effectively use interoperable NER technologies to simplify text-mining pipeline implementation. Database URL: http://ctdbase.org/ PMID:24919658
Phosphorus and water recovery by a novel osmotic membrane bioreactor-reverse osmosis system.
Luo, Wenhai; Hai, Faisal I; Price, William E; Guo, Wenshan; Ngo, Hao H; Yamamoto, Kazuo; Nghiem, Long D
2016-01-01
An osmotic membrane bioreactor-reverse osmosis (OMBR-RO) hybrid system integrated with periodic microfiltration (MF) extraction was evaluated for simultaneous phosphorus and clean water recovery from raw sewage. In this hybrid system, the forward osmosis membrane effectively retained inorganic salts and phosphate in the bioreactor, while the MF membrane periodically bled them out for phosphorus recovery with pH adjustment. The RO process was used for draw solute recovery and clean water production. Results show that phosphorus recuperation from the MF permeate was most effective when the solution pH was adjusted to 10, whereby the recovered precipitate contained 15-20% (wt/wt) of phosphorus. Periodic MF extraction also limited salinity build-up in the bioreactor, resulting in a stable biological performance and an increase in water flux during OMBR operation. Despite the build-up of organic matter and ammonia in the draw solution, OMBR-RO allowed for the recovery of high quality reused water. Crown Copyright © 2015. Published by Elsevier Ltd. All rights reserved.
A multiscale Markov random field model in wavelet domain for image segmentation
NASA Astrophysics Data System (ADS)
Dai, Peng; Cheng, Yu; Wang, Shengchun; Du, Xinyu; Wu, Dan
2017-07-01
The human vision system has abilities for feature detection, learning and selective attention with some properties of hierarchy and bidirectional connection in the form of neural population. In this paper, a multiscale Markov random field model in the wavelet domain is proposed by mimicking some image processing functions of vision system. For an input scene, our model provides its sparse representations using wavelet transforms and extracts its topological organization using MRF. In addition, the hierarchy property of vision system is simulated using a pyramid framework in our model. There are two information flows in our model, i.e., a bottom-up procedure to extract input features and a top-down procedure to provide feedback controls. The two procedures are controlled simply by two pyramidal parameters, and some Gestalt laws are also integrated implicitly. Equipped with such biological inspired properties, our model can be used to accomplish different image segmentation tasks, such as edge detection and region segmentation.
Integrated oil production and upgrading using molten alkali metal
Gordon, John Howard
2016-10-04
A method that combines the oil retorting process (or other process needed to obtain/extract heavy oil or bitumen) with the process for upgrading these materials using sodium or other alkali metals. Specifically, the shale gas or other gases that are obtained from the retorting/extraction process may be introduced into the upgrading reactor and used to upgrade the oil feedstock. Also, the solid materials obtained from the reactor may be used as a fuel source, thereby providing the heat necessary for the retorting/extraction process. Other forms of integration are also disclosed.
Martin, Daniel B; Holzman, Ted; May, Damon; Peterson, Amelia; Eastham, Ashley; Eng, Jimmy; McIntosh, Martin
2008-11-01
Multiple reaction monitoring (MRM) mass spectrometry identifies and quantifies specific peptides in a complex mixture with very high sensitivity and speed and thus has promise for the high throughput screening of clinical samples for candidate biomarkers. We have developed an interactive software platform, called MRMer, for managing highly complex MRM-MS experiments, including quantitative analyses using heavy/light isotopic peptide pairs. MRMer parses and extracts information from MS files encoded in the platform-independent mzXML data format. It extracts and infers precursor-product ion transition pairings, computes integrated ion intensities, and permits rapid visual curation for analyses exceeding 1000 precursor-product pairs. Results can be easily output for quantitative comparison of consecutive runs. Additionally MRMer incorporates features that permit the quantitative analysis experiments including heavy and light isotopic peptide pairs. MRMer is open source and provided under the Apache 2.0 license.
Peng, Shao-Hu; Kim, Deok-Hwan; Lee, Seok-Lyong; Lim, Myung-Kwan
2010-01-01
Texture feature is one of most important feature analysis methods in the computer-aided diagnosis (CAD) systems for disease diagnosis. In this paper, we propose a Uniformity Estimation Method (UEM) for local brightness and structure to detect the pathological change in the chest CT images. Based on the characteristics of the chest CT images, we extract texture features by proposing an extension of rotation invariant LBP (ELBP(riu4)) and the gradient orientation difference so as to represent a uniform pattern of the brightness and structure in the image. The utilization of the ELBP(riu4) and the gradient orientation difference allows us to extract rotation invariant texture features in multiple directions. Beyond this, we propose to employ the integral image technique to speed up the texture feature computation of the spatial gray level dependent method (SGLDM). Copyright © 2010 Elsevier Ltd. All rights reserved.
Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform
Giulioni, Massimiliano; Lagorce, Xavier; Galluppi, Francesco; Benosman, Ryad B.
2016-01-01
Estimating the speed and direction of moving objects is a crucial component of agents behaving in a dynamic world. Biological organisms perform this task by means of the neural connections originating from their retinal ganglion cells. In artificial systems the optic flow is usually extracted by comparing activity of two or more frames captured with a vision sensor. Designing artificial motion flow detectors which are as fast, robust, and efficient as the ones found in biological systems is however a challenging task. Inspired by the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina, we introduce an architecture for robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission, and the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. Mimicking ganglion cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. Hereby we describe the architectural aspects, discuss its latency, scalability, and robustness properties and demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene. This work shows how precise time of spike emission used as a computational basis, biological inspiration, and neuromorphic systems can be used together for solving specific tasks. PMID:26909015
Event-Based Computation of Motion Flow on a Neuromorphic Analog Neural Platform.
Giulioni, Massimiliano; Lagorce, Xavier; Galluppi, Francesco; Benosman, Ryad B
2016-01-01
Estimating the speed and direction of moving objects is a crucial component of agents behaving in a dynamic world. Biological organisms perform this task by means of the neural connections originating from their retinal ganglion cells. In artificial systems the optic flow is usually extracted by comparing activity of two or more frames captured with a vision sensor. Designing artificial motion flow detectors which are as fast, robust, and efficient as the ones found in biological systems is however a challenging task. Inspired by the architecture proposed by Barlow and Levick in 1965 to explain the spiking activity of the direction-selective ganglion cells in the rabbit's retina, we introduce an architecture for robust optical flow extraction with an analog neuromorphic multi-chip system. The task is performed by a feed-forward network of analog integrate-and-fire neurons whose inputs are provided by contrast-sensitive photoreceptors. Computation is supported by the precise time of spike emission, and the extraction of the optical flow is based on time lag in the activation of nearby retinal neurons. Mimicking ganglion cells our neuromorphic detectors encode the amplitude and the direction of the apparent visual motion in their output spiking pattern. Hereby we describe the architectural aspects, discuss its latency, scalability, and robustness properties and demonstrate that a network of mismatched delicate analog elements can reliably extract the optical flow from a simple visual scene. This work shows how precise time of spike emission used as a computational basis, biological inspiration, and neuromorphic systems can be used together for solving specific tasks.
NASA Technical Reports Server (NTRS)
Estes, J. E.; Smith, T.; Star, J. L.
1986-01-01
Research continues to focus on improving the type, quantity, and quality of information which can be derived from remotely sensed data. The focus is on remote sensing and application for the Earth Observing System (Eos) and Space Station, including associated polar and co-orbiting platforms. The remote sensing research activities are being expanded, integrated, and extended into the areas of global science, georeferenced information systems, machine assissted information extraction from image data, and artificial intelligence. The accomplishments in these areas are examined.
Integration of Artificial Market Simulation and Text Mining for Market Analysis
NASA Astrophysics Data System (ADS)
Izumi, Kiyoshi; Matsui, Hiroki; Matsuo, Yutaka
We constructed an evaluation system of the self-impact in a financial market using an artificial market and text-mining technology. Economic trends were first extracted from text data circulating in the real world. Then, the trends were inputted into the market simulation. Our simulation revealed that an operation by intervention could reduce over 70% of rate fluctuation in 1995. By the simulation results, the system was able to help for its user to find the exchange policy which can stabilize the yen-dollar rate.
A tutorial on information retrieval: basic terms and concepts
Zhou, Wei; Smalheiser, Neil R; Yu, Clement
2006-01-01
This informal tutorial is intended for investigators and students who would like to understand the workings of information retrieval systems, including the most frequently used search engines: PubMed and Google. Having a basic knowledge of the terms and concepts of information retrieval should improve the efficiency and productivity of searches. As well, this knowledge is needed in order to follow current research efforts in biomedical information retrieval and text mining that are developing new systems not only for finding documents on a given topic, but extracting and integrating knowledge across documents. PMID:16722601
Li, Yi; Zhu, Hong; Zhang, Huajun; Chen, Zhangran; Tian, Yun; Xu, Hong; Zheng, Tianling; Zheng, Wei
2014-08-15
Toxicity of algicidal extracts from Mangrovimonas yunxiaonensis strain LY01 on Alexandrium tamarense were measured through studying the algicidal procedure, nuclear damage and transcription of related genes. Medium components were optimized to improve algicidal activity, and characteristics of algicidal extracts were determined. Transmission electron microscope analysis revealed that the cell structure was broken. Cell membrane integrity destruction and nuclear structure degradation were monitored using confocal laser scanning microscope, and the rbcS, hsp and proliferating cell nuclear antigen (PCNA) gene expressions were studied. Results showed that 1.0% tryptone, 0.4% glucose and 0.8% MgCl2 were the optimal nutrient sources. The algicidal extracts were heat and pH stable, non-protein and less than 1kD. Cell membrane and nuclear structure integrity were lost, and the transcription of the rbcS and PCNA genes were significantly inhibited and there was up-regulation of hsp gene expression during the exposure procedure. The algicidal extracts destroyed the cell membrane and nuclear structure integrity, inhibited related gene expression and, eventually, lead to the inhibition of algal growth. All the results may elaborate firstly the cell death process and nuclear damage in A. tamarense which was induced by algicidal extracts, and the algicidal extracts could be potentially used as bacterial control of HABs in future. Copyright © 2014 Elsevier B.V. All rights reserved.
Zhou, Li; Friedman, Carol; Parsons, Simon; Hripcsak, George
2005-01-01
Exploring temporal information in narrative Electronic Medical Records (EMRs) is essential and challenging. We propose an architecture for an integrated approach to process temporal information in clinical narrative reports. The goal is to initiate and build a foundation that supports applications which assist healthcare practice and research by including the ability to determine the time of clinical events (e.g., past vs. present). Key components include: (1) a temporal constraint structure for temporal expressions and the development of an associated tagger; (2) a Natural Language Processing (NLP) system for encoding and extracting medical events and associating them with formalized temporal data; (3) a post-processor, with a knowledge-based subsystem to help discover implicit information, that resolves temporal expressions and deals with issues such as granularity and vagueness; and (4) a reasoning mechanism which models clinical reports as Simple Temporal Problems (STPs). PMID:16779164
Basic level scene understanding: categories, attributes and structures
Xiao, Jianxiong; Hays, James; Russell, Bryan C.; Patterson, Genevieve; Ehinger, Krista A.; Torralba, Antonio; Oliva, Aude
2013-01-01
A longstanding goal of computer vision is to build a system that can automatically understand a 3D scene from a single image. This requires extracting semantic concepts and 3D information from 2D images which can depict an enormous variety of environments that comprise our visual world. This paper summarizes our recent efforts toward these goals. First, we describe the richly annotated SUN database which is a collection of annotated images spanning 908 different scene categories with object, attribute, and geometric labels for many scenes. This database allows us to systematically study the space of scenes and to establish a benchmark for scene and object recognition. We augment the categorical SUN database with 102 scene attributes for every image and explore attribute recognition. Finally, we present an integrated system to extract the 3D structure of the scene and objects depicted in an image. PMID:24009590
Wrapping SRS with CORBA: from textual data to distributed objects.
Coupaye, T
1999-04-01
Biological data come in very different shapes. Databanks are maintained and used by distinct organizations. Text is the de facto Standard exchange format. The SRS system can integrate heterogeneous textual databanks but it was lacking a way to structure the extracted data. This paper presents a CORBA interface to the SRS system which manages databanks in a flat file format. SRS Object Servers are CORBA wrappers for SRS. They allow client applications (visualisation tools, data mining tools, etc.) to access and query SRS servers remotely through an Object Request Broker (ORB). They provide loader objects that contain the information extracted from the databanks by SRS. Loader objects are not hard-coded but generated in a flexible way by using loader specifications which allow SRS administrators to package data coming from distinct databanks. The prototype may be available for beta-testing. Please contact the SRS group (http://srs.ebi.ac.uk).
Logvinenko, I I; Voevoda, M I; Samadova, D T; Kulinich, V N; Kopylova, O S
2011-01-01
The authors analyzed work conditions and health of workers in oil-extracting industry of Novosibirsk region. Findings are that work safety system based on workplace certification concerning work conditions and on occupational safety activities certification is the most important component in primary prevention of occupational hazardous effects on life and health of workers during the occupational activities.
Fully integrated lab-on-a-disc for nucleic acid analysis of food-borne pathogens.
Kim, Tae-Hyeong; Park, Juhee; Kim, Chi-Ju; Cho, Yoon-Kyoung
2014-04-15
This paper describes a micro total analysis system for molecular analysis of Salmonella, a major food-borne pathogen. We developed a centrifugal microfluidic device, which integrated the three main steps of pathogen detection, DNA extraction, isothermal recombinase polymerase amplification (RPA), and detection, onto a single disc. A single laser diode was utilized for wireless control of valve actuation, cell lysis, and noncontact heating in the isothermal amplification step, thereby yielding a compact and miniaturized system. To achieve high detection sensitivity, rare cells in large volumes of phosphate-buffered saline (PBS) and milk samples were enriched before loading onto the disc by using antibody-coated magnetic beads. The entire procedure, from DNA extraction through to detection, was completed within 30 min in a fully automated fashion. The final detection was carried out using lateral flow strips by direct visual observation; detection limit was 10 cfu/mL and 10(2) cfu/mL in PBS and milk, respectively. Our device allows rapid molecular diagnostic analysis and does not require specially trained personnel or expensive equipment. Thus, we expect that it would have an array of potential applications, including in the detection of food-borne pathogens, environmental monitoring, and molecular diagnostics in resource-limited settings.
Mars Colony in situ resource utilization: An integrated architecture and economics model
NASA Astrophysics Data System (ADS)
Shishko, Robert; Fradet, René; Do, Sydney; Saydam, Serkan; Tapia-Cortez, Carlos; Dempster, Andrew G.; Coulton, Jeff
2017-09-01
This paper reports on our effort to develop an ensemble of specialized models to explore the commercial potential of mining water/ice on Mars in support of a Mars Colony. This ensemble starts with a formal systems architecting framework to describe a Mars Colony and capture its artifacts' parameters and technical attributes. The resulting database is then linked to a variety of ;downstream; analytic models. In particular, we integrated an extraction process (i.e., ;mining;) model, a simulation of the colony's environmental control and life support infrastructure known as HabNet, and a risk-based economics model. The mining model focuses on the technologies associated with in situ resource extraction, processing, storage and handling, and delivery. This model computes the production rate as a function of the systems' technical parameters and the local Mars environment. HabNet simulates the fundamental sustainability relationships associated with establishing and maintaining the colony's population. The economics model brings together market information, investment and operating costs, along with measures of market uncertainty and Monte Carlo techniques, with the objective of determining the profitability of commercial water/ice in situ mining operations. All told, over 50 market and technical parameters can be varied in order to address ;what-if; questions, including colony location.
NASA Astrophysics Data System (ADS)
Wei, Min; Kan, RuiFeng; Chen, Bing; Xu, ZhenYu; Yang, ChenGuang; Chen, Xiang; Xia, HuiHui; Hu, Mai; He, Yabai; Liu, JianGuo; Fan, XueLi; Wang, Wei
2017-05-01
We report the development of an accurate calibration-free wavelength-scanned wavelength modulation spectroscopy system based on the temporal wavelength response of a current-modulated quantum cascade laser (QCL) for gas concentration detections. Accurate measurements and determination of the QCL output intensity and wavelength response to current modulation enabled calculations of 1f-normalized 2f signal to obtain spectroscopic information with and without gas absorption in the beam path. The gas concentration was retrieved by fitting a simulation spectrum based on spectral line parameters to the background-subtracted 1f-normalized 2f signal based on measurements. In this paper, we demonstrate the performance of the developed system for the CH4 detection by applying an infrared QCL (at 7.84 µm or 1275 cm-1) to probe its two infrared transition lines at 1275.042 cm-1 and 1275.387 cm-1. The experimental results indicated very good agreements between measurements and modeling, for integrated absorbance ranging from 0.0057 cm-1 to 0.11 cm-1 (or absorbance ranging from 0.029 to 0.57). The extracted integrated absorbance was highly linear ( R = 0.99996) to the gas sample concentration. Deviations between the nominal sample gas concentrations and the extracted gas concentrations calculated based on HITRAN spectroscopic parameters were within 3.5%.
High yield cell-free production of integral membrane proteins without refolding or detergents.
Wuu, Jessica J; Swartz, James R
2008-05-01
Integral membrane proteins act as critical cellular components and are important drug targets. However, difficulties in producing membrane proteins have hampered investigations of structure and function. In vivo production systems are often limited by cell toxicity, and previous in vitro approaches have required unnatural folding pathways using detergents or lipid solutions. To overcome these limitations, we present an improved cell-free expression system which produces high yields of integral membrane proteins without the use of detergents or refolding steps. Our cell-free reaction activates an Escherichia coli-derived cell extract for transcription and translation. Purified E. coli inner membrane vesicles supply membrane-bound components and the lipid environment required for insertion and folding. Using this system, we demonstrated successful synthesis of two complex integral membrane transporters, the tetracycline pump (TetA) and mannitol permease (MtlA), in yields of 570+/-50 microg/mL and 130+/-30 microg/mL of vesicle-associated protein, respectively. These yields are up to 400 times typical in vivo concentrations. Insertion and folding of these proteins are verified by sucrose flotation, protease digestion, and activity assays. Whereas TetA incorporates efficiently into vesicle membranes with over two-thirds of the synthesized protein being inserted, MtlA yields appear to be limited by insufficient concentrations of a membrane-associated chaperone.
Purchasing population health: aligning financial incentives to improve health outcomes.
Kindig, D A
1999-01-01
To review the concept of population health, including its definition, measurement, and determinants, and to suggest an approach for aligning financial incentives toward this goal. DATA SOURCE, STUDY DESIGN, DATA EXTRACTION: Literature review, policy analysis The article presents the argument that a major reason for our slow progress toward health outcome improvement is that there is no operational definition of population health and that financial incentives are not aligned to this goal. Current attempts at process measures as indicators of quality or outcome are not adequate for the task. It is suggested that some measure of health-adjusted life expectancy be adopted for this purpose, and that integrated delivery systems and other agents responsible for nonmedical determinants be rewarded for improvement in this measure. This will require the development of an investment portfolio across the determinants of health based on relative marginal return to health, with horizontal integration strategies across sectoral boundaries. A 20-year three-phase development strategy is proposed, including components of research and acceptance, integrated health system implementation, and cross-sectoral integration. The U.S. health care system is a $1 trillion industry without a definition of its product. Until population outcome measures are developed and rewarded for, we will not solve the twenty-first century challenge of maximizing health outcome improvement for the resources available.
Purchasing population health: aligning financial incentives to improve health outcomes.
Kindig, D A
1998-06-01
To review the concept of population health, including its definition, measurement, and determinants, and to suggest an approach for aligning financial incentives toward this goal. DATA SOURCE, STUDY DESIGN, DATA EXTRACTION. Literature review, policy analysis The article presents the argument that a major reason for our slow progress toward health outcome improvement is that there is no operational definition of population health and that financial incentives are not aligned to this goal. Current attempts at process measures as indicators of quality or outcome are not adequate for the task. It is suggested that some measure of health-adjusted life expectancy be adopted for this purpose, and that integrated delivery systems and other agents responsible for nonmedical determinants be rewarded for improvement in this measure. This will require the development of an investment portfolio across the determinants of health based on relative marginal return to health, with horizontal integration strategies across sectoral boundaries. A 20-year three-phase development strategy is proposed, including components of research and acceptance, integrated health system implementation, and cross-sectoral integration. The U.S. healthcare system is a $1 trillion industry without a definition of its product. Until population outcome measures are developed and rewarded for, we will not solve the twenty-first century challenge of maximizing health outcome improvement for the resources available.
Abugessaisa, Imad; Saevarsdottir, Saedis; Tsipras, Giorgos; Lindblad, Staffan; Sandin, Charlotta; Nikamo, Pernilla; Ståhle, Mona; Malmström, Vivianne; Klareskog, Lars; Tegnér, Jesper
2014-01-01
Translational medicine is becoming increasingly dependent upon data generated from health care, clinical research, and molecular investigations. This increasing rate of production and diversity in data has brought about several challenges, including the need to integrate fragmented databases, enable secondary use of patient clinical data from health care in clinical research, and to create information systems that clinicians and biomedical researchers can readily use. Our case study effectively integrates requirements from the clinical and biomedical researcher perspectives in a translational medicine setting. Our three principal achievements are (a) a design of a user-friendly web-based system for management and integration of clinical and molecular databases, while adhering to proper de-identification and security measures; (b) providing a real-world test of the system functionalities using clinical cohorts; and (c) system integration with a clinical decision support system to demonstrate system interoperability. We engaged two active clinical cohorts, 747 psoriasis patients and 2001 rheumatoid arthritis patients, to demonstrate efficient query possibilities across the data sources, enable cohort stratification, extract variation in antibody patterns, study biomarker predictors of treatment response in RA patients, and to explore metabolic profiles of psoriasis patients. Finally, we demonstrated system interoperability by enabling integration with an established clinical decision support system in health care. To assure the usefulness and usability of the system, we followed two approaches. First, we created a graphical user interface supporting all user interactions. Secondly we carried out a system performance evaluation study where we measured the average response time in seconds for active users, http errors, and kilobits per second received and sent. The maximum response time was found to be 0.12 seconds; no server or client errors of any kind were detected. In conclusion, the system can readily be used by clinicians and biomedical researchers in a translational medicine setting. PMID:25203647
Abugessaisa, Imad; Saevarsdottir, Saedis; Tsipras, Giorgos; Lindblad, Staffan; Sandin, Charlotta; Nikamo, Pernilla; Ståhle, Mona; Malmström, Vivianne; Klareskog, Lars; Tegnér, Jesper
2014-01-01
Translational medicine is becoming increasingly dependent upon data generated from health care, clinical research, and molecular investigations. This increasing rate of production and diversity in data has brought about several challenges, including the need to integrate fragmented databases, enable secondary use of patient clinical data from health care in clinical research, and to create information systems that clinicians and biomedical researchers can readily use. Our case study effectively integrates requirements from the clinical and biomedical researcher perspectives in a translational medicine setting. Our three principal achievements are (a) a design of a user-friendly web-based system for management and integration of clinical and molecular databases, while adhering to proper de-identification and security measures; (b) providing a real-world test of the system functionalities using clinical cohorts; and (c) system integration with a clinical decision support system to demonstrate system interoperability. We engaged two active clinical cohorts, 747 psoriasis patients and 2001 rheumatoid arthritis patients, to demonstrate efficient query possibilities across the data sources, enable cohort stratification, extract variation in antibody patterns, study biomarker predictors of treatment response in RA patients, and to explore metabolic profiles of psoriasis patients. Finally, we demonstrated system interoperability by enabling integration with an established clinical decision support system in health care. To assure the usefulness and usability of the system, we followed two approaches. First, we created a graphical user interface supporting all user interactions. Secondly we carried out a system performance evaluation study where we measured the average response time in seconds for active users, http errors, and kilobits per second received and sent. The maximum response time was found to be 0.12 seconds; no server or client errors of any kind were detected. In conclusion, the system can readily be used by clinicians and biomedical researchers in a translational medicine setting.
NASA Astrophysics Data System (ADS)
Othman, Abdullah; Sultan, Mohamed; Becker, Richard; Alsefry, Saleh; Alharbi, Talal; Gebremichael, Esayas; Alharbi, Hassan; Abdelmohsen, Karem
2018-01-01
An integrated approach [field, Interferometric Synthetic Aperture Radar (InSAR), hydrogeology, geodesy, and spatial analysis] was adopted to identify the nature, intensity, and spatial distribution of deformational features (sinkholes, fissures, differential settling) reported over fossil aquifers in arid lands, their controlling factors, and possible remedies. The Lower Mega Aquifer System (area 2 × 106 km2) in central and northern Arabia was used as a test site. Findings suggest that excessive groundwater extraction from the fossil aquifer is the main cause of deformation: (1) deformational features correlated spatially and/or temporally with increased agricultural development and groundwater extraction, and with a decline in water levels and groundwater storage (- 3.7 ± 0.6 km3/year); (2) earthquake events (years 1985-2016; magnitude 1-5) are largely (65% of reported earthquakes) shallow (1-5 km) and increased from 1 event/year in the early 1980s (extraction 1 km3/year), up to 13 events/year in the 1990s (average annual extraction > 6.4 km3). Results indicate that faults played a role in localizing deformation given that deformational sites and InSAR-based high subsidence rates (- 4 to - 15 mm/year) were largely found within, but not outside of, NW-SE-trending grabens bound by the Kahf fault system. Findings from the analysis of Gravity Recovery and Climate Experiment solutions indicate that sustainable extraction could be attained if groundwater extraction was reduced by 3.5-4 km3/year. This study provides replicable and cost-effective methodologies for optimum utilization of fossil aquifers and for minimizing deformation associated with their use.
Jang, Jae-Wook; Yun, Jaesung; Mohaisen, Aziz; Woo, Jiyoung; Kim, Huy Kang
2016-01-01
Mass-market mobile security threats have increased recently due to the growth of mobile technologies and the popularity of mobile devices. Accordingly, techniques have been introduced for identifying, classifying, and defending against mobile threats utilizing static, dynamic, on-device, and off-device techniques. Static techniques are easy to evade, while dynamic techniques are expensive. On-device techniques are evasion, while off-device techniques need being always online. To address some of those shortcomings, we introduce Andro-profiler, a hybrid behavior based analysis and classification system for mobile malware. Andro-profiler main goals are efficiency, scalability, and accuracy. For that, Andro-profiler classifies malware by exploiting the behavior profiling extracted from the integrated system logs including system calls. Andro-profiler executes a malicious application on an emulator in order to generate the integrated system logs, and creates human-readable behavior profiles by analyzing the integrated system logs. By comparing the behavior profile of malicious application with representative behavior profile for each malware family using a weighted similarity matching technique, Andro-profiler detects and classifies it into malware families. The experiment results demonstrate that Andro-profiler is scalable, performs well in detecting and classifying malware with accuracy greater than 98 %, outperforms the existing state-of-the-art work, and is capable of identifying 0-day mobile malware samples.
Fusion of imaging and nonimaging data for surveillance aircraft
NASA Astrophysics Data System (ADS)
Shahbazian, Elisa; Gagnon, Langis; Duquet, Jean Remi; Macieszczak, Maciej; Valin, Pierre
1997-06-01
This paper describes a phased incremental integration approach for application of image analysis and data fusion technologies to provide automated intelligent target tracking and identification for airborne surveillance on board an Aurora Maritime Patrol Aircraft. The sensor suite of the Aurora consists of a radar, an identification friend or foe (IFF) system, an electronic support measures (ESM) system, a spotlight synthetic aperture radar (SSAR), a forward looking infra-red (FLIR) sensor and a link-11 tactical datalink system. Lockheed Martin Canada (LMCan) is developing a testbed, which will be used to analyze and evaluate approaches for combining the data provided by the existing sensors, which were initially not designed to feed a fusion system. Three concurrent research proof-of-concept activities provide techniques, algorithms and methodology into three sequential phases of integration of this testbed. These activities are: (1) analysis of the fusion architecture (track/contact/hybrid) most appropriate for the type of data available, (2) extraction and fusion of simple features from the imaging data into the fusion system performing automatic target identification, and (3) development of a unique software architecture which will permit integration and independent evolution, enhancement and optimization of various decision aid capabilities, such as multi-sensor data fusion (MSDF), situation and threat assessment (STA) and resource management (RM).
ERIC Educational Resources Information Center
Huang, Jian
2010-01-01
With the increasing wealth of information on the Web, information integration is ubiquitous as the same real-world entity may appear in a variety of forms extracted from different sources. This dissertation proposes supervised and unsupervised algorithms that are naturally integrated in a scalable framework to solve the entity resolution problem,…
Ceballos, Melisa Rodas; García-Tenorio, Rafael; Estela, José Manuel; Cerdà, Víctor; Ferrer, Laura
2017-12-01
Leached fractions of U and Th from different environmental solid matrices were evaluated by an automatic system enabling the on-line lixiviation and extraction/pre-concentration of these two elements previous ICP-MS detection. UTEVA resin was used as selective extraction material. Ten leached fraction, using artificial rainwater (pH 5.4) as leaching agent, and a residual fraction were analyzed for each sample, allowing the study of behavior of U and Th in dynamic lixiviation conditions. Multivariate techniques have been employed for the efficient optimization of the independent variables that affect the lixiviation process. The system reached LODs of 0.1 and 0.7ngkg -1 of U and Th, respectively. The method was satisfactorily validated for three solid matrices, by the analysis of a soil reference material (IAEA-375), a certified sediment reference material (BCR- 320R) and a phosphogypsum reference material (MatControl CSN-CIEMAT 2008). Besides, environmental samples were analyzed, showing a similar behavior, i.e. the content of radionuclides decreases with the successive extractions. In all cases, the accumulative leached fraction of U and Th for different solid matrices studied (soil, sediment and phosphogypsum) were extremely low, up to 0.05% and 0.005% of U and Th, respectively. However, a great variability was observed in terms of mass concentration released, e.g. between 44 and 13,967ngUkg -1 . Copyright © 2017 Elsevier B.V. All rights reserved.
Systems biology of meridians, acupoints, and chinese herbs in disease.
Lin, Li-Ling; Wang, Ya-Hui; Lai, Chi-Yu; Chau, Chan-Lao; Su, Guan-Chin; Yang, Chun-Yi; Lou, Shu-Ying; Chen, Szu-Kai; Hsu, Kuan-Hao; Lai, Yen-Ling; Wu, Wei-Ming; Huang, Jian-Long; Liao, Chih-Hsin; Juan, Hsueh-Fen
2012-01-01
Meridians, acupoints, and Chinese herbs are important components of traditional Chinese medicine (TCM). They have been used for disease treatment and prevention and as alternative and complementary therapies. Systems biology integrates omics data, such as transcriptional, proteomic, and metabolomics data, in order to obtain a more global and complete picture of biological activity. To further understand the existence and functions of the three components above, we reviewed relevant research in the systems biology literature and found many recent studies that indicate the value of acupuncture and Chinese herbs. Acupuncture is useful in pain moderation and relieves various symptoms arising from acute spinal cord injury and acute ischemic stroke. Moreover, Chinese herbal extracts have been linked to wound repair, the alleviation of postmenopausal osteoporosis severity, and anti-tumor effects, among others. Different acupoints, variations in treatment duration, and herbal extracts can be used to alleviate various symptoms and conditions and to regulate biological pathways by altering gene and protein expression. Our paper demonstrates how systems biology has helped to establish a platform for investigating the efficacy of TCM in treating different diseases and improving treatment strategies.
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron; Thompson, Julie Dawn
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented.
Aniba, Mohamed Radhouene; Siguenza, Sophie; Friedrich, Anne; Plewniak, Frédéric; Poch, Olivier; Marchler-Bauer, Aron
2009-01-01
The traditional approach to bioinformatics analyses relies on independent task-specific services and applications, using different input and output formats, often idiosyncratic, and frequently not designed to inter-operate. In general, such analyses were performed by experts who manually verified the results obtained at each step in the process. Today, the amount of bioinformatics information continuously being produced means that handling the various applications used to study this information presents a major data management and analysis challenge to researchers. It is now impossible to manually analyse all this information and new approaches are needed that are capable of processing the large-scale heterogeneous data in order to extract the pertinent information. We review the recent use of integrated expert systems aimed at providing more efficient knowledge extraction for bioinformatics research. A general methodology for building knowledge-based expert systems is described, focusing on the unstructured information management architecture, UIMA, which provides facilities for both data and process management. A case study involving a multiple alignment expert system prototype called AlexSys is also presented. PMID:18971242
NASA Astrophysics Data System (ADS)
Bonifazi, Giuseppe; Serranti, Silvia
2007-09-01
Mining activities, expecially those operated in open air (open pit), present a deep impact on the sourrondings. Such an impact, and the related problems, are directly related to the correct operation of the activities, and usually strongly interact with the environment. Impact can be mainly related to the following issues: high volumes of handled material, ii) generation of dust, noise and vibrations, water pollution, visual impact and, finally, mining area recovery at the end of exploitation activities. All these aspects can be considered very important, and must be properly evaluated and monitored. Environmental impact control is usually carried out during and after the end of the mining activities, adopting methods related to the detection, collection, analysis of specific environmental indicators and with their further comparison with reference thresholding values stated by official regulations. Aim of the study was to investigate, and critically evaluate, the problems related to development of an integrated set of procedures based on the collection and the analysis of remote sensed data in order to evaluate the effect of rehabilitation of land contaminated by extractive industry activities. Starting from the results of these analyses, a monitoring and registration of the environmental impact of such operations was performed by the application and the integration of modern information technologies, as the previous mentioned Earth Observation (EO), with Geographic Information Systems (GIS). The study was developed with reference to different dismissed mine sites in India, Thailand and China. The results of the study have been utilized as input for the construction of a knowledge based decision support system finalized to help in the identification of the appropriate rehabilitation technologies for all those dismissed area previously interested by extractive industry activities. The work was financially supported within the framework of the Project ASIA IT&C - CN/ASIA IT&C/006 (89870) Extract-It "Application of Information Technologies for the Sustainable Management of Extractive Industry Activities" of the European Union.
PANTHER. Pattern ANalytics To support High-performance Exploitation and Reasoning.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Czuchlewski, Kristina Rodriguez; Hart, William E.
Sandia has approached the analysis of big datasets with an integrated methodology that uses computer science, image processing, and human factors to exploit critical patterns and relationships in large datasets despite the variety and rapidity of information. The work is part of a three-year LDRD Grand Challenge called PANTHER (Pattern ANalytics To support High-performance Exploitation and Reasoning). To maximize data analysis capability, Sandia pursued scientific advances across three key technical domains: (1) geospatial-temporal feature extraction via image segmentation and classification; (2) geospatial-temporal analysis capabilities tailored to identify and process new signatures more efficiently; and (3) domain- relevant models of humanmore » perception and cognition informing the design of analytic systems. Our integrated results include advances in geographical information systems (GIS) in which we discover activity patterns in noisy, spatial-temporal datasets using geospatial-temporal semantic graphs. We employed computational geometry and machine learning to allow us to extract and predict spatial-temporal patterns and outliers from large aircraft and maritime trajectory datasets. We automatically extracted static and ephemeral features from real, noisy synthetic aperture radar imagery for ingestion into a geospatial-temporal semantic graph. We worked with analysts and investigated analytic workflows to (1) determine how experiential knowledge evolves and is deployed in high-demand, high-throughput visual search workflows, and (2) better understand visual search performance and attention. Through PANTHER, Sandia's fundamental rethinking of key aspects of geospatial data analysis permits the extraction of much richer information from large amounts of data. The project results enable analysts to examine mountains of historical and current data that would otherwise go untouched, while also gaining meaningful, measurable, and defensible insights into overlooked relationships and patterns. The capability is directly relevant to the nation's nonproliferation remote-sensing activities and has broad national security applications for military and intelligence- gathering organizations.« less
Enhanced fluorescence detection using liquid-liquid extraction in a microfluidic droplet system.
Chen, Yan-Yu; Chen, Zhao-Ming; Wang, Hsiang-Yu
2012-11-07
Reducing the fluorescence background in microfluidic assays is important in obtaining accurate outcomes and enhancing the quality of detections. This study demonstrates an integrated process including cell labelling, fluorescence background reduction, and biomolecule detection using liquid-liquid extraction in a microfluidic droplet system. The cellular lipids in Chlorella vulgaris and NIH/3T3 cells were labelled with a hydrophobic dye, Nile red, to investigate the performance of the proposed method. The fluorescence background of the lipid detection can be reduced by 85% and the removal efficiency increased with the volume of continuous phase surrounding a droplet. The removal rate of the fluorescence background increased as the surface area to volume ratio of a droplet increased. Before Nile red was removed from the droplet, the signal to noise ratio was as low as 1.30 and it was difficult to distinguish cells from the background. Removing Nile red increased the signal to noise ratio to 22 and 34 for Chlorella vulgaris and NIH/3T3, respectively, and these were 17 fold and 10 fold of the values before extraction. The proposed method successfully demonstrates the enhancement of fluorescence detection of cellular lipids and has great potential in improving other fluorescence-based detections in microfluidic systems.
Zou, Xue; Kang, Meng; Li, Aiyue; Shen, Chengyin; Chu, Yannan
2016-03-15
Rapid and sensitive monitoring of benzene in water is very important to the health of people and for environmental protection. A novel and online detection method of spray inlet proton transfer reaction mass spectrometry (SI-PTR-MS) was introduced for rapid and sensitive monitoring of trace benzene in water. A spraying extraction system was coupled with the self-developed PTR-MS. The benzene was extracted from the water sample in the spraying extraction system and continuously detected with PTR-MS. The flow of carrier gas and salt concentration in water were optimized to be 50 sccm and 20% (w/v), respectively. The response time and the limit of detection of the SI-PTR-MS for detection of benzene in water were 55 s and 0.14 μg/L at 10 s integration time, respectively. The repeatability of the SI-PTR-MS was evaluated, and the relative standard deviation of five replicate determinations was 4.3%. The SI-PTR-MS system was employed for monitoring benzene in different water matrices, such as tap water, lake water, and wastewater. The results indicated that the online SI-PTR-MS can be used for rapid and sensitive monitoring of trace benzene in water.
Clinical data integration of distributed data sources using Health Level Seven (HL7) v3-RIM mapping
2011-01-01
Background Health information exchange and health information integration has become one of the top priorities for healthcare systems across institutions and hospitals. Most organizations and establishments implement health information exchange and integration in order to support meaningful information retrieval among their disparate healthcare systems. The challenges that prevent efficient health information integration for heterogeneous data sources are the lack of a common standard to support mapping across distributed data sources and the numerous and diverse healthcare domains. Health Level Seven (HL7) is a standards development organization which creates standards, but is itself not the standard. They create the Reference Information Model. RIM is developed by HL7's technical committees. It is a standardized abstract representation of HL7 data across all the domains of health care. In this article, we aim to present a design and a prototype implementation of HL7 v3-RIM mapping for information integration of distributed clinical data sources. The implementation enables the user to retrieve and search information that has been integrated using HL7 v3-RIM technology from disparate health care systems. Method and results We designed and developed a prototype implementation of HL7 v3-RIM mapping function to integrate distributed clinical data sources using R-MIM classes from HL7 v3-RIM as a global view along with a collaborative centralized web-based mapping tool to tackle the evolution of both global and local schemas. Our prototype was implemented and integrated with a Clinical Database management Systems CDMS as a plug-in module. We tested the prototype system with some use case scenarios for distributed clinical data sources across several legacy CDMS. The results have been effective in improving information delivery, completing tasks that would have been otherwise difficult to accomplish, and reducing the time required to finish tasks which are used in collaborative information retrieval and sharing with other systems. Conclusions We created a prototype implementation of HL7 v3-RIM mapping for information integration between distributed clinical data sources to promote collaborative healthcare and translational research. The prototype has effectively and efficiently ensured the accuracy of the information and knowledge extractions for systems that have been integrated PMID:22104558
Family medicine in Iran: facing the health system challenges.
Esmaeili, Reza; Hadian, Mohammad; Rashidian, Arash; Shariati, Mohammad; Ghaderi, Hossien
2014-11-30
In response to the current fragmented context of health systems, it is essential to support the revitalization of primary health care in order to provide a stronger sense of direction and integrity. Around the world, family medicine recognized as a core discipline for strengthening primary health care setting. This study aimed to understand the perspectives of policy makers and decision makers of Iran's health system about the implementation of family medicine in Iran urban areas. This study is a qualitative study with framework analysis. Purposive semi-structured interviews were conducted with Policy and decision makers in the five main organizations of Iran health care system. The codes were extracted using inductive and deductive methods. According to 27 semi-structured interviews were conducted with Policy and decision makers, three main themes and 8 subthemes extracted, including: The development of referral system, better access to health care and the management of chronic diseases. Family medicine is a viable means for a series of crucial reforms in the face of the current challenges of health system. Implementation of family medicine can strengthen the PHC model in Iran urban areas. Attempting to create a general consensus among various stakeholders is essential for effective implementation of the project.
An Integrated Children Disease Prediction Tool within a Special Social Network.
Apostolova Trpkovska, Marika; Yildirim Yayilgan, Sule; Besimi, Adrian
2016-01-01
This paper proposes a social network with an integrated children disease prediction system developed by the use of the specially designed Children General Disease Ontology (CGDO). This ontology consists of children diseases and their relationship with symptoms and Semantic Web Rule Language (SWRL rules) that are specially designed for predicting diseases. The prediction process starts by filling data about the appeared signs and symptoms by the user which are after that mapped with the CGDO ontology. Once the data are mapped, the prediction results are presented. The phase of prediction executes the rules which extract the predicted disease details based on the SWRL rule specified. The motivation behind the development of this system is to spread knowledge about the children diseases and their symptoms in a very simple way using the specialized social networking website www.emama.mk.
HOWDY: an integrated database system for human genome research
Hirakawa, Mika
2002-01-01
HOWDY is an integrated database system for accessing and analyzing human genomic information (http://www-alis.tokyo.jst.go.jp/HOWDY/). HOWDY stores information about relationships between genetic objects and the data extracted from a number of databases. HOWDY consists of an Internet accessible user interface that allows thorough searching of the human genomic databases using the gene symbols and their aliases. It also permits flexible editing of the sequence data. The database can be searched using simple words and the search can be restricted to a specific cytogenetic location. Linear maps displaying markers and genes on contig sequences are available, from which an object can be chosen. Any search starting point identifies all the information matching the query. HOWDY provides a convenient search environment of human genomic data for scientists unsure which database is most appropriate for their search. PMID:11752279
NASA Astrophysics Data System (ADS)
Jiang, Wen-Hao; Liu, Jian-Hong; Liu, Yin; Jin, Ge; Zhang, Jun; Pan, Jian-Wei
2017-12-01
InGaAs/InP single-photon detectors (SPDs) are the key devices for applications requiring near-infrared single-photon detection. Gating mode is an effective approach to synchronous single-photon detection. Increasing gating frequency and reducing module size are important challenges for the design of such detector system. Here we present for the first time an InGaAs/InP SPD with 1.25 GHz sine wave gating using a monolithically integrated readout circuit (MIRC). The MIRC has a size of 15 mm * 15 mm and implements the miniaturization of avalanche extraction for high-frequency sine wave gating. In the MIRC, low-pass filters and a low-noise radio frequency amplifier are integrated based on the technique of low temperature co-fired ceramic, which can effectively reduce the parasitic capacitance and extract weak avalanche signals. We then characterize the InGaAs/InP SPD to verify the functionality and reliability of MIRC, and the SPD exhibits excellent performance with 27.5 % photon detection efficiency, 1.2 kcps dark count rate, and 9.1 % afterpulse probability at 223 K and 100 ns hold-off time. With this MIRC, one can further design miniaturized high-frequency SPD modules that are highly required for practical applications.
Mehta, Mohit J; Kumar, Arvind
2017-12-14
There is significant interest in the development of a sustainable and integrated process for the extraction of essential oils and separation of biopolymers by using novel and efficient solvent systems. Herein, cassia essential oil enriched in coumarin is extracted from Cinnamomum cassia bark by using a protic ionic liquid (IL), ethylammonium nitrate (EAN), through dissolution and the creation of a biphasic system with the help of diethyl ether. The process has been perfected, in terms of higher biomass dissolution ability and essential oil yield through the addition of aprotic ILs (based on the 1-butyl-3-methylimidazolium (C 4 mim) cation and chloride or acetate anions) to EAN. After extraction of oil, cellulose-rich material and free lignin were regenerated from biomass-IL solutions by using a 1:1 mixture of acetone-water. The purity of the extracted essential oil and biopolymers were ascertained by means of FTIR spectroscopy, NMR spectroscopy, and GC-MS techniques. Because lignin contains UV-blocking chromophores, the oil-free residual lignocellulosic material has been directly utilized to construct UV-light-resistant composite materials in conjunction with the biopolymer chitosan. Composite material thus obtained was processed to form biodegradable films, which were characterized for mechanical and optical properties. The films showed excellent UV-light resistance and mechanical properties, thereby making it a material suitable for packaging and light-sensitive applications. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Automated extraction and semantic analysis of mutation impacts from the biomedical literature
2012-01-01
Background Mutations as sources of evolution have long been the focus of attention in the biomedical literature. Accessing the mutational information and their impacts on protein properties facilitates research in various domains, such as enzymology and pharmacology. However, manually curating the rich and fast growing repository of biomedical literature is expensive and time-consuming. As a solution, text mining approaches have increasingly been deployed in the biomedical domain. While the detection of single-point mutations is well covered by existing systems, challenges still exist in grounding impacts to their respective mutations and recognizing the affected protein properties, in particular kinetic and stability properties together with physical quantities. Results We present an ontology model for mutation impacts, together with a comprehensive text mining system for extracting and analysing mutation impact information from full-text articles. Organisms, as sources of proteins, are extracted to help disambiguation of genes and proteins. Our system then detects mutation series to correctly ground detected impacts using novel heuristics. It also extracts the affected protein properties, in particular kinetic and stability properties, as well as the magnitude of the effects and validates these relations against the domain ontology. The output of our system can be provided in various formats, in particular by populating an OWL-DL ontology, which can then be queried to provide structured information. The performance of the system is evaluated on our manually annotated corpora. In the impact detection task, our system achieves a precision of 70.4%-71.1%, a recall of 71.3%-71.5%, and grounds the detected impacts with an accuracy of 76.5%-77%. The developed system, including resources, evaluation data and end-user and developer documentation is freely available under an open source license at http://www.semanticsoftware.info/open-mutation-miner. Conclusion We present Open Mutation Miner (OMM), the first comprehensive, fully open-source approach to automatically extract impacts and related relevant information from the biomedical literature. We assessed the performance of our work on manually annotated corpora and the results show the reliability of our approach. The representation of the extracted information into a structured format facilitates knowledge management and aids in database curation and correction. Furthermore, access to the analysis results is provided through multiple interfaces, including web services for automated data integration and desktop-based solutions for end user interactions. PMID:22759648
Xie, Fagen; Lee, Janet; Munoz-Plaza, Corrine E; Hahn, Erin E; Chen, Wansu
2017-01-01
Surgical pathology reports (SPR) contain rich clinical diagnosis information. The text information extraction system (TIES) is an end-to-end application leveraging natural language processing technologies and focused on the processing of pathology and/or radiology reports. We deployed the TIES system and integrated SPRs into the TIES system on a daily basis at Kaiser Permanente Southern California. The breast cancer cases diagnosed in December 2013 from the Cancer Registry (CANREG) were used to validate the performance of the TIES system. The National Cancer Institute Metathesaurus (NCIM) concept terms and codes to describe breast cancer were identified through the Unified Medical Language System Terminology Service (UTS) application. The identified NCIM codes were used to search for the coded SPRs in the back-end datastore directly. The identified cases were then compared with the breast cancer patients pulled from CANREG. A total of 437 breast cancer concept terms and 14 combinations of "breast"and "cancer" terms were identified from the UTS application. A total of 249 breast cancer cases diagnosed in December 2013 was pulled from CANREG. Out of these 249 cases, 241 were successfully identified by the TIES system from a total of 457 reports. The TIES system also identified an additional 277 cases that were not part of the validation sample. Out of the 277 cases, 11% were determined as highly likely to be cases after manual examinations, and 86% were in CANREG but were diagnosed in months other than December of 2013. The study demonstrated that the TIES system can effectively identify potential breast cancer cases in our care setting. Identified potential cases can be easily confirmed by reviewing the corresponding annotated reports through the front-end visualization interface. The TIES system is a great tool for identifying potential various cancer cases in a timely manner and on a regular basis in support of clinical research studies.
Liu, Jian; Cheng, Yuhu; Wang, Xuesong; Zhang, Lin; Liu, Hui
2017-08-17
It is urgent to diagnose colorectal cancer in the early stage. Some feature genes which are important to colorectal cancer development have been identified. However, for the early stage of colorectal cancer, less is known about the identity of specific cancer genes that are associated with advanced clinical stage. In this paper, we conducted a feature extraction method named Optimal Mean based Block Robust Feature Extraction method (OMBRFE) to identify feature genes associated with advanced colorectal cancer in clinical stage by using the integrated colorectal cancer data. Firstly, based on the optimal mean and L 2,1 -norm, a novel feature extraction method called Optimal Mean based Robust Feature Extraction method (OMRFE) is proposed to identify feature genes. Then the OMBRFE method which introduces the block ideology into OMRFE method is put forward to process the colorectal cancer integrated data which includes multiple genomic data: copy number alterations, somatic mutations, methylation expression alteration, as well as gene expression changes. Experimental results demonstrate that the OMBRFE is more effective than previous methods in identifying the feature genes. Moreover, genes identified by OMBRFE are verified to be closely associated with advanced colorectal cancer in clinical stage.
Instantaneous Coastline Extraction from LIDAR Point Cloud and High Resolution Remote Sensing Imagery
NASA Astrophysics Data System (ADS)
Li, Y.; Zhoing, L.; Lai, Z.; Gan, Z.
2018-04-01
A new method was proposed for instantaneous waterline extraction in this paper, which combines point cloud geometry features and image spectral characteristics of the coastal zone. The proposed method consists of follow steps: Mean Shift algorithm is used to segment the coastal zone of high resolution remote sensing images into small regions containing semantic information;Region features are extracted by integrating LiDAR data and the surface area of the image; initial waterlines are extracted by α-shape algorithm; a region growing algorithm with is taking into coastline refinement, with a growth rule integrating the intensity and topography of LiDAR data; moothing the coastline. Experiments are conducted to demonstrate the efficiency of the proposed method.
NASA Astrophysics Data System (ADS)
Zhan, Jinliang; Lu, Pei
2006-11-01
Since the quality of traditional Chinese medicine products are affected by raw material, machining and many other factors, it is difficult for traditional Chinese medicine production process especially the extracting process to ensure the steady and homogeneous quality. At the same time, there exist some quality control blind spots due to lacking on-line quality detection means. But if infrared spectrum analysis technology was used in traditional Chinese medicine production process on the basis of off-line analysis to real-time detect the quality of semi-manufactured goods and to be assisted by advanced automatic control technique, the steady and homogeneous quality can be obtained. It can be seen that the on-line detection of extracting process plays an important role in the development of Chinese patent medicines industry. In this paper, the design and implement of a traditional Chinese medicine extracting process monitoring experiment system which is based on PROFIBUS-DP field bus, OPC, and Internet technology is introduced. The system integrates intelligence node which gathering data, superior sub-system which achieving figure configuration and remote supervisory, during the process of traditional Chinese medicine production, monitors the temperature parameter, pressure parameter, quality parameter etc. And it can be controlled by the remote nodes in the VPN (Visual Private Network). Experiment and application do have proved that the system can reach the anticipation effect fully, and with the merits of operational stability, real-time, reliable, convenient and simple manipulation and so on.
Automatic identification of species with neural networks.
Hernández-Serna, Andrés; Jiménez-Segura, Luz Fernanda
2014-01-01
A new automatic identification system using photographic images has been designed to recognize fish, plant, and butterfly species from Europe and South America. The automatic classification system integrates multiple image processing tools to extract the geometry, morphology, and texture of the images. Artificial neural networks (ANNs) were used as the pattern recognition method. We tested a data set that included 740 species and 11,198 individuals. Our results show that the system performed with high accuracy, reaching 91.65% of true positive fish identifications, 92.87% of plants and 93.25% of butterflies. Our results highlight how the neural networks are complementary to species identification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stillwell, B.; Billett, B.; Brajuskovic, B.
2017-06-20
Recent work on the design of the storage ring vacuum system for the Advanced Photon Source Upgrade project (APS-U) includes: revising the vacuum system design to accommodate a new lattice with reverse bend magnets, modifying the designs of vacuum chambers in the FODO sections for more intense incident synchrotron radiation power, modifying the design of rf-shielding bellows liners for better performance and reliability, modifying photon absorber designs to make better use of available space, and integrated planning of components needed in the injection, extraction and rf cavity straight sections. An overview of progress in these areas is presented.
RASSOR Demonstration in Regolith Bin
2016-09-29
An integrated test of the MARCO POLO/Mars Pathfinder in-situ resource utilization, or ISRU, system takes place at NASA’s Kennedy Space Center in Florida. A mockup of MARCO POLO, an ISRU propellant production technology demonstration simulated mission, is tested in a regolith bin with RASSOR 2.0, the Regolith Advanced Surface Systems Operations Robot. On the surface of Mars, mining robots like RASSOR will dig down into the regolith and take the material to a processing plant where usable elements such as hydrogen, oxygen and water can be extracted for life support systems. Regolith also shows promise for both construction and creating elements for rocket fuel.
Patel, Darshan C; Lyu, Yaqi Fara; Gandarilla, Jorge; Doherty, Steve
2018-04-03
In-process sampling and analysis is an important aspect of monitoring kinetic profiles and impurity formation or rejection, both in development and during commercial manufacturing. In pharmaceutical process development, the technology of choice for a substantial portion of this analysis is high-performance liquid chromatography (HPLC). Traditionally, the sample extraction and preparation for reaction characterization have been performed manually. This can be time consuming, laborious, and impractical for long processes. Depending on the complexity of the sample preparation, there can be variability introduced by different analysts, and in some cases, the integrity of the sample can be compromised during handling. While there are commercial instruments available for on-line monitoring with HPLC, they lack capabilities in many key areas. Some do not provide integration of the sampling and analysis, while others afford limited flexibility in sample preparation. The current offerings provide a limited number of unit operations available for sample processing and no option for workflow customizability. This work describes development of a microfluidic automated program (MAP) which fully automates the sample extraction, manipulation, and on-line LC analysis. The flexible system is controlled using an intuitive Microsoft Excel based user interface. The autonomous system is capable of unattended reaction monitoring that allows flexible unit operations and workflow customization to enable complex operations and on-line sample preparation. The automated system is shown to offer advantages over manual approaches in key areas while providing consistent and reproducible in-process data. Copyright © 2017 Elsevier B.V. All rights reserved.
TRENCADIS--a WSRF grid MiddleWare for managing DICOM structured reporting objects.
Blanquer, Ignacio; Hernandez, Vicente; Segrelles, Damià
2006-01-01
The adoption of the digital processing of medical data, especially on radiology, has leaded to the availability of millions of records (images and reports). However, this information is mainly used at patient level, being the extraction of information, organised according to administrative criteria, which make the extraction of knowledge difficult. Moreover, legal constraints make the direct integration of information systems complex or even impossible. On the other side, the widespread of the DICOM format has leaded to the inclusion of other information different from just radiological images. The possibility of coding radiology reports in a structured form, adding semantic information about the data contained in the DICOM objects, eases the process of structuring images according to content. DICOM Structured Reporting (DICOM-SR) is a specification of tags and sections to code and integrate radiology reports, with seamless references to findings and regions of interests of the associated images, movies, waveforms, signals, etc. The work presented in this paper aims at developing of a framework to efficiently and securely share medical images and radiology reports, as well as to provide high throughput processing services. This system is based on a previously developed architecture in the framework of the TRENCADIS project, and uses other components such as the security system and the Grid processing service developed in previous activities. The work presented here introduces a semantic structuring and an ontology framework, to organise medical images considering standard terminology and disease coding formats (SNOMED, ICD9, LOINC..).
Event-driven processing for hardware-efficient neural spike sorting
NASA Astrophysics Data System (ADS)
Liu, Yan; Pereira, João L.; Constandinou, Timothy G.
2018-02-01
Objective. The prospect of real-time and on-node spike sorting provides a genuine opportunity to push the envelope of large-scale integrated neural recording systems. In such systems the hardware resources, power requirements and data bandwidth increase linearly with channel count. Event-based (or data-driven) processing can provide here a new efficient means for hardware implementation that is completely activity dependant. In this work, we investigate using continuous-time level-crossing sampling for efficient data representation and subsequent spike processing. Approach. (1) We first compare signals (synthetic neural datasets) encoded with this technique against conventional sampling. (2) We then show how such a representation can be directly exploited by extracting simple time domain features from the bitstream to perform neural spike sorting. (3) The proposed method is implemented in a low power FPGA platform to demonstrate its hardware viability. Main results. It is observed that considerably lower data rates are achievable when using 7 bits or less to represent the signals, whilst maintaining the signal fidelity. Results obtained using both MATLAB and reconfigurable logic hardware (FPGA) indicate that feature extraction and spike sorting accuracies can be achieved with comparable or better accuracy than reference methods whilst also requiring relatively low hardware resources. Significance. By effectively exploiting continuous-time data representation, neural signal processing can be achieved in a completely event-driven manner, reducing both the required resources (memory, complexity) and computations (operations). This will see future large-scale neural systems integrating on-node processing in real-time hardware.
Advances in the Control System for a High Precision Dissolved Organic Carbon Analyzer
NASA Astrophysics Data System (ADS)
Liao, M.; Stubbins, A.; Haidekker, M.
2017-12-01
Dissolved organic carbon (DOC) is a master variable in aquatic ecosystems. DOC in the ocean is one of the largest carbon stores on earth. Studies of the dynamics of DOC in the ocean and other low DOC systems (e.g. groundwater) are hindered by the lack of high precision (sub-micromolar) analytical techniques. Results are presented from efforts to construct and optimize a flow-through, wet chemical DOC analyzer. This study focused on the design, integration and optimization of high precision components and control systems required for such a system (mass flow controller, syringe pumps, gas extraction, reactor chamber with controlled UV and temperature). Results of the approaches developed are presented.
Remote Video Monitor of Vehicles in Cooperative Information Platform
NASA Astrophysics Data System (ADS)
Qin, Guofeng; Wang, Xiaoguo; Wang, Li; Li, Yang; Li, Qiyan
Detection of vehicles plays an important role in the area of the modern intelligent traffic management. And the pattern recognition is a hot issue in the area of computer vision. An auto- recognition system in cooperative information platform is studied. In the cooperative platform, 3G wireless network, including GPS, GPRS (CDMA), Internet (Intranet), remote video monitor and M-DMB networks are integrated. The remote video information can be taken from the terminals and sent to the cooperative platform, then detected by the auto-recognition system. The images are pretreated and segmented, including feature extraction, template matching and pattern recognition. The system identifies different models and gets vehicular traffic statistics. Finally, the implementation of the system is introduced.
NASA Astrophysics Data System (ADS)
Jain, A. K.; Dorai, C.
Computer vision has emerged as a challenging and important area of research, both as an engineering and a scientific discipline. The growing importance of computer vision is evident from the fact that it was identified as one of the "Grand Challenges" and also from its prominent role in the National Information Infrastructure. While the design of a general-purpose vision system continues to be elusive machine vision systems are being used successfully in specific application elusive, machine vision systems are being used successfully in specific application domains. Building a practical vision system requires a careful selection of appropriate sensors, extraction and integration of information from available cues in the sensed data, and evaluation of system robustness and performance. The authors discuss and demonstrate advantages of (1) multi-sensor fusion, (2) combination of features and classifiers, (3) integration of visual modules, and (IV) admissibility and goal-directed evaluation of vision algorithms. The requirements of several prominent real world applications such as biometry, document image analysis, image and video database retrieval, and automatic object model construction offer exciting problems and new opportunities to design and evaluate vision algorithms.
Dynamic Policy Evaluation for Containing Network Attacks (DEFCN)
2005-03-01
API reads policy information from the target users ".ssh" directory and applies those policies to determine whether remote login is allowed to a...types of events that can be controlled by the threshold detectors and reported by the GAA-API include the number of failed login attempts within a given...other uses of the system. Emerald architecture [2] includes a data- collection module integrated with Apache Web server. The module extracts the request
Systems microscopy: an emerging strategy for the life sciences.
Lock, John G; Strömblad, Staffan
2010-05-01
Dynamic cellular processes occurring in time and space are fundamental to all physiology and disease. To understand complex and dynamic cellular processes therefore demands the capacity to record and integrate quantitative multiparametric data from the four spatiotemporal dimensions within which living cells self-organize, and to subsequently use these data for the mathematical modeling of cellular systems. To this end, a raft of complementary developments in automated fluorescence microscopy, cell microarray platforms, quantitative image analysis and data mining, combined with multivariate statistics and computational modeling, now coalesce to produce a new research strategy, "systems microscopy", which facilitates systems biology analyses of living cells. Systems microscopy provides the crucial capacities to simultaneously extract and interrogate multiparametric quantitative data at resolution levels ranging from the molecular to the cellular, thereby elucidating a more comprehensive and richly integrated understanding of complex and dynamic cellular systems. The unique capacities of systems microscopy suggest that it will become a vital cornerstone of systems biology, and here we describe the current status and future prospects of this emerging field, as well as outlining some of the key challenges that remain to be overcome. Copyright 2010 Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Young, Steve; UijtdeHaag, Maarten; Sayre, Jonathon
2003-01-01
Synthetic Vision Systems (SVS) provide pilots with displays of stored geo-spatial data representing terrain, obstacles, and cultural features. As comprehensive validation is impractical, these databases typically have no quantifiable level of integrity. Further, updates to the databases may not be provided as changes occur. These issues limit the certification level and constrain the operational context of SVS for civil aviation. Previous work demonstrated the feasibility of using a realtime monitor to bound the integrity of Digital Elevation Models (DEMs) by using radar altimeter measurements during flight. This paper describes an extension of this concept to include X-band Weather Radar (WxR) measurements. This enables the monitor to detect additional classes of DEM errors and to reduce the exposure time associated with integrity threats. Feature extraction techniques are used along with a statistical assessment of similarity measures between the sensed and stored features that are detected. Recent flight-testing in the area around the Juneau, Alaska Airport (JNU) has resulted in a comprehensive set of sensor data that is being used to assess the feasibility of the proposed monitor technology. Initial results of this assessment are presented.
An integrated framework for detecting suspicious behaviors in video surveillance
NASA Astrophysics Data System (ADS)
Zin, Thi Thi; Tin, Pyke; Hama, Hiromitsu; Toriu, Takashi
2014-03-01
In this paper, we propose an integrated framework for detecting suspicious behaviors in video surveillance systems which are established in public places such as railway stations, airports, shopping malls and etc. Especially, people loitering in suspicion, unattended objects left behind and exchanging suspicious objects between persons are common security concerns in airports and other transit scenarios. These involve understanding scene/event, analyzing human movements, recognizing controllable objects, and observing the effect of the human movement on those objects. In the proposed framework, multiple background modeling technique, high level motion feature extraction method and embedded Markov chain models are integrated for detecting suspicious behaviors in real time video surveillance systems. Specifically, the proposed framework employs probability based multiple backgrounds modeling technique to detect moving objects. Then the velocity and distance measures are computed as the high level motion features of the interests. By using an integration of the computed features and the first passage time probabilities of the embedded Markov chain, the suspicious behaviors in video surveillance are analyzed for detecting loitering persons, objects left behind and human interactions such as fighting. The proposed framework has been tested by using standard public datasets and our own video surveillance scenarios.
Automatic Extraction of Road Markings from Mobile Laser-Point Cloud Using Intensity Data
NASA Astrophysics Data System (ADS)
Yao, L.; Chen, Q.; Qin, C.; Wu, H.; Zhang, S.
2018-04-01
With the development of intelligent transportation, road's high precision information data has been widely applied in many fields. This paper proposes a concise and practical way to extract road marking information from point cloud data collected by mobile mapping system (MMS). The method contains three steps. Firstly, road surface is segmented through edge detection from scan lines. Then the intensity image is generated by inverse distance weighted (IDW) interpolation and the road marking is extracted by using adaptive threshold segmentation based on integral image without intensity calibration. Moreover, the noise is reduced by removing a small number of plaque pixels from binary image. Finally, point cloud mapped from binary image is clustered into marking objects according to Euclidean distance, and using a series of algorithms including template matching and feature attribute filtering for the classification of linear markings, arrow markings and guidelines. Through processing the point cloud data collected by RIEGL VUX-1 in case area, the results show that the F-score of marking extraction is 0.83, and the average classification rate is 0.9.
Paraskevopoulou, Sivylla E; Barsakcioglu, Deren Y; Saberi, Mohammed R; Eftekhar, Amir; Constandinou, Timothy G
2013-04-30
Next generation neural interfaces aspire to achieve real-time multi-channel systems by integrating spike sorting on chip to overcome limitations in communication channel capacity. The feasibility of this approach relies on developing highly efficient algorithms for feature extraction and clustering with the potential of low-power hardware implementation. We are proposing a feature extraction method, not requiring any calibration, based on first and second derivative features of the spike waveform. The accuracy and computational complexity of the proposed method are quantified and compared against commonly used feature extraction methods, through simulation across four datasets (with different single units) at multiple noise levels (ranging from 5 to 20% of the signal amplitude). The average classification error is shown to be below 7% with a computational complexity of 2N-3, where N is the number of sample points of each spike. Overall, this method presents a good trade-off between accuracy and computational complexity and is thus particularly well-suited for hardware-efficient implementation. Copyright © 2013 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Mooser, Matthias; Burri, Christian; Stoller, Markus; Luggen, David; Peyer, Michael; Arnold, Patrik; Meier, Christoph; Považay, Boris
2017-07-01
Ocular optical coherence tomography at the wavelengths ranges of 850 and 1060 nm have been integrated with a confocal scanning laser ophthalmoscope eye-tracker as a clinical commercial-class system. Collinear optics enables an exact overlap of the different channels to produce precisely overlapping depth-scans for evaluating the similarities and differences between the wavelengths to extract additional physiologic information. A reliable segmentation algorithm utilizing Graphcuts has been implemented and applied to automatically extract retinal and choroidal shape in cross-sections and volumes. The device has been tested in normals and pathologies including a cross-sectional and longitudinal study of myopia progress and control with a duplicate instrument in Asian children.
An Automated High-Throughput System to Fractionate Plant Natural Products for Drug Discovery
Tu, Ying; Jeffries, Cynthia; Ruan, Hong; Nelson, Cynthia; Smithson, David; Shelat, Anang A.; Brown, Kristin M.; Li, Xing-Cong; Hester, John P.; Smillie, Troy; Khan, Ikhlas A.; Walker, Larry; Guy, Kip; Yan, Bing
2010-01-01
The development of an automated, high-throughput fractionation procedure to prepare and analyze natural product libraries for drug discovery screening is described. Natural products obtained from plant materials worldwide were extracted and first prefractionated on polyamide solid-phase extraction cartridges to remove polyphenols, followed by high-throughput automated fractionation, drying, weighing, and reformatting for screening and storage. The analysis of fractions with UPLC coupled with MS, PDA and ELSD detectors provides information that facilitates characterization of compounds in active fractions. Screening of a portion of fractions yielded multiple assay-specific hits in several high-throughput cellular screening assays. This procedure modernizes the traditional natural product fractionation paradigm by seamlessly integrating automation, informatics, and multimodal analytical interrogation capabilities. PMID:20232897
Discriminative Features Mining for Offline Handwritten Signature Verification
NASA Astrophysics Data System (ADS)
Neamah, Karrar; Mohamad, Dzulkifli; Saba, Tanzila; Rehman, Amjad
2014-03-01
Signature verification is an active research area in the field of pattern recognition. It is employed to identify the particular person with the help of his/her signature's characteristics such as pen pressure, loops shape, speed of writing and up down motion of pen, writing speed, pen pressure, shape of loops, etc. in order to identify that person. However, in the entire process, features extraction and selection stage is of prime importance. Since several signatures have similar strokes, characteristics and sizes. Accordingly, this paper presents combination of orientation of the skeleton and gravity centre point to extract accurate pattern features of signature data in offline signature verification system. Promising results have proved the success of the integration of the two methods.
Business Intelligence Applied to the ALMA Software Integration Process
NASA Astrophysics Data System (ADS)
Zambrano, M.; Recabarren, C.; González, V.; Hoffstadt, A.; Soto, R.; Shen, T.-C.
2012-09-01
Software quality assurance and planning of an astronomy project is a complex task, specially if it is a distributed collaborative project such as ALMA, where the development centers are spread across the globe. When you execute a software project there is much valuable information about this process itself that you might be able to collect. One of the ways you can receive this input is via an issue tracking system that will gather the problem reports relative to software bugs captured during the testing of the software, during the integration of the different components or even worst, problems occurred during production time. Usually, there is little time spent on analyzing them but with some multidimensional processing you can extract valuable information from them and it might help you on the long term planning and resources allocation. We present an analysis of the information collected at ALMA from a collection of key unbiased indicators. We describe here the extraction, transformation and load process and how the data was processed. The main goal is to assess a software process and get insights from this information.
Kavakiotis, Ioannis; Xochelli, Aliki; Agathangelidis, Andreas; Tsoumakas, Grigorios; Maglaveras, Nicos; Stamatopoulos, Kostas; Hadzidimitriou, Anastasia; Vlahavas, Ioannis; Chouvarda, Ioanna
2016-06-06
Somatic Hypermutation (SHM) refers to the introduction of mutations within rearranged V(D)J genes, a process that increases the diversity of Immunoglobulins (IGs). The analysis of SHM has offered critical insight into the physiology and pathology of B cells, leading to strong prognostication markers for clinical outcome in chronic lymphocytic leukaemia (CLL), the most frequent adult B-cell malignancy. In this paper we present a methodology for integrating multiple immunogenetic and clinocobiological data sources in order to extract features and create high quality datasets for SHM analysis in IG receptors of CLL patients. This dataset is used as the basis for a higher level integration procedure, inspired form social choice theory. This is applied in the Towards Analysis, our attempt to investigate the potential ontogenetic transformation of genes belonging to specific stereotyped CLL subsets towards other genes or gene families, through SHM. The data integration process, followed by feature extraction, resulted in the generation of a dataset containing information about mutations occurring through SHM. The Towards analysis performed on the integrated dataset applying voting techniques, revealed the distinct behaviour of subset #201 compared to other subsets, as regards SHM related movements among gene clans, both in allele-conserved and non-conserved gene areas. With respect to movement between genes, a high percentage movement towards pseudo genes was found in all CLL subsets. This data integration and feature extraction process can set the basis for exploratory analysis or a fully automated computational data mining approach on many as yet unanswered, clinically relevant biological questions.
Patel, Minal R; Vichich, Jennifer; Lang, Ian; Lin, Jessica; Zheng, Kai
2017-04-01
The introduction of health information technology systems, electronic health records in particular, is changing the nature of how clinicians interact with patients. Lack of knowledge remains on how best to integrate such systems in the exam room. The purpose of this systematic review was to (1) distill "best" behavioral and communication practices recommended in the literature for clinicians when interacting with patients in the presence of computerized systems during a clinical encounter, (2) weigh the evidence of each recommendation, and (3) rank evidence-based recommendations for electronic health record communication training initiatives for clinicians. We conducted a literature search of 6 databases, resulting in 52 articles included in the analysis. We extracted information such as study setting, research design, sample, findings, and implications. Recommendations were distilled based on consistent support for behavioral and communication practices across studies. Eight behavioral and communication practices received strong support of evidence in the literature and included specific aspects of using computerized systems to facilitate conversation and transparency in the exam room, such as spatial (re)organization of the exam room, maintaining nonverbal communication, and specific techniques that integrate the computerized system into the visit and engage the patient. Four practices, although patient-centered, have received insufficient evidence to date. We developed an evidence base of best practices for clinicians to maintain patient-centered communications in the presence of computerized systems in the exam room. Further work includes development and empirical evaluation of evidence-based guidelines to better integrate computerized systems into clinical care. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Integration of RAMS in LCC analysis for linear transport infrastructures. A case study for railways.
NASA Astrophysics Data System (ADS)
Calle-Cordón, Álvaro; Jiménez-Redondo, Noemi; Morales-Gámiz, F. J.; García-Villena, F. A.; Garmabaki, Amir H. S.; Odelius, Johan
2017-09-01
Life-cycle cost (LCC) analysis is an economic technique used to assess the total costs associated with the lifetime of a system in order to support decision making in long term strategic planning. For complex systems, such as railway and road infrastructures, the cost of maintenance plays an important role in the LCC analysis. Costs associated with maintenance interventions can be more reliably estimated by integrating the probabilistic nature of the failures associated to these interventions in the LCC models. Reliability, Maintainability, Availability and Safety (RAMS) parameters describe the maintenance needs of an asset in a quantitative way by using probabilistic information extracted from registered maintenance activities. Therefore, the integration of RAMS in the LCC analysis allows obtaining reliable predictions of system maintenance costs and the dependencies of these costs with specific cost drivers through sensitivity analyses. This paper presents an innovative approach for a combined RAMS & LCC methodology for railway and road transport infrastructures being developed under the on-going H2020 project INFRALERT. Such RAMS & LCC analysis provides relevant probabilistic information to be used for condition and risk-based planning of maintenance activities as well as for decision support in long term strategic investment planning.
Chun, Jaesung; Choi, Okkyoung; Sang, Byoung-In
2018-01-01
Extractive fermentation with the removal of carboxylic acid requires low pH conditions because acids are better partitioned into the solvent phase at low pH values. However, this requirement conflicts with the optimal near-neutral pH conditions for microbial growth. CO 2 pressurization was used, instead of the addition of chemicals, to decrease pH for the extraction of butyric acid, a fermentation product of Clostridium tyrobutyricum , and butyl butyrate was selected as an extractant. CO 2 pressurization (50 bar) improved the extraction efficiency of butyric acid from a solution at pH 6, yielding a distribution coefficient ( D ) 0.42. In situ removal of butyric acid during fermentation increased the production of butyric acid by up to 4.10 g/L h, an almost twofold increase over control without the use of an extraction process. In situ extraction of butyric acid using temporal CO 2 pressurization may be applied to an integrated downstream catalytic process for upgrading butyric acid to value-added chemicals in an organic solvent.
High-order Path Integral Monte Carlo methods for solving strongly correlated fermion problems
NASA Astrophysics Data System (ADS)
Chin, Siu A.
2015-03-01
In solving for the ground state of a strongly correlated many-fermion system, the conventional second-order Path Integral Monte Carlo method is plagued with the sign problem. This is due to the large number of anti-symmetric free fermion propagators that are needed to extract the square of the ground state wave function at large imaginary time. In this work, I show that optimized fourth-order Path Integral Monte Carlo methods, which uses no more than 5 free-fermion propagators, in conjunction with the use of the Hamiltonian energy estimator, can yield accurate ground state energies for quantum dots with up to 20 polarized electrons. The correlations are directly built-in and no explicit wave functions are needed. This work is supported by the Qatar National Research Fund NPRP GRANT #5-674-1-114.
Knowledge Modeling in Prior Art Search
NASA Astrophysics Data System (ADS)
Graf, Erik; Frommholz, Ingo; Lalmas, Mounia; van Rijsbergen, Keith
This study explores the benefits of integrating knowledge representations in prior art patent retrieval. Key to the introduced approach is the utilization of human judgment available in the form of classifications assigned to patent documents. The paper first outlines in detail how a methodology for the extraction of knowledge from such an hierarchical classification system can be established. Further potential ways of integrating this knowledge with existing Information Retrieval paradigms in a scalable and flexible manner are investigated. Finally based on these integration strategies the effectiveness in terms of recall and precision is evaluated in the context of a prior art search task for European patents. As a result of this evaluation it can be established that in general the proposed knowledge expansion techniques are particularly beneficial to recall and, with respect to optimizing field retrieval settings, further result in significant precision gains.
Uboh, Friday E; Okon, Iniobong E; Ekong, Moses B
2010-02-01
Serum alanine aminotransferase (ALT), aspartate aminotransferase(AST), alkaline phosphatase (ALP), albumin and total protein levels, as well as the tissue histological assay are known to be useful in assessing the functional integrity of the liver. Also, assessment of red and white blood cells count, hematocrit and hemoglobin concentrations is useful in determining the effect of some chemical substances on hemotopoietic system. In recent times, reports from medicinal plants research indicate that extracts from some plants are both hepatotoxic and hematotoxic, while others on the other hand are reported to be hepatoprotective and hematopoietic in action. This study considers the effects of aqueous extract of Psidium guajava (P. guajava) leaves on the histology and biochemical indices of liver function as well as hematological indices in rats. In this study, phytochemical screening of the aqueous extract of P. guajava leaves was carried out. Also, male and female rats were administered with 200 mg/kg body weight oral daily doses of aqueous extract of P. guajava leaves for a period of 30 days. At the end of the administration period, the rats were anaesthesized with chloroform vapors and dissected for the collection of blood and liver tissues which were used for the hematopoietic and liver functions investigations. Preliminary phytochemical analysis of the plant leaves showed the presence of alkaloids, flavonoids, glycosides, polyphenols, reducing compounds, saponins and tannins. Liver function tests revealed that the serum ALT, AST and ALP, as well as the concentrations of total protein and albumin in male and female rats were not significantly (P > 0.05) affected by the oral administration of the extract. Histopathological study also did not show any adverse alteration in the morphological architecture of the liver tissues in both sexes of the animal model. However, red blood cell counts, hemotocrit and hemoglobin concentrations increased significantly (P < 0.05) on administration of the extract in both male and female rats. It was therefore observed that the effect of the extract on male rats was not significantly different (P > 0.05) from that obtained for the female rats. The results of this present study suggested that aqueous extract of Psidium guajava leaves may be hepatoprotective, and not hepatotoxic, with hematopoietic potentials in both male and female rats. These findings are therefore of clinical importance given the various reported medicinal potentials of the plant.
Autonomous characterization of plastic-bonded explosives
NASA Astrophysics Data System (ADS)
Linder, Kim Dalton; DeRego, Paul; Gomez, Antonio; Baumgart, Chris
2006-08-01
Plastic-Bonded Explosives (PBXs) are a newer generation of explosive compositions developed at Los Alamos National Laboratory (LANL). Understanding the micromechanical behavior of these materials is critical. The size of the crystal particles and porosity within the PBX influences their shock sensitivity. Current methods to characterize the prominent structural characteristics include manual examination by scientists and attempts to use commercially available image processing packages. Both methods are time consuming and tedious. LANL personnel, recognizing this as a manually intensive process, have worked with the Kansas City Plant / Kirtland Operations to develop a system which utilizes image processing and pattern recognition techniques to characterize PBX material. System hardware consists of a CCD camera, zoom lens, two-dimensional, motorized stage, and coaxial, cross-polarized light. System integration of this hardware with the custom software is at the core of the machine vision system. Fundamental processing steps involve capturing images from the PBX specimen, and extraction of void, crystal, and binder regions. For crystal extraction, a Quadtree decomposition segmentation technique is employed. Benefits of this system include: (1) reduction of the overall characterization time; (2) a process which is quantifiable and repeatable; (3) utilization of personnel for intelligent review rather than manual processing; and (4) significantly enhanced characterization accuracy.
Winkelmann, Tim; Cee, Rainer; Haberer, Thomas; Naas, Bernd; Peters, Andreas; Schreiner, Jochen
2014-02-01
The clinical operation at the Heidelberg Ion Beam Therapy Center (HIT) started in November 2009; since then more than 1600 patients have been treated. In a 24/7 operation scheme two 14.5 GHz electron cyclotron resonance ion sources are routinely used to produce protons and carbon ions. The modification of the low energy beam transport line and the integration of a third ion source into the therapy facility will be shown. In the last year we implemented a new extraction system at all three sources to enhance the lifetime of extraction parts and reduce preventive and corrective maintenance. The new four-electrode-design provides electron suppression as well as lower beam emittance. Unwanted beam sputtering effects which typically lead to contamination of the insulator ceramics and subsequent high-voltage break-downs are minimized by the beam guidance of the new extraction system. By this measure the service interval can be increased significantly. As a side effect, the beam emittance can be reduced allowing a less challenging working point for the ion sources without reducing the effective beam performance. This paper gives also an outlook to further enhancements at the HIT ion source testbench.
Wang, Yonghua; Zheng, Chunli; Huang, Chao; Li, Yan; Chen, Xuetong; Wu, Ziyin; Wang, Zhenzhong; Xiao, Wei; Zhang, Boli
2015-01-01
Holistic medicine is an interdisciplinary field of study that integrates all types of biological information (protein, small molecules, tissues, organs, external environmental signals, etc.) to lead to predictive and actionable models for health care and disease treatment. Despite the global and integrative character of this discipline, a comprehensive picture of holistic medicine for the treatment of complex diseases is still lacking. In this study, we develop a novel systems pharmacology approach to dissect holistic medicine in treating cardiocerebrovascular diseases (CCDs) by TCM (traditional Chinese medicine). Firstly, by applying the TCM active ingredients screened out by a systems-ADME process, we explored and experimentalized the signed drug-target interactions for revealing the pharmacological actions of drugs at a molecule level. Then, at a/an tissue/organ level, the drug therapeutic mechanisms were further investigated by a target-organ location method. Finally, a translational integrating pathway approach was applied to extract the diseases-therapeutic modules for understanding the complex disease and its therapy at systems level. For the first time, the feature of the drug-target-pathway-organ-cooperations for treatment of multiple organ diseases in holistic medicine was revealed, facilitating the development of novel treatment paradigm for complex diseases in the future.
Electromagnetic mixed waste processing system for asbestos decontamination
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kasevich, R.S.; Vaux, W.; Ulerich, N.
The overall objective of this three-phase program is to develop an integrated process for treating asbestos-containing material that is contaminated with radioactive and hazardous constituents. The integrated process will attempt to minimize processing and disposal costs. The objectives of Phase 1 were to establish the technical feasibility of asbestos decomposition, inorganic radionuclide nd heavy metal removal, and organic volatilization. Phase 1 resulted in the successful bench-scale demonstration of the elements required to develop a mixed waste treatment process for asbestos-containing material (ACM) contaminated with radioactive metals, heavy metals, and organics. Using the Phase 1 data, a conceptual process was developed.more » The Phase 2 program, currently in progress, is developing an integrated system design for ACM waste processing. The Phase 3 program will target demonstration of the mixed waste processing system at a DOE facility. The electromagnetic mixed waste processing system employs patented technologies to convert DOE asbestos to a non-hazardous, radionuclide-free, stable waste. The dry, contaminated asbestos is initially heated with radiofrequency energy to remove organic volatiles. Second,the radionuclides are removed by solvent extraction coupled with ion exchange solution treatment. Third, the ABCOV method converts the asbestos to an amorphous silica suspension at low temperature (100{degrees}C). Finally the amorphous silica is solidified for disposal.« less
Shaban-Nejad, Arash; Lavigne, Maxime; Okhmatovskaia, Anya; Buckeridge, David L
2017-01-01
Population health decision makers must consider complex relationships between multiple concepts measured with differential accuracy from heterogeneous data sources. Population health information systems are currently limited in their ability to integrate data and present a coherent portrait of population health. Consequentially, these systems can provide only basic support for decision makers. The Population Health Record (PopHR) is a semantic web application that automates the integration and extraction of massive amounts of heterogeneous data from multiple distributed sources (e.g., administrative data, clinical records, and survey responses) to support the measurement and monitoring of population health and health system performance for a defined population. The design of the PopHR draws on the theories of the determinants of health and evidence-based public health to harmonize and explicitly link information about a population with evidence about the epidemiology and control of chronic diseases. Organizing information in this manner and linking it explicitly to evidence is expected to improve decision making related to the planning, implementation, and evaluation of population health and health system interventions. In this paper, we describe the PopHR platform and discuss the architecture, design, key modules, and its implementation and use. © 2016 New York Academy of Sciences.
The integrated motion measurement simulation for SOFIA
NASA Astrophysics Data System (ADS)
Kaswekar, Prashant; Greiner, Benjamin; Wagner, Jörg
2014-07-01
The Stratospheric Observatory for Infrared Astronomy SOFIA consists of a B747-SP aircraft, which carries aloft a 2.7-meter reflecting telescope. The image stability goal for SOFIA is 0:2 arc-seconds rms. The performance of the telescope structure is affected by elastic vibrations induced by aeroacoustic and suspension disturbances. Active compensation of such disturbances requires a fast way of estimating the structural motion. Integrated navigation systems are examples of such estimation systems. However they employ a rigid body assumption. A possible extension of these systems to an elastic structure is shown by different authors for one dimensional beam structures taking into account the eigenmodes of the structural system. The rigid body motion as well as the flexible modes of the telescope assembly, however, are coupled among the three axes. Extending a special mathematical approach to three dimensional structures, the aspect of a modal observer based on integrated motion measurement is simulated for SOFIA. It is in general a fusion of different measurement methods by using their benefits and blinding out their disadvantages. There are no mass and stillness properties needed directly in this approach. However, the knowledge of modal properties of the structure is necessary for the implementation of this method. A finite-element model is chosen as a basis to extract the modal properties of the structure.
Wang, Yonghua; Zheng, Chunli; Huang, Chao; Li, Yan; Chen, Xuetong; Wu, Ziyin; Wang, Zhenzhong; Xiao, Wei; Zhang, Boli
2015-01-01
Holistic medicine is an interdisciplinary field of study that integrates all types of biological information (protein, small molecules, tissues, organs, external environmental signals, etc.) to lead to predictive and actionable models for health care and disease treatment. Despite the global and integrative character of this discipline, a comprehensive picture of holistic medicine for the treatment of complex diseases is still lacking. In this study, we develop a novel systems pharmacology approach to dissect holistic medicine in treating cardiocerebrovascular diseases (CCDs) by TCM (traditional Chinese medicine). Firstly, by applying the TCM active ingredients screened out by a systems-ADME process, we explored and experimentalized the signed drug-target interactions for revealing the pharmacological actions of drugs at a molecule level. Then, at a/an tissue/organ level, the drug therapeutic mechanisms were further investigated by a target-organ location method. Finally, a translational integrating pathway approach was applied to extract the diseases-therapeutic modules for understanding the complex disease and its therapy at systems level. For the first time, the feature of the drug-target-pathway-organ-cooperations for treatment of multiple organ diseases in holistic medicine was revealed, facilitating the development of novel treatment paradigm for complex diseases in the future. PMID:26101539
Myklebust, Lars Henrik; Sørgaard, Knut; Wynn, Rolf
2015-01-01
In the last few decades, there has been a restructuring of the psychiatric services in many countries. The complexity of these systems may represent a challenge to patients that suffer from serious psychiatric disorders. We examined whether local integration of inpatient and outpatient services in contrast to centralized institutions strengthened continuity of care. Two different service-systems were compared. Service-utilization over a 4-year period for 690 inpatients was extracted from the patient registries. The results were controlled for demographic variables, model of service-system, central inpatient admission or local inpatient admission, diagnoses, and duration of inpatient stays. The majority of inpatients in the area with local integration of inpatient and outpatient services used both types of care. In the area that did not have beds locally, many patients that had been hospitalized did not receive outpatient follow-up. Predictors of inpatients' use of outpatient psychiatric care were: Model of service-system (centralized vs decentralized), a diagnosis of affective disorder, central inpatient admission only, and duration of inpatient stays. Psychiatric centers with local inpatient units may positively affect continuity of care for patients with severe psychiatric disorders, probably because of a high functional integration of inpatient and outpatient care.
Myklebust, Lars Henrik; Sørgaard, Knut; Wynn, Rolf
2015-01-01
Objectives In the last few decades, there has been a restructuring of the psychiatric services in many countries. The complexity of these systems may represent a challenge to patients that suffer from serious psychiatric disorders. We examined whether local integration of inpatient and outpatient services in contrast to centralized institutions strengthened continuity of care. Methods Two different service-systems were compared. Service-utilization over a 4-year period for 690 inpatients was extracted from the patient registries. The results were controlled for demographic variables, model of service-system, central inpatient admission or local inpatient admission, diagnoses, and duration of inpatient stays. Results The majority of inpatients in the area with local integration of inpatient and outpatient services used both types of care. In the area that did not have beds locally, many patients that had been hospitalized did not receive outpatient follow-up. Predictors of inpatients’ use of outpatient psychiatric care were: Model of service-system (centralized vs decentralized), a diagnosis of affective disorder, central inpatient admission only, and duration of inpatient stays. Conclusion Psychiatric centers with local inpatient units may positively affect continuity of care for patients with severe psychiatric disorders, probably because of a high functional integration of inpatient and outpatient care. PMID:26604843
The extraction and integration framework: a two-process account of statistical learning.
Thiessen, Erik D; Kronstein, Alexandra T; Hufnagle, Daniel G
2013-07-01
The term statistical learning in infancy research originally referred to sensitivity to transitional probabilities. Subsequent research has demonstrated that statistical learning contributes to infant development in a wide array of domains. The range of statistical learning phenomena necessitates a broader view of the processes underlying statistical learning. Learners are sensitive to a much wider range of statistical information than the conditional relations indexed by transitional probabilities, including distributional and cue-based statistics. We propose a novel framework that unifies learning about all of these kinds of statistical structure. From our perspective, learning about conditional relations outputs discrete representations (such as words). Integration across these discrete representations yields sensitivity to cues and distributional information. To achieve sensitivity to all of these kinds of statistical structure, our framework combines processes that extract segments of the input with processes that compare across these extracted items. In this framework, the items extracted from the input serve as exemplars in long-term memory. The similarity structure of those exemplars in long-term memory leads to the discovery of cues and categorical structure, which guides subsequent extraction. The extraction and integration framework provides a way to explain sensitivity to both conditional statistical structure (such as transitional probabilities) and distributional statistical structure (such as item frequency and variability), and also a framework for thinking about how these different aspects of statistical learning influence each other. 2013 APA, all rights reserved
Saha, Dipankar; Dhar, Y R; Vittala, S S
2010-06-01
A part of the Gangetic Alluvial Plain covering 2,228 km(2), in the state of Bihar, is studied for demarcating groundwater development potential zones. The area is mainly agrarian and experiencing intensive groundwater draft to the tune of 0.12 million cubic metre per square kilometres per year from the Quaternary marginal alluvial deposits, unconformably overlain northerly sloping Precambrian bedrock. Multiparametric data on groundwater comprising water level, hydraulic gradient (pre- and post-monsoon), aquifer thickness, permeability, suitability of groundwater for drinking and irrigation and groundwater resources vs. draft are spatially analysed and integrated on a Geographical Information System platform to generate thematic layers. By integrating these layers, three zones have been delineated based on groundwater development potential. It is inferred that about 48% of the area covering northern part has high development potential, while medium and low development potential category covers 41% of the area. Further increase in groundwater extraction is not recommended for an area of 173 km(2), affected by over-exploitation. The replenishable groundwater resource available for further extraction has been estimated. The development potential enhances towards north with increase in thickness of sediments. Local deviations are due to variation of-(1) cumulative thickness of aquifers, (2) deeper water level resulting from localised heavy groundwater extraction and (3) aquifer permeability.
Image preprocessing study on KPCA-based face recognition
NASA Astrophysics Data System (ADS)
Li, Xuan; Li, Dehua
2015-12-01
Face recognition as an important biometric identification method, with its friendly, natural, convenient advantages, has obtained more and more attention. This paper intends to research a face recognition system including face detection, feature extraction and face recognition, mainly through researching on related theory and the key technology of various preprocessing methods in face detection process, using KPCA method, focuses on the different recognition results in different preprocessing methods. In this paper, we choose YCbCr color space for skin segmentation and choose integral projection for face location. We use erosion and dilation of the opening and closing operation and illumination compensation method to preprocess face images, and then use the face recognition method based on kernel principal component analysis method for analysis and research, and the experiments were carried out using the typical face database. The algorithms experiment on MATLAB platform. Experimental results show that integration of the kernel method based on PCA algorithm under certain conditions make the extracted features represent the original image information better for using nonlinear feature extraction method, which can obtain higher recognition rate. In the image preprocessing stage, we found that images under various operations may appear different results, so as to obtain different recognition rate in recognition stage. At the same time, in the process of the kernel principal component analysis, the value of the power of the polynomial function can affect the recognition result.
NASA Astrophysics Data System (ADS)
Oschepkova, Elena; Vasinskaya, Irina; Sockoluck, Irina
2017-11-01
In view of changing educational paradigm (adopting of two-tier system of higher education concept - undergraduate and graduate programs) a need of using of modern learning and information and communications technologies arises putting into practice learner-centered approaches in training of highly qualified specialists for extraction and processing of solid commercial minerals enterprises. In the unstable market demand situation and changeable institutional environment, from one side, and necessity of work balancing, supplying conditions and product quality when mining-and-geological parameters change, from the other side, mining enterprises have to introduce and develop the integrated management process of product and informative and logistic flows under united management system. One of the main limitations, which keeps down the developing process on Russian mining enterprises, is staff incompetence at all levels of logistic management. Under present-day conditions extraction and processing of solid commercial minerals enterprises need highly qualified specialists who can do self-directed researches, develop new and improve present arranging, planning and managing technologies of technical operation and commercial exploitation of transport and transportation and processing facilities based on logistics. Learner-centered approach and individualization of the learning process necessitate the designing of individual learning route (ILR), which can help the students to realize their professional facilities according to requirements for specialists for extraction and processing of solid commercial minerals enterprises.
Aquifer Thermal Energy Storage for Seasonal Thermal Energy Balance
NASA Astrophysics Data System (ADS)
Rostampour, Vahab; Bloemendal, Martin; Keviczky, Tamas
2017-04-01
Aquifer Thermal Energy Storage (ATES) systems allow storing large quantities of thermal energy in subsurface aquifers enabling significant energy savings and greenhouse gas reductions. This is achieved by injection and extraction of water into and from saturated underground aquifers, simultaneously. An ATES system consists of two wells and operates in a seasonal mode. One well is used for the storage of cold water, the other one for the storage of heat. In warm seasons, cold water is extracted from the cold well to provide cooling to a building. The temperature of the extracted cold water increases as it passes through the building climate control systems and then gets simultaneously, injected back into the warm well. This procedure is reversed during cold seasons where the flow direction is reversed such that the warmer water is extracted from the warm well to provide heating to a building. From the perspective of building climate comfort systems, an ATES system is considered as a seasonal storage system that can be a heat source or sink, or as a storage for thermal energy. This leads to an interesting and challenging optimal control problem of the building climate comfort system that can be used to develop a seasonal-based energy management strategy. In [1] we develop a control-oriented model to predict thermal energy balance in a building climate control system integrated with ATES. Such a model however cannot cope with off-nominal but realistic situations such as when the wells are completely depleted, or the start-up phase of newly installed wells, etc., leading to direct usage of aquifer ambient temperature. Building upon our previous work in [1], we here extend the mathematical model for ATES system to handle the above mentioned more realistic situations. Using our improved models, one can more precisely predict system behavior and apply optimal control strategies to manage the building climate comfort along with energy savings and greenhouse gas reductions. [1] V. Rostampour and T. Keviczky, "Probabilistic Energy Management for Building Climate Comfort in Smart Thermal Grids with Seasonal Storage Systems," arXiv [math.OC], 10-Nov-2016.
Molino, João Vitor Dutra; Lopes, André Moreni; Viana Marques, Daniela de Araújo; Mazzola, Priscila Gava; da Silva, Joas Lucas; Hirata, Mario Hiroyuki; Hirata, Rosário Dominguez Crespo; Gatti, Maria Silvia Viccari; Pessoa, Adalberto
2017-12-04
Viral vectors are important in medical approaches, such as disease prevention and gene therapy, and their production depends on efficient prepurification steps. In the present study, an aqueous two-phase micellar system (ATPMS) was evaluated to extract human adenovirus type 5 particles from a cell lysate. Adenovirus was cultured in human embryonic kidney 293 (HEK-293) cells to a concentration of 1.4 × 10 10 particles/mL. Cells were lysed, and the system formed by direct addition of Triton X-114 in a 2 3 full factorial design with center points. The systems were formed with Triton X-114 at a final concentration of 1.0, 6.0, and 11.0% (w/w), cell lysate pH of 6.0, 6.5, and 7.0, and incubation temperatures at 33, 35, and 37 °C. Adenovirus particles recovered from partition phases were measured by qPCR. The best system condition was with 11.0% (w/w) of Triton X-114, a cell lysate pH of 7.0, and an incubation temperature at 33 °C, yielding 3.51 × 10 10 adenovirus particles/mL, which increased the initial adenovirus particles concentration by 2.3-fold, purifying it by 2.2-fold from the cell lysate, and removing cell debris. In conclusion, these results demonstrated that the use of an aqueous two-phase micellar system in the early steps of downstream processing could improve viral particle extraction from cultured cells while integrating clarification, concentration, and prepurification steps. © 2017 International Union of Biochemistry and Molecular Biology, Inc.
Determination of diagnostic standards on saturated soil extracts for cut roses grown in greenhouses.
Franco-Hermida, John Jairo; Quintero, María Fernanda; Cabrera, Raúl Iskander; Guzman, José Miguel
2017-01-01
This work comprises the theoretical determination and validation of diagnostic standards for the analysis of saturated soil extracts for cut rose flower crops (Rosa spp.) growing in the Bogota Plateau, Colombia. The data included 684 plant tissue analyses and 684 corresponding analyses of saturated soil extracts, all collected between January 2009 and June 2013. The tissue and soil samples were selected from 13 rose farms, and from cultivars grafted on the 'Natal Briar' rootstock. These concurrent samples of soil and plant tissues represented 251 production units (locations) of approximately 10,000 m2 distributed across the study area. The standards were conceived as a tool to improve the nutritional balance in the leaf tissue of rose plants and thereby define the norms for expressing optimum productive potential relative to nutritional conditions in the soil. To this end, previously determined diagnostic standard for rose leaf tissues were employed to obtain rates of foliar nutritional balance at each analyzed location and as criteria for determining the diagnostic norms for saturated soil extracts. Implementing this methodology to foliar analysis, showed a higher significant correlation for diagnostic indices. A similar behavior was observed in saturated soil extracts analysis, becoming a powerful tool for integrated nutritional diagnosis. Leaf analyses determine the most limiting nutrients for high yield and analyses of saturated soil extracts facilitate the possibility of correcting the fertigation formulations applied to soils or substrates. Recommendations are proposed to improve the balance in soil-plant system with which the possibility of yield increase becomes more probable. The main recommendations to increase and improve rose crop flower yields would be: continuously check pH values of SSE, reduce the amounts of P, Fe, Zn and Cu in fertigation solutions and carefully analyze the situation of Mn in the soil-plant system.
An advanced concept secondary power systems study for an advanced transport technology aircraft
NASA Technical Reports Server (NTRS)
1972-01-01
The application of advanced technology to the design of an integrated secondary power system for future near-sonic long-range transports was investigated. The study showed that the highest payoff is achieved by utilizing secondary power equipment that contributes to minimum cruise drag. This is best accomplished by the use of the dedicated auxiliary power unit concept (inflight APU) as the prime power source for an airplane with a body-mounted engine or by the use of the internal engine generator concept (electrical power extraction from the propulsion engine) for an airplane with a wing-pod-mounted engine.
Nano-Enriched and Autonomous Sensing Framework for Dissolved Oxygen.
Shehata, Nader; Azab, Mohammed; Kandas, Ishac; Meehan, Kathleen
2015-08-14
This paper investigates a nano-enhanced wireless sensing framework for dissolved oxygen (DO). The system integrates a nanosensor that employs cerium oxide (ceria) nanoparticles to monitor the concentration of DO in aqueous media via optical fluorescence quenching. We propose a comprehensive sensing framework with the nanosensor equipped with a digital interface where the sensor output is digitized and dispatched wirelessly to a trustworthy data collection and analysis framework for consolidation and information extraction. The proposed system collects and processes the sensor readings to provide clear indications about the current or the anticipated dissolved oxygen levels in the aqueous media.
NASA Astrophysics Data System (ADS)
Torjesen, Alyssa; Istfan, Raeef; Roblyer, Darren
2017-03-01
Frequency-domain diffuse optical spectroscopy (FD-DOS) utilizes intensity-modulated light to characterize optical scattering and absorption in thick tissue. Previous FD-DOS systems have been limited by large device footprints, complex electronics, high costs, and limited acquisition speeds, all of which complicate access to patients in the clinical setting. We have developed a new digital DOS (dDOS) system, which is relatively compact and inexpensive, allowing for simplified clinical use, while providing unprecedented measurement speeds. The dDOS system utilizes hardware-integrated custom board-level direct digital synthesizers and an analog-to-digital converter to generate frequency sweeps and directly measure signals utilizing undersampling at six wavelengths modulated at discrete frequencies from 50 to 400 MHz. Wavelength multiplexing is utilized to achieve broadband frequency sweep measurements acquired at over 97 Hz. When compared to a gold-standard DOS system, the accuracy of optical properties recovered with the dDOS system was within 5.3% and 5.5% for absorption and reduced scattering coefficient extractions, respectively. When tested in vivo, the dDOS system was able to detect physiological changes throughout the cardiac cycle. The new FD-dDOS system is fast, inexpensive, and compact without compromising measurement quality.
Park, Wonse; Choi, Ji-Wook; Kim, Jae-Young; Kim, Bong-Chul; Kim, Hyung Jun; Lee, Sang-Hwy
2010-03-01
Paresthesia is a well-known complication of extraction of mandibular third molars (MTMs). The authors evaluated the relationship between paresthesia after MTM extraction and the cortical integrity of the inferior alveolar canal (IAC) by using computed tomography (CT). The authors designed a retrospective cohort study involving participants considered, on the basis of panoramic imaging, to be at high risk of experiencing injury of the inferior alveolar nerve who subsequently underwent CT imaging and extraction of the MTMs. The primary predictor variable was the contact relationship between the IAC and the MTM as viewed on a CT image, classified into three groups: group 1, no contact; group 2, contact between the MTM and the intact IAC cortex; group 3, contact between the MTM and the interrupted IAC cortex. The secondary predictor variable was the number of CT image slices showing the cortical interruption around the MTM. The outcome variable was the presence or absence of postoperative paresthesia after MTM extraction. The study sample comprised 179 participants who underwent MTM extraction (a total of 259 MTMs). Their mean age was 23.6 years, and 85 (47.5 percent) were male. The overall prevalence of paresthesia was 4.2 percent (11 of 259 teeth). The prevalence of paresthesia in group 3 (involving an interrupted IAC cortex) was 11.8 percent (10 of 85 cases), while for group 2 (involving an intact IAC cortex) and group 1 (involving no contact) it was 1.0 percent (1 of 98 cases) and 0.0 percent (no cases), respectively. The frequency of nerve damage increased with the number of CT image slices showing loss of cortical integrity (P=.043). The results of this study indicate that loss of IAC cortical integrity is associated with an increased risk of experiencing paresthesia after MTM extraction.
Integrating visual learning within a model-based ATR system
NASA Astrophysics Data System (ADS)
Carlotto, Mark; Nebrich, Mark
2017-05-01
Automatic target recognition (ATR) systems, like human photo-interpreters, rely on a variety of visual information for detecting, classifying, and identifying manmade objects in aerial imagery. We describe the integration of a visual learning component into the Image Data Conditioner (IDC) for target/clutter and other visual classification tasks. The component is based on an implementation of a model of the visual cortex developed by Serre, Wolf, and Poggio. Visual learning in an ATR context requires the ability to recognize objects independent of location, scale, and rotation. Our method uses IDC to extract, rotate, and scale image chips at candidate target locations. A bootstrap learning method effectively extends the operation of the classifier beyond the training set and provides a measure of confidence. We show how the classifier can be used to learn other features that are difficult to compute from imagery such as target direction, and to assess the performance of the visual learning process itself.
Face biometrics with renewable templates
NASA Astrophysics Data System (ADS)
van der Veen, Michiel; Kevenaar, Tom; Schrijen, Geert-Jan; Akkermans, Ton H.; Zuo, Fei
2006-02-01
In recent literature, privacy protection technologies for biometric templates were proposed. Among these is the so-called helper-data system (HDS) based on reliable component selection. In this paper we integrate this approach with face biometrics such that we achieve a system in which the templates are privacy protected, and multiple templates can be derived from the same facial image for the purpose of template renewability. Extracting binary feature vectors forms an essential step in this process. Using the FERET and Caltech databases, we show that this quantization step does not significantly degrade the classification performance compared to, for example, traditional correlation-based classifiers. The binary feature vectors are integrated in the HDS leading to a privacy protected facial recognition algorithm with acceptable FAR and FRR, provided that the intra-class variation is sufficiently small. This suggests that a controlled enrollment procedure with a sufficient number of enrollment measurements is required.
Surveillance of occupational noise exposures using OSHA's Integrated Management Information System.
Middendorf, Paul J
2004-11-01
Exposure to noise has long been known to cause hearing loss, and is an ubiquitous problem in workplaces. Occupational noise exposures for industries stored in the Occupational Safety and Health Administration's (OSHA) Integrated Management Information System (IMIS) can be used to identify temporal and industrial trends of noise exposure to anticipate changes in rates of hearing loss. The noise records in OSHA's IMIS database for 1979-1999 were extracted by major industry division and measurement criteria. The noise exposures were summarized by year, industry, and employment size. The majority of records are from Manufacturing and Services. Exposures in Manufacturing and Services have decreased during the period, except that PEL exposures measured by federal enforcement increased from 1995 to 1999. Noise exposures in manufacturing have been reduced since the late 1970s, except those documented by federal enforcement. Noise exposure data outside manufacturing is not well represented in IMIS. Copyright 2004 Wiley-Liss, Inc.
Non-invasive lightweight integration engine for building EHR from autonomous distributed systems.
Angulo, Carlos; Crespo, Pere; Maldonado, José A; Moner, David; Pérez, Daniel; Abad, Irene; Mandingorra, Jesús; Robles, Montserrat
2007-12-01
In this paper we describe Pangea-LE, a message-oriented lightweight data integration engine that allows homogeneous and concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and passes it to the requesting client applications in a flexible XML format. The XML response message can be formatted on demand by appropriate Extensible Stylesheet Language (XSL) transformations in order to meet the needs of client applications. We also present a real deployment in a hospital where Pangea-LE collects and generates an XML view of all the available patient clinical information. The information is presented to healthcare professionals in an Electronic Health Record (EHR) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real setting has been a success due to the non-invasive nature of Pangea-LE which respects the existing information systems.
Non-invasive light-weight integration engine for building EHR from autonomous distributed systems.
Crespo Molina, Pere; Angulo Fernández, Carlos; Maldonado Segura, José A; Moner Cano, David; Robles Viejo, Montserrat
2006-01-01
Pangea-LE is a message oriented light-weight integration engine, allowing concurrent access to clinical information from disperse and heterogeneous data sources. The engine extracts the information and serves it to the requester client applications in a flexible XML format. This XML response message can be formatted on demand by the appropriate XSL (Extensible Stylesheet Language) transformation in order to fit client application needs. In this article we present a real use case sample where Pangea-LE collects and generates "on the fly" a structured view of all the patient clinical information available in a healthcare organisation. This information is presented to healthcare professionals in an EHR (Electronic Health Record) viewer Web application with patient search and EHR browsing capabilities. Implantation in a real environment has been a notable success due to the non-invasive method which extremely respects the existing information systems.
Model of experts for decision support in the diagnosis of leukemia patients.
Corchado, Juan M; De Paz, Juan F; Rodríguez, Sara; Bajo, Javier
2009-07-01
Recent advances in the field of biomedicine, specifically in the field of genomics, have led to an increase in the information available for conducting expression analysis. Expression analysis is a technique used in transcriptomics, a branch of genomics that deals with the study of messenger ribonucleic acid (mRNA) and the extraction of information contained in the genes. This increase in information is reflected in the exon arrays, which require the use of new techniques in order to extract the information. The purpose of this study is to provide a tool based on a mixture of experts model that allows the analysis of the information contained in the exon arrays, from which automatic classifications for decision support in diagnoses of leukemia patients can be made. The proposed model integrates several cooperative algorithms characterized for their efficiency for data processing, filtering, classification and knowledge extraction. The Cancer Institute of the University of Salamanca is making an effort to develop tools to automate the evaluation of data and to facilitate de analysis of information. This proposal is a step forward in this direction and the first step toward the development of a mixture of experts tool that integrates different cognitive and statistical approaches to deal with the analysis of exon arrays. The mixture of experts model presented within this work provides great capacities for learning and adaptation to the characteristics of the problem in consideration, using novel algorithms in each of the stages of the analysis process that can be easily configured and combined, and provides results that notably improve those provided by the existing methods for exon arrays analysis. The material used consists of data from exon arrays provided by the Cancer Institute that contain samples from leukemia patients. The methodology used consists of a system based on a mixture of experts. Each one of the experts incorporates novel artificial intelligence techniques that improve the process of carrying out various tasks such as pre-processing, filtering, classification and extraction of knowledge. This article will detail the manner in which individual experts are combined so that together they generate a system capable of extracting knowledge, thus permitting patients to be classified in an automatic and efficient manner that is also comprehensible for medical personnel. The system has been tested in a real setting and has been used for classifying patients who suffer from different forms of leukemia at various stages. Personnel from the Cancer Institute supervised and participated throughout the testing period. Preliminary results are promising, notably improving the results obtained with previously used tools. The medical staff from the Cancer Institute considers the tools that have been developed to be positive and very useful in a supporting capacity for carrying out their daily tasks. Additionally the mixture of experts supplies a tool for the extraction of necessary information in order to explain the associations that have been made in simple terms. That is, it permits the extraction of knowledge for each classification made and generalized in order to be used in subsequent classifications. This allows for a large amount of learning and adaptation within the proposed system.
A new paradigm for reproducing and analyzing N-body simulations of planetary systems
NASA Astrophysics Data System (ADS)
Rein, Hanno; Tamayo, Daniel
2017-05-01
The reproducibility of experiments is one of the main principles of the scientific method. However, numerical N-body experiments, especially those of planetary systems, are currently not reproducible. In the most optimistic scenario, they can only be replicated in an approximate or statistical sense. Even if authors share their full source code and initial conditions, differences in compilers, libraries, operating systems or hardware often lead to qualitatively different results. We provide a new set of easy-to-use, open-source tools that address the above issues, allowing for exact (bit-by-bit) reproducibility of N-body experiments. In addition to generating completely reproducible integrations, we show that our framework also offers novel and innovative ways to analyse these simulations. As an example, we present a high-accuracy integration of the Solar system spanning 10 Gyr, requiring several weeks to run on a modern CPU. In our framework, we can not only easily access simulation data at predefined intervals for which we save snapshots, but at any time during the integration. We achieve this by integrating an on-demand reconstructed simulation forward in time from the nearest snapshot. This allows us to extract arbitrary quantities at any point in the saved simulation exactly (bit-by-bit), and within seconds rather than weeks. We believe that the tools we present in this paper offer a new paradigm for how N-body simulations are run, analysed and shared across the community.
About Distributed Simulation-based Optimization of Forming Processes using a Grid Architecture
NASA Astrophysics Data System (ADS)
Grauer, Manfred; Barth, Thomas
2004-06-01
Permanently increasing complexity of products and their manufacturing processes combined with a shorter "time-to-market" leads to more and more use of simulation and optimization software systems for product design. Finding a "good" design of a product implies the solution of computationally expensive optimization problems based on the results of simulation. Due to the computational load caused by the solution of these problems, the requirements on the Information&Telecommunication (IT) infrastructure of an enterprise or research facility are shifting from stand-alone resources towards the integration of software and hardware resources in a distributed environment for high-performance computing. Resources can either comprise software systems, hardware systems, or communication networks. An appropriate IT-infrastructure must provide the means to integrate all these resources and enable their use even across a network to cope with requirements from geographically distributed scenarios, e.g. in computational engineering and/or collaborative engineering. Integrating expert's knowledge into the optimization process is inevitable in order to reduce the complexity caused by the number of design variables and the high dimensionality of the design space. Hence, utilization of knowledge-based systems must be supported by providing data management facilities as a basis for knowledge extraction from product data. In this paper, the focus is put on a distributed problem solving environment (PSE) capable of providing access to a variety of necessary resources and services. A distributed approach integrating simulation and optimization on a network of workstations and cluster systems is presented. For geometry generation the CAD-system CATIA is used which is coupled with the FEM-simulation system INDEED for simulation of sheet-metal forming processes and the problem solving environment OpTiX for distributed optimization.
Damiani, Natalia; Fernández, Natalia J; Porrini, Martín P; Gende, Liesel B; Álvarez, Estefanía; Buffa, Franco; Brasesco, Constanza; Maggi, Matías D; Marcangeli, Jorge A; Eguaras, Martín J
2014-02-01
A diverse set of parasites and pathogens affects productivity and survival of Apis mellifera honeybees. In beekeeping, traditional control by antibiotics and molecules of synthesis has caused problems with contamination and resistant pathogens. In this research, different Laurus nobilis extracts are tested against the main honeybee pests through an integrated point of view. In vivo effects on bee survival are also evaluated. The ethanol extract showed minimal inhibitory concentration (MIC) values of 208 to 416 μg/mL, having the best antimicrobial effect on Paenibacillus larvae among all substances tested. Similarly, this leaf extract showed a significant antiparasitic activity on Varroa destructor, killing 50 % of mites 24 h after a 30-s exposure, and on Nosema ceranae, inhibiting the spore development in the midgut of adult bees ingesting 1 × 10(4) μg/mL of extract solution. Both ethanol extract and volatile extracts (essential oil, hydrolate, and its main component) did not cause lethal effects on adult honeybees. Thus, the absence of topical and oral toxicity of the ethanol extract on bees and the strong antimicrobial, microsporicidal, and miticidal effects registered in this study place this laurel extract as a promising integrated treatment of bee diseases and stimulates the search for other bioactive phytochemicals from plants.
30 CFR 700.11 - Applicability.
Code of Federal Regulations, 2011 CFR
2011-07-01
.... Noncommercial use does not include the extraction of coal by one unit of an integrated company or other business or nonprofit entity which uses the coal in its own manufacturing or power plants; (2) The extraction... all coal exploration and surface coal mining and reclamation operations, except: (1) The extraction of...
30 CFR 700.11 - Applicability.
Code of Federal Regulations, 2010 CFR
2010-07-01
.... Noncommercial use does not include the extraction of coal by one unit of an integrated company or other business or nonprofit entity which uses the coal in its own manufacturing or power plants; (2) The extraction... all coal exploration and surface coal mining and reclamation operations, except: (1) The extraction of...
Yuan, Naiming; Fu, Zuntao; Liu, Shida
2014-01-01
Long term memory (LTM) in climate variability is studied by means of fractional integral techniques. By using a recently developed model, Fractional Integral Statistical Model (FISM), we in this report proposed a new method, with which one can estimate the long-lasting influences of historical climate states on the present time quantitatively, and further extract the influence as climate memory signals. To show the usability of this method, two examples, the Northern Hemisphere monthly Temperature Anomalies (NHTA) and the Pacific Decadal Oscillation index (PDO), are analyzed in this study. We find the climate memory signals indeed can be extracted and the whole variations can be further decomposed into two parts: the cumulative climate memory (CCM) and the weather-scale excitation (WSE). The stronger LTM is, the larger proportion the climate memory signals will account for in the whole variations. With the climate memory signals extracted, one can at least determine on what basis the considered time series will continue to change. Therefore, this report provides a new perspective on climate prediction. PMID:25300777
A new approach to the extraction of single exponential diode model parameters
NASA Astrophysics Data System (ADS)
Ortiz-Conde, Adelmo; García-Sánchez, Francisco J.
2018-06-01
A new integration method is presented for the extraction of the parameters of a single exponential diode model with series resistance from the measured forward I-V characteristics. The extraction is performed using auxiliary functions based on the integration of the data which allow to isolate the effects of each of the model parameters. A differentiation method is also presented for data with low level of experimental noise. Measured and simulated data are used to verify the applicability of both proposed method. Physical insight about the validity of the model is also obtained by using the proposed graphical determinations of the parameters.
Reverse engineering biomolecular systems using -omic data: challenges, progress and opportunities.
Quo, Chang F; Kaddi, Chanchala; Phan, John H; Zollanvari, Amin; Xu, Mingqing; Wang, May D; Alterovitz, Gil
2012-07-01
Recent advances in high-throughput biotechnologies have led to the rapid growing research interest in reverse engineering of biomolecular systems (REBMS). 'Data-driven' approaches, i.e. data mining, can be used to extract patterns from large volumes of biochemical data at molecular-level resolution while 'design-driven' approaches, i.e. systems modeling, can be used to simulate emergent system properties. Consequently, both data- and design-driven approaches applied to -omic data may lead to novel insights in reverse engineering biological systems that could not be expected before using low-throughput platforms. However, there exist several challenges in this fast growing field of reverse engineering biomolecular systems: (i) to integrate heterogeneous biochemical data for data mining, (ii) to combine top-down and bottom-up approaches for systems modeling and (iii) to validate system models experimentally. In addition to reviewing progress made by the community and opportunities encountered in addressing these challenges, we explore the emerging field of synthetic biology, which is an exciting approach to validate and analyze theoretical system models directly through experimental synthesis, i.e. analysis-by-synthesis. The ultimate goal is to address the present and future challenges in reverse engineering biomolecular systems (REBMS) using integrated workflow of data mining, systems modeling and synthetic biology.
NASA Astrophysics Data System (ADS)
Guo, Lijuan; Yan, Haijun; Hao, Yongqi; Chen, Yun
2018-01-01
With the power supply level of urban power grid toward high reliability development, it is necessary to adopt appropriate methods for comprehensive evaluation of existing equipment. Considering the wide and multi-dimensional power system data, the method of large data mining is used to explore the potential law and value of power system equipment. Based on the monitoring data of main transformer and the records of defects and faults, this paper integrates the data of power grid equipment environment. Apriori is used as an association identification algorithm to extract the frequent correlation factors of the main transformer, and the potential dependence of the big data is analyzed by the support and confidence. Then, the integrated data is analyzed by PCA, and the integrated quantitative scoring model is constructed. It is proved to be effective by using the test set to validate the evaluation algorithm and scheme. This paper provides a new idea for data fusion of smart grid, and provides a reference for further evaluation of big data of power grid equipment.
Conceptual Design of a 100kW Energy Integrated Type Bi-Directional Tidal Current Turbine
NASA Astrophysics Data System (ADS)
Kim, Ki Pyoung; Ahmed, M. Rafiuddin; Lee, Young Ho
2010-06-01
The development of a tidal current turbine that can extract maximum energy from the tidal current will be extremely beneficial for supplying continuous electric power. The present paper presents a conceptual design of a 100kW energy integrated type tidal current turbine for tidal power generation. The instantaneous power density of a flowing fluid incident on an underwater turbine is proportional to the cubic power of current velocity which is approximately 2.5m/s. A cross-flow turbine, provided with a nozzle and a diffuser, is designed and analyzed. The potential advantages of ducted and diffuser-augmented turbines were taken into consideration in order to achieve higher output at a relatively low speed. This study looks at a cross-flow turbine system which is placed in an augmentation channel to generate electricity bi-directionally. The compatibility of this turbine system is verified using a commercial CFD code, ANSYSCFX. This paper presents the results of the numerical analysis in terms of pressure, streaklines, velocity vectors and performance curves for energy integrated type bi-directional tidal current turbine (BDT) with augmentation.
NASA Astrophysics Data System (ADS)
Thomopoulos, Stelios C. A.; Kyriazanos, Dimitris M.; Astyakopoulos, Alkiviadis; Dimitros, Kostantinos; Margonis, Christos; Thanos, Giorgos Konstantinos; Skroumpelou, Katerina
2016-05-01
AF3 (Advanced Forest Fire Fighting2) is a European FP7 research project that intends to improve the efficiency of current fire-fighting operations and the protection of human lives, the environment and property by developing innovative technologies to ensure the integration between existing and new systems. To reach this objective, the AF3 project focuses on innovative active and passive countermeasures, early detection and monitoring, integrated crisis management and advanced public information channels. OCULUS Fire is the innovative control and command system developed within AF3 as a monitoring, GIS and Knowledge Extraction System and Visualization Tool. OCULUS Fire includes (a) an interface for real-time updating and reconstructing of maps to enable rerouting based on estimated hazards and risks, (b) processing of GIS dynamic re-construction and mission re-routing, based on the fusion of airborne, satellite, ground and ancillary geolocation data, (c) visualization components for the C2 monitoring system, displaying and managing information arriving from a variety of sources and (d) mission and situational awareness module for OCULUS Fire ground monitoring system being part of an Integrated Crisis Management Information System for ground and ancillary sensors. OCULUS Fire will also process and visualise information from public information channels, social media and also mobile applications by helpful citizens and volunteers. Social networking, community building and crowdsourcing features will enable a higher reliability and less false alarm rates when using such data in the context of safety and security applications.
NASA Astrophysics Data System (ADS)
Zaid, Hayyiratul Fatimah Mohd; Kait, Chong Fai; Mutalib, Mohamed Ibrahim Abdul
2014-10-01
Photocatalyts TiO2 doped with Cu, Fe and Cu-Fe metal at different calcination temperature and duration were successfully prepared and characterized. Photocatalytic oxidative desulfurization of model oil containing dibenzothiophene as the sulfur compound (100 ppm) using the prepared photocatalyst was investigated. The photocatalyst calcined at 500°C and duration of 1 h showed the best performance.
Zhu, Hua-Xu; Duan, Jin-Ao; Guo, Li-Wei; Li, Bo; Lu, Jin; Tang, Yu-Ping; Pan, Lin-Mei
2014-05-01
Resource of traditional Chinese medicine residue is an inevitable choice to form new industries characterized of modem, environmental protection and intensive in the Chinese medicine industry. Based on the analysis of source and the main chemical composition of the herb residue, and for the advantages of membrane science and technology used in the pharmaceutical industry, especially membrane separation technology used in improvement technical reserves of traditional extraction and separation process in the pharmaceutical industry, it is proposed that membrane science and technology is one of the most important choices in technological design of traditional Chinese medicine resource industrialization. Traditional Chinese medicine residue is a very complex material system in composition and character, and scientific and effective "separation" process is the key areas of technology to re-use it. Integrated process can improve the productivity of the target product, enhance the purity of the product in the separation process, and solve many tasks which conventional separation is difficult to achieve. As integrated separation technology has the advantages of simplified process and reduced consumption, which are in line with the trend of the modern pharmaceutical industry, the membrane separation technology can provide a broad platform for integrated process, and membrane separation technology with its integrated technology have broad application prospects in achieving resource and industrialization process of traditional Chinese medicine residue. We discuss the principles, methods and applications practice of effective component resources in herb residue using membrane separation and integrated technology, describe the extraction, separation, concentration and purification application of membrane technology in traditional Chinese medicine residue, and systematically discourse suitability and feasibility of membrane technology in the process of traditional Chinese medicine resource industrialization in this paper.
Liao, Maoliang; Shang, Haihua; Li, Yazhuo; Li, Tian; Wang, Miao; Zheng, Yanan; Hou, Wenbin; Liu, Changxiao
2018-06-01
Quality control of traditional Chinese medicines is currently a great concern, due to the correlation between the quality control indicators and clinic effect is often questionable. According to the "multi-components and multi-targets" property of TCMs, a new special quality and bioactivity evaluation system is urgently needed. Present study adopted an integrated approach to provide new insights relating to uncover quality marker underlying the effects of Alisma orientale (AO) on lipid metabolism. In this paper, guided by the concept of the quality marker (Q-marker), an integrated strategies "effect-compound-target-fingerprint" was established to discovery and screen the potential quality marker of AO based on network pharmacology and chemical analysis. Firstly, a bioactivity evaluation was performed to screen the main active fractions. Then the chemical compositions were rapidly identified by chemical analysis. Next, networks were constructed to illuminate the interactions between these component and their targets for lipid metabolism, and the potential Q-marker of AO was initially screened. Finally, the activity of the Q-markers was validated in vitro. 50% ethanol extract fraction was found to have the strongest lipid-lowering activity. Then, the network pharmacology was used to clarify the unique relationship between the Q-markers and their integral pharmacological action. Combined with the results obtained, five active ingredients in the 50% ethanol extract fraction were given special considerations to be representative Q-markers: Alisol A, Alisol B, Alisol A 23-acetate, Alisol B 23-acetate and Alisol A 24-acetate, respectively. The chromatographic fingerprints based Q-marker was establishment. The integrated Q-marker screen may offer an alternative quality assessment of herbal medicines. Copyright © 2018. Published by Elsevier GmbH.
A Machine Reading System for Assembling Synthetic Paleontological Databases
Peters, Shanan E.; Zhang, Ce; Livny, Miron; Ré, Christopher
2014-01-01
Many aspects of macroevolutionary theory and our understanding of biotic responses to global environmental change derive from literature-based compilations of paleontological data. Existing manually assembled databases are, however, incomplete and difficult to assess and enhance with new data types. Here, we develop and validate the quality of a machine reading system, PaleoDeepDive, that automatically locates and extracts data from heterogeneous text, tables, and figures in publications. PaleoDeepDive performs comparably to humans in several complex data extraction and inference tasks and generates congruent synthetic results that describe the geological history of taxonomic diversity and genus-level rates of origination and extinction. Unlike traditional databases, PaleoDeepDive produces a probabilistic database that systematically improves as information is added. We show that the system can readily accommodate sophisticated data types, such as morphological data in biological illustrations and associated textual descriptions. Our machine reading approach to scientific data integration and synthesis brings within reach many questions that are currently underdetermined and does so in ways that may stimulate entirely new modes of inquiry. PMID:25436610
Aqueous Rechargeable Alkaline CoxNi2-xS2/TiO2 Battery.
Liu, Jilei; Wang, Jin; Ku, Zhiliang; Wang, Huanhuan; Chen, Shi; Zhang, Lili; Lin, Jianyi; Shen, Ze Xiang
2016-01-26
An electrochemical energy storage system with high energy density, stringent safety, and reliability is highly desirable for next-generation energy storage devices. Here an aqueous rechargeable alkaline CoxNi2-xS2 // TiO2 battery system is designed by integrating two reversible electrode processes associated with OH(-) insertion/extraction in the cathode part and Li ion insertion/extraction in the anode part, respectively. The prototype CoxNi2-xS2 // TiO2 battery is able to deliver high energy/power densities of 83.7 Wh/kg at 609 W/kg (based on the total mass of active materials) and good cycling stabilities (capacity retention 75.2% after 1000 charge/discharge cycles). A maximum volumetric energy density of 21 Wh/l (based on the whole packaged cell) has been achieved, which is comparable to that of a thin-film battery and better than that of typical commercial supercapacitors, benefiting from the unique battery and hierarchical electrode design. This hybrid system would enrich the existing aqueous rechargeable LIB chemistry and be a promising battery technology for large-scale energy storage.
Ontology-based data integration between clinical and research systems.
Mate, Sebastian; Köpcke, Felix; Toddenroth, Dennis; Martin, Marcus; Prokosch, Hans-Ulrich; Bürkle, Thomas; Ganslandt, Thomas
2015-01-01
Data from the electronic medical record comprise numerous structured but uncoded elements, which are not linked to standard terminologies. Reuse of such data for secondary research purposes has gained in importance recently. However, the identification of relevant data elements and the creation of database jobs for extraction, transformation and loading (ETL) are challenging: With current methods such as data warehousing, it is not feasible to efficiently maintain and reuse semantically complex data extraction and trans-formation routines. We present an ontology-supported approach to overcome this challenge by making use of abstraction: Instead of defining ETL procedures at the database level, we use ontologies to organize and describe the medical concepts of both the source system and the target system. Instead of using unique, specifically developed SQL statements or ETL jobs, we define declarative transformation rules within ontologies and illustrate how these constructs can then be used to automatically generate SQL code to perform the desired ETL procedures. This demonstrates how a suitable level of abstraction may not only aid the interpretation of clinical data, but can also foster the reutilization of methods for un-locking it.
1987-11-01
developed that can be used by circuit engineers to extract the maximum performance from the devices on various board technologies including multilayer ceramic...Design guidelines have been developed that can be used by circuit engineers to extract the maxi- mum performance from the devices on various board...25 Attenuation and Dispersion Effects ......................................... 27 Skin Effect
An integrated process for the extraction of fuel and chemicals from marine macroalgal biomass
NASA Astrophysics Data System (ADS)
Trivedi, Nitin; Baghel, Ravi S.; Bothwell, John; Gupta, Vishal; Reddy, C. R. K.; Lali, Arvind M.; Jha, Bhavanath
2016-07-01
We describe an integrated process that can be applied to biomass of the green seaweed, Ulva fasciata, to allow the sequential recovery of four economically important fractions; mineral rich liquid extract (MRLE), lipid, ulvan, and cellulose. The main benefits of our process are: a) its simplicity and b) the consistent yields obtained from the residual biomass after each successive extraction step. For example, dry Ulva biomass yields ~26% of its starting mass as MRLE, ~3% as lipid, ~25% as ulvan, and ~11% as cellulose, with the enzymatic hydrolysis and fermentation of the final cellulose fraction under optimized conditions producing ethanol at a competitive 0.45 g/g reducing sugar. These yields are comparable to those obtained by direct processing of the individual components from primary biomass. We propose that this integration of ethanol production and chemical feedstock recovery from macroalgal biomass could substantially enhance the sustainability of marine biomass use.
Model-Based Testability Assessment and Directed Troubleshooting of Shuttle Wiring Systems
NASA Technical Reports Server (NTRS)
Deb, Somnath; Domagala, Chuck; Shrestha, Roshan; Malepati, Venkatesh; Cavanaugh, Kevin; Patterson-Hine, Ann; Sanderfer, Dwight; Cockrell, Jim; Norvig, Peter (Technical Monitor)
2000-01-01
We have recently completed a pilot study on the Space shuttle wiring system commissioned by the Wiring Integrity Research (WIRe) team at NASA Ames Research Center, As the space shuttle ages, it is experiencing wiring degradation problems including arcing, chaffing insulation breakdown and broken conductors. A systematic and comprehensive test process is required to thoroughly test and quality assure (QA) the wiring systems. The NASA WIRe team recognized the value of a formal model based analysis for risk-assessment and fault coverage analysis. However. wiring systems are complex and involve over 50,000 wire segments. Therefore, NASA commissioned this pilot study with Qualtech Systems. Inc. (QSI) to explore means of automatically extracting high fidelity multi-signal models from wiring information database for use with QSI's Testability Engineering and Maintenance System (TEAMS) tool.
Characterizing DebriSat Fragments: So Many Fragments, So Much Data, and So Little Time
NASA Technical Reports Server (NTRS)
Shiotani, B.; Rivero, M.; Carrasquilla, M.; Allen, S.; Fitz-Coy, N.; Liou, J.-C.; Huynh, T.; Sorge, M.; Cowardin, H.; Opiela, J.;
2017-01-01
To improve prediction accuracy, the DebriSat project was conceived by NASA and DoD to update existing standard break-up models. Updating standard break-up models require detailed fragment characteristics such as physical size, material properties, bulk density, and ballistic coefficient. For the DebriSat project, a representative modern LEO spacecraft was developed and subjected to a laboratory hypervelocity impact test and all generated fragments with at least one dimension greater than 2 mm are collected, characterized and archived. Since the beginning of the characterization phase of the DebriSat project, over 130,000 fragments have been collected and approximately 250,000 fragments are expected to be collected in total, a three-fold increase over the 85,000 fragments predicted by the current break-up model. The challenge throughout the project has been to ensure the integrity and accuracy of the characteristics of each fragment. To this end, the post hypervelocity-impact test activities, which include fragment collection, extraction, and characterization, have been designed to minimize handling of the fragments. The procedures for fragment collection, extraction, and characterization were painstakingly designed and implemented to maintain the post-impact state of the fragments, thus ensuring the integrity and accuracy of the characterization data. Each process is designed to expedite the accumulation of data, however, the need for speed is restrained by the need to protect the fragments. Methods to expedite the process such as parallel processing have been explored and implemented while continuing to maintain the highest integrity and value of the data. To minimize fragment handling, automated systems have been developed and implemented. Errors due to human inputs are also minimized by the use of these automated systems. This paper discusses the processes and challenges involved in the collection, extraction, and characterization of the fragments as well as the time required to complete the processes. The objective is to provide the orbital debris community an understanding of the scale of the effort required to generate and archive high quality data and metadata for each debris fragment 2 mm or larger generated by the DebriSat project.
Integral lipids of mammalian hair.
Wertz, P W; Downing, D T
1989-01-01
1. It has been demonstrated that hair contains lipids which cannot be removed by extensive extraction with chloroform-methanol mixtures. These integral lipids can be extracted only after the hair has been subjected to alkaline hydrolysis. 2. Integral hair lipids include cholesterol sulfate (0.7-2.9 mg/g hair), ceramides (0.6-1.4 mg/g), cholesterol (0.3-1.4 mg/g), fatty alcohols (trace-0.2 mg/g) and fatty acids (2.3-4.0 mg/g). 3. One of the major integral hair lipids, representing 38.4-47.6% of the total fatty acids, is the anteisobranched 18-methyleicosanoic acid. 4. The species examined included human (Homo sapiens), pig (Sus scrofa), dog (Canis familiaris), sheep (Ovis ammon aries) and cow (Bos taurus).
Burton, G Allen; Rosen, Gunther; Chadwick, D Bart; Greenberg, Marc S; Taulbee, W Keith; Lotufo, Guilherme R; Reible, Danny D
2012-03-01
In situ-based testing using aquatic organisms has been widely reported, but is often limited in scope and practical usefulness in making decisions on ecological risk and remediation. To provide this capability, an integrated deployment system, the Sediment Ecotoxicity Assessment (SEA) Ring was developed, which incorporates rapid in situ hydrological, chemical, bioaccumulation, and toxicological Lines-of-Evidence (LoE) for assessing sediment and overlying water contamination. The SEA Ring system allows for diver-assisted, or diverless, deployment of multiple species of ecologically relevant and indigenous organisms in three different exposures (overlying water, sediment-water interface, and bulk sediment) for periods ranging from two days to three weeks, in a range of water systems. Measured endpoints were both sublethal and lethal effects as well as bioaccumulation. In addition, integrated passive sampling devices for detecting nonpolar organics (solid phase micro-extraction fibers) and metals (diffusive gradients in thin films) provided gradient measures in overlying waters and surficial sediments. Copyright © 2011 Elsevier Ltd. All rights reserved.
Hand Grasping Synergies As Biometrics.
Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana
2017-01-01
Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies-postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security.
Computer vision for driver assistance systems
NASA Astrophysics Data System (ADS)
Handmann, Uwe; Kalinke, Thomas; Tzomakas, Christos; Werner, Martin; von Seelen, Werner
1998-07-01
Systems for automated image analysis are useful for a variety of tasks and their importance is still increasing due to technological advances and an increase of social acceptance. Especially in the field of driver assistance systems the progress in science has reached a level of high performance. Fully or partly autonomously guided vehicles, particularly for road-based traffic, pose high demands on the development of reliable algorithms due to the conditions imposed by natural environments. At the Institut fur Neuroinformatik, methods for analyzing driving relevant scenes by computer vision are developed in cooperation with several partners from the automobile industry. We introduce a system which extracts the important information from an image taken by a CCD camera installed at the rear view mirror in a car. The approach consists of a sequential and a parallel sensor and information processing. Three main tasks namely the initial segmentation (object detection), the object tracking and the object classification are realized by integration in the sequential branch and by fusion in the parallel branch. The main gain of this approach is given by the integrative coupling of different algorithms providing partly redundant information.
Zhuang, Yan; Xie, Bangtie; Weng, Shengxin; Xie, Yanming
2011-10-01
To construct real world integrated data warehouse on re-evaluation of post-marketing traditional Chinese medicine for the research on key techniques of clinic re-evaluation which mainly includes indication of traditional Chinese medicine, dosage usage, course of treatment, unit medication, combined disease and adverse reaction, which provides data for reviewed research on its safety,availability and economy,and provides foundation for perspective research. The integrated data warehouse extracts and integrate data from HIS by information collection system and data warehouse technique and forms standard structure and data. The further research is on process based on the data. A data warehouse and several sub data warehouses were built, which focused on patients' main records, doctor orders, diseases diagnoses, laboratory results and economic indications in hospital. These data warehouses can provide research data for re-evaluation of post-marketing traditional Chinese medicine, and it has clinical value. Besides, it points out the direction for further research.
Tang, Ruihua; Yang, Hui; Gong, Yan; You, MinLi; Liu, Zhi; Choi, Jane Ru; Wen, Ting; Qu, Zhiguo; Mei, Qibing; Xu, Feng
2017-03-29
Nucleic acid testing (NAT) has been widely used for disease diagnosis, food safety control and environmental monitoring. At present, NAT mainly involves nucleic acid extraction, amplification and detection steps that heavily rely on large equipment and skilled workers, making the test expensive, time-consuming, and thus less suitable for point-of-care (POC) applications. With advances in paper-based microfluidic technologies, various integrated paper-based devices have recently been developed for NAT, which however require off-chip reagent storage, complex operation steps and equipment-dependent nucleic acid amplification, restricting their use for POC testing. To overcome these challenges, we demonstrate a fully disposable and integrated paper-based sample-in-answer-out device for NAT by integrating nucleic acid extraction, helicase-dependent isothermal amplification and lateral flow assay detection into one paper device. This simple device allows on-chip dried reagent storage and equipment-free nucleic acid amplification with simple operation steps, which could be performed by untrained users in remote settings. The proposed device consists of a sponge-based reservoir and a paper-based valve for nucleic acid extraction, an integrated battery, a PTC ultrathin heater, temperature control switch and on-chip dried enzyme mix storage for isothermal amplification, and a lateral flow test strip for naked-eye detection. It can sensitively detect Salmonella typhimurium, as a model target, with a detection limit of as low as 10 2 CFU ml -1 in wastewater and egg, and 10 3 CFU ml -1 in milk and juice in about an hour. This fully disposable and integrated paper-based device has great potential for future POC applications in resource-limited settings.
Atun, Rifat; de Jongh, Thyra E; Secci, Federica V; Ohiri, Kelechi; Adeyi, Olusoji; Car, Josip
2011-10-10
Objective of the study was to assess the effects of strategies to integrate targeted priority population, health and nutrition interventions into health systems on patient health outcomes and health system effectiveness and thus to compare integrated and non-integrated health programmes. Systematic review using Cochrane methodology of analysing randomised trials, controlled before-and-after and interrupted time series studies. We defined specific strategies to search PubMed, CENTRAL and the Cochrane Effective Practice and Organisation of Care Group register, considered studies published from January 1998 until September 2008, and tracked references and citations. Two reviewers independently agreed on eligibility, with an additional arbiter as needed, and extracted information on outcomes: primary (improved health, financial protection, and user satisfaction) and secondary (improved population coverage, access to health services, efficiency, and quality) using standardised, pre-piloted forms. Two reviewers in the final stage of selection jointly assessed quality of all selected studies using the GRADE criteria. Of 8,274 citations identified 12 studies met inclusion criteria. Four studies compared the benefits of Integrated Management of Childhood Illnesses in Tanzania and Bangladesh, showing improved care management and higher utilisation of health facilities at no additional cost. Eight studies focused on integrated delivery of mental health and substance abuse services in the United Kingdom and United States of America. Integrated service delivery resulted in better clinical outcomes and greater reduction of substance abuse in specific sub-groups of patients, with no significant difference found overall. Quality of care, patient satisfaction, and treatment engagement were higher in integrated delivery models. Targeted priority population health interventions we identified led to improved health outcomes, quality of care, patient satisfaction and access to care. Limited evidence with inconsistent findings across varied interventions in different settings means no general conclusions can be drawn on the benefits or disadvantages of integrated service delivery.
A Review of Diagnostic Techniques for ISHM Applications
NASA Technical Reports Server (NTRS)
Patterson-Hine, Ann; Biswas, Gautam; Aaseng, Gordon; Narasimhan, Sriam; Pattipati, Krishna
2005-01-01
System diagnosis is an integral part of any Integrated System Health Management application. Diagnostic applications make use of system information from the design phase, such as safety and mission assurance analysis, failure modes and effects analysis, hazards analysis, functional models, fault propagation models, and testability analysis. In modern process control and equipment monitoring systems, topological and analytic , models of the nominal system, derived from design documents, are also employed for fault isolation and identification. Depending on the complexity of the monitored signals from the physical system, diagnostic applications may involve straightforward trending and feature extraction techniques to retrieve the parameters of importance from the sensor streams. They also may involve very complex analysis routines, such as signal processing, learning or classification methods to derive the parameters of importance to diagnosis. The process that is used to diagnose anomalous conditions from monitored system signals varies widely across the different approaches to system diagnosis. Rule-based expert systems, case-based reasoning systems, model-based reasoning systems, learning systems, and probabilistic reasoning systems are examples of the many diverse approaches ta diagnostic reasoning. Many engineering disciplines have specific approaches to modeling, monitoring and diagnosing anomalous conditions. Therefore, there is no "one-size-fits-all" approach to building diagnostic and health monitoring capabilities for a system. For instance, the conventional approaches to diagnosing failures in rotorcraft applications are very different from those used in communications systems. Further, online and offline automated diagnostic applications are integrated into an operations framework with flight crews, flight controllers and maintenance teams. While the emphasis of this paper is automation of health management functions, striking the correct balance between automated and human-performed tasks is a vital concern.
Ultrasonic real-time in-die monitoring of the tablet compaction process-a proof of concept study.
Stephens, James D; Kowalczyk, Brian R; Hancock, Bruno C; Kaul, Goldi; Cetinkaya, Cetin
2013-02-14
The mechanical properties of a drug tablet can affect its performance (e.g., dissolution profile and its physical robustness. An ultrasonic system for real-time in-die tablet mechanical property monitoring during compaction has been demonstrated. The reported set-up is a proof of concept compaction monitoring system which includes an ultrasonic transducer mounted inside the upper punch of the compaction apparatus. This upper punch is utilized to acquire ultrasonic pressure wave phase velocity waveforms and extract the time-of-flight of pressure waves travelling within the compact at a number of compaction force levels during compaction. The reflection coefficients for the waves reflecting from punch tip-powder bed interface are extracted from the acquired waveforms. The reflection coefficient decreases with an increase in compaction force, indicating solidification. The data acquisition methods give an average apparent Young's moduli in the range of 8-20 GPa extracted during the compaction and release/decompression phases in real-time. A monitoring system employing such methods is capable of determining material properties and the integrity of the tablet during compaction. As compared to the millisecond time-scale dwell time of a typical commercial compaction press, the micro-second pulse duration and ToF of an acoustic pulse are sufficiently fast for real-time monitoring. Copyright © 2012 Elsevier B.V. All rights reserved.
Ontology-Based Search of Genomic Metadata.
Fernandez, Javier D; Lenzerini, Maurizio; Masseroli, Marco; Venco, Francesco; Ceri, Stefano
2016-01-01
The Encyclopedia of DNA Elements (ENCODE) is a huge and still expanding public repository of more than 4,000 experiments and 25,000 data files, assembled by a large international consortium since 2007; unknown biological knowledge can be extracted from these huge and largely unexplored data, leading to data-driven genomic, transcriptomic, and epigenomic discoveries. Yet, search of relevant datasets for knowledge discovery is limitedly supported: metadata describing ENCODE datasets are quite simple and incomplete, and not described by a coherent underlying ontology. Here, we show how to overcome this limitation, by adopting an ENCODE metadata searching approach which uses high-quality ontological knowledge and state-of-the-art indexing technologies. Specifically, we developed S.O.S. GeM (http://www.bioinformatics.deib.polimi.it/SOSGeM/), a system supporting effective semantic search and retrieval of ENCODE datasets. First, we constructed a Semantic Knowledge Base by starting with concepts extracted from ENCODE metadata, matched to and expanded on biomedical ontologies integrated in the well-established Unified Medical Language System. We prove that this inference method is sound and complete. Then, we leveraged the Semantic Knowledge Base to semantically search ENCODE data from arbitrary biologists' queries. This allows correctly finding more datasets than those extracted by a purely syntactic search, as supported by the other available systems. We empirically show the relevance of found datasets to the biologists' queries.
Few, Sheridan; Gambhir, Ajay; Napp, Tamaryn; ...
2017-01-27
There exists considerable uncertainty over both shale and conventional gas resource availability and extraction costs, as well as the fugitive methane emissions associated with shale gas extraction and its possible role in mitigating climate change. This study uses a multi-region energy system model, TIAM (TIMES integrated assessment model), to consider the impact of a range of conventional and shale gas cost and availability assessments on mitigation scenarios aimed at achieving a limit to global warming of below 2 °C in 2100, with a 50% likelihood. When adding shale gas to the global energy mix, the reduction to the global energymore » system cost is relatively small (up to 0.4%), and the mitigation cost increases by 1%–3% under all cost assumptions. The impact of a “dash for shale gas”, of unavailability of carbon capture and storage, of increased barriers to investment in low carbon technologies, and of higher than expected leakage rates, are also considered; and are each found to have the potential to increase the cost and reduce feasibility of meeting global temperature goals. Finally, we conclude that the extraction of shale gas is not likely to significantly reduce the effort required to mitigate climate change under globally coordinated action, but could increase required mitigation effort if not handled sufficiently carefully.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Few, Sheridan; Gambhir, Ajay; Napp, Tamaryn
There exists considerable uncertainty over both shale and conventional gas resource availability and extraction costs, as well as the fugitive methane emissions associated with shale gas extraction and its possible role in mitigating climate change. This study uses a multi-region energy system model, TIAM (TIMES integrated assessment model), to consider the impact of a range of conventional and shale gas cost and availability assessments on mitigation scenarios aimed at achieving a limit to global warming of below 2 °C in 2100, with a 50% likelihood. When adding shale gas to the global energy mix, the reduction to the global energymore » system cost is relatively small (up to 0.4%), and the mitigation cost increases by 1%–3% under all cost assumptions. The impact of a “dash for shale gas”, of unavailability of carbon capture and storage, of increased barriers to investment in low carbon technologies, and of higher than expected leakage rates, are also considered; and are each found to have the potential to increase the cost and reduce feasibility of meeting global temperature goals. Finally, we conclude that the extraction of shale gas is not likely to significantly reduce the effort required to mitigate climate change under globally coordinated action, but could increase required mitigation effort if not handled sufficiently carefully.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Islam, Md. Shafiqul, E-mail: shafique@eng.ukm.my; Hannan, M.A., E-mail: hannan@eng.ukm.my; Basri, Hassan
Highlights: • Solid waste bin level detection using Dynamic Time Warping (DTW). • Gabor wavelet filter is used to extract the solid waste image features. • Multi-Layer Perceptron classifier network is used for bin image classification. • The classification performance evaluated by ROC curve analysis. - Abstract: The increasing requirement for Solid Waste Management (SWM) has become a significant challenge for municipal authorities. A number of integrated systems and methods have introduced to overcome this challenge. Many researchers have aimed to develop an ideal SWM system, including approaches involving software-based routing, Geographic Information Systems (GIS), Radio-frequency Identification (RFID), or sensormore » intelligent bins. Image processing solutions for the Solid Waste (SW) collection have also been developed; however, during capturing the bin image, it is challenging to position the camera for getting a bin area centralized image. As yet, there is no ideal system which can correctly estimate the amount of SW. This paper briefly discusses an efficient image processing solution to overcome these problems. Dynamic Time Warping (DTW) was used for detecting and cropping the bin area and Gabor wavelet (GW) was introduced for feature extraction of the waste bin image. Image features were used to train the classifier. A Multi-Layer Perceptron (MLP) classifier was used to classify the waste bin level and estimate the amount of waste inside the bin. The area under the Receiver Operating Characteristic (ROC) curves was used to statistically evaluate classifier performance. The results of this developed system are comparable to previous image processing based system. The system demonstration using DTW with GW for feature extraction and an MLP classifier led to promising results with respect to the accuracy of waste level estimation (98.50%). The application can be used to optimize the routing of waste collection based on the estimated bin level.« less
An Efficient VLSI Architecture for Multi-Channel Spike Sorting Using a Generalized Hebbian Algorithm
Chen, Ying-Lun; Hwang, Wen-Jyi; Ke, Chi-En
2015-01-01
A novel VLSI architecture for multi-channel online spike sorting is presented in this paper. In the architecture, the spike detection is based on nonlinear energy operator (NEO), and the feature extraction is carried out by the generalized Hebbian algorithm (GHA). To lower the power consumption and area costs of the circuits, all of the channels share the same core for spike detection and feature extraction operations. Each channel has dedicated buffers for storing the detected spikes and the principal components of that channel. The proposed circuit also contains a clock gating system supplying the clock to only the buffers of channels currently using the computation core to further reduce the power consumption. The architecture has been implemented by an application-specific integrated circuit (ASIC) with 90-nm technology. Comparisons to the existing works show that the proposed architecture has lower power consumption and hardware area costs for real-time multi-channel spike detection and feature extraction. PMID:26287193
Precise on-machine extraction of the surface normal vector using an eddy current sensor array
NASA Astrophysics Data System (ADS)
Wang, Yongqing; Lian, Meng; Liu, Haibo; Ying, Yangwei; Sheng, Xianjun
2016-11-01
To satisfy the requirements of on-machine measurement of the surface normal during complex surface manufacturing, a highly robust normal vector extraction method using an Eddy current (EC) displacement sensor array is developed, the output of which is almost unaffected by surface brightness, machining coolant and environmental noise. A precise normal vector extraction model based on a triangular-distributed EC sensor array is first established. Calibration of the effects of object surface inclination and coupling interference on measurement results, and the relative position of EC sensors, is involved. A novel apparatus employing three EC sensors and a force transducer was designed, which can be easily integrated into the computer numerical control (CNC) machine tool spindle and/or robot terminal execution. Finally, to test the validity and practicability of the proposed method, typical experiments were conducted with specified testing pieces using the developed approach and system, such as an inclined plane and cylindrical and spherical surfaces.
Chen, Ying-Lun; Hwang, Wen-Jyi; Ke, Chi-En
2015-08-13
A novel VLSI architecture for multi-channel online spike sorting is presented in this paper. In the architecture, the spike detection is based on nonlinear energy operator (NEO), and the feature extraction is carried out by the generalized Hebbian algorithm (GHA). To lower the power consumption and area costs of the circuits, all of the channels share the same core for spike detection and feature extraction operations. Each channel has dedicated buffers for storing the detected spikes and the principal components of that channel. The proposed circuit also contains a clock gating system supplying the clock to only the buffers of channels currently using the computation core to further reduce the power consumption. The architecture has been implemented by an application-specific integrated circuit (ASIC) with 90-nm technology. Comparisons to the existing works show that the proposed architecture has lower power consumption and hardware area costs for real-time multi-channel spike detection and feature extraction.
Moore, G.K.; Baten, L.G.; Allord, G.J.; Robinove, C.J.
1983-01-01
The Fox-Wolf River basin in east-central Wisconsin was selected to test concepts for a water-resources information system using digital mapping technology. This basin of 16,800 sq km is typical of many areas in the country. Fifty digital data sets were included in the Fox-Wolf information system. Many data sets were digitized from 1:500,000 scale maps and overlays. Some thematic data were acquired from WATSTORE and other digital data files. All data were geometrically transformed into a Lambert Conformal Conic map projection and converted to a raster format with a 1-km resolution. The result of this preliminary processing was a group of spatially registered, digital data sets in map form. Parameter evaluation, areal stratification, data merging, and data integration were used to achieve the processing objectives and to obtain analysis results for the Fox-Wolf basin. Parameter evaluation includes the visual interpretation of single data sets and digital processing to obtain new derived data sets. In the areal stratification stage, masks were used to extract from one data set all features that are within a selected area on another data set. Most processing results were obtained by data merging. Merging is the combination of two or more data sets into a composite product, in which the contribution of each original data set is apparent and can be extracted from the composite. One processing result was also obtained by data integration. Integration is the combination of two or more data sets into a single new product, from which the original data cannot be separated or calculated. (USGS)
Xiang, Kun; Li, Yinglei; Ford, William; Land, Walker; Schaffer, J David; Congdon, Robert; Zhang, Jing; Sadik, Omowunmi
2016-02-21
We hereby report the design and implementation of an Autonomous Microbial Cell Culture and Classification (AMC(3)) system for rapid detection of food pathogens. Traditional food testing methods require multistep procedures and long incubation period, and are thus prone to human error. AMC(3) introduces a "one click approach" to the detection and classification of pathogenic bacteria. Once the cultured materials are prepared, all operations are automatic. AMC(3) is an integrated sensor array platform in a microbial fuel cell system composed of a multi-potentiostat, an automated data collection system (Python program, Yocto Maxi-coupler electromechanical relay module) and a powerful classification program. The classification scheme consists of Probabilistic Neural Network (PNN), Support Vector Machines (SVM) and General Regression Neural Network (GRNN) oracle-based system. Differential Pulse Voltammetry (DPV) is performed on standard samples or unknown samples. Then, using preset feature extractions and quality control, accepted data are analyzed by the intelligent classification system. In a typical use, thirty-two extracted features were analyzed to correctly classify the following pathogens: Escherichia coli ATCC#25922, Escherichia coli ATCC#11775, and Staphylococcus epidermidis ATCC#12228. 85.4% accuracy range was recorded for unknown samples, and within a shorter time period than the industry standard of 24 hours.
NASA Astrophysics Data System (ADS)
Hu, Chen; Chen, Mian-zhou; Li, Hong-bin; Zhang, Zhu; Jiao, Yang; Shao, Haiming
2018-05-01
Ordinarily electronic voltage transformers (EVTs) are calibrated off-line and the calibration procedure requires complex switching operations, which will influence the reliability of the power grid and induce large economic losses. To overcome this problem, this paper investigates a 110 kV on-site calibration system for EVTs, including a standard channel, a calibrated channel and a PC equipped with the LabView environment. The standard channel employs a standard capacitor and an analogue integrating circuit to reconstruct the primary voltage signal. Moreover, an adaptive full-phase discrete Fourier transform (DFT) algorithm is proposed to extract electrical parameters. The algorithm involves the process of extracting the frequency of the grid, adjusting the operation points, and calculating the results using DFT. In addition, an insulated automatic lifting device is designed to realize the live connection of the standard capacitor, which is driven by a wireless remote controller. A performance test of the capacitor verifies the accurateness of the standard capacitor. A system calibration test shows that the system ratio error is less than 0.04% and the phase error is below 2‧, which meets the requirement of the 0.2 accuracy class. Finally, the developed calibration system was used in a substation, and the field test data validates the availability of the system.
Road and Roadside Feature Extraction Using Imagery and LIDAR Data for Transportation Operation
NASA Astrophysics Data System (ADS)
Ural, S.; Shan, J.; Romero, M. A.; Tarko, A.
2015-03-01
Transportation agencies require up-to-date, reliable, and feasibly acquired information on road geometry and features within proximity to the roads as input for evaluating and prioritizing new or improvement road projects. The information needed for a robust evaluation of road projects includes road centerline, width, and extent together with the average grade, cross-sections, and obstructions near the travelled way. Remote sensing is equipped with a large collection of data and well-established tools for acquiring the information and extracting aforementioned various road features at various levels and scopes. Even with many remote sensing data and methods available for road extraction, transportation operation requires more than the centerlines. Acquiring information that is spatially coherent at the operational level for the entire road system is challenging and needs multiple data sources to be integrated. In the presented study, we established a framework that used data from multiple sources, including one-foot resolution color infrared orthophotos, airborne LiDAR point clouds, and existing spatially non-accurate ancillary road networks. We were able to extract 90.25% of a total of 23.6 miles of road networks together with estimated road width, average grade along the road, and cross sections at specified intervals. Also, we have extracted buildings and vegetation within a predetermined proximity to the extracted road extent. 90.6% of 107 existing buildings were correctly identified with 31% false detection rate.
Carpinteiro, I; Abuín, B; Rodríguez, I; Ramil, M; Cela, R
2010-06-11
A novel and sensitive method for the determination of five benzotriazole compounds (commonly used as light stabilizers) in indoor dust is presented. Pressurized liquid extraction (PLE) and gas chromatography followed by tandem in time mass spectrometry (GC-MS/MS) were used as sample preparation and determination techniques, respectively. Extraction and clean-up were integrated on-line and, after an evaporative concentration step, the extract provided by the PLE instrument was injected directly in the GC-MS/MS system. Parameters affecting the performance of the sample preparation process were evaluated using experimental factorial designs. Under optimized conditions, analytes were recovered from 0.5g samples in 3 static extraction cycles of 10min, using a hexane:dichloromethane (7:3) mixture, at 90 degrees C. Silica (1g) was placed in the bottom of the extraction cells as clean-up sorbent. The recoveries of the method varied from 82 to 122%, with standard deviations below 13. The inter-day precision ranged from 9 to 12%, and the limits of quantification (LOQs) remained below 10ngg(-1) for all species. For the first time, four of the five investigated species were found in dust from indoor environments. Their mean concentrations ranged from 71 to 780ngg(-1). Copyright 2010 Elsevier B.V. All rights reserved.
2010-01-01
Background The modular approach to analysis of genetically modified organisms (GMOs) relies on the independence of the modules combined (i.e. DNA extraction and GM quantification). The validity of this assumption has to be proved on the basis of specific performance criteria. Results An experiment was conducted using, as a reference, the validated quantitative real-time polymerase chain reaction (PCR) module for detection of glyphosate-tolerant Roundup Ready® GM soybean (RRS). Different DNA extraction modules (CTAB, Wizard and Dellaporta), were used to extract DNA from different food/feed matrices (feed, biscuit and certified reference material [CRM 1%]) containing the target of the real-time PCR module used for validation. Purity and structural integrity (absence of inhibition) were used as basic criteria that a DNA extraction module must satisfy in order to provide suitable template DNA for quantitative real-time (RT) PCR-based GMO analysis. When performance criteria were applied (removal of non-compliant DNA extracts), the independence of GMO quantification from the extraction method and matrix was statistically proved, except in the case of Wizard applied to biscuit. A fuzzy logic-based procedure also confirmed the relatively poor performance of the Wizard/biscuit combination. Conclusions For RRS, this study recognises that modularity can be generally accepted, with the limitation of avoiding combining highly processed material (i.e. biscuit) with a magnetic-beads system (i.e. Wizard). PMID:20687918
Jiang, Wen-Hao; Liu, Jian-Hong; Liu, Yin; Jin, Ge; Zhang, Jun; Pan, Jian-Wei
2017-12-15
InGaAs/InP single-photon detectors (SPDs) are the key devices for applications requiring near-infrared single-photon detection. The gating mode is an effective approach to synchronous single-photon detection. Increasing gating frequency and reducing the module size are important challenges for the design of such a detector system. Here we present for the first time, to the best of our knowledge, an InGaAs/InP SPD with 1.25 GHz sine wave gating (SWG) using a monolithically integrated readout circuit (MIRC). The MIRC has a size of 15 mm×15 mm and implements the miniaturization of avalanche extraction for high-frequency SWG. In the MIRC, low-pass filters and a low-noise radio frequency amplifier are integrated based on the technique of low temperature co-fired ceramic, which can effectively reduce the parasitic capacitance and extract weak avalanche signals. We then characterize the InGaAs/InP SPD to verify the functionality and reliability of the MIRC, and the SPD exhibits excellent performance with 27.5% photon detection efficiency, a 1.2 kcps dark count rate, and 9.1% afterpulse probability at 223 K and 100 ns hold-off time. With this MIRC, one can further design miniaturized high-frequency SPD modules that are highly required for practical applications.
A 64-channel ultra-low power system-on-chip for local field and action potentials recording
NASA Astrophysics Data System (ADS)
Rodríguez-Pérez, Alberto; Delgado-Restituto, Manuel; Darie, Angela; Soto-Sánchez, Cristina; Fernández-Jover, Eduardo; Rodríguez-Vázquez, Ángel
2015-06-01
This paper reports an integrated 64-channel neural recording sensor. Neural signals are acquired, filtered, digitized and compressed in the channels. Additionally, each channel implements an auto-calibration mechanism which configures the transfer characteristics of the recording site. The system has two transmission modes; in one case the information captured by the channels is sent as uncompressed raw data; in the other, feature vectors extracted from the detected neural spikes are released. Data streams coming from the channels are serialized by an embedded digital processor. Experimental results, including in vivo measurements, show that the power consumption of the complete system is lower than 330μW.
Li, Caixia; Chen, Qiyu; Zhang, Xiaoyan; Snyder, Shane A; Gong, Zhiyuan; Lam, Siew Hong
2017-12-11
Comprehensive monitoring of water pollution is challenging. With the increasing amount and types of anthropogenic compounds being released into water, there are rising concerns of undetected toxicity. This is especially true for municipal wastewater effluents that are discharged to surface waters. This study was designed to integrate zebrafish toxicogenomics, targeted gene expression, and morphological analyses, for toxicity evaluation of effluent discharged from two previously characterized wastewater treatment plants (WWTPs) in Pima County, Arizona, and their receiving surface water. Zebrafish embryos were exposed to organic extracts from the WWTP1 effluent that were reconstituted to represent 1× and 0.5× of the original concentration. Microarray analyses identified deregulated gene probes that mapped to 1666, 779, and 631 unique human homologs in the 1×, 0.5×, and the intersection of both groups, respectively. These were associated with 18 cellular and molecular functions ranging from cell cycle to metabolism and are involved in the development and function of 10 organ systems including nervous, cardiovascular, haematological, reproductive, and hepatic systems. Superpathway of cholesterol biosynthesis, retinoic acid receptor activation, glucocorticoid receptor and prolactin signaling were among the top 11 perturbed canonical pathways. Real-time quantitative PCR validated the expression changes of 12 selected genes. These genes were then tested on zebrafish embryos exposed to the reconstituted extract of water sampled downstream of WWTP1 and another nearby WWTP2. The expression of several targeted genes were significantly affected by the WWTP effluents and some of the downstream receiving waters. Morphological analyses using four transgenic zebrafish lines revealed potential toxicity associated with nervous, hepatic, endothelial-vascular and myeloid systems. This study demonstrated how information can be obtained using adverse outcome pathway framework to derive biological effect-based monitoring tools. This integrated approach using zebrafish can supplement analytical chemistry to provide more comprehensive monitoring of discharged effluents and their receiving waters. Copyright © 2017 Elsevier Ltd. All rights reserved.
Xie, Zhi-Peng; Liu, Xue-Song; Chen, Yong; Cai, Ming; Qu, Hai-Bin; Cheng, Yi-Yu
2007-05-01
Multi-stage countercurrent extraction technology, integrating solvent extraction, repercolation with dynamic and countercurrent extraction, is a novel extraction technology for the traditional Chinese medicine. This solvent-saving, energy-saving and high-extraction-efficiency technology can at the most drive active compounds to diffuse from the herbal materials into the solvent stage by stage by creating concentration differences between the herbal materials and the solvents. This paper reviewed the basic principle, the influence factors and the research progress and trends of the equipments and the application of the multi-stage countercurrent extraction.
Rasmussen, Luke V; Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2012-06-01
Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline.
Peissig, Peggy L; McCarty, Catherine A; Starren, Justin
2011-01-01
Background Although the penetration of electronic health records is increasing rapidly, much of the historical medical record is only available in handwritten notes and forms, which require labor-intensive, human chart abstraction for some clinical research. The few previous studies on automated extraction of data from these handwritten notes have focused on monolithic, custom-developed recognition systems or third-party systems that require proprietary forms. Methods We present an optical character recognition processing pipeline, which leverages the capabilities of existing third-party optical character recognition engines, and provides the flexibility offered by a modular custom-developed system. The system was configured and run on a selected set of form fields extracted from a corpus of handwritten ophthalmology forms. Observations The processing pipeline allowed multiple configurations to be run, with the optimal configuration consisting of the Nuance and LEADTOOLS engines running in parallel with a positive predictive value of 94.6% and a sensitivity of 13.5%. Discussion While limitations exist, preliminary experience from this project yielded insights on the generalizability and applicability of integrating multiple, inexpensive general-purpose third-party optical character recognition engines in a modular pipeline. PMID:21890871
NASA Technical Reports Server (NTRS)
Bower, Hannah; Cryderman, Kate; Captain, Janine
2016-01-01
The Resource Prospector (RP) mission with the Regolith and Environment Science and Oxygen and Lunar Volatile Extraction (RESOLVE) payload will prospect for water within the lunar regolith and provide a proof of concept for In-Situ Resource Utilization (ISRU) techniques, which could be used on future lunar and Martian missions. One system within the RESOLVE payload is the Lunar Advanced Volatiles Analysis (LAVA) subsystem, which consists of a Fluid Sub System (FSS) that transports volatiles to the Gas Chromatograph-Mass Spectrometer (GC-MS) instrument. In order for the FSS to transport precise and accurate amounts of volatiles to the GC-MS instrumentation, high performance valves are used within the system. The focus of this investigation is to evaluate the redesigned Lee valve. Further work is needed to continue to evaluate the Lee valve. Initial data shows that the valve could meet our requirements however further work is required to raise the TRL to an acceptable level to be included in the flight design of the system. At this time the risk is too high to change our baseline design to include these non-latching Lee solenoid valves.
NASA Technical Reports Server (NTRS)
Lempriere, B. M.
1987-01-01
The procedures and results of a study of a conceptual system for measuring the debris environment on the space station is discussed. The study was conducted in two phases: the first consisted of experiments aimed at evaluating location of impact through panel response data collected from acoustic emission sensors; the second analyzed the available statistical description of the environment to determine the probability of the measurement system producing useful data, and analyzed the results of the previous tests to evaluate the accuracy of location and the feasibility of extracting impactor characteristics from the panel response. The conclusions were that for one panel the system would not be exposed to any event, but that the entire Logistics Module would provide a modest amount of data. The use of sensors with higher sensitivity than those used in the tests could be advantageous. The impact location could be found with sufficient accuracy from panel response data. The waveforms of the response were shown to contain information on the impact characteristics, but the data set did not span a sufficient range of the variables necessary to evaluate the feasibility of extracting the information.
Multi-Filter String Matching and Human-Centric Entity Matching for Information Extraction
ERIC Educational Resources Information Center
Sun, Chong
2012-01-01
More and more information is being generated in text documents, such as Web pages, emails and blogs. To effectively manage this unstructured information, one broadly used approach includes locating relevant content in documents, extracting structured information and integrating the extracted information for querying, mining or further analysis. In…
Alexandrou, Lydon D; Spencer, Michelle J S; Morrison, Paul D; Meehan, Barry J; Jones, Oliver A H
2015-04-15
Solid phase extraction is one of the most commonly used pre-concentration and cleanup steps in environmental science. However, traditional methods need electrically powered pumps, can use large volumes of solvent (if multiple samples are run), and require several hours to filter a sample. Additionally, if the cartridge is open to the air volatile compounds may be lost and sample integrity compromised. In contrast, micro cartridge based solid phase extraction can be completed in less than 2 min by hand, uses only microlitres of solvent and provides comparable concentration factors to established methods. It is also an enclosed system so volatile components are not lost. The sample can also be eluted directly into a detector (e.g. a mass spectrometer) if required. However, the technology is new and has not been much used for environmental analysis. In this study we compare traditional (macro) and the new micro solid phase extraction for the analysis of four common volatile trihalomethanes (trichloromethane, bromodichloromethane, dibromochloromethane and tribromomethane). The results demonstrate that micro solid phase extraction is faster and cheaper than traditional methods with similar recovery rates for the target compounds. This method shows potential for further development in a range of applications. Copyright © 2015 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Maboudi, Mehdi; Amini, Jalal; Malihi, Shirin; Hahn, Michael
2018-04-01
Updated road network as a crucial part of the transportation database plays an important role in various applications. Thus, increasing the automation of the road extraction approaches from remote sensing images has been the subject of extensive research. In this paper, we propose an object based road extraction approach from very high resolution satellite images. Based on the object based image analysis, our approach incorporates various spatial, spectral, and textural objects' descriptors, the capabilities of the fuzzy logic system for handling the uncertainties in road modelling, and the effectiveness and suitability of ant colony algorithm for optimization of network related problems. Four VHR optical satellite images which are acquired by Worldview-2 and IKONOS satellites are used in order to evaluate the proposed approach. Evaluation of the extracted road networks shows that the average completeness, correctness, and quality of the results can reach 89%, 93% and 83% respectively, indicating that the proposed approach is applicable for urban road extraction. We also analyzed the sensitivity of our algorithm to different ant colony optimization parameter values. Comparison of the achieved results with the results of four state-of-the-art algorithms and quantifying the robustness of the fuzzy rule set demonstrate that the proposed approach is both efficient and transferable to other comparable images.
Broschard, Thomas H; Glowienke, Susanne; Bruen, Uma S; Nagao, Lee M; Teasdale, Andrew; Stults, Cheryl L M; Li, Kim L; Iciek, Laurie A; Erexson, Greg; Martin, Elizabeth A; Ball, Douglas J
2016-11-01
Leachables from pharmaceutical container closure systems can present potential safety risks to patients. Extractables studies may be performed as a risk mitigation activity to identify potential leachables for dosage forms with a high degree of concern associated with the route of administration. To address safety concerns, approaches to toxicological safety evaluation of extractables and leachables have been developed and applied by pharmaceutical and biologics manufacturers. Details of these approaches may differ depending on the nature of the final drug product. These may include application, the formulation, route of administration and length of use. Current regulatory guidelines and industry standards provide general guidance on compound specific safety assessments but do not provide a comprehensive approach to safety evaluations of leachables and/or extractables. This paper provides a perspective on approaches to safety evaluations by reviewing and applying general concepts and integrating key steps in the toxicological evaluation of individual extractables or leachables. These include application of structure activity relationship studies, development of permitted daily exposure (PDE) values, and use of safety threshold concepts. Case studies are provided. The concepts presented seek to encourage discussion in the scientific community, and are not intended to represent a final opinion or "guidelines." Copyright © 2016 Elsevier Inc. All rights reserved.
A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories.
Yang, Wei; Ai, Tinghua; Lu, Wei
2018-04-19
Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality.
A Method for Extracting Road Boundary Information from Crowdsourcing Vehicle GPS Trajectories
Yang, Wei
2018-01-01
Crowdsourcing trajectory data is an important approach for accessing and updating road information. In this paper, we present a novel approach for extracting road boundary information from crowdsourcing vehicle traces based on Delaunay triangulation (DT). First, an optimization and interpolation method is proposed to filter abnormal trace segments from raw global positioning system (GPS) traces and interpolate the optimization segments adaptively to ensure there are enough tracking points. Second, constructing the DT and the Voronoi diagram within interpolated tracking lines to calculate road boundary descriptors using the area of Voronoi cell and the length of triangle edge. Then, the road boundary detection model is established integrating the boundary descriptors and trajectory movement features (e.g., direction) by DT. Third, using the boundary detection model to detect road boundary from the DT constructed by trajectory lines, and a regional growing method based on seed polygons is proposed to extract the road boundary. Experiments were conducted using the GPS traces of taxis in Beijing, China, and the results show that the proposed method is suitable for extracting the road boundary from low-frequency GPS traces, multi-type road structures, and different time intervals. Compared with two existing methods, the automatically extracted boundary information was proved to be of higher quality. PMID:29671792
Gulliksen, Anja; Keegan, Helen; Martin, Cara; O'Leary, John; Solli, Lars A.; Falang, Inger Marie; Grønn, Petter; Karlgård, Aina; Mielnik, Michal M.; Johansen, Ib-Rune; Tofteberg, Terje R.; Baier, Tobias; Gransee, Rainer; Drese, Klaus; Hansen-Hagge, Thomas; Riegger, Lutz; Koltay, Peter; Zengerle, Roland; Karlsen, Frank; Ausen, Dag; Furuberg, Liv
2012-01-01
The paper presents the development of a “proof-of-principle” hands-free and self-contained diagnostic platform for detection of human papillomavirus (HPV) E6/E7 mRNA in clinical specimens. The automated platform performs chip-based sample preconcentration, nucleic acid extraction, amplification, and real-time fluorescent detection with minimal user interfacing. It consists of two modular prototypes, one for sample preparation and one for amplification and detection; however, a common interface is available to facilitate later integration into one single module. Nucleic acid extracts (n = 28) from cervical cytology specimens extracted on the sample preparation chip were tested using the PreTect HPV-Proofer and achieved an overall detection rate for HPV across all dilutions of 50%–85.7%. A subset of 6 clinical samples extracted on the sample preparation chip module was chosen for complete validation on the NASBA chip module. For 4 of the samples, a 100% amplification for HPV 16 or 33 was obtained at the 1 : 10 dilution for microfluidic channels that filled correctly. The modules of a “sample-in, answer-out” diagnostic platform have been demonstrated from clinical sample input through sample preparation, amplification and final detection. PMID:22235204
Integration of internet of things to reduce various losses of jatropha seed supply chain
NASA Astrophysics Data System (ADS)
Srinivasan, S. P.; Anitha, J.; Vijayakumar, R.
2017-06-01
The evolution of bio fuel supply chain has revolutionized the organization by restructuring the practices of the traditional management. A flexible distribution system is becoming the need of our society. The main focus of this paper is to integrate IoT technologies into a cultivation, extraction and management of Jatropha seed. It was noticed that major set-back of farmers due to poor supply chain integration. The various losses like information about the Jatropha seed availability, the location of esterification plants and distribution details are identified through this IoT. This enables the farmers to reorganize the land resources, yield estimation and distribution functions. The wastage and the scarcity of energy can be tackled by using the smart phone technologies. This paper is proposes a conceptual frame work on various losses involved in the supply chain of Jatropha seed.
Knowledge Acquisition of Generic Queries for Information Retrieval
Seol, Yoon-Ho; Johnson, Stephen B.; Cimino, James J.
2002-01-01
Several studies have identified clinical questions posed by health care professionals to understand the nature of information needs during clinical practice. To support access to digital information sources, it is necessary to integrate the information needs with a computer system. We have developed a conceptual guidance approach in information retrieval, based on a knowledge base that contains the patterns of information needs. The knowledge base uses a formal representation of clinical questions based on the UMLS knowledge sources, called the Generic Query model. To improve the coverage of the knowledge base, we investigated a method for extracting plausible clinical questions from the medical literature. This poster presents the Generic Query model, shows how it is used to represent the patterns of clinical questions, and describes the framework used to extract knowledge from the medical literature.
Miyoshi, Newton Shydeo Brandão; Pinheiro, Daniel Guariz; Silva, Wilson Araújo; Felipe, Joaquim Cezar
2013-06-06
The use of the knowledge produced by sciences to promote human health is the main goal of translational medicine. To make it feasible we need computational methods to handle the large amount of information that arises from bench to bedside and to deal with its heterogeneity. A computational challenge that must be faced is to promote the integration of clinical, socio-demographic and biological data. In this effort, ontologies play an essential role as a powerful artifact for knowledge representation. Chado is a modular ontology-oriented database model that gained popularity due to its robustness and flexibility as a generic platform to store biological data; however it lacks supporting representation of clinical and socio-demographic information. We have implemented an extension of Chado - the Clinical Module - to allow the representation of this kind of information. Our approach consists of a framework for data integration through the use of a common reference ontology. The design of this framework has four levels: data level, to store the data; semantic level, to integrate and standardize the data by the use of ontologies; application level, to manage clinical databases, ontologies and data integration process; and web interface level, to allow interaction between the user and the system. The clinical module was built based on the Entity-Attribute-Value (EAV) model. We also proposed a methodology to migrate data from legacy clinical databases to the integrative framework. A Chado instance was initialized using a relational database management system. The Clinical Module was implemented and the framework was loaded using data from a factual clinical research database. Clinical and demographic data as well as biomaterial data were obtained from patients with tumors of head and neck. We implemented the IPTrans tool that is a complete environment for data migration, which comprises: the construction of a model to describe the legacy clinical data, based on an ontology; the Extraction, Transformation and Load (ETL) process to extract the data from the source clinical database and load it in the Clinical Module of Chado; the development of a web tool and a Bridge Layer to adapt the web tool to Chado, as well as other applications. Open-source computational solutions currently available for translational science does not have a model to represent biomolecular information and also are not integrated with the existing bioinformatics tools. On the other hand, existing genomic data models do not represent clinical patient data. A framework was developed to support translational research by integrating biomolecular information coming from different "omics" technologies with patient's clinical and socio-demographic data. This framework should present some features: flexibility, compression and robustness. The experiments accomplished from a use case demonstrated that the proposed system meets requirements of flexibility and robustness, leading to the desired integration. The Clinical Module can be accessed in http://dcm.ffclrp.usp.br/caib/pg=iptrans.
Tarozzi, A; Hrelia, S; Angeloni, C; Morroni, F; Biagi, P; Guardigli, M; Cantelli-Forti, G; Hrelia, P
2006-03-01
Consumers consider plant food products from organic origin healthier than the corresponding conventional plant foods. Clear experimental evidence supporting this assumption is still lacking. To determine if the organic red oranges have a higher phyto-chemical content (i. e., phenolics, anthocyanins and ascorbic acid), total antioxidant activity and in vitro bioactivity, in terms of protective effect against oxidative damage at cellular level, than nonorganic red oranges. Total phenolics were measured using the Folin Ciocalteau assay, while total anthocyanins and ascorbic acid levels were determined by spectrophotometric and HPLC analysis, respectively. In addition, the total antioxidant activity of red orange extracts was measured by the ABTS(*+) test. The ability of red orange extracts to counteract conjugated diene containing lipids and free radical production in cultured rat cardiomyocytes and differentiated Caco-2 cells, respectively, was assessed. Organic oranges had significantly higher total phenolics, total anthocyanins and ascorbic acid levels than the corresponding non-organic oranges (all p < 0.05). Moreover, the organic orange extracts had a higher total antioxidant activity than non-organic orange extracts (p < 0.05). In addition, our results indicate that red oranges have a strong capacity of inhibiting the production of conjugated diene containing lipids and free radicals in rat cardiomyocytes and differentiated Caco-2 cells, respectively. Statistically higher levels of antioxidant activity in both cell models were found in organically grown oranges as compared to those produced by integrated agriculture practice. Our results clearly show that organic red oranges have a higher phytochemical content (i. e., phenolics, anthocyanins and ascorbic acid), total antioxidant activity and bioactivity than integrated red oranges. Further studies are needed to confirm whether the organic agriculture practice is likely to increase the antioxidant activity of other varieties of fruits and vegetables.
Dong, Tao; Fei, Qiang; Genelot, Marie; ...
2017-03-08
In light of the availability of low-cost methane (CH 4) derived from natural gas and biogas along with increasing concerns of the greenhouse gas emissions, the production of alternative liquid biofuels directly from CH 4 is a promising approach to capturing wasted energy. A novel biorefinery concept integrating biological conversion of CH 4 to microbial lipids together with lipid extraction and generation of hydrocarbon fuels is demonstrated in this study for the first time. An aerobic methanotrophic bacterium, Methylomicrobium buryatense capable of using CH 4 as the sole carbon source was selected on the basis of genetic tractability, cultivation robustness,more » and ability to accumulate phospholipids in membranes. A maximum fatty acid content of 10% of dry cell weight was obtained in batch cultures grown in a continuous gas sparging fermentation system. Although phospholipids are not typically considered as a good feedstock for upgrading to hydrocarbon fuels, we set out to demonstrate that using a combination of novel lipid extraction methodology with advanced catalyst design, we could prove the feasibility of this approach. Up to 95% of the total fatty acids from membrane-bound phospholipids were recovered by a two-stage pretreatment method followed by hexane extraction of the aqueous hydrolysate. The upgrading of extracted lipids was then demonstrated in a hydrodeoxygeation process using palladium on silica as a catalyst. Lipid conversion in excess of 99% was achieved, with a full selectivity to hydrocarbons. Lastly, the final hydrocarbon mixture is dominated by 88% pentadecane (C 15H 32) based on decarbonylation/decarboxylation and hydrogenation of C16 fatty acids, indicating that a biological gas-to-liquid fuel (Bio-GTL) process is technically feasible.« less
NASA Astrophysics Data System (ADS)
Huong, Do Thi Viet; Nagasawa, Ryota
2014-01-01
The potential flood hazard was assessed for the Hoa Chau commune in central Vietnam in order to identify the high flood hazard zones for the decision makers who will execute future rural planning. A new approach for deriving the potential flood hazard based on integration of inundation and flow direction maps is described. Areas inundated in the historical flood event of 2007 were extracted from Advanced Land Observing Satellite (ALOS) phased array L-band synthetic aperture data radar (PALSAR) images, while flow direction characteristics were derived from the ASTER GDEM to extract the depressed surfaces. Past flood experience and the flow direction were then integrated to analyze and rank the potential flood hazard zones. The land use/cover map extracted from LANDSAT TM and flood depth point records from field surveys were utilized to check the possibility of susceptible inundated areas, extracting data from ALOS PALSAR and ranking the potential flood hazard. The estimation of potential flood hazard areas revealed that 17.43% and 17.36% of Hoa Chau had high and medium potential flood hazards, respectively. The flow direction and ALOS PALSAR data were effectively integrated for determining the potential flood hazard when hydrological and meteorological data were inadequate and remote sensing images taken during flood times were not available or were insufficient.
Sapphire Energy - Integrated Algal Biorefinery
DOE Office of Scientific and Technical Information (OSTI.GOV)
White, Rebecca L.; Tyler, Mike
2015-07-22
Sapphire Energy, Inc. (SEI) is a leader in large-scale photosynthetic algal biomass production, with a strongly cohesive research, development, and operations program. SEI takes a multidiscipline approach to integrate lab-based strain selection, cultivation and harvest and production scale, and extraction for the production of Green Crude oil, a drop in replacement for traditional crude oil.. SEI’s technical accomplishments since 2007 have produced a multifunctional platform that can address needs for fuel, feed, and other higher value products. Figure 1 outlines SEI’s commercialization process, including Green Crude production and refinement to drop in fuel replacements. The large scale algal biomass productionmore » facility, the SEI Integrated Algal Biorefinery (IABR), was built in Luna County near Columbus, New Mexico (see fig 2). The extraction unit was located at the existing SEI facility in Las Cruces, New Mexico, approximately 95 miles from the IABR. The IABR facility was constructed on time and on budget, and the extraction unit expansion to accommodate the biomass output from the IABR was completed in October 2012. The IABR facility uses open pond cultivation with a proprietary harvesting method to produce algal biomass; this biomass is then shipped to the extraction facility for conversion to Green Crude. The operation of the IABR and the extraction facilities has demonstrated the critical integration of traditional agricultural techniques with algae cultivation knowledge for algal biomass production, and the successful conversion of the biomass to Green Crude. All primary unit operations are de-risked, and at a scale suitable for process demonstration. The results are stable, reliable, and long-term cultivation of strains for year round algal biomass production. From June 2012 to November 2014, the IABR and extraction facilities produced 524 metric tons (MT) of biomass (on a dry weight basis), and 2,587 gallons of Green Crude. Additionally, the IABR demonstrated significant year over year yield improvements (2013 to 2014), and reduction in the cost of biomass production. Therefore, the IABR fulfills a number of critical functions in SEI’s integrated development pipeline. These functions are critical in general for the commercialization of algal biomass production and production of biofuels from algal biomass.« less
Somme, Dominique; Trouvé, Hélène; Perisset, Catherine; Corvol, Aline; Ankri, Joël; Saint-Jean, Olivier; de Stampa, Matthieu
2014-01-01
Introduction Many countries face ageing-related demographic and epidemiological challenges, notably neurodegenerative disorders, due to the multiple care services they require, thereby pleading for a more integrated system of care. The integrated Quebecois method issued from the Programme of Research to Integrate Services for the Maintenance of Autonomy inspired a French pilot experiment and the National Alzheimer Plan 2008–2012. Programme of Research to Integrate Services for the Maintenance of Autonomy method implementation was rated with an evaluation grid adapted to assess its successive degrees of completion. Discussion The approaching end of the president's term led to the method's institutionalization (2011–2012), before the implementation study ended. When the government changed, the study was interrupted. The results extracted from that ‘lost’ study (presented herein) have, nonetheless, ‘found’ some key lessons. Key lessons/conclusion It was possible to implement a Quebecois integrated-care method in France. We describe the lessons and pitfalls encountered in adapting this evaluation tool. This process is necessarily multidisciplinary and requires a test phase. A simple tool for quantitative assessment of integration was obtained. The first assessment of the tool was unsatisfactory but requires further studies. In the meantime, we recommend using mixed methodologies to assess the services integration level. PMID:24959112
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sola, M.; Haakon Nordby, L.; Dailey, D.V.
High resolution 3-D visualization of horizon interpretation and seismic attributes from large 3-D seismic surveys in deepwater Nigeria has greatly enhanced the exploration team`s ability to quickly recognize prospective segments of subregional and prospect specific scale areas. Integrated workstation generated structure, isopach and extracted horizon consistent, interval and windowed attributes are particularly useful in illustrating the complex structural and stratigraphical prospectivity of deepwater Nigeria. Large 3-D seismic volumes acquired over 750 square kilometers can be manipulated within the visualization system with attribute tracking capability that allows for real time data interrogation and interpretation. As in classical seismic stratigraphic studies, patternmore » recognition is fundamental to effective depositions facies interpretation and reservoir model construction. The 3-D perspective enhances the data interpretation through clear representation of relative scale, spatial distribution and magnitude of attributes. In deepwater Nigeria, many prospective traps rely on an interplay between syndepositional structure and slope turbidite depositional systems. Reservoir systems in many prospects appear to be dominated by unconfined to moderately focused slope feeder channel facies. These units have spatially complex facies architecture with feeder channel axes separated by extensive interchannel areas. Structural culminations generally have a history of initial compressional folding with late in extensional collapse and accommodation faulting. The resulting complex trap configurations often have stacked reservoirs over intervals as thick as 1500 meters. Exploration, appraisal and development scenarios in these settings can be optimized by taking full advantage of integrating high resolution 3-D visualization and seismic workstation interpretation.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sola, M.; Haakon Nordby, L.; Dailey, D.V.
High resolution 3-D visualization of horizon interpretation and seismic attributes from large 3-D seismic surveys in deepwater Nigeria has greatly enhanced the exploration team's ability to quickly recognize prospective segments of subregional and prospect specific scale areas. Integrated workstation generated structure, isopach and extracted horizon consistent, interval and windowed attributes are particularly useful in illustrating the complex structural and stratigraphical prospectivity of deepwater Nigeria. Large 3-D seismic volumes acquired over 750 square kilometers can be manipulated within the visualization system with attribute tracking capability that allows for real time data interrogation and interpretation. As in classical seismic stratigraphic studies, patternmore » recognition is fundamental to effective depositions facies interpretation and reservoir model construction. The 3-D perspective enhances the data interpretation through clear representation of relative scale, spatial distribution and magnitude of attributes. In deepwater Nigeria, many prospective traps rely on an interplay between syndepositional structure and slope turbidite depositional systems. Reservoir systems in many prospects appear to be dominated by unconfined to moderately focused slope feeder channel facies. These units have spatially complex facies architecture with feeder channel axes separated by extensive interchannel areas. Structural culminations generally have a history of initial compressional folding with late in extensional collapse and accommodation faulting. The resulting complex trap configurations often have stacked reservoirs over intervals as thick as 1500 meters. Exploration, appraisal and development scenarios in these settings can be optimized by taking full advantage of integrating high resolution 3-D visualization and seismic workstation interpretation.« less
Moon, Myungjin; Nakai, Kenta
2018-04-01
Currently, cancer biomarker discovery is one of the important research topics worldwide. In particular, detecting significant genes related to cancer is an important task for early diagnosis and treatment of cancer. Conventional studies mostly focus on genes that are differentially expressed in different states of cancer; however, noise in gene expression datasets and insufficient information in limited datasets impede precise analysis of novel candidate biomarkers. In this study, we propose an integrative analysis of gene expression and DNA methylation using normalization and unsupervised feature extractions to identify candidate biomarkers of cancer using renal cell carcinoma RNA-seq datasets. Gene expression and DNA methylation datasets are normalized by Box-Cox transformation and integrated into a one-dimensional dataset that retains the major characteristics of the original datasets by unsupervised feature extraction methods, and differentially expressed genes are selected from the integrated dataset. Use of the integrated dataset demonstrated improved performance as compared with conventional approaches that utilize gene expression or DNA methylation datasets alone. Validation based on the literature showed that a considerable number of top-ranked genes from the integrated dataset have known relationships with cancer, implying that novel candidate biomarkers can also be acquired from the proposed analysis method. Furthermore, we expect that the proposed method can be expanded for applications involving various types of multi-omics datasets.
Sonker, Mukul; Knob, Radim; Sahore, Vishal; Woolley, Adam T
2017-07-01
Integration in microfluidics is important for achieving automation. Sample preconcentration integrated with separation in a microfluidic setup can have a substantial impact on rapid analysis of low-abundance disease biomarkers. Here, we have developed a microfluidic device that uses pH-mediated solid-phase extraction (SPE) for the enrichment and elution of preterm birth (PTB) biomarkers. Furthermore, this SPE module was integrated with microchip electrophoresis for combined enrichment and separation of multiple analytes, including a PTB peptide biomarker (P1). A reversed-phase octyl methacrylate monolith was polymerized as the SPE medium in polyethylene glycol diacrylate modified cyclic olefin copolymer microfluidic channels. Eluent for pH-mediated SPE of PTB biomarkers on the monolith was optimized using different pH values and ionic concentrations. Nearly 50-fold enrichment was observed in single channel SPE devices for a low nanomolar solution of P1, with great elution time reproducibility (<7% RSD). The monolith binding capacity was determined to be 400 pg (0.2 pmol). A mixture of a model peptide (FA) and a PTB biomarker (P1) was extracted, eluted, injected, and then separated by microchip electrophoresis in our integrated device with ∼15-fold enrichment. This device shows important progress towards an integrated electrokinetically operated platform for preconcentration and separation of biomarkers. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
An integrated paper-based sample-to-answer biosensor for nucleic acid testing at the point of care.
Choi, Jane Ru; Hu, Jie; Tang, Ruihua; Gong, Yan; Feng, Shangsheng; Ren, Hui; Wen, Ting; Li, XiuJun; Wan Abas, Wan Abu Bakar; Pingguan-Murphy, Belinda; Xu, Feng
2016-02-07
With advances in point-of-care testing (POCT), lateral flow assays (LFAs) have been explored for nucleic acid detection. However, biological samples generally contain complex compositions and low amounts of target nucleic acids, and currently require laborious off-chip nucleic acid extraction and amplification processes (e.g., tube-based extraction and polymerase chain reaction (PCR)) prior to detection. To the best of our knowledge, even though the integration of DNA extraction and amplification into a paper-based biosensor has been reported, a combination of LFA with the aforementioned steps for simple colorimetric readout has not yet been demonstrated. Here, we demonstrate for the first time an integrated paper-based biosensor incorporating nucleic acid extraction, amplification and visual detection or quantification using a smartphone. A handheld battery-powered heating device was specially developed for nucleic acid amplification in POC settings, which is coupled with this simple assay for rapid target detection. The biosensor can successfully detect Escherichia coli (as a model analyte) in spiked drinking water, milk, blood, and spinach with a detection limit of as low as 10-1000 CFU mL(-1), and Streptococcus pneumonia in clinical blood samples, highlighting its potential use in medical diagnostics, food safety analysis and environmental monitoring. As compared to the lengthy conventional assay, which requires more than 5 hours for the entire sample-to-answer process, it takes about 1 hour for our integrated biosensor. The integrated biosensor holds great potential for detection of various target analytes for wide applications in the near future.
NASA Astrophysics Data System (ADS)
Bangs, Corey F.; Kruse, Fred A.; Olsen, Chris R.
2013-05-01
Hyperspectral data were assessed to determine the effect of integrating spectral data and extracted texture feature data on classification accuracy. Four separate spectral ranges (hundreds of spectral bands total) were used from the Visible and Near Infrared (VNIR) and Shortwave Infrared (SWIR) portions of the electromagnetic spectrum. Haralick texture features (contrast, entropy, and correlation) were extracted from the average gray-level image for each of the four spectral ranges studied. A maximum likelihood classifier was trained using a set of ground truth regions of interest (ROIs) and applied separately to the spectral data, texture data, and a fused dataset containing both. Classification accuracy was measured by comparison of results to a separate verification set of test ROIs. Analysis indicates that the spectral range (source of the gray-level image) used to extract the texture feature data has a significant effect on the classification accuracy. This result applies to texture-only classifications as well as the classification of integrated spectral data and texture feature data sets. Overall classification improvement for the integrated data sets was near 1%. Individual improvement for integrated spectral and texture classification of the "Urban" class showed approximately 9% accuracy increase over spectral-only classification. Texture-only classification accuracy was highest for the "Dirt Path" class at approximately 92% for the spectral range from 947 to 1343nm. This research demonstrates the effectiveness of texture feature data for more accurate analysis of hyperspectral data and the importance of selecting the correct spectral range to be used for the gray-level image source to extract these features.
Verma, Arjun; Fratto, Brian E.; Privman, Vladimir; Katz, Evgeny
2016-01-01
We consider flow systems that have been utilized for small-scale biomolecular computing and digital signal processing in binary-operating biosensors. Signal measurement is optimized by designing a flow-reversal cuvette and analyzing the experimental data to theoretically extract the pulse shape, as well as reveal the level of noise it possesses. Noise reduction is then carried out numerically. We conclude that this can be accomplished physically via the addition of properly designed well-mixing flow-reversal cell(s) as an integral part of the flow system. This approach should enable improved networking capabilities and potentially not only digital but analog signal-processing in such systems. Possible applications in complex biocomputing networks and various sense-and-act systems are discussed. PMID:27399702
An integrated software suite for surface-based analyses of cerebral cortex.
Van Essen, D C; Drury, H A; Dickson, J; Harwell, J; Hanlon, D; Anderson, C H
2001-01-01
The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.
Application of Fourier transforms for microwave radiometric inversions
NASA Technical Reports Server (NTRS)
Holmes, J. J.; Balanis, C. A.; Truman, W. M.
1975-01-01
Existing microwave radiometer technology now provides a suitable method for remote determination of the ocean surface's absolute brightness temperature. To extract the brightness temperature of the water from the antenna temperature, an unstable Fredholm integral equation of the first kind is solved. Fourier transform techniques are used to invert the integral after it is placed into a cross correlation form. Application and verification of the methods to a two-dimensional modeling of a laboratory wave tank system are included. The instability of the ill-posed Fredholm equation is examined and a restoration procedure is included which smooths the resulting oscillations. With the recent availability and advances of fast Fourier transform (FFT) techniques, the method presented becomes very attractive in the evaluation of large quantities of data.
An integrated software suite for surface-based analyses of cerebral cortex
NASA Technical Reports Server (NTRS)
Van Essen, D. C.; Drury, H. A.; Dickson, J.; Harwell, J.; Hanlon, D.; Anderson, C. H.
2001-01-01
The authors describe and illustrate an integrated trio of software programs for carrying out surface-based analyses of cerebral cortex. The first component of this trio, SureFit (Surface Reconstruction by Filtering and Intensity Transformations), is used primarily for cortical segmentation, volume visualization, surface generation, and the mapping of functional neuroimaging data onto surfaces. The second component, Caret (Computerized Anatomical Reconstruction and Editing Tool Kit), provides a wide range of surface visualization and analysis options as well as capabilities for surface flattening, surface-based deformation, and other surface manipulations. The third component, SuMS (Surface Management System), is a database and associated user interface for surface-related data. It provides for efficient insertion, searching, and extraction of surface and volume data from the database.
Extracts of Edible and Medicinal Plants Damage Membranes of Vibrio cholerae▿
Sánchez, Eduardo; García, Santos; Heredia, Norma
2010-01-01
The use of natural compounds from plants can provide an alternative approach against food-borne pathogens. The mechanisms of action of most plant extracts with antimicrobial activity have been poorly studied. In this work, changes in membrane integrity, membrane potential, internal pH (pHin), and ATP synthesis were measured in Vibrio cholerae cells after exposure to extracts of edible and medicinal plants. A preliminary screen of methanolic, ethanolic, and aqueous extracts of medicinal and edible plants was performed. Minimal bactericidal concentrations (MBCs) were measured for extracts showing high antimicrobial activity. Our results indicate that methanolic extracts of basil (Ocimum basilicum L.), nopal cactus (Opuntia ficus-indica var. Villanueva L.), sweet acacia (Acacia farnesiana L.), and white sagebrush (Artemisia ludoviciana Nutt.) are the most active against V. cholera, with MBCs ranging from 0.5 to 3.0 mg/ml. Using four fluorogenic techniques, we studied the membrane integrity of V. cholerae cells after exposure to these four extracts. Extracts from these plants were able to disrupt the cell membranes of V. cholerae cells, causing increased membrane permeability, a clear decrease in cytoplasmic pH, cell membrane hyperpolarization, and a decrease in cellular ATP concentration in all strains tested. These four plant extracts could be studied as future alternatives to control V. cholerae contamination in foods and the diseases associated with this microorganism. PMID:20802077
Extracts of edible and medicinal plants damage membranes of Vibrio cholerae.
Sánchez, Eduardo; García, Santos; Heredia, Norma
2010-10-01
The use of natural compounds from plants can provide an alternative approach against food-borne pathogens. The mechanisms of action of most plant extracts with antimicrobial activity have been poorly studied. In this work, changes in membrane integrity, membrane potential, internal pH (pH(in)), and ATP synthesis were measured in Vibrio cholerae cells after exposure to extracts of edible and medicinal plants. A preliminary screen of methanolic, ethanolic, and aqueous extracts of medicinal and edible plants was performed. Minimal bactericidal concentrations (MBCs) were measured for extracts showing high antimicrobial activity. Our results indicate that methanolic extracts of basil (Ocimum basilicum L.), nopal cactus (Opuntia ficus-indica var. Villanueva L.), sweet acacia (Acacia farnesiana L.), and white sagebrush (Artemisia ludoviciana Nutt.) are the most active against V. cholera, with MBCs ranging from 0.5 to 3.0 mg/ml. Using four fluorogenic techniques, we studied the membrane integrity of V. cholerae cells after exposure to these four extracts. Extracts from these plants were able to disrupt the cell membranes of V. cholerae cells, causing increased membrane permeability, a clear decrease in cytoplasmic pH, cell membrane hyperpolarization, and a decrease in cellular ATP concentration in all strains tested. These four plant extracts could be studied as future alternatives to control V. cholerae contamination in foods and the diseases associated with this microorganism.
Medical image computing for computer-supported diagnostics and therapy. Advances and perspectives.
Handels, H; Ehrhardt, J
2009-01-01
Medical image computing has become one of the most challenging fields in medical informatics. In image-based diagnostics of the future software assistance will become more and more important, and image analysis systems integrating advanced image computing methods are needed to extract quantitative image parameters to characterize the state and changes of image structures of interest (e.g. tumors, organs, vessels, bones etc.) in a reproducible and objective way. Furthermore, in the field of software-assisted and navigated surgery medical image computing methods play a key role and have opened up new perspectives for patient treatment. However, further developments are needed to increase the grade of automation, accuracy, reproducibility and robustness. Moreover, the systems developed have to be integrated into the clinical workflow. For the development of advanced image computing systems methods of different scientific fields have to be adapted and used in combination. The principal methodologies in medical image computing are the following: image segmentation, image registration, image analysis for quantification and computer assisted image interpretation, modeling and simulation as well as visualization and virtual reality. Especially, model-based image computing techniques open up new perspectives for prediction of organ changes and risk analysis of patients and will gain importance in diagnostic and therapy of the future. From a methodical point of view the authors identify the following future trends and perspectives in medical image computing: development of optimized application-specific systems and integration into the clinical workflow, enhanced computational models for image analysis and virtual reality training systems, integration of different image computing methods, further integration of multimodal image data and biosignals and advanced methods for 4D medical image computing. The development of image analysis systems for diagnostic support or operation planning is a complex interdisciplinary process. Image computing methods enable new insights into the patient's image data and have the future potential to improve medical diagnostics and patient treatment.
Mumtaz, Sidra; Khan, Laiq
2017-01-01
The hybrid power system (HPS) is an emerging power generation scheme due to the plentiful availability of renewable energy sources. Renewable energy sources are characterized as highly intermittent in nature due to meteorological conditions, while the domestic load also behaves in a quite uncertain manner. In this scenario, to maintain the balance between generation and load, the development of an intelligent and adaptive control algorithm has preoccupied power engineers and researchers. This paper proposes a Hermite wavelet embedded NeuroFuzzy indirect adaptive MPPT (maximum power point tracking) control of photovoltaic (PV) systems to extract maximum power and a Hermite wavelet incorporated NeuroFuzzy indirect adaptive control of Solid Oxide Fuel Cells (SOFC) to obtain a swift response in a grid-connected hybrid power system. A comprehensive simulation testbed for a grid-connected hybrid power system (wind turbine, PV cells, SOFC, electrolyzer, battery storage system, supercapacitor (SC), micro-turbine (MT) and domestic load) is developed in Matlab/Simulink. The robustness and superiority of the proposed indirect adaptive control paradigm are evaluated through simulation results in a grid-connected hybrid power system testbed by comparison with a conventional PI (proportional and integral) control system. The simulation results verify the effectiveness of the proposed control paradigm.
Khan, Laiq
2017-01-01
The hybrid power system (HPS) is an emerging power generation scheme due to the plentiful availability of renewable energy sources. Renewable energy sources are characterized as highly intermittent in nature due to meteorological conditions, while the domestic load also behaves in a quite uncertain manner. In this scenario, to maintain the balance between generation and load, the development of an intelligent and adaptive control algorithm has preoccupied power engineers and researchers. This paper proposes a Hermite wavelet embedded NeuroFuzzy indirect adaptive MPPT (maximum power point tracking) control of photovoltaic (PV) systems to extract maximum power and a Hermite wavelet incorporated NeuroFuzzy indirect adaptive control of Solid Oxide Fuel Cells (SOFC) to obtain a swift response in a grid-connected hybrid power system. A comprehensive simulation testbed for a grid-connected hybrid power system (wind turbine, PV cells, SOFC, electrolyzer, battery storage system, supercapacitor (SC), micro-turbine (MT) and domestic load) is developed in Matlab/Simulink. The robustness and superiority of the proposed indirect adaptive control paradigm are evaluated through simulation results in a grid-connected hybrid power system testbed by comparison with a conventional PI (proportional and integral) control system. The simulation results verify the effectiveness of the proposed control paradigm. PMID:28329015