Integrative Data Analysis of Multi-Platform Cancer Data with a Multimodal Deep Learning Approach.
Liang, Muxuan; Li, Zhizhong; Chen, Ting; Zeng, Jianyang
2015-01-01
Identification of cancer subtypes plays an important role in revealing useful insights into disease pathogenesis and advancing personalized therapy. The recent development of high-throughput sequencing technologies has enabled the rapid collection of multi-platform genomic data (e.g., gene expression, miRNA expression, and DNA methylation) for the same set of tumor samples. Although numerous integrative clustering approaches have been developed to analyze cancer data, few of them are particularly designed to exploit both deep intrinsic statistical properties of each input modality and complex cross-modality correlations among multi-platform input data. In this paper, we propose a new machine learning model, called multimodal deep belief network (DBN), to cluster cancer patients from multi-platform observation data. In our integrative clustering framework, relationships among inherent features of each single modality are first encoded into multiple layers of hidden variables, and then a joint latent model is employed to fuse common features derived from multiple input modalities. A practical learning algorithm, called contrastive divergence (CD), is applied to infer the parameters of our multimodal DBN model in an unsupervised manner. Tests on two available cancer datasets show that our integrative data analysis approach can effectively extract a unified representation of latent features to capture both intra- and cross-modality correlations, and identify meaningful disease subtypes from multi-platform cancer data. In addition, our approach can identify key genes and miRNAs that may play distinct roles in the pathogenesis of different cancer subtypes. Among those key miRNAs, we found that the expression level of miR-29a is highly correlated with survival time in ovarian cancer patients. These results indicate that our multimodal DBN based data analysis approach may have practical applications in cancer pathogenesis studies and provide useful guidelines for personalized cancer therapy.
NASA Astrophysics Data System (ADS)
LIU, Yiping; XU, Qing; ZhANG, Heng; LV, Liang; LU, Wanjie; WANG, Dandi
2016-11-01
The purpose of this paper is to solve the problems of the traditional single system for interpretation and draughting such as inconsistent standards, single function, dependence on plug-ins, closed system and low integration level. On the basis of the comprehensive analysis of the target elements composition, map representation and similar system features, a 3D interpretation and draughting integrated service platform for multi-source, multi-scale and multi-resolution geospatial objects is established based on HTML5 and WebGL, which not only integrates object recognition, access, retrieval, three-dimensional display and test evaluation but also achieves collection, transfer, storage, refreshing and maintenance of data about Geospatial Objects and shows value in certain prospects and potential for growth.
NASA Astrophysics Data System (ADS)
Guldner, Ian H.; Yang, Lin; Cowdrick, Kyle R.; Wang, Qingfei; Alvarez Barrios, Wendy V.; Zellmer, Victoria R.; Zhang, Yizhe; Host, Misha; Liu, Fang; Chen, Danny Z.; Zhang, Siyuan
2016-04-01
Metastatic microenvironments are spatially and compositionally heterogeneous. This seemingly stochastic heterogeneity provides researchers great challenges in elucidating factors that determine metastatic outgrowth. Herein, we develop and implement an integrative platform that will enable researchers to obtain novel insights from intricate metastatic landscapes. Our two-segment platform begins with whole tissue clearing, staining, and imaging to globally delineate metastatic landscape heterogeneity with spatial and molecular resolution. The second segment of our platform applies our custom-developed SMART 3D (Spatial filtering-based background removal and Multi-chAnnel forest classifiers-based 3D ReconsTruction), a multi-faceted image analysis pipeline, permitting quantitative interrogation of functional implications of heterogeneous metastatic landscape constituents, from subcellular features to multicellular structures, within our large three-dimensional (3D) image datasets. Coupling whole tissue imaging of brain metastasis animal models with SMART 3D, we demonstrate the capability of our integrative pipeline to reveal and quantify volumetric and spatial aspects of brain metastasis landscapes, including diverse tumor morphology, heterogeneous proliferative indices, metastasis-associated astrogliosis, and vasculature spatial distribution. Collectively, our study demonstrates the utility of our novel integrative platform to reveal and quantify the global spatial and volumetric characteristics of the 3D metastatic landscape with unparalleled accuracy, opening new opportunities for unbiased investigation of novel biological phenomena in situ.
Integrated Multi-process Microfluidic Systems for Automating Analysis
Yang, Weichun; Woolley, Adam T.
2010-01-01
Microfluidic technologies have been applied extensively in rapid sample analysis. Some current challenges for standard microfluidic systems are relatively high detection limits, and reduced resolving power and peak capacity compared to conventional approaches. The integration of multiple functions and components onto a single platform can overcome these separation and detection limitations of microfluidics. Multiplexed systems can greatly increase peak capacity in multidimensional separations and can increase sample throughput by analyzing many samples simultaneously. On-chip sample preparation, including labeling, preconcentration, cleanup and amplification, can all serve to speed up and automate processes in integrated microfluidic systems. This paper summarizes advances in integrated multi-process microfluidic systems for automated analysis, their benefits and areas for needed improvement. PMID:20514343
Li, Sen; Tang, Shi-Huan; Liu, Jin-Ling; Su, Jin; He, Fu-Yuan
2018-04-01
The ancient dragon Materia Medica, Compendium of Materia Medica and other works recorded that the main effect of ginseng is tonifying qi. It is reported that the main active ingredient of ginseng is ginsenoside. Modern studies have found that ginseng mono saponins are effective for cardiovascular related diseases. This paper preliminary clarified the efficacy of traditional ginseng-nourishing qi and cardiovascular disease through the traditional Chinese medicine (TCM) inheritance auxiliary platform and integration platform of association of pharmacology. With the help of TCM inheritance auxiliary platform-analysis of "Chinese medicine database", Chinese medicine treatment of modern diseases that ginseng rules, so the traditional effect associated with modern medicine and pharmacology; application integration platform enrichment analysis on the target of drug and gene function, metabolic pathway, to further explore the molecular mechanism of ginseng in the treatment of coronary heart disease, aimed at mining the molecular mechanism of ginseng in the treatment of coronary heart disease. Chinese medicine containing ginseng 307 prescriptions, 87 kinds of disease indications, western medicine disease Chinese medicine therapy for ginseng main coronary heart disease; analysis of molecular mechanism of ginseng pharmacology integration platform for the treatment of coronary heart disease. Ginsenosides(Ra₁, Ra₂, Rb₁, Rb₂, Rg₁, Ro) bind these targets, PRKAA1, PRKAA2, NDUFA4, COX5B, UQCRC1, affect chemokines, non-alcoholic fatty liver, gonadotropin, carbon metabolism, glucose metabolism and other pathways to treat coronary heart disease indirectly. The molecular mechanism of Panax ginseng's multi-component, multi-target and synergistic action is preliminarily elucidated in this paper. Copyright© by the Chinese Pharmaceutical Association.
Kusters, Koen; Buck, Louise; de Graaf, Maartje; Minang, Peter; van Oosten, Cora; Zagt, Roderick
2018-07-01
Integrated landscape initiatives typically aim to strengthen landscape governance by developing and facilitating multi-stakeholder platforms. These are institutional coordination mechanisms that enable discussions, negotiations, and joint planning between stakeholders from various sectors in a given landscape. Multi-stakeholder platforms tend to involve complex processes with diverse actors, whose objectives and focus may be subjected to periodic re-evaluation, revision or reform. In this article we propose a participatory method to aid planning, monitoring, and evaluation of such platforms, and we report on experiences from piloting the method in Ghana and Indonesia. The method is comprised of three components. The first can be used to look ahead, identifying priorities for future multi-stakeholder collaboration in the landscape. It is based on the identification of four aspirations that are common across multi-stakeholder platforms in integrated landscape initiatives. The second can be used to look inward. It focuses on the processes within an existing multi-stakeholder platform in order to identify areas for possible improvement. The third can be used to look back, identifying the main outcomes of an existing platform and comparing them to the original objectives. The three components can be implemented together or separately. They can be used to inform planning and adaptive management of the platform, as well as to demonstrate performance and inform the design of new interventions.
Reconfigurable microfluidic hanging drop network for multi-tissue interaction and analysis.
Frey, Olivier; Misun, Patrick M; Fluri, David A; Hengstler, Jan G; Hierlemann, Andreas
2014-06-30
Integration of multiple three-dimensional microtissues into microfluidic networks enables new insights in how different organs or tissues of an organism interact. Here, we present a platform that extends the hanging-drop technology, used for multi-cellular spheroid formation, to multifunctional complex microfluidic networks. Engineered as completely open, 'hanging' microfluidic system at the bottom of a substrate, the platform features high flexibility in microtissue arrangements and interconnections, while fabrication is simple and operation robust. Multiple spheroids of different cell types are formed in parallel on the same platform; the different tissues are then connected in physiological order for multi-tissue experiments through reconfiguration of the fluidic network. Liquid flow is precisely controlled through the hanging drops, which enable nutrient supply, substance dosage and inter-organ metabolic communication. The possibility to perform parallelized microtissue formation on the same chip that is subsequently used for complex multi-tissue experiments renders the developed platform a promising technology for 'body-on-a-chip'-related research.
An integrated GIS-based, multi-attribute decision model deployed in a web-based platform is presented enabling an iterative, spatially explicit and collaborative analysis of relevant and available information for repurposing vacant land. The process incorporated traditional and ...
Cloud Based Earth Observation Data Exploitation Platforms
NASA Astrophysics Data System (ADS)
Romeo, A.; Pinto, S.; Loekken, S.; Marin, A.
2017-12-01
In the last few years data produced daily by several private and public Earth Observation (EO) satellites reached the order of tens of Terabytes, representing for scientists and commercial application developers both a big opportunity for their exploitation and a challenge for their management. New IT technologies, such as Big Data and cloud computing, enable the creation of web-accessible data exploitation platforms, which offer to scientists and application developers the means to access and use EO data in a quick and cost effective way. RHEA Group is particularly active in this sector, supporting the European Space Agency (ESA) in the Exploitation Platforms (EP) initiative, developing technology to build multi cloud platforms for the processing and analysis of Earth Observation data, and collaborating with larger European initiatives such as the European Plate Observing System (EPOS) and the European Open Science Cloud (EOSC). An EP is a virtual workspace, providing a user community with access to (i) large volume of data, (ii) algorithm development and integration environment, (iii) processing software and services (e.g. toolboxes, visualization routines), (iv) computing resources, (v) collaboration tools (e.g. forums, wiki, etc.). When an EP is dedicated to a specific Theme, it becomes a Thematic Exploitation Platform (TEP). Currently, ESA has seven TEPs in a pre-operational phase dedicated to geo-hazards monitoring and prevention, costal zones, forestry areas, hydrology, polar regions, urban areas and food security. On the technology development side, solutions like the multi cloud EO data processing platform provides the technology to integrate ICT resources and EO data from different vendors in a single platform. In particular it offers (i) Multi-cloud data discovery, (ii) Multi-cloud data management and access and (iii) Multi-cloud application deployment. This platform has been demonstrated with the EGI Federated Cloud, Innovation Platform Testbed Poland and the Amazon Web Services cloud. This work will present an overview of the TEPs and the multi-cloud EO data processing platform, and discuss their main achievements and their impacts in the context of distributed Research Infrastructures such as EPOS and EOSC.
Latent feature decompositions for integrative analysis of multi-platform genomic data
Gregory, Karl B.; Momin, Amin A.; Coombes, Kevin R.; Baladandayuthapani, Veerabhadran
2015-01-01
Increased availability of multi-platform genomics data on matched samples has sparked research efforts to discover how diverse molecular features interact both within and between platforms. In addition, simultaneous measurements of genetic and epigenetic characteristics illuminate the roles their complex relationships play in disease progression and outcomes. However, integrative methods for diverse genomics data are faced with the challenges of ultra-high dimensionality and the existence of complex interactions both within and between platforms. We propose a novel modeling framework for integrative analysis based on decompositions of the large number of platform-specific features into a smaller number of latent features. Subsequently we build a predictive model for clinical outcomes accounting for both within- and between-platform interactions based on Bayesian model averaging procedures. Principal components, partial least squares and non-negative matrix factorization as well as sparse counterparts of each are used to define the latent features, and the performance of these decompositions is compared both on real and simulated data. The latent feature interactions are shown to preserve interactions between the original features and not only aid prediction but also allow explicit selection of outcome-related features. The methods are motivated by and applied to, a glioblastoma multiforme dataset from The Cancer Genome Atlas to predict patient survival times integrating gene expression, microRNA, copy number and methylation data. For the glioblastoma data, we find a high concordance between our selected prognostic genes and genes with known associations with glioblastoma. In addition, our model discovers several relevant cross-platform interactions such as copy number variation associated gene dosing and epigenetic regulation through promoter methylation. On simulated data, we show that our proposed method successfully incorporates interactions within and between genomic platforms to aid accurate prediction and variable selection. Our methods perform best when principal components are used to define the latent features. PMID:26146492
Multi-platform 'Omics Analysis of Human Ebola Virus Disease Pathogenesis.
Eisfeld, Amie J; Halfmann, Peter J; Wendler, Jason P; Kyle, Jennifer E; Burnum-Johnson, Kristin E; Peralta, Zuleyma; Maemura, Tadashi; Walters, Kevin B; Watanabe, Tokiko; Fukuyama, Satoshi; Yamashita, Makoto; Jacobs, Jon M; Kim, Young-Mo; Casey, Cameron P; Stratton, Kelly G; Webb-Robertson, Bobbie-Jo M; Gritsenko, Marina A; Monroe, Matthew E; Weitz, Karl K; Shukla, Anil K; Tian, Mingyuan; Neumann, Gabriele; Reed, Jennifer L; van Bakel, Harm; Metz, Thomas O; Smith, Richard D; Waters, Katrina M; N'jai, Alhaji; Sahr, Foday; Kawaoka, Yoshihiro
2017-12-13
The pathogenesis of human Ebola virus disease (EVD) is complex. EVD is characterized by high levels of virus replication and dissemination, dysregulated immune responses, extensive virus- and host-mediated tissue damage, and disordered coagulation. To clarify how host responses contribute to EVD pathophysiology, we performed multi-platform 'omics analysis of peripheral blood mononuclear cells and plasma from EVD patients. Our results indicate that EVD molecular signatures overlap with those of sepsis, imply that pancreatic enzymes contribute to tissue damage in fatal EVD, and suggest that Ebola virus infection may induce aberrant neutrophils whose activity could explain hallmarks of fatal EVD. Moreover, integrated biomarker prediction identified putative biomarkers from different data platforms that differentiated survivors and fatalities early after infection. This work reveals insight into EVD pathogenesis, suggests an effective approach for biomarker identification, and provides an important community resource for further analysis of human EVD severity. Copyright © 2017 Elsevier Inc. All rights reserved.
Network-based drug discovery by integrating systems biology and computational technologies
Leung, Elaine L.; Cao, Zhi-Wei; Jiang, Zhi-Hong; Zhou, Hua
2013-01-01
Network-based intervention has been a trend of curing systemic diseases, but it relies on regimen optimization and valid multi-target actions of the drugs. The complex multi-component nature of medicinal herbs may serve as valuable resources for network-based multi-target drug discovery due to its potential treatment effects by synergy. Recently, robustness of multiple systems biology platforms shows powerful to uncover molecular mechanisms and connections between the drugs and their targeting dynamic network. However, optimization methods of drug combination are insufficient, owning to lacking of tighter integration across multiple ‘-omics’ databases. The newly developed algorithm- or network-based computational models can tightly integrate ‘-omics’ databases and optimize combinational regimens of drug development, which encourage using medicinal herbs to develop into new wave of network-based multi-target drugs. However, challenges on further integration across the databases of medicinal herbs with multiple system biology platforms for multi-target drug optimization remain to the uncertain reliability of individual data sets, width and depth and degree of standardization of herbal medicine. Standardization of the methodology and terminology of multiple system biology and herbal database would facilitate the integration. Enhance public accessible databases and the number of research using system biology platform on herbal medicine would be helpful. Further integration across various ‘-omics’ platforms and computational tools would accelerate development of network-based drug discovery and network medicine. PMID:22877768
Design and testing of a multi-sensor pedestrian location and navigation platform.
Morrison, Aiden; Renaudin, Valérie; Bancroft, Jared B; Lachapelle, Gérard
2012-01-01
Navigation and location technologies are continually advancing, allowing ever higher accuracies and operation under ever more challenging conditions. The development of such technologies requires the rapid evaluation of a large number of sensors and related utilization strategies. The integration of Global Navigation Satellite Systems (GNSSs) such as the Global Positioning System (GPS) with accelerometers, gyros, barometers, magnetometers and other sensors is allowing for novel applications, but is hindered by the difficulties to test and compare integrated solutions using multiple sensor sets. In order to achieve compatibility and flexibility in terms of multiple sensors, an advanced adaptable platform is required. This paper describes the design and testing of the NavCube, a multi-sensor navigation, location and timing platform. The system provides a research tool for pedestrian navigation, location and body motion analysis in an unobtrusive form factor that enables in situ data collections with minimal gait and posture impact. Testing and examples of applications of the NavCube are provided.
REopt: A Platform for Energy System Integration and Optimization: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Simpkins, T.; Cutler, D.; Anderson, K.
2014-08-01
REopt is NREL's energy planning platform offering concurrent, multi-technology integration and optimization capabilities to help clients meet their cost savings and energy performance goals. The REopt platform provides techno-economic decision-support analysis throughout the energy planning process, from agency-level screening and macro planning to project development to energy asset operation. REopt employs an integrated approach to optimizing a site?s energy costs by considering electricity and thermal consumption, resource availability, complex tariff structures including time-of-use, demand and sell-back rates, incentives, net-metering, and interconnection limits. Formulated as a mixed integer linear program, REopt recommends an optimally-sized mix of conventional and renewable energy, andmore » energy storage technologies; estimates the net present value associated with implementing those technologies; and provides the cost-optimal dispatch strategy for operating them at maximum economic efficiency. The REopt platform can be customized to address a variety of energy optimization scenarios including policy, microgrid, and operational energy applications. This paper presents the REopt techno-economic model along with two examples of recently completed analysis projects.« less
MONGKIE: an integrated tool for network analysis and visualization for multi-omics data.
Jang, Yeongjun; Yu, Namhee; Seo, Jihae; Kim, Sun; Lee, Sanghyuk
2016-03-18
Network-based integrative analysis is a powerful technique for extracting biological insights from multilayered omics data such as somatic mutations, copy number variations, and gene expression data. However, integrated analysis of multi-omics data is quite complicated and can hardly be done in an automated way. Thus, a powerful interactive visual mining tool supporting diverse analysis algorithms for identification of driver genes and regulatory modules is much needed. Here, we present a software platform that integrates network visualization with omics data analysis tools seamlessly. The visualization unit supports various options for displaying multi-omics data as well as unique network models for describing sophisticated biological networks such as complex biomolecular reactions. In addition, we implemented diverse in-house algorithms for network analysis including network clustering and over-representation analysis. Novel functions include facile definition and optimized visualization of subgroups, comparison of a series of data sets in an identical network by data-to-visual mapping and subsequent overlaying function, and management of custom interaction networks. Utility of MONGKIE for network-based visual data mining of multi-omics data was demonstrated by analysis of the TCGA glioblastoma data. MONGKIE was developed in Java based on the NetBeans plugin architecture, thus being OS-independent with intrinsic support of module extension by third-party developers. We believe that MONGKIE would be a valuable addition to network analysis software by supporting many unique features and visualization options, especially for analysing multi-omics data sets in cancer and other diseases. .
A Versatile Integrated Ambient Ionization Source Platform.
Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei
2018-04-30
The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. Graphical abstract ᅟ.
A Versatile Integrated Ambient Ionization Source Platform
NASA Astrophysics Data System (ADS)
Ai, Wanpeng; Nie, Honggang; Song, Shiyao; Liu, Xiaoyun; Bai, Yu; Liu, Huwei
2018-04-01
The pursuit of high-throughput sample analysis from complex matrix demands development of multiple ionization techniques with complementary specialties. A versatile integrated ambient ionization source (iAmIS) platform is proposed in this work, based on the idea of integrating multiple functions, enhancing the efficiency of current ionization techniques, extending the applications, and decreasing the cost of the instrument. The design of the iAmIS platform combines flowing atmospheric pressure afterglow (FAPA) source/direct analysis in real time (DART), dielectric barrier discharge ionization (DBDI)/low-temperature plasma (LTP), desorption electrospray ionization (DESI), and laser desorption (LD) technique. All individual and combined ionization modes can be easily attained by modulating parameters. In particular, the FAPA/DART&DESI mode can realize the detection of polar and nonpolar compounds at the same time with two different ionization mechanisms: proton transfer and charge transfer. The introduction of LD contributes to the mass spectrometry imaging and the surface-assisted laser desorption (SALDI) under ambient condition. Compared with other individual or multi-mode ion source, the iAmIS platform provides the flexibility of choosing different ionization modes, broadens the scope of the analyte detection, and facilitates the analysis of complex samples. [Figure not available: see fulltext.
Lynx web services for annotations and systems analysis of multi-gene disorders.
Sulakhe, Dinanath; Taylor, Andrew; Balasubramanian, Sandhya; Feng, Bo; Xie, Bingqing; Börnigen, Daniela; Dave, Utpal J; Foster, Ian T; Gilliam, T Conrad; Maltsev, Natalia
2014-07-01
Lynx is a web-based integrated systems biology platform that supports annotation and analysis of experimental data and generation of weighted hypotheses on molecular mechanisms contributing to human phenotypes and disorders of interest. Lynx has integrated multiple classes of biomedical data (genomic, proteomic, pathways, phenotypic, toxicogenomic, contextual and others) from various public databases as well as manually curated data from our group and collaborators (LynxKB). Lynx provides tools for gene list enrichment analysis using multiple functional annotations and network-based gene prioritization. Lynx provides access to the integrated database and the analytical tools via REST based Web Services (http://lynx.ci.uchicago.edu/webservices.html). This comprises data retrieval services for specific functional annotations, services to search across the complete LynxKB (powered by Lucene), and services to access the analytical tools built within the Lynx platform. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications
Kalinin, Alexandr A.; Palanimalai, Selvam; Dinov, Ivo D.
2018-01-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis. PMID:29630069
SOCRAT Platform Design: A Web Architecture for Interactive Visual Analytics Applications.
Kalinin, Alexandr A; Palanimalai, Selvam; Dinov, Ivo D
2017-04-01
The modern web is a successful platform for large scale interactive web applications, including visualizations. However, there are no established design principles for building complex visual analytics (VA) web applications that could efficiently integrate visualizations with data management, computational transformation, hypothesis testing, and knowledge discovery. This imposes a time-consuming design and development process on many researchers and developers. To address these challenges, we consider the design requirements for the development of a module-based VA system architecture, adopting existing practices of large scale web application development. We present the preliminary design and implementation of an open-source platform for Statistics Online Computational Resource Analytical Toolbox (SOCRAT). This platform defines: (1) a specification for an architecture for building VA applications with multi-level modularity, and (2) methods for optimizing module interaction, re-usage, and extension. To demonstrate how this platform can be used to integrate a number of data management, interactive visualization, and analysis tools, we implement an example application for simple VA tasks including raw data input and representation, interactive visualization and analysis.
Rossi, Elena; Rosa, Manuela; Rossi, Lorenzo; Priori, Alberto; Marceglia, Sara
2014-12-01
The web-based systems available for multi-centre clinical trials do not combine clinical data collection (Electronic Health Records, EHRs) with signal processing storage and analysis tools. However, in pathophysiological research, the correlation between clinical data and signals is crucial for uncovering the underlying neurophysiological mechanisms. A specific example is the investigation of the mechanisms of action for Deep Brain Stimulation (DBS) used for Parkinson's Disease (PD); the neurosignals recorded from the DBS target structure and clinical data must be investigated. The aim of this study is the development and testing of a new system dedicated to a multi-centre study of Parkinson's Disease that integrates biosignal analysis tools and data collection in a shared and secure environment. We designed a web-based platform (WebBioBank) for managing the clinical data and biosignals of PD patients treated with DBS in different clinical research centres. Homogeneous data collection was ensured in the different centres (Operative Units, OUs). The anonymity of the data was preserved using unique identifiers associated with patients (ID BAC). The patients' personal details and their equivalent ID BACs were archived inside the corresponding OU and were not uploaded on the web-based platform; data sharing occurred using the ID BACs. The system allowed researchers to upload different signal processing functions (in a .dll extension) onto the web-based platform and to combine them to define dedicated algorithms. Four clinical research centres used WebBioBank for 1year. The clinical data from 58 patients treated using DBS were managed, and 186 biosignals were uploaded and classified into 4 categories based on the treatment (pharmacological and/or electrical). The user's satisfaction mean score exceeded the satisfaction threshold. WebBioBank enabled anonymous data sharing for a clinical study conducted at multiple centres and demonstrated the capabilities of the signal processing chain configuration as well as its effectiveness and efficiency for integrating the neurophysiological results with clinical data in multi-centre studies, which will allow the future collection of homogeneous data in large cohorts of patients. Copyright © 2014 Elsevier Inc. All rights reserved.
Tele-Supervised Adaptive Ocean Sensor Fleet
NASA Technical Reports Server (NTRS)
Lefes, Alberto; Podnar, Gregg W.; Dolan, John M.; Hosler, Jeffrey C.; Ames, Troy J.
2009-01-01
The Tele-supervised Adaptive Ocean Sensor Fleet (TAOSF) is a multi-robot science exploration architecture and system that uses a group of robotic boats (the Ocean-Atmosphere Sensor Integration System, or OASIS) to enable in-situ study of ocean surface and subsurface characteristics and the dynamics of such ocean phenomena as coastal pollutants, oil spills, hurricanes, or harmful algal blooms (HABs). The OASIS boats are extended- deployment, autonomous ocean surface vehicles. The TAOSF architecture provides an integrated approach to multi-vehicle coordination and sliding human-vehicle autonomy. One feature of TAOSF is the adaptive re-planning of the activities of the OASIS vessels based on sensor input ( smart sensing) and sensorial coordination among multiple assets. The architecture also incorporates Web-based communications that permit control of the assets over long distances and the sharing of data with remote experts. Autonomous hazard and assistance detection allows the automatic identification of hazards that require human intervention to ensure the safety and integrity of the robotic vehicles, or of science data that require human interpretation and response. Also, the architecture is designed for science analysis of acquired data in order to perform an initial onboard assessment of the presence of specific science signatures of immediate interest. TAOSF integrates and extends five subsystems developed by the participating institutions: Emergent Space Tech - nol ogies, Wallops Flight Facility, NASA s Goddard Space Flight Center (GSFC), Carnegie Mellon University, and Jet Propulsion Laboratory (JPL). The OASIS Autonomous Surface Vehicle (ASV) system, which includes the vessels as well as the land-based control and communications infrastructure developed for them, controls the hardware of each platform (sensors, actuators, etc.), and also provides a low-level waypoint navigation capability. The Multi-Platform Simulation Environment from GSFC is a surrogate for the OASIS ASV system and allows for independent development and testing of higher-level software components. The Platform Communicator acts as a proxy for both actual and simulated platforms. It translates platform-independent messages from the higher control systems to the device-dependent communication protocols. This enables the higher-level control systems to interact identically with heterogeneous actual or simulated platforms.
Multi-octave spectral beam combiner on ultra-broadband photonic integrated circuit platform.
Stanton, Eric J; Heck, Martijn J R; Bovington, Jock; Spott, Alexander; Bowers, John E
2015-05-04
We present the design of a novel platform that is able to combine optical frequency bands spanning 4.2 octaves from ultraviolet to mid-wave infrared into a single, low M2 output waveguide. We present the design and realization of a key component in this platform that combines the wavelength bands of 350 nm - 1500 nm and 1500 nm - 6500 nm with demonstrated efficiency greater than 90% in near-infrared and mid-wave infrared. The multi-octave spectral beam combiner concept is realized using an integrated platform with silicon nitride waveguides and silicon waveguides. Simulated bandwidth is shown to be over four octaves, and measured bandwidth is shown over two octaves, limited by the availability of sources.
Design of Smart Multi-Functional Integrated Aviation Photoelectric Payload
NASA Astrophysics Data System (ADS)
Zhang, X.
2018-04-01
To coordinate with the small UAV at reconnaissance mission, we've developed a smart multi-functional integrated aviation photoelectric payload. The payload weighs only 1kg, and has a two-axis stabilized platform with visible task payload, infrared task payload, laser pointers and video tracker. The photoelectric payload could complete the reconnaissance tasks above the target area (including visible and infrared). Because of its light weight, small size, full-featured, high integrated, the constraints of the UAV platform carrying the payload will be reduced a lot, which helps the payload suit for more extensive using occasions. So all users of this type of smart multi-functional integrated aviation photoelectric payload will do better works on completion of the ground to better pinpoint targets, artillery calibration, assessment of observe strike damage, customs officials and other tasks.
Educational process in modern climatology within the web-GIS platform "Climate"
NASA Astrophysics Data System (ADS)
Gordova, Yulia; Gorbatenko, Valentina; Gordov, Evgeny; Martynova, Yulia; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
These days, common to all scientific fields the problem of training of scientists in the environmental sciences is exacerbated by the need to develop new computational and information technology skills in distributed multi-disciplinary teams. To address this and other pressing problems of Earth system sciences, software infrastructure for information support of integrated research in the geosciences was created based on modern information and computational technologies and a software and hardware platform "Climate» (http://climate.scert.ru/) was developed. In addition to the direct analysis of geophysical data archives, the platform is aimed at teaching the basics of the study of changes in regional climate. The educational component of the platform includes a series of lectures on climate, environmental and meteorological modeling and laboratory work cycles on the basics of analysis of current and potential future regional climate change using Siberia territory as an example. The educational process within the Platform is implemented using the distance learning system Moodle (www.moodle.org). This work is partially supported by the Ministry of education and science of the Russian Federation (contract #8345), SB RAS project VIII.80.2.1, RFBR grant #11-05-01190a, and integrated project SB RAS #131.
Irigoyen, Antonio; Jimenez-Luna, Cristina; Benavides, Manuel; Caba, Octavio; Gallego, Javier; Ortuño, Francisco Manuel; Guillen-Ponce, Carmen; Rojas, Ignacio; Aranda, Enrique; Torres, Carolina; Prados, Jose
2018-01-01
Applying differentially expressed genes (DEGs) to identify feasible biomarkers in diseases can be a hard task when working with heterogeneous datasets. Expression data are strongly influenced by technology, sample preparation processes, and/or labeling methods. The proliferation of different microarray platforms for measuring gene expression increases the need to develop models able to compare their results, especially when different technologies can lead to signal values that vary greatly. Integrative meta-analysis can significantly improve the reliability and robustness of DEG detection. The objective of this work was to develop an integrative approach for identifying potential cancer biomarkers by integrating gene expression data from two different platforms. Pancreatic ductal adenocarcinoma (PDAC), where there is an urgent need to find new biomarkers due its late diagnosis, is an ideal candidate for testing this technology. Expression data from two different datasets, namely Affymetrix and Illumina (18 and 36 PDAC patients, respectively), as well as from 18 healthy controls, was used for this study. A meta-analysis based on an empirical Bayesian methodology (ComBat) was then proposed to integrate these datasets. DEGs were finally identified from the integrated data by using the statistical programming language R. After our integrative meta-analysis, 5 genes were commonly identified within the individual analyses of the independent datasets. Also, 28 novel genes that were not reported by the individual analyses ('gained' genes) were also discovered. Several of these gained genes have been already related to other gastroenterological tumors. The proposed integrative meta-analysis has revealed novel DEGs that may play an important role in PDAC and could be potential biomarkers for diagnosing the disease.
Sittig, Dean F.; Hazlehurst, Brian L.; Brown, Jeffrey; Murphy, Shawn; Rosenman, Marc; Tarczy-Hornoch, Peter; Wilcox, Adam B.
2012-01-01
Comparative Effectiveness Research (CER) has the potential to transform the current healthcare delivery system by identifying the most effective medical and surgical treatments, diagnostic tests, disease prevention methods and ways to deliver care for specific clinical conditions. To be successful, such research requires the identification, capture, aggregation, integration, and analysis of disparate data sources held by different institutions with diverse representations of the relevant clinical events. In an effort to address these diverse demands, there have been multiple new designs and implementations of informatics platforms that provide access to electronic clinical data and the governance infrastructure required for inter-institutional CER. The goal of this manuscript is to help investigators understand why these informatics platforms are required and to compare and contrast six, large-scale, recently funded, CER-focused informatics platform development efforts. We utilized an 8-dimension, socio-technical model of health information technology use to help guide our work. We identified six generic steps that are necessary in any distributed, multi-institutional CER project: data identification, extraction, modeling, aggregation, analysis, and dissemination. We expect that over the next several years these projects will provide answers to many important, and heretofore unanswerable, clinical research questions. PMID:22692259
PADF RF localization experiments with multi-agent caged-MAV platforms
NASA Astrophysics Data System (ADS)
Barber, Christopher; Gates, Miguel; Selmic, Rastko; Al-Issa, Huthaifa; Ordonez, Raul; Mitra, Atindra
2011-06-01
This paper provides a summary of preliminary RF direction finding results generated within an AFOSR funded testbed facility recently developed at Louisiana Tech University. This facility, denoted as the Louisiana Tech University Micro- Aerial Vehicle/Wireless Sensor Network (MAVSeN) Laboratory, has recently acquired a number of state-of-the-art MAV platforms that enable us to analyze, design, and test some of our recent results in the area of multiplatform position-adaptive direction finding (PADF) [1] [2] for localization of RF emitters in challenging embedded multipath environments. Discussions within the segmented sections of this paper include a description of the MAVSeN Laboratory and the preliminary results from the implementation of mobile platforms with the PADF algorithm. This novel approach to multi-platform RF direction finding is based on the investigation of iterative path-loss based (i.e. path loss exponent) metrics estimates that are measured across multiple platforms in order to develop a control law that robotically/intelligently positionally adapt (i.e. self-adjust) the location of each distributed/cooperative platform. The body of this paper provides a summary of our recent results on PADF and includes a discussion on state-of-the-art Sensor Mote Technologies as applied towards the development of sensor-integrated caged-MAV platform for PADF applications. Also, a discussion of recent experimental results that incorporate sample approaches to real-time singleplatform data pruning is included as part of a discussion on potential approaches to refining a basic PADF technique in order to integrate and perform distributed self-sensitivity and self-consistency analysis as part of a PADF technique with distributed robotic/intelligent features. These techniques are extracted in analytical form from a parallel study denoted as "PADF RF Localization Criteria for Multi-Model Scattering Environments". The focus here is on developing and reporting specific approaches to self-sensitivity and self-consistency within this experimental PADF framework via the exploitation of specific single-agent caged-MAV trajectories that are unique to this experiment set.
A multi-tissue type genome-scale metabolic network for analysis of whole-body systems physiology
2011-01-01
Background Genome-scale metabolic reconstructions provide a biologically meaningful mechanistic basis for the genotype-phenotype relationship. The global human metabolic network, termed Recon 1, has recently been reconstructed allowing the systems analysis of human metabolic physiology and pathology. Utilizing high-throughput data, Recon 1 has recently been tailored to different cells and tissues, including the liver, kidney, brain, and alveolar macrophage. These models have shown utility in the study of systems medicine. However, no integrated analysis between human tissues has been done. Results To describe tissue-specific functions, Recon 1 was tailored to describe metabolism in three human cells: adipocytes, hepatocytes, and myocytes. These cell-specific networks were manually curated and validated based on known cellular metabolic functions. To study intercellular interactions, a novel multi-tissue type modeling approach was developed to integrate the metabolic functions for the three cell types, and subsequently used to simulate known integrated metabolic cycles. In addition, the multi-tissue model was used to study diabetes: a pathology with systemic properties. High-throughput data was integrated with the network to determine differential metabolic activity between obese and type II obese gastric bypass patients in a whole-body context. Conclusion The multi-tissue type modeling approach presented provides a platform to study integrated metabolic states. As more cell and tissue-specific models are released, it is critical to develop a framework in which to study their interdependencies. PMID:22041191
Goscinski, Wojtek J.; McIntosh, Paul; Felzmann, Ulrich; Maksimenko, Anton; Hall, Christopher J.; Gureyev, Timur; Thompson, Darren; Janke, Andrew; Galloway, Graham; Killeen, Neil E. B.; Raniga, Parnesh; Kaluza, Owen; Ng, Amanda; Poudel, Govinda; Barnes, David G.; Nguyen, Toan; Bonnington, Paul; Egan, Gary F.
2014-01-01
The Multi-modal Australian ScienceS Imaging and Visualization Environment (MASSIVE) is a national imaging and visualization facility established by Monash University, the Australian Synchrotron, the Commonwealth Scientific Industrial Research Organization (CSIRO), and the Victorian Partnership for Advanced Computing (VPAC), with funding from the National Computational Infrastructure and the Victorian Government. The MASSIVE facility provides hardware, software, and expertise to drive research in the biomedical sciences, particularly advanced brain imaging research using synchrotron x-ray and infrared imaging, functional and structural magnetic resonance imaging (MRI), x-ray computer tomography (CT), electron microscopy and optical microscopy. The development of MASSIVE has been based on best practice in system integration methodologies, frameworks, and architectures. The facility has: (i) integrated multiple different neuroimaging analysis software components, (ii) enabled cross-platform and cross-modality integration of neuroinformatics tools, and (iii) brought together neuroimaging databases and analysis workflows. MASSIVE is now operational as a nationally distributed and integrated facility for neuroinfomatics and brain imaging research. PMID:24734019
PaintOmics 3: a web resource for the pathway analysis and visualization of multi-omics data.
Hernández-de-Diego, Rafael; Tarazona, Sonia; Martínez-Mira, Carlos; Balzano-Nogueira, Leandro; Furió-Tarí, Pedro; Pappas, Georgios J; Conesa, Ana
2018-05-25
The increasing availability of multi-omic platforms poses new challenges to data analysis. Joint visualization of multi-omics data is instrumental in better understanding interconnections across molecular layers and in fully utilizing the multi-omic resources available to make biological discoveries. We present here PaintOmics 3, a web-based resource for the integrated visualization of multiple omic data types onto KEGG pathway diagrams. PaintOmics 3 combines server-end capabilities for data analysis with the potential of modern web resources for data visualization, providing researchers with a powerful framework for interactive exploration of their multi-omics information. Unlike other visualization tools, PaintOmics 3 covers a comprehensive pathway analysis workflow, including automatic feature name/identifier conversion, multi-layered feature matching, pathway enrichment, network analysis, interactive heatmaps, trend charts, and more. It accepts a wide variety of omic types, including transcriptomics, proteomics and metabolomics, as well as region-based approaches such as ATAC-seq or ChIP-seq data. The tool is freely available at www.paintomics.org.
Wang, Yeqiao; Nemani, Ramakrishna; Dieffenbach, Fred; Stolte, Kenneth; Holcomb, Glenn B.; Robinson, Matt; Reese, Casey C.; McNiff, Marcia; Duhaime, Roland; Tierney, Geri; Mitchell, Brian; August, Peter; Paton, Peter; LaBash, Charles
2010-01-01
This paper introduces a collaborative multi-agency effort to develop an Appalachian Trail (A.T.) MEGA-Transect Decision Support System (DSS) for monitoring, reporting and forecasting ecological conditions of the A.T. and the surrounding lands. The project is to improve decisionmaking on management of the A.T. by providing a coherent framework for data integration, status reporting and trend analysis. The A.T. MEGA-Transect DSS is to integrate NASA multi-platform sensor data and modeling through the Terrestrial Observation and Prediction System (TOPS) and in situ measurements from A.T. MEGA-Transect partners to address identified natural resource priorities and improve resource management decisions.
Data processing, multi-omic pathway mapping, and metabolite activity analysis using XCMS Online
Forsberg, Erica M; Huan, Tao; Rinehart, Duane; Benton, H Paul; Warth, Benedikt; Hilmers, Brian; Siuzdak, Gary
2018-01-01
Systems biology is the study of complex living organisms, and as such, analysis on a systems-wide scale involves the collection of information-dense data sets that are representative of an entire phenotype. To uncover dynamic biological mechanisms, bioinformatics tools have become essential to facilitating data interpretation in large-scale analyses. Global metabolomics is one such method for performing systems biology, as metabolites represent the downstream functional products of ongoing biological processes. We have developed XCMS Online, a platform that enables online metabolomics data processing and interpretation. A systems biology workflow recently implemented within XCMS Online enables rapid metabolic pathway mapping using raw metabolomics data for investigating dysregulated metabolic processes. In addition, this platform supports integration of multi-omic (such as genomic and proteomic) data to garner further systems-wide mechanistic insight. Here, we provide an in-depth procedure showing how to effectively navigate and use the systems biology workflow within XCMS Online without a priori knowledge of the platform, including uploading liquid chromatography (LCLC)–mass spectrometry (MS) data from metabolite-extracted biological samples, defining the job parameters to identify features, correcting for retention time deviations, conducting statistical analysis of features between sample classes and performing predictive metabolic pathway analysis. Additional multi-omics data can be uploaded and overlaid with previously identified pathways to enhance systems-wide analysis of the observed dysregulations. We also describe unique visualization tools to assist in elucidation of statistically significant dysregulated metabolic pathways. Parameter input takes 5–10 min, depending on user experience; data processing typically takes 1–3 h, and data analysis takes ~30 min. PMID:29494574
Cross-platform validation and analysis environment for particle physics
NASA Astrophysics Data System (ADS)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
2017-11-01
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.
The Development of GIS Educational Resources Sharing among Central Taiwan Universities
NASA Astrophysics Data System (ADS)
Chou, T.-Y.; Yeh, M.-L.; Lai, Y.-C.
2011-09-01
Using GIS in the classroom enhance students' computer skills and explore the range of knowledge. The paper highlights GIS integration on e-learning platform and introduces a variety of abundant educational resources. This research project will demonstrate tools for e-learning environment and delivers some case studies for learning interaction from Central Taiwan Universities. Feng Chia University (FCU) obtained a remarkable academic project subsidized by Ministry of Education and developed e-learning platform for excellence in teaching/learning programs among Central Taiwan's universities. The aim of the project is to integrate the educational resources of 13 universities in central Taiwan. FCU is serving as the hub of Center University. To overcome the problem of distance, e-platforms have been established to create experiences with collaboration enhanced learning. The e-platforms provide coordination of web service access among the educational community and deliver GIS educational resources. Most of GIS related courses cover the development of GIS, principles of cartography, spatial data analysis and overlaying, terrain analysis, buffer analysis, 3D GIS application, Remote Sensing, GPS technology, and WebGIS, MobileGIS, ArcGIS manipulation. In each GIS case study, students have been taught to know geographic meaning, collect spatial data and then use ArcGIS software to analyze spatial data. On one of e-Learning platforms provide lesson plans and presentation slides. Students can learn Arc GIS online. As they analyze spatial data, they can connect to GIS hub to get data they need including satellite images, aerial photos, and vector data. Moreover, e-learning platforms provide solutions and resources. Different levels of image scales have been integrated into the systems. Multi-scale spatial development and analyses in Central Taiwan integrate academic research resources among CTTLRC partners. Thus, establish decision-making support mechanism in teaching and learning. Accelerate communication, cooperation and sharing among academic units
Modeling and analysis of a flywheel microvibration isolation system for spacecrafts
NASA Astrophysics Data System (ADS)
Wei, Zhanji; Li, Dongxu; Luo, Qing; Jiang, Jianping
2015-01-01
The microvibrations generated by flywheels running at full speed onboard high precision spacecrafts will affect stability of the spacecraft bus and further degrade pointing accuracy of the payload. A passive vibration isolation platform comprised of multi-segment zig-zag beams is proposed to isolate disturbances of the flywheel. By considering the flywheel and the platform as an integral system with gyroscopic effects, an equivalent dynamic model is developed and verified through eigenvalue and frequency response analysis. The critical speeds of the system are deduced and expressed as functions of system parameters. The vibration isolation performance of the platform under synchronal and high-order harmonic disturbances caused by the flywheel is investigated. It is found that the speed range within which the passive platform is effective and the disturbance decay rate of the system are greatly influenced by the locations of the critical speeds. Structure optimization of the platform is carried out to enhance its performance. Simulation results show that a properly designed vibration isolation platform can effectively reduce disturbances emitted by the flywheel operating above the critical speeds of the system.
Cross-platform validation and analysis environment for particle physics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.
A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for onlinemore » validation of Monte Carlo event samples through a web interface.« less
NASA Astrophysics Data System (ADS)
Bao, Cheng; Cai, Ningsheng; Croiset, Eric
2011-10-01
Following our integrated hierarchical modeling framework of natural gas internal reforming solid oxide fuel cell (IRSOFC), this paper firstly introduces the model libraries of main balancing units, including some state-of-the-art achievements and our specific work. Based on gPROMS programming code, flexible configuration and modular design are fully realized by specifying graphically all unit models in each level. Via comparison with the steady-state experimental data of Siemens-Westinghouse demonstration system, the in-house multi-level SOFC-gas turbine (GT) simulation platform is validated to be more accurate than the advanced power system analysis tool (APSAT). Moreover, some units of the demonstration system are designed reversely for analysis of a typically part-load transient process. The framework of distributed and dynamic modeling in most of units is significant for the development of control strategies in the future.
DataViewer3D: An Open-Source, Cross-Platform Multi-Modal Neuroimaging Data Visualization Tool
Gouws, André; Woods, Will; Millman, Rebecca; Morland, Antony; Green, Gary
2008-01-01
Integration and display of results from multiple neuroimaging modalities [e.g. magnetic resonance imaging (MRI), magnetoencephalography, EEG] relies on display of a diverse range of data within a common, defined coordinate frame. DataViewer3D (DV3D) is a multi-modal imaging data visualization tool offering a cross-platform, open-source solution to simultaneous data overlay visualization requirements of imaging studies. While DV3D is primarily a visualization tool, the package allows an analysis approach where results from one imaging modality can guide comparative analysis of another modality in a single coordinate space. DV3D is built on Python, a dynamic object-oriented programming language with support for integration of modular toolkits, and development of cross-platform software for neuroimaging. DV3D harnesses the power of the Visualization Toolkit (VTK) for two-dimensional (2D) and 3D rendering, calling VTK's low level C++ functions from Python. Users interact with data via an intuitive interface that uses Python to bind wxWidgets, which in turn calls the user's operating system dialogs and graphical user interface tools. DV3D currently supports NIfTI-1, ANALYZE™ and DICOM formats for MRI data display (including statistical data overlay). Formats for other data types are supported. The modularity of DV3D and ease of use of Python allows rapid integration of additional format support and user development. DV3D has been tested on Mac OSX, RedHat Linux and Microsoft Windows XP. DV3D is offered for free download with an extensive set of tutorial resources and example data. PMID:19352444
Integration of multi-interface conversion channel using FPGA for modular photonic network
NASA Astrophysics Data System (ADS)
Janicki, Tomasz; Pozniak, Krzysztof T.; Romaniuk, Ryszard S.
2010-09-01
The article discusses the integration of different types of interfaces with FPGA circuits using a reconfigurable communication platform. The solution has been implemented in practice in a single node of a distributed measurement system. Construction of communication platform has been presented with its selected hardware modules, described in VHDL and implemented in FPGA circuits. The graphical user interface (GUI) has been described that allows a user to control the operation of the system. In the final part of the article selected practical solutions have been introduced. The whole measurement system resides on multi-gigabit optical network. The optical network construction is highly modular, reconfigurable and scalable.
A Geospatial Information Grid Framework for Geological Survey.
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper.
A Geospatial Information Grid Framework for Geological Survey
Wu, Liang; Xue, Lei; Li, Chaoling; Lv, Xia; Chen, Zhanlong; Guo, Mingqiang; Xie, Zhong
2015-01-01
The use of digital information in geological fields is becoming very important. Thus, informatization in geological surveys should not stagnate as a result of the level of data accumulation. The integration and sharing of distributed, multi-source, heterogeneous geological information is an open problem in geological domains. Applications and services use geological spatial data with many features, including being cross-region and cross-domain and requiring real-time updating. As a result of these features, desktop and web-based geographic information systems (GISs) experience difficulties in meeting the demand for geological spatial information. To facilitate the real-time sharing of data and services in distributed environments, a GIS platform that is open, integrative, reconfigurable, reusable and elastic would represent an indispensable tool. The purpose of this paper is to develop a geological cloud-computing platform for integrating and sharing geological information based on a cloud architecture. Thus, the geological cloud-computing platform defines geological ontology semantics; designs a standard geological information framework and a standard resource integration model; builds a peer-to-peer node management mechanism; achieves the description, organization, discovery, computing and integration of the distributed resources; and provides the distributed spatial meta service, the spatial information catalog service, the multi-mode geological data service and the spatial data interoperation service. The geological survey information cloud-computing platform has been implemented, and based on the platform, some geological data services and geological processing services were developed. Furthermore, an iron mine resource forecast and an evaluation service is introduced in this paper. PMID:26710255
IGMS: An Integrated ISO-to-Appliance Scale Grid Modeling System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Palmintier, Bryan; Hale, Elaine; Hansen, Timothy M.
This paper describes the Integrated Grid Modeling System (IGMS), a novel electric power system modeling platform for integrated transmission-distribution analysis that co-simulates off-the-shelf tools on high performance computing (HPC) platforms to offer unprecedented resolution from ISO markets down to appliances and other end uses. Specifically, the system simultaneously models hundreds or thousands of distribution systems in co-simulation with detailed Independent System Operator (ISO) markets and AGC-level reserve deployment. IGMS uses a new MPI-based hierarchical co-simulation framework to connect existing sub-domain models. Our initial efforts integrate opensource tools for wholesale markets (FESTIV), bulk AC power flow (MATPOWER), and full-featured distribution systemsmore » including physics-based end-use and distributed generation models (many instances of GridLAB-D[TM]). The modular IGMS framework enables tool substitution and additions for multi-domain analyses. This paper describes the IGMS tool, characterizes its performance, and demonstrates the impacts of the coupled simulations for analyzing high-penetration solar PV and price responsive load scenarios.« less
Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform.
Marshall-Colon, Amy; Long, Stephen P; Allen, Douglas K; Allen, Gabrielle; Beard, Daniel A; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A J; Cox, Donna J; Hart, John C; Hirst, Peter M; Kannan, Kavya; Katz, Daniel S; Lynch, Jonathan P; Millar, Andrew J; Panneerselvam, Balaji; Price, Nathan D; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J; Voit, Eberhard O; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang
2017-01-01
Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop.
Crops In Silico: Generating Virtual Crops Using an Integrative and Multi-scale Modeling Platform
Marshall-Colon, Amy; Long, Stephen P.; Allen, Douglas K.; Allen, Gabrielle; Beard, Daniel A.; Benes, Bedrich; von Caemmerer, Susanne; Christensen, A. J.; Cox, Donna J.; Hart, John C.; Hirst, Peter M.; Kannan, Kavya; Katz, Daniel S.; Lynch, Jonathan P.; Millar, Andrew J.; Panneerselvam, Balaji; Price, Nathan D.; Prusinkiewicz, Przemyslaw; Raila, David; Shekar, Rachel G.; Shrivastava, Stuti; Shukla, Diwakar; Srinivasan, Venkatraman; Stitt, Mark; Turk, Matthew J.; Voit, Eberhard O.; Wang, Yu; Yin, Xinyou; Zhu, Xin-Guang
2017-01-01
Multi-scale models can facilitate whole plant simulations by linking gene networks, protein synthesis, metabolic pathways, physiology, and growth. Whole plant models can be further integrated with ecosystem, weather, and climate models to predict how various interactions respond to environmental perturbations. These models have the potential to fill in missing mechanistic details and generate new hypotheses to prioritize directed engineering efforts. Outcomes will potentially accelerate improvement of crop yield, sustainability, and increase future food security. It is time for a paradigm shift in plant modeling, from largely isolated efforts to a connected community that takes advantage of advances in high performance computing and mechanistic understanding of plant processes. Tools for guiding future crop breeding and engineering, understanding the implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment are urgently needed. The purpose of this perspective is to introduce Crops in silico (cropsinsilico.org), an integrative and multi-scale modeling platform, as one solution that combines isolated modeling efforts toward the generation of virtual crops, which is open and accessible to the entire plant biology community. The major challenges involved both in the development and deployment of a shared, multi-scale modeling platform, which are summarized in this prospectus, were recently identified during the first Crops in silico Symposium and Workshop. PMID:28555150
A Multi-Level Parallelization Concept for High-Fidelity Multi-Block Solvers
NASA Technical Reports Server (NTRS)
Hatay, Ferhat F.; Jespersen, Dennis C.; Guruswamy, Guru P.; Rizk, Yehia M.; Byun, Chansup; Gee, Ken; VanDalsem, William R. (Technical Monitor)
1997-01-01
The integration of high-fidelity Computational Fluid Dynamics (CFD) analysis tools with the industrial design process benefits greatly from the robust implementations that are transportable across a wide range of computer architectures. In the present work, a hybrid domain-decomposition and parallelization concept was developed and implemented into the widely-used NASA multi-block Computational Fluid Dynamics (CFD) packages implemented in ENSAERO and OVERFLOW. The new parallel solver concept, PENS (Parallel Euler Navier-Stokes Solver), employs both fine and coarse granularity in data partitioning as well as data coalescing to obtain the desired load-balance characteristics on the available computer platforms. This multi-level parallelism implementation itself introduces no changes to the numerical results, hence the original fidelity of the packages are identically preserved. The present implementation uses the Message Passing Interface (MPI) library for interprocessor message passing and memory accessing. By choosing an appropriate combination of the available partitioning and coalescing capabilities only during the execution stage, the PENS solver becomes adaptable to different computer architectures from shared-memory to distributed-memory platforms with varying degrees of parallelism. The PENS implementation on the IBM SP2 distributed memory environment at the NASA Ames Research Center obtains 85 percent scalable parallel performance using fine-grain partitioning of single-block CFD domains using up to 128 wide computational nodes. Multi-block CFD simulations of complete aircraft simulations achieve 75 percent perfect load-balanced executions using data coalescing and the two levels of parallelism. SGI PowerChallenge, SGI Origin 2000, and a cluster of workstations are the other platforms where the robustness of the implementation is tested. The performance behavior on the other computer platforms with a variety of realistic problems will be included as this on-going study progresses.
NASA Astrophysics Data System (ADS)
Berres, A.; Karthik, R.; Nugent, P.; Sorokine, A.; Myers, A.; Pang, H.
2017-12-01
Building an integrated data infrastructure that can meet the needs of a sustainable energy-water resource management requires a robust data management and geovisual analytics platform, capable of cross-domain scientific discovery and knowledge generation. Such a platform can facilitate the investigation of diverse complex research and policy questions for emerging priorities in Energy-Water Nexus (EWN) science areas. Using advanced data analytics, machine learning techniques, multi-dimensional statistical tools, and interactive geovisualization components, such a multi-layered federated platform is being developed, the Energy-Water Nexus Knowledge Discovery Framework (EWN-KDF). This platform utilizes several enterprise-grade software design concepts and standards such as extensible service-oriented architecture, open standard protocols, event-driven programming model, enterprise service bus, and adaptive user interfaces to provide a strategic value to the integrative computational and data infrastructure. EWN-KDF is built on the Compute and Data Environment for Science (CADES) environment in Oak Ridge National Laboratory (ORNL).
2016-02-16
into areas where there is no access to maritime platforms. Sea-based interceptor platforms have the ability to intercept targets at each stage of the...argues that the most efficient concept for integrating active defense weapon systems is a multi- layered architecture with redundant intercept ...faster data transfer and will prevent data loss. The need for almost 100% interception successes is increasing as the threat becomes more
NASA Astrophysics Data System (ADS)
Wu, Mingching; Fang, Weileun
2006-02-01
This work attempts to integrate poly-Si thin film and single-crystal-silicon (SCS) structures in a monolithic process. The process integrated multi-depth DRIE (deep reactive ion etching), trench-refilled molding, a two poly-Si MUMPs process and (1 1 1) Si bulk micromachining to accomplish multi-thickness and multi-depth structures for superior micro-optical devices. In application, a SCS scanning mirror driven by self-aligned vertical comb-drive actuators was demonstrated. The stiffness of the mirror was significantly increased by thick SCS structures. The thin poly-Si film served as flexible torsional springs and electrical routings. The depth difference of the vertical comb electrodes was tuned by DRIE to increase the devices' stroke. Finally, a large moving space was available after the bulk Si etching. In summary, the present fabrication process, named (1 1 1) MOSBE (molded surface-micromachining and bulk etching release on (1 1 1) Si substrate), can further integrate with the MUMPs devices to establish a more powerful platform.
Study of multi-LLID technology to support multi-services carring in EPONS
NASA Astrophysics Data System (ADS)
Li, Wang; Yi, Benshun; Cheng, Chuanqing
2006-09-01
The Ethernet Passive Optical Network (EPON) has recently attracted more and more research attentions since it could be a perfect candidate for next generation access networks. EPON utilizes pon structure to carry ethernet data, having the both advantages of pon and ethernet devices. From traditional view, EPON is considered to only be a Ethernet services access platform and wake in supporting multi-services especially real-time service. It is obvious that if epon designed only to aim to carrying data service, it is difficult for epon devices to fulfill service provider's command of taking EPON as a integrated service access platform. So discussing the multi-services carrying technology in EPONs is a significative task. This paper deploy a novel method of multi-llid to support multi-services carrying in EPONs.
A mechanical cell disruption microfluidic platform based on an on-chip micropump.
Cheng, Yinuo; Wang, Yue; Wang, Zhiyuan; Huang, Liang; Bi, Mingzhao; Xu, Wenxiao; Wang, Wenhui; Ye, Xiongying
2017-03-01
Cell disruption plays a vital role in detection of intracellular components which contain information about genetic and disease characteristics. In this paper, we demonstrate a novel microfluidic platform based on an on-chip micropump for mechanical cell disruption and sample transport. A 50 μ l cell sample can be effectively lysed through on-chip multi-disruption in 36 s without introducing any chemical agent and suffering from clogging by cellular debris. After 30 cycles of circulating disruption, 80.6% and 90.5% cell disruption rates were achieved for the HEK293 cell sample and human natural killer cell sample, respectively. Profiting from the feature of pump-on-chip, the highly integrated platform enables more convenient and cost-effective cell disruption for the analysis of intracellular components.
A mechanical cell disruption microfluidic platform based on an on-chip micropump
Cheng, Yinuo; Wang, Yue; Wang, Zhiyuan; Bi, Mingzhao; Xu, Wenxiao; Ye, Xiongying
2017-01-01
Cell disruption plays a vital role in detection of intracellular components which contain information about genetic and disease characteristics. In this paper, we demonstrate a novel microfluidic platform based on an on-chip micropump for mechanical cell disruption and sample transport. A 50 μl cell sample can be effectively lysed through on-chip multi-disruption in 36 s without introducing any chemical agent and suffering from clogging by cellular debris. After 30 cycles of circulating disruption, 80.6% and 90.5% cell disruption rates were achieved for the HEK293 cell sample and human natural killer cell sample, respectively. Profiting from the feature of pump-on-chip, the highly integrated platform enables more convenient and cost-effective cell disruption for the analysis of intracellular components. PMID:28798848
Intensive time series data exploitation: the Multi-sensor Evolution Analysis (MEA) platform
NASA Astrophysics Data System (ADS)
Mantovani, Simone; Natali, Stefano; Folegani, Marco; Scremin, Alessandro
2014-05-01
The monitoring of the temporal evolution of natural phenomena must be performed in order to ensure their correct description and to allow improvements in modelling and forecast capabilities. This assumption, that is obvious for ground-based measurements, has not always been true for data collected through space-based platforms: except for geostationary satellites and sensors, that allow providing a very effective monitoring of phenomena with geometric scale from regional to global; smaller phenomena (with characteristic dimension lower than few kilometres) have been monitored with instruments that could collect data only with a time interval in the order of several days; bi-temporal techniques have been the most used ones for years, in order to characterise temporal changes and try identifying specific phenomena. The more the number of flying sensor has grown and their performance improved, the more their capability of monitoring natural phenomena at a smaller geographic scale has grown: we can now count on tenth of years of remotely sensed data, collected by hundreds of sensors that are now accessible from a wide users' community, and the techniques for data processing have to be adapted to move toward a data intensive exploitation. Starting from 2008, the European Space Agency has initiated the development of the Multi-sensor Evolution Analysis (MEA) platform (https://mea.eo.esa.int), whose first aim was to permit the access and exploitation of long term remotely sensed satellite data from different platforms: 15 years of global (A)ATSR data together with 5 years of regional AVNIR-2 data were loaded into the system and were used, through a web-based graphic user interface, for land cover change analysis. The MEA data availability has grown during years integrating multi-disciplinary data that feature spatial and temporal dimensions: so far tenths of Terabytes of data in the land and atmosphere domains are available and can be visualized and exploited, keeping the time dimension as the most relevant one (https://mea.eo.esa.int/data_availability.html). MEA is also used as Climate Data gateway in the framework of the FP7 EarthServer Project. In the present work, principles of the MEA platform are presented, emphasizing the general concept and the methods that have been implemented for data access (including OGC standard data access) and exploitation. In order to show its effectiveness, use cases focused on multi-field and multi-temporal data analysis are shown.
Design and control of multifunctional sorting and training platform based on PLC control
NASA Astrophysics Data System (ADS)
Wan, Hongqiang; Ge, Shuai; Han, Peiying; Li, Fancong; Zhang, Simiao
2018-05-01
Electromechanical integration, as a multi-disciplinary subject, has been paid much attention by universities and is widely used in the automation production of enterprises. Aiming at the problem of the lack of control among enterprises and the lack of training among colleges and universities, this paper presents a design of multifunctional sorting training platform based on PLC control. Firstly, the structure of the platform is determined and three-dimensional modeling is done. Then design the platform's aerodynamic control and electrical control. Finally, realize the platform sorting function through PLC programming and configuration software development. The training platform can be used to design the practical training experiment, which has a strong advance and pertinence in the electromechanical integration teaching. At the same time, the platform makes full use of modular thinking to make the sorting modules more flexible. Compared with the traditional training platform, its teaching effect is more significant.
Creating an Effective Multi-Domain Wide-Area Surveillance Platform to Enhance Border Security
2008-03-01
SWOT ANALYSIS ........................................................................................43 I. ANALYSIS OF PROS AND CONS ...weaknesses and opportunities and SWOT analysis was also used to build pros and cons for the platform. All the interviewees liked unmanned platforms...because of the reduced night hour’s operations and SWOT analysis was also used to build pros and cons for the platform. All the interviewees really
Cultivating engineering innovation ability based on optoelectronic experimental platform
NASA Astrophysics Data System (ADS)
Li, Dangjuan; Wu, Shenjiang
2017-08-01
As the supporting experimental platform of the Xi'an Technological University education reform experimental class, "optical technological innovation experimental platform" integrated the design and comprehensive experiments of the optical multi-class courses. On the basis of summing up the past two years teaching experience, platform pilot projects were improve. It has played a good role by making the use of an open teaching model in the cultivating engineering innovation spirit and scientific thinking of the students.
Zheng, Song; Zhang, Qi; Zheng, Rong; Huang, Bi-Qin; Song, Yi-Lin; Chen, Xin-Chu
2017-01-01
In recent years, the smart home field has gained wide attention for its broad application prospects. However, families using smart home systems must usually adopt various heterogeneous smart devices, including sensors and devices, which makes it more difficult to manage and control their home system. How to design a unified control platform to deal with the collaborative control problem of heterogeneous smart devices is one of the greatest challenges in the current smart home field. The main contribution of this paper is to propose a universal smart home control platform architecture (IAPhome) based on a multi-agent system and communication middleware, which shows significant adaptability and advantages in many aspects, including heterogeneous devices connectivity, collaborative control, human-computer interaction and user self-management. The communication middleware is an important foundation to design and implement this architecture which makes it possible to integrate heterogeneous smart devices in a flexible way. A concrete method of applying the multi-agent software technique to solve the integrated control problem of the smart home system is also presented. The proposed platform architecture has been tested in a real smart home environment, and the results indicate that the effectiveness of our approach for solving the collaborative control problem of different smart devices. PMID:28926957
Zheng, Song; Zhang, Qi; Zheng, Rong; Huang, Bi-Qin; Song, Yi-Lin; Chen, Xin-Chu
2017-09-16
In recent years, the smart home field has gained wide attention for its broad application prospects. However, families using smart home systems must usually adopt various heterogeneous smart devices, including sensors and devices, which makes it more difficult to manage and control their home system. How to design a unified control platform to deal with the collaborative control problem of heterogeneous smart devices is one of the greatest challenges in the current smart home field. The main contribution of this paper is to propose a universal smart home control platform architecture (IAPhome) based on a multi-agent system and communication middleware, which shows significant adaptability and advantages in many aspects, including heterogeneous devices connectivity, collaborative control, human-computer interaction and user self-management. The communication middleware is an important foundation to design and implement this architecture which makes it possible to integrate heterogeneous smart devices in a flexible way. A concrete method of applying the multi-agent software technique to solve the integrated control problem of the smart home system is also presented. The proposed platform architecture has been tested in a real smart home environment, and the results indicate that the effectiveness of our approach for solving the collaborative control problem of different smart devices.
Crop 3D-a LiDAR based platform for 3D high-throughput crop phenotyping.
Guo, Qinghua; Wu, Fangfang; Pang, Shuxin; Zhao, Xiaoqian; Chen, Linhai; Liu, Jin; Xue, Baolin; Xu, Guangcai; Li, Le; Jing, Haichun; Chu, Chengcai
2018-03-01
With the growing population and the reducing arable land, breeding has been considered as an effective way to solve the food crisis. As an important part in breeding, high-throughput phenotyping can accelerate the breeding process effectively. Light detection and ranging (LiDAR) is an active remote sensing technology that is capable of acquiring three-dimensional (3D) data accurately, and has a great potential in crop phenotyping. Given that crop phenotyping based on LiDAR technology is not common in China, we developed a high-throughput crop phenotyping platform, named Crop 3D, which integrated LiDAR sensor, high-resolution camera, thermal camera and hyperspectral imager. Compared with traditional crop phenotyping techniques, Crop 3D can acquire multi-source phenotypic data in the whole crop growing period and extract plant height, plant width, leaf length, leaf width, leaf area, leaf inclination angle and other parameters for plant biology and genomics analysis. In this paper, we described the designs, functions and testing results of the Crop 3D platform, and briefly discussed the potential applications and future development of the platform in phenotyping. We concluded that platforms integrating LiDAR and traditional remote sensing techniques might be the future trend of crop high-throughput phenotyping.
Development of a Crosslink Channel Simulator for Simulation of Formation Flying Satellite Systems
NASA Technical Reports Server (NTRS)
Hart, Roger; Hunt, Chris; Burns, Rich D.
2003-01-01
Multi-vehicle missions are an integral part of NASA s and other space agencies current and future business. These multi-vehicle missions generally involve collectively utilizing the array of instrumentation dispersed throughout the system of space vehicles, and communicating via crosslinks to achieve mission goals such as formation flying, autonomous operation, and collective data gathering. NASA s Goddard Space Flight Center (GSFC) is developing the Formation Flying Test Bed (FFTB) to provide hardware-in- the-loop simulation of these crosslink-based systems. The goal of the FFTB is to reduce mission risk, assist in mission planning and analysis, and provide a technology development platform that allows algorithms to be developed for mission hctions such as precision formation flying, synchronization, and inter-vehicle data synthesis. The FFTB will provide a medium in which the various crosslink transponders being used in multi-vehicle missions can be plugged in for development and test. An integral part of the FFTB is the Crosslink Channel Simulator (CCS),which is placed into the communications channel between the crosslinks under test, and is used to simulate on-orbit effects to the communications channel due to relative vehicle motion or antenna misalignment. The CCS is based on the Starlight software programmable platform developed at General Dynamics Decision Systems which provides the CCS with the ability to be modified on the fly to adapt to new crosslink formats or mission parameters.
Study on the E-commerce platform based on the agent
NASA Astrophysics Data System (ADS)
Fu, Ruixue; Qin, Lishuan; Gao, Yinmin
2011-10-01
To solve problem of dynamic integration in e-commerce, the Multi-Agent architecture of electronic commerce platform system based on Agent and Ontology has been introduced, which includes three major types of agent, Ontology and rule collection. In this architecture, service agent and rule are used to realize the business process reengineering, the reuse of software component, and agility of the electronic commerce platform. To illustrate the architecture, a simulation work has been done and the results imply that the architecture provides a very efficient method to design and implement the flexible, distributed, open and intelligent electronic commerce platform system to solve problem of dynamic integration in ecommerce. The objective of this paper is to illustrate the architecture of electronic commerce platform system, and the approach how Agent and Ontology support the electronic commerce platform system.
Multi-Mission System Analysis for Planetary Entry (M-SAPE) Version 1
NASA Technical Reports Server (NTRS)
Samareh, Jamshid; Glaab, Louis; Winski, Richard G.; Maddock, Robert W.; Emmett, Anjie L.; Munk, Michelle M.; Agrawal, Parul; Sepka, Steve; Aliaga, Jose; Zarchi, Kerry;
2014-01-01
This report describes an integrated system for Multi-mission System Analysis for Planetary Entry (M-SAPE). The system in its current form is capable of performing system analysis and design for an Earth entry vehicle suitable for sample return missions. The system includes geometry, mass sizing, impact analysis, structural analysis, flight mechanics, TPS, and a web portal for user access. The report includes details of M-SAPE modules and provides sample results. Current M-SAPE vehicle design concept is based on Mars sample return (MSR) Earth entry vehicle design, which is driven by minimizing risk associated with sample containment (no parachute and passive aerodynamic stability). By M-SAPE exploiting a common design concept, any sample return mission, particularly MSR, will benefit from significant risk and development cost reductions. The design provides a platform by which technologies and design elements can be evaluated rapidly prior to any costly investment commitment.
Multi-hazard risk analysis using the FP7 RASOR Platform
NASA Astrophysics Data System (ADS)
Koudogbo, Fifamè N.; Duro, Javier; Rossi, Lauro; Rudari, Roberto; Eddy, Andrew
2014-10-01
Climate change challenges our understanding of risk by modifying hazards and their interactions. Sudden increases in population and rapid urbanization are changing exposure to risk around the globe, making impacts harder to predict. Despite the availability of operational mapping products, there is no single tool to integrate diverse data and products across hazards, update exposure data quickly and make scenario-based predictions to support both short and long-term risk-related decisions. RASOR (Rapid Analysis and Spatialization Of Risk) will develop a platform to perform multi-hazard risk analysis for the full cycle of disaster management, including targeted support to critical infrastructure monitoring and climate change impact assessment. A scenario-driven query system simulates future scenarios based on existing or assumed conditions and compares them with historical scenarios. RASOR will thus offer a single work environment that generates new risk information across hazards, across data types (satellite EO, in-situ), across user communities (global, local, climate, civil protection, insurance, etc.) and across the world. Five case study areas are considered within the project, located in Haiti, Indonesia, Netherlands, Italy and Greece. Initially available over those demonstration areas, RASOR will ultimately offer global services to support in-depth risk assessment and full-cycle risk management.
INTEGRATING MESO-AND MICRO-SIMULATION MODELS TO EVALUATE TRAFFIC MANAGEMENT STRATEGIES, YEAR 2
DOT National Transportation Integrated Search
2017-07-04
In the Year 1 Report, the Arizona State University (ASU) Project Team described the development of a hierarchical multi-resolution simulation platform to test proactive traffic management strategies. The scope was to integrate an easily available mic...
Multi-threaded integration of HTC-Vive and MeVisLab
NASA Astrophysics Data System (ADS)
Gunacker, Simon; Gall, Markus; Schmalstieg, Dieter; Egger, Jan
2018-03-01
This work presents how Virtual Reality (VR) can easily be integrated into medical applications via a plugin for a medical image processing framework called MeVisLab. A multi-threaded plugin has been developed using OpenVR, a VR library that can be used for developing vendor and platform independent VR applications. The plugin is tested using the HTC Vive, a head-mounted display developed by HTC and Valve Corporation.
Multi-service terminal adapter based on IP technology applications in rural area
NASA Astrophysics Data System (ADS)
Gao, Li; Li, Xiaobo; Yan, Juntao; Ren, Xupeng
Take advantage of ample modern existing telecom network resources to rural areas may achieve it's information society gradually. This includes the establishment of integrated rural information service platform, modern remote education center and electronic administration management platform for rural areas. The geographical and economic constraints must be overcome for structuring the rural service support system, in order to provide technical support, information products and information services to modern rural information service system. It is important that development an access platform based IP technology, which supports multi-service access in order to implement a variety of types of mobile terminal equipment adapter access and to reduce restrictions on mobile terminal equipment.
NASA Astrophysics Data System (ADS)
Wingo, S. M.; Petersen, W. A.; Gatlin, P. N.; Marks, D. A.; Wolff, D. B.; Pabla, C. S.
2017-12-01
The versatile SIMBA (System for Integrating Multi-platform data to Build the Atmospheric column) precipitation data-fusion framework produces an atmospheric column data product with multi-platform observations set into a common 3-D grid, affording an efficient starting point for multi-sensor comparisons and analysis that can be applied to any region. Supported data sources include: ground-based scanning and profiling radars (S-, X-, Ku-, K-, and Ka-band), multiple types of disdrometers and rain gauges, the GPM Core Observatory's Microwave Imager (GMI, 10-183 GHz) and Dual-frequency Precipitation Radar (DPR, Ka/Ku-band), as well as thermodynamic soundings and the Multi-Radar/Multi-Sensor QPE product. SIMBA column data files provide a unique way to evaluate the complete vertical profile of precipitation. Two post-launch (GPM Core in orbit) field campaigns focused on different facets of the GPM mission: the Olympic Mountains Experiment (OLYMPEX) was geared toward winter season (November-February) precipitation in Pacific frontal systems and their transition from the coastal to mountainous terrain of northwest Washington, while the Integrated Precipitation and Hydrology Experiment (IPHEx) sampled warm season (April-June) precipitation and supported hydrologic applications in the southern Appalachians and eastern North Carolina. Both campaigns included multiple orographic precipitation enhancement episodes. SIMBA column products generated for select OLYMPEX and IPHEx events will be used to evaluate spatial variability and vertical profiles of precipitation and drop size distribution parameters derived and/or observed by space- and ground-based sensors. Results will provide a cursory view of how well the space-based measurements represent what is observed from the ground below and an indication to how the terrain in both regions impacts the characteristics of precipitation within the column and reaching the ground.
NASA Astrophysics Data System (ADS)
Wingo, S. M.; Petersen, W. A.; Gatlin, P. N.; Marks, D. A.; Wolff, D. B.; Pabla, C. S.
2016-12-01
The versatile SIMBA (System for Integrating Multi-platform data to Build the Atmospheric column) precipitation data-fusion framework produces an atmospheric column data product with multi-platform observations set into a common 3-D grid, affording an efficient starting point for multi-sensor comparisons and analysis that can be applied to any region. Supported data sources include: ground-based scanning and profiling radars (S-, X-, Ku-, K-, and Ka-band), multiple types of disdrometers and rain gauges, the GPM Core Observatory's Microwave Imager (GMI, 10-183 GHz) and Dual-frequency Precipitation Radar (DPR, Ka/Ku-band), as well as thermodynamic soundings and the Multi-Radar/Multi-Sensor QPE product. SIMBA column data files provide a unique way to evaluate the complete vertical profile of precipitation. Two post-launch (GPM Core in orbit) field campaigns focused on different facets of the GPM mission: the Olympic Mountains Experiment (OLYMPEX) was geared toward winter season (November-February) precipitation in Pacific frontal systems and their transition from the coastal to mountainous terrain of northwest Washington, while the Integrated Precipitation and Hydrology Experiment (IPHEx) sampled warm season (April-June) precipitation and supported hydrologic applications in the southern Appalachians and eastern North Carolina. Both campaigns included multiple orographic precipitation enhancement episodes. SIMBA column products generated for select OLYMPEX and IPHEx events will be used to evaluate spatial variability and vertical profiles of precipitation and drop size distribution parameters derived and/or observed by space- and ground-based sensors. Results will provide a cursory view of how well the space-based measurements represent what is observed from the ground below and an indication to how the terrain in both regions impacts the characteristics of precipitation within the column and reaching the ground.
Jayapandian, Catherine P; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D; Zhang, Guo-Qiang; Sahoo, Satya S
2013-01-01
Epilepsy is the most common serious neurological disorder affecting 50-60 million persons worldwide. Electrophysiological data recordings, such as electroencephalogram (EEG), are the gold standard for diagnosis and pre-surgical evaluation in epilepsy patients. The increasing trend towards multi-center clinical studies require signal visualization and analysis tools to support real time interaction with signal data in a collaborative environment, which cannot be supported by traditional desktop-based standalone applications. As part of the Prevention and Risk Identification of SUDEP Mortality (PRISM) project, we have developed a Web-based electrophysiology data visualization and analysis platform called Cloudwave using highly scalable open source cloud computing infrastructure. Cloudwave is integrated with the PRISM patient cohort identification tool called MEDCIS (Multi-modality Epilepsy Data Capture and Integration System). The Epilepsy and Seizure Ontology (EpSO) underpins both Cloudwave and MEDCIS to support query composition and result retrieval. Cloudwave is being used by clinicians and research staff at the University Hospital - Case Medical Center (UH-CMC) Epilepsy Monitoring Unit (EMU) and will be progressively deployed at four EMUs in the United States and the United Kingdomas part of the PRISM project.
DEEP SPACE: High Resolution VR Platform for Multi-user Interactive Narratives
NASA Astrophysics Data System (ADS)
Kuka, Daniela; Elias, Oliver; Martins, Ronald; Lindinger, Christopher; Pramböck, Andreas; Jalsovec, Andreas; Maresch, Pascal; Hörtner, Horst; Brandl, Peter
DEEP SPACE is a large-scale platform for interactive, stereoscopic and high resolution content. The spatial and the system design of DEEP SPACE are facing constraints of CAVETM-like systems in respect to multi-user interactive storytelling. To be used as research platform and as public exhibition space for many people, DEEP SPACE is capable to process interactive, stereoscopic applications on two projection walls with a size of 16 by 9 meters and a resolution of four times 1080p (4K) each. The processed applications are ranging from Virtual Reality (VR)-environments to 3D-movies to computationally intensive 2D-productions. In this paper, we are describing DEEP SPACE as an experimental VR platform for multi-user interactive storytelling. We are focusing on the system design relevant for the platform, including the integration of the Apple iPod Touch technology as VR control, and a special case study that is demonstrating the research efforts in the field of multi-user interactive storytelling. The described case study, entitled "Papyrate's Island", provides a prototypical scenario of how physical drawings may impact on digital narratives. In this special case, DEEP SPACE helps us to explore the hypothesis that drawing, a primordial human creative skill, gives us access to entirely new creative possibilities in the domain of interactive storytelling.
Shao, Chenzhong; Tanaka, Shuji; Nakayama, Takahiro; Hata, Yoshiyuki; Bartley, Travis; Muroyama, Masanori
2017-01-01
Robot tactile sensation can enhance human–robot communication in terms of safety, reliability and accuracy. The final goal of our project is to widely cover a robot body with a large number of tactile sensors, which has significant advantages such as accurate object recognition, high sensitivity and high redundancy. In this study, we developed a multi-sensor system with dedicated Complementary Metal-Oxide-Semiconductor (CMOS) Large-Scale Integration (LSI) circuit chips (referred to as “sensor platform LSI”) as a framework of a serial bus-based tactile sensor network system. The sensor platform LSI supports three types of sensors: an on-chip temperature sensor, off-chip capacitive and resistive tactile sensors, and communicates with a relay node via a bus line. The multi-sensor system was first constructed on a printed circuit board to evaluate basic functions of the sensor platform LSI, such as capacitance-to-digital and resistance-to-digital conversion. Then, two kinds of external sensors, nine sensors in total, were connected to two sensor platform LSIs, and temperature, capacitive and resistive sensing data were acquired simultaneously. Moreover, we fabricated flexible printed circuit cables to demonstrate the multi-sensor system with 15 sensor platform LSIs operating simultaneously, which showed a more realistic implementation in robots. In conclusion, the multi-sensor system with up to 15 sensor platform LSIs on a bus line supporting temperature, capacitive and resistive sensing was successfully demonstrated. PMID:29061954
Shao, Chenzhong; Tanaka, Shuji; Nakayama, Takahiro; Hata, Yoshiyuki; Bartley, Travis; Nonomura, Yutaka; Muroyama, Masanori
2017-08-28
Robot tactile sensation can enhance human-robot communication in terms of safety, reliability and accuracy. The final goal of our project is to widely cover a robot body with a large number of tactile sensors, which has significant advantages such as accurate object recognition, high sensitivity and high redundancy. In this study, we developed a multi-sensor system with dedicated Complementary Metal-Oxide-Semiconductor (CMOS) Large-Scale Integration (LSI) circuit chips (referred to as "sensor platform LSI") as a framework of a serial bus-based tactile sensor network system. The sensor platform LSI supports three types of sensors: an on-chip temperature sensor, off-chip capacitive and resistive tactile sensors, and communicates with a relay node via a bus line. The multi-sensor system was first constructed on a printed circuit board to evaluate basic functions of the sensor platform LSI, such as capacitance-to-digital and resistance-to-digital conversion. Then, two kinds of external sensors, nine sensors in total, were connected to two sensor platform LSIs, and temperature, capacitive and resistive sensing data were acquired simultaneously. Moreover, we fabricated flexible printed circuit cables to demonstrate the multi-sensor system with 15 sensor platform LSIs operating simultaneously, which showed a more realistic implementation in robots. In conclusion, the multi-sensor system with up to 15 sensor platform LSIs on a bus line supporting temperature, capacitive and resistive sensing was successfully demonstrated.
GLO-Roots: an imaging platform enabling multidimensional characterization of soil-grown root systems
Rellán-Álvarez, Rubén; Lobet, Guillaume; Lindner, Heike; Pradier, Pierre-Luc; Sebastian, Jose; Yee, Muh-Ching; Geng, Yu; Trontin, Charlotte; LaRue, Therese; Schrager-Lavelle, Amanda; Haney, Cara H; Nieu, Rita; Maloof, Julin; Vogel, John P; Dinneny, José R
2015-01-01
Root systems develop different root types that individually sense cues from their local environment and integrate this information with systemic signals. This complex multi-dimensional amalgam of inputs enables continuous adjustment of root growth rates, direction, and metabolic activity that define a dynamic physical network. Current methods for analyzing root biology balance physiological relevance with imaging capability. To bridge this divide, we developed an integrated-imaging system called Growth and Luminescence Observatory for Roots (GLO-Roots) that uses luminescence-based reporters to enable studies of root architecture and gene expression patterns in soil-grown, light-shielded roots. We have developed image analysis algorithms that allow the spatial integration of soil properties, gene expression, and root system architecture traits. We propose GLO-Roots as a system that has great utility in presenting environmental stimuli to roots in ways that evoke natural adaptive responses and in providing tools for studying the multi-dimensional nature of such processes. DOI: http://dx.doi.org/10.7554/eLife.07597.001 PMID:26287479
GLO-Roots: An imaging platform enabling multidimensional characterization of soil-grown root systems
Rellan-Alvarez, Ruben; Lobet, Guillaume; Lindner, Heike; ...
2015-08-19
Root systems develop different root types that individually sense cues from their local environment and integrate this information with systemic signals. This complex multi-dimensional amalgam of inputs enables continuous adjustment of root growth rates, direction, and metabolic activity that define a dynamic physical network. Current methods for analyzing root biology balance physiological relevance with imaging capability. To bridge this divide, we developed an integrated-imaging system called Growth and Luminescence Observatory for Roots (GLO-Roots) that uses luminescence-based reporters to enable studies of root architecture and gene expression patterns in soil-grown, light-shielded roots. We have developed image analysis algorithms that allow themore » spatial integration of soil properties, gene expression, and root system architecture traits. We propose GLO-Roots as a system that has great utility in presenting environmental stimuli to roots in ways that evoke natural adaptive responses and in providing tools for studying the multi-dimensional nature of such processes.« less
Single-mode glass waveguide technology for optical interchip communication on board level
NASA Astrophysics Data System (ADS)
Brusberg, Lars; Neitz, Marcel; Schröder, Henning
2012-01-01
The large bandwidth demand in long-distance telecom networks lead to single-mode fiber interconnects as result of low dispersion, low loss and dense wavelength multiplexing possibilities. In contrast, multi-mode interconnects are suitable for much shorter lengths up to 300 meters and are promising for optical links between racks and on board level. Active optical cables based on multi-mode fiber links are at the market and research in multi-mode waveguide integration on board level is still going on. Compared to multi-mode, a single-mode waveguide has much more integration potential because of core diameters of around 20% of a multi-mode waveguide by a much larger bandwidth. But light coupling in single-mode waveguides is much more challenging because of lower coupling tolerances. Together with the silicon photonics technology, a single-mode waveguide technology on board-level will be the straight forward development goal for chip-to-chip optical interconnects integration. Such a hybrid packaging platform providing 3D optical single-mode links bridges the gap between novel photonic integrated circuits and the glass fiber based long-distance telecom networks. Following we introduce our 3D photonic packaging approach based on thin glass substrates with planar integrated optical single-mode waveguides for fiber-to-chip and chip-to-chip interconnects. This novel packaging approach merges micro-system packaging and glass integrated optics. It consists of a thin glass substrate with planar integrated singlemode waveguide circuits, optical mirrors and lenses providing an integration platform for photonic IC assembly and optical fiber interconnect. Thin glass is commercially available in panel and wafer formats and characterizes excellent optical and high-frequency properties. That makes it perfect for microsystem packaging. The paper presents recent results in single-mode waveguide technology on wafer level and waveguide characterization. Furthermore the integration in a hybrid packaging process and design issues are discussed.
Integration of Advanced Probabilistic Analysis Techniques with Multi-Physics Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Cetiner, Mustafa Sacit; none,; Flanagan, George F.
2014-07-30
An integrated simulation platform that couples probabilistic analysis-based tools with model-based simulation tools can provide valuable insights for reactive and proactive responses to plant operating conditions. The objective of this work is to demonstrate the benefits of a partial implementation of the Small Modular Reactor (SMR) Probabilistic Risk Assessment (PRA) Detailed Framework Specification through the coupling of advanced PRA capabilities and accurate multi-physics plant models. Coupling a probabilistic model with a multi-physics model will aid in design, operations, and safety by providing a more accurate understanding of plant behavior. This represents the first attempt at actually integrating these two typesmore » of analyses for a control system used for operations, on a faster than real-time basis. This report documents the development of the basic communication capability to exchange data with the probabilistic model using Reliability Workbench (RWB) and the multi-physics model using Dymola. The communication pathways from injecting a fault (i.e., failing a component) to the probabilistic and multi-physics models were successfully completed. This first version was tested with prototypic models represented in both RWB and Modelica. First, a simple event tree/fault tree (ET/FT) model was created to develop the software code to implement the communication capabilities between the dynamic-link library (dll) and RWB. A program, written in C#, successfully communicates faults to the probabilistic model through the dll. A systems model of the Advanced Liquid-Metal Reactor–Power Reactor Inherently Safe Module (ALMR-PRISM) design developed under another DOE project was upgraded using Dymola to include proper interfaces to allow data exchange with the control application (ConApp). A program, written in C+, successfully communicates faults to the multi-physics model. The results of the example simulation were successfully plotted.« less
NASA Astrophysics Data System (ADS)
Wang, Meihua; Li, Rongshuai; Zhang, Wenze
2017-11-01
Multi-function construction platforms (MCPs) as an “old construction technology, new application” of the building facade construction equipment, its efforts to reduce labour intensity, improve labour productivity, ensure construction safety, shorten the duration of construction and other aspects of the effect are significant. In this study, the functional analysis of the multi-function construction platforms is carried out in the construction of the assembly building. Based on the general finite element software ANSYS, the static calculation and dynamic characteristics analysis of the MCPs structure are analysed, the simplified finite element model is constructed, and the selection of the unit, the processing and solution of boundary are under discussion and research. The maximum deformation value, the maximum stress value and the structural dynamic characteristic model are obtained. The dangerous parts of the platform structure are analysed, too. Multiple types of MCPs under engineering construction conditions are calculated, so as to put forward the rationalization suggestions for engineering application of the MCPs.
Burger, R; Kurzbuch, D; Gorkin, R; Kijanka, G; Glynn, M; McDonagh, C; Ducrée, J
2015-01-21
In this work we present a centrifugal microfluidic system enabling highly efficient collective trapping and alignment of particles such as microbeads and cells, their multi-colour fluorescent detection and subsequent manipulation by optical tweezers. We demonstrate array-based capture and imaging followed by "cherry-picking" of individual particles, first for fluorescently labelled polystyrene (PS) beads and then for cells. Different cell lines are discriminated based on intracellular as well as surface-based markers.
Hu, Hai; Brzeski, Henry; Hutchins, Joe; Ramaraj, Mohan; Qu, Long; Xiong, Richard; Kalathil, Surendran; Kato, Rand; Tenkillaya, Santhosh; Carney, Jerry; Redd, Rosann; Arkalgudvenkata, Sheshkumar; Shahzad, Kashif; Scott, Richard; Cheng, Hui; Meadow, Stephen; McMichael, John; Sheu, Shwu-Lin; Rosendale, David; Kvecher, Leonid; Ahern, Stephen; Yang, Song; Zhang, Yonghong; Jordan, Rick; Somiari, Stella B; Hooke, Jeffrey; Shriver, Craig D; Somiari, Richard I; Liebman, Michael N
2004-10-01
The Windber Research Institute is an integrated high-throughput research center employing clinical, genomic and proteomic platforms to produce terabyte levels of data. We use biomedical informatics technologies to integrate all of these operations. This report includes information on a multi-year, multi-phase hybrid data warehouse project currently under development in the Institute. The purpose of the warehouse is to host the terabyte-level of internal experimentally generated data as well as data from public sources. We have previously reported on the phase I development, which integrated limited internal data sources and selected public databases. Currently, we are completing phase II development, which integrates our internal automated data sources and develops visualization tools to query across these data types. This paper summarizes our clinical and experimental operations, the data warehouse development, and the challenges we have faced. In phase III we plan to federate additional manual internal and public data sources and then to develop and adapt more data analysis and mining tools. We expect that the final implementation of the data warehouse will greatly facilitate biomedical informatics research.
Radiology and Enterprise Medical Imaging Extensions (REMIX).
Erdal, Barbaros S; Prevedello, Luciano M; Qian, Songyue; Demirer, Mutlu; Little, Kevin; Ryu, John; O'Donnell, Thomas; White, Richard D
2018-02-01
Radiology and Enterprise Medical Imaging Extensions (REMIX) is a platform originally designed to both support the medical imaging-driven clinical and clinical research operational needs of Department of Radiology of The Ohio State University Wexner Medical Center. REMIX accommodates the storage and handling of "big imaging data," as needed for large multi-disciplinary cancer-focused programs. The evolving REMIX platform contains an array of integrated tools/software packages for the following: (1) server and storage management; (2) image reconstruction; (3) digital pathology; (4) de-identification; (5) business intelligence; (6) texture analysis; and (7) artificial intelligence. These capabilities, along with documentation and guidance, explaining how to interact with a commercial system (e.g., PACS, EHR, commercial database) that currently exists in clinical environments, are to be made freely available.
Development of fast wireless detection system for fixed offshore platform
NASA Astrophysics Data System (ADS)
Li, Zhigang; Yu, Yan; Jiao, Dong; Wang, Jie; Li, Zhirui; Ou, Jinping
2011-04-01
Offshore platforms' security is concerned since in 1950s and 1960s, and in the early 1980s some important specifications and standards are built, and all these provide technical basis of fixed platform design, construction, installation and evaluation. With the condition that more and more platforms are in serving over age, the research about the evaluation and detection technology of offshore platform has been a hotspot, especially underwater detection, and assessment method based on the finite element calculation. For fixed platform structure detection, conventional NDT methods, such as eddy current, magnetic powder, permeate, X-ray and ultrasonic, etc, are generally used. These techniques are more mature, intuitive, but underwater detection needs underwater robot, the necessary supporting tools of auxiliary equipment, and trained professional team, thus resources and cost used are considerable, installation time of test equipment is long. This project presents a new kind of fast wireless detection and damage diagnosis system for fixed offshore platform using wireless sensor networks, that is, wireless sensor nodes can be put quickly on the offshore platform, detect offshore platform structure global status by wireless communication, and then make diagnosis. This system is operated simply, suitable for offshore platform integrity states rapid assessment. The designed system consists in intelligence acquisition equipment and 8 wireless collection nodes, the whole system has 64 collection channels, namely every wireless collection node has eight 16-bit accuracy of A/D channels. Wireless collection node, integrated with vibration sensing unit, embedded low-power micro-processing unit, wireless transceiver unit, large-capacity power unit, and GPS time synchronization unit, can finish the functions such as vibration data collection, initial analysis, data storage, data wireless transmission. Intelligence acquisition equipment, integrated with high-performance computation unit, wireless transceiver unit, mobile power unit and embedded data analysis software, can totally control multi-wireless collection nodes, receive and analyze data, parameter identification. Data is transmitted at the 2.4GHz wireless communication channel, every sensing data channel in charge of data transmission is in a stable frequency band, control channel responsible for the control of power parameters is in a public frequency band. The test is initially conducted for the designed system, experimental results show that the system has good application prospects and practical value with fast arrangement, high sampling rate, high resolution, capacity of low frequency detection.
2007-01-01
15 4.2.3. Users of Systems for Combating Biological Warfare ................................ 16 4.2.4...21 4.3.1. Existing Biosurveillance Systems .............................................................. 22 4.3.2. Automatic Integration...74 6.4.4. Multi-Agent System Management System (MMS).................................... 75 6.4.5. Agent Glossary
Bosse, Stefan
2015-01-01
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques. PMID:25690550
Bosse, Stefan
2015-02-16
Multi-agent systems (MAS) can be used for decentralized and self-organizing data processing in a distributed system, like a resource-constrained sensor network, enabling distributed information extraction, for example, based on pattern recognition and self-organization, by decomposing complex tasks in simpler cooperative agents. Reliable MAS-based data processing approaches can aid the material-integration of structural-monitoring applications, with agent processing platforms scaled to the microchip level. The agent behavior, based on a dynamic activity-transition graph (ATG) model, is implemented with program code storing the control and the data state of an agent, which is novel. The program code can be modified by the agent itself using code morphing techniques and is capable of migrating in the network between nodes. The program code is a self-contained unit (a container) and embeds the agent data, the initialization instructions and the ATG behavior implementation. The microchip agent processing platform used for the execution of the agent code is a standalone multi-core stack machine with a zero-operand instruction format, leading to a small-sized agent program code, low system complexity and high system performance. The agent processing is token-queue-based, similar to Petri-nets. The agent platform can be implemented in software, too, offering compatibility at the operational and code level, supporting agent processing in strong heterogeneous networks. In this work, the agent platform embedded in a large-scale distributed sensor network is simulated at the architectural level by using agent-based simulation techniques.
Scalable Algorithms for Clustering Large Geospatiotemporal Data Sets on Manycore Architectures
NASA Astrophysics Data System (ADS)
Mills, R. T.; Hoffman, F. M.; Kumar, J.; Sreepathi, S.; Sripathi, V.
2016-12-01
The increasing availability of high-resolution geospatiotemporal data sets from sources such as observatory networks, remote sensing platforms, and computational Earth system models has opened new possibilities for knowledge discovery using data sets fused from disparate sources. Traditional algorithms and computing platforms are impractical for the analysis and synthesis of data sets of this size; however, new algorithmic approaches that can effectively utilize the complex memory hierarchies and the extremely high levels of available parallelism in state-of-the-art high-performance computing platforms can enable such analysis. We describe a massively parallel implementation of accelerated k-means clustering and some optimizations to boost computational intensity and utilization of wide SIMD lanes on state-of-the art multi- and manycore processors, including the second-generation Intel Xeon Phi ("Knights Landing") processor based on the Intel Many Integrated Core (MIC) architecture, which includes several new features, including an on-package high-bandwidth memory. We also analyze the code in the context of a few practical applications to the analysis of climatic and remotely-sensed vegetation phenology data sets, and speculate on some of the new applications that such scalable analysis methods may enable.
NASA Astrophysics Data System (ADS)
Tsiokos, Dimitris M.; Dabos, George; Ketzaki, Dimitra; Weeber, Jean-Claude; Markey, Laurent; Dereux, Alain; Giesecke, Anna Lena; Porschatis, Caroline; Chmielak, Bartos; Wahlbrink, Thorsten; Rochracher, Karl; Pleros, Nikos
2017-05-01
Silicon photonics meet most fabrication requirements of standard CMOS process lines encompassing the photonics-electronics consolidation vision. Despite this remarkable progress, further miniaturization of PICs for common integration with electronics and for increasing PIC functional density is bounded by the inherent diffraction limit of light imposed by optical waveguides. Instead, Surface Plasmon Polariton (SPP) waveguides can guide light at sub-wavelength scales at the metal surface providing unique light-matter interaction properties, exploiting at the same time their metallic nature to naturally integrate with electronics in high-performance ASPICs. In this article, we demonstrate the main goals of the recently introduced H2020 project PlasmoFab towards addressing the ever increasing needs for low energy, small size and high performance mass manufactured PICs by developing a revolutionary yet CMOS-compatible fabrication platform for seamless co-integration of plasmonics with photonic and supporting electronic. We demonstrate recent advances on the hosting SiN photonic hosting platform reporting on low-loss passive SiN waveguide and Grating Coupler circuits for both the TM and TE polarization states. We also present experimental results of plasmonic gold thin-film and hybrid slot waveguide configurations that can allow for high-sensitivity sensing, providing also the ongoing activities towards replacing gold with Cu, Al or TiN metal in order to yield the same functionality over a CMOS metallic structure. Finally, the first experimental results on the co-integrated SiN+plasmonic platform are demonstrated, concluding to an initial theoretical performance analysis of the CMOS plasmo-photonic biosensor that has the potential to allow for sensitivities beyond 150000nm/RIU.
ERIC Educational Resources Information Center
Klein, Amarolinda Zanela; da Silva Freitas, José Carlos, Jr.; da Silva, Juliana Vitória Vieira Mattiello Mattiello; Barbosa, Jorge Luis Victória; Baldasso, Lucas
2018-01-01
The popularity of Mobile Instant Messaging (MIM) has prompted educators to integrate it in teaching and learning in higher education. WhatsApp® is a multi-platform instant messaging application widely used worldwide, however, there is still little applied research on its use as a platform for educational activities in management higher education.…
Hołowko, Elwira; Januszkiewicz, Kamil; Bolewicki, Paweł; Sitnik, Robert; Michoński, Jakub
2016-10-01
In forensic documentation with bloodstain pattern analysis (BPA) it is highly desirable to obtain non-invasively overall documentation of a crime scene, but also register in high resolution single evidence objects, like bloodstains. In this study, we propose a hierarchical 3D scanning platform designed according to the top-down approach known from the traditional forensic photography. The overall 3D model of a scene is obtained via integration of laser scans registered from different positions. Some parts of a scene being particularly interesting are documented using midrange scanner, and the smallest details are added in the highest resolution as close-up scans. The scanning devices are controlled using developed software equipped with advanced algorithms for point cloud processing. To verify the feasibility and effectiveness of multi-resolution 3D scanning in crime scene documentation, our platform was applied to document a murder scene simulated by the BPA experts from the Central Forensic Laboratory of the Police R&D, Warsaw, Poland. Applying the 3D scanning platform proved beneficial in the documentation of a crime scene combined with BPA. The multi-resolution 3D model enables virtual exploration of a scene in a three-dimensional environment, distance measurement, and gives a more realistic preservation of the evidences together with their surroundings. Moreover, high-resolution close-up scans aligned in a 3D model can be used to analyze bloodstains revealed at the crime scene. The result of BPA such as trajectories, and the area of origin are visualized and analyzed in an accurate model of a scene. At this stage, a simplified approach considering the trajectory of blood drop as a straight line is applied. Although the 3D scanning platform offers a new quality of crime scene documentation with BPA, some of the limitations of the technique are also mentioned. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Nanochannel Electroporation as a Platform for Living Cell Interrogation in Acute Myeloid Leukemia.
Zhao, Xi; Huang, Xiaomeng; Wang, Xinmei; Wu, Yun; Eisfeld, Ann-Kathrin; Schwind, Sebastian; Gallego-Perez, Daniel; Boukany, Pouyan E; Marcucci, Guido I; Lee, Ly James
2015-12-01
A living cell interrogation platform based on nanochannel electroporation is demonstrated with analysis of RNAs in single cells. This minimally invasive process is based on individual cells and allows both multi-target analysis and stimulus-response analysis by sequential deliveries. The unique platform possesses a great potential to the comprehensive and lysis-free nucleic acid analysis on rare or hard-to-transfect cells.
A platform for rapid prototyping of synthetic gene networks in mammalian cells
Duportet, Xavier; Wroblewska, Liliana; Guye, Patrick; Li, Yinqing; Eyquem, Justin; Rieders, Julianne; Rimchala, Tharathorn; Batt, Gregory; Weiss, Ron
2014-01-01
Mammalian synthetic biology may provide novel therapeutic strategies, help decipher new paths for drug discovery and facilitate synthesis of valuable molecules. Yet, our capacity to genetically program cells is currently hampered by the lack of efficient approaches to streamline the design, construction and screening of synthetic gene networks. To address this problem, here we present a framework for modular and combinatorial assembly of functional (multi)gene expression vectors and their efficient and specific targeted integration into a well-defined chromosomal context in mammalian cells. We demonstrate the potential of this framework by assembling and integrating different functional mammalian regulatory networks including the largest gene circuit built and chromosomally integrated to date (6 transcription units, 27kb) encoding an inducible memory device. Using a library of 18 different circuits as a proof of concept, we also demonstrate that our method enables one-pot/single-flask chromosomal integration and screening of circuit libraries. This rapid and powerful prototyping platform is well suited for comparative studies of genetic regulatory elements, genes and multi-gene circuits as well as facile development of libraries of isogenic engineered cell lines. PMID:25378321
Xu, Hai-Yu; Liu, Zhen-Ming; Fu, Yan; Zhang, Yan-Qiong; Yu, Jian-Jun; Guo, Fei-Fei; Tang, Shi-Huan; Lv, Chuan-Yu; Su, Jin; Cui, Ru-Yi; Yang, Hong-Jun
2017-09-01
Recently, integrative pharmacology(IP) has become a pivotal paradigm for the modernization of traditional Chinese medicines(TCM) and combinatorial drugs discovery, which is an interdisciplinary science for establishing the in vitro and in vivo correlation between absorption, distribution, metabolism, and excretion/pharmacokinetic(ADME/PK) profiles of TCM and the molecular networks of disease by the integration of the knowledge of multi-disciplinary and multi-stages. In the present study, an internet-based Computation Platform for IP of TCM(TCM-IP, www.tcmip.cn) is established to promote the development of the emerging discipline. Among them, a big data of TCM is an important resource for TCM-IP including Chinese Medicine Formula Database, Chinese Medical Herbs Database, Chemical Database of Chinese Medicine, Target Database for Disease and Symptoms, et al. Meanwhile, some data mining and bioinformatics approaches are critical technology for TCM-IP including the identification of the TCM constituents, ADME prediction, target prediction for the TCM constituents, network construction and analysis, et al. Furthermore, network beautification and individuation design are employed to meet the consumer's requirement. We firmly believe that TCM-IP is a very useful tool for the identification of active constituents of TCM and their involving potential molecular mechanism for therapeutics, which would wildly applied in quality evaluation, clinical repositioning, scientific discovery based on original thinking, prescription compatibility and new drug of TCM, et al. Copyright© by the Chinese Pharmaceutical Association.
This project is to develop, deploy, and disseminate a suite of open source tools and integrated informatics platform that will facilitate multi-scale, correlative analyses of high resolution whole slide tissue image data, spatially mapped genetics and molecular data for cancer research. This platform will play an essential role in supporting studies of tumor initiation, development, heterogeneity, invasion, and metastasis.
Cloud Based Web 3d GIS Taiwan Platform
NASA Astrophysics Data System (ADS)
Tsai, W.-F.; Chang, J.-Y.; Yan, S. Y.; Chen, B.
2011-09-01
This article presents the status of the web 3D GIS platform, which has been developed in the National Applied Research Laboratories. The purpose is to develop a global earth observation 3D GIS platform for applications to disaster monitoring and assessment in Taiwan. For quick response to preliminary and detailed assessment after a natural disaster occurs, the web 3D GIS platform is useful to access, transfer, integrate, display and analyze the multi-scale huge data following the international OGC standard. The framework of cloud service for data warehousing management and efficiency enhancement using VMWare is illustrated in this article.
NASA Technical Reports Server (NTRS)
Katz, Daniel S.; Cwik, Tom; Fu, Chuigang; Imbriale, William A.; Jamnejad, Vahraz; Springer, Paul L.; Borgioli, Andrea
2000-01-01
The process of designing and analyzing a multiple-reflector system has traditionally been time-intensive, requiring large amounts of both computational and human time. At many frequencies, a discrete approximation of the radiation integral may be used to model the system. The code which implements this physical optics (PO) algorithm was developed at the Jet Propulsion Laboratory. It analyzes systems of antennas in pairs, and for each pair, the analysis can be computationally time-consuming. Additionally, the antennas must be described using a local coordinate system for each antenna, which makes it difficult to integrate the design into a multi-disciplinary framework in which there is traditionally one global coordinate system, even before considering deforming the antenna as prescribed by external structural and/or thermal factors. Finally, setting up the code to correctly analyze all the antenna pairs in the system can take a fair amount of time, and introduces possible human error. The use of parallel computing to reduce the computational time required for the analysis of a given pair of antennas has been previously discussed. This paper focuses on the other problems mentioned above. It will present a methodology and examples of use of an automated tool that performs the analysis of a complete multiple-reflector system in an integrated multi-disciplinary environment (including CAD modeling, and structural and thermal analysis) at the click of a button. This tool, named MOD Tool (Millimeter-wave Optics Design Tool), has been designed and implemented as a distributed tool, with a client that runs almost identically on Unix, Mac, and Windows platforms, and a server that runs primarily on a Unix workstation and can interact with parallel supercomputers with simple instruction from the user interacting with the client.
Design of an imaging spectrometer for earth observation using freeform mirrors
NASA Astrophysics Data System (ADS)
Peschel, T.; Damm, C.; Beier, M.; Gebhardt, A.; Risse, S.; Walter, I.; Sebastian, I.; Krutz, D.
2017-09-01
In 2017 the new hyperspectral DLR Earth Sensing Imaging Spectrometer (DESIS) will be integrated in the Multi-User-System for Earth Sensing (MUSES) platform [1] installed on the International Space Station (ISS).
Airborne net-centric multi-INT sensor control, display, fusion, and exploitation systems
NASA Astrophysics Data System (ADS)
Linne von Berg, Dale C.; Lee, John N.; Kruer, Melvin R.; Duncan, Michael D.; Olchowski, Fred M.; Allman, Eric; Howard, Grant
2004-08-01
The NRL Optical Sciences Division has initiated a multi-year effort to develop and demonstrate an airborne net-centric suite of multi-intelligence (multi-INT) sensors and exploitation systems for real-time target detection and targeting product dissemination. The goal of this Net-centric Multi-Intelligence Fusion Targeting Initiative (NCMIFTI) is to develop an airborne real-time intelligence gathering and targeting system that can be used to detect concealed, camouflaged, and mobile targets. The multi-INT sensor suite will include high-resolution visible/infrared (EO/IR) dual-band cameras, hyperspectral imaging (HSI) sensors in the visible-to-near infrared, short-wave and long-wave infrared (VNIR/SWIR/LWIR) bands, Synthetic Aperture Radar (SAR), electronics intelligence sensors (ELINT), and off-board networked sensors. Other sensors are also being considered for inclusion in the suite to address unique target detection needs. Integrating a suite of multi-INT sensors on a single platform should optimize real-time fusion of the on-board sensor streams, thereby improving the detection probability and reducing the false alarms that occur in reconnaissance systems that use single-sensor types on separate platforms, or that use independent target detection algorithms on multiple sensors. In addition to the integration and fusion of the multi-INT sensors, the effort is establishing an open-systems net-centric architecture that will provide a modular "plug and play" capability for additional sensors and system components and provide distributed connectivity to multiple sites for remote system control and exploitation.
Ontology driven integration platform for clinical and translational research
Mirhaji, Parsa; Zhu, Min; Vagnoni, Mattew; Bernstam, Elmer V; Zhang, Jiajie; Smith, Jack W
2009-01-01
Semantic Web technologies offer a promising framework for integration of disparate biomedical data. In this paper we present the semantic information integration platform under development at the Center for Clinical and Translational Sciences (CCTS) at the University of Texas Health Science Center at Houston (UTHSC-H) as part of our Clinical and Translational Science Award (CTSA) program. We utilize the Semantic Web technologies not only for integrating, repurposing and classification of multi-source clinical data, but also to construct a distributed environment for information sharing, and collaboration online. Service Oriented Architecture (SOA) is used to modularize and distribute reusable services in a dynamic and distributed environment. Components of the semantic solution and its overall architecture are described. PMID:19208190
Gene ARMADA: an integrated multi-analysis platform for microarray data implemented in MATLAB.
Chatziioannou, Aristotelis; Moulos, Panagiotis; Kolisis, Fragiskos N
2009-10-27
The microarray data analysis realm is ever growing through the development of various tools, open source and commercial. However there is absence of predefined rational algorithmic analysis workflows or batch standardized processing to incorporate all steps, from raw data import up to the derivation of significantly differentially expressed gene lists. This absence obfuscates the analytical procedure and obstructs the massive comparative processing of genomic microarray datasets. Moreover, the solutions provided, heavily depend on the programming skills of the user, whereas in the case of GUI embedded solutions, they do not provide direct support of various raw image analysis formats or a versatile and simultaneously flexible combination of signal processing methods. We describe here Gene ARMADA (Automated Robust MicroArray Data Analysis), a MATLAB implemented platform with a Graphical User Interface. This suite integrates all steps of microarray data analysis including automated data import, noise correction and filtering, normalization, statistical selection of differentially expressed genes, clustering, classification and annotation. In its current version, Gene ARMADA fully supports 2 coloured cDNA and Affymetrix oligonucleotide arrays, plus custom arrays for which experimental details are given in tabular form (Excel spreadsheet, comma separated values, tab-delimited text formats). It also supports the analysis of already processed results through its versatile import editor. Besides being fully automated, Gene ARMADA incorporates numerous functionalities of the Statistics and Bioinformatics Toolboxes of MATLAB. In addition, it provides numerous visualization and exploration tools plus customizable export data formats for seamless integration by other analysis tools or MATLAB, for further processing. Gene ARMADA requires MATLAB 7.4 (R2007a) or higher and is also distributed as a stand-alone application with MATLAB Component Runtime. Gene ARMADA provides a highly adaptable, integrative, yet flexible tool which can be used for automated quality control, analysis, annotation and visualization of microarray data, constituting a starting point for further data interpretation and integration with numerous other tools.
Selvaraju, Subhashini; Rassi, Ziad El
2013-01-01
A fully integrated platform was developed for capturing/fractionating human fucome from disease-free and breast cancer sera. It comprised multicolumn operated by HPLC pumps and switching valves for the simultaneous depletion of high abundance proteins via affinity-based subtraction and the capturing of fucosylated glycoproteins via lectin affinity chromatography followed by the fractionation of the captured glycoproteins by reversed phase chromatography (RPC). Two lectin columns specific to fucose, namely Aleuria aurantia lectin (AAL) and Lotus tetragonolobus agglutinin (LTA) were utilized. The platform allowed the “cascading” of the serum sample from column-to-column in the liquid phase with no sample manipulation between the various steps. This guaranteed no sample loss and no propagation of experimental biases between the various columns. Finally, the fucome was fractionated by RPC yielding desalted fractions in volatile acetonitrile-rich mobile phase, which after vacuum evaporation were subjected to trypsinolysis for LC-MS/MS analysis. This permitted the identification of the differentially expressed proteins (DEP) in breast cancer serum yielding a broad panel of 35 DEP from the combined LTA and AAL captured proteins and a narrower panel of 8 DEP that were commonly differentially expressed in both LTA and AAL fractions, which are considered as more representative of cancer altered fucome. PMID:23533108
BreedVision--a multi-sensor platform for non-destructive field-based phenotyping in plant breeding.
Busemeyer, Lucas; Mentrup, Daniel; Möller, Kim; Wunder, Erik; Alheit, Katharina; Hahn, Volker; Maurer, Hans Peter; Reif, Jochen C; Würschum, Tobias; Müller, Joachim; Rahe, Florian; Ruckelshausen, Arno
2013-02-27
To achieve the food and energy security of an increasing World population likely to exceed nine billion by 2050 represents a major challenge for plant breeding. Our ability to measure traits under field conditions has improved little over the last decades and currently constitutes a major bottleneck in crop improvement. This work describes the development of a tractor-pulled multi-sensor phenotyping platform for small grain cereals with a focus on the technological development of the system. Various optical sensors like light curtain imaging, 3D Time-of-Flight cameras, laser distance sensors, hyperspectral imaging as well as color imaging are integrated into the system to collect spectral and morphological information of the plants. The study specifies: the mechanical design, the system architecture for data collection and data processing, the phenotyping procedure of the integrated system, results from field trials for data quality evaluation, as well as calibration results for plant height determination as a quantified example for a platform application. Repeated measurements were taken at three developmental stages of the plants in the years 2011 and 2012 employing triticale (×Triticosecale Wittmack L.) as a model species. The technical repeatability of measurement results was high for nearly all different types of sensors which confirmed the high suitability of the platform under field conditions. The developed platform constitutes a robust basis for the development and calibration of further sensor and multi-sensor fusion models to measure various agronomic traits like plant moisture content, lodging, tiller density or biomass yield, and thus, represents a major step towards widening the bottleneck of non-destructive phenotyping for crop improvement and plant genetic studies.
BreedVision — A Multi-Sensor Platform for Non-Destructive Field-Based Phenotyping in Plant Breeding
Busemeyer, Lucas; Mentrup, Daniel; Möller, Kim; Wunder, Erik; Alheit, Katharina; Hahn, Volker; Maurer, Hans Peter; Reif, Jochen C.; Würschum, Tobias; Müller, Joachim; Rahe, Florian; Ruckelshausen, Arno
2013-01-01
To achieve the food and energy security of an increasing World population likely to exceed nine billion by 2050 represents a major challenge for plant breeding. Our ability to measure traits under field conditions has improved little over the last decades and currently constitutes a major bottleneck in crop improvement. This work describes the development of a tractor-pulled multi-sensor phenotyping platform for small grain cereals with a focus on the technological development of the system. Various optical sensors like light curtain imaging, 3D Time-of-Flight cameras, laser distance sensors, hyperspectral imaging as well as color imaging are integrated into the system to collect spectral and morphological information of the plants. The study specifies: the mechanical design, the system architecture for data collection and data processing, the phenotyping procedure of the integrated system, results from field trials for data quality evaluation, as well as calibration results for plant height determination as a quantified example for a platform application. Repeated measurements were taken at three developmental stages of the plants in the years 2011 and 2012 employing triticale (×Triticosecale Wittmack L.) as a model species. The technical repeatability of measurement results was high for nearly all different types of sensors which confirmed the high suitability of the platform under field conditions. The developed platform constitutes a robust basis for the development and calibration of further sensor and multi-sensor fusion models to measure various agronomic traits like plant moisture content, lodging, tiller density or biomass yield, and thus, represents a major step towards widening the bottleneck of non-destructive phenotyping for crop improvement and plant genetic studies. PMID:23447014
NASA Astrophysics Data System (ADS)
Chirvi, Sajal
Biomolecular interaction analysis (BIA) plays vital role in wide variety of fields, which include biomedical research, pharmaceutical industry, medical diagnostics, and biotechnology industry. Study and quantification of interactions between natural biomolecules (proteins, enzymes, DNA) and artificially synthesized molecules (drugs) is routinely done using various labeled and label-free BIA techniques. Labeled BIA (Chemiluminescence, Fluorescence, Radioactive) techniques suffer from steric hindrance of labels on interaction site, difficulty of attaching labels to molecules, higher cost and time of assay development. Label free techniques with real time detection capabilities have demonstrated advantages over traditional labeled techniques. The gold standard for label free BIA is surface Plasmon resonance (SPR) that detects and quantifies the changes in refractive index of the ligand-analyte complex molecule with high sensitivity. Although SPR is a highly sensitive BIA technique, it requires custom-made sensor chips and is not well suited for highly multiplexed BIA required in high throughput applications. Moreover implementation of SPR on various biosensing platforms is limited. In this research work spectral domain phase sensitive interferometry (SD-PSI) has been developed for label-free BIA and biosensing applications to address limitations of SPR and other label free techniques. One distinct advantage of SD-PSI compared to other label-free techniques is that it does not require use of custom fabricated biosensor substrates. Laboratory grade, off-the-shelf glass or plastic substrates of suitable thickness with proper surface functionalization are used as biosensor chips. SD-PSI is tested on four separate BIA and biosensing platforms, which include multi-well plate, flow cell, fiber probe with integrated optics and fiber tip biosensor. Sensitivity of 33 ng/ml for anti-IgG is achieved using multi-well platform. Principle of coherence multiplexing for multi-channel label-free biosensing applications is introduced. Simultaneous interrogation of multiple biosensors is achievable with a single spectral domain phase sensitive interferometer by coding the individual sensograms in coherence-multiplexed channels. Experimental results demonstrating multiplexed quantitative biomolecular interaction analysis of antibodies binding to antigen coated functionalized biosensor chip surfaces on different platforms are presented.
Integrative Exploratory Analysis of Two or More Genomic Datasets.
Meng, Chen; Culhane, Aedin
2016-01-01
Exploratory analysis is an essential step in the analysis of high throughput data. Multivariate approaches such as correspondence analysis (CA), principal component analysis, and multidimensional scaling are widely used in the exploratory analysis of single dataset. Modern biological studies often assay multiple types of biological molecules (e.g., mRNA, protein, phosphoproteins) on a same set of biological samples, thereby creating multiple different types of omics data or multiassay data. Integrative exploratory analysis of these multiple omics data is required to leverage the potential of multiple omics studies. In this chapter, we describe the application of co-inertia analysis (CIA; for analyzing two datasets) and multiple co-inertia analysis (MCIA; for three or more datasets) to address this problem. These methods are powerful yet simple multivariate approaches that represent samples using a lower number of variables, allowing a more easily identification of the correlated structure in and between multiple high dimensional datasets. Graphical representations can be employed to this purpose. In addition, the methods simultaneously project samples and variables (genes, proteins) onto the same lower dimensional space, so the most variant variables from each dataset can be selected and associated with samples, which can be further used to facilitate biological interpretation and pathway analysis. We applied CIA to explore the concordance between mRNA and protein expression in a panel of 60 tumor cell lines from the National Cancer Institute. In the same 60 cell lines, we used MCIA to perform a cross-platform comparison of mRNA gene expression profiles obtained on four different microarray platforms. Last, as an example of integrative analysis of multiassay or multi-omics data we analyzed transcriptomic, proteomic, and phosphoproteomic data from pluripotent (iPS) and embryonic stem (ES) cell lines.
Mashup Scheme Design of Map Tiles Using Lightweight Open Source Webgis Platform
NASA Astrophysics Data System (ADS)
Hu, T.; Fan, J.; He, H.; Qin, L.; Li, G.
2018-04-01
To address the difficulty involved when using existing commercial Geographic Information System platforms to integrate multi-source image data fusion, this research proposes the loading of multi-source local tile data based on CesiumJS and examines the tile data organization mechanisms and spatial reference differences of the CesiumJS platform, as well as various tile data sources, such as Google maps, Map World, and Bing maps. Two types of tile data loading schemes have been designed for the mashup of tiles, the single data source loading scheme and the multi-data source loading scheme. The multi-sources of digital map tiles used in this paper cover two different but mainstream spatial references, the WGS84 coordinate system and the Web Mercator coordinate system. According to the experimental results, the single data source loading scheme and the multi-data source loading scheme with the same spatial coordinate system showed favorable visualization effects; however, the multi-data source loading scheme was prone to lead to tile image deformation when loading multi-source tile data with different spatial references. The resulting method provides a low cost and highly flexible solution for small and medium-scale GIS programs and has a certain potential for practical application values. The problem of deformation during the transition of different spatial references is an important topic for further research.
LinkedOmics: analyzing multi-omics data within and across 32 cancer types.
Vasaikar, Suhas V; Straub, Peter; Wang, Jing; Zhang, Bing
2018-01-04
The LinkedOmics database contains multi-omics data and clinical data for 32 cancer types and a total of 11 158 patients from The Cancer Genome Atlas (TCGA) project. It is also the first multi-omics database that integrates mass spectrometry (MS)-based global proteomics data generated by the Clinical Proteomic Tumor Analysis Consortium (CPTAC) on selected TCGA tumor samples. In total, LinkedOmics has more than a billion data points. To allow comprehensive analysis of these data, we developed three analysis modules in the LinkedOmics web application. The LinkFinder module allows flexible exploration of associations between a molecular or clinical attribute of interest and all other attributes, providing the opportunity to analyze and visualize associations between billions of attribute pairs for each cancer cohort. The LinkCompare module enables easy comparison of the associations identified by LinkFinder, which is particularly useful in multi-omics and pan-cancer analyses. The LinkInterpreter module transforms identified associations into biological understanding through pathway and network analysis. Using five case studies, we demonstrate that LinkedOmics provides a unique platform for biologists and clinicians to access, analyze and compare cancer multi-omics data within and across tumor types. LinkedOmics is freely available at http://www.linkedomics.org. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Registration and Fusion of Multiple Source Remotely Sensed Image Data
NASA Technical Reports Server (NTRS)
LeMoigne, Jacqueline
2004-01-01
Earth and Space Science often involve the comparison, fusion, and integration of multiple types of remotely sensed data at various temporal, radiometric, and spatial resolutions. Results of this integration may be utilized for global change analysis, global coverage of an area at multiple resolutions, map updating or validation of new instruments, as well as integration of data provided by multiple instruments carried on multiple platforms, e.g. in spacecraft constellations or fleets of planetary rovers. Our focus is on developing methods to perform fast, accurate and automatic image registration and fusion. General methods for automatic image registration are being reviewed and evaluated. Various choices for feature extraction, feature matching and similarity measurements are being compared, including wavelet-based algorithms, mutual information and statistically robust techniques. Our work also involves studies related to image fusion and investigates dimension reduction and co-kriging for application-dependent fusion. All methods are being tested using several multi-sensor datasets, acquired at EOS Core Sites, and including multiple sensors such as IKONOS, Landsat-7/ETM+, EO1/ALI and Hyperion, MODIS, and SeaWIFS instruments. Issues related to the coregistration of data from the same platform (i.e., AIRS and MODIS from Aqua) or from several platforms of the A-train (i.e., MLS, HIRDLS, OMI from Aura with AIRS and MODIS from Terra and Aqua) will also be considered.
Optofluidic devices for biomolecule sensing and multiplexing
NASA Astrophysics Data System (ADS)
Ozcelik, Damla
Optofluidics which integrates photonics and microfluidics, has led to highly compact, sensitive and adaptable biomedical sensors. Optofluidic biosensors based on liquid-core anti-resonant reflecting optical waveguides (LC-ARROWs), have proven to be a highly sensitive, portable, and reconfigurable platform for fluorescence spectroscopy and detection of single biomolecules such as proteins, nucleic acids, and virus particles. However, continued improvements in sensitivity remain a major goal as we approach the ultimate limit of detecting individual bio-particles labeled by single or few fluorophores. Additionally, the ability to simultaneously detect and identify multiple biological particles or biomarkers is one of the key requirements for molecular diagnostic tests. The compactness and adaptability of these platforms can further be advanced by introducing tunability, integrating off-chip components, designing reconfigurable and customizable devices, which makes these platforms very good candidates for many different applications. The goal of this thesis was to introduce new elements in these LC-ARROW optofluidics platforms that provide major enhancements in their functionality, making them more sensitive, compact, customizable and multiplexed. First, a novel integrated tunable spectral filter that achieves effective elimination of background noise on the ARROW platform was demonstrated. A unique dual liquid-core design enabled the independent multi-wavelength tuning of the spectral filter by adjusting the refractive index and chemical properties of the liquid. In order to enhance the detection sensitivity of the platform, Y-splitter waveguides were integrated to create multiple excitation spots for each target molecule. A powerful signal processing algorithm was used to analyze the data to improve the signal-to-noise ratio (SNR) of the collected data. Next, the design, optimization and characterization of the Y-splitter waveguides are presented; and single influenza virus detection with an improved SNR was demonstrated using this platform. Finally, multiplexing capacity is introduced to the ARROW detection platform by integrating multi-mode interference (MMI) waveguides. MMI waveguides create wavelength dependent multiple excitation spots at the excitation region, allowing the spectral multiplexed detection of multiple different target molecules based on the excitation pattern, without the need for additional spectral filters. Successful spectral multiplexed detection of three different types of influenza viruses is achieved by using separate wavelengths and combination of wavelengths. This multiplexing capacity is further enhanced by taking advantage of the spatial properties of the MMI pattern, designing triple liquid-core waveguides that intersect the MMI waveguide in different locations. Furthermore, the spectral and spatial multiplexing capacities are combined in these triple liquid-core MMI platforms, allowing these devices to distinguish multiple different targets and samples simultaneously.
An integrated compact airborne multispectral imaging system using embedded computer
NASA Astrophysics Data System (ADS)
Zhang, Yuedong; Wang, Li; Zhang, Xuguo
2015-08-01
An integrated compact airborne multispectral imaging system using embedded computer based control system was developed for small aircraft multispectral imaging application. The multispectral imaging system integrates CMOS camera, filter wheel with eight filters, two-axis stabilized platform, miniature POS (position and orientation system) and embedded computer. The embedded computer has excellent universality and expansibility, and has advantages in volume and weight for airborne platform, so it can meet the requirements of control system of the integrated airborne multispectral imaging system. The embedded computer controls the camera parameters setting, filter wheel and stabilized platform working, image and POS data acquisition, and stores the image and data. The airborne multispectral imaging system can connect peripheral device use the ports of the embedded computer, so the system operation and the stored image data management are easy. This airborne multispectral imaging system has advantages of small volume, multi-function, and good expansibility. The imaging experiment results show that this system has potential for multispectral remote sensing in applications such as resource investigation and environmental monitoring.
Integrated multi-sensor package (IMSP) for unmanned vehicle operations
NASA Astrophysics Data System (ADS)
Crow, Eddie C.; Reichard, Karl; Rogan, Chris; Callen, Jeff; Seifert, Elwood
2007-10-01
This paper describes recent efforts to develop integrated multi-sensor payloads for small robotic platforms for improved operator situational awareness and ultimately for greater robot autonomy. The focus is on enhancements to perception through integration of electro-optic, acoustic, and other sensors for navigation and inspection. The goals are to provide easier control and operation of the robot through fusion of multiple sensor outputs, to improve interoperability of the sensor payload package across multiple platforms through the use of open standards and architectures, and to reduce integration costs by embedded sensor data processing and fusion within the sensor payload package. The solutions investigated in this project to be discussed include: improved capture, processing and display of sensor data from multiple, non-commensurate sensors; an extensible architecture to support plug and play of integrated sensor packages; built-in health, power and system status monitoring using embedded diagnostics/prognostics; sensor payload integration into standard product forms for optimized size, weight and power; and the use of the open Joint Architecture for Unmanned Systems (JAUS)/ Society of Automotive Engineers (SAE) AS-4 interoperability standard. This project is in its first of three years. This paper will discuss the applicability of each of the solutions in terms of its projected impact to reducing operational time for the robot and teleoperator.
Introducing WISDEM:An Integrated System Modeling for Wind Turbines and Plant (Presentation)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dykes, K.; Graf, P.; Scott, G.
2015-01-01
The National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems to achieve a better National Wind Technology Center wind energy systems engineering initiative has developed an analysis platform to leverage its research capabilities toward integrating wind energy engineering and cost models across wind plants. This Wind-Plant Integrated System Design & Engineering Model (WISDEM) platform captures the important interactions between various subsystems tomore » achieve a better understanding of how to improve system-level performance and achieve system-level cost reductions. This work illustrates a few case studies with WISDEM that focus on the design and analysis of wind turbines and plants at different system levels.« less
Multi-threaded ATLAS simulation on Intel Knights Landing processors
NASA Astrophysics Data System (ADS)
Farrell, Steven; Calafiura, Paolo; Leggett, Charles; Tsulaia, Vakhtang; Dotti, Andrea; ATLAS Collaboration
2017-10-01
The Knights Landing (KNL) release of the Intel Many Integrated Core (MIC) Xeon Phi line of processors is a potential game changer for HEP computing. With 72 cores and deep vector registers, the KNL cards promise significant performance benefits for highly-parallel, compute-heavy applications. Cori, the newest supercomputer at the National Energy Research Scientific Computing Center (NERSC), was delivered to its users in two phases with the first phase online at the end of 2015 and the second phase now online at the end of 2016. Cori Phase 2 is based on the KNL architecture and contains over 9000 compute nodes with 96GB DDR4 memory. ATLAS simulation with the multithreaded Athena Framework (AthenaMT) is a good potential use-case for the KNL architecture and supercomputers like Cori. ATLAS simulation jobs have a high ratio of CPU computation to disk I/O and have been shown to scale well in multi-threading and across many nodes. In this paper we will give an overview of the ATLAS simulation application with details on its multi-threaded design. Then, we will present a performance analysis of the application on KNL devices and compare it to a traditional x86 platform to demonstrate the capabilities of the architecture and evaluate the benefits of utilizing KNL platforms like Cori for ATLAS production.
G-DOC Plus - an integrative bioinformatics platform for precision medicine.
Bhuvaneshwar, Krithika; Belouali, Anas; Singh, Varun; Johnson, Robert M; Song, Lei; Alaoui, Adil; Harris, Michael A; Clarke, Robert; Weiner, Louis M; Gusev, Yuriy; Madhavan, Subha
2016-04-30
G-DOC Plus is a data integration and bioinformatics platform that uses cloud computing and other advanced computational tools to handle a variety of biomedical BIG DATA including gene expression arrays, NGS and medical images so that they can be analyzed in the full context of other omics and clinical information. G-DOC Plus currently holds data from over 10,000 patients selected from private and public resources including Gene Expression Omnibus (GEO), The Cancer Genome Atlas (TCGA) and the recently added datasets from REpository for Molecular BRAin Neoplasia DaTa (REMBRANDT), caArray studies of lung and colon cancer, ImmPort and the 1000 genomes data sets. The system allows researchers to explore clinical-omic data one sample at a time, as a cohort of samples; or at the level of population, providing the user with a comprehensive view of the data. G-DOC Plus tools have been leveraged in cancer and non-cancer studies for hypothesis generation and validation; biomarker discovery and multi-omics analysis, to explore somatic mutations and cancer MRI images; as well as for training and graduate education in bioinformatics, data and computational sciences. Several of these use cases are described in this paper to demonstrate its multifaceted usability. G-DOC Plus can be used to support a variety of user groups in multiple domains to enable hypothesis generation for precision medicine research. The long-term vision of G-DOC Plus is to extend this translational bioinformatics platform to stay current with emerging omics technologies and analysis methods to continue supporting novel hypothesis generation, analysis and validation for integrative biomedical research. By integrating several aspects of the disease and exposing various data elements, such as outpatient lab workup, pathology, radiology, current treatments, molecular signatures and expected outcomes over a web interface, G-DOC Plus will continue to strengthen precision medicine research. G-DOC Plus is available at: https://gdoc.georgetown.edu .
Churilov, Leonid; Liu, Daniel; Ma, Henry; Christensen, Soren; Nagakane, Yoshinari; Campbell, Bruce; Parsons, Mark W; Levi, Christopher R; Davis, Stephen M; Donnan, Geoffrey A
2013-04-01
The appropriateness of a software platform for rapid MRI assessment of the amount of salvageable brain tissue after stroke is critical for both the validity of the Extending the Time for Thrombolysis in Emergency Neurological Deficits (EXTEND) Clinical Trial of stroke thrombolysis beyond 4.5 hours and for stroke patient care outcomes. The objective of this research is to develop and implement a methodology for selecting the acute stroke imaging software platform most appropriate for the setting of a multi-centre clinical trial. A multi-disciplinary decision making panel formulated the set of preferentially independent evaluation attributes. Alternative Multi-Attribute Value Measurement methods were used to identify the best imaging software platform followed by sensitivity analysis to ensure the validity and robustness of the proposed solution. Four alternative imaging software platforms were identified. RApid processing of PerfusIon and Diffusion (RAPID) software was selected as the most appropriate for the needs of the EXTEND trial. A theoretically grounded generic multi-attribute selection methodology for imaging software was developed and implemented. The developed methodology assured both a high quality decision outcome and a rational and transparent decision process. This development contributes to stroke literature in the area of comprehensive evaluation of MRI clinical software. At the time of evaluation, RAPID software presented the most appropriate imaging software platform for use in the EXTEND clinical trial. The proposed multi-attribute imaging software evaluation methodology is based on sound theoretical foundations of multiple criteria decision analysis and can be successfully used for choosing the most appropriate imaging software while ensuring both robust decision process and outcomes. © 2012 The Authors. International Journal of Stroke © 2012 World Stroke Organization.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mellor-Crummey, John
The PIPER project set out to develop methodologies and software for measurement, analysis, attribution, and presentation of performance data for extreme-scale systems. Goals of the project were to support analysis of massive multi-scale parallelism, heterogeneous architectures, multi-faceted performance concerns, and to support both post-mortem performance analysis to identify program features that contribute to problematic performance and on-line performance analysis to drive adaptation. This final report summarizes the research and development activity at Rice University as part of the PIPER project. Producing a complete suite of performance tools for exascale platforms during the course of this project was impossible since bothmore » hardware and software for exascale systems is still a moving target. For that reason, the project focused broadly on the development of new techniques for measurement and analysis of performance on modern parallel architectures, enhancements to HPCToolkit’s software infrastructure to support our research goals or use on sophisticated applications, engaging developers of multithreaded runtimes to explore how support for tools should be integrated into their designs, engaging operating system developers with feature requests for enhanced monitoring support, engaging vendors with requests that they add hardware measure- ment capabilities and software interfaces needed by tools as they design new components of HPC platforms including processors, accelerators and networks, and finally collaborations with partners interested in using HPCToolkit to analyze and tune scalable parallel applications.« less
A compact multi-trap optical tweezer system based on CD-ROM technologies
NASA Astrophysics Data System (ADS)
McMenamin, T.; Lee, W. M.
2017-08-01
We implemented an integrated time sharing multiple optical trapping system through the synchronisation of high speed voice coil scanning lens and laser pulsing. The integration is achieved by using commonly available optical pickup unit (OPU) that exists inside optical drives. Scanning frequencies of up to 2 kHz were showed to achieve arbitrary distribution of optical traps within the one-dimensional scan range of the voice coil motor. The functions of the system were demonstrated by the imaging and trapping of 1 μm particles and giant unilamellar vesicles (GUVs). The new device circumvents existing bulky laser scanning systems (4f lens systems) with an integrated laser and lens steering platform that can be integrated on a variety of microscopy platforms (confocal, lightsheet, darkfield).
Luo, Jake; Apperson-Hansen, Carolyn; Pelfrey, Clara M; Zhang, Guo-Qiang
2014-11-30
Cross-institutional cross-disciplinary collaboration has become a trend as researchers move toward building more productive and innovative teams for scientific research. Research collaboration is significantly changing the organizational structure and strategies used in the clinical and translational science domain. However, due to the obstacles of diverse administrative structures, differences in area of expertise, and communication barriers, establishing and managing a cross-institutional research project is still a challenging task. We address these challenges by creating an integrated informatics platform to reduce the barriers to biomedical research collaboration. The Request Management System (RMS) is an informatics infrastructure designed to transform a patchwork of expertise and resources into an integrated support network. The RMS facilitates investigators' initiation of new collaborative projects and supports the management of the collaboration process. In RMS, experts and their knowledge areas are categorized and managed structurally to provide consistent service. A role-based collaborative workflow is tightly integrated with domain experts and services to streamline and monitor the life-cycle of a research project. The RMS has so far tracked over 1,500 investigators with over 4,800 tasks. The research network based on the data collected in RMS illustrated that the investigators' collaborative projects increased close to 3 times from 2009 to 2012. Our experience with RMS indicates that the platform reduces barriers for cross-institutional collaboration of biomedical research projects. Building a new generation of infrastructure to enhance cross-disciplinary and multi-institutional collaboration has become an important yet challenging task. In this paper, we share the experience of developing and utilizing a collaborative project management system. The results of this study demonstrate that a web-based integrated informatics platform can facilitate and increase research interactions among investigators.
2013-02-28
needed to detect and isolate the compromised component • Prevent a cyber attack exploit from reading enough information to form a coherent data set...Analysis Signal Copy Selected Sub-Bands • Gimbaled, Stabilized EO/IR Camera Ball • High Precision GPS & INS (eventual swarm capable inter-UAV coherent ... LIDAR , HSI, Chem-Bio • Multi-Platform Distributed Sensor Experiments (eg, MIMO) • Autonomous & Collaborative Multi-Platform Control • Space for
Gil, Yeongjoon; Wu, Wanqing; Lee, Jungtae
2012-01-01
Background Human life can be further improved if diseases and disorders can be predicted before they become dangerous, by correctly recognizing signals from the human body, so in order to make disease detection more precise, various body-signals need to be measured simultaneously in a synchronized manner. Object This research aims at developing an integrated system for measuring four signals (EEG, ECG, respiration, and PPG) and simultaneously producing synchronous signals on a Wireless Body Sensor Network. Design We designed and implemented a platform for multiple bio-signals using Bluetooth communication. Results First, we developed a prototype board and verified the signals from the sensor platform using frequency responses and quantities. Next, we designed and implemented a lightweight, ultra-compact, low cost, low power-consumption Printed Circuit Board. Conclusion A synchronous multi-body sensor platform is expected to be very useful in telemedicine and emergency rescue scenarios. Furthermore, this system is expected to be able to analyze the mutual effects among body signals. PMID:23112605
NASA Astrophysics Data System (ADS)
Huang, W.; Jiang, J.; Zha, Z.; Zhang, H.; Wang, C.; Zhang, J.
2014-04-01
Geospatial data resources are the foundation of the construction of geo portal which is designed to provide online geoinformation services for the government, enterprise and public. It is vital to keep geospatial data fresh, accurate and comprehensive in order to satisfy the requirements of application and development of geographic location, route navigation, geo search and so on. One of the major problems we are facing is data acquisition. For us, integrating multi-sources geospatial data is the mainly means of data acquisition. This paper introduced a practice integration approach of multi-source geospatial data with different data model, structure and format, which provided the construction of National Geospatial Information Service Platform of China (NGISP) with effective technical supports. NGISP is the China's official geo portal which provides online geoinformation services based on internet, e-government network and classified network. Within the NGISP architecture, there are three kinds of nodes: national, provincial and municipal. Therefore, the geospatial data is from these nodes and the different datasets are heterogeneous. According to the results of analysis of the heterogeneous datasets, the first thing we do is to define the basic principles of data fusion, including following aspects: 1. location precision; 2.geometric representation; 3. up-to-date state; 4. attribute values; and 5. spatial relationship. Then the technical procedure is researched and the method that used to process different categories of features such as road, railway, boundary, river, settlement and building is proposed based on the principles. A case study in Jiangsu province demonstrated the applicability of the principle, procedure and method of multi-source geospatial data integration.
Cortés, Ulises; Annicchiarico, Roberta; Campana, Fabio; Vázquez-Salceda, Javier; Urdiales, Cristina; Canãmero, Lola; López, Maite; Sánchez-Marrè, Miquel; Di Vincenzo, Sarah; Caltagirone, Carlo
2004-04-01
A project based on the integration of new technologies and artificial intelligence to develop a device--e-tool--for disabled patients and elderly people is presented. A mobile platform in intelligent environments (skilled-care facilities and home-care), controlled and managed by a multi-level architecture, is proposed to support patients and caregivers to increase self-dependency in activities of daily living.
NASA Astrophysics Data System (ADS)
Anders, Niels; Suomalainen, Juha; Seeger, Manuel; Keesstra, Saskia; Bartholomeus, Harm; Paron, Paolo
2014-05-01
The recent increase of performance and endurance of electronically controlled flying platforms, such as multi-copters and fixed-wing airplanes, and decreasing size and weight of different sensors and batteries leads to increasing popularity of Unmanned Aerial Systems (UAS) for scientific purposes. Modern workflows that implement UAS include guided flight plan generation, 3D GPS navigation for fully automated piloting, and automated processing with new techniques such as "Structure from Motion" photogrammetry. UAS are often equipped with normal RGB cameras, multi- and hyperspectral sensors, radar, or other sensors, and provide a cheap and flexible solution for creating multi-temporal data sets. UAS revolutionized multi-temporal research allowing new applications related to change analysis and process monitoring. The EGU General Assembly 2014 is hosting a session on platforms, sensors and applications with UAS in soil science and geomorphology. This presentation briefly summarizes the outcome of this session, addressing the current state and future challenges of small-platform data acquisition in soil science and geomorphology.
A droplet-to-digital (D2D) microfluidic device for single cell assays.
Shih, Steve C C; Gach, Philip C; Sustarich, Jess; Simmons, Blake A; Adams, Paul D; Singh, Seema; Singh, Anup K
2015-01-07
We have developed a new hybrid droplet-to-digital microfluidic platform (D2D) that integrates droplet-in-channel microfluidics with digital microfluidics (DMF) for performing multi-step assays. This D2D platform combines the strengths of the two formats-droplets-in-channel for facile generation of droplets containing single cells, and DMF for on-demand manipulation of droplets including control of different droplet volumes (pL-μL), creation of a dilution series of ionic liquid (IL), and parallel single cell culturing and analysis for IL toxicity screening. This D2D device also allows for automated analysis that includes a feedback-controlled system for merging and splitting of droplets to add reagents, an integrated Peltier element for parallel cell culture at optimum temperature, and an impedance sensing mechanism to control the flow rate for droplet generation and preventing droplet evaporation. Droplet-in-channel is well-suited for encapsulation of single cells as it allows the careful manipulation of flow rates of aqueous phase containing cells and oil to optimize encapsulation. Once single cell containing droplets are generated, they are transferred to a DMF chip via a capillary where they are merged with droplets containing IL and cultured at 30 °C. The DMF chip, in addition to permitting cell culture and reagent (ionic liquid/salt) addition, also allows recovery of individual droplets for off-chip analysis such as further culturing and measurement of ethanol production. The D2D chip was used to evaluate the effect of IL/salt type (four types: NaOAc, NaCl, [C2mim] [OAc], [C2mim] [Cl]) and concentration (four concentrations: 0, 37.5, 75, 150 mM) on the growth kinetics and ethanol production of yeast and as expected, increasing IL concentration led to lower biomass and ethanol production. Specifically, [C2mim] [OAc] had inhibitory effects on yeast growth at concentrations 75 and 150 mM and significantly reduced their ethanol production compared to cells grown in other ILs/salts. The growth curve trends obtained by D2D matched conventional yeast culturing in microtiter wells, validating the D2D platform. We believe that our approach represents a generic platform for multi-step biochemical assays such as drug screening, digital PCR, enzyme assays, immunoassays and cell-based assays.
A Roadmap to Continuous Integration for ATLAS Software Development
NASA Astrophysics Data System (ADS)
Elmsheuser, J.; Krasznahorkay, A.; Obreshkov, E.; Undrus, A.; ATLAS Collaboration
2017-10-01
The ATLAS software infrastructure facilitates efforts of more than 1000 developers working on the code base of 2200 packages with 4 million lines of C++ and 1.4 million lines of python code. The ATLAS offline code management system is the powerful, flexible framework for processing new package versions requests, probing code changes in the Nightly Build System, migration to new platforms and compilers, deployment of production releases for worldwide access and supporting physicists with tools and interfaces for efficient software use. It maintains multi-stream, parallel development environment with about 70 multi-platform branches of nightly releases and provides vast opportunities for testing new packages, for verifying patches to existing software and for migrating to new platforms and compilers. The system evolution is currently aimed on the adoption of modern continuous integration (CI) practices focused on building nightly releases early and often, with rigorous unit and integration testing. This paper describes the CI incorporation program for the ATLAS software infrastructure. It brings modern open source tools such as Jenkins and GitLab into the ATLAS Nightly System, rationalizes hardware resource allocation and administrative operations, provides improved feedback and means to fix broken builds promptly for developers. Once adopted, ATLAS CI practices will improve and accelerate innovation cycles and result in increased confidence in new software deployments. The paper reports the status of Jenkins integration with the ATLAS Nightly System as well as short and long term plans for the incorporation of CI practices.
Towards a social and context-aware multi-sensor fall detection and risk assessment platform.
De Backere, F; Ongenae, F; Van den Abeele, F; Nelis, J; Bonte, P; Clement, E; Philpott, M; Hoebeke, J; Verstichel, S; Ackaert, A; De Turck, F
2015-09-01
For elderly people fall incidents are life-changing events that lead to degradation or even loss of autonomy. Current fall detection systems are not integrated and often associated with undetected falls and/or false alarms. In this paper, a social- and context-aware multi-sensor platform is presented, which integrates information gathered by a plethora of fall detection systems and sensors at the home of the elderly, by using a cloud-based solution, making use of an ontology. Within the ontology, both static and dynamic information is captured to model the situation of a specific patient and his/her (in)formal caregivers. This integrated contextual information allows to automatically and continuously assess the fall risk of the elderly, to more accurately detect falls and identify false alarms and to automatically notify the appropriate caregiver, e.g., based on location or their current task. The main advantage of the proposed platform is that multiple fall detection systems and sensors can be integrated, as they can be easily plugged in, this can be done based on the specific needs of the patient. The combination of several systems and sensors leads to a more reliable system, with better accuracy. The proof of concept was tested with the use of the visualizer, which enables a better way to analyze the data flow within the back-end and with the use of the portable testbed, which is equipped with several different sensors. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Baccichet, Nicola; Caillat, Amandine; Rakotonimbahy, Eddy; Dohlen, Kjetil; Savini, Giorgio; Marcos, Michel
2016-08-01
In the framework of the European FP7-FISICA (Far Infrared Space Interferometer Critical Assessment) program, we developed a miniaturized version of the hyper-telescope to demonstrate multi-aperture interferometry on ground. This setup would be ultimately integrated into a CubeSat platform, therefore providing the first real demonstrator of a multi aperture Fizeau interferometer in space. In this paper, we describe the optical design of the ground testbed and the data processing pipeline implemented to reconstruct the object image from interferometric data. As a scientific application, we measured the Sun diameter by fitting a limb-darkening model to our data. Finally, we present the design of a CubeSat platform carrying this miniature Fizeau interferometer, which could be used to monitor the Sun diameter over a long in-orbit period.
NASA Astrophysics Data System (ADS)
Ding, Kai; Jiang, Ping-Yu
2017-09-01
Currently, little work has been devoted to the mediators and tools for multi-role production interactions in the mass individualization environment. This paper proposes a kind of hardware-software-integrated mediators called social sensors (S2ensors) to facilitate the production interactions among customers, manufacturers, and other stakeholders in the social manufacturing systems (SMS). The concept, classification, operational logics, and formalization of S2ensors are clarified. S2ensors collect subjective data from physical sensors and objective data from sensory input in mobile Apps, merge them into meaningful information for decision-making, and finally feed the decisions back for reaction and execution. Then, an S2ensors-Cloud platform is discussed to integrate different S2ensors to work for SMSs in an autonomous way. A demonstrative case is studied by developing a prototype system and the results show that S2ensors and S2ensors-Cloud platform can assist multi-role stakeholders interact and collaborate for the production tasks. It reveals the mediator-enabled mechanisms and methods for production interactions among stakeholders in SMS.
Development of a Crosslink Channel Simulator
NASA Technical Reports Server (NTRS)
Hunt, Chris; Smith, Carl; Burns, Rich
2004-01-01
Distributed Spacecraft missions are an integral part of current and future plans for NASA and other space agencies. Many of these multi-vehicle missions involve utilizing the array of spacecraft as a single, instrument requiring communication via crosslinks to achieve mission goals. NASA s Goddard Space Flight Center (GSFC) is developing the Formation Flying Test Bed (FFTB) to provide a hardware-in-the-loop simulation environment to support mission concept development and system trades with a primary focus on Guidance, Navigation, and Control (GN&C) challenges associated with spacecraft flying. The goal of the FFTB is to reduce mission risk by assisting in mission planning and analysis, provide a technology development platform that allows algorithms to be developed for mission functions such as precision formation navigation and control and time synchronization. The FFTB will provide a medium in which the various crosslink transponders being used in multi-vehicle missions can be integrated for development and test; an integral part of the FFTB is the Crosslink Channel Simulator (CCS). The CCS is placed into the communications channel between the crosslinks under test, and is used to simulate on-mission effects to the communications channel such as vehicle maneuvers, relative vehicle motion, or antenna misalignment. The CCS is based on the Starlight software programmable platform developed at General Dynamics Decision Systems and provides the CCS with the ability to be modified on the fly to adapt to new crosslink formats or mission parameters. This paper briefly describes the Formation Flying Test Bed and its potential uses. It then provides details on the current and future development of the Crosslink Channel Simulator and its capabilities.
DOT National Transportation Integrated Search
2009-12-22
This document presents the University of Michigan Transportation Research Institutes plan to : perform analysis of data collected from the light vehicle platform field operational test of the : Integrated Vehicle-Based Safety Systems (IVBSS) progr...
Integrated Proteomic Approaches for Understanding Toxicity of Environmental Chemicals
To apply quantitative proteomic analysis to the evaluation of toxicity of environmental chemicals, we have developed an integrated proteomic technology platform. This platform has been applied to the analysis of the toxic effects and pathways of many important environmental chemi...
DOT National Transportation Integrated Search
2009-11-23
This document presents the University of Michigan Transportation Research Institutes plan to perform : analysis of data collected from the heavy truck platform field operational test of the Integrated Vehicle- : Based Safety Systems (IVBSS) progra...
Hazlehurst, Brian L; Kurtz, Stephen E; Masica, Andrew; Stevens, Victor J; McBurnie, Mary Ann; Puro, Jon E; Vijayadeva, Vinutha; Au, David H; Brannon, Elissa D; Sittig, Dean F
2015-10-01
Comparative effectiveness research (CER) requires the capture and analysis of data from disparate sources, often from a variety of institutions with diverse electronic health record (EHR) implementations. In this paper we describe the CER Hub, a web-based informatics platform for developing and conducting research studies that combine comprehensive electronic clinical data from multiple health care organizations. The CER Hub platform implements a data processing pipeline that employs informatics standards for data representation and web-based tools for developing study-specific data processing applications, providing standardized access to the patient-centric electronic health record (EHR) across organizations. The CER Hub is being used to conduct two CER studies utilizing data from six geographically distributed and demographically diverse health systems. These foundational studies address the effectiveness of medications for controlling asthma and the effectiveness of smoking cessation services delivered in primary care. The CER Hub includes four key capabilities: the ability to process and analyze both free-text and coded clinical data in the EHR; a data processing environment supported by distributed data and study governance processes; a clinical data-interchange format for facilitating standardized extraction of clinical data from EHRs; and a library of shareable clinical data processing applications. CER requires coordinated and scalable methods for extracting, aggregating, and analyzing complex, multi-institutional clinical data. By offering a range of informatics tools integrated into a framework for conducting studies using EHR data, the CER Hub provides a solution to the challenges of multi-institutional research using electronic medical record data. Copyright © 2015. Published by Elsevier Ireland Ltd.
Tunable quantum interference in a 3D integrated circuit.
Chaboyer, Zachary; Meany, Thomas; Helt, L G; Withford, Michael J; Steel, M J
2015-04-27
Integrated photonics promises solutions to questions of stability, complexity, and size in quantum optics. Advances in tunable and non-planar integrated platforms, such as laser-inscribed photonics, continue to bring the realisation of quantum advantages in computation and metrology ever closer, perhaps most easily seen in multi-path interferometry. Here we demonstrate control of two-photon interference in a chip-scale 3D multi-path interferometer, showing a reduced periodicity and enhanced visibility compared to single photon measurements. Observed non-classical visibilities are widely tunable, and explained well by theoretical predictions based on classical measurements. With these predictions we extract Fisher information approaching a theoretical maximum. Our results open a path to quantum enhanced phase measurements.
Multifunctional millimeter-wave radar system for helicopter safety
NASA Astrophysics Data System (ADS)
Goshi, Darren S.; Case, Timothy J.; McKitterick, John B.; Bui, Long Q.
2012-06-01
A multi-featured sensor solution has been developed that enhances the operational safety and functionality of small airborne platforms, representing an invaluable stride toward enabling higher-risk, tactical missions. This paper demonstrates results from a recently developed multi-functional sensor system that integrates a high performance millimeter-wave radar front end, an evidence grid-based integration processing scheme, and the incorporation into a 3D Synthetic Vision System (SVS) display. The front end architecture consists of a w-band real-beam scanning radar that generates a high resolution real-time radar map and operates with an adaptable antenna architecture currently configured with an interferometric capability for target height estimation. The raw sensor data is further processed within an evidence grid-based integration functionality that results in high-resolution maps in the region surrounding the platform. Lastly, the accumulated radar results are displayed in a fully rendered 3D SVS environment integrated with local database information to provide the best representation of the surrounding environment. The integrated system concept will be discussed and initial results from an experimental flight test of this developmental system will be presented. Specifically, the forward-looking operation of the system demonstrates the system's ability to produce high precision terrain mapping with obstacle detection and avoidance capability, showcasing the system's versatility in a true operational environment.
A multi-landing pad DNA integration platform for mammalian cell engineering
Gaidukov, Leonid; Wroblewska, Liliana; Teague, Brian; Nelson, Tom; Zhang, Xin; Liu, Yan; Jagtap, Kalpana; Mamo, Selamawit; Tseng, Wen Allen; Lowe, Alexis; Das, Jishnu; Bandara, Kalpanie; Baijuraj, Swetha; Summers, Nevin M; Zhang, Lin; Weiss, Ron
2018-01-01
Abstract Engineering mammalian cell lines that stably express many transgenes requires the precise insertion of large amounts of heterologous DNA into well-characterized genomic loci, but current methods are limited. To facilitate reliable large-scale engineering of CHO cells, we identified 21 novel genomic sites that supported stable long-term expression of transgenes, and then constructed cell lines containing one, two or three ‘landing pad’ recombination sites at selected loci. By using a highly efficient BxB1 recombinase along with different selection markers at each site, we directed recombinase-mediated insertion of heterologous DNA to selected sites, including targeting all three with a single transfection. We used this method to controllably integrate up to nine copies of a monoclonal antibody, representing about 100 kb of heterologous DNA in 21 transcriptional units. Because the integration was targeted to pre-validated loci, recombinant protein expression remained stable for weeks and additional copies of the antibody cassette in the integrated payload resulted in a linear increase in antibody expression. Overall, this multi-copy site-specific integration platform allows for controllable and reproducible insertion of large amounts of DNA into stable genomic sites, which has broad applications for mammalian synthetic biology, recombinant protein production and biomanufacturing. PMID:29617873
Liabsuetrakul, Tippawan; Prappre, Tagoon; Pairot, Pakamas; Oumudee, Nurlisa; Islam, Monir
2017-06-01
Surveillance systems are yet to be integrated with health information systems for improving the health of pregnant mothers and their newborns, particularly in developing countries. This study aimed to develop a web-based epidemiological surveillance system for maternal and newborn health with integration of action-oriented responses and automatic data analysis with results presentations and to assess the system acceptance by nurses and doctors involved in various hospitals in southern Thailand. Freeware software and scripting languages were used. The system can be run on different platforms, and it is accessible via various electronic devices. Automatic data analysis with results presentations in the forms of graphs, tables and maps was part of the system. A multi-level security system was incorporated into the program. Most doctors and nurses involved in the study felt the system was easy to use and useful. This system can be integrated into country routine reporting system for monitoring maternal and newborn health and survival.
MeDICi Software Superglue for Data Analysis Pipelines
Ian Gorton
2017-12-09
The Middleware for Data-Intensive Computing (MeDICi) Integration Framework is an integrated middleware platform developed to solve data analysis and processing needs of scientists across many domains. MeDICi is scalable, easily modified, and robust to multiple languages, protocols, and hardware platforms, and in use today by PNNL scientists for bioinformatics, power grid failure analysis, and text analysis.
Cloud-based image sharing network for collaborative imaging diagnosis and consultation
NASA Astrophysics Data System (ADS)
Yang, Yuanyuan; Gu, Yiping; Wang, Mingqing; Sun, Jianyong; Li, Ming; Zhang, Weiqiang; Zhang, Jianguo
2018-03-01
In this presentation, we presented a new approach to design cloud-based image sharing network for collaborative imaging diagnosis and consultation through Internet, which can enable radiologists, specialists and physicians locating in different sites collaboratively and interactively to do imaging diagnosis or consultation for difficult or emergency cases. The designed network combined a regional RIS, grid-based image distribution management, an integrated video conferencing system and multi-platform interactive image display devices together with secured messaging and data communication. There are three kinds of components in the network: edge server, grid-based imaging documents registry and repository, and multi-platform display devices. This network has been deployed in a public cloud platform of Alibaba through Internet since March 2017 and used for small lung nodule or early staging lung cancer diagnosis services between Radiology departments of Huadong hospital in Shanghai and the First Hospital of Jiaxing in Zhejiang Province.
Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.
Barre, Arnaud; Armand, Stéphane
2014-04-01
C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel
Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less
Lundstrom, Blake R.; Palmintier, Bryan S.; Rowe, Daniel; ...
2017-07-24
Electric system operators are increasingly concerned with the potential system-wide impacts of the large-scale integration of distributed energy resources (DERs) including voltage control, protection coordination, and equipment wear. This prompts a need for new simulation techniques that can simultaneously capture all the components of these large integrated smart grid systems. This paper describes a novel platform that combines three emerging research areas: power systems co-simulation, power hardware in the loop (PHIL) simulation, and lab-lab links. The platform is distributed, real-time capable, allows for easy internet-based connection from geographically-dispersed participants, and is software platform agnostic. We demonstrate its utility by studyingmore » real-time PHIL co-simulation of coordinated solar PV firming control of two inverters connected in multiple electric distribution network models, prototypical of U.S. and Australian systems. Here, the novel trans-pacific closed-loop system simulation was conducted in real-time using a power network simulator and physical PV/battery inverter at power at the National Renewable Energy Laboratory in Golden, CO, USA and a physical PV inverter at power at the Commonwealth Scientific and Industrial Research Organisation's Energy Centre in Newcastle, NSW, Australia. This capability enables smart grid researchers throughout the world to leverage their unique simulation capabilities for multi-site collaborations that can effectively simulate and validate emerging smart grid technology solutions.« less
NASA Astrophysics Data System (ADS)
Murukeshan, Vadakke M.; Hoong Ta, Lim
2014-11-01
Medical diagnostics in the recent past has seen the challenging trend to come up with dual and multi-modality imaging for implementing better diagnostic procedures. The changes in tissues in the early disease stages are often subtle and can occur beneath the tissue surface. In most of these cases, conventional types of medical imaging using optics may not be able to detect these changes easily due to its penetration depth of the orders of 1 mm. Each imaging modality has its own advantages and limitations, and the use of a single modality is not suitable for every diagnostic applications. Therefore the need for multi or hybrid-modality imaging arises. Combining more than one imaging modalities overcomes the limitation of individual imaging method and integrates the respective advantages into a single setting. In this context, this paper will be focusing on the research and development of two multi-modality imaging platforms. The first platform combines ultrasound and photoacoustic imaging for diagnostic applications in the eye. The second platform consists of optical hyperspectral and photoacoustic imaging for diagnostic applications in the colon. Photoacoustic imaging is used as one of the modalities in both platforms as it can offer deeper penetration depth compared to optical imaging. The optical engineering and research challenges in developing the dual/multi-modality platforms will be discussed, followed by initial results validating the proposed scheme. The proposed schemes offer high spatial and spectral resolution imaging and sensing, and is expected to offer potential biomedical imaging solutions in the near future.
A practical data processing workflow for multi-OMICS projects.
Kohl, Michael; Megger, Dominik A; Trippler, Martin; Meckel, Hagen; Ahrens, Maike; Bracht, Thilo; Weber, Frank; Hoffmann, Andreas-Claudius; Baba, Hideo A; Sitek, Barbara; Schlaak, Jörg F; Meyer, Helmut E; Stephan, Christian; Eisenacher, Martin
2014-01-01
Multi-OMICS approaches aim on the integration of quantitative data obtained for different biological molecules in order to understand their interrelation and the functioning of larger systems. This paper deals with several data integration and data processing issues that frequently occur within this context. To this end, the data processing workflow within the PROFILE project is presented, a multi-OMICS project that aims on identification of novel biomarkers and the development of new therapeutic targets for seven important liver diseases. Furthermore, a software called CrossPlatformCommander is sketched, which facilitates several steps of the proposed workflow in a semi-automatic manner. Application of the software is presented for the detection of novel biomarkers, their ranking and annotation with existing knowledge using the example of corresponding Transcriptomics and Proteomics data sets obtained from patients suffering from hepatocellular carcinoma. Additionally, a linear regression analysis of Transcriptomics vs. Proteomics data is presented and its performance assessed. It was shown, that for capturing profound relations between Transcriptomics and Proteomics data, a simple linear regression analysis is not sufficient and implementation and evaluation of alternative statistical approaches are needed. Additionally, the integration of multivariate variable selection and classification approaches is intended for further development of the software. Although this paper focuses only on the combination of data obtained from quantitative Proteomics and Transcriptomics experiments, several approaches and data integration steps are also applicable for other OMICS technologies. Keeping specific restrictions in mind the suggested workflow (or at least parts of it) may be used as a template for similar projects that make use of different high throughput techniques. This article is part of a Special Issue entitled: Computational Proteomics in the Post-Identification Era. Guest Editors: Martin Eisenacher and Christian Stephan. Copyright © 2013 Elsevier B.V. All rights reserved.
An open source platform for multi-scale spatially distributed simulations of microbial ecosystems
DOE Office of Scientific and Technical Information (OSTI.GOV)
Segre, Daniel
2014-08-14
The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.
Efficient Parallel Engineering Computing on Linux Workstations
NASA Technical Reports Server (NTRS)
Lou, John Z.
2010-01-01
A C software module has been developed that creates lightweight processes (LWPs) dynamically to achieve parallel computing performance in a variety of engineering simulation and analysis applications to support NASA and DoD project tasks. The required interface between the module and the application it supports is simple, minimal and almost completely transparent to the user applications, and it can achieve nearly ideal computing speed-up on multi-CPU engineering workstations of all operating system platforms. The module can be integrated into an existing application (C, C++, Fortran and others) either as part of a compiled module or as a dynamically linked library (DLL).
Multi-task learning for cross-platform siRNA efficacy prediction: an in-silico study
2010-01-01
Background Gene silencing using exogenous small interfering RNAs (siRNAs) is now a widespread molecular tool for gene functional study and new-drug target identification. The key mechanism in this technique is to design efficient siRNAs that incorporated into the RNA-induced silencing complexes (RISC) to bind and interact with the mRNA targets to repress their translations to proteins. Although considerable progress has been made in the computational analysis of siRNA binding efficacy, few joint analysis of different RNAi experiments conducted under different experimental scenarios has been done in research so far, while the joint analysis is an important issue in cross-platform siRNA efficacy prediction. A collective analysis of RNAi mechanisms for different datasets and experimental conditions can often provide new clues on the design of potent siRNAs. Results An elegant multi-task learning paradigm for cross-platform siRNA efficacy prediction is proposed. Experimental studies were performed on a large dataset of siRNA sequences which encompass several RNAi experiments recently conducted by different research groups. By using our multi-task learning method, the synergy among different experiments is exploited and an efficient multi-task predictor for siRNA efficacy prediction is obtained. The 19 most popular biological features for siRNA according to their jointly importance in multi-task learning were ranked. Furthermore, the hypothesis is validated out that the siRNA binding efficacy on different messenger RNAs(mRNAs) have different conditional distribution, thus the multi-task learning can be conducted by viewing tasks at an "mRNA"-level rather than at the "experiment"-level. Such distribution diversity derived from siRNAs bound to different mRNAs help indicate that the properties of target mRNA have important implications on the siRNA binding efficacy. Conclusions The knowledge gained from our study provides useful insights on how to analyze various cross-platform RNAi data for uncovering of their complex mechanism. PMID:20380733
Multi-task learning for cross-platform siRNA efficacy prediction: an in-silico study.
Liu, Qi; Xu, Qian; Zheng, Vincent W; Xue, Hong; Cao, Zhiwei; Yang, Qiang
2010-04-10
Gene silencing using exogenous small interfering RNAs (siRNAs) is now a widespread molecular tool for gene functional study and new-drug target identification. The key mechanism in this technique is to design efficient siRNAs that incorporated into the RNA-induced silencing complexes (RISC) to bind and interact with the mRNA targets to repress their translations to proteins. Although considerable progress has been made in the computational analysis of siRNA binding efficacy, few joint analysis of different RNAi experiments conducted under different experimental scenarios has been done in research so far, while the joint analysis is an important issue in cross-platform siRNA efficacy prediction. A collective analysis of RNAi mechanisms for different datasets and experimental conditions can often provide new clues on the design of potent siRNAs. An elegant multi-task learning paradigm for cross-platform siRNA efficacy prediction is proposed. Experimental studies were performed on a large dataset of siRNA sequences which encompass several RNAi experiments recently conducted by different research groups. By using our multi-task learning method, the synergy among different experiments is exploited and an efficient multi-task predictor for siRNA efficacy prediction is obtained. The 19 most popular biological features for siRNA according to their jointly importance in multi-task learning were ranked. Furthermore, the hypothesis is validated out that the siRNA binding efficacy on different messenger RNAs(mRNAs) have different conditional distribution, thus the multi-task learning can be conducted by viewing tasks at an "mRNA"-level rather than at the "experiment"-level. Such distribution diversity derived from siRNAs bound to different mRNAs help indicate that the properties of target mRNA have important implications on the siRNA binding efficacy. The knowledge gained from our study provides useful insights on how to analyze various cross-platform RNAi data for uncovering of their complex mechanism.
NASA Astrophysics Data System (ADS)
Rossetto, Rudy; De Filippis, Giovanna; Borsi, Iacopo; Foglia, Laura; Toegl, Anja; Cannata, Massimiliano; Neumann, Jakob; Vazquez-Sune, Enric; Criollo, Rotman
2017-04-01
In order to achieve sustainable and participated ground-water management, innovative software built on the integration of numerical models within GIS software is a perfect candidate to provide a full characterization of quantitative and qualitative aspects of ground- and surface-water resources maintaining the time and spatial dimension. The EU H2020 FREEWAT project (FREE and open source software tools for WATer resource management; Rossetto et al., 2015) aims at simplifying the application of EU water-related Directives through an open-source and public-domain, GIS-integrated simulation platform for planning and management of ground- and surface-water resources. The FREEWAT platform allows to simulate the whole hydrological cycle, coupling the power of GIS geo-processing and post-processing tools in spatial data analysis with that of process-based simulation models. This results in a modeling environment where large spatial datasets can be stored, managed and visualized and where several simulation codes (mainly belonging to the USGS MODFLOW family) are integrated to simulate multiple hydrological, hydrochemical or economic processes. So far, the FREEWAT platform is a large plugin for the QGIS GIS desktop software and it integrates the following capabilities: • the AkvaGIS module allows to produce plots and statistics for the analysis and interpretation of hydrochemical and hydrogeological data; • the Observation Analysis Tool, to facilitate the import, analysis and visualization of time-series data and the use of these data to support model construction and calibration; • groundwater flow simulation in the saturated and unsaturated zones may be simulated using MODFLOW-2005 (Harbaugh, 2005); • multi-species advective-dispersive transport in the saturated zone can be simulated using MT3DMS (Zheng & Wang, 1999); the possibility to simulate viscosity- and density-dependent flows is further accomplished through SEAWAT (Langevin et al., 2007); • sustainable management of combined use of ground- and surface-water resources in rural environments is accomplished by the Farm Process module embedded in MODFLOW-OWHM (Hanson et al., 2014), which allows to dynamically integrate crop water demand and supply from ground- and surface-water; • UCODE_2014 (Poeter et al., 2014) is implemented to perform sensitivity analysis and parameter estimation to improve the model fit through an inverse, regression method based on the evaluation of an objective function. Through creating a common environment among water research/professionals, policy makers and implementers, FREEWAT aims at enhancing science and participatory approach and evidence-based decision making in water resource management, hence producing relevant outcomes for policy implementation. Acknowledgements This paper is presented within the framework of the project FREEWAT, which has received funding from the European Union's HORIZON 2020 research and innovation programme under Grant Agreement n. 642224. References Hanson, R.T., Boyce, S.E., Schmid, W., Hughes, J.D., Mehl, S.M., Leake, S.A., Maddock, T., Niswonger, R.G. One-Water Hydrologic Flow Model (MODFLOW-OWHM), U.S. Geological Survey, Techniques and Methods 6-A51, 2014 134 p. Harbaugh A.W. (2005) - MODFLOW-2005, The U.S. Geological Survey Modular Ground-Water Model - the Ground-Water Flow Process. U.S. Geological Survey, Techniques and Methods 6-A16, 253 p. Langevin C.D., Thorne D.T. Jr., Dausman A.M., Sukop M.C. & Guo Weixing (2007) - SEAWAT Version 4: A Computer Program for Simulation of Multi-Species Solute and Heat Transport. U.S. Geological Survey Techniques and Methods 6-A22, 39 pp. Poeter E.P., Hill M.C., Lu D., Tiedeman C.R. & Mehl S. (2014) - UCODE_2014, with new capabilities to define parameters unique to predictions, calculate weights using simulated values, estimate parameters with SVD, evaluate uncertainty with MCMC, and more. Integrated Groundwater Modeling Center Report Number GWMI 2014-02. Rossetto, R., Borsi, I. & Foglia, L. FREEWAT: FREE and open source software tools for WATer resource management, Rendiconti Online Società Geologica Italiana, 2015, 35, 252-255. Zheng C. & Wang P.P. (1999) - MT3DMS, A modular three-dimensional multi-species transport model for simulation of advection, dispersion and chemical reactions of contaminants in groundwater systems. U.S. Army Engineer Research and Development Center Contract Report SERDP-99-1, Vicksburg, MS, 202 pp.
Object-Oriented Multi-Disciplinary Design, Analysis, and Optimization Tool
NASA Technical Reports Server (NTRS)
Pak, Chan-gi
2011-01-01
An Object-Oriented Optimization (O3) tool was developed that leverages existing tools and practices, and allows the easy integration and adoption of new state-of-the-art software. At the heart of the O3 tool is the Central Executive Module (CEM), which can integrate disparate software packages in a cross platform network environment so as to quickly perform optimization and design tasks in a cohesive, streamlined manner. This object-oriented framework can integrate the analysis codes for multiple disciplines instead of relying on one code to perform the analysis for all disciplines. The CEM was written in FORTRAN and the script commands for each performance index were submitted through the use of the FORTRAN Call System command. In this CEM, the user chooses an optimization methodology, defines objective and constraint functions from performance indices, and provides starting and side constraints for continuous as well as discrete design variables. The structural analysis modules such as computations of the structural weight, stress, deflection, buckling, and flutter and divergence speeds have been developed and incorporated into the O3 tool to build an object-oriented Multidisciplinary Design, Analysis, and Optimization (MDAO) tool.
A Mobile Multi-Agent Information System for Ubiquitous Fetal Monitoring
Su, Chuan-Jun; Chu, Ta-Wei
2014-01-01
Electronic fetal monitoring (EFM) systems integrate many previously separate clinical activities related to fetal monitoring. Promoting the use of ubiquitous fetal monitoring services with real time status assessments requires a robust information platform equipped with an automatic diagnosis engine. This paper presents the design and development of a mobile multi-agent platform-based open information systems (IMAIS) with an automated diagnosis engine to support intensive and distributed ubiquitous fetal monitoring. The automatic diagnosis engine that we developed is capable of analyzing data in both traditional paper-based and digital formats. Issues related to interoperability, scalability, and openness in heterogeneous e-health environments are addressed through the adoption of a FIPA2000 standard compliant agent development platform—the Java Agent Development Environment (JADE). Integrating the IMAIS with light-weight, portable fetal monitor devices allows for continuous long-term monitoring without interfering with a patient’s everyday activities and without restricting her mobility. The system architecture can be also applied to vast monitoring scenarios such as elder care and vital sign monitoring. PMID:24452256
Rzeczycki, Phillip; Yoon, Gi Sang; Keswani, Rahul K.; Sud, Sudha; Stringer, Kathleen A.; Rosania, Gus R.
2017-01-01
Following prolonged administration, certain orally bioavailable but poorly soluble small molecule drugs are prone to precipitate out and form crystal-like drug inclusions (CLDIs) within the cells of living organisms. In this research, we present a quantitative multi-parameter imaging platform for measuring the fluorescence and polarization diattenuation signals of cells harboring intracellular CLDIs. To validate the imaging system, the FDA-approved drug clofazimine (CFZ) was used as a model compound. Our results demonstrated that a quantitative multi-parameter microscopy image analysis platform can be used to study drug sequestering macrophages, and to detect the formation of ordered molecular aggregates formed by poorly soluble small molecule drugs in animals. PMID:28270989
Rzeczycki, Phillip; Yoon, Gi Sang; Keswani, Rahul K; Sud, Sudha; Stringer, Kathleen A; Rosania, Gus R
2017-02-01
Following prolonged administration, certain orally bioavailable but poorly soluble small molecule drugs are prone to precipitate out and form crystal-like drug inclusions (CLDIs) within the cells of living organisms. In this research, we present a quantitative multi-parameter imaging platform for measuring the fluorescence and polarization diattenuation signals of cells harboring intracellular CLDIs. To validate the imaging system, the FDA-approved drug clofazimine (CFZ) was used as a model compound. Our results demonstrated that a quantitative multi-parameter microscopy image analysis platform can be used to study drug sequestering macrophages, and to detect the formation of ordered molecular aggregates formed by poorly soluble small molecule drugs in animals.
NASA Astrophysics Data System (ADS)
Braun, A.; Walter, C. A.; Parvar, K.
2016-12-01
The current platforms for collecting magnetic data include dense coverage, but low resolution traditional airborne surveys, and high resolution, but low coverage terrestrial surveys. Both platforms leave a critical observation gap between the ground surface and approximately 100m above ground elevation, which can be navigated efficiently by new technologies, such as Unmanned Aerial Vehicles (UAVs). Specifically, multi rotor UAV platforms provide the ability to sense the magnetic field in a full 3-D tensor, which increases the quality of data collected over other current platform types. Payload requirements and target requirements must be balanced to fully exploit the 3-D magnetic tensor. This study outlines the integration of a GEM Systems Cesium Vapour UAV Magnetometer, a Lightware SF-11 Laser Altimeter and a uBlox EVK-7P GPS module with a DJI s900 Multi Rotor UAV. The Cesium Magnetometer is suspended beneath the UAV platform by a cable of varying length. A set of surveys was carried out to optimize the sensor orientation, sensor cable length beneath the UAV and data collection methods of the GEM Systems Cesium Vapour UAV Magnetometer when mounted on the DJI s900. The target for these surveys is a 12 inch steam pipeline located approximately 2 feet below the ground surface. A systematic variation of cable length, sensor orientation and inclination was conducted. The data collected from the UAV magnetometer was compared to a terrestrial survey conducted with the GEM GST-19 Proton Procession Magnetometer at the same elevation, which also served a reference station. This allowed for a cross examination between the UAV system and a proven industry standard for magnetic field data collection. The surveys resulted in optimizing the above parameters based on minimizing instrument error and ensuring reliable data acquisition. The results demonstrate that optimizing the UAV magnetometer survey can yield to industry standard measurements.
Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang
2014-09-05
The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents.
Ni, Haochen; Rui, Yikang; Wang, Jiechen; Cheng, Liang
2014-01-01
The chemical industry poses a potential security risk to factory personnel and neighboring residents. In order to mitigate prospective damage, a synthetic method must be developed for an emergency response. With the development of environmental numeric simulation models, model integration methods, and modern information technology, many Decision Support Systems (DSSs) have been established. However, existing systems still have limitations, in terms of synthetic simulation and network interoperation. In order to resolve these limitations, the matured simulation model for chemical accidents was integrated into the WEB Geographic Information System (WEBGIS) platform. The complete workflow of the emergency response, including raw data (meteorology information, and accident information) management, numeric simulation of different kinds of accidents, environmental impact assessments, and representation of the simulation results were achieved. This allowed comprehensive and real-time simulation of acute accidents in the chemical industry. The main contribution of this paper is that an organizational mechanism of the model set, based on the accident type and pollutant substance; a scheduling mechanism for the parallel processing of multi-accident-type, multi-accident-substance, and multi-simulation-model; and finally a presentation method for scalar and vector data on the web browser on the integration of a WEB Geographic Information System (WEBGIS) platform. The outcomes demonstrated that this method could provide effective support for deciding emergency responses of acute chemical accidents. PMID:25198686
Cross-scale phenological data integration to benefit resource management and monitoring
Richardson, Andrew D.; Weltzin, Jake F.; Morisette, Jeffrey T.
2017-01-01
Climate change is presenting new challenges for natural resource managers charged with maintaining sustainable ecosystems and landscapes. Phenology, a branch of science dealing with seasonal natural phenomena (bird migration or plant flowering in response to weather changes, for example), bridges the gap between the biosphere and the climate system. Phenological processes operate across scales that span orders of magnitude—from leaf to globe and from days to seasons—making phenology ideally suited to multiscale, multiplatform data integration and delivery of information at spatial and temporal scales suitable to inform resource management decisions.A workshop report: Workshop held June 2016 to investigate opportunities and challenges facing multi-scale, multi-platform integration of phenological data to support natural resource management decision-making.
TRIAD: The Translational Research Informatics and Data Management Grid
Payne, P.; Ervin, D.; Dhaval, R.; Borlawsky, T.; Lai, A.
2011-01-01
Objective Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. Methods A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis “pipelines.” Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile “working interoperability” between data, information, and knowledge resources. Results Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile “working interoperability” between distributed data and knowledge sources. Conclusion Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling “working interoperability” in heterogeneous biomedical environments. PMID:23616879
TRIAD: The Translational Research Informatics and Data Management Grid.
Payne, P; Ervin, D; Dhaval, R; Borlawsky, T; Lai, A
2011-01-01
Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis "pipelines." Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile "working interoperability" between data, information, and knowledge resources. Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile "working interoperability" between distributed data and knowledge sources. Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling "working interoperability" in heterogeneous biomedical environments.
A Video Game Platform for Exploring Satellite and In-Situ Data Streams
NASA Astrophysics Data System (ADS)
Cai, Y.
2014-12-01
Exploring spatiotemporal patterns of moving objects are essential to Earth Observation missions, such as tracking, modeling and predicting movement of clouds, dust, plumes and harmful algal blooms. Those missions involve high-volume, multi-source, and multi-modal imagery data analysis. Analytical models intend to reveal inner structure, dynamics, and relationship of things. However, they are not necessarily intuitive to humans. Conventional scientific visualization methods are intuitive but limited by manual operations, such as area marking, measurement and alignment of multi-source data, which are expensive and time-consuming. A new development of video analytics platform has been in progress, which integrates the video game engine with satellite and in-situ data streams. The system converts Earth Observation data into articulated objects that are mapped from a high-dimensional space to a 3D space. The object tracking and augmented reality algorithms highlight the objects' features in colors, shapes and trajectories, creating visual cues for observing dynamic patterns. The head and gesture tracker enable users to navigate the data space interactively. To validate our design, we have used NASA SeaWiFS satellite images of oceanographic remote sensing data and NOAA's in-situ cell count data. Our study demonstrates that the video game system can reduce the size and cost of traditional CAVE systems in two to three orders of magnitude. This system can also be used for satellite mission planning and public outreaching.
High density electronic circuit and process for making
Morgan, William P.
1999-01-01
High density circuits with posts that protrude beyond one surface of a substrate to provide easy mounting of devices such as integrated circuits. The posts also provide stress relief to accommodate differential thermal expansion. The process allows high interconnect density with fewer alignment restrictions and less wasted circuit area than previous processes. The resulting substrates can be test platforms for die testing and for multi-chip module substrate testing. The test platform can contain active components and emulate realistic operational conditions, replacing shorts/opens net testing.
NASA Astrophysics Data System (ADS)
Riahi, Reza; Shaegh, Seyed Ali Mousavi; Ghaderi, Masoumeh; Zhang, Yu Shrike; Shin, Su Ryon; Aleman, Julio; Massa, Solange; Kim, Duckjin; Dokmeci, Mehmet Remzi; Khademhosseini, Ali
2016-04-01
There is an increasing interest in developing microfluidic bioreactors and organs-on-a-chip platforms combined with sensing capabilities for continual monitoring of cell-secreted biomarkers. Conventional approaches such as ELISA and mass spectroscopy cannot satisfy the needs of continual monitoring as they are labor-intensive and not easily integrable with low-volume bioreactors. This paper reports on the development of an automated microfluidic bead-based electrochemical immunosensor for in-line measurement of cell-secreted biomarkers. For the operation of the multi-use immunosensor, disposable magnetic microbeads were used to immobilize biomarker-recognition molecules. Microvalves were further integrated in the microfluidic immunosensor chip to achieve programmable operations of the immunoassay including bead loading and unloading, binding, washing, and electrochemical sensing. The platform allowed convenient integration of the immunosensor with liver-on-chips to carry out continual quantification of biomarkers secreted from hepatocytes. Transferrin and albumin productions were monitored during a 5-day hepatotoxicity assessment in which human primary hepatocytes cultured in the bioreactor were treated with acetaminophen. Taken together, our unique microfluidic immunosensor provides a new platform for in-line detection of biomarkers in low volumes and long-term in vitro assessments of cellular functions in microfluidic bioreactors and organs-on-chips.
Riahi, Reza; Shaegh, Seyed Ali Mousavi; Ghaderi, Masoumeh; Zhang, Yu Shrike; Shin, Su Ryon; Aleman, Julio; Massa, Solange; Kim, Duckjin; Dokmeci, Mehmet Remzi; Khademhosseini, Ali
2016-01-01
There is an increasing interest in developing microfluidic bioreactors and organs-on-a-chip platforms combined with sensing capabilities for continual monitoring of cell-secreted biomarkers. Conventional approaches such as ELISA and mass spectroscopy cannot satisfy the needs of continual monitoring as they are labor-intensive and not easily integrable with low-volume bioreactors. This paper reports on the development of an automated microfluidic bead-based electrochemical immunosensor for in-line measurement of cell-secreted biomarkers. For the operation of the multi-use immunosensor, disposable magnetic microbeads were used to immobilize biomarker-recognition molecules. Microvalves were further integrated in the microfluidic immunosensor chip to achieve programmable operations of the immunoassay including bead loading and unloading, binding, washing, and electrochemical sensing. The platform allowed convenient integration of the immunosensor with liver-on-chips to carry out continual quantification of biomarkers secreted from hepatocytes. Transferrin and albumin productions were monitored during a 5-day hepatotoxicity assessment in which human primary hepatocytes cultured in the bioreactor were treated with acetaminophen. Taken together, our unique microfluidic immunosensor provides a new platform for in-line detection of biomarkers in low volumes and long-term in vitro assessments of cellular functions in microfluidic bioreactors and organs-on-chips. PMID:27098564
NASA Technical Reports Server (NTRS)
Farah, Jeffrey J.
1992-01-01
Developing a robust, task level, error recovery and on-line planning architecture is an open research area. There is previously published work on both error recovery and on-line planning; however, none incorporates error recovery and on-line planning into one integrated platform. The integration of these two functionalities requires an architecture that possesses the following characteristics. The architecture must provide for the inclusion of new information without the destruction of existing information. The architecture must provide for the relating of pieces of information, old and new, to one another in a non-trivial rather than trivial manner (e.g., object one is related to object two under the following constraints, versus, yes, they are related; no, they are not related). Finally, the architecture must be not only a stand alone architecture, but also one that can be easily integrated as a supplement to some existing architecture. This thesis proposal addresses architectural development. Its intent is to integrate error recovery and on-line planning onto a single, integrated, multi-processor platform. This intelligent x-autonomous platform, called the Planning Coordinator, will be used initially to supplement existing x-autonomous systems and eventually replace them.
Grid-wide neuroimaging data federation in the context of the NeuroLOG project
Michel, Franck; Gaignard, Alban; Ahmad, Farooq; Barillot, Christian; Batrancourt, Bénédicte; Dojat, Michel; Gibaud, Bernard; Girard, Pascal; Godard, David; Kassel, Gilles; Lingrand, Diane; Malandain, Grégoire; Montagnat, Johan; Pélégrini-Issac, Mélanie; Pennec, Xavier; Rojas Balderrama, Javier; Wali, Bacem
2010-01-01
Grid technologies are appealing to deal with the challenges raised by computational neurosciences and support multi-centric brain studies. However, core grids middleware hardly cope with the complex neuroimaging data representation and multi-layer data federation needs. Moreover, legacy neuroscience environments need to be preserved and cannot be simply superseded by grid services. This paper describes the NeuroLOG platform design and implementation, shedding light on its Data Management Layer. It addresses the integration of brain image files, associated relational metadata and neuroscience semantic data in a heterogeneous distributed environment, integrating legacy data managers through a mediation layer. PMID:20543431
Multi-platform ’Omics Analysis of Human Ebola Virus Disease Pathogenesis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Eisfeld, Amie J.; Halfmann, Peter J.; Wendler, Jason P.
The pathogenesis of human Ebola virus disease (EVD) is complex. EVD is characterized by high levels of virus replication and dissemination, dysregulated immune responses, extensive virus- and host-mediated tissue damage, and disordered coagulation. To clarify how host responses contribute to EVD pathophysiology, we performed multi-platform ’omics analysis of peripheral blood mononuclear cells and plasma from EVD patients. Our results indicate that EVD molecular signatures overlap with those of sepsis, imply that pancreatic enzymes contribute to tissue damage in fatal EVD, and suggest that Ebola virus infection may induce aberrant neutrophils whose activity could explain hallmarks of fatal EVD. Moreover, integratedmore » biomarker prediction identified putative biomarkers from different data platforms that differentiated survivors and fatalities early after infection. This work reveals insight into EVD pathogenesis, suggests an effective approach for biomarker identification, and provides an important community resource for further analysis of human EVD severity.« less
Weckwerth, Wolfram; Wienkoop, Stefanie; Hoehenwarter, Wolfgang; Egelhofer, Volker; Sun, Xiaoliang
2014-01-01
Genome sequencing and systems biology are revolutionizing life sciences. Proteomics emerged as a fundamental technique of this novel research area as it is the basis for gene function analysis and modeling of dynamic protein networks. Here a complete proteomics platform suited for functional genomics and systems biology is presented. The strategy includes MAPA (mass accuracy precursor alignment; http://www.univie.ac.at/mosys/software.html ) as a rapid exploratory analysis step; MASS WESTERN for targeted proteomics; COVAIN ( http://www.univie.ac.at/mosys/software.html ) for multivariate statistical analysis, data integration, and data mining; and PROMEX ( http://www.univie.ac.at/mosys/databases.html ) as a database module for proteogenomics and proteotypic peptides for targeted analysis. Moreover, the presented platform can also be utilized to integrate metabolomics and transcriptomics data for the analysis of metabolite-protein-transcript correlations and time course analysis using COVAIN. Examples for the integration of MAPA and MASS WESTERN data, proteogenomic and metabolic modeling approaches for functional genomics, phosphoproteomics by integration of MOAC (metal-oxide affinity chromatography) with MAPA, and the integration of metabolomics, transcriptomics, proteomics, and physiological data using this platform are presented. All software and step-by-step tutorials for data processing and data mining can be downloaded from http://www.univie.ac.at/mosys/software.html.
Rast, Georg; Weber, Jürgen; Disch, Christoph; Schuck, Elmar; Ittrich, Carina; Guth, Brian D
2015-01-01
Human induced pluripotent stem cell-derived cardiomyocytes are available from various sources and they are being evaluated for safety testing. Several platforms are available offering different assay principles and read-out parameters: patch-clamp and field potential recording, imaging or photometry, impedance measurement, and recording of contractile force. Routine use will establish which assay principle and which parameters best serve the intended purpose. We introduce a combination of field potential recording and calcium ratiometry from spontaneously beating cardiomyocytes as a novel assay providing a complementary read-out parameter set. Field potential recording is performed using a commercial multi-well multi-electrode array platform. Calcium ratiometry is performed using a fiber optic illumination and silicon avalanche photodetectors. Data condensation and statistical analysis are designed to enable statistical inference of differences and equivalence with regard to a solvent control. Simultaneous recording of field potentials and calcium transients from spontaneously beating monolayers was done in a nine-well format. Calcium channel blockers (e.g. nifedipine) and a blocker of calcium store release (ryanodine) can be recognized and discriminated based on the calcium transient signal. An agonist of L-type calcium channels, FPL 64176, increased and prolonged the calcium transient, whereas BAY K 8644, another L-type calcium channel agonist, had no effect. Both FPL 64176 and various calcium channel antagonists have chronotropic effects, which can be discriminated from typical "chronotropic" compounds, like (±)isoprenaline (positive) and arecaidine propargyl ester (negative), based on their effects on the calcium transient. Despite technical limitations in temporal resolution and exact matching of composite calcium transient with the field potential of a subset of cells, the combined recording platform enables a refined interpretation of the field potential recording and a more reliable identification of drug effects on calcium handling. Copyright © 2015 Elsevier Inc. All rights reserved.
Integrated microfluidic platforms for investigating neuronal networks
NASA Astrophysics Data System (ADS)
Kim, Hyung Joon
This dissertation describes the development and application of integrated microfluidics-based assay platforms to study neuronal activities in the nervous system in-vitro. The assay platforms were fabricated using soft lithography and micro/nano fabrication including microfluidics, surface patterning, and nanomaterial synthesis. The use of integrated microfluidics-based assay platform allows culturing and manipulating many types of neuronal tissues in precisely controlled microenvironment. Furthermore, they provide organized multi-cellular in-vitro model, long-term monitoring with live cell imaging, and compatibility with molecular biology techniques and electrophysiology experiment. In this dissertation, the integrated microfluidics-based assay platforms are developed for investigation of neuronal activities such as local protein synthesis, impairment of axonal transport by chemical/physical variants, growth cone path finding under chemical/physical cues, and synaptic transmission in neuronal circuit. Chapter 1 describes the motivation, objectives, and scope for developing in-vitro platform to study various neuronal activities. Chapter 2 introduces microfluidic culture platform for biochemical assay with large-scale neuronal tissues that are utilized as model system in neuroscience research. Chapter 3 focuses on the investigation of impaired axonal transport by beta-Amyloid and oxidative stress. The platform allows to control neuronal processes and to quantify mitochondrial movement in various regions of axons away from applied drugs. Chapter 4 demonstrates the development of microfluidics-based growth cone turning assay to elucidate the mechanism underlying axon guidance under soluble factors and shear flow. Using this platform, the behaviors of growth cone of mammalian neurons are verified under the gradient of inhibitory molecules and also shear flow in well-controlled manner. In Chapter 5, I combine in-vitro multicellular model with microfabricated MEA (multielectrode array) or nanowire electrode array to study electrophysiology in neuronal network. Also, "diode-like" microgrooves to control the number of neuronal processes is embedded in this platform. Chapter 6 concludes with a possible future direction of this work. Interfacing micro/nanotechnology with primary neuron culture would open many doors in fundamental neuroscience research and also biomedical innovation.
The transformative potential of an integrative approach to pregnancy.
Eidem, Haley R; McGary, Kriston L; Capra, John A; Abbot, Patrick; Rokas, Antonis
2017-09-01
Complex traits typically involve diverse biological pathways and are shaped by numerous genetic and environmental factors. Pregnancy-associated traits and pathologies are further complicated by extensive communication across multiple tissues in two individuals, interactions between two genomes-maternal and fetal-that obscure causal variants and lead to genetic conflict, and rapid evolution of pregnancy-associated traits across mammals and in the human lineage. Given the multi-faceted complexity of human pregnancy, integrative approaches that synthesize diverse data types and analyses harbor tremendous promise to identify the genetic architecture and environmental influences underlying pregnancy-associated traits and pathologies. We review current research that addresses the extreme complexities of traits and pathologies associated with human pregnancy. We find that successful efforts to address the many complexities of pregnancy-associated traits and pathologies often harness the power of many and diverse types of data, including genome-wide association studies, evolutionary analyses, multi-tissue transcriptomic profiles, and environmental conditions. We propose that understanding of pregnancy and its pathologies will be accelerated by computational platforms that provide easy access to integrated data and analyses. By simplifying the integration of diverse data, such platforms will provide a comprehensive synthesis that transcends many of the inherent challenges present in studies of pregnancy. Copyright © 2017 Elsevier Ltd. All rights reserved.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W; Gautier, Virginie W
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip.
Courtney, Jane; Woods, Elena; Scholz, Dimitri; Hall, William W.; Gautier, Virginie W.
2015-01-01
We introduce here MATtrack, an open source MATLAB-based computational platform developed to process multi-Tiff files produced by a photo-conversion time lapse protocol for live cell fluorescent microscopy. MATtrack automatically performs a series of steps required for image processing, including extraction and import of numerical values from Multi-Tiff files, red/green image classification using gating parameters, noise filtering, background extraction, contrast stretching and temporal smoothing. MATtrack also integrates a series of algorithms for quantitative image analysis enabling the construction of mean and standard deviation images, clustering and classification of subcellular regions and injection point approximation. In addition, MATtrack features a simple user interface, which enables monitoring of Fluorescent Signal Intensity in multiple Regions of Interest, over time. The latter encapsulates a region growing method to automatically delineate the contours of Regions of Interest selected by the user, and performs background and regional Average Fluorescence Tracking, and automatic plotting. Finally, MATtrack computes convenient visualization and exploration tools including a migration map, which provides an overview of the protein intracellular trajectories and accumulation areas. In conclusion, MATtrack is an open source MATLAB-based software package tailored to facilitate the analysis and visualization of large data files derived from real-time live cell fluorescent microscopy using photoconvertible proteins. It is flexible, user friendly, compatible with Windows, Mac, and Linux, and a wide range of data acquisition software. MATtrack is freely available for download at eleceng.dit.ie/courtney/MATtrack.zip. PMID:26485569
Raamanathan, Archana; Simmons, Glennon W.; Christodoulides, Nicolaos; Floriano, Pierre N.; Furmaga, Wieslaw B.; Redding, Spencer W.; Lu, Karen H.; Bast, Robert C.; McDevitt, John T.
2013-01-01
Point-of-care (POC) implementation of early detection and screening methodologies for ovarian cancer may enable improved survival rates through early intervention. Current laboratory-confined immunoanalyzers have long turnaround times and are often incompatible with multiplexing and POC implementation. Rapid, sensitive and multiplexable POC diagnostic platforms compatible with promising early detection approaches for ovarian cancer are needed. To this end, we report the adaptation of the programmable bio-nano-chip (p-BNC), an integrated, microfluidic, modular (Programmable) platform for CA125 serum quantitation, a biomarker prominently implicated in multi-modal and multi-marker screening approaches. In the p-BNC, CA125 from diseased sera (Bio) is sequestered and assessed with a fluorescence-based sandwich immunoassay, completed in the nano-nets (Nano) of sensitized agarose microbeads localized in individually addressable wells (Chip), housed in a microfluidic module, capable of integrating multiple sample, reagent and biowaste processing and handling steps. Antibody pairs that bind to distinct epitopes on CA125 were screened. To permit efficient biomarker sequestration in a 3-D microfluidic environment, the p-BNC operating variables (incubation times, flow rates and reagent concentrations) were tuned to deliver optimal analytical performance under 45 minutes. With short analysis times, competitive analytical performance (Inter- and intra-assay precision of 1.2% and 1.9% and LODs of 1.0 U/mL) was achieved on this mini-sensor ensemble. Further validation with sera of ovarian cancer patients (n=20) demonstrated excellent correlation (R2 = 0.97) with gold-standard ELISA. Building on the integration capabilities of novel microfluidic systems programmed for ovarian cancer, the rapid, precise and sensitive miniaturized p-BNC system shows strong promise for ovarian cancer diagnostics. PMID:22490510
Histogram analysis for smartphone-based rapid hematocrit determination
Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.
2017-01-01
A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569
A Privacy-Preserving Platform for User-Centric Quantitative Benchmarking
NASA Astrophysics Data System (ADS)
Herrmann, Dominik; Scheuer, Florian; Feustel, Philipp; Nowey, Thomas; Federrath, Hannes
We propose a centralised platform for quantitative benchmarking of key performance indicators (KPI) among mutually distrustful organisations. Our platform offers users the opportunity to request an ad-hoc benchmarking for a specific KPI within a peer group of their choice. Architecture and protocol are designed to provide anonymity to its users and to hide the sensitive KPI values from other clients and the central server. To this end, we integrate user-centric peer group formation, exchangeable secure multi-party computation protocols, short-lived ephemeral key pairs as pseudonyms, and attribute certificates. We show by empirical evaluation of a prototype that the performance is acceptable for reasonably sized peer groups.
GISpark: A Geospatial Distributed Computing Platform for Spatiotemporal Big Data
NASA Astrophysics Data System (ADS)
Wang, S.; Zhong, E.; Wang, E.; Zhong, Y.; Cai, W.; Li, S.; Gao, S.
2016-12-01
Geospatial data are growing exponentially because of the proliferation of cost effective and ubiquitous positioning technologies such as global remote-sensing satellites and location-based devices. Analyzing large amounts of geospatial data can provide great value for both industrial and scientific applications. Data- and compute- intensive characteristics inherent in geospatial big data increasingly pose great challenges to technologies of data storing, computing and analyzing. Such challenges require a scalable and efficient architecture that can store, query, analyze, and visualize large-scale spatiotemporal data. Therefore, we developed GISpark - a geospatial distributed computing platform for processing large-scale vector, raster and stream data. GISpark is constructed based on the latest virtualized computing infrastructures and distributed computing architecture. OpenStack and Docker are used to build multi-user hosting cloud computing infrastructure for GISpark. The virtual storage systems such as HDFS, Ceph, MongoDB are combined and adopted for spatiotemporal data storage management. Spark-based algorithm framework is developed for efficient parallel computing. Within this framework, SuperMap GIScript and various open-source GIS libraries can be integrated into GISpark. GISpark can also integrated with scientific computing environment (e.g., Anaconda), interactive computing web applications (e.g., Jupyter notebook), and machine learning tools (e.g., TensorFlow/Orange). The associated geospatial facilities of GISpark in conjunction with the scientific computing environment, exploratory spatial data analysis tools, temporal data management and analysis systems make up a powerful geospatial computing tool. GISpark not only provides spatiotemporal big data processing capacity in the geospatial field, but also provides spatiotemporal computational model and advanced geospatial visualization tools that deals with other domains related with spatial property. We tested the performance of the platform based on taxi trajectory analysis. Results suggested that GISpark achieves excellent run time performance in spatiotemporal big data applications.
XML-based scripting of multimodality image presentations in multidisciplinary clinical conferences
NASA Astrophysics Data System (ADS)
Ratib, Osman M.; Allada, Vivekanand; Dahlbom, Magdalena; Marcus, Phillip; Fine, Ian; Lapstra, Lorelle
2002-05-01
We developed a multi-modality image presentation software for display and analysis of images and related data from different imaging modalities. The software is part of a cardiac image review and presentation platform that supports integration of digital images and data from digital and analog media such as videotapes, analog x-ray films and 35 mm cine films. The software supports standard DICOM image files as well as AVI and PDF data formats. The system is integrated in a digital conferencing room that includes projections of digital and analog sources, remote videoconferencing capabilities, and an electronic whiteboard. The goal of this pilot project is to: 1) develop a new paradigm for image and data management for presentation in a clinically meaningful sequence adapted to case-specific scenarios, 2) design and implement a multi-modality review and conferencing workstation using component technology and customizable 'plug-in' architecture to support complex review and diagnostic tasks applicable to all cardiac imaging modalities and 3) develop an XML-based scripting model of image and data presentation for clinical review and decision making during routine clinical tasks and multidisciplinary clinical conferences.
Yoshida, Catherine E; Kruczkiewicz, Peter; Laing, Chad R; Lingohr, Erika J; Gannon, Victor P J; Nash, John H E; Taboada, Eduardo N
2016-01-01
For nearly 100 years serotyping has been the gold standard for the identification of Salmonella serovars. Despite the increasing adoption of DNA-based subtyping approaches, serotype information remains a cornerstone in food safety and public health activities aimed at reducing the burden of salmonellosis. At the same time, recent advances in whole-genome sequencing (WGS) promise to revolutionize our ability to perform advanced pathogen characterization in support of improved source attribution and outbreak analysis. We present the Salmonella In Silico Typing Resource (SISTR), a bioinformatics platform for rapidly performing simultaneous in silico analyses for several leading subtyping methods on draft Salmonella genome assemblies. In addition to performing serovar prediction by genoserotyping, this resource integrates sequence-based typing analyses for: Multi-Locus Sequence Typing (MLST), ribosomal MLST (rMLST), and core genome MLST (cgMLST). We show how phylogenetic context from cgMLST analysis can supplement the genoserotyping analysis and increase the accuracy of in silico serovar prediction to over 94.6% on a dataset comprised of 4,188 finished genomes and WGS draft assemblies. In addition to allowing analysis of user-uploaded whole-genome assemblies, the SISTR platform incorporates a database comprising over 4,000 publicly available genomes, allowing users to place their isolates in a broader phylogenetic and epidemiological context. The resource incorporates several metadata driven visualizations to examine the phylogenetic, geospatial and temporal distribution of genome-sequenced isolates. As sequencing of Salmonella isolates at public health laboratories around the world becomes increasingly common, rapid in silico analysis of minimally processed draft genome assemblies provides a powerful approach for molecular epidemiology in support of public health investigations. Moreover, this type of integrated analysis using multiple sequence-based methods of sub-typing allows for continuity with historical serotyping data as we transition towards the increasing adoption of genomic analyses in epidemiology. The SISTR platform is freely available on the web at https://lfz.corefacility.ca/sistr-app/.
Registration of Laser Scanning Point Clouds: A Review.
Cheng, Liang; Chen, Song; Liu, Xiaoqiang; Xu, Hao; Wu, Yang; Li, Manchun; Chen, Yanming
2018-05-21
The integration of multi-platform, multi-angle, and multi-temporal LiDAR data has become important for geospatial data applications. This paper presents a comprehensive review of LiDAR data registration in the fields of photogrammetry and remote sensing. At present, a coarse-to-fine registration strategy is commonly used for LiDAR point clouds registration. The coarse registration method is first used to achieve a good initial position, based on which registration is then refined utilizing the fine registration method. According to the coarse-to-fine framework, this paper reviews current registration methods and their methodologies, and identifies important differences between them. The lack of standard data and unified evaluation systems is identified as a factor limiting objective comparison of different methods. The paper also describes the most commonly-used point cloud registration error analysis methods. Finally, avenues for future work on LiDAR data registration in terms of applications, data, and technology are discussed. In particular, there is a need to address registration of multi-angle and multi-scale data from various newly available types of LiDAR hardware, which will play an important role in diverse applications such as forest resource surveys, urban energy use, cultural heritage protection, and unmanned vehicles.
Registration of Laser Scanning Point Clouds: A Review
Cheng, Liang; Chen, Song; Xu, Hao; Wu, Yang; Li, Manchun
2018-01-01
The integration of multi-platform, multi-angle, and multi-temporal LiDAR data has become important for geospatial data applications. This paper presents a comprehensive review of LiDAR data registration in the fields of photogrammetry and remote sensing. At present, a coarse-to-fine registration strategy is commonly used for LiDAR point clouds registration. The coarse registration method is first used to achieve a good initial position, based on which registration is then refined utilizing the fine registration method. According to the coarse-to-fine framework, this paper reviews current registration methods and their methodologies, and identifies important differences between them. The lack of standard data and unified evaluation systems is identified as a factor limiting objective comparison of different methods. The paper also describes the most commonly-used point cloud registration error analysis methods. Finally, avenues for future work on LiDAR data registration in terms of applications, data, and technology are discussed. In particular, there is a need to address registration of multi-angle and multi-scale data from various newly available types of LiDAR hardware, which will play an important role in diverse applications such as forest resource surveys, urban energy use, cultural heritage protection, and unmanned vehicles. PMID:29883397
Hall, Jacqueline A; Brown, Robert
2013-09-27
The integration of molecular information in clinical decision making is becoming a reality. These changes are shaping the way clinical research is conducted, and as reality sets in, the challenges in conducting, managing and organising multi-disciplinary research become apparent. Clinical trials provide a platform to conduct translational research (TR) within the context of high quality clinical data accrual. Integrating TR objectives in trials allows the execution of pivotal studies that provide clinical evidence for biomarker-driven treatment strategies, targeting early drug development trials to a homogeneous and well defined patient population, supports the development of companion diagnostics and provides an opportunity for deepening our understanding of cancer biology and mechanisms of drug action. To achieve these goals within a clinical trial, developing translational research infrastructure and capabilities (TRIC) plays a critical catalytic role for translating preclinical data into successful clinical research and development. TRIC represents a technical platform, dedicated resources and access to expertise promoting high quality standards, logistical and operational support and unified streamlined procedures under an appropriate governance framework. TRIC promotes integration of multiple disciplines including biobanking, laboratory analysis, molecular data, informatics, statistical analysis and dissemination of results which are all required for successful TR projects and scientific progress. Such a supporting infrastructure is absolutely essential in order to promote high quality robust research, avoid duplication and coordinate resources. Lack of such infrastructure, we would argue, is one reason for the limited effect of TR in clinical practice beyond clinical trials.
Space Science Cloud: a Virtual Space Science Research Platform Based on Cloud Model
NASA Astrophysics Data System (ADS)
Hu, Xiaoyan; Tong, Jizhou; Zou, Ziming
Through independent and co-operational science missions, Strategic Pioneer Program (SPP) on Space Science, the new initiative of space science program in China which was approved by CAS and implemented by National Space Science Center (NSSC), dedicates to seek new discoveries and new breakthroughs in space science, thus deepen the understanding of universe and planet earth. In the framework of this program, in order to support the operations of space science missions and satisfy the demand of related research activities for e-Science, NSSC is developing a virtual space science research platform based on cloud model, namely the Space Science Cloud (SSC). In order to support mission demonstration, SSC integrates interactive satellite orbit design tool, satellite structure and payloads layout design tool, payload observation coverage analysis tool, etc., to help scientists analyze and verify space science mission designs. Another important function of SSC is supporting the mission operations, which runs through the space satellite data pipelines. Mission operators can acquire and process observation data, then distribute the data products to other systems or issue the data and archives with the services of SSC. In addition, SSC provides useful data, tools and models for space researchers. Several databases in the field of space science are integrated and an efficient retrieve system is developing. Common tools for data visualization, deep processing (e.g., smoothing and filtering tools), analysis (e.g., FFT analysis tool and minimum variance analysis tool) and mining (e.g., proton event correlation analysis tool) are also integrated to help the researchers to better utilize the data. The space weather models on SSC include magnetic storm forecast model, multi-station middle and upper atmospheric climate model, solar energetic particle propagation model and so on. All the services above-mentioned are based on the e-Science infrastructures of CAS e.g. cloud storage and cloud computing. SSC provides its users with self-service storage and computing resources at the same time.At present, the prototyping of SSC is underway and the platform is expected to be put into trial operation in August 2014. We hope that as SSC develops, our vision of Digital Space may come true someday.
The Generation Challenge Programme Platform: Semantic Standards and Workbench for Crop Science
Bruskiewich, Richard; Senger, Martin; Davenport, Guy; Ruiz, Manuel; Rouard, Mathieu; Hazekamp, Tom; Takeya, Masaru; Doi, Koji; Satoh, Kouji; Costa, Marcos; Simon, Reinhard; Balaji, Jayashree; Akintunde, Akinnola; Mauleon, Ramil; Wanchana, Samart; Shah, Trushar; Anacleto, Mylah; Portugal, Arllet; Ulat, Victor Jun; Thongjuea, Supat; Braak, Kyle; Ritter, Sebastian; Dereeper, Alexis; Skofic, Milko; Rojas, Edwin; Martins, Natalia; Pappas, Georgios; Alamban, Ryan; Almodiel, Roque; Barboza, Lord Hendrix; Detras, Jeffrey; Manansala, Kevin; Mendoza, Michael Jonathan; Morales, Jeffrey; Peralta, Barry; Valerio, Rowena; Zhang, Yi; Gregorio, Sergio; Hermocilla, Joseph; Echavez, Michael; Yap, Jan Michael; Farmer, Andrew; Schiltz, Gary; Lee, Jennifer; Casstevens, Terry; Jaiswal, Pankaj; Meintjes, Ayton; Wilkinson, Mark; Good, Benjamin; Wagner, James; Morris, Jane; Marshall, David; Collins, Anthony; Kikuchi, Shoshi; Metz, Thomas; McLaren, Graham; van Hintum, Theo
2008-01-01
The Generation Challenge programme (GCP) is a global crop research consortium directed toward crop improvement through the application of comparative biology and genetic resources characterization to plant breeding. A key consortium research activity is the development of a GCP crop bioinformatics platform to support GCP research. This platform includes the following: (i) shared, public platform-independent domain models, ontology, and data formats to enable interoperability of data and analysis flows within the platform; (ii) web service and registry technologies to identify, share, and integrate information across diverse, globally dispersed data sources, as well as to access high-performance computational (HPC) facilities for computationally intensive, high-throughput analyses of project data; (iii) platform-specific middleware reference implementations of the domain model integrating a suite of public (largely open-access/-source) databases and software tools into a workbench to facilitate biodiversity analysis, comparative analysis of crop genomic data, and plant breeding decision making. PMID:18483570
PEEX Modelling Platform for Seamless Environmental Prediction
NASA Astrophysics Data System (ADS)
Baklanov, Alexander; Mahura, Alexander; Arnold, Stephen; Makkonen, Risto; Petäjä, Tuukka; Kerminen, Veli-Matti; Lappalainen, Hanna K.; Ezau, Igor; Nuterman, Roman; Zhang, Wen; Penenko, Alexey; Gordov, Evgeny; Zilitinkevich, Sergej; Kulmala, Markku
2017-04-01
The Pan-Eurasian EXperiment (PEEX) is a multidisciplinary, multi-scale research programme stared in 2012 and aimed at resolving the major uncertainties in Earth System Science and global sustainability issues concerning the Arctic and boreal Northern Eurasian regions and in China. Such challenges include climate change, air quality, biodiversity loss, chemicalization, food supply, and the use of natural resources by mining, industry, energy production and transport. The research infrastructure introduces the current state of the art modeling platform and observation systems in the Pan-Eurasian region and presents the future baselines for the coherent and coordinated research infrastructures in the PEEX domain. The PEEX modeling Platform is characterized by a complex seamless integrated Earth System Modeling (ESM) approach, in combination with specific models of different processes and elements of the system, acting on different temporal and spatial scales. The ensemble approach is taken to the integration of modeling results from different models, participants and countries. PEEX utilizes the full potential of a hierarchy of models: scenario analysis, inverse modeling, and modeling based on measurement needs and processes. The models are validated and constrained by available in-situ and remote sensing data of various spatial and temporal scales using data assimilation and top-down modeling. The analyses of the anticipated large volumes of data produced by available models and sensors will be supported by a dedicated virtual research environment developed for these purposes.
High-power lightweight external-cavity quantum cascade lasers
NASA Astrophysics Data System (ADS)
Day, Timothy; Takeuchi, Eric B.; Weida, Miles; Arnone, David; Pushkarsky, Michael; Boyden, David; Caffey, David
2009-05-01
Commercially available quantum cascade gain media has been integrated with advanced coating and die attach technologies, mid-IR micro-optics and telecom-style assembly and packaging to yield cutting edge performance. When combined into Daylight's external-cavity quantum cascade laser (ECqcL) platform, multi-Watt output power has been obtained. Daylight will describe their most recent results obtained from this platform, including high cw power from compact hermetically sealed packages and narrow spectral linewidth devices. Fiber-coupling and direct amplitude modulation from such multi-Watt lasers will also be described. In addition, Daylight will present the most recent results from their compact, portable, battery-operated "thermal laser pointers" that are being used for illumination and aiming applications. When combined with thermal imaging technology, such devices provide significant benefits in contrast and identification.
High-Brightness Lasers with Spectral Beam Combining on Silicon
NASA Astrophysics Data System (ADS)
Stanton, Eric John
Modern implementations of absorption spectroscopy and infrared-countermeasures demand advanced performance and integration of high-brightness lasers, especially in the molecular fingerprint spectral region. These applications, along with others in communication, remote-sensing, and medicine, benefit from the light source comprising a multitude of frequencies. To realize this technology, a single multi-spectral optical beam of near-diffraction-limited divergence is created by combining the outputs from an array of laser sources. Full integration of such a laser is possible with direct bonding of several epitaxially-grown chips to a single silicon (Si) substrate. In this platform, an array of lasers is defined with each gain material, creating a densely spaced set of wavelengths similar to wavelength division multiplexing used in communications. Scaling the brightness of a laser typically involves increasing the active volume to produce more output power. In the direction transverse to the light propagation, larger geometries compromise the beam quality. Lengthening the cavity provides only limited scaling of the output power due to the internal losses. Individual integrated lasers have low brightness due to combination of thermal effects and high optical intensities. With heterogeneous integration, many lasers can be spectrally combined on a single integrated chip to scale brightness in a compact platform. Recent demonstrations of 2.0-microm diode and 4.8-microm quantum cascade lasers on Si have extended this heterogeneous platform beyond the telecommunications band to the mid-infrared. In this work, low-loss beam combining elements spanning the visible to the mid-infrared are developed and a high-brightness multi-spectral laser is demonstrated in the range of 4.6-4.7-microm wavelengths. An architecture is presented where light is combined in multiple stages: first within the gain-bandwidth of each laser material and then coarsely between each spectral band to a single output waveguide. All components are demonstrated on a common material platform with a Si substrate, which lends feasibility to the complete system integration. Particular attention is focused on improving the efficiency of arrayed waveguide gratings (AWGs), used in the dense wavelength combining stage. This requires development of a refined characterization technique involving AWGs in a ring-resonator configuration to reduce measurement uncertainty. New levels of low-loss are achieved for visible, near-infrared, and mid-infrared multiplexing devices. Also, a multi-spectral laser in the mid-infrared is demonstrated by integrating an array of quantum cascade lasers and an AWG with Si waveguides. The output power and spectra are measured, demonstrating efficient beam combining and power scaling. Thus, a bright laser source in the mid-infrared has been demonstrated, along with an architecture and the components for incorporating visible and near-infrared optical bands.
Cabana Multi-User Spaceport Tour of KSC
2017-02-17
Nancy Bray, director of Spaceport Integration and Services at NASA's Kennedy Space Center, speaks to members of the news media on the balcony of Operations Support Building II describing the site's transition from a primarily government-only facility to a premier, multi-user spaceport. In the background is the Vehicle Assembly Building (VAB). Modifications were recently completed in the VAB where new work platforms were installed to support processing of NASA's Space Launch System rocket designed to send the Orion spacecraft on missions beyond low-Earth orbit.
Wan, Cuihong; Liu, Jian; Fong, Vincent; Lugowski, Andrew; Stoilova, Snejana; Bethune-Waddell, Dylan; Borgeson, Blake; Havugimana, Pierre C; Marcotte, Edward M; Emili, Andrew
2013-04-09
The experimental isolation and characterization of stable multi-protein complexes are essential to understanding the molecular systems biology of a cell. To this end, we have developed a high-throughput proteomic platform for the systematic identification of native protein complexes based on extensive fractionation of soluble protein extracts by multi-bed ion exchange high performance liquid chromatography (IEX-HPLC) combined with exhaustive label-free LC/MS/MS shotgun profiling. To support these studies, we have built a companion data analysis software pipeline, termed ComplexQuant. Proteins present in the hundreds of fractions typically collected per experiment are first identified by exhaustively interrogating MS/MS spectra using multiple database search engines within an integrative probabilistic framework, while accounting for possible post-translation modifications. Protein abundance is then measured across the fractions based on normalized total spectral counts and precursor ion intensities using a dedicated tool, PepQuant. This analysis allows co-complex membership to be inferred based on the similarity of extracted protein co-elution profiles. Each computational step has been optimized for processing large-scale biochemical fractionation datasets, and the reliability of the integrated pipeline has been benchmarked extensively. This article is part of a Special Issue entitled: From protein structures to clinical applications. Copyright © 2012 Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Im, Piljae; Cho, Heejin; Kim, Dongsu
2016-08-01
This report provides second-year project simulation results for the multi-year project titled “Evaluation of Variable Refrigeration Flow (VRF) system on Oak Ridge National Laboratory (ORNL)’s Flexible Research Platform (FRP).”
Development of a Web-Enabled Informatics Platform for Manipulation of Gene Expression Data
2004-12-01
genomic platforms such as metabolomics and proteomics , and to federated databases for knowledge management. A successful SBIR Phase I completed...measurements that require sophisticated bioinformatic platforms for data archival, management, integration, and analysis if researchers are to derive...web-enabled bioinformatic platform consisting of a Laboratory Information Management System (LIMS), an Analysis Information Management System (AIMS
High density electronic circuit and process for making
Morgan, W.P.
1999-06-29
High density circuits with posts that protrude beyond one surface of a substrate to provide easy mounting of devices such as integrated circuits are disclosed. The posts also provide stress relief to accommodate differential thermal expansion. The process allows high interconnect density with fewer alignment restrictions and less wasted circuit area than previous processes. The resulting substrates can be test platforms for die testing and for multi-chip module substrate testing. The test platform can contain active components and emulate realistic operational conditions, replacing shorts/opens net testing. 8 figs.
NASA Astrophysics Data System (ADS)
Wu, Mingching; Fang, Weileun
2005-03-01
This work integrates multi-depth DRIE etching, trench-refilled molding, two poly-Si layers MUMPs and bulk releasing to improve the variety and performance of MEMS devices. In summary, the present fabrication process, named MOSBE II, has three merits. First, this process can monolithically fabricate and integrate poly-Si thin-film structures with different thicknesses and stiffnesses, such as the flexible spring and the stiff mirror plate. Second, multi-depth structures, such as vertical comb electrodes, are available from the DRIE processes. Third, a cavity under the micromachined device is provided by the bulk silicon etching process, so that a large out-of-plane motion is allowed. In application, an optical scanner driven by the self-aligned vertical comb actuator was demonstrated. The poly-Si micromachined components fabricated by MOSBE II can further integrate with the MUMPs devices to establish a more powerful MOEMS platform.
Integrated long-range UAV/UGV collaborative target tracking
NASA Astrophysics Data System (ADS)
Moseley, Mark B.; Grocholsky, Benjamin P.; Cheung, Carol; Singh, Sanjiv
2009-05-01
Coordinated operations between unmanned air and ground assets allow leveraging of multi-domain sensing and increase opportunities for improving line of sight communications. While numerous military missions would benefit from coordinated UAV-UGV operations, foundational capabilities that integrate stove-piped tactical systems and share available sensor data are required and not yet available. iRobot, AeroVironment, and Carnegie Mellon University are working together, partially SBIR-funded through ARDEC's small unit network lethality initiative, to develop collaborative capabilities for surveillance, targeting, and improved communications based on PackBot UGV and Raven UAV platforms. We integrate newly available technologies into computational, vision, and communications payloads and develop sensing algorithms to support vision-based target tracking. We first simulated and then applied onto real tactical platforms an implementation of Decentralized Data Fusion, a novel technique for fusing track estimates from PackBot and Raven platforms for a moving target in an open environment. In addition, system integration with AeroVironment's Digital Data Link onto both air and ground platforms has extended our capabilities in communications range to operate the PackBot as well as in increased video and data throughput. The system is brought together through a unified Operator Control Unit (OCU) for the PackBot and Raven that provides simultaneous waypoint navigation and traditional teleoperation. We also present several recent capability accomplishments toward PackBot-Raven coordinated operations, including single OCU display design and operation, early target track results, and Digital Data Link integration efforts, as well as our near-term capability goals.
Artificial Neuron Based on Integrated Semiconductor Quantum Dot Mode-Locked Lasers
NASA Astrophysics Data System (ADS)
Mesaritakis, Charis; Kapsalis, Alexandros; Bogris, Adonis; Syvridis, Dimitris
2016-12-01
Neuro-inspired implementations have attracted strong interest as a power efficient and robust alternative to the digital model of computation with a broad range of applications. Especially, neuro-mimetic systems able to produce and process spike-encoding schemes can offer merits like high noise-resiliency and increased computational efficiency. Towards this direction, integrated photonics can be an auspicious platform due to its multi-GHz bandwidth, its high wall-plug efficiency and the strong similarity of its dynamics under excitation with biological spiking neurons. Here, we propose an integrated all-optical neuron based on an InAs/InGaAs semiconductor quantum-dot passively mode-locked laser. The multi-band emission capabilities of these lasers allows, through waveband switching, the emulation of the excitation and inhibition modes of operation. Frequency-response effects, similar to biological neural circuits, are observed just as in a typical two-section excitable laser. The demonstrated optical building block can pave the way for high-speed photonic integrated systems able to address tasks ranging from pattern recognition to cognitive spectrum management and multi-sensory data processing.
Artificial Neuron Based on Integrated Semiconductor Quantum Dot Mode-Locked Lasers
Mesaritakis, Charis; Kapsalis, Alexandros; Bogris, Adonis; Syvridis, Dimitris
2016-01-01
Neuro-inspired implementations have attracted strong interest as a power efficient and robust alternative to the digital model of computation with a broad range of applications. Especially, neuro-mimetic systems able to produce and process spike-encoding schemes can offer merits like high noise-resiliency and increased computational efficiency. Towards this direction, integrated photonics can be an auspicious platform due to its multi-GHz bandwidth, its high wall-plug efficiency and the strong similarity of its dynamics under excitation with biological spiking neurons. Here, we propose an integrated all-optical neuron based on an InAs/InGaAs semiconductor quantum-dot passively mode-locked laser. The multi-band emission capabilities of these lasers allows, through waveband switching, the emulation of the excitation and inhibition modes of operation. Frequency-response effects, similar to biological neural circuits, are observed just as in a typical two-section excitable laser. The demonstrated optical building block can pave the way for high-speed photonic integrated systems able to address tasks ranging from pattern recognition to cognitive spectrum management and multi-sensory data processing. PMID:27991574
USDA-ARS?s Scientific Manuscript database
A continuous monitoring of daily evapotranspiration (ET) at field scale can be achieved by combining thermal infrared remote sensing data information from multiple satellite platforms. Here, an integrated approach to field scale ET mapping is described, combining multi-scale surface energy balance e...
Crops in silico: Generating virtual crops using an integrative and multi-scale modeling platform
USDA-ARS?s Scientific Manuscript database
There are currently 795 million hungry people in the world and 98 percent of them are in developing countries. Food demand is expected to increase by 70% by 2050. With a reduction in arable land, decreases in water availability, and an increasing impact of climate change, innovative technologies are...
A multilevel Lab on chip platform for DNA analysis.
Marasso, Simone Luigi; Giuri, Eros; Canavese, Giancarlo; Castagna, Riccardo; Quaglio, Marzia; Ferrante, Ivan; Perrone, Denis; Cocuzza, Matteo
2011-02-01
Lab-on-chips (LOCs) are critical systems that have been introduced to speed up and reduce the cost of traditional, laborious and extensive analyses in biological and biomedical fields. These ambitious and challenging issues ask for multi-disciplinary competences that range from engineering to biology. Starting from the aim to integrate microarray technology and microfluidic devices, a complex multilevel analysis platform has been designed, fabricated and tested (All rights reserved-IT Patent number TO2009A000915). This LOC successfully manages to interface microfluidic channels with standard DNA microarray glass slides, in order to implement a complete biological protocol. Typical Micro Electro Mechanical Systems (MEMS) materials and process technologies were employed. A silicon/glass microfluidic chip and a Polydimethylsiloxane (PDMS) reaction chamber were fabricated and interfaced with a standard microarray glass slide. In order to have a high disposable system all micro-elements were passive and an external apparatus provided fluidic driving and thermal control. The major microfluidic and handling problems were investigated and innovative solutions were found. Finally, an entirely automated DNA hybridization protocol was successfully tested with a significant reduction in analysis time and reagent consumption with respect to a conventional protocol.
SNPConvert: SNP Array Standardization and Integration in Livestock Species.
Nicolazzi, Ezequiel Luis; Marras, Gabriele; Stella, Alessandra
2016-06-09
One of the main advantages of single nucleotide polymorphism (SNP) array technology is providing genotype calls for a specific number of SNP markers at a relatively low cost. Since its first application in animal genetics, the number of available SNP arrays for each species has been constantly increasing. However, conversely to that observed in whole genome sequence data analysis, SNP array data does not have a common set of file formats or coding conventions for allele calling. Therefore, the standardization and integration of SNP array data from multiple sources have become an obstacle, especially for users with basic or no programming skills. Here, we describe the difficulties related to handling SNP array data, focusing on file formats, SNP allele coding, and mapping. We also present SNPConvert suite, a multi-platform, open-source, and user-friendly set of tools to overcome these issues. This tool, which can be integrated with open-source and open-access tools already available, is a first step towards an integrated system to standardize and integrate any type of raw SNP array data. The tool is available at: https://github. com/nicolazzie/SNPConvert.git.
GRAPE: a graphical pipeline environment for image analysis in adaptive magnetic resonance imaging.
Gabr, Refaat E; Tefera, Getaneh B; Allen, William J; Pednekar, Amol S; Narayana, Ponnada A
2017-03-01
We present a platform, GRAphical Pipeline Environment (GRAPE), to facilitate the development of patient-adaptive magnetic resonance imaging (MRI) protocols. GRAPE is an open-source project implemented in the Qt C++ framework to enable graphical creation, execution, and debugging of real-time image analysis algorithms integrated with the MRI scanner. The platform provides the tools and infrastructure to design new algorithms, and build and execute an array of image analysis routines, and provides a mechanism to include existing analysis libraries, all within a graphical environment. The application of GRAPE is demonstrated in multiple MRI applications, and the software is described in detail for both the user and the developer. GRAPE was successfully used to implement and execute three applications in MRI of the brain, performed on a 3.0-T MRI scanner: (i) a multi-parametric pipeline for segmenting the brain tissue and detecting lesions in multiple sclerosis (MS), (ii) patient-specific optimization of the 3D fluid-attenuated inversion recovery MRI scan parameters to enhance the contrast of brain lesions in MS, and (iii) an algebraic image method for combining two MR images for improved lesion contrast. GRAPE allows graphical development and execution of image analysis algorithms for inline, real-time, and adaptive MRI applications.
Offshore Energy Mapping for Northeast Atlantic and Mediterranean: MARINA PLATFORM project
NASA Astrophysics Data System (ADS)
Kallos, G.; Galanis, G.; Spyrou, C.; Kalogeri, C.; Adam, A.; Athanasiadis, P.
2012-04-01
Deep offshore ocean energy mapping requires detailed modeling of the wind, wave, tidal and ocean circulation estimations. It requires also detailed mapping of the associated extremes. An important issue in such work is the co-generation of energy (generation of wind, wave, tides, currents) in order to design platforms on an efficient way. For example wind and wave fields exhibit significant phase differences and therefore the produced energy from both sources together requires special analysis. The other two sources namely tides and currents have different temporal scales from the previous two. Another important issue is related to the estimation of the environmental frequencies in order to avoid structural problems. These are issues studied at the framework of the FP7 project MARINA PLATFORM. The main objective of the project is to develop deep water structures that can exploit the energy from wind, wave, tidal and ocean current energy sources. In particular, a primary goal will be the establishment of a set of equitable and transparent criteria for the evaluation of multi-purpose platforms for marine renewable energy. Using these criteria, a novel system set of design and optimisation tools will be produced addressing new platform design, component engineering, risk assessment, spatial planning, platform-related grid connection concepts, all focussed on system integration and reducing costs. The University of Athens group is in charge for estimation and mapping of wind, wave, tidal and ocean current resources, estimate available energy potential, map extreme event characteristics and provide any additional environmental parameter required.
McGonigle, John; Murphy, Anna; Paterson, Louise M; Reed, Laurence J; Nestor, Liam; Nash, Jonathan; Elliott, Rebecca; Ersche, Karen D; Flechais, Remy SA; Newbould, Rexford; Orban, Csaba; Smith, Dana G; Taylor, Eleanor M; Waldman, Adam D; Robbins, Trevor W; Deakin, JF William; Nutt, David J; Lingford-Hughes, Anne R; Suckling, John
2016-01-01
Objectives: We aimed to set up a robust multi-centre clinical fMRI and neuropsychological platform to investigate the neuropharmacology of brain processes relevant to addiction – reward, impulsivity and emotional reactivity. Here we provide an overview of the fMRI battery, carried out across three centres, characterizing neuronal response to the tasks, along with exploring inter-centre differences in healthy participants. Experimental design: Three fMRI tasks were used: monetary incentive delay to probe reward sensitivity, go/no-go to probe impulsivity and an evocative images task to probe emotional reactivity. A coordinate-based activation likelihood estimation (ALE) meta-analysis was carried out for the reward and impulsivity tasks to help establish region of interest (ROI) placement. A group of healthy participants was recruited from across three centres (total n=43) to investigate inter-centre differences. Principle observations: The pattern of response observed for each of the three tasks was consistent with previous studies using similar paradigms. At the whole brain level, significant differences were not observed between centres for any task. Conclusions: In developing this platform we successfully integrated neuroimaging data from three centres, adapted validated tasks and applied whole brain and ROI approaches to explore and demonstrate their consistency across centres. PMID:27703042
McGonigle, John; Murphy, Anna; Paterson, Louise M; Reed, Laurence J; Nestor, Liam; Nash, Jonathan; Elliott, Rebecca; Ersche, Karen D; Flechais, Remy Sa; Newbould, Rexford; Orban, Csaba; Smith, Dana G; Taylor, Eleanor M; Waldman, Adam D; Robbins, Trevor W; Deakin, Jf William; Nutt, David J; Lingford-Hughes, Anne R; Suckling, John
2017-01-01
We aimed to set up a robust multi-centre clinical fMRI and neuropsychological platform to investigate the neuropharmacology of brain processes relevant to addiction - reward, impulsivity and emotional reactivity. Here we provide an overview of the fMRI battery, carried out across three centres, characterizing neuronal response to the tasks, along with exploring inter-centre differences in healthy participants. Three fMRI tasks were used: monetary incentive delay to probe reward sensitivity, go/no-go to probe impulsivity and an evocative images task to probe emotional reactivity. A coordinate-based activation likelihood estimation (ALE) meta-analysis was carried out for the reward and impulsivity tasks to help establish region of interest (ROI) placement. A group of healthy participants was recruited from across three centres (total n=43) to investigate inter-centre differences. Principle observations: The pattern of response observed for each of the three tasks was consistent with previous studies using similar paradigms. At the whole brain level, significant differences were not observed between centres for any task. In developing this platform we successfully integrated neuroimaging data from three centres, adapted validated tasks and applied whole brain and ROI approaches to explore and demonstrate their consistency across centres.
Towards the Ultimate Multi-Junction Solar Cell using Transfer Printing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lumb, Matthew P.; Meitl, Matt; Schmieder, Kenneth J.
2016-11-21
Transfer printing is a uniquely enabling technology for the heterogeneous integration of III-V materials grown on dissimilar substrates. In this paper, we present experimental results for a mechanically stacked tandem cell using GaAs and GaSb-based materials capable of harvesting the entire solar spectrum with 44.5% efficiency. We also present the latest results toward developing an ultra-high performance heterogeneous cell, integrating materials grown on GaAs, InP and GaSb platforms.
Targeted exploration and analysis of large cross-platform human transcriptomic compendia
Zhu, Qian; Wong, Aaron K; Krishnan, Arjun; Aure, Miriam R; Tadych, Alicja; Zhang, Ran; Corney, David C; Greene, Casey S; Bongo, Lars A; Kristensen, Vessela N; Charikar, Moses; Li, Kai; Troyanskaya, Olga G.
2016-01-01
We present SEEK (http://seek.princeton.edu), a query-based search engine across very large transcriptomic data collections, including thousands of human data sets from almost 50 microarray and next-generation sequencing platforms. SEEK uses a novel query-level cross-validation-based algorithm to automatically prioritize data sets relevant to the query and a robust search approach to identify query-coregulated genes, pathways, and processes. SEEK provides cross-platform handling, multi-gene query search, iterative metadata-based search refinement, and extensive visualization-based analysis options. PMID:25581801
Farshidfar, Farshad; Zheng, Siyuan; Gingras, Marie-Claude; Newton, Yulia; Shih, Juliann; Robertson, A Gordon; Hinoue, Toshinori; Hoadley, Katherine A; Gibb, Ewan A; Roszik, Jason; Covington, Kyle R; Wu, Chia-Chin; Shinbrot, Eve; Stransky, Nicolas; Hegde, Apurva; Yang, Ju Dong; Reznik, Ed; Sadeghi, Sara; Pedamallu, Chandra Sekhar; Ojesina, Akinyemi I; Hess, Julian M; Auman, J Todd; Rhie, Suhn K; Bowlby, Reanne; Borad, Mitesh J; Zhu, Andrew X; Stuart, Josh M; Sander, Chris; Akbani, Rehan; Cherniack, Andrew D; Deshpande, Vikram; Mounajjed, Taofic; Foo, Wai Chin; Torbenson, Michael S; Kleiner, David E; Laird, Peter W; Wheeler, David A; McRee, Autumn J; Bathe, Oliver F; Andersen, Jesper B; Bardeesy, Nabeel; Roberts, Lewis R; Kwong, Lawrence N
2017-03-14
Cholangiocarcinoma (CCA) is an aggressive malignancy of the bile ducts, with poor prognosis and limited treatment options. Here, we describe the integrated analysis of somatic mutations, RNA expression, copy number, and DNA methylation by The Cancer Genome Atlas of a set of predominantly intrahepatic CCA cases and propose a molecular classification scheme. We identified an IDH mutant-enriched subtype with distinct molecular features including low expression of chromatin modifiers, elevated expression of mitochondrial genes, and increased mitochondrial DNA copy number. Leveraging the multi-platform data, we observed that ARID1A exhibited DNA hypermethylation and decreased expression in the IDH mutant subtype. More broadly, we found that IDH mutations are associated with an expanded histological spectrum of liver tumors with molecular features that stratify with CCA. Our studies reveal insights into the molecular pathogenesis and heterogeneity of cholangiocarcinoma and provide classification information of potential therapeutic significance. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Gupta, Amit Kumar; Kaur, Karambir; Rajput, Akanksha; Dhanda, Sandeep Kumar; Sehgal, Manika; Khan, Md. Shoaib; Monga, Isha; Dar, Showkat Ahmad; Singh, Sandeep; Nagpal, Gandharva; Usmani, Salman Sadullah; Thakur, Anamika; Kaur, Gazaldeep; Sharma, Shivangi; Bhardwaj, Aman; Qureshi, Abid; Raghava, Gajendra Pal Singh; Kumar, Manoj
2016-01-01
Current Zika virus (ZIKV) outbreaks that spread in several areas of Africa, Southeast Asia, and in pacific islands is declared as a global health emergency by World Health Organization (WHO). It causes Zika fever and illness ranging from severe autoimmune to neurological complications in humans. To facilitate research on this virus, we have developed an integrative multi-omics platform; ZikaVR (http://bioinfo.imtech.res.in/manojk/zikavr/), dedicated to the ZIKV genomic, proteomic and therapeutic knowledge. It comprises of whole genome sequences, their respective functional information regarding proteins, genes, and structural content. Additionally, it also delivers sophisticated analysis such as whole-genome alignments, conservation and variation, CpG islands, codon context, usage bias and phylogenetic inferences at whole genome and proteome level with user-friendly visual environment. Further, glycosylation sites and molecular diagnostic primers were also analyzed. Most importantly, we also proposed potential therapeutically imperative constituents namely vaccine epitopes, siRNAs, miRNAs, sgRNAs and repurposing drug candidates. PMID:27633273
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kleese van Dam, Kerstin; Lansing, Carina S.; Elsethagen, Todd O.
2014-01-28
Modern workflow systems enable scientists to run ensemble simulations at unprecedented scales and levels of complexity, allowing them to study system sizes previously impossible to achieve, due to the inherent resource requirements needed for the modeling work. However as a result of these new capabilities the science teams suddenly also face unprecedented data volumes that they are unable to analyze with their existing tools and methodologies in a timely fashion. In this paper we will describe the ongoing development work to create an integrated data intensive scientific workflow and analysis environment that offers researchers the ability to easily create andmore » execute complex simulation studies and provides them with different scalable methods to analyze the resulting data volumes. The integration of simulation and analysis environments is hereby not only a question of ease of use, but supports fundamental functions in the correlated analysis of simulation input, execution details and derived results for multi-variant, complex studies. To this end the team extended and integrated the existing capabilities of the Velo data management and analysis infrastructure, the MeDICi data intensive workflow system and RHIPE the R for Hadoop version of the well-known statistics package, as well as developing a new visual analytics interface for the result exploitation by multi-domain users. The capabilities of the new environment are demonstrated on a use case that focusses on the Pacific Northwest National Laboratory (PNNL) building energy team, showing how they were able to take their previously local scale simulations to a nationwide level by utilizing data intensive computing techniques not only for their modeling work, but also for the subsequent analysis of their modeling results. As part of the PNNL research initiative PRIMA (Platform for Regional Integrated Modeling and Analysis) the team performed an initial 3 year study of building energy demands for the US Eastern Interconnect domain, which they are now planning to extend to predict the demand for the complete century. The initial study raised their data demands from a few GBs to 400GB for the 3year study and expected tens of TBs for the full century.« less
Development of deployable structures for large space platform systems, volume 1
NASA Technical Reports Server (NTRS)
1982-01-01
Generic deployable spacecraft configurations and deployable platform systems concepts were identified. Sizing, building block concepts, orbiter packaging, thermal analysis, cost analysis, and mass properties analysis as related to platform systems integration are considered. Technology needs are examined and the major criteria used in concept selection are delineated. Requirements for deployable habitat modules, tunnels, and OTV hangars are considered.
ROBOCAL: Gamma-ray isotopic hardware/software interface
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hurd, J.R.; Bonner, C.A.; Ostenak, C.A.
1989-01-01
ROBOCAL, presently being developed at the Los Alamos National Laboratory, is a full-scale prototypical robotic system for remotely performing calorimetric and gamma-ray isotopics measurements of nuclear materials. It features a fully automated vertical stacker-retriever for storing and retrieving packaged nuclear materials from a multi-drawer system, and a fully automated, uniquely integrated gantry robot for programmable selection and transfer of nuclear materials to calorimetric and gamma-ray isotopic measurement stations. Since ROBOCAL is to require almost no operator intervention, a mechanical control system is required in addition to a totally automated assay system. The assay system must be a completely integrated datamore » acquisition and isotopic analysis package fully capable of performing state-of-the-art homogeneous and heterogeneous analyses on many varied matrices. The TRIFID assay system being discussed at this conference by J. G. Fleissner of the Rocky Flats Plant has been adopted because of its many automated features. These include: MCA/ADC setup and acquisition; spectral storage and analysis utilizing an expert system formalism; report generation with internal measurement control printout; user friendly screens and menus. The mechanical control portion consists primarily of two detector platforms and a sample platform, each with independent movement. Some minor modifications and additions are needed with TRIFID to interface the assay and mechanical portions with the CimRoc 4000 software controlling the robot. 6 refs., 5 figs., 3 tabs.« less
NASA Astrophysics Data System (ADS)
Whaley, Gregory J.; Karnopp, Roger J.
2010-04-01
The goal of the Air Force Highly Integrated Photonics (HIP) program is to develop and demonstrate single photonic chip components which support a single mode fiber network architecture for use on mobile military platforms. We propose an optically transparent, broadcast and select fiber optic network as the next generation interconnect on avionics platforms. In support of this network, we have developed three principal, single-chip photonic components: a tunable laser transmitter, a 32x32 port star coupler, and a 32 port multi-channel receiver which are all compatible with demanding avionics environmental and size requirements. The performance of the developed components will be presented as well as the results of a demonstration system which integrates the components into a functional network representative of the form factor used in advanced avionics computing and signal processing applications.
Coordinating teams of autonomous vehicles: an architectural perspective
NASA Astrophysics Data System (ADS)
Czichon, Cary; Peterson, Robert W.; Mettala, Erik G.; Vondrak, Ivo
2005-05-01
In defense-related robotics research, a mission level integration gap exists between mission tasks (tactical) performed by ground, sea, or air applications and elementary behaviors enacted by processing, communications, sensors, and weaponry resources (platform specific). The gap spans ensemble (heterogeneous team) behaviors, automatic MOE/MOP tracking, and tactical task modeling/simulation for virtual and mixed teams comprised of robotic and human combatants. This study surveys robotic system architectures, compares approaches for navigating problem/state spaces by autonomous systems, describes an architecture for an integrated, repository-based modeling, simulation, and execution environment, and outlines a multi-tiered scheme for robotic behavior components that is agent-based, platform-independent, and extendable via plug-ins. Tools for this integrated environment, along with a distributed agent framework for collaborative task performance are being developed by a U.S. Army funded SBIR project (RDECOM Contract N61339-04-C-0005).
The depiction of Alboran Sea Gyre during Donde Va? using remote sensing and conventional data
NASA Technical Reports Server (NTRS)
Laviolette, P. E.
1984-01-01
Experienced oceanographic investigators have come to realize that remote sensing techniques are most successful when applied as part of programs of integrated measurements aimed at solving specific oceanographic problems. A good example of such integration occurred during the multi-platform international experiment, Donde Va? in the Alboran Sea during the period June through October, 1982. The objective of Donde Va? was to derive the interrelationship of the Atlantic waters entering the Mediterranean Sea and the Alboran Sea Gyre. The experimental plan conceived solely with this objective in mind consisted of a variety of remote sensing and conventional platforms: three ships, three aircraft, five current moorings, two satellites and a specialized beach radar (CODAR). Integrated analyses of these multiple-data sets are still being conducted. However, the initial results show detailed structure of the incoming Atlantic jet and Alboran Sea Gyre that would not have been possible by conventional means.
Barbesi, Donato; Vicente Vilas, Víctor; Millet, Sylvain; Sandow, Miguel; Colle, Jean-Yves; Aldave de Las Heras, Laura
2017-01-01
A LabVIEW ® -based software for the control of the fully automated multi-sequential flow injection analysis Lab-on-Valve (MSFIA-LOV) platform AutoRAD performing radiochemical analysis is described. The analytical platform interfaces an Arduino ® -based device triggering multiple detectors providing a flexible and fit for purpose choice of detection systems. The different analytical devices are interfaced to the PC running LabVIEW ® VI software using USB and RS232 interfaces, both for sending commands and receiving confirmation or error responses. The AUTORAD platform has been successfully applied for the chemical separation and determination of Sr, an important fission product pertinent to nuclear waste.
Kinahan, David J; Kearney, Sinéad M; Dimov, Nikolay; Glynn, Macdara T; Ducrée, Jens
2014-07-07
The centrifugal "lab-on-a-disc" concept has proven to have great potential for process integration of bioanalytical assays, in particular where ease-of-use, ruggedness, portability, fast turn-around time and cost efficiency are of paramount importance. Yet, as all liquids residing on the disc are exposed to the same centrifugal field, an inherent challenge of these systems remains the automation of multi-step, multi-liquid sample processing and subsequent detection. In order to orchestrate the underlying bioanalytical protocols, an ample palette of rotationally and externally actuated valving schemes has been developed. While excelling with the level of flow control, externally actuated valves require interaction with peripheral instrumentation, thus compromising the conceptual simplicity of the centrifugal platform. In turn, for rotationally controlled schemes, such as common capillary burst valves, typical manufacturing tolerances tend to limit the number of consecutive laboratory unit operations (LUOs) that can be automated on a single disc. In this paper, a major advancement on recently established dissolvable film (DF) valving is presented; for the very first time, a liquid handling sequence can be controlled in response to completion of preceding liquid transfer event, i.e. completely independent of external stimulus or changes in speed of disc rotation. The basic, event-triggered valve configuration is further adapted to leverage conditional, large-scale process integration. First, we demonstrate a fluidic network on a disc encompassing 10 discrete valving steps including logical relationships such as an AND-conditional as well as serial and parallel flow control. Then we present a disc which is capable of implementing common laboratory unit operations such as metering and selective routing of flows. Finally, as a pilot study, these functions are integrated on a single disc to automate a common, multi-step lab protocol for the extraction of total RNA from mammalian cell homogenate.
NASA Astrophysics Data System (ADS)
Mazzega, Pierre; Therond, Olivier; Debril, Thomas; March, Hug; Sibertin-Blanc, Christophe; Lardy, Romain; Sant'ana, Daniel
2014-11-01
This paper presents the experience gained related to the development of an integrated simulation model of water policy. Within this context, we analyze particular difficulties raised by the inclusion of multi-level governance that assigns the responsibility of individual or collective decision-making to a variety of actors, regarding measures of which the implementation has significant effects toward the sustainability of socio-hydrosystems. Multi-level governance procedures are compared with the potential of model-based impact assessment. Our discussion is illustrated on the basis of the exploitation of the multi-agent platform MAELIA dedicated to the simulation of social, economic and environmental impacts of low-water management in a context of climate and regulatory changes. We focus on three major decision-making processes occurring in the Adour-Garonne basin, France: (i) the participatory development of the Master Scheme for Water Planning and Management (SDAGE) under the auspices of the Water Agency; (ii) the publication of water use restrictions in situations of water scarcity; and (iii) the determination of the abstraction volumes for irrigation and their allocation. The MAELIA platform explicitly takes into account the mode of decision-making when it is framed by a procedure set beforehand, focusing on the actors' participation and on the nature and parameters of the measures to be implemented. It is observed that in some water organizations decision-making follows patterns that can be represented as rule-based actions triggered by thresholds of resource states. When decisions are resulting from individual choice, endowing virtual agents with bounded rationality allows us to reproduce (in silico) their behavior and decisions in a reliable way. However, the negotiation processes taking place during the period of time simulated by the models in arenas of collective choices are not all reproducible. Outcomes of some collective decisions are very little or not at all predictable. The development and simulation of a priori policy scenarios capturing the most plausible or interesting outcomes of such collective decisions on measures for low-water management allows these difficulties to be overcome. The building of these kind of scenarios requires close collaboration between researchers and stakeholders involved in arenas of collective choice, and implies the integration of the production of model and the analysis of scenarios as one component of the polycentric political process of water management.
NASA Astrophysics Data System (ADS)
Gorelick, Noel
2013-04-01
The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.
NASA Astrophysics Data System (ADS)
Gorelick, N.
2012-12-01
The Google Earth Engine platform is a system designed to enable petabyte-scale, scientific analysis and visualization of geospatial datasets. Earth Engine provides a consolidated environment including a massive data catalog co-located with thousands of computers for analysis. The user-friendly front-end provides a workbench environment to allow interactive data and algorithm development and exploration and provides a convenient mechanism for scientists to share data, visualizations and analytic algorithms via URLs. The Earth Engine data catalog contains a wide variety of popular, curated datasets, including the world's largest online collection of Landsat scenes (> 2.0M), numerous MODIS collections, and many vector-based data sets. The platform provides a uniform access mechanism to a variety of data types, independent of their bands, projection, bit-depth, resolution, etc..., facilitating easy multi-sensor analysis. Additionally, a user is able to add and curate their own data and collections. Using a just-in-time, distributed computation model, Earth Engine can rapidly process enormous quantities of geo-spatial data. All computation is performed lazily; nothing is computed until it's required either for output or as input to another step. This model allows real-time feedback and preview during algorithm development, supporting a rapid algorithm development, test, and improvement cycle that scales seamlessly to large-scale production data processing. Through integration with a variety of other services, Earth Engine is able to bring to bear considerable analytic and technical firepower in a transparent fashion, including: AI-based classification via integration with Google's machine learning infrastructure, publishing and distribution at Google scale through integration with the Google Maps API, Maps Engine and Google Earth, and support for in-the-field activities such as validation, ground-truthing, crowd-sourcing and citizen science though the Android Open Data Kit.
NASA Astrophysics Data System (ADS)
Shute, J.; Carriere, L.; Duffy, D.; Hoy, E.; Peters, J.; Shen, Y.; Kirschbaum, D.
2017-12-01
The NASA Center for Climate Simulation (NCCS) at the Goddard Space Flight Center is building and maintaining an Enterprise GIS capability for its stakeholders, to include NASA scientists, industry partners, and the public. This platform is powered by three GIS subsystems operating in a highly-available, virtualized environment: 1) the Spatial Analytics Platform is the primary NCCS GIS and provides users discoverability of the vast DigitalGlobe/NGA raster assets within the NCCS environment; 2) the Disaster Mapping Platform provides mapping and analytics services to NASA's Disaster Response Group; and 3) the internal (Advanced Data Analytics Platform/ADAPT) enterprise GIS provides users with the full suite of Esri and open source GIS software applications and services. All systems benefit from NCCS's cutting edge infrastructure, to include an InfiniBand network for high speed data transfers; a mixed/heterogeneous environment featuring seamless sharing of information between Linux and Windows subsystems; and in-depth system monitoring and warning systems. Due to its co-location with the NCCS Discover High Performance Computing (HPC) environment and the Advanced Data Analytics Platform (ADAPT), the GIS platform has direct access to several large NCCS datasets including DigitalGlobe/NGA, Landsat, MERRA, and MERRA2. Additionally, the NCCS ArcGIS Desktop Windows virtual machines utilize existing NetCDF and OPeNDAP assets for visualization, modelling, and analysis - thus eliminating the need for data duplication. With the advent of this platform, Earth scientists have full access to vast data repositories and the industry-leading tools required for successful management and analysis of these multi-petabyte, global datasets. The full system architecture and integration with scientific datasets will be presented. Additionally, key applications and scientific analyses will be explained, to include the NASA Global Landslide Catalog (GLC) Reporter crowdsourcing application, the NASA GLC Viewer discovery and analysis tool, the DigitalGlobe/NGA Data Discovery Tool, the NASA Disaster Response Group Mapping Platform (https://maps.disasters.nasa.gov), and support for NASA's Arctic - Boreal Vulnerability Experiment (ABoVE).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lereu, Aude L.; Zerrad, M.; Passian, Ali
In photonics, the field concentration and enhancement have been major objectives for achieving size reduction and device integration. Plasmonics offers resonant field confinement and enhancement, but ultra-sharp optical resonances in all-dielectric multi-layer thin films are emerging as a powerful contestant. Thus, applications capitalizing upon stronger and sharper optical resonances and larger field enhancements could be faced with a choice for the superior platform. Here in this paper, we present a comparison between plasmonic and dielectric multi-layer thin films for their resonance merits. We show that the remarkable characteristics of the resonance behavior of optimized dielectric multi-layers can outweigh those ofmore » their metallic counterpart.« less
Interfaces and Integration of Medical Image Analysis Frameworks: Challenges and Opportunities.
Covington, Kelsie; McCreedy, Evan S; Chen, Min; Carass, Aaron; Aucoin, Nicole; Landman, Bennett A
2010-05-25
Clinical research with medical imaging typically involves large-scale data analysis with interdependent software toolsets tied together in a processing workflow. Numerous, complementary platforms are available, but these are not readily compatible in terms of workflows or data formats. Both image scientists and clinical investigators could benefit from using the framework which is a most natural fit to the specific problem at hand, but pragmatic choices often dictate that a compromise platform is used for collaboration. Manual merging of platforms through carefully tuned scripts has been effective, but exceptionally time consuming and is not feasible for large-scale integration efforts. Hence, the benefits of innovation are constrained by platform dependence. Removing this constraint via integration of algorithms from one framework into another is the focus of this work. We propose and demonstrate a light-weight interface system to expose parameters across platforms and provide seamless integration. In this initial effort, we focus on four platforms Medical Image Analysis and Visualization (MIPAV), Java Image Science Toolkit (JIST), command line tools, and 3D Slicer. We explore three case studies: (1) providing a system for MIPAV to expose internal algorithms and utilize these algorithms within JIST, (2) exposing JIST modules through self-documenting command line interface for inclusion in scripting environments, and (3) detecting and using JIST modules in 3D Slicer. We review the challenges and opportunities for light-weight software integration both within development language (e.g., Java in MIPAV and JIST) and across languages (e.g., C/C++ in 3D Slicer and shell in command line tools).
NASA Astrophysics Data System (ADS)
Quach, N.; Huang, T.; Boening, C.; Gill, K. M.
2016-12-01
Research related to sea level rise crosses multiple disciplines from sea ice to land hydrology. The NASA Sea Level Change Portal (SLCP) is a one-stop source for current sea level change information and data, including interactive tools for accessing and viewing regional data, a virtual dashboard of sea level indicators, and ongoing updates through a suite of editorial products that include content articles, graphics, videos, and animations. The architecture behind the SLCP makes it possible to integrate web content and data relevant to sea level change that are archived across various data centers as well as new data generated by sea level change principal investigators. The Extensible Data Gateway Environment (EDGE) is incorporated into the SLCP architecture to provide a unified platform for web content and science data discovery. EDGE is a data integration platform designed to facilitate high-performance geospatial data discovery and access with the ability to support multi-metadata standard specifications. EDGE has the capability to retrieve data from one or more sources and package the resulting sets into a single response to the requestor. With this unified endpoint, the Data Analysis Tool that is available on the SLCP can retrieve dataset and granule level metadata as well as perform geospatial search on the data. This talk focuses on the architecture that makes it possible to seamlessly integrate and enable discovery of disparate data relevant to sea level rise.
Single cell multiplexed assay for proteolytic activity using droplet microfluidics.
Ng, Ee Xien; Miller, Miles A; Jing, Tengyang; Chen, Chia-Hung
2016-07-15
Cellular enzymes interact in a post-translationally regulated fashion to govern individual cell behaviors, yet current platform technologies are limited in their ability to measure multiple enzyme activities simultaneously in single cells. Here, we developed multi-color Förster resonance energy transfer (FRET)-based enzymatic substrates and use them in a microfluidics platform to simultaneously measure multiple specific protease activities from water-in-oil droplets that contain single cells. By integrating the microfluidic platform with a computational analytical method, Proteolytic Activity Matrix Analysis (PrAMA), we are able to infer six different protease activity signals from individual cells in a high throughput manner (~100 cells/experimental run). We characterized protease activity profiles at single cell resolution for several cancer cell lines including breast cancer cell line MDA-MB-231, lung cancer cell line PC-9, and leukemia cell line K-562 using both live-cell and in-situ cell lysis assay formats, with special focus on metalloproteinases important in metastasis. The ability to measure multiple proteases secreted from or expressed in individual cells allows us to characterize cell heterogeneity and has potential applications including systems biology, pharmacology, cancer diagnosis and stem cell biology. Copyright © 2016 Elsevier B.V. All rights reserved.
Revel8or: Model Driven Capacity Planning Tool Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhu, Liming; Liu, Yan; Bui, Ngoc B.
2007-05-31
Designing complex multi-tier applications that must meet strict performance requirements is a challenging software engineering problem. Ideally, the application architect could derive accurate performance predictions early in the project life-cycle, leveraging initial application design-level models and a description of the target software and hardware platforms. To this end, we have developed a capacity planning tool suite for component-based applications, called Revel8tor. The tool adheres to the model driven development paradigm and supports benchmarking and performance prediction for J2EE, .Net and Web services platforms. The suite is composed of three different tools: MDAPerf, MDABench and DSLBench. MDAPerf allows annotation of designmore » diagrams and derives performance analysis models. MDABench allows a customized benchmark application to be modeled in the UML 2.0 Testing Profile and automatically generates a deployable application, with measurement automatically conducted. DSLBench allows the same benchmark modeling and generation to be conducted using a simple performance engineering Domain Specific Language (DSL) in Microsoft Visual Studio. DSLBench integrates with Visual Studio and reuses its load testing infrastructure. Together, the tool suite can assist capacity planning across platforms in an automated fashion.« less
The Method of Multiple Spatial Planning Basic Map
NASA Astrophysics Data System (ADS)
Zhang, C.; Fang, C.
2018-04-01
The "Provincial Space Plan Pilot Program" issued in December 2016 pointed out that the existing space management and control information management platforms of various departments were integrated, and a spatial planning information management platform was established to integrate basic data, target indicators, space coordinates, and technical specifications. The planning and preparation will provide supportive decision support, digital monitoring and evaluation of the implementation of the plan, implementation of various types of investment projects and space management and control departments involved in military construction projects in parallel to approve and approve, and improve the efficiency of administrative approval. The space planning system should be set up to delimit the control limits for the development of production, life and ecological space, and the control of use is implemented. On the one hand, it is necessary to clarify the functional orientation between various kinds of planning space. On the other hand, it is necessary to achieve "multi-compliance" of various space planning. Multiple spatial planning intergration need unified and standard basic map(geographic database and technical specificaton) to division of urban, agricultural, ecological three types of space and provide technical support for the refinement of the space control zoning for the relevant planning. The article analysis the main space datum, the land use classification standards, base map planning, planning basic platform main technical problems. Based on the geographic conditions, the results of the census preparation of spatial planning map, and Heilongjiang, Hainan many rules combined with a pilot application.
Web-based visual analysis for high-throughput genomics
2013-01-01
Background Visualization plays an essential role in genomics research by making it possible to observe correlations and trends in large datasets as well as communicate findings to others. Visual analysis, which combines visualization with analysis tools to enable seamless use of both approaches for scientific investigation, offers a powerful method for performing complex genomic analyses. However, there are numerous challenges that arise when creating rich, interactive Web-based visualizations/visual analysis applications for high-throughput genomics. These challenges include managing data flow from Web server to Web browser, integrating analysis tools and visualizations, and sharing visualizations with colleagues. Results We have created a platform simplifies the creation of Web-based visualization/visual analysis applications for high-throughput genomics. This platform provides components that make it simple to efficiently query very large datasets, draw common representations of genomic data, integrate with analysis tools, and share or publish fully interactive visualizations. Using this platform, we have created a Circos-style genome-wide viewer, a generic scatter plot for correlation analysis, an interactive phylogenetic tree, a scalable genome browser for next-generation sequencing data, and an application for systematically exploring tool parameter spaces to find good parameter values. All visualizations are interactive and fully customizable. The platform is integrated with the Galaxy (http://galaxyproject.org) genomics workbench, making it easy to integrate new visual applications into Galaxy. Conclusions Visualization and visual analysis play an important role in high-throughput genomics experiments, and approaches are needed to make it easier to create applications for these activities. Our framework provides a foundation for creating Web-based visualizations and integrating them into Galaxy. Finally, the visualizations we have created using the framework are useful tools for high-throughput genomics experiments. PMID:23758618
Li, Wen-Jie; Zhang, Shi-Huang; Wang, Hui-Min
2011-12-01
Ecosystem services evaluation is a hot topic in current ecosystem management, and has a close link with human beings welfare. This paper summarized the research progress on the evaluation of ecosystem services based on geographic information system (GIS) and remote sensing (RS) technology, which could be reduced to the following three characters, i. e., ecological economics theory is widely applied as a key method in quantifying ecosystem services, GIS and RS technology play a key role in multi-source data acquisition, spatiotemporal analysis, and integrated platform, and ecosystem mechanism model becomes a powerful tool for understanding the relationships between natural phenomena and human activities. Aiming at the present research status and its inadequacies, this paper put forward an "Assembly Line" framework, which was a distributed one with scalable characteristics, and discussed the future development trend of the integration research on ecosystem services evaluation based on GIS and RS technologies.
Integrated sample-to-detection chip for nucleic acid test assays.
Prakash, R; Pabbaraju, K; Wong, S; Tellier, R; Kaler, K V I S
2016-06-01
Nucleic acid based diagnostic techniques are routinely used for the detection of infectious agents. Most of these assays rely on nucleic acid extraction platforms for the extraction and purification of nucleic acids and a separate real-time PCR platform for quantitative nucleic acid amplification tests (NATs). Several microfluidic lab on chip (LOC) technologies have been developed, where mechanical and chemical methods are used for the extraction and purification of nucleic acids. Microfluidic technologies have also been effectively utilized for chip based real-time PCR assays. However, there are few examples of microfluidic systems which have successfully integrated these two key processes. In this study, we have implemented an electro-actuation based LOC micro-device that leverages multi-frequency actuation of samples and reagents droplets for chip based nucleic acid extraction and real-time, reverse transcription (RT) PCR (qRT-PCR) amplification from clinical samples. Our prototype micro-device combines chemical lysis with electric field assisted isolation of nucleic acid in a four channel parallel processing scheme. Furthermore, a four channel parallel qRT-PCR amplification and detection assay is integrated to deliver the sample-to-detection NAT chip. The NAT chip combines dielectrophoresis and electrostatic/electrowetting actuation methods with resistive micro-heaters and temperature sensors to perform chip based integrated NATs. The two chip modules have been validated using different panels of clinical samples and their performance compared with standard platforms. This study has established that our integrated NAT chip system has a sensitivity and specificity comparable to that of the standard platforms while providing up to 10 fold reduction in sample/reagent volumes.
A two-channel, spectrally degenerate polarization entangled source on chip
NASA Astrophysics Data System (ADS)
Sansoni, Linda; Luo, Kai Hong; Eigner, Christof; Ricken, Raimund; Quiring, Viktor; Herrmann, Harald; Silberhorn, Christine
2017-12-01
Integrated optics provides the platform for the experimental implementation of highly complex and compact circuits for quantum information applications. In this context integrated waveguide sources represent a powerful resource for the generation of quantum states of light due to their high brightness and stability. However, the confinement of the light in a single spatial mode limits the realization of multi-channel sources. Due to this challenge one of the most adopted sources in quantum information processes, i.e. a source which generates spectrally indistinguishable polarization entangled photons in two different spatial modes, has not yet been realized in a fully integrated platform. Here we overcome this limitation by suitably engineering two periodically poled waveguides and an integrated polarization splitter in lithium niobate. This source produces polarization entangled states with fidelity of F = 0.973 ±0.003 and a test of Bell's inequality results in a violation larger than 14 standard deviations. It can work both in pulsed and continuous wave regime. This device represents a new step toward the implementation of fully integrated circuits for quantum information applications.
A wireless modular multi-modal multi-node patch platform for robust biosignal monitoring.
Pantelopoulos, Alexandros; Saldivar, Enrique; Roham, Masoud
2011-01-01
In this paper a wireless modular, multi-modal, multi-node patch platform is described. The platform comprises low-cost semi-disposable patch design aiming at unobtrusive ambulatory monitoring of multiple physiological parameters. Owing to its modular design it can be interfaced with various low-power RF communication and data storage technologies, while the data fusion of multi-modal and multi-node features facilitates measurement of several biosignals from multiple on-body locations for robust feature extraction. Preliminary results of the patch platform are presented which illustrate the capability to extract respiration rate from three different independent metrics, which combined together can give a more robust estimate of the actual respiratory rate.
James, Joseph; Murukeshan, Vadakke Matham; Woh, Lye Sun
2014-07-01
The structural and molecular heterogeneities of biological tissues demand the interrogation of the samples with multiple energy sources and provide visualization capabilities at varying spatial resolution and depth scales for obtaining complementary diagnostic information. A novel multi-modal imaging approach that uses optical and acoustic energies to perform photoacoustic, ultrasound and fluorescence imaging at multiple resolution scales from the tissue surface and depth is proposed in this paper. The system comprises of two distinct forms of hardware level integration so as to have an integrated imaging system under a single instrumentation set-up. The experimental studies show that the system is capable of mapping high resolution fluorescence signatures from the surface, optical absorption and acoustic heterogeneities along the depth (>2cm) of the tissue at multi-scale resolution (<1µm to <0.5mm).
Argumentation Based Joint Learning: A Novel Ensemble Learning Approach
Xu, Junyi; Yao, Li; Li, Le
2015-01-01
Recently, ensemble learning methods have been widely used to improve classification performance in machine learning. In this paper, we present a novel ensemble learning method: argumentation based multi-agent joint learning (AMAJL), which integrates ideas from multi-agent argumentation, ensemble learning, and association rule mining. In AMAJL, argumentation technology is introduced as an ensemble strategy to integrate multiple base classifiers and generate a high performance ensemble classifier. We design an argumentation framework named Arena as a communication platform for knowledge integration. Through argumentation based joint learning, high quality individual knowledge can be extracted, and thus a refined global knowledge base can be generated and used independently for classification. We perform numerous experiments on multiple public datasets using AMAJL and other benchmark methods. The results demonstrate that our method can effectively extract high quality knowledge for ensemble classifier and improve the performance of classification. PMID:25966359
Iconic Gestures for Robot Avatars, Recognition and Integration with Speech.
Bremner, Paul; Leonards, Ute
2016-01-01
Co-verbal gestures are an important part of human communication, improving its efficiency and efficacy for information conveyance. One possible means by which such multi-modal communication might be realized remotely is through the use of a tele-operated humanoid robot avatar. Such avatars have been previously shown to enhance social presence and operator salience. We present a motion tracking based tele-operation system for the NAO robot platform that allows direct transmission of speech and gestures produced by the operator. To assess the capabilities of this system for transmitting multi-modal communication, we have conducted a user study that investigated if robot-produced iconic gestures are comprehensible, and are integrated with speech. Robot performed gesture outcomes were compared directly to those for gestures produced by a human actor, using a within participant experimental design. We show that iconic gestures produced by a tele-operated robot are understood by participants when presented alone, almost as well as when produced by a human. More importantly, we show that gestures are integrated with speech when presented as part of a multi-modal communication equally well for human and robot performances.
Single Cell Multi-Omics Technology: Methodology and Application.
Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying
2018-01-01
In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions.
Single Cell Multi-Omics Technology: Methodology and Application
Hu, Youjin; An, Qin; Sheu, Katherine; Trejo, Brandon; Fan, Shuxin; Guo, Ying
2018-01-01
In the era of precision medicine, multi-omics approaches enable the integration of data from diverse omics platforms, providing multi-faceted insight into the interrelation of these omics layers on disease processes. Single cell sequencing technology can dissect the genotypic and phenotypic heterogeneity of bulk tissue and promises to deepen our understanding of the underlying mechanisms governing both health and disease. Through modification and combination of single cell assays available for transcriptome, genome, epigenome, and proteome profiling, single cell multi-omics approaches have been developed to simultaneously and comprehensively study not only the unique genotypic and phenotypic characteristics of single cells, but also the combined regulatory mechanisms evident only at single cell resolution. In this review, we summarize the state-of-the-art single cell multi-omics methods and discuss their applications, challenges, and future directions. PMID:29732369
Surface plasmons and Bloch surface waves: Towards optimized ultra-sensitive optical sensors
Lereu, Aude L.; Zerrad, M.; Passian, Ali; ...
2017-07-07
In photonics, the field concentration and enhancement have been major objectives for achieving size reduction and device integration. Plasmonics offers resonant field confinement and enhancement, but ultra-sharp optical resonances in all-dielectric multi-layer thin films are emerging as a powerful contestant. Thus, applications capitalizing upon stronger and sharper optical resonances and larger field enhancements could be faced with a choice for the superior platform. Here in this paper, we present a comparison between plasmonic and dielectric multi-layer thin films for their resonance merits. We show that the remarkable characteristics of the resonance behavior of optimized dielectric multi-layers can outweigh those ofmore » their metallic counterpart.« less
Multi-agent integrated password management (MIPM) application secured with encryption
NASA Astrophysics Data System (ADS)
Awang, Norkhushaini; Zukri, Nurul Hidayah Ahmad; Rashid, Nor Aimuni Md; Zulkifli, Zuhri Arafah; Nazri, Nor Afifah Mohd
2017-10-01
Users use weak passwords and reuse them on different websites and applications. Password managers are a solution to store login information for websites and help users log in automatically. This project developed a system that acts as an agent managing passwords. Multi-Agent Integrated Password Management (MIPM) is an application using encryption that provides users with secure storage of their login account information such as their username, emails and passwords. This project was developed on an Android platform with an encryption agent using Java Agent Development Environment (JADE). The purpose of the embedded agents is to act as a third-party software to ease the encryption process, and in the future, the developed encryption agents can form part of the security system. This application can be used by the computer and mobile users. Currently, users log into many applications causing them to use unique passwords to prevent password leaking. The crypto agent handles the encryption process using an Advanced Encryption Standard (AES) 128-bit encryption algorithm. As a whole, MIPM is developed on the Android application to provide a secure platform to store passwords and has high potential to be commercialised for public use.
Recent progress in InP/polymer-based devices for telecom and data center applications
NASA Astrophysics Data System (ADS)
Kleinert, Moritz; Zhang, Ziyang; de Felipe, David; Zawadzki, Crispin; Maese Novo, Alejandro; Brinker, Walter; Möhrle, Martin; Keil, Norbert
2015-02-01
Recent progress on polymer-based photonic devices and hybrid photonic integration technology using InP-based active components is presented. High performance thermo-optic components, including compact polymer variable optical attenuators and switches are powerful tools to regulate and control the light flow in the optical backbone. Polymer arrayed waveguide gratings integrated with InP laser and detector arrays function as low-cost optical line terminals (OLTs) in the WDM-PON network. External cavity tunable lasers combined with C/L band thinfilm filter, on-chip U-groove and 45° mirrors construct a compact, bi-directional and color-less optical network unit (ONU). A tunable laser integrated with VOAs, TFEs and two 90° hybrids builds the optical front-end of a colorless, dual-polarization coherent receiver. Multicore polymer waveguides and multi-step 45°mirrors are demonstrated as bridging devices between the spatialdivision- multiplexing transmission technology using multi-core fibers and the conventional PLCbased photonic platforms, appealing to the fast development of dense 3D photonic integration.
Balkányi, László
2002-01-01
To develop information systems (IS) in the changing environment of the health sector, a simple but throughout model, avoiding the techno-jargon of informatics, might be useful for the top management. A platform neutral, extensible, transparent conceptual model should be established. Limitations of current methods lead to a simple, but comprehensive mapping, in the form of a three-dimensional cube. The three 'orthogonal' views are (a) organization functionality, (b) organizational structures and (c) information technology. Each of the cube-sides is described according to its nature. This approach enables to define any kind of an IS component as a certain point/layer/domain of the cube and enables also the management to label all IS components independently form any supplier(s) and/or any specific platform. The model handles changes in organization structure, business functionality and the serving info-system independently form each other. Practical application extends to (a) planning complex, new ISs, (b) guiding development of multi-vendor, multi-site ISs, (c) supporting large-scale public procurement procedures and the contracting, implementation phase by establishing a platform neutral reference, (d) keeping an exhaustive inventory of an existing large-scale system, that handles non-tangible aspects of the IS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kraucunas, Ian P.; Clarke, Leon E.; Dirks, James A.
2015-04-01
The Platform for Regional Integrated Modeling and Analysis (PRIMA) is an innovative modeling system developed at Pacific Northwest National Laboratory (PNNL) to simulate interactions among natural and human systems at scales relevant to regional decision making. PRIMA brings together state-of-the-art models of regional climate, hydrology, agriculture, socioeconomics, and energy systems using a flexible coupling approach. The platform can be customized to inform a variety of complex questions and decisions, such as the integrated evaluation of mitigation and adaptation options across a range of sectors. Research into stakeholder decision support needs underpins the platform's application to regional issues, including uncertainty characterization.more » Ongoing numerical experiments are yielding new insights into the interactions among human and natural systems on regional scales with an initial focus on the energy-land-water nexus in the upper U.S. Midwest. This paper focuses on PRIMA’s functional capabilities and describes some lessons learned to date about integrated regional modeling.« less
McIDAS-V: Data Analysis and Visualization for NPOESS and GOES-R
NASA Astrophysics Data System (ADS)
Rink, T.; Achtor, T. H.
2009-12-01
McIDAS-V, the next-generation McIDAS, is being built on top a modern, cross-platform software framework which supports development of 4-D, interactive displays and integration of wide-array of geophysical data. As the replacement of McIDAS, the development emphasis is on future satellite observation platforms such as NPOESS and GOES-R. Data interrogation, analysis and visualization capabilities have been developed for multi- and hyper-spectral instruments like MODIS, AIRS and IASI, and are being extended for application to VIIRS and CrIS. Compatibility with GOES-R ABI level1 and level2 product storage formats has been demonstrated. The abstract data model, which can internalize most any geophysical data, opens up new possibilities for data fusion techniques, for example, polar and geostationary, (LEO/GEO), synergy for research and validation. McIDAS-V follows an object-oriented design model, using the Java programming language, allowing specialized extensions for for new sources of data, and novel displays and interactive behavior. The reference application, what the user sees on startup, can be customized, and the system has a persistence mechanism allowing sharing of the application state across the internet. McIDAS-V is open-source, and free to the public.
Aslam, Luqman; Beal, Kathryn; Ann Blomberg, Le; Bouffard, Pascal; Burt, David W.; Crasta, Oswald; Crooijmans, Richard P. M. A.; Cooper, Kristal; Coulombe, Roger A.; De, Supriyo; Delany, Mary E.; Dodgson, Jerry B.; Dong, Jennifer J.; Evans, Clive; Frederickson, Karin M.; Flicek, Paul; Florea, Liliana; Folkerts, Otto; Groenen, Martien A. M.; Harkins, Tim T.; Herrero, Javier; Hoffmann, Steve; Megens, Hendrik-Jan; Jiang, Andrew; de Jong, Pieter; Kaiser, Pete; Kim, Heebal; Kim, Kyu-Won; Kim, Sungwon; Langenberger, David; Lee, Mi-Kyung; Lee, Taeheon; Mane, Shrinivasrao; Marcais, Guillaume; Marz, Manja; McElroy, Audrey P.; Modise, Thero; Nefedov, Mikhail; Notredame, Cédric; Paton, Ian R.; Payne, William S.; Pertea, Geo; Prickett, Dennis; Puiu, Daniela; Qioa, Dan; Raineri, Emanuele; Ruffier, Magali; Salzberg, Steven L.; Schatz, Michael C.; Scheuring, Chantel; Schmidt, Carl J.; Schroeder, Steven; Searle, Stephen M. J.; Smith, Edward J.; Smith, Jacqueline; Sonstegard, Tad S.; Stadler, Peter F.; Tafer, Hakim; Tu, Zhijian (Jake); Van Tassell, Curtis P.; Vilella, Albert J.; Williams, Kelly P.; Yorke, James A.; Zhang, Liqing; Zhang, Hong-Bin; Zhang, Xiaojun; Zhang, Yang; Reed, Kent M.
2010-01-01
A synergistic combination of two next-generation sequencing platforms with a detailed comparative BAC physical contig map provided a cost-effective assembly of the genome sequence of the domestic turkey (Meleagris gallopavo). Heterozygosity of the sequenced source genome allowed discovery of more than 600,000 high quality single nucleotide variants. Despite this heterozygosity, the current genome assembly (∼1.1 Gb) includes 917 Mb of sequence assigned to specific turkey chromosomes. Annotation identified nearly 16,000 genes, with 15,093 recognized as protein coding and 611 as non-coding RNA genes. Comparative analysis of the turkey, chicken, and zebra finch genomes, and comparing avian to mammalian species, supports the characteristic stability of avian genomes and identifies genes unique to the avian lineage. Clear differences are seen in number and variety of genes of the avian immune system where expansions and novel genes are less frequent than examples of gene loss. The turkey genome sequence provides resources to further understand the evolution of vertebrate genomes and genetic variation underlying economically important quantitative traits in poultry. This integrated approach may be a model for providing both gene and chromosome level assemblies of other species with agricultural, ecological, and evolutionary interest. PMID:20838655
2013-01-01
Background Tuberculosis is currently the second highest cause of death from infectious diseases worldwide. The emergence of multi and extensive drug resistance is threatening to make tuberculosis incurable. There is growing evidence that the genetic diversity of Mycobacterium tuberculosis may have important clinical consequences. Therefore, combining genetic, clinical and socio-demographic data is critical to understand the epidemiology of this infectious disease, and how virulence and other phenotypic traits evolve over time. This requires dedicated bioinformatics platforms, capable of integrating and enabling analyses of this heterogeneous data. Results We developed inTB, a web-based system for integrated warehousing and analysis of clinical, socio-demographic and molecular data for Mycobacterium sp. isolates. As a database it can organize and display data from any of the standard genotyping methods (SNP, MIRU-VNTR, RFLP and spoligotype), as well as an extensive array of clinical and socio-demographic variables that are used in multiple countries to characterize the disease. Through the inTB interface it is possible to insert and download data, browse the database and search specific parameters. New isolates are automatically classified into strains according to an internal reference, and data uploaded or typed in is checked for internal consistency. As an analysis framework, the system provides simple, point and click analysis tools that allow multiple types of data plotting, as well as simple ways to download data for external analysis. Individual trees for each genotyping method are available, as well as a super tree combining all of them. The integrative nature of inTB grants the user the ability to generate trees for filtered subsets of data crossing molecular and clinical/socio-demografic information. inTB is built on open source software, can be easily installed locally and easily adapted to other diseases. Its design allows for use by research laboratories, hospitals or public health authorities. The full source code as well as ready to use packages is available at http://www.evocell.org/inTB. Conclusions To the best of our knowledge, this is the only system capable of integrating different types of molecular data with clinical and socio-demographic data, empowering researchers and clinicians with easy to use analysis tools that were not possible before. PMID:24001185
Soares, Patrícia; Alves, Renato J; Abecasis, Ana B; Penha-Gonçalves, Carlos; Gomes, M Gabriela M; Pereira-Leal, José B
2013-08-30
Tuberculosis is currently the second highest cause of death from infectious diseases worldwide. The emergence of multi and extensive drug resistance is threatening to make tuberculosis incurable. There is growing evidence that the genetic diversity of Mycobacterium tuberculosis may have important clinical consequences. Therefore, combining genetic, clinical and socio-demographic data is critical to understand the epidemiology of this infectious disease, and how virulence and other phenotypic traits evolve over time. This requires dedicated bioinformatics platforms, capable of integrating and enabling analyses of this heterogeneous data. We developed inTB, a web-based system for integrated warehousing and analysis of clinical, socio-demographic and molecular data for Mycobacterium sp. isolates. As a database it can organize and display data from any of the standard genotyping methods (SNP, MIRU-VNTR, RFLP and spoligotype), as well as an extensive array of clinical and socio-demographic variables that are used in multiple countries to characterize the disease. Through the inTB interface it is possible to insert and download data, browse the database and search specific parameters. New isolates are automatically classified into strains according to an internal reference, and data uploaded or typed in is checked for internal consistency. As an analysis framework, the system provides simple, point and click analysis tools that allow multiple types of data plotting, as well as simple ways to download data for external analysis. Individual trees for each genotyping method are available, as well as a super tree combining all of them. The integrative nature of inTB grants the user the ability to generate trees for filtered subsets of data crossing molecular and clinical/socio-demografic information. inTB is built on open source software, can be easily installed locally and easily adapted to other diseases. Its design allows for use by research laboratories, hospitals or public health authorities. The full source code as well as ready to use packages is available at http://www.evocell.org/inTB. To the best of our knowledge, this is the only system capable of integrating different types of molecular data with clinical and socio-demographic data, empowering researchers and clinicians with easy to use analysis tools that were not possible before.
Effective use of metadata in the integration and analysis of multi-dimensional optical data
NASA Astrophysics Data System (ADS)
Pastorello, G. Z.; Gamon, J. A.
2012-12-01
Data discovery and integration relies on adequate metadata. However, creating and maintaining metadata is time consuming and often poorly addressed or avoided altogether, leading to problems in later data analysis and exchange. This is particularly true for research fields in which metadata standards do not yet exist or are under development, or within smaller research groups without enough resources. Vegetation monitoring using in-situ and remote optical sensing is an example of such a domain. In this area, data are inherently multi-dimensional, with spatial, temporal and spectral dimensions usually being well characterized. Other equally important aspects, however, might be inadequately translated into metadata. Examples include equipment specifications and calibrations, field/lab notes and field/lab protocols (e.g., sampling regimen, spectral calibration, atmospheric correction, sensor view angle, illumination angle), data processing choices (e.g., methods for gap filling, filtering and aggregation of data), quality assurance, and documentation of data sources, ownership and licensing. Each of these aspects can be important as metadata for search and discovery, but they can also be used as key data fields in their own right. If each of these aspects is also understood as an "extra dimension," it is possible to take advantage of them to simplify the data acquisition, integration, analysis, visualization and exchange cycle. Simple examples include selecting data sets of interest early in the integration process (e.g., only data collected according to a specific field sampling protocol) or applying appropriate data processing operations to different parts of a data set (e.g., adaptive processing for data collected under different sky conditions). More interesting scenarios involve guided navigation and visualization of data sets based on these extra dimensions, as well as partitioning data sets to highlight relevant subsets to be made available for exchange. The DAX (Data Acquisition to eXchange) Web-based tool uses a flexible metadata representation model and takes advantage of multi-dimensional data structures to translate metadata types into data dimensions, effectively reshaping data sets according to available metadata. With that, metadata is tightly integrated into the acquisition-to-exchange cycle, allowing for more focused exploration of data sets while also increasing the value of, and incentives for, keeping good metadata. The tool is being developed and tested with optical data collected in different settings, including laboratory, field, airborne, and satellite platforms.
Practice innovation: the need for nimble data platforms to implement precision oncology care.
Elfiky, Aymen; Zhang, Dongyang; Krishnan Nair, Hari K
2015-01-01
Given the drive toward personalized, value-based, and coordinated cancer care delivery, modern knowledge-based practice is being shaped within the context of an increasingly technology-driven healthcare landscape. The ultimate promise of 'precision medicine' is predicated on taking advantage of the range of new capabilities for integrating disease- and individual-specific data to define new taxonomies as part of a systems-based knowledge network. Specifically, with cancer being a constantly evolving complex disease process, proper care of an individual will require the ability to seamlessly integrate multi-dimensional 'omic' and clinical data. Importantly, however, the challenges of curating knowledge from multiple dynamic data sources and translating to practice at the point-of-care highlight parallel needs. As patients, caregivers, and their environments become more proactive in clinical care and management, practical success of precision medicine is equally dependent on the development of proper infrastructures for evolving data integration, platforms for knowledge representation in a clinically-relevant context, and implementation within a provider's work-life and workflow.
Automatic Generation of Cycle-Approximate TLMs with Timed RTOS Model Support
NASA Astrophysics Data System (ADS)
Hwang, Yonghyun; Schirner, Gunar; Abdi, Samar
This paper presents a technique for automatically generating cycle-approximate transaction level models (TLMs) for multi-process applications mapped to embedded platforms. It incorporates three key features: (a) basic block level timing annotation, (b) RTOS model integration, and (c) RTOS overhead delay modeling. The inputs to TLM generation are application C processes and their mapping to processors in the platform. A processor data model, including pipelined datapath, memory hierarchy and branch delay model is used to estimate basic block execution delays. The delays are annotated to the C code, which is then integrated with a generated SystemC RTOS model. Our abstract RTOS provides dynamic scheduling and inter-process communication (IPC) with processor- and RTOS-specific pre-characterized timing. Our experiments using a MP3 decoder and a JPEG encoder show that timed TLMs, with integrated RTOS models, can be automatically generated in less than a minute. Our generated TLMs simulated three times faster than real-time and showed less than 10% timing error compared to board measurements.
pyPaSWAS: Python-based multi-core CPU and GPU sequence alignment.
Warris, Sven; Timal, N Roshan N; Kempenaar, Marcel; Poortinga, Arne M; van de Geest, Henri; Varbanescu, Ana L; Nap, Jan-Peter
2018-01-01
Our previously published CUDA-only application PaSWAS for Smith-Waterman (SW) sequence alignment of any type of sequence on NVIDIA-based GPUs is platform-specific and therefore adopted less than could be. The OpenCL language is supported more widely and allows use on a variety of hardware platforms. Moreover, there is a need to promote the adoption of parallel computing in bioinformatics by making its use and extension more simple through more and better application of high-level languages commonly used in bioinformatics, such as Python. The novel application pyPaSWAS presents the parallel SW sequence alignment code fully packed in Python. It is a generic SW implementation running on several hardware platforms with multi-core systems and/or GPUs that provides accurate sequence alignments that also can be inspected for alignment details. Additionally, pyPaSWAS support the affine gap penalty. Python libraries are used for automated system configuration, I/O and logging. This way, the Python environment will stimulate further extension and use of pyPaSWAS. pyPaSWAS presents an easy Python-based environment for accurate and retrievable parallel SW sequence alignments on GPUs and multi-core systems. The strategy of integrating Python with high-performance parallel compute languages to create a developer- and user-friendly environment should be considered for other computationally intensive bioinformatics algorithms.
A hybrid approach to device integration on a genetic analysis platform
NASA Astrophysics Data System (ADS)
Brennan, Des; Jary, Dorothee; Kurg, Ants; Berik, Evgeny; Justice, John; Aherne, Margaret; Macek, Milan; Galvin, Paul
2012-10-01
Point-of-care (POC) systems require significant component integration to implement biochemical protocols associated with molecular diagnostic assays. Hybrid platforms where discrete components are combined in a single platform are a suitable approach to integration, where combining multiple device fabrication steps on a single substrate is not possible due to incompatible or costly fabrication steps. We integrate three devices each with a specific system functionality: (i) a silicon electro-wetting-on-dielectric (EWOD) device to move and mix sample and reagent droplets in an oil phase, (ii) a polymer microfluidic chip containing channels and reservoirs and (iii) an aqueous phase glass microarray for fluorescence microarray hybridization detection. The EWOD device offers the possibility of fully integrating on-chip sample preparation using nanolitre sample and reagent volumes. A key challenge is sample transfer from the oil phase EWOD device to the aqueous phase microarray for hybridization detection. The EWOD device, waveguide performance and functionality are maintained during the integration process. An on-chip biochemical protocol for arrayed primer extension (APEX) was implemented for single nucleotide polymorphism (SNiP) analysis. The prepared sample is aspirated from the EWOD oil phase to the aqueous phase microarray for hybridization. A bench-top instrumentation system was also developed around the integrated platform to drive the EWOD electrodes, implement APEX sample heating and image the microarray after hybridization.
Brower, Kevin P; Ryakala, Venkat K; Bird, Ryan; Godawat, Rahul; Riske, Frank J; Konstantinov, Konstantin; Warikoo, Veena; Gamble, Jean
2014-01-01
Downstream sample purification for quality attribute analysis is a significant bottleneck in process development for non-antibody biologics. Multi-step chromatography process train purifications are typically required prior to many critical analytical tests. This prerequisite leads to limited throughput, long lead times to obtain purified product, and significant resource requirements. In this work, immunoaffinity purification technology has been leveraged to achieve single-step affinity purification of two different enzyme biotherapeutics (Fabrazyme® [agalsidase beta] and Enzyme 2) with polyclonal and monoclonal antibodies, respectively, as ligands. Target molecules were rapidly isolated from cell culture harvest in sufficient purity to enable analysis of critical quality attributes (CQAs). Most importantly, this is the first study that demonstrates the application of predictive analytics techniques to predict critical quality attributes of a commercial biologic. The data obtained using the affinity columns were used to generate appropriate models to predict quality attributes that would be obtained after traditional multi-step purification trains. These models empower process development decision-making with drug substance-equivalent product quality information without generation of actual drug substance. Optimization was performed to ensure maximum target recovery and minimal target protein degradation. The methodologies developed for Fabrazyme were successfully reapplied for Enzyme 2, indicating platform opportunities. The impact of the technology is significant, including reductions in time and personnel requirements, rapid product purification, and substantially increased throughput. Applications are discussed, including upstream and downstream process development support to achieve the principles of Quality by Design (QbD) as well as integration with bioprocesses as a process analytical technology (PAT). © 2014 American Institute of Chemical Engineers.
A pivotal-based approach for enterprise business process and IS integration
NASA Astrophysics Data System (ADS)
Ulmer, Jean-Stéphane; Belaud, Jean-Pierre; Le Lann, Jean-Marc
2013-02-01
A company must be able to describe and react against any endogenous or exogenous event. Such flexibility can be achieved through business process management (BPM). Nevertheless a BPM approach highlights complex relations between business and IT domains. A non-alignment is exposed between heterogeneous models: this is the 'business-IT gap' as described in the literature. Through concepts from business engineering and information systems driven by models and IT, we define a generic approach ensuring multi-view consistency. Its role is to maintain and provide all information related to the structure and semantic of models. Allowing the full return of a transformed model in the sense of reverse engineering, our platform enables synchronisation between analysis model and implementation model.
The key technique study of a kind of personal navigation oriented LBS system
NASA Astrophysics Data System (ADS)
Yan, Lei; Zheng, Jianghua; Zhang, Xin; Peng, Chunhua; He, Lina
2005-11-01
With the integration of GIS, IT technology and wireless communication techniques, LBS is fast developing and caused wide concern. Personal navigation is the critical application of LBS. It has higher requirement of data quality, positioning accuracy and multi-model services. The study discusses the key techniques of a personal navigation oriented LBS system. As an example for service platform of China Unicom, NAVISTAR especially emphasizes the importance of spatial data organization. Based-on CDMA1X network, it adopts gpsOne\\MS-Assisted dynamic positioning technique, and puts forward a data organization solution to realize multi-scale representation.
NASA Astrophysics Data System (ADS)
Kollet, S. J.; Goergen, K.; Gasper, F.; Shresta, P.; Sulis, M.; Rihani, J.; Simmer, C.; Vereecken, H.
2013-12-01
In studies of the terrestrial hydrologic, energy and biogeochemical cycles, integrated multi-physics simulation platforms take a central role in characterizing non-linear interactions, variances and uncertainties of system states and fluxes in reciprocity with observations. Recently developed integrated simulation platforms attempt to honor the complexity of the terrestrial system across multiple time and space scales from the deeper subsurface including groundwater dynamics into the atmosphere. Technically, this requires the coupling of atmospheric, land surface, and subsurface-surface flow models in supercomputing environments, while ensuring a high-degree of efficiency in the utilization of e.g., standard Linux clusters and massively parallel resources. A systematic performance analysis including profiling and tracing in such an application is crucial in the understanding of the runtime behavior, to identify optimum model settings, and is an efficient way to distinguish potential parallel deficiencies. On sophisticated leadership-class supercomputers, such as the 28-rack 5.9 petaFLOP IBM Blue Gene/Q 'JUQUEEN' of the Jülich Supercomputing Centre (JSC), this is a challenging task, but even more so important, when complex coupled component models are to be analysed. Here we want to present our experience from coupling, application tuning (e.g. 5-times speedup through compiler optimizations), parallel scaling and performance monitoring of the parallel Terrestrial Systems Modeling Platform TerrSysMP. The modeling platform consists of the weather prediction system COSMO of the German Weather Service; the Community Land Model, CLM of NCAR; and the variably saturated surface-subsurface flow code ParFlow. The model system relies on the Multiple Program Multiple Data (MPMD) execution model where the external Ocean-Atmosphere-Sea-Ice-Soil coupler (OASIS3) links the component models. TerrSysMP has been instrumented with the performance analysis tool Scalasca and analyzed on JUQUEEN with processor counts on the order of 10,000. The instrumentation is used in weak and strong scaling studies with real data cases and hypothetical idealized numerical experiments for detailed profiling and tracing analysis. The profiling is not only useful in identifying wait states that are due to the MPMD execution model, but also in fine-tuning resource allocation to the component models in search of the most suitable load balancing. This is especially necessary, as with numerical experiments that cover multiple (high resolution) spatial scales, the time stepping, coupling frequencies, and communication overheads are constantly shifting, which makes it necessary to re-determine the model setup with each new experimental design.
The ARGO Project: assessing NA-TECH risks on off-shore oil platforms
NASA Astrophysics Data System (ADS)
Capuano, Paolo; Basco, Anna; Di Ruocco, Angela; Esposito, Simona; Fusco, Giannetta; Garcia-Aristizabal, Alexander; Mercogliano, Paola; Salzano, Ernesto; Solaro, Giuseppe; Teofilo, Gianvito; Scandone, Paolo; Gasparini, Paolo
2017-04-01
ARGO (Analysis of natural and anthropogenic risks on off-shore oil platforms) is a 2 years project, funded by the DGS-UNMIG (Directorate General for Safety of Mining and Energy Activities - National Mining Office for Hydrocarbons and Georesources) of Italian Ministry of Economic Development. The project, coordinated by AMRA (Center for the Analysis and Monitoring of Environmental Risk), aims at providing technical support for the analysis of natural and anthropogenic risks on offshore oil platforms. In order to achieve this challenging objective, ARGO brings together climate experts, risk management experts, seismologists, geologists, chemical engineers, earth and coastal observation experts. ARGO has developed methodologies for the probabilistic analysis of industrial accidents triggered by natural events (NA-TECH) on offshore oil platforms in the Italian seas, including extreme events related to climate changes. Furthermore the environmental effect of offshore activities has been investigated, including: changes on seismicity and on the evolution of coastal areas close to offshore platforms. Then a probabilistic multi-risk framework has been developed for the analysis of NA-TECH events on offshore installations for hydrocarbon extraction.
Deeply etched MMI-based components on 4 μm thick SOI for SOA-based optical RAM cell circuits
NASA Astrophysics Data System (ADS)
Cherchi, Matteo; Ylinen, Sami; Harjanne, Mikko; Kapulainen, Markku; Aalto, Timo; Kanellos, George T.; Fitsios, Dimitrios; Pleros, Nikos
2013-02-01
We present novel deeply etched functional components, fabricated by multi-step patterning in the frame of our 4 μm thick Silicon on Insulator (SOI) platform based on singlemode rib-waveguides and on the previously developed rib-tostrip converter. These novel components include Multi-Mode Interference (MMI) splitters with any desired splitting ratio, wavelength sensitive 50/50 splitters with pre-filtering capability, multi-stage Mach-Zehnder Interferometer (MZI) filters for suppression of Amplified Spontaneous Emission (ASE), and MMI resonator filters. These novel building blocks enable functionalities otherwise not achievable on our SOI platform, and make it possible to integrate optical RAM cell layouts, by resorting to our technology for hybrid integration of Semiconductor Optical Amplifiers (SOAs). Typical SOA-based RAM cell layouts require generic splitting ratios, which are not readily achievable by a single MMI splitter. We present here a novel solution to this problem, which is very compact and versatile and suits perfectly our technology. Another useful functional element when using SOAs is the pass-band filter to suppress ASE. We pursued two complimentary approaches: a suitable interleaved cascaded MZI filter, based on a novel suitably designed MMI coupler with pre-filtering capabilities, and a completely novel MMI resonator concept, to achieve larger free spectral ranges and narrower pass-band response. Simulation and design principles are presented and compared to preliminary experimental functional results, together with scaling rules and predictions of achievable RAM cell densities. When combined with our newly developed ultra-small light-turning concept, these new components are expected to pave the way for high integration density of RAM cells.
HERA: A New Platform for Embedding Agents in Heterogeneous Wireless Sensor Networks
NASA Astrophysics Data System (ADS)
Alonso, Ricardo S.; de Paz, Juan F.; García, Óscar; Gil, Óscar; González, Angélica
Ambient Intelligence (AmI) based systems require the development of innovative solutions that integrate distributed intelligent systems with context-aware technologies. In this sense, Multi-Agent Systems (MAS) and Wireless Sensor Networks (WSN) are two key technologies for developing distributed systems based on AmI scenarios. This paper presents the new HERA (Hardware-Embedded Reactive Agents) platform, that allows using dynamic and self-adaptable heterogeneous WSNs on which agents are directly embedded on the wireless nodes This approach facilitates the inclusion of context-aware capabilities in AmI systems to gather data from their surrounding environments, achieving a higher level of ubiquitous and pervasive computing.
Montague, Elizabeth; Stanberry, Larissa; Higdon, Roger; Janko, Imre; Lee, Elaine; Anderson, Nathaniel; Choiniere, John; Stewart, Elizabeth; Yandl, Gregory; Broomall, William; Kolker, Natali
2014-01-01
Abstract Multi-omics data-driven scientific discovery crucially rests on high-throughput technologies and data sharing. Currently, data are scattered across single omics repositories, stored in varying raw and processed formats, and are often accompanied by limited or no metadata. The Multi-Omics Profiling Expression Database (MOPED, http://moped.proteinspire.org) version 2.5 is a freely accessible multi-omics expression database. Continual improvement and expansion of MOPED is driven by feedback from the Life Sciences Community. In order to meet the emergent need for an integrated multi-omics data resource, MOPED 2.5 now includes gene relative expression data in addition to protein absolute and relative expression data from over 250 large-scale experiments. To facilitate accurate integration of experiments and increase reproducibility, MOPED provides extensive metadata through the Data-Enabled Life Sciences Alliance (DELSA Global, http://delsaglobal.org) metadata checklist. MOPED 2.5 has greatly increased the number of proteomics absolute and relative expression records to over 500,000, in addition to adding more than four million transcriptomics relative expression records. MOPED has an intuitive user interface with tabs for querying different types of omics expression data and new tools for data visualization. Summary information including expression data, pathway mappings, and direct connection between proteins and genes can be viewed on Protein and Gene Details pages. These connections in MOPED provide a context for multi-omics expression data exploration. Researchers are encouraged to submit omics data which will be consistently processed into expression summaries. MOPED as a multi-omics data resource is a pivotal public database, interdisciplinary knowledge resource, and platform for multi-omics understanding. PMID:24910945
Duncan, R G; Saperia, D; Dulbandzhyan, R; Shabot, M M; Polaschek, J X; Jones, D T
2001-01-01
The advent of the World-Wide-Web protocols and client-server technology has made it easy to build low-cost, user-friendly, platform-independent graphical user interfaces to health information systems and to integrate the presentation of data from multiple systems. The authors describe a Web interface for a clinical data repository (CDR) that was moved from concept to production status in less than six months using a rapid prototyping approach, multi-disciplinary development team, and off-the-shelf hardware and software. The system has since been expanded to provide an integrated display of clinical data from nearly 20 disparate information systems.
Sung, Wen-Tsai; Chiang, Yen-Chun
2012-12-01
This study examines wireless sensor network with real-time remote identification using the Android study of things (HCIOT) platform in community healthcare. An improved particle swarm optimization (PSO) method is proposed to efficiently enhance physiological multi-sensors data fusion measurement precision in the Internet of Things (IOT) system. Improved PSO (IPSO) includes: inertia weight factor design, shrinkage factor adjustment to allow improved PSO algorithm data fusion performance. The Android platform is employed to build multi-physiological signal processing and timely medical care of things analysis. Wireless sensor network signal transmission and Internet links allow community or family members to have timely medical care network services.
Yuan, Huiming; Zhou, Yuan; Zhang, Lihua; Liang, Zhen; Zhang, Yukui
2009-10-30
An integrated platform with the combination of proteins and peptides separation was established via the unit of on-line proteins digestion, by which proteins were in sequence separated by column switch recycling size exclusion chromatography (csrSEC), on-line digested by an immobilized trypsin microreactor, trapped and desalted by two parallel C8 precolumns, separated by microRPLC with the linear gradient of organic modifier concentration, and identified by ESI-MS/MS. A 6-protein mixture, with Mr ranging from 10 kDa to 80 kDa, was used to evaluate the performance of the integrated platform, and all proteins were identified with sequence coverage over 5.67%. Our experimental results demonstrate that such an integrated platform is of advantages such as good time compatibility, high peak capacity, and facile automation, which might be a promising approach for proteome study.
Lin, Lihua; Liu, Shengquan; Nie, Zhou; Chen, Yingzhuang; Lei, Chunyang; Wang, Zhen; Yin, Chao; Hu, Huiping; Huang, Yan; Yao, Shouzhuo
2015-04-21
Nowadays, large-scale screening for enzyme discovery, engineering, and drug discovery processes require simple, fast, and sensitive enzyme activity assay platforms with high integration and potential for high-throughput detection. Herein, a novel automatic and integrated micro-enzyme assay (AIμEA) platform was proposed based on a unique microreaction system fabricated by a engineered green fluorescence protein (GFP)-functionalized monolithic capillary column, with thrombin as an example. The recombinant GFP probe was rationally engineered to possess a His-tag and a substrate sequence of thrombin, which enable it to be immobilized on the monolith via metal affinity binding, and to be released after thrombin digestion. Combined with capillary electrophoresis-laser-induced fluorescence (CE-LIF), all the procedures, including thrombin injection, online enzymatic digestion in the microreaction system, and label-free detection of the released GFP, were integrated in a single electrophoretic process. By taking advantage of the ultrahigh loading capacity of the AIμEA platform and the CE automatic programming setup, one microreaction column was sufficient for many times digestion without replacement. The novel microreaction system showed significantly enhanced catalytic efficiency, about 30 fold higher than that of the equivalent bulk reaction. Accordingly, the AIμEA platform was highly sensitive with a limit of detection down to 1 pM of thrombin. Moreover, the AIμEA platform was robust and reliable to detect thrombin in human serum samples and its inhibition by hirudin. Hence, this AIμEA platform exhibits great potential for high-throughput analysis in future biological application, disease diagnostics, and drug screening.
In situ 3D nanoprinting of free-form coupling elements for hybrid photonic integration
NASA Astrophysics Data System (ADS)
Dietrich, P.-I.; Blaicher, M.; Reuter, I.; Billah, M.; Hoose, T.; Hofmann, A.; Caer, C.; Dangel, R.; Offrein, B.; Troppenz, U.; Moehrle, M.; Freude, W.; Koos, C.
2018-04-01
Hybrid photonic integration combines complementary advantages of different material platforms, offering superior performance and flexibility compared with monolithic approaches. This applies in particular to multi-chip concepts, where components can be individually optimized and tested. The assembly of such systems, however, requires expensive high-precision alignment and adaptation of optical mode profiles. We show that these challenges can be overcome by in situ printing of facet-attached beam-shaping elements. Our approach allows precise adaptation of vastly dissimilar mode profiles and permits alignment tolerances compatible with cost-efficient passive assembly techniques. We demonstrate a selection of beam-shaping elements at chip and fibre facets, achieving coupling efficiencies of up to 88% between edge-emitting lasers and single-mode fibres. We also realize printed free-form mirrors that simultaneously adapt beam shape and propagation direction, and we explore multi-lens systems for beam expansion. The concept paves the way to automated assembly of photonic multi-chip systems with unprecedented performance and versatility.
Progressive simplification and transmission of building polygons based on triangle meshes
NASA Astrophysics Data System (ADS)
Li, Hongsheng; Wang, Yingjie; Guo, Qingsheng; Han, Jiafu
2010-11-01
Digital earth is a virtual representation of our planet and a data integration platform which aims at harnessing multisource, multi-resolution, multi-format spatial data. This paper introduces a research framework integrating progressive cartographic generalization and transmission of vector data. The progressive cartographic generalization provides multiple resolution data from coarse to fine as key scales and increments between them which is not available in traditional generalization framework. Based on the progressive simplification algorithm, the building polygons are triangulated into meshes and encoded according to the simplification sequence of two basic operations, edge collapse and vertex split. The map data at key scales and encoded increments between them are stored in a multi-resolution file. As the client submits requests to the server, the coarsest map is transmitted first and then the increments. After data decoding and mesh refinement the building polygons with more details will be visualized. Progressive generalization and transmission of building polygons is demonstrated in the paper.
Wu, Chun-Chang; Chuang, Wen-Yu; Wu, Ching-Da; Su, Yu-Cheng; Huang, Yung-Yang; Huang, Yang-Jing; Peng, Sheng-Yu; Yu, Shih-An; Lin, Chih-Ting; Lu, Shey-Shi
2017-01-01
A self-sustained multi-sensor platform for indoor environmental monitoring is proposed in this paper. To reduce the cost and power consumption of the sensing platform, in the developed platform, organic materials of PEDOT:PSS and PEDOT:PSS/EB-PANI are used as the sensing films for humidity and CO2 detection, respectively. Different from traditional gas sensors, these organic sensing films can operate at room temperature without heating processes or infrared transceivers so that the power consumption of the developed humidity and the CO2 sensors can be as low as 10 μW and 5 μW, respectively. To cooperate with these low-power sensors, a Complementary Metal-Oxide-Semiconductor (CMOS) system-on-chip (SoC) is designed to amplify and to read out multiple sensor signals with low power consumption. The developed SoC includes an analog-front-end interface circuit (AFE), an analog-to-digital convertor (ADC), a digital controller and a power management unit (PMU). Scheduled by the digital controller, the sensing circuits are power gated with a small duty-cycle to reduce the average power consumption to 3.2 μW. The designed PMU converts the power scavenged from a dye sensitized solar cell (DSSC) module into required supply voltages for SoC circuits operation under typical indoor illuminance conditions. To our knowledge, this is the first multiple environmental parameters (Temperature/CO2/Humidity) sensing platform that demonstrates a true self-powering functionality for long-term operations. PMID:28353680
Wu, Chun-Chang; Chuang, Wen-Yu; Wu, Ching-Da; Su, Yu-Cheng; Huang, Yung-Yang; Huang, Yang-Jing; Peng, Sheng-Yu; Yu, Shih-An; Lin, Chih-Ting; Lu, Shey-Shi
2017-03-29
A self-sustained multi-sensor platform for indoor environmental monitoring is proposed in this paper. To reduce the cost and power consumption of the sensing platform, in the developed platform, organic materials of PEDOT:PSS and PEDOT:PSS/EB-PANI are used as the sensing films for humidity and CO₂ detection, respectively. Different from traditional gas sensors, these organic sensing films can operate at room temperature without heating processes or infrared transceivers so that the power consumption of the developed humidity and the CO₂ sensors can be as low as 10 μW and 5 μW, respectively. To cooperate with these low-power sensors, a Complementary Metal-Oxide-Semiconductor (CMOS) system-on-chip (SoC) is designed to amplify and to read out multiple sensor signals with low power consumption. The developed SoC includes an analog-front-end interface circuit (AFE), an analog-to-digital convertor (ADC), a digital controller and a power management unit (PMU). Scheduled by the digital controller, the sensing circuits are power gated with a small duty-cycle to reduce the average power consumption to 3.2 μW. The designed PMU converts the power scavenged from a dye sensitized solar cell (DSSC) module into required supply voltages for SoC circuits operation under typical indoor illuminance conditions. To our knowledge, this is the first multiple environmental parameters (Temperature/CO₂/Humidity) sensing platform that demonstrates a true self-powering functionality for long-term operations.
Systematic Analysis of Rocky Shore Morphology along 700km of Coastline Using LiDAR-derived DEMs
NASA Astrophysics Data System (ADS)
Matsumoto, H.; Dickson, M. E.; Masselink, G.
2016-12-01
Rock shore platforms occur along much of the world's coast and have a long history of study; however, uncertainty remains concerning the relative importance of various formative controls in different settings (e.g. wave erosion, weathering, tidal range, rock resistance, inheritance). Ambiguity is often attributed to intrinsic natural variability and the lack of preserved evidence on eroding rocky shores, but it could also be argued that previous studies are limited in scale, focusing on a small number of local sites, which restricts the potential for insights from broad, regional analyses. Here we describe a method, using LiDAR-derived digital elevation models (DEMs), for analysing shore platform morphology over an unprecedentedly wide area in which there are large variations in environmental conditions. The new method semi-automatically extracts shore platform profiles and systematically conducts morphometric analysis. We apply the method to 700 km of coast in the SW UK that is exposed to (i) highly energetic swell waves to local wind waves, (ii) macro to mega tidal ranges, and (iii) highly resistant igneous rocks to moderately hard sedimentary rocks. Computer programs are developed to estimate mean sea level, mean spring tidal range, wave height, and rock strength along the coastline. Filtering routines automatically select and remove profiles that are unsuitable for analysis. The large data-set of remaining profiles supports broad and systematic investigation of possible controls on platform morphology. Results, as expected, show wide scatter, because many formative controls are in play, but several trends exist that are generally consistent with relationships that have been inferred from local site studies. This paper will describe correlation analysis on platform morphology in relation to environmental conditions and also present a multi-variable empirical model derived from multi linear regression analysis. Interesting matches exist between platform gradients obtained from the field, and empirical model predictions, particularly when morphological variability found in LiDAR-based shore platform morphology analysis is considered. These findings frame a discussion on formative controls of rocky shore morphology.
Methods for multi-material stereolithography
Wicker, Ryan [El Paso, TX; Medina, Francisco [El Paso, TX; Elkins, Christopher [Redwood City, CA
2011-06-14
Methods and systems of stereolithography for building cost-efficient and time-saving multi-material, multi-functional and multi-colored prototypes, models and devices configured for intermediate washing and curing/drying is disclosed including: laser(s), liquid and/or platform level sensing system(s), controllable optical system(s), moveable platform(s), elevator platform(s), recoating system(s) and at least one polymer retaining receptacle. Multiple polymer retaining receptacles may be arranged in a moveable apparatus, wherein each receptacle is adapted to actively/passively maintain a uniform, desired level of polymer by including a recoating device and a material fill/remove system. The platform is movably accessible to the polymer retaining receptacle(s), elevator mechanism(s) and washing and curing/drying area(s) which may be housed in a shielded enclosure(s). The elevator mechanism is configured to vertically traverse and rotate the platform, thus providing angled building, washing and curing/drying capabilities. A horizontal traversing mechanism may be included to facilitate manufacturing between components of SL cabinet(s) and/or alternative manufacturing technologies.
2010-01-01
Background An important focus of genomic science is the discovery and characterization of all functional elements within genomes. In silico methods are used in genome studies to discover putative regulatory genomic elements (called words or motifs). Although a number of methods have been developed for motif discovery, most of them lack the scalability needed to analyze large genomic data sets. Methods This manuscript presents WordSeeker, an enumerative motif discovery toolkit that utilizes multi-core and distributed computational platforms to enable scalable analysis of genomic data. A controller task coordinates activities of worker nodes, each of which (1) enumerates a subset of the DNA word space and (2) scores words with a distributed Markov chain model. Results A comprehensive suite of performance tests was conducted to demonstrate the performance, speedup and efficiency of WordSeeker. The scalability of the toolkit enabled the analysis of the entire genome of Arabidopsis thaliana; the results of the analysis were integrated into The Arabidopsis Gene Regulatory Information Server (AGRIS). A public version of WordSeeker was deployed on the Glenn cluster at the Ohio Supercomputer Center. Conclusion WordSeeker effectively utilizes concurrent computing platforms to enable the identification of putative functional elements in genomic data sets. This capability facilitates the analysis of the large quantity of sequenced genomic data. PMID:21210985
Multi terabits/s optical access transport technologies
NASA Astrophysics Data System (ADS)
Binh, Le Nguyen; Wang Tao, Thomas; Livshits, Daniil; Gubenko, Alexey; Karinou, Fotini; Liu Ning, Gordon; Shkolnik, Alexey
2016-02-01
Tremendous efforts have been developed for multi-Tbps over ultra-long distance and metro and access optical networks. With the exponential increase demand on data transmission, storage and serving, especially the 5G wireless access scenarios, the optical Internet networking has evolved to data-center based optical networks pressuring on novel and economical access transmission systems. This paper reports (1) Experimental platforms and transmission techniques employing band-limited optical components operating at 10G for 100G based at 28G baud. Advanced modulation formats such as PAM-4, DMT, duo-binary etc are reported and their advantages and disadvantages are analyzed so as to achieve multi-Tbps optical transmission systems for access inter- and intra- data-centered-based networks; (2) Integrated multi-Tbps combining comb laser sources and micro-ring modulators meeting the required performance for access systems are reported. Ten-sub-carrier quantum dot com lasers are employed in association with wideband optical intensity modulators to demonstrate the feasibility of such sources and integrated micro-ring modulators acting as a combined function of demultiplexing/multiplexing and modulation, hence compactness and economy scale. Under the use of multi-level modulation and direct detection at 56 GBd an aggregate of higher than 2Tbps and even 3Tbps can be achieved by interleaved two comb lasers of 16 sub-carrier lines; (3) Finally the fundamental designs of ultra-compacts flexible filters and switching integrated components based on Si photonics for multi Tera-bps active interconnection are presented. Experimental results on multi-channels transmissions and performances of optical switching matrices and effects on that of data channels are proposed.
Radiation and scattering from printed antennas on cylindrically conformal platforms
NASA Technical Reports Server (NTRS)
Kempel, Leo C.; Volakis, John L.; Bindiganavale, Sunil
1994-01-01
The goal was to develop suitable methods and software for the analysis of antennas on cylindrical coated and uncoated platforms. Specifically, the finite element boundary integral and finite element ABC methods were employed successfully and associated software were developed for the analysis and design of wraparound and discrete cavity-backed arrays situated on cylindrical platforms. This work led to the successful implementation of analysis software for such antennas. Developments which played a role in this respect are the efficient implementation of the 3D Green's function for a metallic cylinder, the incorporation of the fast Fourier transform in computing the matrix-vector products executed in the solver of the finite element-boundary integral system, and the development of a new absorbing boundary condition for terminating the finite element mesh on cylindrical surfaces.
Multi-function microfluidic platform for sensor integration.
Fernandes, Ana C; Semenova, Daria; Panjan, Peter; Sesay, Adama M; Gernaey, Krist V; Krühne, Ulrich
2018-03-06
The limited availability of metabolite-specific sensors for continuous sampling and monitoring is one of the main bottlenecks contributing to failures in bioprocess development. Furthermore, only a limited number of approaches exist to connect currently available measurement systems with high throughput reactor units. This is especially relevant in the biocatalyst screening and characterization stage of process development. In this work, a strategy for sensor integration in microfluidic platforms is demonstrated, to address the need for rapid, cost-effective and high-throughput screening in bioprocesses. This platform is compatible with different sensor formats by enabling their replacement and was built in order to be highly flexible and thus suitable for a wide range of applications. Moreover, this re-usable platform can easily be connected to analytical equipment, such as HPLC, laboratory scale reactors or other microfluidic chips through the use of standardized fittings. In addition, the developed platform includes a two-sensor system interspersed with a mixing channel, which allows the detection of samples that might be outside the first sensor's range of detection, through dilution of the sample solution up to 10 times. In order to highlight the features of the proposed platform, inline monitoring of glucose levels is presented and discussed. Glucose was chosen due to its importance in biotechnology as a relevant substrate. The platform demonstrated continuous measurement of substrate solutions for up to 12 h. Furthermore, the influence of the fluid velocity on substrate diffusion was observed, indicating the need for in-flow calibration to achieve a good quantitative output. Copyright © 2018 Elsevier B.V. All rights reserved.
CSDC: a nationwide screening platform for stroke control and prevention in China.
Jinghui Yu; Huajian Mao; Mei Li; Dan Ye; Dongsheng Zhao
2016-08-01
As a leading cause of severe disability and death, stroke places an enormous burden on Chinese society. A nationwide stroke screening platform called CSDC (China Stoke Data Center) has been built to support the national stroke prevention program and stroke clinical research since 2011. This platform is composed of a data integration system and a big data analysis system. The data integration system is used to collect information on risk factors, diagnosis history, treatment, and sociodemographic characteristics and stroke patients' EMR. The big data analysis system support decision making of stroke control and prevention, clinical evaluation and research. In this paper, the design and implementation of CSDC are illustrated, and some application results are presented. This platform is expected to provide rich data and powerful tool support for stroke control and prevention in China.
Clinical validation of the CHRONIOUS wearable system in patients with chronic disease.
Bellos, Christos; Papadopoulos, Athanassios; Rosso, Roberto; Fotiadis, Dimitrios I
2013-01-01
The CHRONIOUS system defines a powerful and easy to use framework which has been designed to provide services to clinicians and their patients suffering from chronic diseases. The system is composed of a wearable shirt that integrate several body sensors, a portable smart device and a central sub-system that is responsible for the long term storage of the collected patient's data. A multi-parametric expert system is developed for the analysis of the collected data using intelligent algorithms and complex techniques. Apart for the vital signals, dietary habits, drug intake, activity data, environmental and biochemical parameters are recorded. The CHRONIOUS platform is validated through clinical trials in several medical centers and patient's home environments recruiting patients suffering from Chronic Obstructive pulmonary disease (COPD) and Chronic Kidney Disease (CKD) diseases. The clinical trials contribute in improving the system's accuracy, while Pulmonologists and Nephrologists experts utilized the CHRONIOUS platform to evaluate its efficiency and performance. The results of the utilization of the system were very encouraging. The CHRONIOUS system has been proven to be a well-validated real-time patient monitoring and supervision platform, providing a useful tool for the clinician and the patient that would contribute to the more effective management of chronic diseases.
The Real-Time Monitoring Service Platform for Land Supervision Based on Cloud Integration
NASA Astrophysics Data System (ADS)
Sun, J.; Mao, M.; Xiang, H.; Wang, G.; Liang, Y.
2018-04-01
Remote sensing monitoring has become the important means for land and resources departments to strengthen supervision. Aiming at the problems of low monitoring frequency and poor data currency in current remote sensing monitoring, this paper researched and developed the cloud-integrated real-time monitoring service platform for land supervision which enhanced the monitoring frequency by acquiring the domestic satellite image data overall and accelerated the remote sensing image data processing efficiency by exploiting the intelligent dynamic processing technology of multi-source images. Through the pilot application in Jinan Bureau of State Land Supervision, it has been proved that the real-time monitoring technical method for land supervision is feasible. In addition, the functions of real-time monitoring and early warning are carried out on illegal land use, permanent basic farmland protection and boundary breakthrough in urban development. The application has achieved remarkable results.
Cartwright, Joseph F; Anderson, Karin; Longworth, Joseph; Lobb, Philip; James, David C
2018-06-01
High-fidelity replication of biologic-encoding recombinant DNA sequences by engineered mammalian cell cultures is an essential pre-requisite for the development of stable cell lines for the production of biotherapeutics. However, immortalized mammalian cells characteristically exhibit an increased point mutation frequency compared to mammalian cells in vivo, both across their genomes and at specific loci (hotspots). Thus unforeseen mutations in recombinant DNA sequences can arise and be maintained within producer cell populations. These may affect both the stability of recombinant gene expression and give rise to protein sequence variants with variable bioactivity and immunogenicity. Rigorous quantitative assessment of recombinant DNA integrity should therefore form part of the cell line development process and be an essential quality assurance metric for instances where synthetic/multi-component assemblies are utilized to engineer mammalian cells, such as the assessment of recombinant DNA fidelity or the mutability of single-site integration target loci. Based on Pacific Biosciences (Menlo Park, CA) single molecule real-time (SMRT™) circular consensus sequencing (CCS) technology we developed a rDNA sequence analysis tool to process the multi-parallel sequencing of ∼40,000 single recombinant DNA molecules. After statistical filtering of raw sequencing data, we show that this analytical method is capable of detecting single point mutations in rDNA to a minimum single mutation frequency of 0.0042% (<1/24,000 bases). Using a stable CHO transfectant pool harboring a randomly integrated 5 kB plasmid construct encoding GFP we found that 28% of recombinant plasmid copies contained at least one low frequency (<0.3%) point mutation. These mutations were predominantly found in GC base pairs (85%) and that there was no positional bias in mutation across the plasmid sequence. There was no discernable difference between the mutation frequencies of coding and non-coding DNA. The putative ratio of non-synonymous and synonymous changes within the open reading frames (ORFs) in the plasmid sequence indicates that natural selection does not impact upon the prevalence of these mutations. Here we have demonstrated the abundance of mutations that fall outside of the reported range of detection of next generation sequencing (NGS) and second generation sequencing (SGS) platforms, providing a methodology capable of being utilized in cell line development platforms to identify the fidelity of recombinant genes throughout the production process. © 2018 Wiley Periodicals, Inc.
Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry
2017-08-01
Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.
NASA Astrophysics Data System (ADS)
Vivoni, E.; Mascaro, G.; Shupe, J. W.; Hiatt, C.; Potter, C. S.; Miller, R. L.; Stanley, J.; Abraham, T.; Castilla-Rubio, J.
2012-12-01
Droughts and their hydrological consequences are a major threat to food security throughout the world. In arid and semiarid regions dependent on irrigated agriculture, prolonged droughts lead to significant and recurring economic and social losses. In this contribution, we present preliminary results on integrating a set of multi-resolution drought indices into a cloud computing-based visualization platform. We focused our initial efforts on Brazil due to a severe, on-going drought in a large agricultural area in the northeastern part of the country. The online platform includes drought products developed from: (1) a MODIS-based water stress index (WSI) based on inferences from normalized difference vegetation index and land surface temperature fields, (2) a volumetric water content (VWC) index obtained from application of the NASA CASA model, and (3) a set of AVHRR-based vegetation health indices obtained from NOAA/NESDIS. The drought indices are also presented in terms of anomalies with respect to a baseline period. Since our main objective is to engage stakeholders and decision-makers in Brazil, we incorporated other relevant geospatial data into the platform, including irrigation areas, dams and reservoirs, administrative units and annual climate information. We will also present a set of use cases developed to help stakeholders explore, query and provide feedback that allowed fine-tuning of the drought product delivery, presentation and analysis tools. Finally, we discuss potential next steps in development of the online platform, including applications at finer resolutions in specific basins and at a coarser global scale.
Blue guardian: an open architecture for rapid ISR demonstration
NASA Astrophysics Data System (ADS)
Barrett, Donald A.; Borntrager, Luke A.; Green, David M.
2016-05-01
Throughout the Department of Defense (DoD), acquisition, platform integration, and life cycle costs for weapons systems have continued to rise. Although Open Architecture (OA) interface standards are one of the primary methods being used to reduce these costs, the Air Force Rapid Capabilities Office (AFRCO) has extended the OA concept and chartered the Open Mission System (OMS) initiative with industry to develop and demonstrate a consensus-based, non-proprietary, OA standard for integrating subsystems and services into airborne platforms. The new OMS standard provides the capability to decouple vendor-specific sensors, payloads, and service implementations from platform-specific architectures and is still in the early stages of maturation and demonstration. The Air Force Research Laboratory (AFRL) - Sensors Directorate has developed the Blue Guardian program to demonstrate advanced sensing technology utilizing open architectures in operationally relevant environments. Over the past year, Blue Guardian has developed a platform architecture using the Air Force's OMS reference architecture and conducted a ground and flight test program of multiple payload combinations. Systems tested included a vendor-unique variety of Full Motion Video (FMV) systems, a Wide Area Motion Imagery (WAMI) system, a multi-mode radar system, processing and database functions, multiple decompression algorithms, multiple communications systems, and a suite of software tools. Initial results of the Blue Guardian program show the promise of OA to DoD acquisitions, especially for Intelligence, Surveillance and Reconnaissance (ISR) payload applications. Specifically, the OMS reference architecture was extremely useful in reducing the cost and time required for integrating new systems.
NASA Astrophysics Data System (ADS)
Di Stefano, M.; Fox, P. A.; Beaulieu, S. E.; Maffei, A. R.; West, P.; Hare, J. A.
2012-12-01
Integrated assessments of large marine ecosystems require the understanding of interactions between environmental, ecological, and socio-economic factors that affect production and utilization of marine natural resources. Assessing the functioning of complex coupled natural-human systems calls for collaboration between natural and social scientists across disciplinary and national boundaries. We are developing a platform to implement and sustain informatics solutions for these applications, providing interoperability among very diverse and heterogeneous data and information sources, as well as multi-disciplinary organizations and people. We have partnered with NOAA NMFS scientists to facilitate the deployment of an integrated ecosystem approach to management in the Northeast U.S. (NES) and California Current Large Marine Ecosystems (LMEs). Our platform will facilitate the collaboration and knowledge sharing among NMFS natural and social scientists, promoting community participation in integrating data, models, and knowledge. Here, we present collaborative software tools developed to aid the production of the Ecosystem Status Report (ESR) for the NES LME. The ESR addresses the D-P-S portion of the DPSIR (Driver-Pressure-State-Impact-Response) management framework: reporting data, indicators, and information products for climate drivers, physical and human (fisheries) pressures, and ecosystem state (primary and secondary production and higher trophic levels). We are developing our tools in open-source software, with the main tool based on a web application capable of providing the ability to work on multiple data types from a variety of sources, providing an effective way to share the source code used to generate data products and associated metadata as well as track workflow provenance to allow in the reproducibility of a data product. Our platform retrieves data, conducts standard analyses, reports data quality and other standardized metadata, provides iterative and interactive visualization, and enables the download of data plotted in the ESR. Data, indicators, and information products include time series, geographic maps, and uni-variate and multi-variate analyses. Also central to the success of this initiative is the commitment to accommodate and train scientists of multiple disciplines who will learn to interact effectively with this new integrated and interoperable ecosystem assessment capability. Traceability, repeatability, explanation, verification, and validation of data, indicators, and information products are important for cross-disciplinary understanding and sharing with managers, policymakers, and the public. We are also developing an ontology to support the implementation of the DPSIR framework. These new capabilities will serve as the essential foundation for the formal synthesis and quantitative analysis of information on relevant natural and socio-economic factors in relation to specified ecosystem management goals which can be applied in other LMEs.
Crowdsourcing engagement and applications for communities within crisis events
NASA Astrophysics Data System (ADS)
Frigerio, Simone; Schenato, Luca; Bossi, Giulia; Mantovani, Matteo; Crema, Stefano; Cavalli, Marco; Marcato, Gianluca; Pasuto, Alessandro
2017-04-01
Civil protection attitude is a changing pattern within natural hazards, deploying responsibilities from central government to local authorities. The competence of volunteers and the awareness and involvement of local inhabitants are key points for prevention and preparedness. Citizens and volunteers become first actors of civil protection, toward context-specific strategies of surveillance and territorial surveys. The crowd-mapping technology includes a mobile solution tested insight trained communities, as participation within disaster response. The platform includes also a user-friendly dashboard for data gathering and analysis in multi-hazard realities, tested with pilot case studies. Usability and gradual innovation of platform are continuous granted by cloud dataset and bugfixing controls. The first module focuses on flood processes gathering data from local and trained population, for awareness and long-term preparedness. The second module integrates field survey of several volunteers within rescue squads, combining geolocations and comparing dataset collected in pre-emergency steps in urban case studies. The results include an easy-to-use data interface for crisis management, a tested support within crisis combined with personal awareness, continuously updated and customized. The development provides a version for Android 4.0 onward, the web application combines a cloud architecture with a relational database and web services, integrated with SDK cloud notification. The wireframes planned two accesses for a Citizens Kit and a Volunteers Kit, synchronized with a common dashboard. The follow up includes the integration between mobile solutions with sensors for dynamic update and data export for GIS analysis. The location-based services uses location data to monitor parameters and control features within natural hazard. A human sensor network is the aim, integrating sensor measurements with external observation as baseline of future modelling. Point data like humidity, temperature and pressure are geolocated and real-time. Human sensors reveal a massive approach of crowdsourcing, and user-friendly dashboards appears as solid control of data management to support resilience and quality of risk assessment.
Li, Ying-Jun; Yang, Cong; Wang, Gui-Cong; Zhang, Hui; Cui, Huan-Yong; Zhang, Yong-Liang
2017-09-01
This paper presents a novel integrated piezoelectric six-dimensional force sensor which can realize dynamic measurement of multi-dimensional space load. Firstly, the composition of the sensor, the spatial layout of force-sensitive components, and measurement principle are analyzed and designed. There is no interference of piezoelectric six-dimensional force sensor in theoretical analysis. Based on the principle of actual work and deformation compatibility coherence, this paper deduces the parallel load sharing principle of the piezoelectric six-dimensional force sensor. The main effect factors which affect the load sharing ratio are obtained. The finite element model of the piezoelectric six-dimensional force sensor is established. In order to verify the load sharing principle of the sensor, a load sharing test device of piezoelectric force sensor is designed and fabricated. The load sharing experimental platform is set up. The experimental results are in accordance with the theoretical analysis and simulation results. The experiments show that the multi-dimensional and heavy force measurement can be realized by the parallel arrangement of the load sharing ring and the force sensitive element in the novel integrated piezoelectric six-dimensional force sensor. The ideal load sharing effect of the sensor can be achieved by appropriate size parameters. This paper has an important guide for the design of the force measuring device according to the load sharing mode. Copyright © 2017 ISA. Published by Elsevier Ltd. All rights reserved.
Iconic Gestures for Robot Avatars, Recognition and Integration with Speech
Bremner, Paul; Leonards, Ute
2016-01-01
Co-verbal gestures are an important part of human communication, improving its efficiency and efficacy for information conveyance. One possible means by which such multi-modal communication might be realized remotely is through the use of a tele-operated humanoid robot avatar. Such avatars have been previously shown to enhance social presence and operator salience. We present a motion tracking based tele-operation system for the NAO robot platform that allows direct transmission of speech and gestures produced by the operator. To assess the capabilities of this system for transmitting multi-modal communication, we have conducted a user study that investigated if robot-produced iconic gestures are comprehensible, and are integrated with speech. Robot performed gesture outcomes were compared directly to those for gestures produced by a human actor, using a within participant experimental design. We show that iconic gestures produced by a tele-operated robot are understood by participants when presented alone, almost as well as when produced by a human. More importantly, we show that gestures are integrated with speech when presented as part of a multi-modal communication equally well for human and robot performances. PMID:26925010
NASA Astrophysics Data System (ADS)
Huang, M.
2016-12-01
Earth System models (ESMs) are effective tools for investigating the water-energy-food system interactions under climate change. In this presentation, I will introduce research efforts at the Pacific Northwest National Laboratory towards quantifying impacts of LULCC on the water-energy-food nexus in a changing climate using an integrated regional Earth system modeling framework: the Platform for Regional Integrated Modeling and Analysis (PRIMA). Two studies will be discussed to showcase the capability of PRIMA: (1) quantifying changes in terrestrial hydrology over the Conterminous US (CONUS) from 2005 to 2095 using the Community Land Model (CLM) driven by high-resolution downscaled climate and land cover products from PRIMA, which was designed for assessing the impacts of and potential responses to climate and anthropogenic changes at regional scales; (2) applying CLM over the CONUS to provide the first county-scale model validation in simulating crop yields and assessing associated impacts on the water and energy budgets using CLM. The studies demonstrate the benefits of incorporating and coupling human activities into complex ESMs, and critical needs to account for the biogeophysical and biogeochemical effects of LULCC in climate impacts studies, and in designing mitigation and adaptation strategies at a scale meaningful for decision-making. Future directions in quantifying LULCC impacts on the water-energy-food nexus under a changing climate, as well as feedbacks among climate, energy production and consumption, and natural/managed ecosystems using an Integrated Multi-scale, Multi-sector Modeling framework will also be discussed.
Battle Lab Simulation Collaboration Environment (BLSCE): Multipurpose Platform for Simulation C2
2006-06-01
encryption, low-probability of intercept and detection communications, and specialized intelligent agents will provide the brick an d mortar for our...echelons. It allows multi-celled experimentations among several locations that cover all of the United States. It has become a gateway for Joint...of exercises from remote locations , including live-force play. • Integration of combined arms experimentation in support of Army Transformation
Advanced Wireless Integrated Navy Network - AWINN
2005-09-30
progress report No. 3 on AWINN hardware and software configurations of smart , wideband, multi-function antennas, secure configurable platform, close-in...results to the host PC via a UART soft core. The UART core used is a proprietary Xilinx core which incorporates features described in National...current software uses wheel odometry and visual landmarks to create a map and estimate position on an internal x, y grid . The wheel odometry provides a
Integration of hybrid silicon lasers and electroabsorption modulators.
Sysak, Matthew N; Anthes, Joel O; Bowers, John E; Raday, Omri; Jones, Richard
2008-08-18
We present an integration platform based on quantum well intermixing for multi-section hybrid silicon lasers and electroabsorption modulators. As a demonstration of the technology, we have fabricated discrete sampled grating DBR lasers and sampled grating DBR lasers integrated with InGaAsP/InP electroabsorption modulators. The integrated sampled grating DBR laser-modulators use the as-grown III-V bandgap for optical gain, a 50 nm blue shifted bandgap for the electrabosprtion modulators, and an 80 nm blue shifted bandgap for low loss mirrors. Laser continuous wave operation up to 45 ?C is achieved with output power >1.0 mW and threshold current of <50 mA. The modulator bandwidth is >2GHz with 5 dB DC extinction.
Roever, Stefan
2012-01-01
A massively parallel, low cost molecular analysis platform will dramatically change the nature of protein, molecular and genomics research, DNA sequencing, and ultimately, molecular diagnostics. An integrated circuit (IC) with 264 sensors was fabricated using standard CMOS semiconductor processing technology. Each of these sensors is individually controlled with precision analog circuitry and is capable of single molecule measurements. Under electronic and software control, the IC was used to demonstrate the feasibility of creating and detecting lipid bilayers and biological nanopores using wild type α-hemolysin. The ability to dynamically create bilayers over each of the sensors will greatly accelerate pore development and pore mutation analysis. In addition, the noise performance of the IC was measured to be 30fA(rms). With this noise performance, single base detection of DNA was demonstrated using α-hemolysin. The data shows that a single molecule, electrical detection platform using biological nanopores can be operationalized and can ultimately scale to millions of sensors. Such a massively parallel platform will revolutionize molecular analysis and will completely change the field of molecular diagnostics in the future.
Exogenous Molecular Probes for Targeted Imaging in Cancer: Focus on Multi-modal Imaging
Joshi, Bishnu P.; Wang, Thomas D.
2010-01-01
Cancer is one of the major causes of mortality and morbidity in our healthcare system. Molecular imaging is an emerging methodology for the early detection of cancer, guidance of therapy, and monitoring of response. The development of new instruments and exogenous molecular probes that can be labeled for multi-modality imaging is critical to this process. Today, molecular imaging is at a crossroad, and new targeted imaging agents are expected to broadly expand our ability to detect and manage cancer. This integrated imaging strategy will permit clinicians to not only localize lesions within the body but also to manage their therapy by visualizing the expression and activity of specific molecules. This information is expected to have a major impact on drug development and understanding of basic cancer biology. At this time, a number of molecular probes have been developed by conjugating various labels to affinity ligands for targeting in different imaging modalities. This review will describe the current status of exogenous molecular probes for optical, scintigraphic, MRI and ultrasound imaging platforms. Furthermore, we will also shed light on how these techniques can be used synergistically in multi-modal platforms and how these techniques are being employed in current research. PMID:22180839
NASA Astrophysics Data System (ADS)
Yang, Gilmo; Kang, Sukwon; Lee, Kangjin; Kim, Giyoung; Son, Jaeryong; Mo, Changyeun
2010-04-01
The identification of pesticide and 6-benzylaminopurine (6-BAP) plant growth regulator was carried out using a label-free opto-fluidic ring resonator (OFRR) biosensor. The OFRR sensing platform is a recent advancement in opto-fluidic technology that integrates photonic sensing technology with microfluidics. It features quick detection time, small sample volume, accurate quantitative and kinetic results. The most predominant advantage of the OFRR integrated with microfluidics is that we can potentially realize the multi-channel and portable biosensor that detects numerous analytes simultaneously. Antisera for immunoassay were raised in rabbits against the 6-BAP-BSA conjugate. Using the immunization protocol and unknown cytokinin reacting with same antibody, comparable sensitivity and specificity were obtained. 6-BAP antibody was routinely used for cytokinin analysis. A sensitive and simple OFRR method with a good linear relationship was developed for the determination of 6-BAP. The detection limit was also examined. The biosensor demonstrated excellent reproducibility when periodically exposed to 6-BAP.
A 1 GHz integrated circuit with carbon nanotube interconnects and silicon transistors.
Close, Gael F; Yasuda, Shinichi; Paul, Bipul; Fujita, Shinobu; Wong, H-S Philip
2008-02-01
Due to their excellent electrical properties, metallic carbon nanotubes are promising materials for interconnect wires in future integrated circuits. Simulations have shown that the use of metallic carbon nanotube interconnects could yield more energy efficient and faster integrated circuits. The next step is to build an experimental prototype integrated circuit using carbon nanotube interconnects operating at high speed. Here, we report the fabrication of the first stand-alone integrated circuit combining silicon transistors and individual carbon nanotube interconnect wires on the same chip operating above 1 GHz. In addition to setting a milestone by operating above 1 GHz, this prototype is also a tool to investigate carbon nanotubes on a silicon-based platform at high frequencies, paving the way for future multi-GHz nanoelectronics.
Ingham, Richard J; Battilocchio, Claudio; Fitzpatrick, Daniel E; Sliwinski, Eric; Hawkins, Joel M; Ley, Steven V
2015-01-01
Performing reactions in flow can offer major advantages over batch methods. However, laboratory flow chemistry processes are currently often limited to single steps or short sequences due to the complexity involved with operating a multi-step process. Using new modular components for downstream processing, coupled with control technologies, more advanced multi-step flow sequences can be realized. These tools are applied to the synthesis of 2-aminoadamantane-2-carboxylic acid. A system comprising three chemistry steps and three workup steps was developed, having sufficient autonomy and self-regulation to be managed by a single operator. PMID:25377747
NASA Astrophysics Data System (ADS)
Fink, Wolfgang; George, Thomas; Tarbell, Mark A.
2007-04-01
Robotic reconnaissance operations are called for in extreme environments, not only those such as space, including planetary atmospheres, surfaces, and subsurfaces, but also in potentially hazardous or inaccessible operational areas on Earth, such as mine fields, battlefield environments, enemy occupied territories, terrorist infiltrated environments, or areas that have been exposed to biochemical agents or radiation. Real time reconnaissance enables the identification and characterization of transient events. A fundamentally new mission concept for tier-scalable reconnaissance of operational areas, originated by Fink et al., is aimed at replacing the engineering and safety constrained mission designs of the past. The tier-scalable paradigm integrates multi-tier (orbit atmosphere surface/subsurface) and multi-agent (satellite UAV/blimp surface/subsurface sensing platforms) hierarchical mission architectures, introducing not only mission redundancy and safety, but also enabling and optimizing intelligent, less constrained, and distributed reconnaissance in real time. Given the mass, size, and power constraints faced by such a multi-platform approach, this is an ideal application scenario for a diverse set of MEMS sensors. To support such mission architectures, a high degree of operational autonomy is required. Essential elements of such operational autonomy are: (1) automatic mapping of an operational area from different vantage points (including vehicle health monitoring); (2) automatic feature extraction and target/region-of-interest identification within the mapped operational area; and (3) automatic target prioritization for close-up examination. These requirements imply the optimal deployment of MEMS sensors and sensor platforms, sensor fusion, and sensor interoperability.
Paleomagnetism.org: An online multi-platform open source environment for paleomagnetic data analysis
NASA Astrophysics Data System (ADS)
Koymans, Mathijs R.; Langereis, Cor G.; Pastor-Galán, Daniel; van Hinsbergen, Douwe J. J.
2016-08-01
This contribution provides an overview of Paleomagnetism.org, an open-source, multi-platform online environment for paleomagnetic data analysis. Paleomagnetism.org provides an interactive environment where paleomagnetic data can be interpreted, evaluated, visualized, and exported. The Paleomagnetism.org application is split in to an interpretation portal, a statistics portal, and a portal for miscellaneous paleomagnetic tools. In the interpretation portal, principle component analysis can be performed on visualized demagnetization diagrams. Interpreted directions and great circles can be combined to find great circle solutions. These directions can be used in the statistics portal, or exported as data and figures. The tools in the statistics portal cover standard Fisher statistics for directions and VGPs, including other statistical parameters used as reliability criteria. Other available tools include an eigenvector approach foldtest, two reversal test including a Monte Carlo simulation on mean directions, and a coordinate bootstrap on the original data. An implementation is included for the detection and correction of inclination shallowing in sediments following TK03.GAD. Finally we provide a module to visualize VGPs and expected paleolatitudes, declinations, and inclinations relative to widely used global apparent polar wander path models in coordinates of major continent-bearing plates. The tools in the miscellaneous portal include a net tectonic rotation (NTR) analysis to restore a body to its paleo-vertical and a bootstrapped oroclinal test using linear regressive techniques, including a modified foldtest around a vertical axis. Paleomagnetism.org provides an integrated approach for researchers to work with visualized (e.g. hemisphere projections, Zijderveld diagrams) paleomagnetic data. The application constructs a custom exportable file that can be shared freely and included in public databases. This exported file contains all data and can later be imported to the application by other researchers. The accessibility and simplicity through which paleomagnetic data can be interpreted, analyzed, visualized, and shared makes Paleomagnetism.org of interest to the community.
Multi-disciplinary coupling effects for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions which govern the accurate response of propulsion systems. Results are presented for propulsion system responses including multi-disciplinary coupling effects using coupled multi-discipline thermal, structural, and acoustic tailoring; an integrated system of multi-disciplinary simulators; coupled material behavior/fabrication process tailoring; sensitivities using a probabilistic simulator; and coupled materials, structures, fracture, and probabilistic behavior simulator. The results demonstrate that superior designs can be achieved if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated coupled multi-discipline numerical propulsion system simulator.
Design of a New Ultracompact Resonant Plasmonic Multi-Analyte Label-Free Biosensing Platform
De Palo, Maripina; Ciminelli, Caterina
2017-01-01
In this paper, we report on the design of a bio-multisensing platform for the selective label-free detection of protein biomarkers, carried out through a 3D numerical algorithm. The platform includes a number of biosensors, each of them is based on a plasmonic nanocavity, consisting of a periodic metal structure to be deposited on a silicon oxide substrate. Light is strongly confined in a region with extremely small size (=1.57 μm2), to enhance the light-matter interaction. A surface sensitivity Ss = 1.8 nm/nm has been calculated together with a detection limit of 128 pg/mm2. Such performance, together with the extremely small footprint, allow the integration of several devices on a single chip to realize extremely compact lab-on-chip microsystems. In addition, each sensing element of the platform has a good chemical stability that is guaranteed by the selection of gold for its fabrication. PMID:28783075
Bieri, Michael; d'Auvergne, Edward J; Gooley, Paul R
2011-06-01
Investigation of protein dynamics on the ps-ns and μs-ms timeframes provides detailed insight into the mechanisms of enzymes and the binding properties of proteins. Nuclear magnetic resonance (NMR) is an excellent tool for studying protein dynamics at atomic resolution. Analysis of relaxation data using model-free analysis can be a tedious and time consuming process, which requires good knowledge of scripting procedures. The software relaxGUI was developed for fast and simple model-free analysis and is fully integrated into the software package relax. It is written in Python and uses wxPython to build the graphical user interface (GUI) for maximum performance and multi-platform use. This software allows the analysis of NMR relaxation data with ease and the generation of publication quality graphs as well as color coded images of molecular structures. The interface is designed for simple data analysis and management. The software was tested and validated against the command line version of relax.
NASA Astrophysics Data System (ADS)
Dabos, G.; Pitris, S.; Mitsolidou, C.; Alexoudi, T.; Fitsios, D.; Cherchi, M.; Harjanne, M.; Aalto, T.; Kanellos, G. T.; Pleros, N.
2017-02-01
As data centers constantly expand, electronic switches are facing the challenge of enhanced scalability and the request for increased pin-count and bandwidth. Photonic technology and wavelength division multiplexing have always been a strong alternative for efficient routing and their potential was already proven in the telecoms. CWDM transceivers have emerged in the board-to-board level interconnection, revealing the potential for wavelength-routing to be applied in the datacom and an AWGR-based approach has recently been proposed towards building an optical multi-socket interconnection to offer any-to-any connectivity with high aggregated throughput and reduced power consumption. Echelle gratings have long been recognized as the multiplexing block exhibiting smallest footprint and robustness in a wide number of applications compared to other alternatives such as the Arrayed Waveguide Grating. Such filtering devices can also perform in a similar way to cyclical AWGR and serve as mid-board routing platforms in multi-socket environments. In this communication, we present such a 3x3 Echelle grating integrated on thick SOI platform with aluminum-coated facets that is shown to perform successful wavelength-routing functionality at 10 Gb/s. The device exhibits a footprint of 60x270 μm2, while the static characterization showed a 3 dB on-chip loss for the best channel. The 3 dB-bandwidth of the channels was 4.5 nm and the free spectral range was 90 nm. The echelle was evaluated in a 2x2 wavelength routing topology, exhibiting a power penalty of below 0.4 dB at 10-9 BER for the C-band. Further experimental evaluations of the platform involve commercially available CWDM datacenter transceivers, towards emulating an optically-interconnected multi-socket environment traffic scenario.
NASA Technical Reports Server (NTRS)
Hoff, Claus; Cady, Eric; Chainyk, Mike; Kissil, Andrew; Levine, Marie; Moore, Greg
2011-01-01
The efficient simulation of multidisciplinary thermo-opto-mechanical effects in precision deployable systems has for years been limited by numerical toolsets that do not necessarily share the same finite element basis, level of mesh discretization, data formats, or compute platforms. Cielo, a general purpose integrated modeling tool funded by the Jet Propulsion Laboratory and the Exoplanet Exploration Program, addresses shortcomings in the current state of the art via features that enable the use of a single, common model for thermal, structural and optical aberration analysis, producing results of greater accuracy, without the need for results interpolation or mapping. This paper will highlight some of these advances, and will demonstrate them within the context of detailed external occulter analyses, focusing on in-plane deformations of the petal edges for both steady-state and transient conditions, with subsequent optical performance metrics including intensity distributions at the pupil and image plane.
Integrated, Step-Wise, Mass-Isotopomeric Flux Analysis of the TCA Cycle.
Alves, Tiago C; Pongratz, Rebecca L; Zhao, Xiaojian; Yarborough, Orlando; Sereda, Sam; Shirihai, Orian; Cline, Gary W; Mason, Graeme; Kibbey, Richard G
2015-11-03
Mass isotopomer multi-ordinate spectral analysis (MIMOSA) is a step-wise flux analysis platform to measure discrete glycolytic and mitochondrial metabolic rates. Importantly, direct citrate synthesis rates were obtained by deconvolving the mass spectra generated from [U-(13)C6]-D-glucose labeling for position-specific enrichments of mitochondrial acetyl-CoA, oxaloacetate, and citrate. Comprehensive steady-state and dynamic analyses of key metabolic rates (pyruvate dehydrogenase, β-oxidation, pyruvate carboxylase, isocitrate dehydrogenase, and PEP/pyruvate cycling) were calculated from the position-specific transfer of (13)C from sequential precursors to their products. Important limitations of previous techniques were identified. In INS-1 cells, citrate synthase rates correlated with both insulin secretion and oxygen consumption. Pyruvate carboxylase rates were substantially lower than previously reported but showed the highest fold change in response to glucose stimulation. In conclusion, MIMOSA measures key metabolic rates from the precursor/product position-specific transfer of (13)C-label between metabolites and has broad applicability to any glucose-oxidizing cell. Copyright © 2015 Elsevier Inc. All rights reserved.
Cytobank: providing an analytics platform for community cytometry data analysis and collaboration.
Chen, Tiffany J; Kotecha, Nikesh
2014-01-01
Cytometry is used extensively in clinical and laboratory settings to diagnose and track cell subsets in blood and tissue. High-throughput, single-cell approaches leveraging cytometry are developed and applied in the computational and systems biology communities by researchers, who seek to improve the diagnosis of human diseases, map the structures of cell signaling networks, and identify new cell types. Data analysis and management present a bottleneck in the flow of knowledge from bench to clinic. Multi-parameter flow and mass cytometry enable identification of signaling profiles of patient cell samples. Currently, this process is manual, requiring hours of work to summarize multi-dimensional data and translate these data for input into other analysis programs. In addition, the increase in the number and size of collaborative cytometry studies as well as the computational complexity of analytical tools require the ability to assemble sufficient and appropriately configured computing capacity on demand. There is a critical need for platforms that can be used by both clinical and basic researchers who routinely rely on cytometry. Recent advances provide a unique opportunity to facilitate collaboration and analysis and management of cytometry data. Specifically, advances in cloud computing and virtualization are enabling efficient use of large computing resources for analysis and backup. An example is Cytobank, a platform that allows researchers to annotate, analyze, and share results along with the underlying single-cell data.
Zhou, Zhenyu; Xu, Linru; Wu, Suozhu; Su, Bin
2014-10-07
Electrochemiluminescence (ECL) imaging provides a superior approach to achieve array detection because of its ability for ultrasensitive multiplex analysis. In this paper, we reported a novel ECL imaging biosensor array modified with an enzyme/carbon nanotubes/chitosan composite film for the determination of glucose, choline and lactate. The biosensor array was constructed by integrating a patterned indium tin oxide (ITO) glass plate with six perforated poly(dimethylsiloxane) (PDMS) covers. ECL is generated by the electrochemical reaction between luminol and hydrogen peroxide that is produced by the enzyme catalysed oxidation of different substrates with molecular oxygen, and ECL images were captured by a charge-coupled device (CCD) camera. The separated electrochemical micro-cells enabled simultaneous assay of six samples at different concentrations. From the established calibration curves, the detection limits were 14 μM for glucose, 40 μM for lactate and 97 μM for choline, respectively. Moreover, multicomponent assays and cross reactivity were also studied, both of which were satisfied for the analysis. This biosensing platform based on ECL imaging shows many distinct advantages, including miniaturization, low cost, and multi-functionalization. We believe that this novel ECL imaging biosensor platform will have potential applications in clinical diagnostics, medicine and food inspection.
Poritz, Mark A.; Blaschke, Anne J.; Byington, Carrie L.; Meyers, Lindsay; Nilsson, Kody; Jones, David E.; Thatcher, Stephanie A.; Robbins, Thomas; Lingenfelter, Beth; Amiott, Elizabeth; Herbener, Amy; Daly, Judy; Dobrowolski, Steven F.; Teng, David H. -F.; Ririe, Kirk M.
2011-01-01
The ideal clinical diagnostic system should deliver rapid, sensitive, specific and reproducible results while minimizing the requirements for specialized laboratory facilities and skilled technicians. We describe an integrated diagnostic platform, the “FilmArray”, which fully automates the detection and identification of multiple organisms from a single sample in about one hour. An unprocessed biologic/clinical sample is subjected to nucleic acid purification, reverse transcription, a high-order nested multiplex polymerase chain reaction and amplicon melt curve analysis. Biochemical reactions are enclosed in a disposable pouch, minimizing the PCR contamination risk. FilmArray has the potential to detect greater than 100 different nucleic acid targets at one time. These features make the system well-suited for molecular detection of infectious agents. Validation of the FilmArray technology was achieved through development of a panel of assays capable of identifying 21 common viral and bacterial respiratory pathogens. Initial testing of the system using both cultured organisms and clinical nasal aspirates obtained from children demonstrated an analytical and clinical sensitivity and specificity comparable to existing diagnostic platforms. We demonstrate that automated identification of pathogens from their corresponding target amplicon(s) can be accomplished by analysis of the DNA melting curve of the amplicon. PMID:22039434
NASA Astrophysics Data System (ADS)
Mosconi, A.; Pozzoli, A.; Meroni, A.; Gagliano, S.
2015-10-01
This paper presents an integrated approach for land subsidence monitoring using measures coming from different sensors. Eni S.p.A., the main Italian oil and gas company, constantly surveys the land with all the state of the art and innovative techniques, and a method able to integrate the results is an important and actual topic. Nowadays the world is a multi-sensor platform, and measure integration is strictly necessary. Combining the different data sources should be done in a clever way, taking advantages from the best performances of each technique. An integrated analysis allows the interpretation of simultaneous temporal series of data, coming from different sources, and try to separate subsidence contributions. With this purpose Exelis VIS in collaboration with Eni S.p.A. customize PISAV (Permanent Interferometric Scatterometer Analysis and Visualization), an ENVI extension able to capitalize on and combine all the different data collected in the surveys. In this article are presented some significant examples to show the potential of this tool in oil and gas activity: a hydrocarbon storage field where the comparison between SAR and production volumes emphasise a correlation between the two measures in few steps; and a hydrocarbon production field with the Satellite Survey Unit (S.S.U.), where SAR, CGPS, piezometers and assestimeters measure in the same area at the same time, giving the opportunity to analyse data contextually. In the integrated analysis performed with PISAV not always a mathematical rigorous study is possible, and a semi-quantitative approach is the only method for results interpretation. As a result, in the first test case strong correlation between injected hydrocarbon volume and vertical displacement were highlighted; in the second one the integrated analysis has different advantages in monitoring the land subsidence: permits a first qualitative "differentiation" of the natural and anthropic component of subsidence, and also gives more reliability and coverage to each measurement, taking advantages from the strong points of each technique.
RadMAP: The Radiological Multi-sensor Analysis Platform
NASA Astrophysics Data System (ADS)
Bandstra, Mark S.; Aucott, Timothy J.; Brubaker, Erik; Chivers, Daniel H.; Cooper, Reynold J.; Curtis, Joseph C.; Davis, John R.; Joshi, Tenzing H.; Kua, John; Meyer, Ross; Negut, Victor; Quinlan, Michael; Quiter, Brian J.; Srinivasan, Shreyas; Zakhor, Avideh; Zhang, Richard; Vetter, Kai
2016-12-01
The variability of gamma-ray and neutron background during the operation of a mobile detector system greatly limits the ability of the system to detect weak radiological and nuclear threats. The natural radiation background measured by a mobile detector system is the result of many factors, including the radioactivity of nearby materials, the geometric configuration of those materials and the system, the presence of absorbing materials, and atmospheric conditions. Background variations tend to be highly non-Poissonian, making it difficult to set robust detection thresholds using knowledge of the mean background rate alone. The Radiological Multi-sensor Analysis Platform (RadMAP) system is designed to allow the systematic study of natural radiological background variations and to serve as a development platform for emerging concepts in mobile radiation detection and imaging. To do this, RadMAP has been used to acquire extensive, systematic background measurements and correlated contextual data that can be used to test algorithms and detector modalities at low false alarm rates. By combining gamma-ray and neutron detector systems with data from contextual sensors, the system enables the fusion of data from multiple sensors into novel data products. The data are curated in a common format that allows for rapid querying across all sensors, creating detailed multi-sensor datasets that are used to study correlations between radiological and contextual data, and develop and test novel techniques in mobile detection and imaging. In this paper we will describe the instruments that comprise the RadMAP system, the effort to curate and provide access to multi-sensor data, and some initial results on the fusion of contextual and radiological data.
antiSMASH 2.0--a versatile platform for genome mining of secondary metabolite producers.
Blin, Kai; Medema, Marnix H; Kazempour, Daniyal; Fischbach, Michael A; Breitling, Rainer; Takano, Eriko; Weber, Tilmann
2013-07-01
Microbial secondary metabolites are a potent source of antibiotics and other pharmaceuticals. Genome mining of their biosynthetic gene clusters has become a key method to accelerate their identification and characterization. In 2011, we developed antiSMASH, a web-based analysis platform that automates this process. Here, we present the highly improved antiSMASH 2.0 release, available at http://antismash.secondarymetabolites.org/. For the new version, antiSMASH was entirely re-designed using a plug-and-play concept that allows easy integration of novel predictor or output modules. antiSMASH 2.0 now supports input of multiple related sequences simultaneously (multi-FASTA/GenBank/EMBL), which allows the analysis of draft genomes comprising multiple contigs. Moreover, direct analysis of protein sequences is now possible. antiSMASH 2.0 has also been equipped with the capacity to detect additional classes of secondary metabolites, including oligosaccharide antibiotics, phenazines, thiopeptides, homo-serine lactones, phosphonates and furans. The algorithm for predicting the core structure of the cluster end product is now also covering lantipeptides, in addition to polyketides and non-ribosomal peptides. The antiSMASH ClusterBlast functionality has been extended to identify sub-clusters involved in the biosynthesis of specific chemical building blocks. The new features currently make antiSMASH 2.0 the most comprehensive resource for identifying and analyzing novel secondary metabolite biosynthetic pathways in microorganisms.
NASA Astrophysics Data System (ADS)
Ormerod, R.; Scholl, M.
2017-12-01
Rapid evolution is occurring in the monitoring and assessment of air emissions and their impacts. The development of next generation lower cost sensor technologies creates the potential for much more intensive and far-reaching monitoring networks that provide spatially rich data. While much attention at present is being directed at the types and performance characteristics of sensor technologies, it is important also that the full potential of rich data sources be realized. Parallel to sensor developments, software platforms to display and manage data in real time are increasingly common adjuncts to sensor networks. However, the full value of data can be realized by extending platform capabilities to include complex scientific functions that are integrated into an action-oriented management framework. Depending on the purpose and nature of a monitoring network, there will be a variety of potential uses of the data or its derivatives, for example: statistical analysis for policy development, event analysis, real-time issue management including emergency response and complaints, and predictive management. Moving these functions into an on-demand, optionally mobile, environment greatly increases the value and accessibility of the data. Increased interplay between monitoring data and decision-making in an operational environment is optimised by a system that is designed with equal weight on technical robustness and user experience. A system now being used by several regulatory agencies and a larger number of industries in the US, Latin America, Europe, Australia and Asia has been developed to provide a wide range of on-demand decision-support in addition to the basic data collection, display and management that most platforms offer. With stable multi-year operation, the platform, known as Envirosuite, is assisting organisations to both reduce operating costs and improve environmental performance. Some current examples of its application across a range of applications for regulatory and industry organisations is described and demonstrated.
Magnetic Nanoparticles for Multi-Imaging and Drug Delivery
Lee, Jae-Hyun; Kim, Ji-wook; Cheon, Jinwoo
2013-01-01
Various bio-medical applications of magnetic nanoparticles have been explored during the past few decades. As tools that hold great potential for advancing biological sciences, magnetic nanoparticles have been used as platform materials for enhanced magnetic resonance imaging (MRI) agents, biological separation and magnetic drug delivery systems, and magnetic hyperthermia treatment. Furthermore, approaches that integrate various imaging and bioactive moieties have been used in the design of multi-modality systems, which possess synergistically enhanced properties such as better imaging resolution and sensitivity, molecular recognition capabilities, stimulus responsive drug delivery with on-demand control, and spatio-temporally controlled cell signal activation. Below, recent studies that focus on the design and synthesis of multi-mode magnetic nanoparticles will be briefly reviewed and their potential applications in the imaging and therapy areas will be also discussed. PMID:23579479
Oller, Joaquim; Demirkol, Ilker; Casademont, Jordi; Paradells, Josep; Gamm, Gerd Ulrich; Reindl, Leonhard
2014-01-01
Energy-efficient communication is one of the main concerns of wireless sensor networks nowadays. A commonly employed approach for achieving energy efficiency has been the use of duty-cycled operation of the radio, where the node's transceiver is turned off and on regularly, listening to the radio channel for possible incoming communication during its on-state. Nonetheless, such a paradigm performs poorly for scenarios of low or bursty traffic because of unnecessary activations of the radio transceiver. As an alternative technology, Wake-up Radio (WuR) systems present a promising energy-efficient network operation, where target devices are only activated in an on-demand fashion by means of a special radio signal and a WuR receiver. In this paper, we analyze a novel wake-up radio approach that integrates both data communication and wake-up functionalities into one platform, providing a reconfigurable radio operation. Through physical experiments, we characterize the delay, current consumption and overall operational range performance of this approach under different transmit power levels. We also present an actual single-hop WuR application scenario, as well as demonstrate the first true multi-hop capabilities of a WuR platform and simulate its performance in a multi-hop scenario. Finally, by thorough qualitative comparisons to the most relevant WuR proposals in the literature, we state that the proposed WuR system stands out as a strong candidate for any application requiring energy-efficient wireless sensor node communications. PMID:24451452
Oller, Joaquim; Demirkol, Ilker; Casademont, Jordi; Paradells, Josep; Gamm, Gerd Ulrich; Reindl, Leonhard
2013-12-19
Energy-efficient communication is one of the main concerns of wireless sensor networks nowadays. A commonly employed approach for achieving energy efficiency has been the use of duty-cycled operation of the radio, where the node's transceiver is turned off and on regularly, listening to the radio channel for possible incoming communication during its on-state. Nonetheless, such a paradigm performs poorly for scenarios of low or bursty traffic because of unnecessary activations of the radio transceiver. As an alternative technology, Wake-up Radio (WuR) systems present a promising energy-efficient network operation, where target devices are only activated in an on-demand fashion by means of a special radio signal and a WuR receiver. In this paper, we analyze a novel wake-up radio approach that integrates both data communication and wake-up functionalities into one platform, providing a reconfigurable radio operation. Through physical experiments, we characterize the delay, current consumption and overall operational range performance of this approach under different transmit power levels. We also present an actual single-hop WuR application scenario, as well as demonstrate the first true multi-hop capabilities of a WuR platform and simulate its performance in a multi-hop scenario. Finally, by thorough qualitative comparisons to the most relevant WuR proposals in the literature, we state that the proposed WuR system stands out as a strong candidate for any application requiring energy-efficient wireless sensor node communications.
Gadelha, Luiz; Ribeiro-Alves, Marcelo; Porto, Fábio
2017-01-01
There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were analyzed. The results are integrated into GeNNet-DB, a database about genes, clusters, experiments and their properties and relationships. The resulting graph database is explored with queries that demonstrate the expressiveness of this data model for reasoning about gene interaction networks. GeNNet is the first platform to integrate the analytical process of transcriptome data with graph databases. It provides a comprehensive set of tools that would otherwise be challenging for non-expert users to install and use. Developers can add new functionality to components of GeNNet. The derived data allows for testing previous hypotheses about an experiment and exploring new ones through the interactive graph database environment. It enables the analysis of different data on humans, rhesus, mice and rat coming from Affymetrix platforms. GeNNet is available as an open source platform at https://github.com/raquele/GeNNet and can be retrieved as a software container with the command docker pull quelopes/gennet. PMID:28695067
Costa, Raquel L; Gadelha, Luiz; Ribeiro-Alves, Marcelo; Porto, Fábio
2017-01-01
There are many steps in analyzing transcriptome data, from the acquisition of raw data to the selection of a subset of representative genes that explain a scientific hypothesis. The data produced can be represented as networks of interactions among genes and these may additionally be integrated with other biological databases, such as Protein-Protein Interactions, transcription factors and gene annotation. However, the results of these analyses remain fragmented, imposing difficulties, either for posterior inspection of results, or for meta-analysis by the incorporation of new related data. Integrating databases and tools into scientific workflows, orchestrating their execution, and managing the resulting data and its respective metadata are challenging tasks. Additionally, a great amount of effort is equally required to run in-silico experiments to structure and compose the information as needed for analysis. Different programs may need to be applied and different files are produced during the experiment cycle. In this context, the availability of a platform supporting experiment execution is paramount. We present GeNNet, an integrated transcriptome analysis platform that unifies scientific workflows with graph databases for selecting relevant genes according to the evaluated biological systems. It includes GeNNet-Wf, a scientific workflow that pre-loads biological data, pre-processes raw microarray data and conducts a series of analyses including normalization, differential expression inference, clusterization and gene set enrichment analysis. A user-friendly web interface, GeNNet-Web, allows for setting parameters, executing, and visualizing the results of GeNNet-Wf executions. To demonstrate the features of GeNNet, we performed case studies with data retrieved from GEO, particularly using a single-factor experiment in different analysis scenarios. As a result, we obtained differentially expressed genes for which biological functions were analyzed. The results are integrated into GeNNet-DB, a database about genes, clusters, experiments and their properties and relationships. The resulting graph database is explored with queries that demonstrate the expressiveness of this data model for reasoning about gene interaction networks. GeNNet is the first platform to integrate the analytical process of transcriptome data with graph databases. It provides a comprehensive set of tools that would otherwise be challenging for non-expert users to install and use. Developers can add new functionality to components of GeNNet. The derived data allows for testing previous hypotheses about an experiment and exploring new ones through the interactive graph database environment. It enables the analysis of different data on humans, rhesus, mice and rat coming from Affymetrix platforms. GeNNet is available as an open source platform at https://github.com/raquele/GeNNet and can be retrieved as a software container with the command docker pull quelopes/gennet.
GPS/Optical/Inertial Integration for 3D Navigation Using Multi-Copter Platforms
NASA Technical Reports Server (NTRS)
Dill, Evan T.; Young, Steven D.; Uijt De Haag, Maarten
2017-01-01
In concert with the continued advancement of a UAS traffic management system (UTM), the proposed uses of autonomous unmanned aerial systems (UAS) have become more prevalent in both the public and private sectors. To facilitate this anticipated growth, a reliable three-dimensional (3D) positioning, navigation, and mapping (PNM) capability will be required to enable operation of these platforms in challenging environments where global navigation satellite systems (GNSS) may not be available continuously. Especially, when the platform's mission requires maneuvering through different and difficult environments like outdoor opensky, outdoor under foliage, outdoor-urban and indoor, and may include transitions between these environments. There may not be a single method to solve the PNM problem for all environments. The research presented in this paper is a subset of a broader research effort, described in [1]. The research is focused on combining data from dissimilar sensor technologies to create an integrated navigation and mapping method that can enable reliable operation in both an outdoor and structured indoor environment. The integrated navigation and mapping design is utilizes a Global Positioning System (GPS) receiver, an Inertial Measurement Unit (IMU), a monocular digital camera, and three short to medium range laser scanners. This paper describes specifically the techniques necessary to effectively integrate the monocular camera data within the established mechanization. To evaluate the developed algorithms a hexacopter was built, equipped with the discussed sensors, and both hand-carried and flown through representative environments. This paper highlights the effect that the monocular camera has on the aforementioned sensor integration scheme's reliability, accuracy and availability.
Laurinaviciene, Aida; Plancoulaine, Benoit; Baltrusaityte, Indra; Meskauskas, Raimundas; Besusparis, Justinas; Lesciute-Krilaviciene, Daiva; Raudeliunas, Darius; Iqbal, Yasir; Herlin, Paulette; Laurinavicius, Arvydas
2014-01-01
Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools.
2014-01-01
Background Digital immunohistochemistry (IHC) is one of the most promising applications brought by new generation image analysis (IA). While conventional IHC staining quality is monitored by semi-quantitative visual evaluation of tissue controls, IA may require more sensitive measurement. We designed an automated system to digitally monitor IHC multi-tissue controls, based on SQL-level integration of laboratory information system with image and statistical analysis tools. Methods Consecutive sections of TMA containing 10 cores of breast cancer tissue were used as tissue controls in routine Ki67 IHC testing. Ventana slide label barcode ID was sent to the LIS to register the serial section sequence. The slides were stained and scanned (Aperio ScanScope XT), IA was performed by the Aperio/Leica Colocalization and Genie Classifier/Nuclear algorithms. SQL-based integration ensured automated statistical analysis of the IA data by the SAS Enterprise Guide project. Factor analysis and plot visualizations were performed to explore slide-to-slide variation of the Ki67 IHC staining results in the control tissue. Results Slide-to-slide intra-core IHC staining analysis revealed rather significant variation of the variables reflecting the sample size, while Brown and Blue Intensity were relatively stable. To further investigate this variation, the IA results from the 10 cores were aggregated to minimize tissue-related variance. Factor analysis revealed association between the variables reflecting the sample size detected by IA and Blue Intensity. Since the main feature to be extracted from the tissue controls was staining intensity, we further explored the variation of the intensity variables in the individual cores. MeanBrownBlue Intensity ((Brown+Blue)/2) and DiffBrownBlue Intensity (Brown-Blue) were introduced to better contrast the absolute intensity and the colour balance variation in each core; relevant factor scores were extracted. Finally, tissue-related factors of IHC staining variance were explored in the individual tissue cores. Conclusions Our solution enabled to monitor staining of IHC multi-tissue controls by the means of IA, followed by automated statistical analysis, integrated into the laboratory workflow. We found that, even in consecutive serial tissue sections, tissue-related factors affected the IHC IA results; meanwhile, less intense blue counterstain was associated with less amount of tissue, detected by the IA tools. PMID:25565007
Geospatial Information from Satellite Imagery for Geovisualisation of Smart Cities in India
NASA Astrophysics Data System (ADS)
Mohan, M.
2016-06-01
In the recent past, there have been large emphasis on extraction of geospatial information from satellite imagery. The Geospatial information are being processed through geospatial technologies which are playing important roles in developing of smart cities, particularly in developing countries of the world like India. The study is based on the latest geospatial satellite imagery available for the multi-date, multi-stage, multi-sensor, and multi-resolution. In addition to this, the latest geospatial technologies have been used for digital image processing of remote sensing satellite imagery and the latest geographic information systems as 3-D GeoVisualisation, geospatial digital mapping and geospatial analysis for developing of smart cities in India. The Geospatial information obtained from RS and GPS systems have complex structure involving space, time and presentation. Such information helps in 3-Dimensional digital modelling for smart cities which involves of spatial and non-spatial information integration for geographic visualisation of smart cites in context to the real world. In other words, the geospatial database provides platform for the information visualisation which is also known as geovisualisation. So, as a result there have been an increasing research interest which are being directed to geospatial analysis, digital mapping, geovisualisation, monitoring and developing of smart cities using geospatial technologies. However, the present research has made an attempt for development of cities in real world scenario particulary to help local, regional and state level planners and policy makers to better understand and address issues attributed to cities using the geospatial information from satellite imagery for geovisualisation of Smart Cities in emerging and developing country, India.
NASA Astrophysics Data System (ADS)
German, Kristine A.; Kubby, Joel; Chen, Jingkuang; Diehl, James; Feinberg, Kathleen; Gulvin, Peter; Herko, Larry; Jia, Nancy; Lin, Pinyen; Liu, Xueyuan; Ma, Jun; Meyers, John; Nystrom, Peter; Wang, Yao Rong
2004-07-01
Xerox Corporation has developed a technology platform for on-chip integration of latching MEMS optical waveguide switches and Planar Light Circuit (PLC) components using a Silicon On Insulator (SOI) based process. To illustrate the current state of this new technology platform, working prototypes of a Reconfigurable Optical Add/Drop Multiplexer (ROADM) and a l-router will be presented along with details of the integrated latching MEMS optical switches. On-chip integration of optical switches and PLCs can greatly reduce the size, manufacturing cost and operating cost of multi-component optical equipment. It is anticipated that low-cost, low-overhead optical network products will accelerate the migration of functions and services from high-cost long-haul markets to price sensitive markets, including networks for metropolitan areas and fiber to the home. Compared to the more common silica-on-silicon PLC technology, the high index of refraction of silicon waveguides created in the SOI device layer enables miniaturization of optical components, thereby increasing yield and decreasing cost projections. The latching SOI MEMS switches feature moving waveguides, and are advantaged across multiple attributes relative to alternative switching technologies, such as thermal optical switches and polymer switches. The SOI process employed was jointly developed under the auspice of the NIST APT program in partnership with Coventor, Corning IntelliSense Corp., and MicroScan Systems to enable fabrication of a broad range of free space and guided wave MicroOptoElectroMechanical Systems (MOEMS).
NextGen Technologies on the FAA's Standard Terminal Automation Replacement System
NASA Technical Reports Server (NTRS)
Witzberger, Kevin; Swenson, Harry; Martin, Lynne; Lin, Melody; Cheng, Jinn-Hwei
2014-01-01
This paper describes the integration, evaluation, and results from a high-fidelity human-in-the-loop (HITL) simulation of key NASA Air Traffic Management Technology Demonstration - 1 (ATD- 1) technologies implemented in an enhanced version of the FAA's Standard Terminal Automation Replacement System (STARS) platform. These ATD-1 technologies include: (1) a NASA enhanced version of the FAA's Time-Based Flow Management, (2) a NASA ground-based automation technology known as controller-managed spacing (CMS), and (3) a NASA advanced avionics airborne technology known as flight-deck interval management (FIM). These ATD-1 technologies have been extensively tested in large-scale HITL simulations using general-purpose workstations to study air transportation technologies. These general purpose workstations perform multiple functions and are collectively referred to as the Multi-Aircraft Control System (MACS). Researchers at NASA Ames Research Center and Raytheon collaborated to augment the STARS platform by including CMS and FIM advisory tools to validate the feasibility of integrating these automation enhancements into the current FAA automation infrastructure. NASA Ames acquired three STARS terminal controller workstations, and then integrated the ATD-1 technologies. HITL simulations were conducted to evaluate the ATD-1 technologies when using the STARS platform. These results were compared with the results obtained when the ATD-1 technologies were tested in the MACS environment. Results collected from the numerical data show acceptably minor differences, and, together with the subjective controller questionnaires showing a trend towards preferring STARS, validate the ATD-1/STARS integration.
NASA Astrophysics Data System (ADS)
Costanzo, Antonio; Montuori, Antonio; Silva, Juan Pablo; Silvestri, Malvina; Musacchio, Massimo; Buongiorno, Maria Fabrizia; Stramondo, Salvatore
2016-08-01
In this work, a web-GIS procedure to map the risk of road blockage in urban environments through the combined use of space-borne and airborne remote sensing sensors is presented. The methodology concerns (1) the provision of a geo-database through the integration of space-borne multispectral images and airborne LiDAR data products; (2) the modeling of building vulnerability, based on the corresponding 3D geometry and construction time information; (3) the GIS-based mapping of road closure due to seismic- related building collapses based on the building characteristic height and the width of the road. Experimental results, gathered for the Cosenza urban area, allow demonstrating the benefits of both the proposed approach and the GIS-based integration of multi-platforms remote sensing sensors and techniques for seismic road assessment purposes.
Jayapandian, Catherine P; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D; Zhang, Guo-Qiang; Sahoo, Satya S
2013-01-01
Epilepsy is the most common serious neurological disorder affecting 50-60 million persons worldwide. Multi-modal electrophysiological data, such as electroencephalography (EEG) and electrocardiography (EKG), are central to effective patient care and clinical research in epilepsy. Electrophysiological data is an example of clinical "big data" consisting of more than 100 multi-channel signals with recordings from each patient generating 5-10GB of data. Current approaches to store and analyze signal data using standalone tools, such as Nihon Kohden neurology software, are inadequate to meet the growing volume of data and the need for supporting multi-center collaborative studies with real time and interactive access. We introduce the Cloudwave platform in this paper that features a Web-based intuitive signal analysis interface integrated with a Hadoop-based data processing module implemented on clinical data stored in a "private cloud". Cloudwave has been developed as part of the National Institute of Neurological Disorders and Strokes (NINDS) funded multi-center Prevention and Risk Identification of SUDEP Mortality (PRISM) project. The Cloudwave visualization interface provides real-time rendering of multi-modal signals with "montages" for EEG feature characterization over 2TB of patient data generated at the Case University Hospital Epilepsy Monitoring Unit. Results from performance evaluation of the Cloudwave Hadoop data processing module demonstrate one order of magnitude improvement in performance over 77GB of patient data. (Cloudwave project: http://prism.case.edu/prism/index.php/Cloudwave).
Jayapandian, Catherine P.; Chen, Chien-Hung; Bozorgi, Alireza; Lhatoo, Samden D.; Zhang, Guo-Qiang; Sahoo, Satya S.
2013-01-01
Epilepsy is the most common serious neurological disorder affecting 50–60 million persons worldwide. Multi-modal electrophysiological data, such as electroencephalography (EEG) and electrocardiography (EKG), are central to effective patient care and clinical research in epilepsy. Electrophysiological data is an example of clinical “big data” consisting of more than 100 multi-channel signals with recordings from each patient generating 5–10GB of data. Current approaches to store and analyze signal data using standalone tools, such as Nihon Kohden neurology software, are inadequate to meet the growing volume of data and the need for supporting multi-center collaborative studies with real time and interactive access. We introduce the Cloudwave platform in this paper that features a Web-based intuitive signal analysis interface integrated with a Hadoop-based data processing module implemented on clinical data stored in a “private cloud”. Cloudwave has been developed as part of the National Institute of Neurological Disorders and Strokes (NINDS) funded multi-center Prevention and Risk Identification of SUDEP Mortality (PRISM) project. The Cloudwave visualization interface provides real-time rendering of multi-modal signals with “montages” for EEG feature characterization over 2TB of patient data generated at the Case University Hospital Epilepsy Monitoring Unit. Results from performance evaluation of the Cloudwave Hadoop data processing module demonstrate one order of magnitude improvement in performance over 77GB of patient data. (Cloudwave project: http://prism.case.edu/prism/index.php/Cloudwave) PMID:24551370
First experience with particle-in-cell plasma physics code on ARM-based HPC systems
NASA Astrophysics Data System (ADS)
Sáez, Xavier; Soba, Alejandro; Sánchez, Edilberto; Mantsinen, Mervi; Mateo, Sergi; Cela, José M.; Castejón, Francisco
2015-09-01
In this work, we will explore the feasibility of porting a Particle-in-cell code (EUTERPE) to an ARM multi-core platform from the Mont-Blanc project. The used prototype is based on a system-on-chip Samsung Exynos 5 with an integrated GPU. It is the first prototype that could be used for High-Performance Computing (HPC), since it supports double precision and parallel programming languages.
NASA Astrophysics Data System (ADS)
Popovici, Ioana; Goloub, Philippe; Podvin, Thierry; Blarel, Luc; Loisil, Rodrigue; Mortier, Augustin; Deroo, Christine; Ducos, Fabrice; Victori, Stéphane; Torres, Benjamin
2018-04-01
The mobile system described in this paper integrates a commercial eye-safe lidar (CIMEL), a sunphotometer and in situ instruments. The system is distinguished by other transportable platforms through its capabilities to perform onroad measurements. The potential of a commercial lidar to provide reliable information on aerosol properties is investigated through comparison with a multi-wavelength Raman lidar. First results from observation campaigns in northern France are presented.
The curving calculation of a mechanical device attached to a multi-storey car park
NASA Astrophysics Data System (ADS)
Muscalagiu, C. G.; Muscalagiu, I.; Muscalagiu, D. M.
2017-01-01
Study bunk storage systems for motor vehicles developed much lately due to high demand for parking in congested city centers. In this paper we propose to study mechanism drive bunk platforms for dynamic request. This paper aims to improve the response mechanism on a platform behavior self during operation of the system and identify hot spots. In this paper we propose to analyze the deformations of the superposed platform in the points of application of the exterior forces produced by the weight of the vehicle in a dynamic way. This paper aims to automate the necessary computation for the analysis of the deformations of the superposed platform using Netlogo language.
jsc2018m000314_Spinning_Science_Multi-use_Variable-g_Platform_Arrives_at_the_Space_Station-MP4
2018-05-09
Spinning Science: Multi-use Variable-g Platform Arrives at the Space Station --- The Multi-use Variable-gravity Platform (MVP) Validation mission will install and test the MVP, a new hardware platform developed and owned by Techshot Inc., on the International Space Station (ISS). Though the MVP is designed for research with many different kinds of organisms and cell types, this validation mission will focus on Drosophila melanogaster, more commonly known as the fruit fly. This platform will be especially important for fruit fly research, as it will allow researchers to study larger sample sizes of Drosophila melanogaster than in other previous hardware utilizing centrifuges and it will be able to support fly colonies for multiple generations.
Deep space environments for human exploration
NASA Technical Reports Server (NTRS)
Wilson, J. W.; Clowdsley, M. S.; Cucinotta, F. A.; Tripathi, R. K.; Nealy, J. E.; De Angelis, G.
2004-01-01
Mission scenarios outside the Earth's protective magnetic shield are being studied. Included are high usage assets in the near-Earth environment for casual trips, for research, and for commercial/operational platforms, in which career exposures will be multi-mission determined over the astronaut's lifetime. The operational platforms will serve as launching points for deep space exploration missions, characterized by a single long-duration mission during the astronaut's career. The exploration beyond these operational platforms will include missions to planets, asteroids, and planetary satellites. The interplanetary environment is evaluated using convective diffusion theory. Local environments for each celestial body are modeled by using results from the most recent targeted spacecraft, and integrated into the design environments. Design scenarios are then evaluated for these missions. The underlying assumptions in arriving at the model environments and their impact on mission exposures within various shield materials will be discussed. Published by Elsevier Ltd on behalf of COSPAR.
Thermal and Power Challenges in High Performance Computing Systems
NASA Astrophysics Data System (ADS)
Natarajan, Venkat; Deshpande, Anand; Solanki, Sudarshan; Chandrasekhar, Arun
2009-05-01
This paper provides an overview of the thermal and power challenges in emerging high performance computing platforms. The advent of new sophisticated applications in highly diverse areas such as health, education, finance, entertainment, etc. is driving the platform and device requirements for future systems. The key ingredients of future platforms are vertically integrated (3D) die-stacked devices which provide the required performance characteristics with the associated form factor advantages. Two of the major challenges to the design of through silicon via (TSV) based 3D stacked technologies are (i) effective thermal management and (ii) efficient power delivery mechanisms. Some of the key challenges that are articulated in this paper include hot-spot superposition and intensification in a 3D stack, design/optimization of thermal through silicon vias (TTSVs), non-uniform power loading of multi-die stacks, efficient on-chip power delivery, minimization of electrical hotspots etc.
Ewald, Melanie; Fechner, Peter; Gauglitz, Günter
2015-05-01
For the first time, a multi-analyte biosensor platform has been developed using the label-free 1-lambda-reflectometry technique. This platform is the first, which does not use imaging techniques, but is able to perform multi-analyte measurements. It is designed to be portable and cost-effective and therefore allows for point-of-need testing or on-site field-testing with possible applications in diagnostics. This work highlights the application possibilities of this platform in the field of animal testing, but is also relevant and transferable to human diagnostics. The performance of the platform has been evaluated using relevant reference systems like biomarker (C-reactive protein) and serology (anti-Salmonella antibodies) as well as a panel of real samples (animal sera). The comparison of the working range and limit of detection shows no loss of performance transferring the separate assays to the multi-analyte setup. Moreover, the new multi-analyte platform allows for discrimination between sera of animals infected with different Salmonella subtypes.
NASA Astrophysics Data System (ADS)
Baker, B.; Lee, T.; Buban, M.; Dumas, E. J.
2017-12-01
Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approachC. Bruce Baker1, Ed Dumas1,2, Temple Lee1,2, Michael Buban1,21NOAA ARL, Atmospheric Turbulence and Diffusion Division, Oak Ridge, TN2Oak Ridge Associated Universities, Oak Ridge, TN The development of a small Unmanned Aerial System (sUAS) testbeds that can be used to validate, integrate, calibrate and evaluate new technology and sensors for routine boundary layer research, validation of operational weather models, improvement of model parameterizations, and recording observations within high-impact storms is important for understanding the importance and impact of using sUAS's routinely as a new observing platform. The goal of the multi-testbed approach is to build a robust set of protocols to assess the cost and operational feasibility of unmanned observations for routine applications using various combinations of sUAS aircraft and sensors in different locations and field experiments. All of these observational testbeds serve different community needs, but they also use a diverse suite of methodologies for calibration and evaluation of different sensors and platforms for severe weather and boundary layer research. The primary focus will be to evaluate meteorological sensor payloads to measure thermodynamic parameters and define surface characteristics with visible, IR, and multi-spectral cameras. This evaluation will lead to recommendations for sensor payloads for VTOL and fixed-wing sUAS.
Analysis and design of a high power, digitally-controlled spacecraft power system
NASA Technical Reports Server (NTRS)
Lee, F. C.; Cho, B. H.
1990-01-01
The progress to date on the analysis and design of a high power, digitally controlled spacecraft power system is described. Several battery discharger topologies were compared for use in the space platform application. Updated information has been provided on the battery voltage specification. Initially it was thought to be in the 30 to 40 V range. It is now specified to be 53 V to 84 V. This eliminated the tapped-boost and the current-fed auto-transformer converters from consideration. After consultations with NASA, it was decided to trade-off the following topologies: (1) boost converter; (2) multi-module, multi-phase boost converter; and (3) voltage-fed push-pull with auto-transformer. A non-linear design optimization software tool was employed to facilitate an objective comparison. Non-linear design optimization insures that the best design of each topology is compared. The results indicate that a four-module, boost converter with each module operating 90 degrees out of phase is the optimum converter for the space platform. Large-signal and small-signal models were generated for the shunt, charger, discharger, battery, and the mode controller. The models were first tested individually according to the space platform power system specifications supplied by NASA. The effect of battery voltage imbalance on parallel dischargers was investigated with respect to dc and small-signal responses. Similarly, the effects of paralleling dischargers and chargers were also investigated. A solar array and shunt model was included in these simulations. A model for the bus mode controller (power control unit) was also developed to interface the Orbital replacement Unit (ORU) model to the platform power system. Small signal models were used to generate the bus impedance plots in the various operating modes. The large signal models were integrated into a system model, and time domain simulations were performed to verify bus regulation during mode transitions. Some changes have subsequently been incorporated into the models. The changes include the use of a four module boost discharger, and a new model for the mode controller, which includes the effects of saturation. The new simulations for the boost discharger show the improvement in bus ripple that can be achieved by phase-shifted operation of each of the boost modules.
Self-assembly micro optical filter
NASA Astrophysics Data System (ADS)
Zhang, Ping (Cerina); Le, Kevin; Malalur-Nagaraja-Rao, Smitha; Hsu, Lun-Chen; Chiao, J.-C.
2006-01-01
Optical communication and sensor industry face critical challenges in manufacturing for system integration. Due to the assembly complexity and integration platform variety, micro optical components require costly alignment and assembly procedures, in which many required manual efforts. Consequently, self-assembly device architectures have become a great interest and could provide major advantages over the conventional optical devices. In this paper, we discussed a self-assembly integration platform for micro optical components. To demonstrate the adaptability and flexibility of the proposed optical device architectures, we chose a commercially available MEMS fabrication foundry service - MUMPs (Multi-User MEMS Process). In this work, polysilicon layers of MUMPS are used as the 3-D structural material for construction of micro component framework and actuators. However, because the polysilicon has high absorption in the visible and near infrared wavelength ranges, it is not suitable for optical interaction. To demonstrate the required optical performance, hybrid integration of materials was proposed and implemented. Organic compound materials were applied on the silicon-based framework to form the required optical interfaces. Organic compounds provide good optical transparency, flexibility to form filters or lens and inexpensive manufacturing procedures. In this paper, we have demonstrated a micro optical filter integrated with self-assembly structures. We will discuss the self-assembly mechanism, optical filter designs, fabrication issues and results.
Cheung, Kit; Schultz, Simon R; Luk, Wayne
2015-01-01
NeuroFlow is a scalable spiking neural network simulation platform for off-the-shelf high performance computing systems using customizable hardware processors such as Field-Programmable Gate Arrays (FPGAs). Unlike multi-core processors and application-specific integrated circuits, the processor architecture of NeuroFlow can be redesigned and reconfigured to suit a particular simulation to deliver optimized performance, such as the degree of parallelism to employ. The compilation process supports using PyNN, a simulator-independent neural network description language, to configure the processor. NeuroFlow supports a number of commonly used current or conductance based neuronal models such as integrate-and-fire and Izhikevich models, and the spike-timing-dependent plasticity (STDP) rule for learning. A 6-FPGA system can simulate a network of up to ~600,000 neurons and can achieve a real-time performance of 400,000 neurons. Using one FPGA, NeuroFlow delivers a speedup of up to 33.6 times the speed of an 8-core processor, or 2.83 times the speed of GPU-based platforms. With high flexibility and throughput, NeuroFlow provides a viable environment for large-scale neural network simulation.
Cheung, Kit; Schultz, Simon R.; Luk, Wayne
2016-01-01
NeuroFlow is a scalable spiking neural network simulation platform for off-the-shelf high performance computing systems using customizable hardware processors such as Field-Programmable Gate Arrays (FPGAs). Unlike multi-core processors and application-specific integrated circuits, the processor architecture of NeuroFlow can be redesigned and reconfigured to suit a particular simulation to deliver optimized performance, such as the degree of parallelism to employ. The compilation process supports using PyNN, a simulator-independent neural network description language, to configure the processor. NeuroFlow supports a number of commonly used current or conductance based neuronal models such as integrate-and-fire and Izhikevich models, and the spike-timing-dependent plasticity (STDP) rule for learning. A 6-FPGA system can simulate a network of up to ~600,000 neurons and can achieve a real-time performance of 400,000 neurons. Using one FPGA, NeuroFlow delivers a speedup of up to 33.6 times the speed of an 8-core processor, or 2.83 times the speed of GPU-based platforms. With high flexibility and throughput, NeuroFlow provides a viable environment for large-scale neural network simulation. PMID:26834542
Neuroimaging Data Sharing on the Neuroinformatics Database Platform
Book, Gregory A; Stevens, Michael; Assaf, Michal; Glahn, David; Pearlson, Godfrey D
2015-01-01
We describe the Neuroinformatics Database (NiDB), an open-source database platform for archiving, analysis, and sharing of neuroimaging data. Data from the multi-site projects Autism Brain Imaging Data Exchange (ABIDE), Bipolar-Schizophrenia Network on Intermediate Phenotypes parts one and two (B-SNIP1, B-SNIP2), and Monetary Incentive Delay task (MID) are available for download from the public instance of NiDB, with more projects sharing data as it becomes available. As demonstrated by making several large datasets available, NiDB is an extensible platform appropriately suited to archive and distribute shared neuroimaging data. PMID:25888923
Multi-disciplinary coupling for integrated design of propulsion systems
NASA Technical Reports Server (NTRS)
Chamis, C. C.; Singhal, S. N.
1993-01-01
Effective computational simulation procedures are described for modeling the inherent multi-disciplinary interactions for determining the true response of propulsion systems. Results are presented for propulsion system responses including multi-discipline coupling effects via (1) coupled multi-discipline tailoring, (2) an integrated system of multidisciplinary simulators, (3) coupled material-behavior/fabrication-process tailoring, (4) sensitivities using a probabilistic simulator, and (5) coupled materials/structures/fracture/probabilistic behavior simulator. The results show that the best designs can be determined if the analysis/tailoring methods account for the multi-disciplinary coupling effects. The coupling across disciplines can be used to develop an integrated interactive multi-discipline numerical propulsion system simulator.
a Web-Based Interactive Platform for Co-Clustering Spatio-Temporal Data
NASA Astrophysics Data System (ADS)
Wu, X.; Poorthuis, A.; Zurita-Milla, R.; Kraak, M.-J.
2017-09-01
Since current studies on clustering analysis mainly focus on exploring spatial or temporal patterns separately, a co-clustering algorithm is utilized in this study to enable the concurrent analysis of spatio-temporal patterns. To allow users to adopt and adapt the algorithm for their own analysis, it is integrated within the server side of an interactive web-based platform. The client side of the platform, running within any modern browser, is a graphical user interface (GUI) with multiple linked visualizations that facilitates the understanding, exploration and interpretation of the raw dataset and co-clustering results. Users can also upload their own datasets and adjust clustering parameters within the platform. To illustrate the use of this platform, an annual temperature dataset from 28 weather stations over 20 years in the Netherlands is used. After the dataset is loaded, it is visualized in a set of linked visualizations: a geographical map, a timeline and a heatmap. This aids the user in understanding the nature of their dataset and the appropriate selection of co-clustering parameters. Once the dataset is processed by the co-clustering algorithm, the results are visualized in the small multiples, a heatmap and a timeline to provide various views for better understanding and also further interpretation. Since the visualization and analysis are integrated in a seamless platform, the user can explore different sets of co-clustering parameters and instantly view the results in order to do iterative, exploratory data analysis. As such, this interactive web-based platform allows users to analyze spatio-temporal data using the co-clustering method and also helps the understanding of the results using multiple linked visualizations.
Skounakis, Emmanouil; Farmaki, Christina; Sakkalis, Vangelis; Roniotis, Alexandros; Banitsas, Konstantinos; Graf, Norbert; Marias, Konstantinos
2010-01-01
This paper presents a novel, open access interactive platform for 3D medical image analysis, simulation and visualization, focusing in oncology images. The platform was developed through constant interaction and feedback from expert clinicians integrating a thorough analysis of their requirements while having an ultimate goal of assisting in accurately delineating tumors. It allows clinicians not only to work with a large number of 3D tomographic datasets but also to efficiently annotate multiple regions of interest in the same session. Manual and semi-automatic segmentation techniques combined with integrated correction tools assist in the quick and refined delineation of tumors while different users can add different components related to oncology such as tumor growth and simulation algorithms for improving therapy planning. The platform has been tested by different users and over large number of heterogeneous tomographic datasets to ensure stability, usability, extensibility and robustness with promising results. the platform, a manual and tutorial videos are available at: http://biomodeling.ics.forth.gr. it is free to use under the GNU General Public License.
Mapping snow cover using multi-source satellite data on big data platforms
NASA Astrophysics Data System (ADS)
Lhermitte, Stef
2017-04-01
Snowmelt is an important and dynamically changing water resource in mountainous regions around the world. In this framework, remote sensing data of snow cover data provides an essential input for hydrological models to model the water contribution from remote mountain areas and to understand how this water resource might alter as a result of climate change. Traditionally, however, many of these remote sensing products show a trade-off between spatial and temporal resolution (e.g., 16-day Landsat at 30m vs. daily MODIS at 500m resolution). With the advent of Sentinel-1 and 2 and the PROBA-V 100m products this trade-off can partially be tackled by having data that corresponds more closely to the spatial and temporal variations in snow cover typically observed over complex mountain areas. This study provides first a quantitative analysis of the trade-offs between the state-of-the-art snow cover mapping methodologies for Landsat, MODIS, PROBA-V, Sentinel-1 and 2 and applies them on big data platforms such as Google Earth Engine (GEE), RSS (ESA Research Service & Support) CloudToolbox, and the PROBA-V Mission Exploitation Platform (MEP). Second, it combines the different sensor data-cubes in one multi-sensor classification approach using newly developed spatio-temporal probability classifiers within the big data platform environments. Analysis of the spatio-temporal differences in derived snow cover areas from the different sensors reveals the importance of understanding the spatial and temporal scales at which variations occur. Moreover, it shows the importance of i) temporal resolution when monitoring highly dynamical properties such as snow cover and of ii) differences in satellite viewing angles over complex mountain areas. Finally, it highlights the potential and drawbacks of big data platforms for combining multi-source satellite data for monitoring dynamical processes such as snow cover.
Sutherland, J David; Tu, Noah P; Nemcek, Thomas A; Searle, Philip A; Hochlowski, Jill E; Djuric, Stevan W; Pan, Jeffrey Y
2014-04-01
A flexible and integrated flow-chemistry-synthesis-purification compound-generation and sample-management platform has been developed to accelerate the production of small-molecule organic-compound drug candidates in pharmaceutical research. Central to the integrated system is a Mitsubishi robot, which hands off samples throughout the process to the next station, including synthesis and purification, sample dispensing for purity and quantification analysis, dry-down, and aliquot generation.
Multi-source Geospatial Data Analysis with Google Earth Engine
NASA Astrophysics Data System (ADS)
Erickson, T.
2014-12-01
The Google Earth Engine platform is a cloud computing environment for data analysis that combines a public data catalog with a large-scale computational facility optimized for parallel processing of geospatial data. The data catalog is a multi-petabyte archive of georeferenced datasets that include images from Earth observing satellite and airborne sensors (examples: USGS Landsat, NASA MODIS, USDA NAIP), weather and climate datasets, and digital elevation models. Earth Engine supports both a just-in-time computation model that enables real-time preview and debugging during algorithm development for open-ended data exploration, and a batch computation mode for applying algorithms over large spatial and temporal extents. The platform automatically handles many traditionally-onerous data management tasks, such as data format conversion, reprojection, and resampling, which facilitates writing algorithms that combine data from multiple sensors and/or models. Although the primary use of Earth Engine, to date, has been the analysis of large Earth observing satellite datasets, the computational platform is generally applicable to a wide variety of use cases that require large-scale geospatial data analyses. This presentation will focus on how Earth Engine facilitates the analysis of geospatial data streams that originate from multiple separate sources (and often communities) and how it enables collaboration during algorithm development and data exploration. The talk will highlight current projects/analyses that are enabled by this functionality.https://earthengine.google.org
Analysis and Modeling of DIII-D Experiments With OMFIT and Neural Networks
NASA Astrophysics Data System (ADS)
Meneghini, O.; Luna, C.; Smith, S. P.; Lao, L. L.; GA Theory Team
2013-10-01
The OMFIT integrated modeling framework is designed to facilitate experimental data analysis and enable integrated simulations. This talk introduces this framework and presents a selection of its applications to the DIII-D experiment. Examples include kinetic equilibrium reconstruction analysis; evaluation of MHD stability in the core and in the edge; and self-consistent predictive steady-state transport modeling. The OMFIT framework also provides the platform for an innovative approach based on neural networks to predict electron and ion energy fluxes. In our study a multi-layer feed-forward back-propagation neural network is built and trained over a database of DIII-D data. It is found that given the same parameters that the highest fidelity models use, the neural network model is able to predict to a large degree the heat transport profiles observed in the DIII-D experiments. Once the network is built, the numerical cost of evaluating the transport coefficients is virtually nonexistent, thus making the neural network model particularly well suited for plasma control and quick exploration of operational scenarios. The implementation of the neural network model and benchmark with experimental results and gyro-kinetic models will be discussed. Work supported in part by the US DOE under DE-FG02-95ER54309.
3D Printed Bionic Nanodevices.
Kong, Yong Lin; Gupta, Maneesh K; Johnson, Blake N; McAlpine, Michael C
2016-06-01
The ability to three-dimensionally interweave biological and functional materials could enable the creation of bionic devices possessing unique and compelling geometries, properties, and functionalities. Indeed, interfacing high performance active devices with biology could impact a variety of fields, including regenerative bioelectronic medicines, smart prosthetics, medical robotics, and human-machine interfaces. Biology, from the molecular scale of DNA and proteins, to the macroscopic scale of tissues and organs, is three-dimensional, often soft and stretchable, and temperature sensitive. This renders most biological platforms incompatible with the fabrication and materials processing methods that have been developed and optimized for functional electronics, which are typically planar, rigid and brittle. A number of strategies have been developed to overcome these dichotomies. One particularly novel approach is the use of extrusion-based multi-material 3D printing, which is an additive manufacturing technology that offers a freeform fabrication strategy. This approach addresses the dichotomies presented above by (1) using 3D printing and imaging for customized, hierarchical, and interwoven device architectures; (2) employing nanotechnology as an enabling route for introducing high performance materials, with the potential for exhibiting properties not found in the bulk; and (3) 3D printing a range of soft and nanoscale materials to enable the integration of a diverse palette of high quality functional nanomaterials with biology. Further, 3D printing is a multi-scale platform, allowing for the incorporation of functional nanoscale inks, the printing of microscale features, and ultimately the creation of macroscale devices. This blending of 3D printing, novel nanomaterial properties, and 'living' platforms may enable next-generation bionic systems. In this review, we highlight this synergistic integration of the unique properties of nanomaterials with the versatility of extrusion-based 3D printing technologies to interweave nanomaterials and fabricate novel bionic devices.
Kong, Yong Lin; Gupta, Maneesh K.; Johnson, Blake N.; McAlpine, Michael C.
2016-01-01
Summary The ability to three-dimensionally interweave biological and functional materials could enable the creation of bionic devices possessing unique and compelling geometries, properties, and functionalities. Indeed, interfacing high performance active devices with biology could impact a variety of fields, including regenerative bioelectronic medicines, smart prosthetics, medical robotics, and human-machine interfaces. Biology, from the molecular scale of DNA and proteins, to the macroscopic scale of tissues and organs, is three-dimensional, often soft and stretchable, and temperature sensitive. This renders most biological platforms incompatible with the fabrication and materials processing methods that have been developed and optimized for functional electronics, which are typically planar, rigid and brittle. A number of strategies have been developed to overcome these dichotomies. One particularly novel approach is the use of extrusion-based multi-material 3D printing, which is an additive manufacturing technology that offers a freeform fabrication strategy. This approach addresses the dichotomies presented above by (1) using 3D printing and imaging for customized, hierarchical, and interwoven device architectures; (2) employing nanotechnology as an enabling route for introducing high performance materials, with the potential for exhibiting properties not found in the bulk; and (3) 3D printing a range of soft and nanoscale materials to enable the integration of a diverse palette of high quality functional nanomaterials with biology. Further, 3D printing is a multi-scale platform, allowing for the incorporation of functional nanoscale inks, the printing of microscale features, and ultimately the creation of macroscale devices. This blending of 3D printing, novel nanomaterial properties, and ‘living’ platforms may enable next-generation bionic systems. In this review, we highlight this synergistic integration of the unique properties of nanomaterials with the versatility of extrusion-based 3D printing technologies to interweave nanomaterials and fabricate novel bionic devices. PMID:27617026
Data-intensive computing on numerically-insensitive supercomputers
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ahrens, James P; Fasel, Patricia K; Habib, Salman
2010-12-03
With the advent of the era of petascale supercomputing, via the delivery of the Roadrunner supercomputing platform at Los Alamos National Laboratory, there is a pressing need to address the problem of visualizing massive petascale-sized results. In this presentation, I discuss progress on a number of approaches including in-situ analysis, multi-resolution out-of-core streaming and interactive rendering on the supercomputing platform. These approaches are placed in context by the emerging area of data-intensive supercomputing.
Design and Development of a Low-Cost Aerial Mobile Mapping System for Multi-Purpose Applications
NASA Astrophysics Data System (ADS)
Acevedo Pardo, C.; Farjas Abadía, M.; Sternberg, H.
2015-08-01
The research project with the working title "Design and development of a low-cost modular Aerial Mobile Mapping System" was formed during the last year as the result from numerous discussions and considerations with colleagues from the HafenCity University Hamburg, Department Geomatics. The aim of the project is to design a sensor platform which can be embedded preferentially on an UAV, but also can be integrated on any adaptable vehicle. The system should perform a direct scanning of surfaces with a laser scanner and supported through sensors for determining the position and attitude of the platform. The modular design allows his extension with other sensors such as multispectral cameras, digital cameras or multiple cameras systems.
Searching Lost People with Uavs: the System and Results of the Close-Search Project
NASA Astrophysics Data System (ADS)
Molina, P.; Colomina, I.; Vitoria, T.; Silva, P. F.; Skaloud, J.; Kornus, W.; Prades, R.; Aguilera, C.
2012-07-01
This paper will introduce the goals, concept and results of the project named CLOSE-SEARCH, which stands for 'Accurate and safe EGNOS-SoL Navigation for UAV-based low-cost Search-And-Rescue (SAR) operations'. The main goal is to integrate a medium-size, helicopter-type Unmanned Aerial Vehicle (UAV), a thermal imaging sensor and an EGNOS-based multi-sensor navigation system, including an Autonomous Integrity Monitoring (AIM) capability, to support search operations in difficult-to-access areas and/or night operations. The focus of the paper is three-fold. Firstly, the operational and technical challenges of the proposed approach are discussed, such as ultra-safe multi-sensor navigation system, the use of combined thermal and optical vision (infrared plus visible) for person recognition and Beyond-Line-Of-Sight communications among others. Secondly, the implementation of the integrity concept for UAV platforms is discussed herein through the AIM approach. Based on the potential of the geodetic quality analysis and on the use of the European EGNOS system as a navigation performance starting point, AIM approaches integrity from the precision standpoint; that is, the derivation of Horizontal and Vertical Protection Levels (HPLs, VPLs) from a realistic precision estimation of the position parameters is performed and compared to predefined Alert Limits (ALs). Finally, some results from the project test campaigns are described to report on particular project achievements. Together with actual Search-and-Rescue teams, the system was operated in realistic, user-chosen test scenarios. In this context, and specially focusing on the EGNOS-based UAV navigation, the AIM capability and also the RGB/thermal imaging subsystem, a summary of the results is presented.
NASA Astrophysics Data System (ADS)
Torres-Martínez, J. A.; Seddaiu, M.; Rodríguez-Gonzálvez, P.; Hernández-López, D.; González-Aguilera, D.
2015-02-01
The complexity of archaeological sites hinders to get an integral modelling using the actual Geomatic techniques (i.e. aerial, closerange photogrammetry and terrestrial laser scanner) individually, so a multi-sensor approach is proposed as the best solution to provide a 3D reconstruction and visualization of these complex sites. Sensor registration represents a riveting milestone when automation is required and when aerial and terrestrial dataset must be integrated. To this end, several problems must be solved: coordinate system definition, geo-referencing, co-registration of point clouds, geometric and radiometric homogeneity, etc. Last but not least, safeguarding of tangible archaeological heritage and its associated intangible expressions entails a multi-source data approach in which heterogeneous material (historical documents, drawings, archaeological techniques, habit of living, etc.) should be collected and combined with the resulting hybrid 3D of "Tolmo de Minateda" located models. The proposed multi-data source and multi-sensor approach is applied to the study case of "Tolmo de Minateda" archaeological site. A total extension of 9 ha is reconstructed, with an adapted level of detail, by an ultralight aerial platform (paratrike), an unmanned aerial vehicle, a terrestrial laser scanner and terrestrial photogrammetry. In addition, the own defensive nature of the site (i.e. with the presence of three different defensive walls) together with the considerable stratification of the archaeological site (i.e. with different archaeological surfaces and constructive typologies) require that tangible and intangible archaeological heritage expressions can be integrated with the hybrid 3D models obtained, to analyse, understand and exploit the archaeological site by different experts and heritage stakeholders.
Thermo-optic devices on polymer platform
NASA Astrophysics Data System (ADS)
Zhang, Ziyang; Keil, Norbert
2016-03-01
Optical polymers possess in general relatively high thermo-optic coefficients and at the same time low thermal conductivity, both of which make them attractive material candidates for realizing highly efficient thermally tunable devices. Over the years, various thermo-optic components have been demonstrated on polymer platform, covering (1) tunable reflectors and filters as part of a laser cavity, (2) variable optical attenuators (VOAs) as light amplitude regulators in e.g. a coherent receiver, and (3) thermo-optic switches (TOSs) allowing multi-flow control in the photonic integrated circuits (PICs). This work attempts to review the recent progress on the above mentioned three component branches, including linearly and differentially tunable filters, VOAs based on 1×1 multimode interference structure (MMI) and Mach-Zehnder interferometer (MZI), and 1×2 TOS based on waveguide Y-branch, driven by a pair of sidelong placed heater electrodes. These thermo-optic components can well be integrated into larger PICs: the dual-polarization switchable tunable laser and the colorless optical 90° hybrid are presented in the end as examples.
Lo, Mu-Chieh; Guzmán, Robinson; Gordón, Carlos; Carpintero, Guillermo
2017-04-15
This Letter presents a photonics-based millimeter wave and terahertz frequency synthesizer using a monolithic InP photonic integrated circuit composed of a mode-locked laser (MLL) and two pulse interleaver stages to multiply the repetition rate frequency. The MLL is a multiple colliding pulse MLL producing an 80 GHz repetition rate pulse train. Through two consecutive monolithic pulse interleaver structures, each doubling the repetition rate, we demonstrate the achievement of 160 and 320 GHz. The fabrication was done on a multi-project wafer run of a generic InP photonic technology platform.
The Third Phase of AQMEII: Evaluation Strategy and Multi-Model Performance Analysis
AQMEII (Air Quality Model Evaluation International Initiative) is an extraordinary effort promoting policy-relevant research on regional air quality model evaluation across the European and North American atmospheric modelling communities, providing the ideal platform for advanci...
Virtual health platform for medical tourism purposes.
Martinez, Debora; Ferriol, Pedro; Tous, Xisco; Cabrer, Miguel; Prats, Mercedes
2008-01-01
This paper introduces an overview of the Virtual Health Platform (VHP), an alternative approach to create a functional PHR system in a medical tourism environment. The proposed platform has been designed in order to be integrated with EHR infrastructures and in this way it expects to be useful and more advantageous to the patient or tourist. Use cases of the VHP and its potential benefits summarize the analysis.
EPOS Thematic Core Service ANTHROPOGENIC HAZARDS (TCS AH) - development of e-research platform
NASA Astrophysics Data System (ADS)
Orlecka-Sikora, Beata
2017-04-01
TCS AH is based on IS-EPOS Platform. The Platform facilitates research on anthropogenic hazards and is available online, free of charge https://tcs.ah-epos.eu/. The Platform is a final product of the IS-EPOS project, founded by the national programme - POIG - which was implemented in 2013-2015 (POIG.02.03.00-14-090/13-00). The platform is a result of a joint work of scientific community and industrial partners. Currently, the development of TCS AH is carried under EPOS IP project (H2020-INFRADEV-1-2015-1, INFRADEV-3-2015). Platform is an open virtual access point for researchers and Ph. D. students interested in anthropogenic seismicity and related hazards. This environment is designed to ensure a researcher the maximum possible liberty for experimentation by providing a virtual laboratory, in which the researcher can design own processing streams and process the data integrated on the platform. TCS AH integrates: data and specific high-level services. Data gathered in the so-called "episodes", comprehensively describing a geophysical process, induced or triggered by human technological activity, which, under certain circumstances can become hazardous for people, infrastructure and the environment. 7 sets of seismic, geological and technological data were made available on the Platform. The data come from Poland, Germany, UK and Vietnam, and refer to underground mining, reservoir impoundment, shale gas exploitation and geothermal energy production. The next at least 19 new episodes related to conventional hydrocarbon extraction, reservoir treatment, underground mining and geothermal energy production are being integrated within the framework of EPOS IP project. The heterogeneous multi-disciplinary data (seismic, displacement, geomechanical data, production data etc.) are transformed to unified structures to form integrated and validated datasets. To deal with this various data the problem-oriented services were designed and implemented. The particular attention devoted to methods analyzing correlations between technology, geophysical response and resulting hazard was stressed out in service preparation. TCS AH contains a number of computing and data visualization services, which give opportunity to make graphical presentations of the available data. Further development of the Platform, except integration of at least new episodes of all types of anthropogenic hazards, will be covering gradually implementation of new services. TCS AH platform is open for the whole research community. The platform is also designated to be used in research projects, eg. it serves "Shale gas exploration and exploitation induced risks (SHEER)" project (Horizon 2020, call LCE 16-2014). In addition, it is also meant to serve the public sector expert knowledge and background information. In order to fulfill this aim the services for outreach, dissemination & communication will be implemented. TCS AH was used as a teaching tool in Ph. D. students education within IG PAS seismology course for Ph. D. candidates, Interdisciplinary Polar Studies as well as in several workshops for Polish and international students. Additionally, the platform is also used within educational project ERIS (Exploitation of Research results In School practice) aimed for junior high and high schools, funded with support from the European Commission within ERASMUS+ Programme.
A platform for real-time online health analytics during spaceflight
NASA Astrophysics Data System (ADS)
McGregor, Carolyn
Monitoring the health and wellbeing of astronauts during spaceflight is an important aspect of any manned mission. To date the monitoring has been based on a sequential set of discontinuous samplings of physiological data to support initial studies on aspects such as weightlessness, and its impact on the cardiovascular system and to perform proactive monitoring for health status. The research performed and the real-time monitoring has been hampered by the lack of a platform to enable a more continuous approach to real-time monitoring. While any spaceflight is monitored heavily by Mission Control, an important requirement within the context of any spaceflight setting and in particular where there are extended periods with a lack of communication with Mission Control, is the ability for the mission to operate in an autonomous manner. This paper presents a platform to enable real-time astronaut monitoring for prognostics and health management within space medicine using online health analytics. The platform is based on extending previous online health analytics research known as the Artemis and Artemis Cloud platforms which have demonstrated their relevance for multi-patient, multi-diagnosis and multi-stream temporal analysis in real-time for clinical management and research within Neonatal Intensive Care. Artemis and Artemis Cloud source data from a range of medical devices capable of transmission of the signal via wired or wireless connectivity and hence are well suited to process real-time data acquired from astronauts. A key benefit of this platform is its ability to monitor their health and wellbeing onboard the mission as well as enabling the astronaut's physiological data, and other clinical data, to be sent to the platform components at Mission Control at each stage when that communication is available. As a result, researchers at Mission Control would be able to simulate, deploy and tailor predictive analytics and diagnostics during the same spaceflight for - reater medical support.
NASA Technical Reports Server (NTRS)
Chamis, Christos C.; Abumeri, Galib H.
2000-01-01
Aircraft engines are assemblies of dynamically interacting components. Engine updates to keep present aircraft flying safely and engines for new aircraft are progressively required to operate in more demanding technological and environmental requirements. Designs to effectively meet those requirements are necessarily collections of multi-scale, multi-level, multi-disciplinary analysis and optimization methods and probabilistic methods are necessary to quantify respective uncertainties. These types of methods are the only ones that can formally evaluate advanced composite designs which satisfy those progressively demanding requirements while assuring minimum cost, maximum reliability and maximum durability. Recent research activities at NASA Glenn Research Center have focused on developing multi-scale, multi-level, multidisciplinary analysis and optimization methods. Multi-scale refers to formal methods which describe complex material behavior metal or composite; multi-level refers to integration of participating disciplines to describe a structural response at the scale of interest; multidisciplinary refers to open-ended for various existing and yet to be developed discipline constructs required to formally predict/describe a structural response in engine operating environments. For example, these include but are not limited to: multi-factor models for material behavior, multi-scale composite mechanics, general purpose structural analysis, progressive structural fracture for evaluating durability and integrity, noise and acoustic fatigue, emission requirements, hot fluid mechanics, heat-transfer and probabilistic simulations. Many of these, as well as others, are encompassed in an integrated computer code identified as Engine Structures Technology Benefits Estimator (EST/BEST) or Multi-faceted/Engine Structures Optimization (MP/ESTOP). The discipline modules integrated in MP/ESTOP include: engine cycle (thermodynamics), engine weights, internal fluid mechanics, cost, mission and coupled structural/thermal, various composite property simulators and probabilistic methods to evaluate uncertainty effects (scatter ranges) in all the design parameters. The objective of the proposed paper is to briefly describe a multi-faceted design analysis and optimization capability for coupled multi-discipline engine structures optimization. Results are presented for engine and aircraft type metrics to illustrate the versatility of that capability. Results are also presented for reliability, noise and fatigue to illustrate its inclusiveness. For example, replacing metal rotors with composites reduces the engine weight by 20 percent, 15 percent noise reduction, and an order of magnitude improvement in reliability. Composite designs exist to increase fatigue life by at least two orders of magnitude compared to state-of-the-art metals.
SHARP pre-release v1.0 - Current Status and Documentation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mahadevan, Vijay S.; Rahaman, Ronald O.
The NEAMS Reactor Product Line effort aims to develop an integrated multiphysics simulation capability for the design and analysis of future generations of nuclear power plants. The Reactor Product Line code suite’s multi-resolution hierarchy is being designed to ultimately span the full range of length and time scales present in relevant reactor design and safety analyses, as well as scale from desktop to petaflop computing platforms. In this report, building on a several previous report issued in September 2014, we describe our continued efforts to integrate thermal/hydraulics, neutronics, and structural mechanics modeling codes to perform coupled analysis of a representativemore » fast sodium-cooled reactor core in preparation for a unified release of the toolkit. The work reported in the current document covers the software engineering aspects of managing the entire stack of components in the SHARP toolkit and the continuous integration efforts ongoing to prepare a release candidate for interested reactor analysis users. Here we report on the continued integration effort of PROTEUS/Nek5000 and Diablo into the NEAMS framework and the software processes that enable users to utilize the capabilities without losing scientific productivity. Due to the complexity of the individual modules and their necessary/optional dependency library chain, we focus on the configuration and build aspects for the SHARP toolkit, which includes capability to autodownload dependencies and configure/install with optimal flags in an architecture-aware fashion. Such complexity is untenable without strong software engineering processes such as source management, source control, change reviews, unit tests, integration tests and continuous test suites. Details on these processes are provided in the report as a building step for a SHARP user guide that will accompany the first release, expected by Mar 2016.« less
Capture and 3D culture of colonic crypts and colonoids in a microarray platform.
Wang, Yuli; Ahmad, Asad A; Shah, Pavak K; Sims, Christopher E; Magness, Scott T; Allbritton, Nancy L
2013-12-07
Crypts are the basic structural and functional units of colonic epithelium and can be isolated from the colon and cultured in vitro into multi-cell spheroids termed "colonoids". Both crypts and colonoids are ideal building blocks for construction of an in vitro tissue model of the colon. Here we proposed and tested a microengineered platform for capture and in vitro 3D culture of colonic crypts and colonoids. An integrated platform was fabricated from polydimethylsiloxane which contained two fluidic layers separated by an array of cylindrical microwells (150 μm diameter, 150 μm depth) with perforated bottoms (30 μm opening, 10 μm depth) termed "microstrainers". As fluid moved through the array, crypts or colonoids were retained in the microstrainers with a >90% array-filling efficiency. Matrigel as an extracellular matrix was then applied to the microstrainers to generate isolated Matrigel pockets encapsulating the crypts or colonoids. After supplying the essential growth factors, epidermal growth factor, Wnt-3A, R-spondin 2 and noggin, 63 ± 13% of the crypts and 77 ± 8% of the colonoids cultured in the microstrainers over a 48-72 h period formed viable 3D colonoids. Thus colonoid growth on the array was similar to that under standard culture conditions (78 ± 5%). Additionally the colonoids displayed the same morphology and similar numbers of stem and progenitor cells as those under standard culture conditions. Immunofluorescence staining confirmed that the differentiated cell-types of the colon, goblet cells, enteroendocrine cells and absorptive enterocytes, formed on the array. To demonstrating the utility of the array in tracking the colonoid fate, quantitative fluorescence analysis was performed on the arrayed colonoids exposed to reagents such as Wnt-3A and the γ-secretase inhibitor LY-411575. The successful formation of viable, multi-cell type colonic tissue on the microengineered platform represents a first step in the building of a "colon-on-a-chip" with the goal of producing the physiologic structure and organ-level function of the colon for controlled experiments.
Linking earth science informatics resources into uninterrupted digital value chains
NASA Astrophysics Data System (ADS)
Woodcock, Robert; Angreani, Rini; Cox, Simon; Fraser, Ryan; Golodoniuc, Pavel; Klump, Jens; Rankine, Terry; Robertson, Jess; Vote, Josh
2015-04-01
The CSIRO Mineral Resources Flagship was established to tackle medium- to long-term challenges facing the Australian mineral industry across the value chain from exploration and mining through mineral processing within the framework of an economically, environmentally and socially sustainable minerals industry. This broad portfolio demands collaboration and data exchange with a broad range of participants and data providers across government, research and industry. It is an ideal environment to link geoscience informatics platforms to application across the resource extraction industry and to unlock the value of data integration between traditionally discrete parts of the minerals digital value chain. Despite the potential benefits, data integration remains an elusive goal within research and industry. Many projects use only a subset of available data types in an integrated manner, often maintaining the traditional discipline-based data 'silos'. Integrating data across the entire minerals digital value chain is an expensive proposition involving multiple disciplines and, significantly, multiple data sources both internal and external to any single organisation. Differing vocabularies and data formats, along with access regimes to appropriate analysis software and equipment all hamper the sharing and exchange of information. AuScope has addressed the challenge of data exchange across organisations nationally, and established a national geosciences information infrastructure using open standards-based web services. Federated across a wide variety of organisations, the resulting infrastructure contains a wide variety of live and updated data types. The community data standards and infrastructure platforms that underpin AuScope provide important new datasets and multi-agency links independent of software and hardware differences. AuScope has thus created an infrastructure, a platform of technologies and the opportunity for new ways of working with and integrating disparate data at much lower cost. An early example of this approach is the value generated by combining geological and metallurgical data sets as part of the rapidly growing field of geometallurgy. This not only provides a far better understanding of the impact of geological variability on ore processing but also leads to new thinking on the types and characteristics of data sets collected at various stages of the exploration and mining process. The Minerals Resources Flagship is linking its research activities to the AuScope infrastructure, exploiting the technology internally to create a platform for integrated research across the minerals value chain and improved interaction with industry. Referred to as the 'Early Access Virtual Lab', the system will be fully interoperable with AuScope and international infrastructures using open standards like GeosciML. Secured access is provided to allow confidential collaboration with industry when required. This presentation will discuss how the CSIRO Mineral Resources Flagship is building on the AuScope infrastructure to transform the way that data and data products are identified, shared, integrated, and reused, to unlock the benefits of true integration of research efforts across the minerals digital value chain.
S-Genius, a universal software platform with versatile inverse problem resolution for scatterometry
NASA Astrophysics Data System (ADS)
Fuard, David; Troscompt, Nicolas; El Kalyoubi, Ismael; Soulan, Sébastien; Besacier, Maxime
2013-05-01
S-Genius is a new universal scatterometry platform, which gathers all the LTM-CNRS know-how regarding the rigorous electromagnetic computation and several inverse problem solver solutions. This software platform is built to be a userfriendly, light, swift, accurate, user-oriented scatterometry tool, compatible with any ellipsometric measurements to fit and any types of pattern. It aims to combine a set of inverse problem solver capabilities — via adapted Levenberg- Marquard optimization, Kriging, Neural Network solutions — that greatly improve the reliability and the velocity of the solution determination. Furthermore, as the model solution is mainly vulnerable to materials optical properties, S-Genius may be coupled with an innovative material refractive indices determination. This paper will a little bit more focuses on the modified Levenberg-Marquardt optimization, one of the indirect method solver built up in parallel with the total SGenius software coding by yours truly. This modified Levenberg-Marquardt optimization corresponds to a Newton algorithm with an adapted damping parameter regarding the definition domains of the optimized parameters. Currently, S-Genius is technically ready for scientific collaboration, python-powered, multi-platform (windows/linux/macOS), multi-core, ready for 2D- (infinite features along the direction perpendicular to the incident plane), conical, and 3D-features computation, compatible with all kinds of input data from any possible ellipsometers (angle or wavelength resolved) or reflectometers, and widely used in our laboratory for resist trimming studies, etching features characterization (such as complex stack) or nano-imprint lithography measurements for instance. The work about kriging solver, neural network solver and material refractive indices determination is done (or about to) by other LTM members and about to be integrated on S-Genius platform.
Hostetter, Jason; Khanna, Nishanth; Mandell, Jacob C
2018-06-01
The purpose of this study was to integrate web-based forms with a zero-footprint cloud-based Picture Archiving and Communication Systems (PACS) to create a tool of potential benefit to radiology research and education. Web-based forms were created with a front-end and back-end architecture utilizing common programming languages including Vue.js, Node.js and MongoDB, and integrated into an existing zero-footprint cloud-based PACS. The web-based forms application can be accessed in any modern internet browser on desktop or mobile devices and allows the creation of customizable forms consisting of a variety of questions types. Each form can be linked to an individual DICOM examination or a collection of DICOM examinations. Several uses are demonstrated through a series of case studies, including implementation of a research platform for multi-reader multi-case (MRMC) studies and other imaging research, and creation of an online Objective Structure Clinical Examination (OSCE) and an educational case file. Copyright © 2018 The Association of University Radiologists. Published by Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Ding, Yichen; Yu, Jing; Abiri, Arash; Abiri, Parinaz; Lee, Juhyun; Chang, Chih-Chiang; Baek, Kyung In; Sevag Packard, René R.; Hsiai, Tzung K.
2018-02-01
There currently is a limited ability to interactively study developmental cardiac mechanics and physiology. We therefore combined light-sheet fluorescence microscopy (LSFM) with virtual reality (VR) to provide a hybrid platform for 3- dimensional (3-D) architecture and time-dependent cardiac contractile function characterization. By taking advantage of the rapid acquisition, high axial resolution, low phototoxicity, and high fidelity in 3-D and 4-D (3-D spatial + 1-D time or spectra), this VR-LSFM hybrid methodology enables interactive visualization and quantification otherwise not available by conventional methods such as routine optical microscopes. We hereby demonstrate multi-scale applicability of VR-LSFM to 1) interrogate skin fibroblasts interacting with a hyaluronic acid-based hydrogel, 2) navigate through the endocardial trabecular network during zebrafish development, and 3) localize gene therapy-mediated potassium channel expression in adult murine hearts. We further combined our batch intensity normalized segmentation (BINS) algorithm with deformable image registration (DIR) to interface a VR environment for the analysis of cardiac contraction. Thus, the VR-LSFM hybrid platform demonstrates an efficient and robust framework for creating a user-directed microenvironment in which we uncovered developmental cardiac mechanics and physiology with high spatiotemporal resolution.
Xi-cam: a versatile interface for data visualization and analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
Xi-cam: a versatile interface for data visualization and analysis
Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...
2018-05-31
Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less
NASA Astrophysics Data System (ADS)
Verma, Sneha K.; Liu, Brent J.; Gridley, Daila S.; Mao, Xiao W.; Kotha, Nikhil
2015-03-01
In previous years we demonstrated an imaging informatics system designed to support multi-institutional research focused on the utilization of proton radiation for treating spinal cord injury (SCI)-related pain. This year we will demonstrate an update on the system with new modules added to perform image processing on evaluation data using immunhistochemistry methods to observe effects of proton therapy. The overarching goal of the research is to determine the effectiveness of using the proton beam for treating SCI-related neuropathic pain as an alternative to invasive surgical lesioning. The research is a joint collaboration between three major institutes, University of Southern California (data collection/integration and image analysis), Spinal Cord Institute VA Healthcare System, Long Beach (patient subject recruitment), and Loma Linda University and Medical Center (human and preclinical animal studies). The system that we are presenting is one of its kind which is capable of integrating a large range of data types, including text data, imaging data, DICOM objects from proton therapy treatment and pathological data. For multi-institutional studies, keeping data secure and integrated is very crucial. Different kinds of data within the study workflow are generated at different stages and different groups of people who process and analyze them in order to see hidden patterns within healthcare data from a broader perspective. The uniqueness of our system relies on the fact that it is platform independent and web-based which makes it very useful in such a large-scale study.
NASA Technical Reports Server (NTRS)
Newsom, Jerry R.
1991-01-01
Control-Structures Interaction (CSI) technology embraces the understanding of the interaction between the spacecraft structure and the control system, and the creation and validation of concepts, techniques, and tools, for enabling the interdisciplinary design of an integrated structure and control system, rather than the integration of a structural design and a control system design. The goal of this program is to develop validated CSI technology for integrated design/analysis and qualification of large flexible space systems and precision space structures. A description of the CSI technology program is presented.
Vaccarino, Anthony L; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M; Stuss, Donald T; Theriault, Elizabeth; Evans, Kenneth R
2018-01-01
Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute's "Brain-CODE" is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care.
Vaccarino, Anthony L.; Dharsee, Moyez; Strother, Stephen; Aldridge, Don; Arnott, Stephen R.; Behan, Brendan; Dafnas, Costas; Dong, Fan; Edgecombe, Kenneth; El-Badrawi, Rachad; El-Emam, Khaled; Gee, Tom; Evans, Susan G.; Javadi, Mojib; Jeanson, Francis; Lefaivre, Shannon; Lutz, Kristen; MacPhee, F. Chris; Mikkelsen, Jordan; Mikkelsen, Tom; Mirotchnick, Nicholas; Schmah, Tanya; Studzinski, Christa M.; Stuss, Donald T.; Theriault, Elizabeth; Evans, Kenneth R.
2018-01-01
Historically, research databases have existed in isolation with no practical avenue for sharing or pooling medical data into high dimensional datasets that can be efficiently compared across databases. To address this challenge, the Ontario Brain Institute’s “Brain-CODE” is a large-scale neuroinformatics platform designed to support the collection, storage, federation, sharing and analysis of different data types across several brain disorders, as a means to understand common underlying causes of brain dysfunction and develop novel approaches to treatment. By providing researchers access to aggregated datasets that they otherwise could not obtain independently, Brain-CODE incentivizes data sharing and collaboration and facilitates analyses both within and across disorders and across a wide array of data types, including clinical, neuroimaging and molecular. The Brain-CODE system architecture provides the technical capabilities to support (1) consolidated data management to securely capture, monitor and curate data, (2) privacy and security best-practices, and (3) interoperable and extensible systems that support harmonization, integration, and query across diverse data modalities and linkages to external data sources. Brain-CODE currently supports collaborative research networks focused on various brain conditions, including neurodevelopmental disorders, cerebral palsy, neurodegenerative diseases, epilepsy and mood disorders. These programs are generating large volumes of data that are integrated within Brain-CODE to support scientific inquiry and analytics across multiple brain disorders and modalities. By providing access to very large datasets on patients with different brain disorders and enabling linkages to provincial, national and international databases, Brain-CODE will help to generate new hypotheses about the biological bases of brain disorders, and ultimately promote new discoveries to improve patient care. PMID:29875648
TriPleX: a versatile dielectric photonic platform
NASA Astrophysics Data System (ADS)
Wörhoff, Kerstin; Heideman, René G.; Leinse, Arne; Hoekman, Marcel
2015-04-01
Photonic applications based on planar waveguide technology impose stringent requirements on properties such as optical propagation losses, light coupling to optical fibers, integration density, as well as on reliability and reproducibility. The latter is correlated to a high level of control of the refractive index and waveguide geometry. In this paper, we review a versatile dielectric waveguide platform, called TriPleX, which is based on alternating silicon nitride and silicon dioxide films. Fabrication with CMOS-compatible equipment based on low-pressure chemical vapor deposition enables the realization of stable material compositions being a prerequisite to the control of waveguide properties and modal shape. The transparency window of both materials allows for the realization of low-loss waveguides over a wide wavelength range (400 nm-2.35 μm). Propagation losses as low as 5×10-4 dB/cm are reported. Three basic geometries (box shell, double stripe, and filled box) can be distinguished. A specific tapering technology is developed for on-chip, low-loss (<0.1 dB) spotsize convertors, allowing for combining efficient fiber to chip coupling with high-contrast waveguides required for increased functional complexity as well as for hybrid integration with other photonic platforms such as InP and SOI. The functionality of the TriPleX platform is captured by verified basic building blocks. The corresponding library and associated design kit is available for multi-project wafer (MPW) runs. Several applications of this platform technology in communications, biomedicine, sensing, as well as a few special fields of photonics are treated in more detail.
CHRONIOUS: a wearable platform for monitoring and management of patients with chronic disease.
Bellos, Christos; Papadopoulos, Athanassios; Rosso, Roberto; Fotiadis, Dimitrios I
2011-01-01
The CHRONIOUS system has been developed based on an open architecture design that consists of a set of subsystems which interact in order to provide all the needed services to the chronic disease patients. An advanced multi-parametric expert system is being implemented that fuses information effectively from various sources using intelligent techniques. Data are collected by sensors of a body network controlling vital signals while additional tools record dietary habits and plans, drug intake, environmental and biochemical parameters and activity data. The CHRONIOUS platform provides guidelines and standards for the future generations of "chronic disease management systems" and facilitates sophisticated monitoring tools. In addition, an ontological information retrieval system is being delivered satisfying the necessities for up-to-date clinical information of Chronic Obstructive pulmonary disease (COPD) and Chronic Kidney Disease (CKD). Moreover, support tools are being embedded in the system, such as the Mental Tools for the monitoring of patient mental health status. The integrated platform provides real-time patient monitoring and supervision, both indoors and outdoors and represents a generic platform for the management of various chronic diseases.
Moon-based Earth Observation for Large Scale Geoscience Phenomena
NASA Astrophysics Data System (ADS)
Guo, Huadong; Liu, Guang; Ding, Yixing
2016-07-01
The capability of Earth observation for large-global-scale natural phenomena needs to be improved and new observing platform are expected. We have studied the concept of Moon as an Earth observation in these years. Comparing with manmade satellite platform, Moon-based Earth observation can obtain multi-spherical, full-band, active and passive information,which is of following advantages: large observation range, variable view angle, long-term continuous observation, extra-long life cycle, with the characteristics of longevity ,consistency, integrity, stability and uniqueness. Moon-based Earth observation is suitable for monitoring the large scale geoscience phenomena including large scale atmosphere change, large scale ocean change,large scale land surface dynamic change,solid earth dynamic change,etc. For the purpose of establishing a Moon-based Earth observation platform, we already have a plan to study the five aspects as follows: mechanism and models of moon-based observing earth sciences macroscopic phenomena; sensors' parameters optimization and methods of moon-based Earth observation; site selection and environment of moon-based Earth observation; Moon-based Earth observation platform; and Moon-based Earth observation fundamental scientific framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lingerfelt, Eric J; Messer, II, Otis E
2017-01-02
The Bellerophon software system supports CHIMERA, a production-level HPC application that simulates the evolution of core-collapse supernovae. Bellerophon enables CHIMERA's geographically dispersed team of collaborators to perform job monitoring and real-time data analysis from multiple supercomputing resources, including platforms at OLCF, NERSC, and NICS. Its multi-tier architecture provides an encapsulated, end-to-end software solution that enables the CHIMERA team to quickly and easily access highly customizable animated and static views of results from anywhere in the world via a cross-platform desktop application.
NASA Astrophysics Data System (ADS)
Wang, Jian
2017-01-01
In order to change traditional PE teaching mode and realize the interconnection, interworking and sharing of PE teaching resources, a distance PE teaching platform based on broadband network is designed and PE teaching information resource database is set up. The designing of PE teaching information resource database takes Windows NT 4/2000Server as operating system platform, Microsoft SQL Server 7.0 as RDBMS, and takes NAS technology for data storage and flow technology for video service. The analysis of system designing and implementation shows that the dynamic PE teaching information resource sharing platform based on Web Service can realize loose coupling collaboration, realize dynamic integration and active integration and has good integration, openness and encapsulation. The distance PE teaching platform based on Web Service and the design scheme of PE teaching information resource database can effectively solve and realize the interconnection, interworking and sharing of PE teaching resources and adapt to the informatization development demands of PE teaching.
A generic, cost-effective, and scalable cell lineage analysis platform
Biezuner, Tamir; Spiro, Adam; Raz, Ofir; Amir, Shiran; Milo, Lilach; Adar, Rivka; Chapal-Ilani, Noa; Berman, Veronika; Fried, Yael; Ainbinder, Elena; Cohen, Galit; Barr, Haim M.; Halaban, Ruth; Shapiro, Ehud
2016-01-01
Advances in single-cell genomics enable commensurate improvements in methods for uncovering lineage relations among individual cells. Current sequencing-based methods for cell lineage analysis depend on low-resolution bulk analysis or rely on extensive single-cell sequencing, which is not scalable and could be biased by functional dependencies. Here we show an integrated biochemical-computational platform for generic single-cell lineage analysis that is retrospective, cost-effective, and scalable. It consists of a biochemical-computational pipeline that inputs individual cells, produces targeted single-cell sequencing data, and uses it to generate a lineage tree of the input cells. We validated the platform by applying it to cells sampled from an ex vivo grown tree and analyzed its feasibility landscape by computer simulations. We conclude that the platform may serve as a generic tool for lineage analysis and thus pave the way toward large-scale human cell lineage discovery. PMID:27558250
An integrated multi-sensors approach for volcanic cloud retrievals and source characterization
NASA Astrophysics Data System (ADS)
Corradini, Stefano; Merucci, Luca
2017-04-01
Volcanic eruptions are one the most important sources of natural pollution. In particular the volcanic clouds represent a severe threat for aviation safety. Worldwide the volcanic activity is monitored by using satellite and ground-based instruments working at different spectral ranges, with different spatial resolutions and sensitivities. Here the complementarity between geostationary and polar satellites and ground based measurements is exploited, in order to significantly improve the volcanic cloud detection and retrievals and to fully characterize the eruption source. The integration procedure named MACE (Multi-platform volcanic Ash Cloud Estimation), has been developed during the EU-FP7 APhoRISM project aimed to develop innovative products to support the management and mitigation of the volcanic and the seismic crisis. The proposed method integrates in a novel manner the volcanic ash retrievals at the space-time scale of typical geostationary observations using both the polar satellite estimations and in-situ measurements. On MACE the typical volcanic cloud retrievals in the thermal infrared are integrated by using a wider spectral range from visible to microwave. Moreover the volcanic cloud detection is extended in case of cloudy atmosphere or steam plumes. As example, the integrated approach is tested on different recent eruptions, occurred on Etna (Italy) in 2013 and 2015 and on Calbuco (Chile) in 2015.
Foldable and Cytocompatible Sol-gel TiO2 Photonics
NASA Astrophysics Data System (ADS)
Li, Lan; Zhang, Ping; Wang, Wei-Ming; Lin, Hongtao; Zerdoum, Aidan B.; Geiger, Sarah J.; Liu, Yangchen; Xiao, Nicholas; Zou, Yi; Ogbuu, Okechukwu; Du, Qingyang; Jia, Xinqiao; Li, Jingjing; Hu, Juejun
2015-09-01
Integrated photonics provides a miniaturized and potentially implantable platform to manipulate and enhance the interactions between light and biological molecules or tissues in in-vitro and in-vivo settings, and is thus being increasingly adopted in a wide cross-section of biomedical applications ranging from disease diagnosis to optogenetic neuromodulation. However, the mechanical rigidity of substrates traditionally used for photonic integration is fundamentally incompatible with soft biological tissues. Cytotoxicity of materials and chemicals used in photonic device processing imposes another constraint towards these biophotonic applications. Here we present thin film TiO2 as a viable material for biocompatible and flexible integrated photonics. Amorphous TiO2 films were deposited using a low temperature (<250 °C) sol-gel process fully compatible with monolithic integration on plastic substrates. High-index-contrast flexible optical waveguides and resonators were fabricated using the sol-gel TiO2 material, and resonator quality factors up to 20,000 were measured. Following a multi-neutral-axis mechanical design, these devices exhibit remarkable mechanical flexibility, and can sustain repeated folding without compromising their optical performance. Finally, we validated the low cytotoxicity of the sol-gel TiO2 devices through in-vitro cell culture tests. These results demonstrate the potential of sol-gel TiO2 as a promising material platform for novel biophotonic devices.
Foldable and Cytocompatible Sol-gel TiO2 Photonics
Li, Lan; Zhang, Ping; Wang, Wei-Ming; Lin, Hongtao; Zerdoum, Aidan B.; Geiger, Sarah J.; Liu, Yangchen; Xiao, Nicholas; Zou, Yi; Ogbuu, Okechukwu; Du, Qingyang; Jia, Xinqiao; Li, Jingjing; Hu, Juejun
2015-01-01
Integrated photonics provides a miniaturized and potentially implantable platform to manipulate and enhance the interactions between light and biological molecules or tissues in in-vitro and in-vivo settings, and is thus being increasingly adopted in a wide cross-section of biomedical applications ranging from disease diagnosis to optogenetic neuromodulation. However, the mechanical rigidity of substrates traditionally used for photonic integration is fundamentally incompatible with soft biological tissues. Cytotoxicity of materials and chemicals used in photonic device processing imposes another constraint towards these biophotonic applications. Here we present thin film TiO2 as a viable material for biocompatible and flexible integrated photonics. Amorphous TiO2 films were deposited using a low temperature (<250 °C) sol-gel process fully compatible with monolithic integration on plastic substrates. High-index-contrast flexible optical waveguides and resonators were fabricated using the sol-gel TiO2 material, and resonator quality factors up to 20,000 were measured. Following a multi-neutral-axis mechanical design, these devices exhibit remarkable mechanical flexibility, and can sustain repeated folding without compromising their optical performance. Finally, we validated the low cytotoxicity of the sol-gel TiO2 devices through in-vitro cell culture tests. These results demonstrate the potential of sol-gel TiO2 as a promising material platform for novel biophotonic devices. PMID:26344823
Foldable and Cytocompatible Sol-gel TiO2 Photonics.
Li, Lan; Zhang, Ping; Wang, Wei-Ming; Lin, Hongtao; Zerdoum, Aidan B; Geiger, Sarah J; Liu, Yangchen; Xiao, Nicholas; Zou, Yi; Ogbuu, Okechukwu; Du, Qingyang; Jia, Xinqiao; Li, Jingjing; Hu, Juejun
2015-09-07
Integrated photonics provides a miniaturized and potentially implantable platform to manipulate and enhance the interactions between light and biological molecules or tissues in in-vitro and in-vivo settings, and is thus being increasingly adopted in a wide cross-section of biomedical applications ranging from disease diagnosis to optogenetic neuromodulation. However, the mechanical rigidity of substrates traditionally used for photonic integration is fundamentally incompatible with soft biological tissues. Cytotoxicity of materials and chemicals used in photonic device processing imposes another constraint towards these biophotonic applications. Here we present thin film TiO2 as a viable material for biocompatible and flexible integrated photonics. Amorphous TiO2 films were deposited using a low temperature (<250 °C) sol-gel process fully compatible with monolithic integration on plastic substrates. High-index-contrast flexible optical waveguides and resonators were fabricated using the sol-gel TiO2 material, and resonator quality factors up to 20,000 were measured. Following a multi-neutral-axis mechanical design, these devices exhibit remarkable mechanical flexibility, and can sustain repeated folding without compromising their optical performance. Finally, we validated the low cytotoxicity of the sol-gel TiO2 devices through in-vitro cell culture tests. These results demonstrate the potential of sol-gel TiO2 as a promising material platform for novel biophotonic devices.
A WebGIS platform for the monitoring of Farm Animal Genetic Resources (GENMON)
Flury, Christine; Matasci, Giona; Joerin, Florent; Widmer, Ivo; Joost, Stéphane
2017-01-01
Background In 2007, the Food and Agriculture Organization of the United Nations (FAO) initiated the Global plan of action for Farm Animal Genetic Resources (FAnGR). The main goal of this plan is to reduce further loss of genetic diversity in farm animals, so as to protect and promote the diversity of farm animal resources. An important step to reach this goal is to monitor and prioritize endangered breeds in the context of conservation programs. Methodology/Web portal implementation The GENMON WebGIS platform is able to monitor FAnGR and to evaluate the degree of endangerment of livestock breeds. The system takes into account pedigree and introgression information, the geographical concentration of animals, the cryo-conservation plan and the sustainability of breeding activities based on socio-economic data as well as present and future land use conditions. A multi-criteria decision tool supports the aggregation of the multi-thematic indices mentioned above using the MACBETH method, which is based on a weighted average using satisfaction thresholds. GENMON is a monitoring tool to reach subjective decisions made by a government agency. It relies on open source software and is available at http://lasigsrv2.epfl.ch/genmon-ch. Results/Significance GENMON allows users to upload pedigree-information (animal ID, parents, birthdate, sex, location and introgression) from a specific livestock breed and to define species and/or region-specific weighting parameters and thresholds. The program then completes a pedigree analysis and derives several indices that are used to calculate an integrated score of conservation prioritization for the breeds under investigation. The score can be visualized on a geographic map and allows a fast, intuitive and regional identification of breeds in danger. Appropriate conservation actions and breeding programs can thus be undertaken in order to promote the recovery of the genetic diversity in livestock breeds in need. The use of the platform is illustrated by means of an example based on three local livestock breeds from different species in Switzerland. PMID:28453561
A WebGIS platform for the monitoring of Farm Animal Genetic Resources (GENMON).
Duruz, Solange; Flury, Christine; Matasci, Giona; Joerin, Florent; Widmer, Ivo; Joost, Stéphane
2017-01-01
In 2007, the Food and Agriculture Organization of the United Nations (FAO) initiated the Global plan of action for Farm Animal Genetic Resources (FAnGR). The main goal of this plan is to reduce further loss of genetic diversity in farm animals, so as to protect and promote the diversity of farm animal resources. An important step to reach this goal is to monitor and prioritize endangered breeds in the context of conservation programs. The GENMON WebGIS platform is able to monitor FAnGR and to evaluate the degree of endangerment of livestock breeds. The system takes into account pedigree and introgression information, the geographical concentration of animals, the cryo-conservation plan and the sustainability of breeding activities based on socio-economic data as well as present and future land use conditions. A multi-criteria decision tool supports the aggregation of the multi-thematic indices mentioned above using the MACBETH method, which is based on a weighted average using satisfaction thresholds. GENMON is a monitoring tool to reach subjective decisions made by a government agency. It relies on open source software and is available at http://lasigsrv2.epfl.ch/genmon-ch. GENMON allows users to upload pedigree-information (animal ID, parents, birthdate, sex, location and introgression) from a specific livestock breed and to define species and/or region-specific weighting parameters and thresholds. The program then completes a pedigree analysis and derives several indices that are used to calculate an integrated score of conservation prioritization for the breeds under investigation. The score can be visualized on a geographic map and allows a fast, intuitive and regional identification of breeds in danger. Appropriate conservation actions and breeding programs can thus be undertaken in order to promote the recovery of the genetic diversity in livestock breeds in need. The use of the platform is illustrated by means of an example based on three local livestock breeds from different species in Switzerland.
MPHASYS: a mouse phenotype analysis system
Calder, R Brent; Beems, Rudolf B; van Steeg, Harry; Mian, I Saira; Lohman, Paul HM; Vijg, Jan
2007-01-01
Background Systematic, high-throughput studies of mouse phenotypes have been hampered by the inability to analyze individual animal data from a multitude of sources in an integrated manner. Studies generally make comparisons at the level of genotype or treatment thereby excluding associations that may be subtle or involve compound phenotypes. Additionally, the lack of integrated, standardized ontologies and methodologies for data exchange has inhibited scientific collaboration and discovery. Results Here we introduce a Mouse Phenotype Analysis System (MPHASYS), a platform for integrating data generated by studies of mouse models of human biology and disease such as aging and cancer. This computational platform is designed to provide a standardized methodology for working with animal data; a framework for data entry, analysis and sharing; and ontologies and methodologies for ensuring accurate data capture. We describe the tools that currently comprise MPHASYS, primarily ones related to mouse pathology, and outline its use in a study of individual animal-specific patterns of multiple pathology in mice harboring a specific germline mutation in the DNA repair and transcription-specific gene Xpd. Conclusion MPHASYS is a system for analyzing multiple data types from individual animals. It provides a framework for developing data analysis applications, and tools for collecting and distributing high-quality data. The software is platform independent and freely available under an open-source license [1]. PMID:17553167
STINGRAY: system for integrated genomic resources and analysis.
Wagner, Glauber; Jardim, Rodrigo; Tschoeke, Diogo A; Loureiro, Daniel R; Ocaña, Kary A C S; Ribeiro, Antonio C B; Emmel, Vanessa E; Probst, Christian M; Pitaluga, André N; Grisard, Edmundo C; Cavalcanti, Maria C; Campos, Maria L M; Mattoso, Marta; Dávila, Alberto M R
2014-03-07
The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/.
Kaigala, Govind V; Hoang, Viet N; Backhouse, Christopher J
2008-07-01
Microvalves are key in realizing portable miniaturized diagnostic platforms. We present a scalable microvalve that integrates well with standard lab on a chip (LOC) implementations, yet which requires essentially no external infrastructure for its operation. This electrically controlled, phase-change microvalve is used to integrate genetic amplification and analysis via capillary electrophoresis--the basis of many diagnostics. The microvalve is actuated using a polymer (polyethylene glycol, PEG) that exhibits a large volumetric change between its solid and liquid phases. Both the phase change of the PEG and the genetic amplification via polymerase chain reaction (PCR) are thermally controlled using thin film resistive elements that are patterned using standard microfabrication methods. By contrast with many other valve technologies, these microvalves and their control interface scale down in size readily. The novelty here lies in the use of fully integrated microvalves that require only electrical connections to realize a portable and inexpensive genetic analysis platform.
STINGRAY: system for integrated genomic resources and analysis
2014-01-01
Background The STINGRAY system has been conceived to ease the tasks of integrating, analyzing, annotating and presenting genomic and expression data from Sanger and Next Generation Sequencing (NGS) platforms. Findings STINGRAY includes: (a) a complete and integrated workflow (more than 20 bioinformatics tools) ranging from functional annotation to phylogeny; (b) a MySQL database schema, suitable for data integration and user access control; and (c) a user-friendly graphical web-based interface that makes the system intuitive, facilitating the tasks of data analysis and annotation. Conclusion STINGRAY showed to be an easy to use and complete system for analyzing sequencing data. While both Sanger and NGS platforms are supported, the system could be faster using Sanger data, since the large NGS datasets could potentially slow down the MySQL database usage. STINGRAY is available at http://stingray.biowebdb.org and the open source code at http://sourceforge.net/projects/stingray-biowebdb/. PMID:24606808
Jordan Water Project: an interdisciplinary evaluation of freshwater vulnerability and security
NASA Astrophysics Data System (ADS)
Gorelick, S.; Yoon, J.; Rajsekhar, D.; Muller, M. F.; Zhang, H.; Gawel, E.; Klauer, B.; Klassert, C. J. A.; Sigel, K.; Thilmant, A.; Avisse, N.; Lachaut, T.; Harou, J. J.; Knox, S.; Selby, P. D.; Mustafa, D.; Talozi, S.; Haddad, Y.; Shamekh, M.
2016-12-01
The Jordan Water Project, part of the Belmont Forum projects, is an interdisciplinary, international research effort focused on evaluation of freshwater security in Jordan, one of the most water-vulnerable countries in the world. The team covers hydrology, water resources systems analysis, economics, policy evaluation, geography, risk and remote sensing analyses, and model platform development. The entire project team communally engaged in construction of an integrated hydroeconomic model for water supply policy evaluation. To represent water demand and allocation behavior at multiple levels of decision making,the model integrates biophysical modules that simulate natural and engineered hydrologic phenomena with human behavioral modules. Hydrologic modules include spatially-distributed groundwater and surface-water models for the major aquifers and watersheds throughout Jordan. For the human modules, we adopt a multi-agent modeling approach to represent decision-making processes. The integrated model was developed in Pynsim, a new open-source, object-oriented platform in Python for network-based water resource systems. We continue to explore the impacts of future scenarios and interventions.This project had tremendous encouragement and data support from Jordan's Ministry of Water and Irrigation. Modeling technology is being transferred through a companion NSF/USAID PEER project awarded toJordan University of Science and Technology. Individual teams have also conducted a range of studies aimed at evaluating Jordanian and transboundary surface water and groundwater systems. Surveys, interviews, and econometric analyses enabled us to better understandthe behavior of urban households, farmers, private water resellers, water use pattern of the commercial sector and irrigation water user associations. We analyzed nationwide spatial and temporal statistical trends in rainfall, developed urban and national comparative metrics to quantify water supply vulnerability, improved remote sensing methods to estimate crop-water use, and evaluated the impacts of climate change on future drought severity.
Multi-Residential Activity Labelling in Smart Homes with Wearable Tags Using BLE Technology
Mokhtari, Ghassem; Zhang, Qing; Karunanithi, Mohanraj
2018-01-01
Smart home platforms show promising outcomes to provide a better quality of life for residents in their homes. One of the main challenges that exists with these platforms in multi-residential houses is activity labeling. As most of the activity sensors do not provide any information regarding the identity of the person who triggers them, it is difficult to label the sensor events in multi-residential smart homes. To deal with this challenge, individual localization in different areas can be a promising solution. The localization information can be used to automatically label the activity sensor data to individuals. Bluetooth low energy (BLE) is a promising technology for this application due to how easy it is to implement and its low energy footprint. In this approach, individuals wear a tag that broadcasts its unique identity (ID) in certain time intervals, while fixed scanners listen to the broadcasting packet to localize the tag and the individual. However, the localization accuracy of this method depends greatly on different settings of broadcasting signal strength, and the time interval of BLE tags. To achieve the best localization accuracy, this paper studies the impacts of different advertising time intervals and power levels, and proposes an efficient and applicable algorithm to select optimal value settings of BLE sensors. Moreover, it proposes an automatic activity labeling method, through integrating BLE localization information and ambient sensor data. The applicability and effectiveness of the proposed structure is also demonstrated in a real multi-resident smart home scenario. PMID:29562666
Multi-Residential Activity Labelling in Smart Homes with Wearable Tags Using BLE Technology.
Mokhtari, Ghassem; Anvari-Moghaddam, Amjad; Zhang, Qing; Karunanithi, Mohanraj
2018-03-19
Smart home platforms show promising outcomes to provide a better quality of life for residents in their homes. One of the main challenges that exists with these platforms in multi-residential houses is activity labeling. As most of the activity sensors do not provide any information regarding the identity of the person who triggers them, it is difficult to label the sensor events in multi-residential smart homes. To deal with this challenge, individual localization in different areas can be a promising solution. The localization information can be used to automatically label the activity sensor data to individuals. Bluetooth low energy (BLE) is a promising technology for this application due to how easy it is to implement and its low energy footprint. In this approach, individuals wear a tag that broadcasts its unique identity (ID) in certain time intervals, while fixed scanners listen to the broadcasting packet to localize the tag and the individual. However, the localization accuracy of this method depends greatly on different settings of broadcasting signal strength, and the time interval of BLE tags. To achieve the best localization accuracy, this paper studies the impacts of different advertising time intervals and power levels, and proposes an efficient and applicable algorithm to select optimal value settings of BLE sensors. Moreover, it proposes an automatic activity labeling method, through integrating BLE localization information and ambient sensor data. The applicability and effectiveness of the proposed structure is also demonstrated in a real multi-resident smart home scenario.
Recent Advances in Geospatial Visualization with the New Google Earth
NASA Astrophysics Data System (ADS)
Anderson, J. C.; Poyart, E.; Yan, S.; Sargent, R.
2017-12-01
Google Earth's detailed, world-wide imagery and terrain data provide a rich backdrop for geospatial visualization at multiple scales, from global to local. The Keyhole Markup Language (KML) is an open standard that has been the primary way for users to author and share data visualizations in Google Earth. Despite its ease of use and flexibility for relatively small amounts of data, users can quickly run into difficulties and limitations working with large-scale or time-varying datasets using KML in Google Earth. Recognizing these challenges, we present our recent work toward extending Google Earth to be a more powerful data visualization platform. We describe a new KML extension to simplify the display of multi-resolution map tile pyramids - which can be created by analysis platforms like Google Earth Engine, or by a variety of other map tile production pipelines. We also describe how this implementation can pave the way to creating novel data visualizations by leveraging custom graphics shaders. Finally, we present our investigations into native support in Google Earth for data storage and transport formats that are well-suited for big raster and vector data visualization. Taken together, these capabilities make it easier to create and share new scientific data visualization experiences using Google Earth, and simplify the integration of Google Earth with existing map data products, services, and analysis pipelines.
NASA Technical Reports Server (NTRS)
Baker, G. R.; Fethe, T. P.
1975-01-01
Research in the application of remotely sensed data from LANDSAT or other airborne platforms to the efficient management of a large timber based forest industry was divided into three phases: (1) establishment of a photo/ground sample correlation, (2) investigation of techniques for multi-spectral digital analysis, and (3) development of a semi-automated multi-level sampling system. To properly verify results, three distinct test areas were selected: (1) Jacksonville Mill Region, Lower Coastal Plain, Flatwoods, (2) Pensacola Mill Region, Middle Coastal Plain, and (3) Mississippi Mill Region, Middle Coastal Plain. The following conclusions were reached: (1) the probability of establishing an information base suitable for management requirements through a photo/ground double sampling procedure, alleviating the ground sampling effort, is encouraging, (2) known classification techniques must be investigated to ascertain the level of precision possible in separating the many densities involved, and (3) the multi-level approach must be related to an information system that is executable and feasible.
GIS-based Landing-Site Analysis and Passive Decision Support
NASA Astrophysics Data System (ADS)
van Gasselt, Stephan; Nass, Andrea
2016-04-01
The increase of surface coverage and the availability and accessibility of planetary data allow researchers and engineers to remotely perform detailed studies on surface processes and properties, in particular on objects such as Mars and the Moon for which Terabytes of multi-temporal data at multiple spatial resolution levels have become available during the last 15 years. Orbiters, rovers and landers have been returning information and insights into the surface evolution of the terrestrial planets in unprecedented detail. While rover- and lander-based analyses are one major research aim to obtain ground truth, resource exploration or even potential establishment of bases using autonomous platforms are others and they require detailed investigation of settings in order to identify spots on the surface that are suitable for spacecraft to land and operate safely and over a long period of time. What has been done using hardcopy material in the past is today being carried by using either in-house developments or off-the-shelf spatial information system technology which allows to manage, integrate and analyse data as well as visualize and create user-defined reports for performing assessments. Usually, such analyses can be broken down (manually) by considering scientific wishes, engineering boundary conditions, potential hazards and various tertiary constraints. We here (1) review standard tasks of landing site analyses, (2) discuss issues inherently related to the analysis using integrated spatial analysis systems and (3) demonstrate a modular analysis framework for integration of data and for the evaluation of results from individual tasks in order to support decisions for landing-site selection.
Biorefineries--multi product processes.
Kamm, B; Kamm, M
2007-01-01
The development of biorefineries represents the key for access to an integrated production of food, feed, chemicals, materials, goods, and fuels of the future [1]. Biorefineries combine the necessary technologies of the biogenic raw materials with those of intermediates and final products. The main focus is directed at the precursors carbohydrates, lignin, oils, and proteins and the combination between biotechnological and chemical conversion of substances. Currently the lignocellulosic feedstock biorefinery, green biorefinery, whole corn biorefinery, and the so-called two-platform concept are favored in research, development, and industrial implementation.
Multiscale agent-based cancer modeling.
Zhang, Le; Wang, Zhihui; Sagotsky, Jonathan A; Deisboeck, Thomas S
2009-04-01
Agent-based modeling (ABM) is an in silico technique that is being used in a variety of research areas such as in social sciences, economics and increasingly in biomedicine as an interdisciplinary tool to study the dynamics of complex systems. Here, we describe its applicability to integrative tumor biology research by introducing a multi-scale tumor modeling platform that understands brain cancer as a complex dynamic biosystem. We summarize significant findings of this work, and discuss both challenges and future directions for ABM in the field of cancer research.
Taylor, David; Valenza, John A; Spence, James M; Baber, Randolph H
2007-10-11
Simulation has been used for many years in dental education, but the educational context is typically a laboratory divorced from the clinical setting, which impairs the transfer of learning. Here we report on a true simulation clinic with multimedia communication from a central teaching station. Each of the 43 fully-functioning student operatories includes a thin-client networked computer with access to an Electronic Patient Record (EPR).
Integrated Component-based Data Acquisition Systems for Aerospace Test Facilities
NASA Technical Reports Server (NTRS)
Ross, Richard W.
2001-01-01
The Multi-Instrument Integrated Data Acquisition System (MIIDAS), developed by the NASA Langley Research Center, uses commercial off the shelf (COTS) products, integrated with custom software, to provide a broad range of capabilities at a low cost throughout the system s entire life cycle. MIIDAS combines data acquisition capabilities with online and post-test data reduction computations. COTS products lower purchase and maintenance costs by reducing the level of effort required to meet system requirements. Object-oriented methods are used to enhance modularity, encourage reusability, and to promote adaptability, reducing software development costs. Using only COTS products and custom software supported on multiple platforms reduces the cost of porting the system to other platforms. The post-test data reduction capabilities of MIIDAS have been installed at four aerospace testing facilities at NASA Langley Research Center. The systems installed at these facilities provide a common user interface, reducing the training time required for personnel that work across multiple facilities. The techniques employed by MIIDAS enable NASA to build a system with a lower initial purchase price and reduced sustaining maintenance costs. With MIIDAS, NASA has built a highly flexible next generation data acquisition and reduction system for aerospace test facilities that meets customer expectations.
A Comprehensive Validation Approach Using The RAVEN Code
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J
2015-06-01
The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less
Rosen, Gunther; Chadwick, D Bart; Burton, G Allen; Taulbee, W Keith; Greenberg, Marc S; Lotufo, Guilherme R; Reible, Danny D
2012-03-01
A comprehensive, weight-of-evidence based ecological risk assessment approach integrating laboratory and in situ bioaccumulation and toxicity testing, passive sampler devices, hydrological characterization tools, continuous water quality sensing, and multi-phase chemical analyses was evaluated. The test site used to demonstrate the approach was a shallow estuarine wetland where groundwater seepage and elevated organic and inorganic contaminants were of potential concern. Although groundwater was discharging into the surficial sediments, little to no chemical contamination was associated with the infiltrating groundwater. Results from bulk chemistry analysis, toxicity testing, and bioaccumulation, however, suggested possible PAH toxicity at one station, which might have been enhanced by UV photoactivation, explaining the differences between in situ and laboratory amphipod survival. Concurrently deployed PAH bioaccumulation on solid-phase micro-extraction fibers positively correlated (r(2) ≥ 0.977) with in situ PAH bioaccumulation in amphipods, attesting to their utility as biomimetics, and contributing to the overall improved linkage between exposure and effects demonstrated by this approach. Published by Elsevier Ltd.
Multi-Threaded DNA Tag/Anti-Tag Library Generator for Multi-Core Platforms
2009-05-01
base pair) Watson ‐ Crick strand pairs that bind perfectly within pairs, but poorly across pairs. A variety of DNA strand hybridization metrics...AFRL-RI-RS-TR-2009-131 Final Technical Report May 2009 MULTI-THREADED DNA TAG/ANTI-TAG LIBRARY GENERATOR FOR MULTI-CORE PLATFORMS...TYPE Final 3. DATES COVERED (From - To) Jun 08 – Feb 09 4. TITLE AND SUBTITLE MULTI-THREADED DNA TAG/ANTI-TAG LIBRARY GENERATOR FOR MULTI-CORE
MeV+R: using MeV as a graphical user interface for Bioconductor applications in microarray analysis
Chu, Vu T; Gottardo, Raphael; Raftery, Adrian E; Bumgarner, Roger E; Yeung, Ka Yee
2008-01-01
We present MeV+R, an integration of the JAVA MultiExperiment Viewer program with Bioconductor packages. This integration of MultiExperiment Viewer and R is easily extensible to other R packages and provides users with point and click access to traditionally command line driven tools written in R. We demonstrate the ability to use MultiExperiment Viewer as a graphical user interface for Bioconductor applications in microarray data analysis by incorporating three Bioconductor packages, RAMA, BRIDGE and iterativeBMA. PMID:18652698
C 3, A Command-line Catalog Cross-match Tool for Large Astrophysical Catalogs
NASA Astrophysics Data System (ADS)
Riccio, Giuseppe; Brescia, Massimo; Cavuoti, Stefano; Mercurio, Amata; di Giorgio, Anna Maria; Molinari, Sergio
2017-02-01
Modern Astrophysics is based on multi-wavelength data organized into large and heterogeneous catalogs. Hence, the need for efficient, reliable and scalable catalog cross-matching methods plays a crucial role in the era of the petabyte scale. Furthermore, multi-band data have often very different angular resolution, requiring the highest generality of cross-matching features, mainly in terms of region shape and resolution. In this work we present C 3 (Command-line Catalog Cross-match), a multi-platform application designed to efficiently cross-match massive catalogs. It is based on a multi-core parallel processing paradigm and conceived to be executed as a stand-alone command-line process or integrated within any generic data reduction/analysis pipeline, providing the maximum flexibility to the end-user, in terms of portability, parameter configuration, catalog formats, angular resolution, region shapes, coordinate units and cross-matching types. Using real data, extracted from public surveys, we discuss the cross-matching capabilities and computing time efficiency also through a direct comparison with some publicly available tools, chosen among the most used within the community, and representative of different interface paradigms. We verified that the C 3 tool has excellent capabilities to perform an efficient and reliable cross-matching between large data sets. Although the elliptical cross-match and the parametric handling of angular orientation and offset are known concepts in the astrophysical context, their availability in the presented command-line tool makes C 3 competitive in the context of public astronomical tools.
Unified Framework for Development, Deployment and Robust Testing of Neuroimaging Algorithms
Joshi, Alark; Scheinost, Dustin; Okuda, Hirohito; Belhachemi, Dominique; Murphy, Isabella; Staib, Lawrence H.; Papademetris, Xenophon
2011-01-01
Developing both graphical and command-line user interfaces for neuroimaging algorithms requires considerable effort. Neuroimaging algorithms can meet their potential only if they can be easily and frequently used by their intended users. Deployment of a large suite of such algorithms on multiple platforms requires consistency of user interface controls, consistent results across various platforms and thorough testing. We present the design and implementation of a novel object-oriented framework that allows for rapid development of complex image analysis algorithms with many reusable components and the ability to easily add graphical user interface controls. Our framework also allows for simplified yet robust nightly testing of the algorithms to ensure stability and cross platform interoperability. All of the functionality is encapsulated into a software object requiring no separate source code for user interfaces, testing or deployment. This formulation makes our framework ideal for developing novel, stable and easy-to-use algorithms for medical image analysis and computer assisted interventions. The framework has been both deployed at Yale and released for public use in the open source multi-platform image analysis software—BioImage Suite (bioimagesuite.org). PMID:21249532
PROS: An IRAF based system for analysis of x ray data
NASA Technical Reports Server (NTRS)
Conroy, M. A.; Deponte, J.; Moran, J. F.; Orszak, J. S.; Roberts, W. P.; Schmidt, D.
1992-01-01
PROS is an IRAF based software package for the reduction and analysis of x-ray data. The use of a standard, portable, integrated environment provides for both multi-frequency and multi-mission analysis. The analysis of x-ray data differs from optical analysis due to the nature of the x-ray data and its acquisition during constantly varying conditions. The scarcity of data, the low signal-to-noise ratio and the large gaps in exposure time make data screening and masking an important part of the analysis. PROS was developed to support the analysis of data from the ROSAT and Einstein missions but many of the tasks have been used on data from other missions. IRAF/PROS provides a complete end-to-end system for x-ray data analysis: (1) a set of tools for importing and exporting data via FITS format -- in particular, IRAF provides a specialized event-list format, QPOE, that is compatible with its IMAGE (2-D array) format; (2) a powerful set of IRAF system capabilities for both temporal and spatial event filtering; (3) full set of imaging and graphics tasks; (4) specialized packages for scientific analysis such as spatial, spectral and timing analysis -- these consist of both general and mission specific tasks; and (5) complete system support including ftp and magnetic tape releases, electronic and conventional mail hotline support, electronic mail distribution of solutions to frequently asked questions and current known bugs. We will discuss the philosophy, architecture and development environment used by PROS to generate a portable, multimission software environment. PROS is available on all platforms that support IRAF, including Sun/Unix, VAX/VMS, HP, and Decstations. It is available on request at no charge.
NASA Astrophysics Data System (ADS)
Wałach, Daniel; Sagan, Joanna; Gicala, Magdalena
2017-10-01
The paper presents an environmental and economic analysis of the material solutions of multi-level garage. The construction project approach considered reinforced concrete structure under conditions of use of ordinary concrete and high-performance concrete (HPC). Using of HPC allowed to significant reduction of reinforcement steel, mainly in compression elements (columns) in the construction of the object. The analysis includes elements of the methodology of integrated lice cycle design (ILCD). By making multi-criteria analysis based on established weight of the economic and environmental parameters, three solutions have been evaluated and compared within phase of material production (information modules A1-A3).
Gas diffusion as a new fluidic unit operation for centrifugal microfluidic platforms.
Ymbern, Oriol; Sández, Natàlia; Calvo-López, Antonio; Puyol, Mar; Alonso-Chamarro, Julian
2014-03-07
A centrifugal microfluidic platform prototype with an integrated membrane for gas diffusion is presented for the first time. The centrifugal platform allows multiple and parallel analysis on a single disk and integrates at least ten independent microfluidic subunits, which allow both calibration and sample determination. It is constructed with a polymeric substrate material and it is designed to perform colorimetric determinations by the use of a simple miniaturized optical detection system. The determination of three different analytes, sulfur dioxide, nitrite and carbon dioxide, is carried out as a proof of concept of a versatile microfluidic system for the determination of analytes which involve a gas diffusion separation step during the analytical procedure.
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-01-01
Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367
Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei
2012-06-15
The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.
A Passive Wireless Multi-Sensor SAW Technology Device and System Perspectives
Malocha, Donald C.; Gallagher, Mark; Fisher, Brian; Humphries, James; Gallagher, Daniel; Kozlovski, Nikolai
2013-01-01
This paper will discuss a SAW passive, wireless multi-sensor system under development by our group for the past several years. The device focus is on orthogonal frequency coded (OFC) SAW sensors, which use both frequency diversity and pulse position reflectors to encode the device ID and will be briefly contrasted to other embodiments. A synchronous correlator transceiver is used for the hardware and post processing and correlation techniques of the received signal to extract the sensor information will be presented. Critical device and system parameters addressed include encoding, operational range, SAW device parameters, post-processing, and antenna-SAW device integration. A fully developed 915 MHz OFC SAW multi-sensor system is used to show experimental results. The system is based on a software radio approach that provides great flexibility for future enhancements and diverse sensor applications. Several different sensor types using the OFC SAW platform are shown. PMID:23666124
Parametric Modelling of As-Built Beam Framed Structure in Bim Environment
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2017-02-01
A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.
Sahoo, Satya S; Ramesh, Priya; Welter, Elisabeth; Bukach, Ashley; Valdez, Joshua; Tatsuoka, Curtis; Bamps, Yvan; Stoll, Shelley; Jobst, Barbara C; Sajatovic, Martha
2016-10-01
We present Insight as an integrated database and analysis platform for epilepsy self-management research as part of the national Managing Epilepsy Well Network. Insight is the only available informatics platform for accessing and analyzing integrated data from multiple epilepsy self-management research studies with several new data management features and user-friendly functionalities. The features of Insight include, (1) use of Common Data Elements defined by members of the research community and an epilepsy domain ontology for data integration and querying, (2) visualization tools to support real time exploration of data distribution across research studies, and (3) an interactive visual query interface for provenance-enabled research cohort identification. The Insight platform contains data from five completed epilepsy self-management research studies covering various categories of data, including depression, quality of life, seizure frequency, and socioeconomic information. The data represents over 400 participants with 7552 data points. The Insight data exploration and cohort identification query interface has been developed using Ruby on Rails Web technology and open source Web Ontology Language Application Programming Interface to support ontology-based reasoning. We have developed an efficient ontology management module that automatically updates the ontology mappings each time a new version of the Epilepsy and Seizure Ontology is released. The Insight platform features a Role-based Access Control module to authenticate and effectively manage user access to different research studies. User access to Insight is managed by the Managing Epilepsy Well Network database steering committee consisting of representatives of all current collaborating centers of the Managing Epilepsy Well Network. New research studies are being continuously added to the Insight database and the size as well as the unique coverage of the dataset allows investigators to conduct aggregate data analysis that will inform the next generation of epilepsy self-management studies. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Integration services to enable regional shared electronic health records.
Oliveira, Ilídio C; Cunha, João P S
2011-01-01
eHealth is expected to integrate a comprehensive set of patient data sources into a coherent continuum, but implementations vary and Portugal is still lacking on electronic patient data sharing. In this work, we present a clinical information hub to aggregate multi-institution patient data and bridge the information silos. This integration platform enables a coherent object model, services-oriented applications development and a trust framework. It has been instantiated in the Rede Telemática de Saúde (www.RTSaude.org) to support a regional Electronic Health Record approach, fed dynamically from production systems at eight partner institutions, providing access to more than 11,000,000 care episodes, relating to over 350,000 citizens. The network has obtained the necessary clearance from the Portuguese data protection agency.
Study on Vortex-Induced Motions of A New Type of Deep Draft Multi-Columns FDPSO
NASA Astrophysics Data System (ADS)
Gu, Jia-yang; Xie, Yu-lin; Zhao, Yuan; Li, Wen-juan; Tao, Yan-wu; Huang, Xiang-hong
2018-03-01
A numerical simulation and an experimental study on vortex-induced motion (VIM) of a new type of deep draft multi-columns floating drilling production, storage and offloading (FDPSO) are presented in this paper. The main dimension, the special variable cross-section column and the cabin arrangement of the octagonal pontoon are introduced based on the result. The numerical simulation is adapted to study the effects of current incidence angles and reduced velocities on this platform's sway motion response. The 300 m water depth equivalent truncated mooring system is adopted for the model tests. The model tests are carried out to check the reliability of numerical simulation. The results consist of surge, sway and yaw motions, as well as motion trajectories. The maximum sway amplitudes for different types of offshore platform is also studied. The main results show that the peak frequencies of sway motion under different current incidence angles and reduced velocities vary around the natural frequency. The analysis result of flow field indicates that the change of distribution of vortex in vertical presents significant influences on the VIM of platform. The trend of sway amplitude ratio curve of this new type FDPSO differs from the other types of platform. Under 45° current incidence angle, the sway amplitude of this new type of FDPSO is much smaller than those of other types of offshore platform at 4.4 ≤ V r ≤ 8.9. The typical `8' shape trajectory does not appear in the platform's motion trajectories.
Thermal Design and Analysis of an ISS Science Payload - SAGE III on ISS
NASA Technical Reports Server (NTRS)
Liles, Kaitlin, A. K.; Amundsen, Ruth M.; Davis, Warren T.; Carrillo, Laurie Y.
2017-01-01
The Stratospheric Aerosol and Gas Experiment III (SAGE III) instrument is the fifth in a series of instruments developed for monitoring aerosols and gaseous constituents in the stratosphere and troposphere. SAGE III will be launched in the SpaceX Dragon vehicle in 2017 and mounted to an external stowage platform on the International Space Station (ISS) to begin its three-year mission. The SAGE III thermal team at NASA Langley Research Center (LaRC) worked with ISS thermal engineers to ensure that SAGE III, as an ISS payload, would meet requirements specific to ISS and the Dragon vehicle. This document presents an overview of the SAGE III thermal design and analysis efforts, focusing on aspects that are relevant for future ISS payload developers. This includes development of detailed and reduced Thermal Desktop (TD) models integrated with the ISS and launch vehicle models, definition of analysis cases necessary to verify thermal requirements considering all mission phases from launch through installation and operation on-orbit, and challenges associated with thermal hardware selection including heaters, multi-layer insulation (MLI) blankets, and thermal tapes.
[Applications of meta-analysis in multi-omics].
Han, Mingfei; Zhu, Yunping
2014-07-01
As a statistical method integrating multi-features and multi-data, meta-analysis was introduced to the field of life science in the 1990s. With the rapid advances in high-throughput technologies, life omics, the core of which are genomics, transcriptomics and proteomics, is becoming the new hot spot of life science. Although the fast output of massive data has promoted the development of omics study, it results in excessive data that are difficult to integrate systematically. In this case, meta-analysis is frequently applied to analyze different types of data and is improved continuously. Here, we first summarize the representative meta-analysis methods systematically, and then study the current applications of meta-analysis in various omics fields, finally we discuss the still-existing problems and the future development of meta-analysis.
Future Directions for Astronomical Image Display
NASA Technical Reports Server (NTRS)
Mandel, Eric
2000-01-01
In the "Future Directions for Astronomical Image Displav" project, the Smithsonian Astrophysical Observatory (SAO) and the National Optical Astronomy Observatories (NOAO) evolved our existing image display program into fully extensible. cross-platform image display software. We also devised messaging software to support integration of image display into astronomical analysis systems. Finally, we migrated our software from reliance on Unix and the X Window System to a platform-independent architecture that utilizes the cross-platform Tcl/Tk technology.
A Hierarchical Visualization Analysis Model of Power Big Data
NASA Astrophysics Data System (ADS)
Li, Yongjie; Wang, Zheng; Hao, Yang
2018-01-01
Based on the conception of integrating VR scene and power big data analysis, a hierarchical visualization analysis model of power big data is proposed, in which levels are designed, targeting at different abstract modules like transaction, engine, computation, control and store. The regularly departed modules of power data storing, data mining and analysis, data visualization are integrated into one platform by this model. It provides a visual analysis solution for the power big data.
Proactive Time-Rearrangement Scheme for Multi-Radio Collocated Platform
NASA Astrophysics Data System (ADS)
Kim, Chul; Shin, Sang-Heon; Park, Sang Kyu
We present a simple proactive time rearrangement scheme (PATRA) that reduces the interferences from multi-radio devices equipped in one platform and guarantees user-conceived QoS. Simulation results show that the interference among multiple radios in one platform causes severe performance degradation and cannot guarantee the user requested QoS. However, the PATRA can dramatically improve not only the userconceived QoS but also the overall network throughput.
The MiPACQ Clinical Question Answering System
Cairns, Brian L.; Nielsen, Rodney D.; Masanz, James J.; Martin, James H.; Palmer, Martha S.; Ward, Wayne H.; Savova, Guergana K.
2011-01-01
The Multi-source Integrated Platform for Answering Clinical Questions (MiPACQ) is a QA pipeline that integrates a variety of information retrieval and natural language processing systems into an extensible question answering system. We present the system’s architecture and an evaluation of MiPACQ on a human-annotated evaluation dataset based on the Medpedia health and medical encyclopedia. Compared with our baseline information retrieval system, the MiPACQ rule-based system demonstrates 84% improvement in Precision at One and the MiPACQ machine-learning-based system demonstrates 134% improvement. Other performance metrics including mean reciprocal rank and area under the precision/recall curves also showed significant improvement, validating the effectiveness of the MiPACQ design and implementation. PMID:22195068
The MiPACQ clinical question answering system.
Cairns, Brian L; Nielsen, Rodney D; Masanz, James J; Martin, James H; Palmer, Martha S; Ward, Wayne H; Savova, Guergana K
2011-01-01
The Multi-source Integrated Platform for Answering Clinical Questions (MiPACQ) is a QA pipeline that integrates a variety of information retrieval and natural language processing systems into an extensible question answering system. We present the system's architecture and an evaluation of MiPACQ on a human-annotated evaluation dataset based on the Medpedia health and medical encyclopedia. Compared with our baseline information retrieval system, the MiPACQ rule-based system demonstrates 84% improvement in Precision at One and the MiPACQ machine-learning-based system demonstrates 134% improvement. Other performance metrics including mean reciprocal rank and area under the precision/recall curves also showed significant improvement, validating the effectiveness of the MiPACQ design and implementation.
Strategic Industrial Alliances in Paper Industry: XML- vs Ontology-Based Integration Platforms
ERIC Educational Resources Information Center
Naumenko, Anton; Nikitin, Sergiy; Terziyan, Vagan; Zharko, Andriy
2005-01-01
Purpose: To identify cases related to design of ICT platforms for industrial alliances, where the use of Ontology-driven architectures based on Semantic web standards is more advantageous than application of conventional modeling together with XML standards. Design/methodology/approach: A comparative analysis of the two latest and the most obvious…
VStar: Variable star data visualization and analysis tool
NASA Astrophysics Data System (ADS)
VStar Team
2014-07-01
VStar is a multi-platform, easy-to-use variable star data visualization and analysis tool. Data for a star can be read from the AAVSO (American Association of Variable Star Observers) database or from CSV and TSV files. VStar displays light curves and phase plots, can produce a mean curve, and analyzes time-frequency with Weighted Wavelet Z-Transform. It offers tools for period analysis, filtering, and other functions.
PNNL Data-Intensive Computing for a Smarter Energy Grid
Carol Imhoff; Zhenyu (Henry) Huang; Daniel Chavarria
2017-12-09
The Middleware for Data-Intensive Computing (MeDICi) Integration Framework, an integrated platform to solve data analysis and processing needs, supports PNNL research on the U.S. electric power grid. MeDICi is enabling development of visualizations of grid operations and vulnerabilities, with goal of near real-time analysis to aid operators in preventing and mitigating grid failures.
Chae, Heejoon; Lee, Sangseon; Seo, Seokjun; Jung, Daekyoung; Chang, Hyeonsook; Nephew, Kenneth P; Kim, Sun
2016-12-01
Measuring gene expression, DNA sequence variation, and DNA methylation status is routinely done using high throughput sequencing technologies. To analyze such multi-omics data and explore relationships, reliable bioinformatics systems are much needed. Existing systems are either for exploring curated data or for processing omics data in the form of a library such as R. Thus scientists have much difficulty in investigating relationships among gene expression, DNA sequence variation, and DNA methylation using multi-omics data. In this study, we report a system called BioVLAB-mCpG-SNP-EXPRESS for the integrated analysis of DNA methylation, sequence variation (SNPs), and gene expression for distinguishing cellular phenotypes at the pairwise and multiple phenotype levels. The system can be deployed on either the Amazon cloud or a publicly available high-performance computing node, and the data analysis and exploration of the analysis result can be conveniently done using a web-based interface. In order to alleviate analysis complexity, all the process are fully automated, and graphical workflow system is integrated to represent real-time analysis progression. The BioVLAB-mCpG-SNP-EXPRESS system works in three stages. First, it processes and analyzes multi-omics data as input in the form of the raw data, i.e., FastQ files. Second, various integrated analyses such as methylation vs. gene expression and mutation vs. methylation are performed. Finally, the analysis result can be explored in a number of ways through a web interface for the multi-level, multi-perspective exploration. Multi-level interpretation can be done by either gene, gene set, pathway or network level and multi-perspective exploration can be explored from either gene expression, DNA methylation, sequence variation, or their relationship perspective. The utility of the system is demonstrated by performing analysis of phenotypically distinct 30 breast cancer cell line data set. BioVLAB-mCpG-SNP-EXPRESS is available at http://biohealth.snu.ac.kr/software/biovlab_mcpg_snp_express/. Copyright © 2016 Elsevier Inc. All rights reserved.
Liang, Li; Oline, Stefan N; Kirk, Justin C; Schmitt, Lukas Ian; Komorowski, Robert W; Remondes, Miguel; Halassa, Michael M
2017-01-01
Independently adjustable multielectrode arrays are routinely used to interrogate neuronal circuit function, enabling chronic in vivo monitoring of neuronal ensembles in freely behaving animals at a single-cell, single spike resolution. Despite the importance of this approach, its widespread use is limited by highly specialized design and fabrication methods. To address this, we have developed a Scalable, Lightweight, Integrated and Quick-to-assemble multielectrode array platform. This platform additionally integrates optical fibers with independently adjustable electrodes to allow simultaneous single unit recordings and circuit-specific optogenetic targeting and/or manipulation. In current designs, the fully assembled platforms are scalable from 2 to 32 microdrives, and yet range 1-3 g, light enough for small animals. Here, we describe the design process starting from intent in computer-aided design, parameter testing through finite element analysis and experimental means, and implementation of various applications across mice and rats. Combined, our methods may expand the utility of multielectrode recordings and their continued integration with other tools enabling functional dissection of intact neural circuits.
Software for the Integration of Multiomics Experiments in Bioconductor.
Ramos, Marcel; Schiffer, Lucas; Re, Angela; Azhar, Rimsha; Basunia, Azfar; Rodriguez, Carmen; Chan, Tiffany; Chapman, Phil; Davis, Sean R; Gomez-Cabrero, David; Culhane, Aedin C; Haibe-Kains, Benjamin; Hansen, Kasper D; Kodali, Hanish; Louis, Marie S; Mer, Arvind S; Riester, Markus; Morgan, Martin; Carey, Vince; Waldron, Levi
2017-11-01
Multiomics experiments are increasingly commonplace in biomedical research and add layers of complexity to experimental design, data integration, and analysis. R and Bioconductor provide a generic framework for statistical analysis and visualization, as well as specialized data classes for a variety of high-throughput data types, but methods are lacking for integrative analysis of multiomics experiments. The MultiAssayExperiment software package, implemented in R and leveraging Bioconductor software and design principles, provides for the coordinated representation of, storage of, and operation on multiple diverse genomics data. We provide the unrestricted multiple 'omics data for each cancer tissue in The Cancer Genome Atlas as ready-to-analyze MultiAssayExperiment objects and demonstrate in these and other datasets how the software simplifies data representation, statistical analysis, and visualization. The MultiAssayExperiment Bioconductor package reduces major obstacles to efficient, scalable, and reproducible statistical analysis of multiomics data and enhances data science applications of multiple omics datasets. Cancer Res; 77(21); e39-42. ©2017 AACR . ©2017 American Association for Cancer Research.
JaSTA-2: Second version of the Java Superposition T-matrix Application
NASA Astrophysics Data System (ADS)
Halder, Prithish; Das, Himadri Sekhar
2017-12-01
In this article, we announce the development of a new version of the Java Superposition T-matrix App (JaSTA-2), to study the light scattering properties of porous aggregate particles. It has been developed using Netbeans 7.1.2, which is a java integrated development environment (IDE). The JaSTA uses double precision superposition T-matrix codes for multi-sphere clusters in random orientation, developed by Mackowski and Mischenko (1996). The new version consists of two options as part of the input parameters: (i) single wavelength and (ii) multiple wavelengths. The first option (which retains the applicability of older version of JaSTA) calculates the light scattering properties of aggregates of spheres for a single wavelength at a given instant of time whereas the second option can execute the code for a multiple numbers of wavelengths in a single run. JaSTA-2 provides convenient and quicker data analysis which can be used in diverse fields like Planetary Science, Atmospheric Physics, Nanoscience, etc. This version of the software is developed for Linux platform only, and it can be operated over all the cores of a processor using the multi-threading option.
e-Science platform for translational biomedical imaging research: running, statistics, and analysis
NASA Astrophysics Data System (ADS)
Wang, Tusheng; Yang, Yuanyuan; Zhang, Kai; Wang, Mingqing; Zhao, Jun; Xu, Lisa; Zhang, Jianguo
2015-03-01
In order to enable multiple disciplines of medical researchers, clinical physicians and biomedical engineers working together in a secured, efficient, and transparent cooperative environment, we had designed an e-Science platform for biomedical imaging research and application cross multiple academic institutions and hospitals in Shanghai and presented this work in SPIE Medical Imaging conference held in San Diego in 2012. In past the two-years, we implemented a biomedical image chain including communication, storage, cooperation and computing based on this e-Science platform. In this presentation, we presented the operating status of this system in supporting biomedical imaging research, analyzed and discussed results of this system in supporting multi-disciplines collaboration cross-multiple institutions.
Design of a dynamic test platform for autonomous robot vision systems
NASA Technical Reports Server (NTRS)
Rich, G. C.
1980-01-01
The concept and design of a dynamic test platform for development and evluation of a robot vision system is discussed. The platform is to serve as a diagnostic and developmental tool for future work with the RPI Mars Rover's multi laser/multi detector vision system. The platform allows testing of the vision system while its attitude is varied, statically or periodically. The vision system is mounted on the test platform. It can then be subjected to a wide variety of simulated can thus be examined in a controlled, quantitative fashion. Defining and modeling Rover motions and designing the platform to emulate these motions are also discussed. Individual aspects of the design process are treated separately, as structural, driving linkages, and motors and transmissions.
Tolstikhin, Valery; Saeidi, Shayan; Dolgaleva, Ksenia
2018-05-01
We report on the design optimization and tolerance analysis of a multistep lateral-taper spot-size converter based on indium phosphide (InP), performed using the Monte Carlo method. Being a natural fit to (and a key building block of) the regrowth-free taper-assisted vertical integration platform, such a spot-size converter enables efficient and displacement-tolerant fiber coupling to InP-based photonic integrated circuits at a wavelength of 1.31 μm. An exemplary four-step lateral-taper design featuring 0.35 dB coupling loss at optimal alignment of a standard single-mode fiber; ≥7 μm 1 dB displacement tolerance in any direction in a facet plane; and great stability against manufacturing variances is demonstrated.
PRIMA Platform capability for satellite missions in LEO and MEO (SAR, Optical, GNSS, TLC, etc.)
NASA Astrophysics Data System (ADS)
Logue, T.; L'Abbate, M.
2016-12-01
PRIMA (Piattaforma Riconfigurabile Italiana Multi Applicativa) is a multi-mission 3-axis stabilized Platform developed by Thales Alenia Space Italia under ASI contract.PRIMA is designed to operate for a wide variety of applications from LEO, MEO up to GEO and for different classes of satellites Platform Family. It has an extensive heritage in flight heritage (LEO and MEO Satellites already fully operational) in which it has successfully demonstrated the flexibility of use, low management costs and the ability to adapt to changing operational conditions.The flexibility and modularity of PRIMA provides unique capability to satisfy different Payload design and mission requirements, thanks to the utilization of recurrent adaptable modules (Service Module-SVM, Propulsion Module-PPM, Payload Module-PLM) to obtain mission dependent configuration. PRIMA product line development is continuously progressing, and is based on state of art technology, modular architecture and an Integrated Avionics. The aim is to maintain and extent multi-mission capabilities to operate in different environments (LEO to GEO) with different payloads (SAR, Optical, GNSS, TLC, etc.). The design is compatible with a wide range of European and US equipment suppliers, thus maximising cooperation opportunity. Evolution activities are mainly focused on the following areas: Structure: to enable Spacecraft configurations for multiple launch; Thermal Control: to guarantee thermal limits for new missions, more demanding in terms of environment and payload; Electrical: to cope with higher power demand (e.g. electrical propulsion, wide range of payloads, etc.) considering orbital environment (e.g. lighting condition); Avionics : AOCS solutions optimized on mission (LEO observation driven by agility and pointing, agility not a driver for GEO). Use of sensors and actuators tailored for specific mission and related environments. Optimised Propulsion control. Data Handling, SW and FDIR mission customization, ensuring robust storage and downlink capability, long lasting autonomy and flexible operations in all mission phases, nominal and non-nominal conditions. This paper starting from PRIMA flight achievements will then outline PRIMA family multi-purpose features addressed to meet multi mission requirements.
2013-12-18
include interactive gene and methylation profiles, interactive heatmaps, cytoscape network views, integrative genomics viewer ( IGV ), and protein-protein...single chart. The website also provides an option to include multiple genes. Integrative Genomics Viewer ( IGV )1, is a high-performance desktop tool for
NASA Astrophysics Data System (ADS)
Ghaebi, Hadi; Abbaspour, Ghader
2018-05-01
In this research, thermoeconomic analysis of a multi-effect desalination thermal vapor compression (MED-TVC) system integrated with a trigeneration system with a gas turbine prime mover is carried out. The integrated system comprises of a compressor, a combustion chamber, a gas turbine, a triple-pressure (low, medium and high pressures) heat recovery steam generator (HRSG) system, an absorption chiller cycle (ACC), and a multi-effect desalination (MED) system. Low pressure steam produced in the HRSG is used to drive absorption chiller cycle, medium pressure is used in desalination system and high pressure superheated steam is used for heating purposes. For thermodynamic and thermoeconomic analysis of the proposed integrated system, Engineering Equation Solver (EES) is used by employing mass, energy, exergy, and cost balance equations for each component of system. The results of the modeling showed that with the new design, the exergy efficiency in the base design will increase to 57.5%. In addition, thermoeconomic analysis revealed that the net power, heating, fresh water and cooling have the highest production cost, respectively.
Crescentini, Marco; Thei, Frederico; Bennati, Marco; Saha, Shimul; de Planque, Maurits R R; Morgan, Hywel; Tartagni, Marco
2015-06-01
Lipid bilayer membrane (BLM) arrays are required for high throughput analysis, for example drug screening or advanced DNA sequencing. Complex microfluidic devices are being developed but these are restricted in terms of array size and structure or have integrated electronic sensing with limited noise performance. We present a compact and scalable multichannel electrophysiology platform based on a hybrid approach that combines integrated state-of-the-art microelectronics with low-cost disposable fluidics providing a platform for high-quality parallel single ion channel recording. Specifically, we have developed a new integrated circuit amplifier based on a novel noise cancellation scheme that eliminates flicker noise derived from devices under test and amplifiers. The system is demonstrated through the simultaneous recording of ion channel activity from eight bilayer membranes. The platform is scalable and could be extended to much larger array sizes, limited only by electronic data decimation and communication capabilities.
Multimedia-modeling integration development environment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pelton, Mitchell A.; Hoopes, Bonnie L.
2002-09-02
There are many framework systems available; however, the purpose of the framework presented here is to capitalize on the successes of the Framework for Risk Analysis in Multimedia Environmental Systems (FRAMES) and Multi-media Multi-pathway Multi-receptor Risk Assessment (3MRA) methodology as applied to the Hazardous Waste Identification Rule (HWIR) while focusing on the development of software tools to simplify the module developer?s effort of integrating a module into the framework.
Bridging ultrahigh-Q devices and photonic circuits
NASA Astrophysics Data System (ADS)
Yang, Ki Youl; Oh, Dong Yoon; Lee, Seung Hoon; Yang, Qi-Fan; Yi, Xu; Shen, Boqiang; Wang, Heming; Vahala, Kerry
2018-05-01
Optical microresonators are essential to a broad range of technologies and scientific disciplines. However, many of their applications rely on discrete devices to attain challenging combinations of ultra-low-loss performance (ultrahigh Q) and resonator design requirements. This prevents access to scalable fabrication methods for photonic integration and lithographic feature control. Indeed, finding a microfabrication bridge that connects ultrahigh-Q device functions with photonic circuits is a priority of the microcavity field. Here, an integrated resonator having a record Q factor over 200 million is presented. Its ultra-low-loss and flexible cavity design brings performance to integrated systems that has been the exclusive domain of discrete silica and crystalline microcavity devices. Two distinctly different devices are demonstrated: soliton sources with electronic repetition rates and high-coherence/low-threshold Brillouin lasers. This multi-device capability and performance from a single integrated cavity platform represents a critical advance for future photonic circuits and systems.
[Analysis of how elderly internet users react to unexpected situations].
Haesner, Marten; Steinert, Anika; O'Sullivan, Julie Lorraine; Steinhagen-Thiessen, Elisabeth
2015-12-01
Although internet usage among older adults is steadily increasing, there is still a digital divide between generations. Younger internet users seem to be more open towards new media. Recent studies showed the negative influence of computer anxiety on internet usage. It is not known how older adults deal with computer and internet issues in their home environment and which problem-solving strategies they apply. The behavior of elderly people in unexpected situations when using an internet portal was analyzed to establish whether older users can solve the problems without assistance and what individual reactions (e.g. facial expressions and gesticulations) they show during the interaction. In a clinical trial with 50 older adults aged 60 years and older various typical problems which may occur while using web platforms were simulated and user behavior was analyzed using logging data, videography and with questionnaires to measure the subjective opinion of the study participants. The study participants had severe problems in solving the tasks on their own and many of them could not find a suitable solution at all. Overall, the videography data indicated an increased concentration of the participants during the whole session, which is in contrast to the low levels of perceived mental workload reported by the participants. Regarding task completion, no differences were found between seniors with and without cognitive impairment. The results showed the serious difficulties of older adults when dealing with unexpected events while using a web platform. For developers of internet platforms for inexperienced seniors, it seems to be crucial to incorporate a simple integration of all available features within the platform, without including features requiring high multi-tasking skills.
Stone, John E; Hallock, Michael J; Phillips, James C; Peterson, Joseph R; Luthey-Schulten, Zaida; Schulten, Klaus
2016-05-01
Many of the continuing scientific advances achieved through computational biology are predicated on the availability of ongoing increases in computational power required for detailed simulation and analysis of cellular processes on biologically-relevant timescales. A critical challenge facing the development of future exascale supercomputer systems is the development of new computing hardware and associated scientific applications that dramatically improve upon the energy efficiency of existing solutions, while providing increased simulation, analysis, and visualization performance. Mobile computing platforms have recently become powerful enough to support interactive molecular visualization tasks that were previously only possible on laptops and workstations, creating future opportunities for their convenient use for meetings, remote collaboration, and as head mounted displays for immersive stereoscopic viewing. We describe early experiences adapting several biomolecular simulation and analysis applications for emerging heterogeneous computing platforms that combine power-efficient system-on-chip multi-core CPUs with high-performance massively parallel GPUs. We present low-cost power monitoring instrumentation that provides sufficient temporal resolution to evaluate the power consumption of individual CPU algorithms and GPU kernels. We compare the performance and energy efficiency of scientific applications running on emerging platforms with results obtained on traditional platforms, identify hardware and algorithmic performance bottlenecks that affect the usability of these platforms, and describe avenues for improving both the hardware and applications in pursuit of the needs of molecular modeling tasks on mobile devices and future exascale computers.
Modular Countermine Payload for Small Robots
DOE Office of Scientific and Technical Information (OSTI.GOV)
Herman Herman; Doug Few; Roelof Versteeg
2010-04-01
Payloads for small robotic platforms have historically been designed and implemented as platform and task specific solutions. A consequence of this approach is that payloads cannot be deployed on different robotic platforms without substantial re-engineering efforts. To address this issue, we developed a modular countermine payload that is designed from the ground-up to be platform agnostic. The payload consists of the multi-mission payload controller unit (PCU) coupled with the configurable mission specific threat detection, navigation and marking payloads. The multi-mission PCU has all the common electronics to control and interface to all the payloads. It also contains the embedded processormore » that can be used to run the navigational and control software. The PCU has a very flexible robot interface which can be configured to interface to various robot platforms. The threat detection payload consists of a two axis sweeping arm and the detector. The navigation payload consists of several perception sensors that are used for terrain mapping, obstacle detection and navigation. Finally, the marking payload consists of a dual-color paint marking system. Through the multi-mission PCU, all these payloads are packaged in a platform agnostic way to allow deployment on multiple robotic platforms, including Talon and Packbot.« less
Lab-on-CMOS Integration of Microfluidics and Electrochemical Sensors
Huang, Yue; Mason, Andrew J.
2013-01-01
This paper introduces a CMOS-microfluidics integration scheme for electrochemical microsystems. A CMOS chip was embedded into a micro-machined silicon carrier. By leveling the CMOS chip and carrier surface to within 100 nm, an expanded obstacle-free surface suitable for photolithography was achieved. Thin film metal planar interconnects were microfabricated to bridge CMOS pads to the perimeter of the carrier, leaving a flat and smooth surface for integrating microfluidic structures. A model device containing SU-8 microfluidic mixers and detection channels crossing over microelectrodes on a CMOS integrated circuit was constructed using the chip-carrier assembly scheme. Functional integrity of microfluidic structures and on-CMOS electrodes was verified by a simultaneous sample dilution and electrochemical detection experiment within multi-channel microfluidics. This lab-on-CMOS integration process is capable of high packing density, is suitable for wafer-level batch production, and opens new opportunities to combine the performance benefits of on-CMOS sensors with lab-on-chip platforms. PMID:23939616
Lab-on-CMOS integration of microfluidics and electrochemical sensors.
Huang, Yue; Mason, Andrew J
2013-10-07
This paper introduces a CMOS-microfluidics integration scheme for electrochemical microsystems. A CMOS chip was embedded into a micro-machined silicon carrier. By leveling the CMOS chip and carrier surface to within 100 nm, an expanded obstacle-free surface suitable for photolithography was achieved. Thin film metal planar interconnects were microfabricated to bridge CMOS pads to the perimeter of the carrier, leaving a flat and smooth surface for integrating microfluidic structures. A model device containing SU-8 microfluidic mixers and detection channels crossing over microelectrodes on a CMOS integrated circuit was constructed using the chip-carrier assembly scheme. Functional integrity of microfluidic structures and on-CMOS electrodes was verified by a simultaneous sample dilution and electrochemical detection experiment within multi-channel microfluidics. This lab-on-CMOS integration process is capable of high packing density, is suitable for wafer-level batch production, and opens new opportunities to combine the performance benefits of on-CMOS sensors with lab-on-chip platforms.
Huang, Hu; Zhao, Hongwei; Yang, Zhaojun; Fan, Zunqiang; Wan, Shunguang; Shi, Chengli; Ma, Zhichao
2012-01-01
Miniaturization precision positioning platforms are needed for in situ nanomechanical test applications. This paper proposes a compact precision positioning platform integrating strain gauges and the piezoactuator. Effects of geometric parameters of two parallel plates on Von Mises stress distribution as well as static and dynamic characteristics of the platform were studied by the finite element method. Results of the calibration experiment indicate that the strain gauge sensor has good linearity and its sensitivity is about 0.0468 mV/μm. A closed-loop control system was established to solve the problem of nonlinearity of the platform. Experimental results demonstrate that for the displacement control process, both the displacement increasing portion and the decreasing portion have good linearity, verifying that the control system is available. The developed platform has a compact structure but can realize displacement measurement with the embedded strain gauges, which is useful for the closed-loop control and structure miniaturization of piezo devices. It has potential applications in nanoindentation and nanoscratch tests, especially in the field of in situ nanomechanical testing which requires compact structures. PMID:23012566
KDE Bioscience: platform for bioinformatics analysis workflows.
Lu, Qiang; Hao, Pei; Curcin, Vasa; He, Weizhong; Li, Yuan-Yuan; Luo, Qing-Ming; Guo, Yi-Ke; Li, Yi-Xue
2006-08-01
Bioinformatics is a dynamic research area in which a large number of algorithms and programs have been developed rapidly and independently without much consideration so far of the need for standardization. The lack of such common standards combined with unfriendly interfaces make it difficult for biologists to learn how to use these tools and to translate the data formats from one to another. Consequently, the construction of an integrative bioinformatics platform to facilitate biologists' research is an urgent and challenging task. KDE Bioscience is a java-based software platform that collects a variety of bioinformatics tools and provides a workflow mechanism to integrate them. Nucleotide and protein sequences from local flat files, web sites, and relational databases can be entered, annotated, and aligned. Several home-made or 3rd-party viewers are built-in to provide visualization of annotations or alignments. KDE Bioscience can also be deployed in client-server mode where simultaneous execution of the same workflow is supported for multiple users. Moreover, workflows can be published as web pages that can be executed from a web browser. The power of KDE Bioscience comes from the integrated algorithms and data sources. With its generic workflow mechanism other novel calculations and simulations can be integrated to augment the current sequence analysis functions. Because of this flexible and extensible architecture, KDE Bioscience makes an ideal integrated informatics environment for future bioinformatics or systems biology research.
USDA-ARS?s Scientific Manuscript database
Next-generation sequencing technologies were used to rapidly and efficiently sequence the genome of the domestic turkey (Meleagris gallopavo). The current genome assembly (~1.1 Gb) includes 917 Mb of sequence assigned to chromosomes. Innate heterozygosity of the sequenced bird allowed discovery of...
Multi-locus mixed model analysis of stem rust resistance in a worldwide collection of winter wheat
USDA-ARS?s Scientific Manuscript database
Genome-wide association mapping is a powerful tool for dissecting the relationship between phenotypes and genetic variants in diverse populations. With improved cost efficiency of high-throughput genotyping platforms, association mapping is a desirable method to mine populations for favorable allele...
Integrated Microfluidic Lectin Barcode Platform for High-Performance Focused Glycomic Profiling
NASA Astrophysics Data System (ADS)
Shang, Yuqin; Zeng, Yun; Zeng, Yong
2016-02-01
Protein glycosylation is one of the key processes that play essential roles in biological functions and dysfunctions. However, progress in glycomics has considerably lagged behind genomics and proteomics, due in part to the enormous challenges in analysis of glycans. Here we present a new integrated and automated microfluidic lectin barcode platform to substantially improve the performance of lectin array for focused glycomic profiling. The chip design and flow control were optimized to promote the lectin-glycan binding kinetics and speed of lectin microarray. Moreover, we established an on-chip lectin assay which employs a very simple blocking method to effectively suppress the undesired background due to lectin binding of antibodies. Using this technology, we demonstrated focused differential profiling of tissue-specific glycosylation changes of a biomarker, CA125 protein purified from ovarian cancer cell line and different tissues from ovarian cancer patients in a fast, reproducible, and high-throughput fashion. Highly sensitive CA125 detection was also demonstrated with a detection limit much lower than the clinical cutoff value for cancer diagnosis. This microfluidic platform holds the potential to integrate with sample preparation functions to construct a fully integrated “sample-to-answer” microsystem for focused differential glycomic analysis. Thus, our technology should present a powerful tool in support of rapid advance in glycobiology and glyco-biomarker development.
Integrated Microfluidic Lectin Barcode Platform for High-Performance Focused Glycomic Profiling
Shang, Yuqin; Zeng, Yun; Zeng, Yong
2016-01-01
Protein glycosylation is one of the key processes that play essential roles in biological functions and dysfunctions. However, progress in glycomics has considerably lagged behind genomics and proteomics, due in part to the enormous challenges in analysis of glycans. Here we present a new integrated and automated microfluidic lectin barcode platform to substantially improve the performance of lectin array for focused glycomic profiling. The chip design and flow control were optimized to promote the lectin-glycan binding kinetics and speed of lectin microarray. Moreover, we established an on-chip lectin assay which employs a very simple blocking method to effectively suppress the undesired background due to lectin binding of antibodies. Using this technology, we demonstrated focused differential profiling of tissue-specific glycosylation changes of a biomarker, CA125 protein purified from ovarian cancer cell line and different tissues from ovarian cancer patients in a fast, reproducible, and high-throughput fashion. Highly sensitive CA125 detection was also demonstrated with a detection limit much lower than the clinical cutoff value for cancer diagnosis. This microfluidic platform holds the potential to integrate with sample preparation functions to construct a fully integrated “sample-to-answer” microsystem for focused differential glycomic analysis. Thus, our technology should present a powerful tool in support of rapid advance in glycobiology and glyco-biomarker development. PMID:26831207
Extending XNAT Platform with an Incremental Semantic Framework
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases. PMID:28912709
Extending XNAT Platform with an Incremental Semantic Framework.
Timón, Santiago; Rincón, Mariano; Martínez-Tomás, Rafael
2017-01-01
Informatics increases the yield from neuroscience due to improved data. Data sharing and accessibility enable joint efforts between different research groups, as well as replication studies, pivotal for progress in the field. Research data archiving solutions are evolving rapidly to address these necessities, however, distributed data integration is still difficult because of the need of explicit agreements for disparate data models. To address these problems, ontologies are widely used in biomedical research to obtain common vocabularies and logical descriptions, but its application may suffer from scalability issues, domain bias, and loss of low-level data access. With the aim of improving the application of semantic models in biobanking systems, an incremental semantic framework that takes advantage of the latest advances in biomedical ontologies and the XNAT platform is designed and implemented. We follow a layered architecture that allows the alignment of multi-domain biomedical ontologies to manage data at different levels of abstraction. To illustrate this approach, the development is integrated in the JPND (EU Joint Program for Neurodegenerative Disease) APGeM project, focused on finding early biomarkers for Alzheimer's and other dementia related diseases.
The CALIPSO Integrated Thermal Control Subsystem
NASA Technical Reports Server (NTRS)
Gasbarre, Joseph F.; Ousley, Wes; Valentini, Marc; Thomas, Jason; Dejoie, Joel
2007-01-01
The Cloud-Aerosol Lidar and Infrared Pathfinder Satellite Observation (CALIPSO) is a joint NASA-CNES mission to study the Earth's cloud and aerosol layers. The satellite is composed of a primary payload (built by Ball Aerospace) and a spacecraft platform bus (PROTEUS, built by Alcatel Alenia Space). The thermal control subsystem (TCS) for the CALIPSO satellite is a passive design utilizing radiators, multi-layer insulation (MLI) blankets, and both operational and survival surface heaters. The most temperature sensitive component within the satellite is the laser system. During thermal vacuum testing of the integrated satellite, the laser system's operational heaters were found to be inadequate in maintaining the lasers required set point. In response, a solution utilizing the laser system's survival heaters to augment the operational heaters was developed with collaboration between NASA, CNES, Ball Aerospace, and Alcatel-Alenia. The CALIPSO satellite launched from Vandenberg Air Force Base in California on April 26th, 2006. Evaluation of both the platform and payload thermal control systems show they are performing as expected and maintaining the critical elements of the satellite within acceptable limits.
Photonanomedicine: a convergence of photodynamic therapy and nanotechnology
NASA Astrophysics Data System (ADS)
Obaid, Girgis; Broekgaarden, Mans; Bulin, Anne-Laure; Huang, Huang-Chiao; Kuriakose, Jerrin; Liu, Joyce; Hasan, Tayyaba
2016-06-01
As clinical nanomedicine has emerged over the past two decades, phototherapeutic advancements using nanotechnology have also evolved and impacted disease management. Because of unique features attributable to the light activation process of molecules, photonanomedicine (PNM) holds significant promise as a personalized, image-guided therapeutic approach for cancer and non-cancer pathologies. The convergence of advanced photochemical therapies such as photodynamic therapy (PDT) and imaging modalities with sophisticated nanotechnologies is enabling the ongoing evolution of fundamental PNM formulations, such as Visudyne®, into progressive forward-looking platforms that integrate theranostics (therapeutics and diagnostics), molecular selectivity, the spatiotemporally controlled release of synergistic therapeutics, along with regulated, sustained drug dosing. Considering that the envisioned goal of these integrated platforms is proving to be realistic, this review will discuss how PNM has evolved over the years as a preclinical and clinical amalgamation of nanotechnology with PDT. The encouraging investigations that emphasize the potent synergy between photochemistry and nanotherapeutics, in addition to the growing realization of the value of these multi-faceted theranostic nanoplatforms, will assist in driving PNM formulations into mainstream oncological clinical practice as a necessary tool in the medical armamentarium.
Wade, James H; Jones, Joshua D; Lenov, Ivan L; Riordan, Colleen M; Sligar, Stephen G; Bailey, Ryan C
2017-08-22
The characterization of integral membrane proteins presents numerous analytical challenges on account of their poor activity under non-native conditions, limited solubility in aqueous solutions, and low expression in most cell culture systems. Nanodiscs are synthetic model membrane constructs that offer many advantages for studying membrane protein function by offering a native-like phospholipid bilayer environment. The successful incorporation of membrane proteins within Nanodiscs requires experimental optimization of conditions. Standard protocols for Nanodisc formation can require large amounts of time and input material, limiting the facile screening of formation conditions. Capitalizing on the miniaturization and efficient mass transport inherent to microfluidics, we have developed a microfluidic platform for efficient Nanodisc assembly and purification, and demonstrated the ability to incorporate functional membrane proteins into the resulting Nanodiscs. In addition to working with reduced sample volumes, this platform simplifies membrane protein incorporation from a multi-stage protocol requiring several hours or days into a single platform that outputs purified Nanodiscs in less than one hour. To demonstrate the utility of this platform, we incorporated Cytochrome P450 into Nanodiscs of variable size and lipid composition, and present spectroscopic evidence for the functional active site of the membrane protein. This platform is a promising new tool for membrane protein biology and biochemistry that enables tremendous versatility for optimizing the incorporation of membrane proteins using microfluidic gradients to screen across diverse formation conditions.
Reis, Yara; Wolf, Thomas; Brors, Benedikt; Hamacher-Brady, Anne; Eils, Roland; Brady, Nathan R.
2012-01-01
Mitochondria exist as a network of interconnected organelles undergoing constant fission and fusion. Current approaches to study mitochondrial morphology are limited by low data sampling coupled with manual identification and classification of complex morphological phenotypes. Here we propose an integrated mechanistic and data-driven modeling approach to analyze heterogeneous, quantified datasets and infer relations between mitochondrial morphology and apoptotic events. We initially performed high-content, multi-parametric measurements of mitochondrial morphological, apoptotic, and energetic states by high-resolution imaging of human breast carcinoma MCF-7 cells. Subsequently, decision tree-based analysis was used to automatically classify networked, fragmented, and swollen mitochondrial subpopulations, at the single-cell level and within cell populations. Our results revealed subtle but significant differences in morphology class distributions in response to various apoptotic stimuli. Furthermore, key mitochondrial functional parameters including mitochondrial membrane potential and Bax activation, were measured under matched conditions. Data-driven fuzzy logic modeling was used to explore the non-linear relationships between mitochondrial morphology and apoptotic signaling, combining morphological and functional data as a single model. Modeling results are in accordance with previous studies, where Bax regulates mitochondrial fragmentation, and mitochondrial morphology influences mitochondrial membrane potential. In summary, we established and validated a platform for mitochondrial morphological and functional analysis that can be readily extended with additional datasets. We further discuss the benefits of a flexible systematic approach for elucidating specific and general relationships between mitochondrial morphology and apoptosis. PMID:22272225
CAS-viewer: web-based tool for splicing-guided integrative analysis of multi-omics cancer data.
Han, Seonggyun; Kim, Dongwook; Kim, Youngjun; Choi, Kanghoon; Miller, Jason E; Kim, Dokyoon; Lee, Younghee
2018-04-20
The Cancer Genome Atlas (TCGA) project is a public resource that provides transcriptomic, DNA sequence, methylation, and clinical data for 33 cancer types. Transforming the large size and high complexity of TCGA cancer genome data into integrated knowledge can be useful to promote cancer research. Alternative splicing (AS) is a key regulatory mechanism of genes in human cancer development and in the interaction with epigenetic factors. Therefore, AS-guided integration of existing TCGA data sets will make it easier to gain insight into the genetic architecture of cancer risk and related outcomes. There are already existing tools analyzing and visualizing alternative mRNA splicing patterns for large-scale RNA-seq experiments. However, these existing web-based tools are limited to the analysis of individual TCGA data sets at a time, such as only transcriptomic information. We implemented CAS-viewer (integrative analysis of Cancer genome data based on Alternative Splicing), a web-based tool leveraging multi-cancer omics data from TCGA. It illustrates alternative mRNA splicing patterns along with methylation, miRNAs, and SNPs, and then provides an analysis tool to link differential transcript expression ratio to methylation, miRNA, and splicing regulatory elements for 33 cancer types. Moreover, one can analyze AS patterns with clinical data to identify potential transcripts associated with different survival outcome for each cancer. CAS-viewer is a web-based application for transcript isoform-driven integration of multi-omics data in multiple cancer types and will aid in the visualization and possible discovery of biomarkers for cancer by integrating multi-omics data from TCGA.
MultiElec: A MATLAB Based Application for MEA Data Analysis.
Georgiadis, Vassilis; Stephanou, Anastasis; Townsend, Paul A; Jackson, Thomas R
2015-01-01
We present MultiElec, an open source MATLAB based application for data analysis of microelectrode array (MEA) recordings. MultiElec displays an extremely user-friendly graphic user interface (GUI) that allows the simultaneous display and analysis of voltage traces for 60 electrodes and includes functions for activation-time determination, the production of activation-time heat maps with activation time and isoline display. Furthermore, local conduction velocities are semi-automatically calculated along with their corresponding vector plots. MultiElec allows ad hoc signal suppression, enabling the user to easily and efficiently handle signal artefacts and for incomplete data sets to be analysed. Voltage traces and heat maps can be simply exported for figure production and presentation. In addition, our platform is able to produce 3D videos of signal progression over all 60 electrodes. Functions are controlled entirely by a single GUI with no need for command line input or any understanding of MATLAB code. MultiElec is open source under the terms of the GNU General Public License as published by the Free Software Foundation, version 3. Both the program and source code are available to download from http://www.cancer.manchester.ac.uk/MultiElec/.
Dynamic analysis of apoptosis using cyanine SYTO probes: From classical to microfluidic cytometry
Wlodkowic, Donald; Skommer, Joanna; Faley, Shannon; Darzynkiewicz, Zbigniew; Cooper, Jonathan M.
2013-01-01
Cell death is a stochastic process, often initiated and/or executed in a multi-pathway/multi-organelle fashion. Therefore, high-throughput single-cell analysis platforms are required to provide detailed characterization of kinetics and mechanisms of cell death in heterogeneous cell populations. However, there is still a largely unmet need for inert fluorescent probes, suitable for prolonged kinetic studies. Here, we compare the use of innovative adaptation of unsymmetrical SYTO dyes for dynamic real-time analysis of apoptosis in conventional as well as microfluidic chip-based systems. We show that cyanine SYTO probes allow non-invasive tracking of intracellular events over extended time. Easy handling and “stain–no wash” protocols open up new opportunities for high-throughput analysis and live-cell sorting. Furthermore, SYTO probes are easily adaptable for detection of cell death using automated microfluidic chip-based cytometry. Overall, the combined use of SYTO probes and state-of-the-art Lab-on-a-Chip platform emerges as a cost effective solution for automated drug screening compared to conventional Annexin V or TUNEL assays. In particular, it should allow for dynamic analysis of samples where low cell number has so far been an obstacle, e.g. primary cancer stems cells or circulating minimal residual tumors. PMID:19298813
The network and transmission of based on the principle of laser multipoint communication
NASA Astrophysics Data System (ADS)
Fu, Qiang; Liu, Xianzhu; Jiang, Huilin; Hu, Yuan; Jiang, Lun
2014-11-01
Space laser communication is the perfectly choose to the earth integrated information backbone network in the future. This paper introduces the structure of the earth integrated information network that is a large capacity integrated high-speed broadband information network, a variety of communications platforms were densely interconnected together, such as the land, sea, air and deep air users or aircraft, the technologies of the intelligent high-speed processing, switching and routing were adopt. According to the principle of maximum effective comprehensive utilization of information resources, get accurately information, fast processing and efficient transmission through inter-satellite, satellite earth, sky and ground station and other links. Namely it will be a space-based, air-based and ground-based integrated information network. It will be started from the trends of laser communication. The current situation of laser multi-point communications were expounded, the transmission scheme of the dynamic multi-point between wireless laser communication n network has been carefully studied, a variety of laser communication network transmission schemes the corresponding characteristics and scope described in detail , described the optical multiplexer machine that based on the multiport form of communication is applied to relay backbone link; the optical multiplexer-based on the form of the segmentation receiver field of view is applied to small angle link, the optical multiplexer-based form of three concentric spheres structure is applied to short distances, motorized occasions, and the multi-point stitching structure based on the rotation paraboloid is applied to inter-satellite communications in detail. The multi-point laser communication terminal apparatus consist of the transmitting and receiving antenna, a relay optical system, the spectroscopic system, communication system and communication receiver transmitter system. The communication forms of optical multiplexer more than four goals or more, the ratio of received power and volume weight will be Obvious advantages, and can track multiple moving targets in flexible.It would to provide reference for the construction of earth integrated information networks.
Stakeholders Opinions on Multi-Use Deep Water Offshore Platform in Hsiao-Liu-Chiu, Taiwan
Sie, Ya-Tsune; Chang, Yang-Chi; Lu, Shiau-Yun
2018-01-01
This paper describes a group model building activity designed to elicit the potential effects a projected multi-use deep water offshore platform may have on its local environment, including ecological and socio-economic issues. As such a platform is proposed for construction around the island of Hsiao-Liu-Chiu, Taiwan, we organized several meetings with the local stakeholders and structured the debates using group modeling methods to promote consensus. During the process, the participants iteratively built and revised a causal-loop diagram that summarizes their opinions. Overall, local stakeholders concluded that a multi-use deep water offshore marine platform might have beneficial effects for Hsiao-Liu-Chiu because more tourists and fish could be attracted by the structure, but they also raised some potential problems regarding the law in Taiwan and the design of the offshore platform, especially its resistance to extreme weather. We report the method used and the main results and insights gained during the process. PMID:29415521
Huygens' inspired multi-pendulum setups: Experiments and stability analysis
NASA Astrophysics Data System (ADS)
Hoogeboom, F. N.; Pogromsky, A. Y.; Nijmeijer, H.
2016-11-01
This paper examines synchronization of a set of metronomes placed on a lightweight foam platform. Two configurations of the set of metronomes are considered: a row setup containing one-dimensional coupling and a cross setup containing two-dimensional coupling. Depending on the configuration and coupling between the metronomes, i.e., the platform parameters, in- and/or anti-phase synchronized behavior is observed in the experiments. To explain this behavior, mathematical models of a metronome and experimental setups have been derived and used in a local stability analysis. It is numerically and experimentally demonstrated that varying the coupling parameters for both configurations has a significant influence on the stability of the synchronized solutions.
NASA Astrophysics Data System (ADS)
Schröder, Henning; Brusberg, Lars; Pitwon, Richard; Whalley, Simon; Wang, Kai; Miller, Allen; Herbst, Christian; Weber, Daniel; Lang, Klaus-Dieter
2015-03-01
Optical interconnects for data transmission at board level offer increased energy efficiency, system density, and bandwidth scalability compared to purely copper driven systems. We present recent results on manufacturing of electrooptical printed circuit board (PCB) with integrated planar glass waveguides. The graded index multi-mode waveguides are patterned inside commercially available thin-glass panels by performing a specific ion-exchange process. The glass waveguide panel is embedded within the layer stack-up of a PCB using proven industrial processes. This paper describes the design, manufacture, assembly and characterization of the first electro-optical backplane demonstrator based on integrated planar glass waveguides. The electro-optical backplane in question is created by laminating the glass waveguide panel into a conventional multi-layer electronic printed circuit board stack-up. High precision ferrule mounts are automatically assembled, which will enable MT compliant connectors to be plugged accurately to the embedded waveguide interfaces on the glass panel edges. The demonstration platform comprises a standardized sub-rack chassis and five pluggable test cards each housing optical engines and pluggable optical connectors. The test cards support a variety of different data interfaces and can support data rates of up to 32 Gb/s per channel.
NASA Astrophysics Data System (ADS)
Strangio, S.; Palestri, P.; Lanuzza, M.; Esseni, D.; Crupi, F.; Selmi, L.
2017-02-01
In this work, a benchmark for low-power digital applications of a III-V TFET technology platform against a conventional CMOS FinFET technology node is proposed. The analysis focuses on full-adder circuits, which are commonly identified as representative of the digital logic environment. 28T and 24T topologies, implemented in complementary-logic and transmission-gate logic, respectively, are investigated. Transient simulations are performed with a purpose-built test-bench on each single-bit full adder solution. The extracted delays and energy characteristics are post-processed and translated into figures-of-merit for multi-bit ripple-carry-adders. Trends related to the different full-adder implementations (for the same device technology platform) and to the different technology platforms (for the same full-adder topology) are presented and discussed.
Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S
2013-01-01
Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.
Wan, Yuhang; Carlson, John A; Kesler, Benjamin A; Peng, Wang; Su, Patrick; Al-Mulla, Saoud A; Lim, Sung Jun; Smith, Andrew M; Dallesasse, John M; Cunningham, Brian T
2016-07-08
A compact analysis platform for detecting liquid absorption and emission spectra using a set of optical linear variable filters atop a CMOS image sensor is presented. The working spectral range of the analysis platform can be extended without a reduction in spectral resolution by utilizing multiple linear variable filters with different wavelength ranges on the same CMOS sensor. With optical setup reconfiguration, its capability to measure both absorption and fluorescence emission is demonstrated. Quantitative detection of fluorescence emission down to 0.28 nM for quantum dot dispersions and 32 ng/mL for near-infrared dyes has been demonstrated on a single platform over a wide spectral range, as well as an absorption-based water quality test, showing the versatility of the system across liquid solutions for different emission and absorption bands. Comparison with a commercially available portable spectrometer and an optical spectrum analyzer shows our system has an improved signal-to-noise ratio and acceptable spectral resolution for discrimination of emission spectra, and characterization of colored liquid's absorption characteristics generated by common biomolecular assays. This simple, compact, and versatile analysis platform demonstrates a path towards an integrated optical device that can be utilized for a wide variety of applications in point-of-use testing and point-of-care diagnostics.
GenePattern | Informatics Technology for Cancer Research (ITCR)
GenePattern is a genomic analysis platform that provides access to hundreds of tools for the analysis and visualization of multiple data types. A web-based interface provides easy access to these tools and allows the creation of multi-step analysis pipelines that enable reproducible in silico research. A new GenePattern Notebook environment allows users to combine GenePattern analyses with text, graphics, and code to create complete reproducible research narratives.
MultiMetEval: Comparative and Multi-Objective Analysis of Genome-Scale Metabolic Models
Gevorgyan, Albert; Kierzek, Andrzej M.; Breitling, Rainer; Takano, Eriko
2012-01-01
Comparative metabolic modelling is emerging as a novel field, supported by the development of reliable and standardized approaches for constructing genome-scale metabolic models in high throughput. New software solutions are needed to allow efficient comparative analysis of multiple models in the context of multiple cellular objectives. Here, we present the user-friendly software framework Multi-Metabolic Evaluator (MultiMetEval), built upon SurreyFBA, which allows the user to compose collections of metabolic models that together can be subjected to flux balance analysis. Additionally, MultiMetEval implements functionalities for multi-objective analysis by calculating the Pareto front between two cellular objectives. Using a previously generated dataset of 38 actinobacterial genome-scale metabolic models, we show how these approaches can lead to exciting novel insights. Firstly, after incorporating several pathways for the biosynthesis of natural products into each of these models, comparative flux balance analysis predicted that species like Streptomyces that harbour the highest diversity of secondary metabolite biosynthetic gene clusters in their genomes do not necessarily have the metabolic network topology most suitable for compound overproduction. Secondly, multi-objective analysis of biomass production and natural product biosynthesis in these actinobacteria shows that the well-studied occurrence of discrete metabolic switches during the change of cellular objectives is inherent to their metabolic network architecture. Comparative and multi-objective modelling can lead to insights that could not be obtained by normal flux balance analyses. MultiMetEval provides a powerful platform that makes these analyses straightforward for biologists. Sources and binaries of MultiMetEval are freely available from https://github.com/PiotrZakrzewski/MetEval/downloads. PMID:23272111
The Image Data Resource: A Bioimage Data Integration and Publication Platform.
Williams, Eleanor; Moore, Josh; Li, Simon W; Rustici, Gabriella; Tarkowska, Aleksandra; Chessel, Anatole; Leo, Simone; Antal, Bálint; Ferguson, Richard K; Sarkans, Ugis; Brazma, Alvis; Salas, Rafael E Carazo; Swedlow, Jason R
2017-08-01
Access to primary research data is vital for the advancement of science. To extend the data types supported by community repositories, we built a prototype Image Data Resource (IDR) that collects and integrates imaging data acquired across many different imaging modalities. IDR links data from several imaging modalities, including high-content screening, super-resolution and time-lapse microscopy, digital pathology, public genetic or chemical databases, and cell and tissue phenotypes expressed using controlled ontologies. Using this integration, IDR facilitates the analysis of gene networks and reveals functional interactions that are inaccessible to individual studies. To enable re-analysis, we also established a computational resource based on Jupyter notebooks that allows remote access to the entire IDR. IDR is also an open source platform that others can use to publish their own image data. Thus IDR provides both a novel on-line resource and a software infrastructure that promotes and extends publication and re-analysis of scientific image data.
Point Analysis in Java applied to histological images of the perforant pathway: a user's account.
Scorcioni, Ruggero; Wright, Susan N; Patrick Card, J; Ascoli, Giorgio A; Barrionuevo, Germán
2008-01-01
The freeware Java tool Point Analysis in Java (PAJ), created to perform 3D point analysis, was tested in an independent laboratory setting. The input data consisted of images of the hippocampal perforant pathway from serial immunocytochemical localizations of the rat brain in multiple views at different resolutions. The low magnification set (x2 objective) comprised the entire perforant pathway, while the high magnification set (x100 objective) allowed the identification of individual fibers. A preliminary stereological study revealed a striking linear relationship between the fiber count at high magnification and the optical density at low magnification. PAJ enabled fast analysis for down-sampled data sets and a friendly interface with automated plot drawings. Noted strengths included the multi-platform support as well as the free availability of the source code, conducive to a broad user base and maximum flexibility for ad hoc requirements. PAJ has great potential to extend its usability by (a) improving its graphical user interface, (b) increasing its input size limit, (c) improving response time for large data sets, and (d) potentially being integrated with other Java graphical tools such as ImageJ.
Highly Integrated THz Receiver Systems for Small Satellite Remote Sensing Applications
NASA Technical Reports Server (NTRS)
Groppi, Christopher; Hunter, Roger C.; Baker, Christopher
2017-01-01
We are developing miniaturized, highly integrated Schottky receiver systems suitable for use in CubeSats or other small spacecraft platforms, where state-of-the-art performance and ultra-low mass, power, and volume are required. Current traditional Schottky receivers are too large to employ on a CubeSat. We will develop highly integrated receivers operating from 520-600 GHz and 1040-1200 GHz that are based on state-of-the-art receivers already developed at Jet Propulsion Laboratory (JPL) by using novel 3D multi layer packaging. This process will reduce both mass and volume by more than an order of magnitude, while preserving state-of-the-art noise performance. The resulting receiver systems will have a volume of approximately 25 x 25 x 40 millimeters (mm), a mass of 250 grams (g), and power consumption on the order of of 7 watts (W). Using these techniques, we will also integrate both receivers into a single frame, further reducing mass and volume for applications where dual band operation is advantageous. Additionally, as Schottky receivers offer significant gains in noise performance when cooled to 100 K, we will investigate the improvement gained by passively cooling these receivers. Work by Sierra Lobo Inc., with their Cryo Cube technology development program, offers the possibility of passive cooling to 100 K on CubeSat platforms for 1-unit (1U) sized instruments.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.
Wafer integrated micro-scale concentrating photovoltaics
NASA Astrophysics Data System (ADS)
Gu, Tian; Li, Duanhui; Li, Lan; Jared, Bradley; Keeler, Gordon; Miller, Bill; Sweatt, William; Paap, Scott; Saavedra, Michael; Das, Ujjwal; Hegedus, Steve; Tauke-Pedretti, Anna; Hu, Juejun
2017-09-01
Recent development of a novel micro-scale PV/CPV technology is presented. The Wafer Integrated Micro-scale PV approach (WPV) seamlessly integrates multijunction micro-cells with a multi-functional silicon platform that provides optical micro-concentration, hybrid photovoltaic, and mechanical micro-assembly. The wafer-embedded micro-concentrating elements is shown to considerably improve the concentration-acceptance-angle product, potentially leading to dramatically reduced module materials and fabrication costs, sufficient angular tolerance for low-cost trackers, and an ultra-compact optical architecture, which makes the WPV module compatible with commercial flat panel infrastructures. The PV/CPV hybrid architecture further allows the collection of both direct and diffuse sunlight, thus extending the geographic and market domains for cost-effective PV system deployment. The WPV approach can potentially benefits from both the high performance of multijunction cells and the low cost of flat plate Si PV systems.
NASA Astrophysics Data System (ADS)
McGillivary, P. A.; Borges de Sousa, J.; Martins, R.; Rajan, K.
2012-12-01
Autonomous platforms are increasingly used as components of Integrated Ocean Observing Systems and oceanographic research cruises. Systems deployed can include gliders or propeller-driven autonomous underwater vessels (AUVs), autonomous surface vessels (ASVs), and unmanned aircraft systems (UAS). Prior field campaigns have demonstrated successful communication, sensor data fusion and visualization for studies using gliders and AUVs. However, additional requirements exist for incorporating ASVs and UASs into ship operations. For these systems to be optimally integrated into research vessel data management and operational planning systems involves addressing three key issues: real-time field data availability, platform coordination, and data archiving for later analysis. A fleet of AUVs, ASVs and UAS deployed from a research vessel is best operated as a system integrated with the ship, provided communications among them can be sustained. For this purpose, Disruptive Tolerant Networking (DTN) software protocols for operation in communication-challenged environments help ensure reliable high-bandwidth communications. Additionally, system components need to have considerable onboard autonomy, namely adaptive sampling capabilities using their own onboard sensor data stream analysis. We discuss Oceanographic Decision Support System (ODSS) software currently used for situational awareness and planning onshore, and in the near future event detection and response will be coordinated among multiple vehicles. Results from recent field studies from oceanographic research vessels using AUVs, ASVs and UAS, including the Rapid Environmental Picture (REP-12) cruise, are presented describing methods and results for use of multi-vehicle communication and deliberative control networks, adaptive sampling with single and multiple platforms, issues relating to data management and archiving, and finally challenges that remain in addressing these technological issues. Significantly, the use of UAS on oceanographic research vessels is just beginning. We report on several initial field efforts which demonstrated that UAS improve spatial and temporal mapping of ocean features, as well as monitoring marine mammal populations, ocean color, sea ice and wave fields and air-sea gas exchange. These studies however also confirm the challenges for shipboard computer systems ingesting and archiving UAS high resolution video, SAR and lidar data. We describe the successful inclusion of DTN communications for: 1) passing video data between two UAS or a UAS and ship; 2) for inclusion of ASVs as communication nodes for AUVs; as well as, 3) enabling extension of adaptive sampling software from AUVs and ASVs to include UAS. In conclusion, we describe how autonomous sampling systems may be best integrated into shipboard oceanographic vessel research to provide new and more comprehensive time-space ocean and atmospheric data collection that is important not only for scientific study, but also for sustainable ocean management, including emergency response capabilities. The recent examples of such integrated studies highlighted confirm ocean and atmospheric studies can more cost-effectively pursued, and in some cases only accomplished, by combining underwater, surface and aircraft autonomous systems with research vessel operations.
Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V.; Hirsch, Soeren
2017-01-01
The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept. PMID:28946609
Oseev, Aleksandr; Lucklum, Ralf; Zubtsov, Mikhail; Schmidt, Marc-Peter; Mukhin, Nikolay V; Hirsch, Soeren
2017-09-23
The current work demonstrates a novel surface acoustic wave (SAW) based phononic crystal sensor approach that allows the integration of a velocimetry-based sensor concept into single chip integrated solutions, such as Lab-on-a-Chip devices. The introduced sensor platform merges advantages of ultrasonic velocimetry analytic systems and a microacoustic sensor approach. It is based on the analysis of structural resonances in a periodic composite arrangement of microfluidic channels confined within a liquid analyte. Completed theoretical and experimental investigations show the ability to utilize periodic structure localized modes for the detection of volumetric properties of liquids and prove the efficacy of the proposed sensor concept.
Huang, Zhenzhen; Duan, Huilong; Li, Haomin
2015-01-01
Large-scale human cancer genomics projects, such as TCGA, generated large genomics data for further study. Exploring and mining these data to obtain meaningful analysis results can help researchers find potential genomics alterations that intervene the development and metastasis of tumors. We developed a web-based gene analysis platform, named TCGA4U, which used statistics methods and models to help translational investigators explore, mine and visualize human cancer genomic characteristic information from the TCGA datasets. Furthermore, through Gene Ontology (GO) annotation and clinical data integration, the genomic data were transformed into biological process, molecular function, cellular component and survival curves to help researchers identify potential driver genes. Clinical researchers without expertise in data analysis will benefit from such a user-friendly genomic analysis platform.
Kriechbaumer, Thomas; Blackburn, Kim; Breckon, Toby P.; Hamilton, Oliver; Rivas Casado, Monica
2015-01-01
Autonomous survey vessels can increase the efficiency and availability of wide-area river environment surveying as a tool for environment protection and conservation. A key challenge is the accurate localisation of the vessel, where bank-side vegetation or urban settlement preclude the conventional use of line-of-sight global navigation satellite systems (GNSS). In this paper, we evaluate unaided visual odometry, via an on-board stereo camera rig attached to the survey vessel, as a novel, low-cost localisation strategy. Feature-based and appearance-based visual odometry algorithms are implemented on a six degrees of freedom platform operating under guided motion, but stochastic variation in yaw, pitch and roll. Evaluation is based on a 663 m-long trajectory (>15,000 image frames) and statistical error analysis against ground truth position from a target tracking tachymeter integrating electronic distance and angular measurements. The position error of the feature-based technique (mean of ±0.067 m) is three times smaller than that of the appearance-based algorithm. From multi-variable statistical regression, we are able to attribute this error to the depth of tracked features from the camera in the scene and variations in platform yaw. Our findings inform effective strategies to enhance stereo visual localisation for the specific application of river monitoring. PMID:26694411
A generic flexible and robust approach for intelligent real-time video-surveillance systems
NASA Astrophysics Data System (ADS)
Desurmont, Xavier; Delaigle, Jean-Francois; Bastide, Arnaud; Macq, Benoit
2004-05-01
In this article we present a generic, flexible and robust approach for an intelligent real-time video-surveillance system. A previous version of the system was presented in [1]. The goal of these advanced tools is to provide help to operators by detecting events of interest in visual scenes and highlighting alarms and compute statistics. The proposed system is a multi-camera platform able to handle different standards of video inputs (composite, IP, IEEE1394 ) and which can basically compress (MPEG4), store and display them. This platform also integrates advanced video analysis tools, such as motion detection, segmentation, tracking and interpretation. The design of the architecture is optimised to playback, display, and process video flows in an efficient way for video-surveillance application. The implementation is distributed on a scalable computer cluster based on Linux and IP network. It relies on POSIX threads for multitasking scheduling. Data flows are transmitted between the different modules using multicast technology and under control of a TCP-based command network (e.g. for bandwidth occupation control). We report here some results and we show the potential use of such a flexible system in third generation video surveillance system. We illustrate the interest of the system in a real case study, which is the indoor surveillance.
MultiSpec: A Desktop and Online Geospatial Image Data Processing Tool
NASA Astrophysics Data System (ADS)
Biehl, L. L.; Hsu, W. K.; Maud, A. R. M.; Yeh, T. T.
2017-12-01
MultiSpec is an easy to learn and use, freeware image processing tool for interactively analyzing a broad spectrum of geospatial image data, with capabilities such as image display, unsupervised and supervised classification, feature extraction, feature enhancement, and several other functions. Originally developed for Macintosh and Windows desktop computers, it has a community of several thousand users worldwide, including researchers and educators, as a practical and robust solution for analyzing multispectral and hyperspectral remote sensing data in several different file formats. More recently MultiSpec was adapted to run in the HUBzero collaboration platform so that it can be used within a web browser, allowing new user communities to be engaged through science gateways. MultiSpec Online has also been extended to interoperate with other components (e.g., data management) in HUBzero through integration with the geospatial data building blocks (GABBs) project. This integration enables a user to directly launch MultiSpec Online from data that is stored and/or shared in a HUBzero gateway and to save output data from MultiSpec Online to hub storage, allowing data sharing and multi-step workflows without having to move data between different systems. MultiSpec has also been used in K-12 classes for which one example is the GLOBE program (www.globe.gov) and in outreach material such as that provided by the USGS (eros.usgs.gov/educational-activities). MultiSpec Online now provides teachers with another way to use MultiSpec without having to install the desktop tool. Recently MultiSpec Online was used in a geospatial data session with 30-35 middle school students at the Turned Onto Technology and Leadership (TOTAL) Camp in the summers of 2016 and 2017 at Purdue University. The students worked on a flood mapping exercise using Landsat 5 data to learn about land remote sensing using supervised classification techniques. Online documentation is available for MultiSpec (engineering.purdue.edu/ biehl/MultiSpec/) including a reference manual and several tutorials allowing young high-school students through research faculty to learn the basic functions in MultiSpec. Some of the tutorials have been translated to other languages by MultiSpec users.
The NIH Common Fund Human Biomolecular Atlas Program (HuBMAP) aims to develop a framework for functional mapping the human body with cellular resolution to enhance our understanding of cellular organization-function. HuBMAP will accelerate the development of the next generation of tools and techniques to generate 3D tissue maps using validated high-content, high-throughput imaging and omics assays, and establish an open data platform for integrating, visualizing data to build multi-dimensional maps.
2001-04-19
KENNEDY SPACE CENTER, FLA. -- Spring leaves frame the launch of Space Shuttle Endeavour on mission STS-100, the ninth flight to the International Space Station. Liftoff occurred at 2:40:42 p.m. EDT. The 11-day mission will deliver and integrate the Spacelab Logistics Pallet/Launch Deployment Assembly, which includes the Space Station Remote Manipulator System and the UHF Antenna. The mission includes two planned spacewalks for installation of the SSRMS on the Station. Also onboard is the Multi-Purpose Logistics Module Raffaello, carrying resupply stowage racks and resupply/return stowage platform
2001-04-19
KENNEDY SPACE CENTER, FLA. -- Spring leaves frame Space Shuttle Endeavour as the water captures the launch of mission STS-100. Liftoff of Endeavour on the ninth flight to the International Space Station occurred at 2:40:42 p.m. EDT. The 11-day mission will deliver and integrate the Spacelab Logistics Pallet/Launch Deployment Assembly, which includes the Space Station Remote Manipulator System and the UHF Antenna. The mission includes two planned spacewalks for installation of the SSRMS on the Station. Also onboard is the Multi-Purpose Logistics Module Raffaello, carrying resupply stowage racks and resupply/return stowage platforms
NASA Technical Reports Server (NTRS)
Pitts, Robert Lee
2012-01-01
Goals of this activity: Test the Huntsville Operations Support Center (HOSC) Delay/Disruption Tolerant Networking (DTN) Gateway for operational use Current activity includes: (1) Test the Implementation of a new DTN2 gateway at the HOSC (2) Confirm integration of DTN nodes into the S-band uplink and Ku-band downlink of the ISS for limited use (3) Implement Aggregate Custody Signal to ISS platforms (4) Verify operational support for Colorado University (CU) onboard components (5) Verify ability to support Multi-Purpose End-To- End Robotic Operation Network (METERON) OpsCon-2
University of Virginia suborbital infrared sensing experiment
NASA Astrophysics Data System (ADS)
Holland, Stephen; Nunnally, Clayton; Armstrong, Sarah; Laufer, Gabriel
2002-03-01
An Orion sounding rocket launched from Wallops Flight Facility carried a University of Virginia payload to an altitude of 47 km and returned infrared measurements of the Earth's upper atmosphere and video images of the ocean. The payload launch was the result of a three-year undergraduate design project by a multi-disciplinary student group from the University of Virginia and James Madison University. As part of a new multi-year design course, undergraduate students designed, built, tested, and participated in the launch of a suborbital platform from which atmospheric remote sensors and other scientific experiments could operate. The first launch included a simplified atmospheric measurement system intended to demonstrate full system operation and remote sensing capabilities during suborbital flight. A thermoelectrically cooled HgCdTe infrared detector, with peak sensitivity at 10 micrometers , measured upwelling radiation and a small camera and VCR system, aligned with the infrared sensor, provided a ground reference. Additionally, a simple orientation sensor, consisting of three photodiodes, equipped with red, green, and blue light with dichroic filters, was tested. Temperature measurements of the upper atmosphere were successfully obtained during the flight. Video images were successfully recorded on-board the payload and proved a valuable tool in the data analysis process. The photodiode system, intended as a replacement for the camera and VCR system, functioned well, despite low signal amplification. This fully integrated and flight tested payload will serve as a platform for future atmospheric sensing experiments. It is currently being modified for a second suborbital flight that will incorporate a gas filter correlation radiometry (GFCR) instrument to measure the distribution of stratospheric methane and imaging capabilities to record the chlorophyll distribution in the Metompkin Bay as an indicator of pollution runoff.
ENFIN--A European network for integrative systems biology.
Kahlem, Pascal; Clegg, Andrew; Reisinger, Florian; Xenarios, Ioannis; Hermjakob, Henning; Orengo, Christine; Birney, Ewan
2009-11-01
Integration of biological data of various types and the development of adapted bioinformatics tools represent critical objectives to enable research at the systems level. The European Network of Excellence ENFIN is engaged in developing an adapted infrastructure to connect databases, and platforms to enable both the generation of new bioinformatics tools and the experimental validation of computational predictions. With the aim of bridging the gap existing between standard wet laboratories and bioinformatics, the ENFIN Network runs integrative research projects to bring the latest computational techniques to bear directly on questions dedicated to systems biology in the wet laboratory environment. The Network maintains internally close collaboration between experimental and computational research, enabling a permanent cycling of experimental validation and improvement of computational prediction methods. The computational work includes the development of a database infrastructure (EnCORE), bioinformatics analysis methods and a novel platform for protein function analysis FuncNet.
A Framework for Daylighting Optimization in Whole Buildings with OpenStudio
DOE Office of Scientific and Technical Information (OSTI.GOV)
None
2016-08-12
We present a toolkit and workflow for leveraging the OpenStudio (Guglielmetti et al. 2010) platform to perform daylighting analysis and optimization in a whole building energy modeling (BEM) context. We have re-implemented OpenStudio's integrated Radiance and EnergyPlus functionality as an OpenStudio Measure. The OpenStudio Radiance Measure works within the OpenStudio Application and Parametric Analysis Tool, as well as the OpenStudio Server large scale analysis framework, allowing a rigorous daylighting simulation to be performed on a single building model or potentially an entire population of programmatically generated models. The Radiance simulation results can automatically inform the broader building energy model, andmore » provide dynamic daylight metrics as a basis for decision. Through introduction and example, this paper illustrates the utility of the OpenStudio building energy modeling platform to leverage existing simulation tools for integrated building energy performance simulation, daylighting analysis, and reportage.« less
Médigue, Claudine; Calteau, Alexandra; Cruveiller, Stéphane; Gachet, Mathieu; Gautreau, Guillaume; Josso, Adrien; Lajus, Aurélie; Langlois, Jordan; Pereira, Hugo; Planel, Rémi; Roche, David; Rollin, Johan; Rouy, Zoe; Vallenet, David
2017-09-12
The overwhelming list of new bacterial genomes becoming available on a daily basis makes accurate genome annotation an essential step that ultimately determines the relevance of thousands of genomes stored in public databanks. The MicroScope platform (http://www.genoscope.cns.fr/agc/microscope) is an integrative resource that supports systematic and efficient revision of microbial genome annotation, data management and comparative analysis. Starting from the results of our syntactic, functional and relational annotation pipelines, MicroScope provides an integrated environment for the expert annotation and comparative analysis of prokaryotic genomes. It combines tools and graphical interfaces to analyze genomes and to perform the manual curation of gene function in a comparative genomics and metabolic context. In this article, we describe the free-of-charge MicroScope services for the annotation and analysis of microbial (meta)genomes, transcriptomic and re-sequencing data. Then, the functionalities of the platform are presented in a way providing practical guidance and help to the nonspecialists in bioinformatics. Newly integrated analysis tools (i.e. prediction of virulence and resistance genes in bacterial genomes) and original method recently developed (the pan-genome graph representation) are also described. Integrated environments such as MicroScope clearly contribute, through the user community, to help maintaining accurate resources. © The Author 2017. Published by Oxford University Press.
NASA Tech Briefs, November 2012
NASA Technical Reports Server (NTRS)
2012-01-01
The topics include: Visual System for Browsing, Analysis, and Retrieval of Data (ViSBARD); Time-Domain Terahertz Computed Axial Tomography NDE System; Adaptive Sampling of Time Series During Remote Exploration; A Tracking Sun Photometer Without Moving Parts; Surface Temperature Data Analysis; Modular, Autonomous Command and Data Handling Software with Built-In Simulation and Test; In-Situ Wire Damage Detection System; Amplifier Module for 260-GHz Band Using Quartz Waveguide Transitions; Wideband Agile Digital Microwave Radiometer; Buckyball Nucleation of HiPco Tubes; FACT, Mega-ROSA, SOLAROSA; An Integrated, Layered-Spinel Composite Cathode for Energy Storage Applications; Engineered Multifunctional Surfaces for Fluid Handling; Polyolefin-Based Aerogels; Adjusting Permittivity by Blending Varying Ratios of SWNTs; Gravity-Assist Mechanical Simulator for Outreach; Concept for Hydrogen-Impregnated Nanofiber/Photovoltaic Cargo Stowage System; DROP: Durable Reconnaissance and Observation Platform; Developing Physiologic Models for Emergency Medical Procedures Under Microgravity; Spectroscopic Chemical Analysis Methods and Apparatus; Low Average Sidelobe Slot Array Antennas for Radiometer Applications; Motion-Corrected 3D Sonic Anemometer for Tethersondes and Other Moving Platforms; Water Treatment Systems for Long Spaceflights; Microchip Non-Aqueous Capillary Electrophoresis (MicronNACE) Method to Analyze Long-Chain Primary Amines; Low-Cost Phased Array Antenna for Sounding Rockets, Missiles, and Expendable Launch Vehicles; Mars Science Laboratory Engineering Cameras; Seismic Imager Space Telescope; Estimating Sea Surface Salinity and Wind Using Combined Passive and Active L-Band Microwave Observations; A Posteriori Study of a DNS Database Describing Super critical Binary-Species Mixing; Scalable SCPPM Decoder; QuakeSim 2.0; HURON (HUman and Robotic Optimization Network) Multi-Agent Temporal Activity Planner/Scheduler; MPST Software: MoonKommand
GARNET--gene set analysis with exploration of annotation relations.
Rho, Kyoohyoung; Kim, Bumjin; Jang, Youngjun; Lee, Sanghyun; Bae, Taejeong; Seo, Jihae; Seo, Chaehwa; Lee, Jihyun; Kang, Hyunjung; Yu, Ungsik; Kim, Sunghoon; Lee, Sanghyuk; Kim, Wan Kyu
2011-02-15
Gene set analysis is a powerful method of deducing biological meaning for an a priori defined set of genes. Numerous tools have been developed to test statistical enrichment or depletion in specific pathways or gene ontology (GO) terms. Major difficulties towards biological interpretation are integrating diverse types of annotation categories and exploring the relationships between annotation terms of similar information. GARNET (Gene Annotation Relationship NEtwork Tools) is an integrative platform for gene set analysis with many novel features. It includes tools for retrieval of genes from annotation database, statistical analysis & visualization of annotation relationships, and managing gene sets. In an effort to allow access to a full spectrum of amassed biological knowledge, we have integrated a variety of annotation data that include the GO, domain, disease, drug, chromosomal location, and custom-defined annotations. Diverse types of molecular networks (pathways, transcription and microRNA regulations, protein-protein interaction) are also included. The pair-wise relationship between annotation gene sets was calculated using kappa statistics. GARNET consists of three modules--gene set manager, gene set analysis and gene set retrieval, which are tightly integrated to provide virtually automatic analysis for gene sets. A dedicated viewer for annotation network has been developed to facilitate exploration of the related annotations. GARNET (gene annotation relationship network tools) is an integrative platform for diverse types of gene set analysis, where complex relationships among gene annotations can be easily explored with an intuitive network visualization tool (http://garnet.isysbio.org/ or http://ercsb.ewha.ac.kr/garnet/).
Integrated generation of complex optical quantum states and their coherent control
NASA Astrophysics Data System (ADS)
Roztocki, Piotr; Kues, Michael; Reimer, Christian; Romero Cortés, Luis; Sciara, Stefania; Wetzel, Benjamin; Zhang, Yanbing; Cino, Alfonso; Chu, Sai T.; Little, Brent E.; Moss, David J.; Caspani, Lucia; Azaña, José; Morandotti, Roberto
2018-01-01
Complex optical quantum states based on entangled photons are essential for investigations of fundamental physics and are the heart of applications in quantum information science. Recently, integrated photonics has become a leading platform for the compact, cost-efficient, and stable generation and processing of optical quantum states. However, onchip sources are currently limited to basic two-dimensional (qubit) two-photon states, whereas scaling the state complexity requires access to states composed of several (<2) photons and/or exhibiting high photon dimensionality. Here we show that the use of integrated frequency combs (on-chip light sources with a broad spectrum of evenly-spaced frequency modes) based on high-Q nonlinear microring resonators can provide solutions for such scalable complex quantum state sources. In particular, by using spontaneous four-wave mixing within the resonators, we demonstrate the generation of bi- and multi-photon entangled qubit states over a broad comb of channels spanning the S, C, and L telecommunications bands, and control these states coherently to perform quantum interference measurements and state tomography. Furthermore, we demonstrate the on-chip generation of entangled high-dimensional (quDit) states, where the photons are created in a coherent superposition of multiple pure frequency modes. Specifically, we confirm the realization of a quantum system with at least one hundred dimensions. Moreover, using off-the-shelf telecommunications components, we introduce a platform for the coherent manipulation and control of frequencyentangled quDit states. Our results suggest that microcavity-based entangled photon state generation and the coherent control of states using accessible telecommunications infrastructure introduce a powerful and scalable platform for quantum information science.
Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System.
Li, Hongqiang; Yuan, Danyang; Wang, Youxi; Cui, Dianyin; Cao, Lu
2016-10-20
Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG) recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias.
Arrhythmia Classification Based on Multi-Domain Feature Extraction for an ECG Recognition System
Li, Hongqiang; Yuan, Danyang; Wang, Youxi; Cui, Dianyin; Cao, Lu
2016-01-01
Automatic recognition of arrhythmias is particularly important in the diagnosis of heart diseases. This study presents an electrocardiogram (ECG) recognition system based on multi-domain feature extraction to classify ECG beats. An improved wavelet threshold method for ECG signal pre-processing is applied to remove noise interference. A novel multi-domain feature extraction method is proposed; this method employs kernel-independent component analysis in nonlinear feature extraction and uses discrete wavelet transform to extract frequency domain features. The proposed system utilises a support vector machine classifier optimized with a genetic algorithm to recognize different types of heartbeats. An ECG acquisition experimental platform, in which ECG beats are collected as ECG data for classification, is constructed to demonstrate the effectiveness of the system in ECG beat classification. The presented system, when applied to the MIT-BIH arrhythmia database, achieves a high classification accuracy of 98.8%. Experimental results based on the ECG acquisition experimental platform show that the system obtains a satisfactory classification accuracy of 97.3% and is able to classify ECG beats efficiently for the automatic identification of cardiac arrhythmias. PMID:27775596
expVIP: a Customizable RNA-seq Data Analysis and Visualization Platform1[OPEN
2016-01-01
The majority of transcriptome sequencing (RNA-seq) expression studies in plants remain underutilized and inaccessible due to the use of disparate transcriptome references and the lack of skills and resources to analyze and visualize these data. We have developed expVIP, an expression visualization and integration platform, which allows easy analysis of RNA-seq data combined with an intuitive and interactive interface. Users can analyze public and user-specified data sets with minimal bioinformatics knowledge using the expVIP virtual machine. This generates a custom Web browser to visualize, sort, and filter the RNA-seq data and provides outputs for differential gene expression analysis. We demonstrate expVIP’s suitability for polyploid crops and evaluate its performance across a range of biologically relevant scenarios. To exemplify its use in crop research, we developed a flexible wheat (Triticum aestivum) expression browser (www.wheat-expression.com) that can be expanded with user-generated data in a local virtual machine environment. The open-access expVIP platform will facilitate the analysis of gene expression data from a wide variety of species by enabling the easy integration, visualization, and comparison of RNA-seq data across experiments. PMID:26869702
De Diego, Nuria; Fürst, Tomáš; Humplík, Jan F; Ugena, Lydia; Podlešáková, Kateřina; Spíchal, Lukáš
2017-01-01
High-throughput plant phenotyping platforms provide new possibilities for automated, fast scoring of several plant growth and development traits, followed over time using non-invasive sensors. Using Arabidops is as a model offers important advantages for high-throughput screening with the opportunity to extrapolate the results obtained to other crops of commercial interest. In this study we describe the development of a highly reproducible high-throughput Arabidopsis in vitro bioassay established using our OloPhen platform, suitable for analysis of rosette growth in multi-well plates. This method was successfully validated on example of multivariate analysis of Arabidopsis rosette growth in different salt concentrations and the interaction with varying nutritional composition of the growth medium. Several traits such as changes in the rosette area, relative growth rate, survival rate and homogeneity of the population are scored using fully automated RGB imaging and subsequent image analysis. The assay can be used for fast screening of the biological activity of chemical libraries, phenotypes of transgenic or recombinant inbred lines, or to search for potential quantitative trait loci. It is especially valuable for selecting genotypes or growth conditions that improve plant stress tolerance.
A Control System and Streaming DAQ Platform with Image-Based Trigger for X-ray Imaging
NASA Astrophysics Data System (ADS)
Stevanovic, Uros; Caselle, Michele; Cecilia, Angelica; Chilingaryan, Suren; Farago, Tomas; Gasilov, Sergey; Herth, Armin; Kopmann, Andreas; Vogelgesang, Matthias; Balzer, Matthias; Baumbach, Tilo; Weber, Marc
2015-06-01
High-speed X-ray imaging applications play a crucial role for non-destructive investigations of the dynamics in material science and biology. On-line data analysis is necessary for quality assurance and data-driven feedback, leading to a more efficient use of a beam time and increased data quality. In this article we present a smart camera platform with embedded Field Programmable Gate Array (FPGA) processing that is able to stream and process data continuously in real-time. The setup consists of a Complementary Metal-Oxide-Semiconductor (CMOS) sensor, an FPGA readout card, and a readout computer. It is seamlessly integrated in a new custom experiment control system called Concert that provides a more efficient way of operating a beamline by integrating device control, experiment process control, and data analysis. The potential of the embedded processing is demonstrated by implementing an image-based trigger. It records the temporal evolution of physical events with increased speed while maintaining the full field of view. The complete data acquisition system, with Concert and the smart camera platform was successfully integrated and used for fast X-ray imaging experiments at KIT's synchrotron radiation facility ANKA.
CISUS: an integrated 3D ultrasound system for IGT using a modular tracking API
NASA Astrophysics Data System (ADS)
Boctor, Emad M.; Viswanathan, Anand; Pieper, Steve; Choti, Michael A.; Taylor, Russell H.; Kikinis, Ron; Fichtinger, Gabor
2004-05-01
Ultrasound has become popular in clinical/surgical applications, both as the primary image guidance modality and also in conjunction with other modalities like CT or MRI. Three dimensional ultrasound (3DUS) systems have also demonstrated usefulness in image-guided therapy (IGT). At the same time, however, current lack of open-source and open-architecture multi-modal medical visualization systems prevents 3DUS from fulfilling its potential. Several stand-alone 3DUS systems, like Stradx or In-Vivo exist today. Although these systems have been found to be useful in real clinical setting, it is difficult to augment their functionality and integrate them in versatile IGT systems. To address these limitations, a robotic/freehand 3DUS open environment (CISUS) is being integrated into the 3D Slicer, an open-source research tool developed for medical image analysis and surgical planning. In addition, the system capitalizes on generic application programming interfaces (APIs) for tracking devices and robotic control. The resulting platform-independent open-source system may serve as a valuable tool to the image guided surgery community. Other researchers could straightforwardly integrate the generic CISUS system along with other functionalities (i.e. dual view visualization, registration, real-time tracking, segmentation, etc) to rapidly create their medical/surgical applications. Our current driving clinical application is robotically assisted and freehand 3DUS-guided liver ablation, which is fully being integrated under the CISUS-3D Slicer. Initial functionality and pre-clinical feasibility are demonstrated on phantom and ex-vivo animal models.
Modular Multi-Function Multi-Band Airborne Radio System (MFBARS). Volume II. Detailed Report.
1981-06-01
Three Platforms in a Field of Hyperbolic LOP’s.......................... 187 76 Comparison, MFBARS Versus Baseline .......... 190 77 Program Flow Chart...configure, from a set of common modules, a given total CNI capability on specific platforms for a given mission " the ability to take advantage of...X Comm/Nav GPS L-Band; Spread Spectrum Nay X X SEEK TALK UHF Spread; Spectrum Comm X X SINCGARS VHF; Freq. Hop Comm (some platforms ) AFSATCOM UHF
A self optimizing synthetic organic reactor system using real-time in-line NMR spectroscopy.
Sans, Victor; Porwol, Luzian; Dragone, Vincenza; Cronin, Leroy
2015-02-01
A configurable platform for synthetic chemistry incorporating an in-line benchtop NMR that is capable of monitoring and controlling organic reactions in real-time is presented. The platform is controlled via a modular LabView software control system for the hardware, NMR, data analysis and feedback optimization. Using this platform we report the real-time advanced structural characterization of reaction mixtures, including 19 F, 13 C, DEPT, 2D NMR spectroscopy (COSY, HSQC and 19 F-COSY) for the first time. Finally, the potential of this technique is demonstrated through the optimization of a catalytic organic reaction in real-time, showing its applicability to self-optimizing systems using criteria such as stereoselectivity, multi-nuclear measurements or 2D correlations.
Device Data Ingestion for Industrial Big Data Platforms with a Case Study †
Ji, Cun; Shao, Qingshi; Sun, Jiao; Liu, Shijun; Pan, Li; Wu, Lei; Yang, Chenglei
2016-01-01
Despite having played a significant role in the Industry 4.0 era, the Internet of Things is currently faced with the challenge of how to ingest large-scale heterogeneous and multi-type device data. In response to this problem we present a heterogeneous device data ingestion model for an industrial big data platform. The model includes device templates and four strategies for data synchronization, data slicing, data splitting and data indexing, respectively. We can ingest device data from multiple sources with this heterogeneous device data ingestion model, which has been verified on our industrial big data platform. In addition, we present a case study on device data-based scenario analysis of industrial big data. PMID:26927121
Lee, SangWook; Lee, Jong Hyun; Kwon, Hyuck Gi; Laurell, Thomas; Jeong, Ok Chan; Kim, Soyoun
2018-01-01
Here, we report a sol-gel integrated affinity microarray for on-chip matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) that enables capture and identification of prostate?specific antigen (PSA) in samples. An anti-PSA antibody (H117) was mixed with a sol?gel, and the mixture was spotted onto a porous silicon (pSi) surface without additional surface modifications. The antibody easily penetrates the sol-gel macropore fluidic network structure, making possible high affinities. To assess the capture affinity of the platform, we performed a direct assay using fluorescein isothiocyanate-labeled PSA. Pure PSA was subjected to on-chip MALDI-TOF-MS analysis, yielding three clear mass peptide peaks (m/z = 1272, 1407, and 1872). The sol-gel microarray platform enables dual readout of PSA both fluorometric and MALDI-TOF MS analysis in biological samples. Here we report a useful method for a means for discovery of biomarkers in complex body fluids.
ERIC Educational Resources Information Center
Naidoo, Devika
2010-01-01
This paper provides an analysis of the extent of integration at a historically advantaged school. A qualitative multi-method case study allowed for in-depth analysis of integration in the school. Bernstein's theory of code, classification, boundary and power framed the study. Data analysis showed that: racial desegregation was achieved at student…
From Synergy to Complexity: The Trend Toward Integrated Value Chain and Landscape Governance.
Ros-Tonen, Mirjam A F; Reed, James; Sunderland, Terry
2018-07-01
This Editorial introduces a special issue that illustrates a trend toward integrated landscape approaches. Whereas two papers echo older "win-win" strategies based on the trade of non-timber forest products, ten papers reflect a shift from a product to landscape perspective. However, they differ from integrated landscape approaches in that they emanate from sectorial approaches driven primarily by aims such as forest restoration, sustainable commodity sourcing, natural resource management, or carbon emission reduction. The potential of such initiatives for integrated landscape governance and achieving landscape-level outcomes has hitherto been largely unaddressed in the literature on integrated landscape approaches. This special issue addresses this gap, with a focus on actor constellations and institutional arrangements emerging in the transition from sectorial to integrated approaches. This editorial discusses the trends arising from the papers, including the need for a commonly shared concern and sense of urgency; inclusive stakeholder engagement; accommodating and coordinating polycentric governance in landscapes beset with institutional fragmentation and jurisdictional mismatches; alignment with locally embedded initiatives and governance structures; and a framework to assess and monitor the performance of integrated multi-stakeholder approaches. We conclude that, despite a growing tendency toward integrated approaches at the landscape level, inherent landscape complexity renders persistent and significant challenges such as balancing multiple objectives, equitable inclusion of all relevant stakeholders, dealing with power and gender asymmetries, adaptive management based on participatory outcome monitoring, and moving beyond existing administrative, jurisdictional, and sectorial silos. Multi-stakeholder platforms and bridging organizations and individuals are seen as key in overcoming such challenges.
Wang, Sibo; Ren, Zheng; Guo, Yanbing; ...
2016-03-21
We report the scalable three-dimensional (3-D) integration of functional nanostructures into applicable platforms represents a promising technology to meet the ever-increasing demands of fabricating high performance devices featuring cost-effectiveness, structural sophistication and multi-functional enabling. Such an integration process generally involves a diverse array of nanostructural entities (nano-entities) consisting of dissimilar nanoscale building blocks such as nanoparticles, nanowires, and nanofilms made of metals, ceramics, or polymers. Various synthetic strategies and integration methods have enabled the successful assembly of both structurally and functionally tailored nano-arrays into a unique class of monolithic devices. The performance of nano-array based monolithic devices is dictated bymore » a few important factors such as materials substrate selection, nanostructure composition and nano-architecture geometry. Therefore, the rational material selection and nano-entity manipulation during the nano-array integration process, aiming to exploit the advantageous characteristics of nanostructures and their ensembles, are critical steps towards bridging the design of nanostructure integrated monolithic devices with various practical applications. In this article, we highlight the latest research progress of the two-dimensional (2-D) and 3-D metal and metal oxide based nanostructural integrations into prototype devices applicable with ultrahigh efficiency, good robustness and improved functionality. Lastly, selective examples of nano-array integration, scalable nanomanufacturing and representative monolithic devices such as catalytic converters, sensors and batteries will be utilized as the connecting dots to display a roadmap from hierarchical nanostructural assembly to practical nanotechnology implications ranging from energy, environmental, to chemical and biotechnology areas.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Sibo; Ren, Zheng; Guo, Yanbing
We report the scalable three-dimensional (3-D) integration of functional nanostructures into applicable platforms represents a promising technology to meet the ever-increasing demands of fabricating high performance devices featuring cost-effectiveness, structural sophistication and multi-functional enabling. Such an integration process generally involves a diverse array of nanostructural entities (nano-entities) consisting of dissimilar nanoscale building blocks such as nanoparticles, nanowires, and nanofilms made of metals, ceramics, or polymers. Various synthetic strategies and integration methods have enabled the successful assembly of both structurally and functionally tailored nano-arrays into a unique class of monolithic devices. The performance of nano-array based monolithic devices is dictated bymore » a few important factors such as materials substrate selection, nanostructure composition and nano-architecture geometry. Therefore, the rational material selection and nano-entity manipulation during the nano-array integration process, aiming to exploit the advantageous characteristics of nanostructures and their ensembles, are critical steps towards bridging the design of nanostructure integrated monolithic devices with various practical applications. In this article, we highlight the latest research progress of the two-dimensional (2-D) and 3-D metal and metal oxide based nanostructural integrations into prototype devices applicable with ultrahigh efficiency, good robustness and improved functionality. Lastly, selective examples of nano-array integration, scalable nanomanufacturing and representative monolithic devices such as catalytic converters, sensors and batteries will be utilized as the connecting dots to display a roadmap from hierarchical nanostructural assembly to practical nanotechnology implications ranging from energy, environmental, to chemical and biotechnology areas.« less
xQTL workbench: a scalable web environment for multi-level QTL analysis.
Arends, Danny; van der Velde, K Joeri; Prins, Pjotr; Broman, Karl W; Möller, Steffen; Jansen, Ritsert C; Swertz, Morris A
2012-04-01
xQTL workbench is a scalable web platform for the mapping of quantitative trait loci (QTLs) at multiple levels: for example gene expression (eQTL), protein abundance (pQTL), metabolite abundance (mQTL) and phenotype (phQTL) data. Popular QTL mapping methods for model organism and human populations are accessible via the web user interface. Large calculations scale easily on to multi-core computers, clusters and Cloud. All data involved can be uploaded and queried online: markers, genotypes, microarrays, NGS, LC-MS, GC-MS, NMR, etc. When new data types come available, xQTL workbench is quickly customized using the Molgenis software generator. xQTL workbench runs on all common platforms, including Linux, Mac OS X and Windows. An online demo system, installation guide, tutorials, software and source code are available under the LGPL3 license from http://www.xqtl.org. m.a.swertz@rug.nl.
xQTL workbench: a scalable web environment for multi-level QTL analysis
Arends, Danny; van der Velde, K. Joeri; Prins, Pjotr; Broman, Karl W.; Möller, Steffen; Jansen, Ritsert C.; Swertz, Morris A.
2012-01-01
Summary: xQTL workbench is a scalable web platform for the mapping of quantitative trait loci (QTLs) at multiple levels: for example gene expression (eQTL), protein abundance (pQTL), metabolite abundance (mQTL) and phenotype (phQTL) data. Popular QTL mapping methods for model organism and human populations are accessible via the web user interface. Large calculations scale easily on to multi-core computers, clusters and Cloud. All data involved can be uploaded and queried online: markers, genotypes, microarrays, NGS, LC-MS, GC-MS, NMR, etc. When new data types come available, xQTL workbench is quickly customized using the Molgenis software generator. Availability: xQTL workbench runs on all common platforms, including Linux, Mac OS X and Windows. An online demo system, installation guide, tutorials, software and source code are available under the LGPL3 license from http://www.xqtl.org. Contact: m.a.swertz@rug.nl PMID:22308096
Yang, Hye Jung; Kang, Jae-Heon; Kim, Ok Hyun; Choi, Mona; Oh, Myungju; Nam, Jihyun; Sung, Eunju
2017-01-01
Background: Childhood obesity is a critical health issue, both currently and for the foreseeable future. To prevent obesity, behavior changes are essential. Smartphones can be a good tool, as the number of child smartphone users is rapidly increasing. We have developed a mobile platform system named “HAPPY ME,” which is a smartphone application coupled with a wearable device, designed to improve healthy behaviors to prevent childhood obesity. This study aimed to evaluate the effectiveness of obesity prevention among children 10–12 years of age using HAPPY ME. Methods: A total of 1000 participants, all fifth and sixth graders from four schools, were assigned to either control or intervention groups by school. Students in the intervention group used HAPPY ME. The study comprises a safety test, a 12-week efficacy test, and a six-month follow-up test to determine the long-term effects of preventive intervention via the integrated service platform. The integrated service platform aims to facilitate child-parent-school participation, involving the child-parent mobile application, a child-teacher mobile web, and a school website. Primary outcome measures are behavioral changes, including healthy eating, increased physical activity, and fitness. Secondary outcome measures are changes in anthropometric parameters (body weight, height, body mass index z-score, and waist circumference), body mass index (BMI) percentiles (obesity rate), and psychological perceptions among participants. Conclusions: The results of this study will offer evidence of the effectiveness of a mobile platform service with a multi-component intervention program based on a comprehensive approach. PMID:28208839
Yang, Hye Jung; Kang, Jae-Heon; Kim, Ok Hyun; Choi, Mona; Oh, Myungju; Nam, Jihyun; Sung, Eunju
2017-02-13
Childhood obesity is a critical health issue, both currently and for the foreseeable future. To prevent obesity, behavior changes are essential. Smartphones can be a good tool, as the number of child smartphone users is rapidly increasing. We have developed a mobile platform system named "HAPPY ME," which is a smartphone application coupled with a wearable device, designed to improve healthy behaviors to prevent childhood obesity. This study aimed to evaluate the effectiveness of obesity prevention among children 10-12 years of age using HAPPY ME. A total of 1000 participants, all fifth and sixth graders from four schools, were assigned to either control or intervention groups by school. Students in the intervention group used HAPPY ME. The study comprises a safety test, a 12-week efficacy test, and a six-month follow-up test to determine the long-term effects of preventive intervention via the integrated service platform. The integrated service platform aims to facilitate child-parent-school participation, involving the child-parent mobile application, a child-teacher mobile web, and a school website. Primary outcome measures are behavioral changes, including healthy eating, increased physical activity, and fitness. Secondary outcome measures are changes in anthropometric parameters (body weight, height, body mass index z-score, and waist circumference), body mass index (BMI) percentiles (obesity rate), and psychological perceptions among participants. The results of this study will offer evidence of the effectiveness of a mobile platform service with a multi-component intervention program based on a comprehensive approach.
BluePen Biomarkers LLC: integrated biomarker solutions
Blair, Ian A; Mesaros, Clementina; Lilley, Patrick; Nunez, Matthew
2016-01-01
BluePen Biomarkers provides a unique comprehensive multi-omics biomarker discovery and validation platform. We can quantify, integrate and analyze genomics, proteomics, metabolomics and lipidomics biomarkers, alongside clinical data, demographics and other phenotypic data. A unique bio-inspired signal processing analytic approach is used that has the proven ability to identify biomarkers in a wide variety of diseases. The resulting biomarkers can be used for diagnosis, prognosis, mechanistic studies and predicting treatment response, in contexts from core research through clinical trials. BluePen Biomarkers provides an additional groundbreaking research goal: identifying surrogate biomarkers from different modalities. This not only provides new biological insights, but enables least invasive, least-cost tests that meet or exceed the predictive quality of current tests. PMID:28031971
Lee, Wonhoon; Park, Jongsun; Choi, Jaeyoung; Jung, Kyongyong; Park, Bongsoo; Kim, Donghan; Lee, Jaeyoung; Ahn, Kyohun; Song, Wonho; Kang, Seogchan; Lee, Yong-Hwan; Lee, Seunghwan
2009-01-01
Background Sequences and organization of the mitochondrial genome have been used as markers to investigate evolutionary history and relationships in many taxonomic groups. The rapidly increasing mitochondrial genome sequences from diverse insects provide ample opportunities to explore various global evolutionary questions in the superclass Hexapoda. To adequately support such questions, it is imperative to establish an informatics platform that facilitates the retrieval and utilization of available mitochondrial genome sequence data. Results The Insect Mitochondrial Genome Database (IMGD) is a new integrated platform that archives the mitochondrial genome sequences from 25,747 hexapod species, including 112 completely sequenced and 20 nearly completed genomes and 113,985 partially sequenced mitochondrial genomes. The Species-driven User Interface (SUI) of IMGD supports data retrieval and diverse analyses at multi-taxon levels. The Phyloviewer implemented in IMGD provides three methods for drawing phylogenetic trees and displays the resulting trees on the web. The SNP database incorporated to IMGD presents the distribution of SNPs and INDELs in the mitochondrial genomes of multiple isolates within eight species. A newly developed comparative SNU Genome Browser supports the graphical presentation and interactive interface for the identified SNPs/INDELs. Conclusion The IMGD provides a solid foundation for the comparative mitochondrial genomics and phylogenetics of insects. All data and functions described here are available at the web site . PMID:19351385