Sample records for operation cross-platform validation

  1. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Plutonium Metals, Oxides, and Solutions on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; Gough, Sean T.

    This report documents a validation of the MCNP6 Version 1.0 computer code on the high performance computing platform Moonlight, for operations at Los Alamos National Laboratory (LANL) that involve plutonium metals, oxides, and solutions. The validation is conducted using the ENDF/B-VII.1 continuous energy group cross section library at room temperature. The results are for use by nuclear criticality safety personnel in performing analysis and evaluation of various facility activities involving plutonium materials.

  2. HipMatch: an object-oriented cross-platform program for accurate determination of cup orientation using 2D-3D registration of single standard X-ray radiograph and a CT volume.

    PubMed

    Zheng, Guoyan; Zhang, Xuan; Steppacher, Simon D; Murphy, Stephen B; Siebenrock, Klaus A; Tannast, Moritz

    2009-09-01

    The widely used procedure of evaluation of cup orientation following total hip arthroplasty using single standard anteroposterior (AP) radiograph is known inaccurate, largely due to the wide variability in individual pelvic orientation relative to X-ray plate. 2D-3D image registration methods have been introduced for an accurate determination of the post-operative cup alignment with respect to an anatomical reference extracted from the CT data. Although encouraging results have been reported, their extensive usage in clinical routine is still limited. This may be explained by their requirement of a CAD model of the prosthesis, which is often difficult to be organized from the manufacturer due to the proprietary issue, and by their requirement of either multiple radiographs or a radiograph-specific calibration, both of which are not available for most retrospective studies. To address these issues, we developed and validated an object-oriented cross-platform program called "HipMatch" where a hybrid 2D-3D registration scheme combining an iterative landmark-to-ray registration with a 2D-3D intensity-based registration was implemented to estimate a rigid transformation between a pre-operative CT volume and the post-operative X-ray radiograph for a precise estimation of cup alignment. No CAD model of the prosthesis is required. Quantitative and qualitative results evaluated on cadaveric and clinical datasets are given, which indicate the robustness and the accuracy of the program. HipMatch is written in object-oriented programming language C++ using cross-platform software Qt (TrollTech, Oslo, Norway), VTK, and Coin3D and is transportable to any platform.

  3. UTM Technical Capabilities Level 2 (TLC2) Test at Reno-Stead Airport.

    NASA Image and Video Library

    2016-10-06

    Test of Unmanned Aircraft Systems Traffic Management (UTM) technical capability Level 2 (TCL2) at Reno-Stead Airport, Nevada. During the test, five drones simultaneously crossed paths, separated by altitude. Two drones flew beyond visual line-of-sight and three flew within line-of-sight of their operators. Engineer Joey Mercer reviews flight paths using the UAS traffic management research platform UTM coordinator app to verify and validate flight paths.

  4. PR-PR: Cross-Platform Laboratory Automation System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, G; Stawski, N; Goyal, G

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Goldenmore » Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.« less

  5. PR-PR: cross-platform laboratory automation system.

    PubMed

    Linshiz, Gregory; Stawski, Nina; Goyal, Garima; Bi, Changhao; Poust, Sean; Sharma, Monica; Mutalik, Vivek; Keasling, Jay D; Hillson, Nathan J

    2014-08-15

    To enable protocol standardization, sharing, and efficient implementation across laboratory automation platforms, we have further developed the PR-PR open-source high-level biology-friendly robot programming language as a cross-platform laboratory automation system. Beyond liquid-handling robotics, PR-PR now supports microfluidic and microscopy platforms, as well as protocol translation into human languages, such as English. While the same set of basic PR-PR commands and features are available for each supported platform, the underlying optimization and translation modules vary from platform to platform. Here, we describe these further developments to PR-PR, and demonstrate the experimental implementation and validation of PR-PR protocols for combinatorial modified Golden Gate DNA assembly across liquid-handling robotic, microfluidic, and manual platforms. To further test PR-PR cross-platform performance, we then implement and assess PR-PR protocols for Kunkel DNA mutagenesis and hierarchical Gibson DNA assembly for microfluidic and manual platforms.

  6. Targeted exploration and analysis of large cross-platform human transcriptomic compendia

    PubMed Central

    Zhu, Qian; Wong, Aaron K; Krishnan, Arjun; Aure, Miriam R; Tadych, Alicja; Zhang, Ran; Corney, David C; Greene, Casey S; Bongo, Lars A; Kristensen, Vessela N; Charikar, Moses; Li, Kai; Troyanskaya, Olga G.

    2016-01-01

    We present SEEK (http://seek.princeton.edu), a query-based search engine across very large transcriptomic data collections, including thousands of human data sets from almost 50 microarray and next-generation sequencing platforms. SEEK uses a novel query-level cross-validation-based algorithm to automatically prioritize data sets relevant to the query and a robust search approach to identify query-coregulated genes, pathways, and processes. SEEK provides cross-platform handling, multi-gene query search, iterative metadata-based search refinement, and extensive visualization-based analysis options. PMID:25581801

  7. Cross-platform validation and analysis environment for particle physics

    NASA Astrophysics Data System (ADS)

    Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.

    2017-11-01

    A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for online validation of Monte Carlo event samples through a web interface.

  8. Cross-platform validation and analysis environment for particle physics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chekanov, S. V.; Pogrebnyak, I.; Wilbern, D.

    A multi-platform validation and analysis framework for public Monte Carlo simulation for high-energy particle collisions is discussed. The front-end of this framework uses the Python programming language, while the back-end is written in Java, which provides a multi-platform environment that can be run from a web browser and can easily be deployed at the grid sites. The analysis package includes all major software tools used in high-energy physics, such as Lorentz vectors, jet algorithms, histogram packages, graphic canvases, and tools for providing data access. This multi-platform software suite, designed to minimize OS-specific maintenance and deployment time, is used for onlinemore » validation of Monte Carlo event samples through a web interface.« less

  9. PaR-PaR Laboratory Automation Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Linshiz, G; Stawski, N; Poust, S

    2013-05-01

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaRmore » allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.« less

  10. PaR-PaR laboratory automation platform.

    PubMed

    Linshiz, Gregory; Stawski, Nina; Poust, Sean; Bi, Changhao; Keasling, Jay D; Hillson, Nathan J

    2013-05-17

    Labor-intensive multistep biological tasks, such as the construction and cloning of DNA molecules, are prime candidates for laboratory automation. Flexible and biology-friendly operation of robotic equipment is key to its successful integration in biological laboratories, and the efforts required to operate a robot must be much smaller than the alternative manual lab work. To achieve these goals, a simple high-level biology-friendly robot programming language is needed. We have developed and experimentally validated such a language: Programming a Robot (PaR-PaR). The syntax and compiler for the language are based on computer science principles and a deep understanding of biological workflows. PaR-PaR allows researchers to use liquid-handling robots effectively, enabling experiments that would not have been considered previously. After minimal training, a biologist can independently write complicated protocols for a robot within an hour. Adoption of PaR-PaR as a standard cross-platform language would enable hand-written or software-generated robotic protocols to be shared across laboratories.

  11. Assembly Platform For Use In Outer Space

    NASA Technical Reports Server (NTRS)

    Rao, Niranjan S.; Buddington, Patricia A.

    1995-01-01

    Report describes conceptual platform or framework for use in assembling other structures and spacecraft in outer space. Consists of three fixed structural beams comprising central beam and two cross beams. Robotic manipulators spaced apart on platform to provide telerobotic operation of platform by either space-station or ground crews. Platform and attached vehicles function synergistically to achieve maximum performance for intended purposes.

  12. openECA Detailed Design Document

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Robertson, Russell

    This document describes the functional and non-functional requirements for: The openECA platform The included analytic systems that will: Validate the operational readiness and performance of the openECA platform Provide out-of-box value to those that implement the openECA platform with an initial collection of analytics

  13. Validation of MCNP6 Version 1.0 with the ENDF/B-VII.1 Cross Section Library for Uranium Metal, Oxide, and Solution Systems on the High Performance Computing Platform Moonlight

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chapman, Bryan Scott; MacQuigg, Michael Robert; Wysong, Andrew Russell

    In this document, the code MCNP is validated with ENDF/B-VII.1 cross section data under the purview of ANSI/ANS-8.24-2007, for use with uranium systems. MCNP is a computer code based on Monte Carlo transport methods. While MCNP has wide reading capability in nuclear transport simulation, this validation is limited to the functionality related to neutron transport and calculation of criticality parameters such as k eff.

  14. Development and Validation of a Novel Robotic Procedure Specific Simulation Platform: Partial Nephrectomy.

    PubMed

    Hung, Andrew J; Shah, Swar H; Dalag, Leonard; Shin, Daniel; Gill, Inderbir S

    2015-08-01

    We developed a novel procedure specific simulation platform for robotic partial nephrectomy. In this study we prospectively evaluate its face, content, construct and concurrent validity. This hybrid platform features augmented reality and virtual reality. Augmented reality involves 3-dimensional robotic partial nephrectomy surgical videos overlaid with virtual instruments to teach surgical anatomy, technical skills and operative steps. Advanced technical skills are assessed with an embedded full virtual reality renorrhaphy task. Participants were classified as novice (no surgical training, 15), intermediate (less than 100 robotic cases, 13) or expert (100 or more robotic cases, 14) and prospectively assessed. Cohort performance was compared with the Kruskal-Wallis test (construct validity). Post-study questionnaire was used to assess the realism of simulation (face validity) and usefulness for training (content validity). Concurrent validity evaluated correlation between virtual reality renorrhaphy task and a live porcine robotic partial nephrectomy performance (Spearman's analysis). Experts rated the augmented reality content as realistic (median 8/10) and helpful for resident/fellow training (8.0-8.2/10). Experts rated the platform highly for teaching anatomy (9/10) and operative steps (8.5/10) but moderately for technical skills (7.5/10). Experts and intermediates outperformed novices (construct validity) in efficiency (p=0.0002) and accuracy (p=0.002). For virtual reality renorrhaphy, experts outperformed intermediates on GEARS metrics (p=0.002). Virtual reality renorrhaphy and in vivo porcine robotic partial nephrectomy performance correlated significantly (r=0.8, p <0.0001) (concurrent validity). This augmented reality simulation platform displayed face, content and construct validity. Performance in the procedure specific virtual reality task correlated highly with a porcine model (concurrent validity). Future efforts will integrate procedure specific virtual reality tasks and their global assessment. Copyright © 2015 American Urological Association Education and Research, Inc. Published by Elsevier Inc. All rights reserved.

  15. Cross Validation of Selection of Variables in Multiple Regression.

    DTIC Science & Technology

    1979-12-01

    55 vii CROSS VALIDATION OF SELECTION OF VARIABLES IN MULTIPLE REGRESSION I Introduction Background Long term DoD planning gcals...028545024 .31109000 BF * SS - .008700618 .0471961 Constant - .70977903 85.146786 55 had adequate predictive capabilities; the other two models (the...71ZCO F111D Control 54 73EGO FlIID Computer, General Purpose 55 73EPO FII1D Converter-Multiplexer 56 73HAO flllD Stabilizer Platform 57 73HCO F1ID

  16. Behavioral Indicators on a Mobile Sensing Platform Predict Clinically Validated Psychiatric Symptoms of Mood and Anxiety Disorders

    PubMed Central

    Place, Skyler; Rubin, Channah; Gorrostieta, Cristina; Mead, Caroline; Kane, John; Marx, Brian P; Feast, Joshua; Deckersbach, Thilo; Pentland, Alex “Sandy”; Nierenberg, Andrew; Azarbayejani, Ali

    2017-01-01

    Background There is a critical need for real-time tracking of behavioral indicators of mental disorders. Mobile sensing platforms that objectively and noninvasively collect, store, and analyze behavioral indicators have not yet been clinically validated or scalable. Objective The aim of our study was to report on models of clinical symptoms for post-traumatic stress disorder (PTSD) and depression derived from a scalable mobile sensing platform. Methods A total of 73 participants (67% [49/73] male, 48% [35/73] non-Hispanic white, 33% [24/73] veteran status) who reported at least one symptom of PTSD or depression completed a 12-week field trial. Behavioral indicators were collected through the noninvasive mobile sensing platform on participants’ mobile phones. Clinical symptoms were measured through validated clinical interviews with a licensed clinical social worker. A combination hypothesis and data-driven approach was used to derive key features for modeling symptoms, including the sum of outgoing calls, count of unique numbers texted, absolute distance traveled, dynamic variation of the voice, speaking rate, and voice quality. Participants also reported ease of use and data sharing concerns. Results Behavioral indicators predicted clinically assessed symptoms of depression and PTSD (cross-validated area under the curve [AUC] for depressed mood=.74, fatigue=.56, interest in activities=.75, and social connectedness=.83). Participants reported comfort sharing individual data with physicians (Mean 3.08, SD 1.22), mental health providers (Mean 3.25, SD 1.39), and medical researchers (Mean 3.03, SD 1.36). Conclusions Behavioral indicators passively collected through a mobile sensing platform predicted symptoms of depression and PTSD. The use of mobile sensing platforms can provide clinically validated behavioral indicators in real time; however, further validation of these models and this platform in large clinical samples is needed. PMID:28302595

  17. Experimental validation of docking and capture using space robotics testbeds

    NASA Technical Reports Server (NTRS)

    Spofford, John

    1991-01-01

    Docking concepts include capture, berthing, and docking. The definitions of these terms, consistent with AIAA, are as follows: (1) capture (grasping)--the use of a manipulator to make initial contact and attachment between transfer vehicle and a platform; (2) berthing--positioning of a transfer vehicle or payload into platform restraints using a manipulator; and (3) docking--propulsive mechanical connection between vehicle and platform. The combination of the capture and berthing operations is effectively the same as docking; i.e., capture (grasping) + berthing = docking. These concepts are discussed in terms of Martin Marietta's ability to develop validation methods using robotics testbeds.

  18. SURA-IOOS Coastal Inundation Testbed Inter-Model Evaluation of Tides, Waves, and Hurricane Surge in the Gulf of Mexico

    NASA Astrophysics Data System (ADS)

    Kerr, P. C.; Donahue, A.; Westerink, J. J.; Luettich, R.; Zheng, L.; Weisberg, R. H.; Wang, H. V.; Slinn, D. N.; Davis, J. R.; Huang, Y.; Teng, Y.; Forrest, D.; Haase, A.; Kramer, A.; Rhome, J.; Feyen, J. C.; Signell, R. P.; Hanson, J. L.; Taylor, A.; Hope, M.; Kennedy, A. B.; Smith, J. M.; Powell, M. D.; Cardone, V. J.; Cox, A. T.

    2012-12-01

    The Southeastern Universities Research Association (SURA), in collaboration with the NOAA Integrated Ocean Observing System program and other federal partners, developed a testbed to help accelerate progress in both research and the transition to operational use of models for both coastal and estuarine prediction. This testbed facilitates cyber-based sharing of data and tools, archival of observation data, and the development of cross-platform tools to efficiently access, visualize, skill assess, and evaluate model results. In addition, this testbed enables the modeling community to quantitatively assess the behavior (e.g., skill, robustness, execution speed) and implementation requirements (e.g. resolution, parameterization, computer capacity) that characterize the suitability and performance of selected models from both operational and fundamental science perspectives. This presentation focuses on the tropical coastal inundation component of the testbed and compares a variety of model platforms as well as grids in simulating tides, and the wave and surge environments for two extremely well documented historical hurricanes, Hurricanes Rita (2005) and Ike (2008). Model platforms included are ADCIRC, FVCOM, SELFE, SLOSH, SWAN, and WWMII. Model validation assessments were performed on simulation results using numerous station observation data in the form of decomposed harmonic constituents, water level high water marks and hydrographs of water level and wave data. In addition, execution speed, inundation extents defined by differences in wetting/drying schemes, resolution and parameterization sensitivities are also explored.

  19. Validation of a multiplex electrochemiluminescent immunoassay platform in human and mouse samples

    PubMed Central

    Bastarache, J.A.; Koyama, T.; Wickersham, N.E; Ware, L.B.

    2014-01-01

    Despite the widespread use of multiplex immunoassays, there are very few scientific reports that test the accuracy and reliability of a platform prior to publication of experimental data. Our laboratory has previously demonstrated the need for new assay platform validation prior to use of biologic samples from large studies in order to optimize sample handling and assay performance. In this study, our goal was to test the accuracy and reproducibility of an electrochemiluminescent multiplex immunoassay platform (Meso Scale Discovery, MSD®) and compare this platform to validated, singleplex immunoassays (R&D Systems®) using actual study subject (human plasma and mouse bronchoalveolar lavage fluid (BALF) and plasma) samples. We found that the MSD platform performed well on intra- and inter-assay comparisons, spike and recovery and cross-platform comparisons. The mean intra-assay CV% and range for MSD was 3.49 (0.0-10.4) for IL-6 and 2.04 (0.1-7.9) for IL-8. The correlation between values for identical samples measured on both MSD and R&D was R=0.97 for both analytes. The mouse MSD assay had a broader range of CV% with means ranging from 9.5-28.5 depending on the analyte. The range of mean CV% was similar for single plex ELISAs at 4.3-23.7 depending on the analyte. Regardless of species or sample type, CV% was more variable at lower protein concentrations. In conclusion, we validated a multiplex electrochemiluminscent assay system and found that it has superior test characteristics in human plasma compared to mouse BALF and plasma. Both human and MSD assays compared favorably to well-validated singleplex ELISA's PMID:24768796

  20. Reliability, robustness, and reproducibility in mouse behavioral phenotyping: a cross-laboratory study

    PubMed Central

    Mandillo, Silvia; Tucci, Valter; Hölter, Sabine M.; Meziane, Hamid; Banchaabouchi, Mumna Al; Kallnik, Magdalena; Lad, Heena V.; Nolan, Patrick M.; Ouagazzal, Abdel-Mouttalib; Coghill, Emma L.; Gale, Karin; Golini, Elisabetta; Jacquot, Sylvie; Krezel, Wojtek; Parker, Andy; Riet, Fabrice; Schneider, Ilka; Marazziti, Daniela; Auwerx, Johan; Brown, Steve D. M.; Chambon, Pierre; Rosenthal, Nadia; Tocchini-Valentini, Glauco; Wurst, Wolfgang

    2008-01-01

    Establishing standard operating procedures (SOPs) as tools for the analysis of behavioral phenotypes is fundamental to mouse functional genomics. It is essential that the tests designed provide reliable measures of the process under investigation but most importantly that these are reproducible across both time and laboratories. For this reason, we devised and tested a set of SOPs to investigate mouse behavior. Five research centers were involved across France, Germany, Italy, and the UK in this study, as part of the EUMORPHIA program. All the procedures underwent a cross-validation experimental study to investigate the robustness of the designed protocols. Four inbred reference strains (C57BL/6J, C3HeB/FeJ, BALB/cByJ, 129S2/SvPas), reflecting their use as common background strains in mutagenesis programs, were analyzed to validate these tests. We demonstrate that the operating procedures employed, which includes open field, SHIRPA, grip-strength, rotarod, Y-maze, prepulse inhibition of acoustic startle response, and tail flick tests, generated reproducible results between laboratories for a number of the test output parameters. However, we also identified several uncontrolled variables that constitute confounding factors in behavioral phenotyping. The EUMORPHIA SOPs described here are an important start-point for the ongoing development of increasingly robust phenotyping platforms and their application in large-scale, multicentre mouse phenotyping programs. PMID:18505770

  1. Criterion and Construct Validity of an Isometric Midthigh-Pull Dynamometer for Assessing Whole-Body Strength in Professional Rugby League Players.

    PubMed

    Dobbin, Nick; Hunwicks, Richard; Jones, Ben; Till, Kevin; Highton, Jamie; Twist, Craig

    2018-02-01

    To examine the criterion and construct validity of an isometric midthigh-pull dynamometer to assess whole-body strength in professional rugby league players. Fifty-six male rugby league players (33 senior and 23 youth players) performed 4 isometric midthigh-pull efforts (ie, 2 on the dynamometer and 2 on the force platform) in a randomized and counterbalanced order. Isometric peak force was underestimated (P < .05) using the dynamometer compared with the force platform (95% LoA: -213.5 ± 342.6 N). Linear regression showed that peak force derived from the dynamometer explained 85% (adjusted R 2  = .85, SEE = 173 N) of the variance in the dependent variable, with the following prediction equation derived: predicted peak force = [1.046 × dynamometer peak force] + 117.594. Cross-validation revealed a nonsignificant bias (P > .05) between the predicted and peak force from the force platform and an adjusted R 2 (79.6%) that represented shrinkage of 0.4% relative to the cross-validation model (80%). Peak force was greater for the senior than the youth professionals using the dynamometer (2261.2 ± 222 cf 1725.1 ± 298.0 N, respectively; P < .05). The isometric midthigh pull assessed using a dynamometer underestimates criterion peak force but is capable of distinguishing muscle-function characteristics between professional rugby league players of different standards.

  2. Excavator Design Validation

    NASA Technical Reports Server (NTRS)

    Pholsiri, Chalongrath; English, James; Seberino, Charles; Lim, Yi-Je

    2010-01-01

    The Excavator Design Validation tool verifies excavator designs by automatically generating control systems and modeling their performance in an accurate simulation of their expected environment. Part of this software design includes interfacing with human operations that can be included in simulation-based studies and validation. This is essential for assessing productivity, versatility, and reliability. This software combines automatic control system generation from CAD (computer-aided design) models, rapid validation of complex mechanism designs, and detailed models of the environment including soil, dust, temperature, remote supervision, and communication latency to create a system of high value. Unique algorithms have been created for controlling and simulating complex robotic mechanisms automatically from just a CAD description. These algorithms are implemented as a commercial cross-platform C++ software toolkit that is configurable using the Extensible Markup Language (XML). The algorithms work with virtually any mobile robotic mechanisms using module descriptions that adhere to the XML standard. In addition, high-fidelity, real-time physics-based simulation algorithms have also been developed that include models of internal forces and the forces produced when a mechanism interacts with the outside world. This capability is combined with an innovative organization for simulation algorithms, new regolith simulation methods, and a unique control and study architecture to make powerful tools with the potential to transform the way NASA verifies and compares excavator designs. Energid's Actin software has been leveraged for this design validation. The architecture includes parametric and Monte Carlo studies tailored for validation of excavator designs and their control by remote human operators. It also includes the ability to interface with third-party software and human-input devices. Two types of simulation models have been adapted: high-fidelity discrete element models and fast analytical models. By using the first to establish parameters for the second, a system has been created that can be executed in real time, or faster than real time, on a desktop PC. This allows Monte Carlo simulations to be performed on a computer platform available to all researchers, and it allows human interaction to be included in a real-time simulation process. Metrics on excavator performance are established that work with the simulation architecture. Both static and dynamic metrics are included.

  3. Statistical validation of normal tissue complication probability models.

    PubMed

    Xu, Cheng-Jian; van der Schaaf, Arjen; Van't Veld, Aart A; Langendijk, Johannes A; Schilstra, Cornelis

    2012-09-01

    To investigate the applicability and value of double cross-validation and permutation tests as established statistical approaches in the validation of normal tissue complication probability (NTCP) models. A penalized regression method, LASSO (least absolute shrinkage and selection operator), was used to build NTCP models for xerostomia after radiation therapy treatment of head-and-neck cancer. Model assessment was based on the likelihood function and the area under the receiver operating characteristic curve. Repeated double cross-validation showed the uncertainty and instability of the NTCP models and indicated that the statistical significance of model performance can be obtained by permutation testing. Repeated double cross-validation and permutation tests are recommended to validate NTCP models before clinical use. Copyright © 2012 Elsevier Inc. All rights reserved.

  4. Robotic vehicle with multiple tracked mobility platforms

    DOEpatents

    Salton, Jonathan R [Albuquerque, NM; Buttz, James H [Albuquerque, NM; Garretson, Justin [Albuquerque, NM; Hayward, David R [Wetmore, CO; Hobart, Clinton G [Albuquerque, NM; Deuel, Jr., Jamieson K.

    2012-07-24

    A robotic vehicle having two or more tracked mobility platforms that are mechanically linked together with a two-dimensional coupling, thereby forming a composite vehicle of increased mobility. The robotic vehicle is operative in hazardous environments and can be capable of semi-submersible operation. The robotic vehicle is capable of remote controlled operation via radio frequency and/or fiber optic communication link to a remote operator control unit. The tracks have a plurality of track-edge scallop cut-outs that allow the tracks to easily grab onto and roll across railroad tracks, especially when crossing the railroad tracks at an oblique angle.

  5. Development of a cross-platform biomarker signature to detect renal transplant tolerance in humans

    PubMed Central

    Sagoo, Pervinder; Perucha, Esperanza; Sawitzki, Birgit; Tomiuk, Stefan; Stephens, David A.; Miqueu, Patrick; Chapman, Stephanie; Craciun, Ligia; Sergeant, Ruhena; Brouard, Sophie; Rovis, Flavia; Jimenez, Elvira; Ballow, Amany; Giral, Magali; Rebollo-Mesa, Irene; Le Moine, Alain; Braudeau, Cecile; Hilton, Rachel; Gerstmayer, Bernhard; Bourcier, Katarzyna; Sharif, Adnan; Krajewska, Magdalena; Lord, Graham M.; Roberts, Ian; Goldman, Michel; Wood, Kathryn J.; Newell, Kenneth; Seyfert-Margolis, Vicki; Warrens, Anthony N.; Janssen, Uwe; Volk, Hans-Dieter; Soulillou, Jean-Paul; Hernandez-Fuentes, Maria P.; Lechler, Robert I.

    2010-01-01

    Identifying transplant recipients in whom immunological tolerance is established or is developing would allow an individually tailored approach to their posttransplantation management. In this study, we aimed to develop reliable and reproducible in vitro assays capable of detecting tolerance in renal transplant recipients. Several biomarkers and bioassays were screened on a training set that included 11 operationally tolerant renal transplant recipients, recipient groups following different immunosuppressive regimes, recipients undergoing chronic rejection, and healthy controls. Highly predictive assays were repeated on an independent test set that included 24 tolerant renal transplant recipients. Tolerant patients displayed an expansion of peripheral blood B and NK lymphocytes, fewer activated CD4+ T cells, a lack of donor-specific antibodies, donor-specific hyporesponsiveness of CD4+ T cells, and a high ratio of forkhead box P3 to α-1,2-mannosidase gene expression. Microarray analysis further revealed in tolerant recipients a bias toward differential expression of B cell–related genes and their associated molecular pathways. By combining these indices of tolerance as a cross-platform biomarker signature, we were able to identify tolerant recipients in both the training set and the test set. This study provides an immunological profile of the tolerant state that, with further validation, should inform and shape drug-weaning protocols in renal transplant recipients. PMID:20501943

  6. Platform for a Hydrocarbon Exhaust Gas Sensor Utilizing a Pumping Cell and a Conductometric Sensor

    PubMed Central

    Biskupski, Diana; Geupel, Andrea; Wiesner, Kerstin; Fleischer, Maximilian; Moos, Ralf

    2009-01-01

    Very often, high-temperature operated gas sensors are cross-sensitive to oxygen and/or they cannot be operated in oxygen-deficient (rich) atmospheres. For instance, some metal oxides like Ga2O3 or doped SrTiO3 are excellent materials for conductometric hydrocarbon detection in the rough atmosphere of automotive exhausts, but have to be operated preferably at a constant oxygen concentration. We propose a modular sensor platform that combines a conductometric two-sensor-setup with an electrochemical pumping cell made of YSZ to establish a constant oxygen concentration in the ambient of the conductometric sensor film. In this paper, the platform is introduced, the two-sensor-setup is integrated into this new design, and sensing performance is characterized. Such a platform can be used for other sensor principles as well. PMID:22423212

  7. Infrared Spectral Radiance Intercomparisons With Satellite and Aircraft Sensors

    NASA Technical Reports Server (NTRS)

    Larar, Allen M.; Zhou, Daniel K.; Liu, Xu; Smith, William L.

    2014-01-01

    Measurement system validation is critical for advanced satellite sounders to reach their full potential of improving observations of the Earth's atmosphere, clouds, and surface for enabling enhancements in weather prediction, climate monitoring capability, and environmental change detection. Experimental field campaigns, focusing on satellite under-flights with well-calibrated FTS sensors aboard high-altitude aircraft, are an essential part of the validation task. Airborne FTS systems can enable an independent, SI-traceable measurement system validation by directly measuring the same level-1 parameters spatially and temporally coincident with the satellite sensor of interest. Continuation of aircraft under-flights for multiple satellites during multiple field campaigns enables long-term monitoring of system performance and inter-satellite cross-validation. The NASA / NPOESS Airborne Sounder Testbed - Interferometer (NAST-I) has been a significant contributor in this area by providing coincident high spectral/spatial resolution observations of infrared spectral radiances along with independently-retrieved geophysical products for comparison with like products from satellite sensors being validated. This presentation gives an overview of benefits achieved using airborne sensors such as NAST-I utilizing examples from recent field campaigns. The methodology implemented is not only beneficial to new sensors such as the Cross-track Infrared Sounder (CrIS) flying aboard the Suomi NPP and future JPSS satellites but also of significant benefit to sensors of longer flight heritage such as the Atmospheric InfraRed Sounder (AIRS) and the Infrared Atmospheric Sounding Interferometer (IASI) on the AQUA and METOP-A platforms, respectively, to ensure data quality continuity important for climate and other applications. Infrared spectral radiance inter-comparisons are discussed with a particular focus on usage of NAST-I data for enabling inter-platform cross-validation.

  8. An open source platform for multi-scale spatially distributed simulations of microbial ecosystems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Segre, Daniel

    2014-08-14

    The goal of this project was to develop a tool for facilitating simulation, validation and discovery of multiscale dynamical processes in microbial ecosystems. This led to the development of an open-source software platform for Computation Of Microbial Ecosystems in Time and Space (COMETS). COMETS performs spatially distributed time-dependent flux balance based simulations of microbial metabolism. Our plan involved building the software platform itself, calibrating and testing it through comparison with experimental data, and integrating simulations and experiments to address important open questions on the evolution and dynamics of cross-feeding interactions between microbial species.

  9. An intelligent monitoring and management system for cross-enterprise biomedical data sharing platform

    NASA Astrophysics Data System (ADS)

    Wang, Tusheng; Yang, Yuanyuan; Zhang, Jianguo

    2013-03-01

    In order to enable multiple disciplines of medical researchers, clinical physicians and biomedical engineers working together in a secured, efficient, and transparent cooperative environment, we had designed an e-Science platform for biomedical imaging research and application cross multiple academic institutions and hospitals in Shanghai by using grid-based or cloud-based distributed architecture and presented this work in SPIE Medical Imaging conference held in San Diego in 2012. However, when the platform integrates more and more nodes over different networks, the first challenge is that how to monitor and maintain all the hosts and services operating cross multiple academic institutions and hospitals in the e-Science platform, such as DICOM and Web based image communication services, messaging services and XDS ITI transaction services. In this presentation, we presented a system design and implementation of intelligent monitoring and management which can collect system resource status of every node in real time, alert when node or service failure occurs, and can finally improve the robustness, reliability and service continuity of this e-Science platform.

  10. e-Science platform for translational biomedical imaging research: running, statistics, and analysis

    NASA Astrophysics Data System (ADS)

    Wang, Tusheng; Yang, Yuanyuan; Zhang, Kai; Wang, Mingqing; Zhao, Jun; Xu, Lisa; Zhang, Jianguo

    2015-03-01

    In order to enable multiple disciplines of medical researchers, clinical physicians and biomedical engineers working together in a secured, efficient, and transparent cooperative environment, we had designed an e-Science platform for biomedical imaging research and application cross multiple academic institutions and hospitals in Shanghai and presented this work in SPIE Medical Imaging conference held in San Diego in 2012. In past the two-years, we implemented a biomedical image chain including communication, storage, cooperation and computing based on this e-Science platform. In this presentation, we presented the operating status of this system in supporting biomedical imaging research, analyzed and discussed results of this system in supporting multi-disciplines collaboration cross-multiple institutions.

  11. The Chemical Validation and Standardization Platform (CVSP): large-scale automated validation of chemical structure datasets.

    PubMed

    Karapetyan, Karen; Batchelor, Colin; Sharpe, David; Tkachenko, Valery; Williams, Antony J

    2015-01-01

    There are presently hundreds of online databases hosting millions of chemical compounds and associated data. As a result of the number of cheminformatics software tools that can be used to produce the data, subtle differences between the various cheminformatics platforms, as well as the naivety of the software users, there are a myriad of issues that can exist with chemical structure representations online. In order to help facilitate validation and standardization of chemical structure datasets from various sources we have delivered a freely available internet-based platform to the community for the processing of chemical compound datasets. The chemical validation and standardization platform (CVSP) both validates and standardizes chemical structure representations according to sets of systematic rules. The chemical validation algorithms detect issues with submitted molecular representations using pre-defined or user-defined dictionary-based molecular patterns that are chemically suspicious or potentially requiring manual review. Each identified issue is assigned one of three levels of severity - Information, Warning, and Error - in order to conveniently inform the user of the need to browse and review subsets of their data. The validation process includes validation of atoms and bonds (e.g., making aware of query atoms and bonds), valences, and stereo. The standard form of submission of collections of data, the SDF file, allows the user to map the data fields to predefined CVSP fields for the purpose of cross-validating associated SMILES and InChIs with the connection tables contained within the SDF file. This platform has been applied to the analysis of a large number of data sets prepared for deposition to our ChemSpider database and in preparation of data for the Open PHACTS project. In this work we review the results of the automated validation of the DrugBank dataset, a popular drug and drug target database utilized by the community, and ChEMBL 17 data set. CVSP web site is located at http://cvsp.chemspider.com/. A platform for the validation and standardization of chemical structure representations of various formats has been developed and made available to the community to assist and encourage the processing of chemical structure files to produce more homogeneous compound representations for exchange and interchange between online databases. While the CVSP platform is designed with flexibility inherent to the rules that can be used for processing the data we have produced a recommended rule set based on our own experiences with the large data sets such as DrugBank, ChEMBL, and data sets from ChemSpider.

  12. Behavioral Indicators on a Mobile Sensing Platform Predict Clinically Validated Psychiatric Symptoms of Mood and Anxiety Disorders.

    PubMed

    Place, Skyler; Blanch-Hartigan, Danielle; Rubin, Channah; Gorrostieta, Cristina; Mead, Caroline; Kane, John; Marx, Brian P; Feast, Joshua; Deckersbach, Thilo; Pentland, Alex Sandy; Nierenberg, Andrew; Azarbayejani, Ali

    2017-03-16

    There is a critical need for real-time tracking of behavioral indicators of mental disorders. Mobile sensing platforms that objectively and noninvasively collect, store, and analyze behavioral indicators have not yet been clinically validated or scalable. The aim of our study was to report on models of clinical symptoms for post-traumatic stress disorder (PTSD) and depression derived from a scalable mobile sensing platform. A total of 73 participants (67% [49/73] male, 48% [35/73] non-Hispanic white, 33% [24/73] veteran status) who reported at least one symptom of PTSD or depression completed a 12-week field trial. Behavioral indicators were collected through the noninvasive mobile sensing platform on participants' mobile phones. Clinical symptoms were measured through validated clinical interviews with a licensed clinical social worker. A combination hypothesis and data-driven approach was used to derive key features for modeling symptoms, including the sum of outgoing calls, count of unique numbers texted, absolute distance traveled, dynamic variation of the voice, speaking rate, and voice quality. Participants also reported ease of use and data sharing concerns. Behavioral indicators predicted clinically assessed symptoms of depression and PTSD (cross-validated area under the curve [AUC] for depressed mood=.74, fatigue=.56, interest in activities=.75, and social connectedness=.83). Participants reported comfort sharing individual data with physicians (Mean 3.08, SD 1.22), mental health providers (Mean 3.25, SD 1.39), and medical researchers (Mean 3.03, SD 1.36). Behavioral indicators passively collected through a mobile sensing platform predicted symptoms of depression and PTSD. The use of mobile sensing platforms can provide clinically validated behavioral indicators in real time; however, further validation of these models and this platform in large clinical samples is needed. ©Skyler Place, Danielle Blanch-Hartigan, Channah Rubin, Cristina Gorrostieta, Caroline Mead, John Kane, Brian P Marx, Joshua Feast, Thilo Deckersbach, Alex “Sandy” Pentland, Andrew Nierenberg, Ali Azarbayejani. Originally published in the Journal of Medical Internet Research (http://www.jmir.org), 16.03.2017.

  13. Simultaneous overpass off nadir (SOON): a method for unified calibration/validation across IEOS and GEOSS system of systems

    NASA Astrophysics Data System (ADS)

    Ardanuy, Philip; Bergen, Bill; Huang, Allen; Kratz, Gene; Puschell, Jeff; Schueler, Carl; Walker, Joe

    2006-08-01

    The US operates a diverse, evolving constellation of research and operational environmental satellites, principally in polar and geosynchronous orbits. Our current and enhanced future domestic remote sensing capability is complemented by the significant capabilities of our current and potential future international partners. In this analysis, we define "success" through the data customers' "eyes": participating in the sufficient and continuously improving satisfaction of their mission responsibilities. To successfully fuse together observations from multiple simultaneous platforms and sensors into a common, self-consistent, operational environment requires that there exist a unified calibration and validation approach. Here, we consider develop a concept for an integrating framework for absolute accuracy; long-term stability; self-consistency among sensors, platforms, techniques, and observing systems; and validation and characterization of performance. Across all systems, this is a non-trivial problem. Simultaneous Nadir Overpasses, or SNO's, provide a proven intercomparison technique: simultaneous, collocated, co-angular measurements. Many systems have off-nadir elements, or effects, that must be calibrated. For these systems, the nadir technique constrains the process. We define the term "SOON," for simultaneous overpass off nadir. We present a target architecture and sensitivity analysis for the affordable, sustainable implementation of a global SOON calibration/validation network that can deliver the much-needed comprehensive, common, self-consistent operational picture in near-real time, at an affordable cost.

  14. The R package "sperrorest" : Parallelized spatial error estimation and variable importance assessment for geospatial machine learning

    NASA Astrophysics Data System (ADS)

    Schratz, Patrick; Herrmann, Tobias; Brenning, Alexander

    2017-04-01

    Computational and statistical prediction methods such as the support vector machine have gained popularity in remote-sensing applications in recent years and are often compared to more traditional approaches like maximum-likelihood classification. However, the accuracy assessment of such predictive models in a spatial context needs to account for the presence of spatial autocorrelation in geospatial data by using spatial cross-validation and bootstrap strategies instead of their now more widely used non-spatial equivalent. The R package sperrorest by A. Brenning [IEEE International Geoscience and Remote Sensing Symposium, 1, 374 (2012)] provides a generic interface for performing (spatial) cross-validation of any statistical or machine-learning technique available in R. Since spatial statistical models as well as flexible machine-learning algorithms can be computationally expensive, parallel computing strategies are required to perform cross-validation efficiently. The most recent major release of sperrorest therefore comes with two new features (aside from improved documentation): The first one is the parallelized version of sperrorest(), parsperrorest(). This function features two parallel modes to greatly speed up cross-validation runs. Both parallel modes are platform independent and provide progress information. par.mode = 1 relies on the pbapply package and calls interactively (depending on the platform) parallel::mclapply() or parallel::parApply() in the background. While forking is used on Unix-Systems, Windows systems use a cluster approach for parallel execution. par.mode = 2 uses the foreach package to perform parallelization. This method uses a different way of cluster parallelization than the parallel package does. In summary, the robustness of parsperrorest() is increased with the implementation of two independent parallel modes. A new way of partitioning the data in sperrorest is provided by partition.factor.cv(). This function gives the user the possibility to perform cross-validation at the level of some grouping structure. As an example, in remote sensing of agricultural land uses, pixels from the same field contain nearly identical information and will thus be jointly placed in either the test set or the training set. Other spatial sampling resampling strategies are already available and can be extended by the user.

  15. International Space Station Medical Project

    NASA Technical Reports Server (NTRS)

    Starkey, Blythe A.

    2008-01-01

    The goals and objectives of the ISS Medical Project (ISSMP) are to: 1) Maximize the utilization the ISS and other spaceflight platforms to assess the effects of longduration spaceflight on human systems; 2) Devise and verify strategies to ensure optimal crew performance; 3) Enable development and validation of a suite of integrated physical (e.g., exercise), pharmacologic and/or nutritional countermeasures against deleterious effects of space flight that may impact mission success or crew health. The ISSMP provides planning, integration, and implementation services for Human Research Program research tasks and evaluation activities requiring access to space or related flight resources on the ISS, Shuttle, Soyuz, Progress, or other spaceflight vehicles and platforms. This includes pre- and postflight activities; 2) ISSMP services include operations and sustaining engineering for HRP flight hardware; experiment integration and operation, including individual research tasks and on-orbit validation of next generation on-orbit equipment; medical operations; procedures development and validation; and crew training tools and processes, as well as operation and sustaining engineering for the Telescience Support Center; and 3) The ISSMP integrates the HRP approved flight activity complement and interfaces with external implementing organizations, such as the ISS Payloads Office and International Partners, to accomplish the HRP's objectives. This effort is led by JSC with Baseline Data Collection support from KSC.

  16. Multi-platform operational validation of the Western Mediterranean SOCIB forecasting system

    NASA Astrophysics Data System (ADS)

    Juza, Mélanie; Mourre, Baptiste; Renault, Lionel; Tintoré, Joaquin

    2014-05-01

    The development of science-based ocean forecasting systems at global, regional, and local scales can support a better management of the marine environment (maritime security, environmental and resources protection, maritime and commercial operations, tourism, ...). In this context, SOCIB (the Balearic Islands Coastal Observing and Forecasting System, www.socib.es) has developed an operational ocean forecasting system in the Western Mediterranean Sea (WMOP). WMOP uses a regional configuration of the Regional Ocean Modelling System (ROMS, Shchepetkin and McWilliams, 2005) nested in the larger scale Mediterranean Forecasting System (MFS) with a spatial resolution of 1.5-2km. WMOP aims at reproducing both the basin-scale ocean circulation and the mesoscale variability which is known to play a crucial role due to its strong interaction with the large scale circulation in this region. An operational validation system has been developed to systematically assess the model outputs at daily, monthly and seasonal time scales. Multi-platform observations are used for this validation, including satellite products (Sea Surface Temperature, Sea Level Anomaly), in situ measurements (from gliders, Argo floats, drifters and fixed moorings) and High-Frequency radar data. The validation procedures allow to monitor and certify the general realism of the daily production of the ocean forecasting system before its distribution to users. Additionally, different indicators (Sea Surface Temperature and Salinity, Eddy Kinetic Energy, Mixed Layer Depth, Heat Content, transports in key sections) are computed every day both at the basin-scale and in several sub-regions (Alboran Sea, Balearic Sea, Gulf of Lion). The daily forecasts, validation diagnostics and indicators from the operational model over the last months are available at www.socib.es.

  17. Strength Performance Assessment in a Simulated Men’s Gymnastics Still Rings Cross

    PubMed Central

    Dunlavy, Jennifer K.; Sands, William A.; McNeal, Jeni R.; Stone, Michael H.; Smith, Sarah L.; Jemni, Monem; Haff, G. Gregory

    2007-01-01

    Athletes in sports such as the gymnastics who perform the still rings cross position are disadvantaged due to a lack of objective and convenient measurement methods. The gymnastics “cross ”is a held isometric strength position considered fundamental to all still rings athletes. The purpose of this investigation was to determine if two small force platforms (FPs) placed on supports to simulate a cross position could demonstrate the fidelity necessary to differentiate between athletes who could perform a cross from those who could not. Ten gymnasts (5 USA Gymnastics, Senior National Team, and 5 Age Group Level Gymnasts) agreed to participate. The five Senior National Team athletes were grouped as cross Performers; the Age Group Gymnasts could not successfully perform the cross position and were grouped as cross Non- Performers. The two small FPs were first tested for reliability and validity and were then used to obtain a force-time record of a simulated cross position. The simulated cross test consisted of standing between two small force platforms placed on top of large solid gymnastics spotting blocks. The gymnasts attempted to perform a cross position by placing their hands at the center of the FPs and pressing downward with sufficient force that they could remove the support of their feet from the floor. Force-time curves (100 Hz) were obtained and analyzed for the sum of peak and mean arm ground reaction forces. The summed arm forces, mean and peak, were compared to body weight to determine how close the gymnasts came to achieving forces equal to body weight and thus the ability to perform the cross. The mean and peak summed arm forces were able to statistically differentiate between athletes who could perform the cross from those who could not (p < 0.05). The force-time curves and small FPs showed sufficient fidelity to differentiate between Performer and Non- Performer groups. This experiment showed that small and inexpensive force platforms may serve as useful adjuncts to athlete performance measurement such as the gymnastics still rings cross. Key pointsStrength-related skills are difficult to assess in some sports and thus require special means.Small force platforms have sufficient fidelity to assess the differences between gymnasts who can perform a still rings cross from those who cannot.Strength assessment via small force platforms may serve as a means of assessing skill readiness, strength symmetry, and progress in learning a still rings cross. PMID:24149230

  18. Monitoring Snow Using Geostationary Satellite Retrievals During the SAAWSO Project

    NASA Astrophysics Data System (ADS)

    Rabin, Robert M.; Gultepe, Ismail; Kuligowski, Robert J.; Heidinger, Andrew K.

    2016-09-01

    The SAAWSO (Satellite Applications for Arctic Weather and SAR (Search And Rescue) Operations) field programs were conducted by Environment Canada near St. Johns, NL and Goose Bay, NL in the winters of 2012-13 and 2013-14, respectively. The goals of these programs were to validate satellite-based nowcasting products, including snow amount, wind intensity, and cloud physical parameters (e.g., cloud cover), over northern latitudes with potential applications to Search And Rescue (SAR) operations. Ground-based in situ sensors and remote sensing platforms were used to measure microphysical properties of precipitation, clouds and fog, radiation, temperature, moisture and wind profiles. Multi-spectral infrared observations obtained from Geostationary Operational Environmental Satellite (GOES)-13 provided estimates of cloud top temperature and height, phase (water, ice), hydrometer size, extinction, optical depth, and horizontal wind patterns at 15 min intervals. In this work, a technique developed for identifying clouds capable of producing high snowfall rates and incorporating wind information from the satellite observations is described. The cloud top physical properties retrieved from operational satellite observations are validated using measurements obtained from the ground-based in situ and remote sensing platforms collected during two precipitation events: a blizzard heavy snow storm case and a moderate snow event. The retrieved snow precipitation rates are found to be comparable to those of ground-based platform measurements in the heavy snow event.

  19. Virtual reality simulation training in Otolaryngology.

    PubMed

    Arora, Asit; Lau, Loretta Y M; Awad, Zaid; Darzi, Ara; Singh, Arvind; Tolley, Neil

    2014-01-01

    To conduct a systematic review of the validity data for the virtual reality surgical simulator platforms available in Otolaryngology. Ovid and Embase databases searched July 13, 2013. Four hundred and nine abstracts were independently reviewed by 2 authors. Thirty-six articles which fulfilled the search criteria were retrieved and viewed in full text. These articles were assessed for quantitative data on at least one aspect of face, content, construct or predictive validity. Papers were stratified by simulator, sub-specialty and further classified by the validation method used. There were 21 articles reporting applications for temporal bone surgery (n = 12), endoscopic sinus surgery (n = 6) and myringotomy (n = 3). Four different simulator platforms were validated for temporal bone surgery and two for each of the other surgical applications. Face/content validation represented the most frequent study type (9/21). Construct validation studies performed on temporal bone and endoscopic sinus surgery simulators showed that performance measures reliably discriminated between different experience levels. Simulation training improved cadaver temporal bone dissection skills and operating room performance in sinus surgery. Several simulator platforms particularly in temporal bone surgery and endoscopic sinus surgery are worthy of incorporation into training programmes. Standardised metrics are necessary to guide curriculum development in Otolaryngology. Copyright © 2013 Surgical Associates Ltd. Published by Elsevier Ltd. All rights reserved.

  20. MEGA X: Molecular Evolutionary Genetics Analysis across Computing Platforms.

    PubMed

    Kumar, Sudhir; Stecher, Glen; Li, Michael; Knyaz, Christina; Tamura, Koichiro

    2018-06-01

    The Molecular Evolutionary Genetics Analysis (Mega) software implements many analytical methods and tools for phylogenomics and phylomedicine. Here, we report a transformation of Mega to enable cross-platform use on Microsoft Windows and Linux operating systems. Mega X does not require virtualization or emulation software and provides a uniform user experience across platforms. Mega X has additionally been upgraded to use multiple computing cores for many molecular evolutionary analyses. Mega X is available in two interfaces (graphical and command line) and can be downloaded from www.megasoftware.net free of charge.

  1. Hyperspectral Observations of Land Surfaces Using Ground-based, Airborne, and Satellite Sensors

    NASA Astrophysics Data System (ADS)

    Knuteson, R. O.; Best, F. A.; Revercomb, H. E.; Tobin, D. C.

    2006-12-01

    The University of Wisconsin-Madison Space Science and Engineering Center (UW-SSEC) has helped pioneer the use of high spectral resolution infrared spectrometers for application to atmospheric and surface remote sensing. This paper is focused on observations of land surface infrared emission from high spectral resolution measurements collected over the past 15 years using airborne, ground-based, and satellite platforms. The earliest data was collected by the High-resolution Interferometer Sounder (HIS), an instrument designed in the 1980s for operation on the NASA ER-2 high altitude aircraft. The HIS was replaced in the late 1990s by the Scanning-HIS instrument which has flown on the NASA ER-2, WB-57, DC-8, and Scaled Composites Proteus aircraft and continues to support field campaigns, such as those for EOS Terra, Aqua, and Aura validation. Since 1995 the UW-SSEC has fielded a ground-based Atmospheric Emitted Radiance Interferometer (AERI) in a research vehicle (the AERIBAGO) which has allowed for direct field measurements of land surface emission from a height of about 16 ft above the ground. Several ground-based and aircraft campaigns were conducted to survey the region surrounding the ARM Southern Great Plains site in north central Oklahoma. The ground- based AERIBAGO has also participated in surface emissivity campaigns in the Western U.S.. Since 2002, the NASA Atmospheric InfraRed Sounder (AIRS) has provided similar measurements from the Aqua platform in an afternoon sun-synchronous polar orbit. Ground-based and airborne observations are being used to validate the land surface products derived from the AIRS observations. These cal/val activities are in preparation for similar measurements anticipated from the operational Cross-track InfraRed Sounder (CrIS) on the NPOESS Preparatory Platform (NPP), expected to be launched in 2008. Moreover, high spectral infrared observations will soon be made by the Infrared Atmospheric Sounder Interferometer (IASI) on the European MetOp platform as well as a planned series of Chinese polar orbiting satellites. The detailed understanding of the land surface infrared emission is a crucial step in the effective utilization of these advanced sounder instruments for the extraction of atmospheric composition information (esp. water vapor vertical profile) over land, which is a key goal for numerical weather prediction data assimilation.

  2. Guidelines To Validate Control of Cross-Contamination during Washing of Fresh-Cut Leafy Vegetables.

    PubMed

    Gombas, D; Luo, Y; Brennan, J; Shergill, G; Petran, R; Walsh, R; Hau, H; Khurana, K; Zomorodi, B; Rosen, J; Varley, R; Deng, K

    2017-02-01

    The U.S. Food and Drug Administration requires food processors to implement and validate processes that will result in significantly minimizing or preventing the occurrence of hazards that are reasonably foreseeable in food production. During production of fresh-cut leafy vegetables, microbial contamination that may be present on the product can spread throughout the production batch when the product is washed, thus increasing the risk of illnesses. The use of antimicrobials in the wash water is a critical step in preventing such water-mediated cross-contamination; however, many factors can affect antimicrobial efficacy in the production of fresh-cut leafy vegetables, and the procedures for validating this key preventive control have not been articulated. Producers may consider three options for validating antimicrobial washing as a preventive control for cross-contamination. Option 1 involves the use of a surrogate for the microbial hazard and the demonstration that cross-contamination is prevented by the antimicrobial wash. Option 2 involves the use of antimicrobial sensors and the demonstration that a critical antimicrobial level is maintained during worst-case operating conditions. Option 3 validates the placement of the sensors in the processing equipment with the demonstration that a critical antimicrobial level is maintained at all locations, regardless of operating conditions. These validation options developed for fresh-cut leafy vegetables may serve as examples for validating processes that prevent cross-contamination during washing of other fresh produce commodities.

  3. Cross-validation pitfalls when selecting and assessing regression and classification models.

    PubMed

    Krstajic, Damjan; Buturovic, Ljubomir J; Leahy, David E; Thomas, Simon

    2014-03-29

    We address the problem of selecting and assessing classification and regression models using cross-validation. Current state-of-the-art methods can yield models with high variance, rendering them unsuitable for a number of practical applications including QSAR. In this paper we describe and evaluate best practices which improve reliability and increase confidence in selected models. A key operational component of the proposed methods is cloud computing which enables routine use of previously infeasible approaches. We describe in detail an algorithm for repeated grid-search V-fold cross-validation for parameter tuning in classification and regression, and we define a repeated nested cross-validation algorithm for model assessment. As regards variable selection and parameter tuning we define two algorithms (repeated grid-search cross-validation and double cross-validation), and provide arguments for using the repeated grid-search in the general case. We show results of our algorithms on seven QSAR datasets. The variation of the prediction performance, which is the result of choosing different splits of the dataset in V-fold cross-validation, needs to be taken into account when selecting and assessing classification and regression models. We demonstrate the importance of repeating cross-validation when selecting an optimal model, as well as the importance of repeating nested cross-validation when assessing a prediction error.

  4. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  5. Automating an integrated spatial data-mining model for landfill site selection

    NASA Astrophysics Data System (ADS)

    Abujayyab, Sohaib K. M.; Ahamad, Mohd Sanusi S.; Yahya, Ahmad Shukri; Ahmad, Siti Zubaidah; Aziz, Hamidi Abdul

    2017-10-01

    An integrated programming environment represents a robust approach to building a valid model for landfill site selection. One of the main challenges in the integrated model is the complicated processing and modelling due to the programming stages and several limitations. An automation process helps avoid the limitations and improve the interoperability between integrated programming environments. This work targets the automation of a spatial data-mining model for landfill site selection by integrating between spatial programming environment (Python-ArcGIS) and non-spatial environment (MATLAB). The model was constructed using neural networks and is divided into nine stages distributed between Matlab and Python-ArcGIS. A case study was taken from the north part of Peninsular Malaysia. 22 criteria were selected to utilise as input data and to build the training and testing datasets. The outcomes show a high-performance accuracy percentage of 98.2% in the testing dataset using 10-fold cross validation. The automated spatial data mining model provides a solid platform for decision makers to performing landfill site selection and planning operations on a regional scale.

  6. Perioperative and ICU Healthcare Analytics within a Veterans Integrated System Network: a Qualitative Gap Analysis.

    PubMed

    Mudumbai, Seshadri; Ayer, Ferenc; Stefanko, Jerry

    2017-08-01

    Health care facilities are implementing analytics platforms as a way to document quality of care. However, few gap analyses exist on platforms specifically designed for patients treated in the Operating Room, Post-Anesthesia Care Unit, and Intensive Care Unit (ICU). As part of a quality improvement effort, we undertook a gap analysis of an existing analytics platform within the Veterans Healthcare Administration. The objectives were to identify themes associated with 1) current clinical use cases and stakeholder needs; 2) information flow and pain points; and 3) recommendations for future analytics development. Methods consisted of semi-structured interviews in 2 phases with a diverse set (n = 9) of support personnel and end users from five facilities across a Veterans Integrated Service Network. Phase 1 identified underlying needs and previous experiences with the analytics platform across various roles and operational responsibilities. Phase 2 validated preliminary feedback, lessons learned, and recommendations for improvement. Emerging themes suggested that the existing system met a small pool of national reporting requirements. However, pain points were identified with accessing data in several information system silos and performing multiple manual validation steps of data content. Notable recommendations included enhancing systems integration to create "one-stop shopping" for data, and developing a capability to perform trends analysis. Our gap analysis suggests that analytics platforms designed for surgical and ICU patients should employ approaches similar to those being used for primary care patients.

  7. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C. A.; Somerville, P.; Jordan, T. H.

    2014-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving geoscientists, earthquake engineers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform (BBP) is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms for a well-observed historical earthquake. Then, the BBP calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes 5 simulation methods, 7 simulation regions covering California, Japan, and Eastern North America, the ability to compare simulation results against GMPEs, and several new data products, such as map and distance-based goodness of fit plots. As the number and complexity of scenarios simulated using the Broadband Platform increases, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  8. Remote observations of reentering spacecraft including the space shuttle orbiter

    NASA Astrophysics Data System (ADS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, Jay H.; Gibson, David M.

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  9. Remote Observations of Reentering Spacecraft Including the Space Shuttle Orbiter

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Cagle, Melinda F.; Grinstead, jay H.; Gibson, David

    2013-01-01

    Flight measurement is a critical phase in development, validation and certification processes of technologies destined for future civilian and military operational capabilities. This paper focuses on several recent NASA-sponsored remote observations that have provided unique engineering and scientific insights of reentry vehicle flight phenomenology and performance that could not necessarily be obtained with more traditional instrumentation methods such as onboard discrete surface sensors. The missions highlighted include multiple spatially-resolved infrared observations of the NASA Space Shuttle Orbiter during hypersonic reentry from 2009 to 2011, and emission spectroscopy of comparatively small-sized sample return capsules returning from exploration missions. Emphasis has been placed upon identifying the challenges associated with these remote sensing missions with focus on end-to-end aspects that include the initial science objective, selection of the appropriate imaging platform and instrumentation suite, target flight path analysis and acquisition strategy, pre-mission simulations to optimize sensor configuration, logistics and communications during the actual observation. Explored are collaborative opportunities and technology investments required to develop a next-generation quantitative imaging system (i.e., an intelligent sensor and platform) with greater capability, which could more affordably support cross cutting civilian and military flight test needs.

  10. The SCEC Broadband Platform: A Collaborative Open-Source Software Package for Strong Ground Motion Simulation and Validation

    NASA Astrophysics Data System (ADS)

    Silva, F.; Maechling, P. J.; Goulet, C.; Somerville, P.; Jordan, T. H.

    2013-12-01

    The Southern California Earthquake Center (SCEC) Broadband Platform is a collaborative software development project involving SCEC researchers, graduate students, and the SCEC Community Modeling Environment. The SCEC Broadband Platform is open-source scientific software that can generate broadband (0-100Hz) ground motions for earthquakes, integrating complex scientific modules that implement rupture generation, low and high-frequency seismogram synthesis, non-linear site effects calculation, and visualization into a software system that supports easy on-demand computation of seismograms. The Broadband Platform operates in two primary modes: validation simulations and scenario simulations. In validation mode, the Broadband Platform runs earthquake rupture and wave propagation modeling software to calculate seismograms of a historical earthquake for which observed strong ground motion data is available. Also in validation mode, the Broadband Platform calculates a number of goodness of fit measurements that quantify how well the model-based broadband seismograms match the observed seismograms for a certain event. Based on these results, the Platform can be used to tune and validate different numerical modeling techniques. During the past year, we have modified the software to enable the addition of a large number of historical events, and we are now adding validation simulation inputs and observational data for 23 historical events covering the Eastern and Western United States, Japan, Taiwan, Turkey, and Italy. In scenario mode, the Broadband Platform can run simulations for hypothetical (scenario) earthquakes. In this mode, users input an earthquake description, a list of station names and locations, and a 1D velocity model for their region of interest, and the Broadband Platform software then calculates ground motions for the specified stations. By establishing an interface between scientific modules with a common set of input and output files, the Broadband Platform facilitates the addition of new scientific methods, which are written by earth scientists in a number of languages such as C, C++, Fortran, and Python. The Broadband Platform's modular design also supports the reuse of existing software modules as building blocks to create new scientific methods. Additionally, the Platform implements a wrapper around each scientific module, converting input and output files to and from the specific formats required (or produced) by individual scientific codes. Working in close collaboration with scientists and research engineers, the SCEC software development group continues to add new capabilities to the Broadband Platform and to release new versions as open-source scientific software distributions that can be compiled and run on many Linux computer systems. Our latest release includes the addition of 3 new simulation methods and several new data products, such as map and distance-based goodness of fit plots. Finally, as the number and complexity of scenarios simulated using the Broadband Platform increase, we have added batching utilities to substantially improve support for running large-scale simulations on computing clusters.

  11. The need for a European data platform for hydrological observatories

    NASA Astrophysics Data System (ADS)

    Blöschl, Günter; Bogena, Heye; Jensen, Karsten; Zacharias, Steffen; Kunstmann, Harald; Heinrich, Ingo; Kunkel, Ralf; Vereecken, Harry

    2017-04-01

    Experimental research in hydrology is amazingly fragmented and disperse. Typically, individual research groups establish and operate their own hydrological test sites and observatories with dedicated funding and specific research questions in mind. Once funding ceases, provisions for archiving and exchanging the data also soon run out and often data are lost or are no longer accessible to the research community. This has not only resulted in missed opportunities for exploring and mining hydrological data but also in a general difficulty in synthesizing research findings from different locations around the world. Many reasons for this fragmentation can be put forward, including the site-specific nature of hydrological processes, the particular types of research funding and the professional education in diverse departments. However, opportunities exist for making hydrological data more accessible and valuable to the research community, for example for designing cross-catchment experiments that build on a common data base and for the development and validation of hydrological models. A number of abundantly instrumented hydrological observatories, including the TERENO catchments in Germany, the HOBE catchment in Denmark and the HOAL catchment in Austria, have, in a first step, started to join forces to serve as a community-driven nucleus for a European data platform of hydrological observatories. The common data platform aims at making data of existing hydrological observatories accessible and available to the research community, thereby providing new opportunities for the design of cross-catchment experiments and model validation efforts. Tangible instruments for implementing this platform include a common data portal, for which the TEODOOR portal (http://www.tereno.net/) is currently used. Intangible instruments include a strong motivational basis. As with any community initiative, it is important to align expectations and to provide incentives to all involved. It is argued that the main incentives lie in the shared learning from contrasting environments, which is at the heart of obtaining hydrological research findings that are generalizable beyond individual locations. From a more practical perspective, experience can be shared with testing measurement technologies and experimental design. Benefits to the wider community include a more coherent research thrust brought about by a common, accessible data set, a more long-term vision of experimental research, as well as greater visibility of experimental research. The common data platform is a first step towards a larger network of hydrological observatories. The larger network could involve a more aligned research collaboration including exchange of models, exchange of students, a joint research agenda and joint long-term projects. Ultimately, the aim is to align experimental research in hydrology to strengthen the discipline of hydrology as a whole.

  12. Validity and reliability of rectus femoris ultrasound measurements: Comparison of curved-array and linear-array transducers.

    PubMed

    Hammond, Kendra; Mampilly, Jobby; Laghi, Franco A; Goyal, Amit; Collins, Eileen G; McBurney, Conor; Jubran, Amal; Tobin, Martin J

    2014-01-01

    Muscle-mass loss augers increased morbidity and mortality in critically ill patients. Muscle-mass loss can be assessed by wide linear-array ultrasound transducers connected to cumbersome, expensive console units. Whether cheaper, hand-carried units equipped with curved-array transducers can be used as alternatives is unknown. Accordingly, our primary aim was to investigate in 15 nondisabled subjects the validity of measurements of rectus femoris cross-sectional area by using a curved-array transducer against a linear-array transducer-the reference-standard technique. In these subjects, we also determined the reliability of measurements obtained by a novice operator versus measurements obtained by an experienced operator. Lastly, the relationship between quadriceps strength and rectus area recorded by two experienced operators with a curved-array transducer was assessed in 17 patients with chronic obstructive pulmonary disease (COPD). In nondisabled subjects, the rectus cross-sectional area measured with the curved-array transducer by the novice and experienced operators was valid (intraclass correlation coefficient [ICC]: 0.98, typical percentage error [%TE]: 3.7%) and reliable (ICC: 0.79, %TE: 9.7%). In the subjects with COPD, both reliability (ICC: 0.99) and repeatability (%TE: 7.6% and 9.8%) were high. Rectus area was related to quadriceps strength in COPD for both experienced operators (coefficient of determination: 0.67 and 0.70). In conclusion, measurements of rectus femoris cross-sectional area recorded with a curved-array transducer connected to a hand-carried unit are valid, reliable, and reproducible, leading us to contend that this technique is suitable for cross-sectional and longitudinal studies.

  13. Validity evidence for Surgical Improvement of Clinical Knowledge Ops: a novel gaming platform to assess surgical decision making.

    PubMed

    Lin, Dana T; Park, Julia; Liebert, Cara A; Lau, James N

    2015-01-01

    Current surgical education curricula focus mainly on the acquisition of technical skill rather than clinical and operative judgment. SICKO (Surgical Improvement of Clinical Knowledge Ops) is a novel gaming platform developed to address this critical need. A pilot study was performed to collect validity evidence for SICKO as an assessment for surgical decision making. Forty-nine subjects stratified into 4 levels of expertise were recruited to play SICKO. Later, players were surveyed regarding the realism of the gaming platform as well as the clinical competencies required of them while playing SICKO. Each group of increasing expertise outperformed the less experienced groups. Mean total game scores for the novice, junior resident, senior resident, and expert groups were 5,461, 8,519, 11,404, and 13,913, respectively (P = .001). Survey results revealed high scores for realism and content. SICKO holds the potential to be not only an engaging and immersive educational tool, but also a valid assessment in the armamentarium of surgical educators. Published by Elsevier Inc.

  14. OCULUS SeaTM: integrated maritime surveillance platform

    NASA Astrophysics Data System (ADS)

    Kanellopoulos, Sotirios A.; Katsoulis, Stavros; Motos, Dionysis; Lampropoulos, Vassilis; Margonis, Chris; Dimitros, Kostantinos; Thomopoulos, Stelios C. A.

    2015-05-01

    OCULUS Sea™ is a C2 platform for Integrated Maritime Surveillance. The platform consists of "loosely coupled" National/ Regional and Local C2 Centers which are "centrally governed". "Loosely coupled" as C2 Centers are located separately, share their Situational Pictures via a Message Oriented Middleware but preserve their administrational and operational autonomy. "Centrally governed" as there exists a central governance mechanism at the NCC that registers, authenticates and authorizes Regional and Local C2 centers into the OCULUS Sea network. From operational point of view, OCULUS Sea has been tested under realistic conditions during the PERSEUS [3] Eastern Campaign and has been positively evaluated by Coast Guard officers from Spain and Greece. From Research and Development point of view, OCULUS Sea can act as a test bed for validating any technology development is this domain in the near future.

  15. Evaluation of gene expression classification studies: factors associated with classification performance.

    PubMed

    Novianti, Putri W; Roes, Kit C B; Eijkemans, Marinus J C

    2014-01-01

    Classification methods used in microarray studies for gene expression are diverse in the way they deal with the underlying complexity of the data, as well as in the technique used to build the classification model. The MAQC II study on cancer classification problems has found that performance was affected by factors such as the classification algorithm, cross validation method, number of genes, and gene selection method. In this paper, we study the hypothesis that the disease under study significantly determines which method is optimal, and that additionally sample size, class imbalance, type of medical question (diagnostic, prognostic or treatment response), and microarray platform are potentially influential. A systematic literature review was used to extract the information from 48 published articles on non-cancer microarray classification studies. The impact of the various factors on the reported classification accuracy was analyzed through random-intercept logistic regression. The type of medical question and method of cross validation dominated the explained variation in accuracy among studies, followed by disease category and microarray platform. In total, 42% of the between study variation was explained by all the study specific and problem specific factors that we studied together.

  16. Validation and cross-cultural pilot testing of compliance with standard precautions scale: self-administered instrument for clinical nurses.

    PubMed

    Lam, Simon C

    2014-05-01

    To perform detailed psychometric testing of the compliance with standard precautions scale (CSPS) in measuring compliance with standard precautions of clinical nurses and to conduct cross-cultural pilot testing and assess the relevance of the CSPS on an international platform. A cross-sectional and correlational design with repeated measures. Nursing students from a local registered nurse training university, nurses from different hospitals in Hong Kong, and experts in an international conference. The psychometric properties of the CSPS were evaluated via internal consistency, 2-week and 3-month test-retest reliability, concurrent validation, and construct validation. The cross-cultural pilot testing and relevance check was examined by experts on infection control from various developed and developing regions. Among 453 participants, 193 were nursing students, 165 were enrolled nurses, and 95 were registered nurses. The results showed that the CSPS had satisfactory reliability (Cronbach α = 0.73; intraclass correlation coefficient, 0.79 for 2-week test-retest and 0.74 for 3-month test-retest) and validity (optimum correlation with criterion measure; r = 0.76, P < .001; satisfactory results on known-group method and hypothesis testing). A total of 19 experts from 16 countries assured that most of the CSPS findings were relevant and globally applicable. The CSPS demonstrated satisfactory results on the basis of the standard international criteria on psychometric testing, which ascertained the reliability and validity of this instrument in measuring the compliance of clinical nurses with standard precautions. The cross-cultural pilot testing further reinforced the instrument's relevance and applicability in most developed and developing regions.

  17. Project FLOSSIE: Marine Data Stewardship at the Waterline

    NASA Astrophysics Data System (ADS)

    Bouchard, R. H.; Jensen, R. E.; Riley, R. E.

    2016-02-01

    There are more than 10 million wave records from platforms of the National Data Buoy Center (NDBC) that are archived by National Oceanic and Atmospheric Administration (NOAA). A considerable number of these were measured from the 61 NOMAD (Navy Oceanographic Meteorological Automatic Device) hulls that NDBC has used to make wave measurements since October 1979. Many of these measurements were made before the era of modern marine data stewardship. These long records lend themselves to investigations of climate trends and variability either directly by the measurements themselves, or indirectly by validating long-term numerical wave models or remote sensing applications. However studies (e.g., Gemmrich et al. 2011) indicate that discontinuities and increased variability of the measurements can arise from changing wave systems and platforms. The value of these records is undermined by the lack of understanding or documentation of technology changes - a critical component of data stewardship. To support its mission of long-term understanding of coastal waves and wave models, the U.S. Army Corps of Engineers, Coastal Hydraulics Laboratory (CHL) sponsored the FLOSSIE Project to gage the effects of technology changes on the long-term wave measurements from NOMAD hulls. On behalf of CHL, NDBC engineering and operations integrated old, new, and leading edge technologies on one NOMAD hull. The hull was successfully deployed in July 2015 at the Wave Evaluation and Testing area off of Monterey Bay, CA. The area hosts an NDBC 3-m hull with cross-generational-technologies and a reference standard in a Datawell Waverider buoy. Thus cross-generational and cross-platform inter-comparisons can be performed simultaneously to an accepted standard. The analysis goes beyond the bulk wave parameters. The analysis will examine the energy and directional distributions over the frequency range of wind-generated waves. The project is named in honor of the pioneering World War II Naval meteorologist, Commander Florence (Flossie) Van Straten (1913 - 1992), USNR, who coined the acronym for NOMAD. This paper will discuss the goals of the project, present preliminary data results and application to the long-term measurements, and outline the plans incorporating Best Practices of Marine Data Stewardship for the resulting datasets.

  18. 3D Data Acquisition Platform for Human Activity Understanding

    DTIC Science & Technology

    2016-03-02

    3D data. The support for the acquisition of such research instrumentation have significantly facilitated our current and future research and educate ...SECURITY CLASSIFICATION OF: In this project, we incorporated motion capture devices, 3D vision sensors, and EMG sensors to cross validate...multimodality data acquisition, and address fundamental research problems of representation and invariant description of 3D data, human motion modeling and

  19. Creating, generating and comparing random network models with NetworkRandomizer.

    PubMed

    Tosadori, Gabriele; Bestvina, Ivan; Spoto, Fausto; Laudanna, Carlo; Scardoni, Giovanni

    2016-01-01

    Biological networks are becoming a fundamental tool for the investigation of high-throughput data in several fields of biology and biotechnology. With the increasing amount of information, network-based models are gaining more and more interest and new techniques are required in order to mine the information and to validate the results. To fill the validation gap we present an app, for the Cytoscape platform, which aims at creating randomised networks and randomising existing, real networks. Since there is a lack of tools that allow performing such operations, our app aims at enabling researchers to exploit different, well known random network models that could be used as a benchmark for validating real, biological datasets. We also propose a novel methodology for creating random weighted networks, i.e. the multiplication algorithm, starting from real, quantitative data. Finally, the app provides a statistical tool that compares real versus randomly computed attributes, in order to validate the numerical findings. In summary, our app aims at creating a standardised methodology for the validation of the results in the context of the Cytoscape platform.

  20. Data to Decisions: Creating a Culture of Model-Driven Drug Discovery.

    PubMed

    Brown, Frank K; Kopti, Farida; Chang, Charlie Zhenyu; Johnson, Scott A; Glick, Meir; Waller, Chris L

    2017-09-01

    Merck & Co., Inc., Kenilworth, NJ, USA, is undergoing a transformation in the way that it prosecutes R&D programs. Through the adoption of a "model-driven" culture, enhanced R&D productivity is anticipated, both in the form of decreased attrition at each stage of the process and by providing a rational framework for understanding and learning from the data generated along the way. This new approach focuses on the concept of a "Design Cycle" that makes use of all the data possible, internally and externally, to drive decision-making. These data can take the form of bioactivity, 3D structures, genomics, pathway, PK/PD, safety data, etc. Synthesis of high-quality data into models utilizing both well-established and cutting-edge methods has been shown to yield high confidence predictions to prioritize decision-making and efficiently reposition resources within R&D. The goal is to design an adaptive research operating plan that uses both modeled data and experiments, rather than just testing, to drive project decision-making. To support this emerging culture, an ambitious information management (IT) program has been initiated to implement a harmonized platform to facilitate the construction of cross-domain workflows to enable data-driven decision-making and the construction and validation of predictive models. These goals are achieved through depositing model-ready data, agile persona-driven access to data, a unified cross-domain predictive model lifecycle management platform, and support for flexible scientist-developed workflows that simplify data manipulation and consume model services. The end-to-end nature of the platform, in turn, not only supports but also drives the culture change by enabling scientists to apply predictive sciences throughout their work and over the lifetime of a project. This shift in mindset for both scientists and IT was driven by an early impactful demonstration of the potential benefits of the platform, in which expert-level early discovery predictive models were made available from familiar desktop tools, such as ChemDraw. This was built using a workflow-driven service-oriented architecture (SOA) on top of the rigorous registration of all underlying model entities.

  1. Development of Camera Model and Geometric Calibration/validation of Xsat IRIS Imagery

    NASA Astrophysics Data System (ADS)

    Kwoh, L. K.; Huang, X.; Tan, W. J.

    2012-07-01

    XSAT, launched on 20 April 2011, is the first micro-satellite designed and built in Singapore. It orbits the Earth at altitude of 822 km in a sun synchronous orbit. The satellite carries a multispectral camera IRIS with three spectral bands - 0.52~0.60 mm for Green, 0.63~0.69 mm for Red and 0.76~0.89 mm for NIR at 12 m resolution. In the design of IRIS camera, the three bands were acquired by three lines of CCDs (NIR, Red and Green). These CCDs were physically separated in the focal plane and their first pixels not absolutely aligned. The micro-satellite platform was also not stable enough to allow for co-registration of the 3 bands with simple linear transformation. In the camera model developed, this platform stability was compensated with 3rd to 4th order polynomials for the satellite's roll, pitch and yaw attitude angles. With the camera model, the camera parameters such as the band to band separations, the alignment of the CCDs relative to each other, as well as the focal length of the camera can be validated or calibrated. The results of calibration with more than 20 images showed that the band to band along-track separation agreed well with the pre-flight values provided by the vendor (0.093° and 0.046° for the NIR vs red and for green vs red CCDs respectively). The cross-track alignments were 0.05 pixel and 5.9 pixel for the NIR vs red and green vs red CCDs respectively. The focal length was found to be shorter by about 0.8%. This was attributed to the lower operating temperature which XSAT is currently operating. With the calibrated parameters and the camera model, a geometric level 1 multispectral image with RPCs can be generated and if required, orthorectified imagery can also be produced.

  2. Spotlight-8 Image Analysis Software

    NASA Technical Reports Server (NTRS)

    Klimek, Robert; Wright, Ted

    2006-01-01

    Spotlight is a cross-platform GUI-based software package designed to perform image analysis on sequences of images generated by combustion and fluid physics experiments run in a microgravity environment. Spotlight can perform analysis on a single image in an interactive mode or perform analysis on a sequence of images in an automated fashion. Image processing operations can be employed to enhance the image before various statistics and measurement operations are performed. An arbitrarily large number of objects can be analyzed simultaneously with independent areas of interest. Spotlight saves results in a text file that can be imported into other programs for graphing or further analysis. Spotlight can be run on Microsoft Windows, Linux, and Apple OS X platforms.

  3. Using smartphone technology to deliver a virtual pedestrian environment: usability and validation.

    PubMed

    Schwebel, David C; Severson, Joan; He, Yefei

    2017-09-01

    Various programs effectively teach children to cross streets more safely, but all are labor- and cost-intensive. Recent developments in mobile phone technology offer opportunity to deliver virtual reality pedestrian environments to mobile smartphone platforms. Such an environment may offer a cost- and labor-effective strategy to teach children to cross streets safely. This study evaluated usability, feasibility, and validity of a smartphone-based virtual pedestrian environment. A total of 68 adults completed 12 virtual crossings within each of two virtual pedestrian environments, one delivered by smartphone and the other a semi-immersive kiosk virtual environment. Participants completed self-report measures of perceived realism and simulator sickness experienced in each virtual environment, plus self-reported demographic and personality characteristics. All participants followed system instructions and used the smartphone-based virtual environment without difficulty. No significant simulator sickness was reported or observed. Users rated the smartphone virtual environment as highly realistic. Convergent validity was detected, with many aspects of pedestrian behavior in the smartphone-based virtual environment matching behavior in the kiosk virtual environment. Anticipated correlations between personality and kiosk virtual reality pedestrian behavior emerged for the smartphone-based system. A smartphone-based virtual environment can be usable and valid. Future research should develop and evaluate such a training system.

  4. Correlating Resolving Power, Resolution, and Collision Cross Section: Unifying Cross-Platform Assessment of Separation Efficiency in Ion Mobility Spectrometry.

    PubMed

    Dodds, James N; May, Jody C; McLean, John A

    2017-11-21

    Here we examine the relationship among resolving power (R p ), resolution (R pp ), and collision cross section (CCS) for compounds analyzed in previous ion mobility (IM) experiments representing a wide variety of instrument platforms and IM techniques. Our previous work indicated these three variables effectively describe and predict separation efficiency for drift tube ion mobility spectrometry experiments. In this work, we seek to determine if our previous findings are a general reflection of IM behavior that can be applied to various instrument platforms and mobility techniques. Results suggest IM distributions are well characterized by a Gaussian model and separation efficiency can be predicted on the basis of the empirical difference in the gas-phase CCS and a CCS-based resolving power definition (CCS/ΔCCS). Notably traveling wave (TWIMS) was found to operate at resolutions substantially higher than a single-peak resolving power suggested. When a CCS-based R p definition was utilized, TWIMS was found to operate at a resolving power between 40 and 50, confirming the previous observations by Giles and co-workers. After the separation axis (and corresponding resolving power) is converted to cross section space, it is possible to effectively predict separation behavior for all mobility techniques evaluated (i.e., uniform field, trapped ion mobility, traveling wave, cyclic, and overtone instruments) using the equations described in this work. Finally, we are able to establish for the first time that the current state-of-the-art ion mobility separations benchmark at a CCS-based resolving power of >300 that is sufficient to differentiate analyte ions with CCS differences as small as 0.5%.

  5. 1:50 Scale Testing of Three Floating Wind Turbines at MARIN and Numerical Model Validation Against Test Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dagher, Habib; Viselli, Anthony; Goupee, Andrew

    The primary goal of the basin model test program discussed herein is to properly scale and accurately capture physical data of the rigid body motions, accelerations and loads for different floating wind turbine platform technologies. The intended use for this data is for performing comparisons with predictions from various aero-hydro-servo-elastic floating wind turbine simulators for calibration and validation. Of particular interest is validating the floating offshore wind turbine simulation capabilities of NREL’s FAST open-source simulation tool. Once the validation process is complete, coupled simulators such as FAST can be used with a much greater degree of confidence in design processesmore » for commercial development of floating offshore wind turbines. The test program subsequently described in this report was performed at MARIN (Maritime Research Institute Netherlands) in Wageningen, the Netherlands. The models considered consisted of the horizontal axis, NREL 5 MW Reference Wind Turbine (Jonkman et al., 2009) with a flexible tower affixed atop three distinct platforms: a tension leg platform (TLP), a spar-buoy modeled after the OC3 Hywind (Jonkman, 2010) and a semi-submersible. The three generic platform designs were intended to cover the spectrum of currently investigated concepts, each based on proven floating offshore structure technology. The models were tested under Froude scale wind and wave loads. The high-quality wind environments, unique to these tests, were realized in the offshore basin via a novel wind machine which exhibits negligible swirl and low turbulence intensity in the flow field. Recorded data from the floating wind turbine models included rotor torque and position, tower top and base forces and moments, mooring line tensions, six-axis platform motions and accelerations at key locations on the nacelle, tower, and platform. A large number of tests were performed ranging from simple free-decay tests to complex operating conditions with irregular sea states and dynamic winds.« less

  6. Improving diagnostic recognition of primary hyperparathyroidism with machine learning.

    PubMed

    Somnay, Yash R; Craven, Mark; McCoy, Kelly L; Carty, Sally E; Wang, Tracy S; Greenberg, Caprice C; Schneider, David F

    2017-04-01

    Parathyroidectomy offers the only cure for primary hyperparathyroidism, but today only 50% of primary hyperparathyroidism patients are referred for operation, in large part, because the condition is widely under-recognized. The diagnosis of primary hyperparathyroidism can be especially challenging with mild biochemical indices. Machine learning is a collection of methods in which computers build predictive algorithms based on labeled examples. With the aim of facilitating diagnosis, we tested the ability of machine learning to distinguish primary hyperparathyroidism from normal physiology using clinical and laboratory data. This retrospective cohort study used a labeled training set and 10-fold cross-validation to evaluate accuracy of the algorithm. Measures of accuracy included area under the receiver operating characteristic curve, precision (sensitivity), and positive and negative predictive value. Several different algorithms and ensembles of algorithms were tested using the Weka platform. Among 11,830 patients managed operatively at 3 high-volume endocrine surgery programs from March 2001 to August 2013, 6,777 underwent parathyroidectomy for confirmed primary hyperparathyroidism, and 5,053 control patients without primary hyperparathyroidism underwent thyroidectomy. Test-set accuracies for machine learning models were determined using 10-fold cross-validation. Age, sex, and serum levels of preoperative calcium, phosphate, parathyroid hormone, vitamin D, and creatinine were defined as potential predictors of primary hyperparathyroidism. Mild primary hyperparathyroidism was defined as primary hyperparathyroidism with normal preoperative calcium or parathyroid hormone levels. After testing a variety of machine learning algorithms, Bayesian network models proved most accurate, classifying correctly 95.2% of all primary hyperparathyroidism patients (area under receiver operating characteristic = 0.989). Omitting parathyroid hormone from the model did not decrease the accuracy significantly (area under receiver operating characteristic = 0.985). In mild disease cases, however, the Bayesian network model classified correctly 71.1% of patients with normal calcium and 92.1% with normal parathyroid hormone levels preoperatively. Bayesian networking and AdaBoost improved the accuracy of all parathyroid hormone patients to 97.2% cases (area under receiver operating characteristic = 0.994), and 91.9% of primary hyperparathyroidism patients with mild disease. This was significantly improved relative to Bayesian networking alone (P < .0001). Machine learning can diagnose accurately primary hyperparathyroidism without human input even in mild disease. Incorporation of this tool into electronic medical record systems may aid in recognition of this under-diagnosed disorder. Copyright © 2016 Elsevier Inc. All rights reserved.

  7. Measuring intra-operative decision-making during laparoscopic cholecystectomy: validity evidence for a novel interactive Web-based assessment tool.

    PubMed

    Madani, Amin; Watanabe, Yusuke; Bilgic, Elif; Pucher, Philip H; Vassiliou, Melina C; Aggarwal, Rajesh; Fried, Gerald M; Mitmaker, Elliot J; Feldman, Liane S

    2017-03-01

    Errors in judgment during laparoscopic cholecystectomy can lead to bile duct injuries and other complications. Despite correlations between outcomes, expertise and advanced cognitive skills, current methods to evaluate these skills remain subjective, rater- and situation-dependent and non-systematic. The purpose of this study was to develop objective metrics using a Web-based platform and to obtain validity evidence for their assessment of decision-making during laparoscopic cholecystectomy. An interactive online learning platform was developed ( www.thinklikeasurgeon.com ). Trainees and surgeons from six institutions completed a 12-item assessment, developed based on a cognitive task analysis. Five items required subjects to draw their answer on the surgical field, and accuracy scores were calculated based on an algorithm derived from experts' responses ("visual concordance test", VCT). Test-retest reliability, internal consistency, and correlation with self-reported experience, Global Operative Assessment of Laparoscopic Skills (GOALS) score and Objective Performance Rating Scale (OPRS) score were calculated. Questionnaires were administered to evaluate the platform's usability, feasibility and educational value. Thirty-nine subjects (17 surgeons, 22 trainees) participated. There was high test-retest reliability (intraclass correlation coefficient = 0.95; n = 10) and internal consistency (Cronbach's α = 0.87). The assessment demonstrated significant differences between novices, intermediates and experts in total score (p < 0.01) and VCT score (p < 0.01). There was high correlation between total case number and total score (ρ = 0.83, p < 0.01) and between total case number and VCT (ρ = 0.82, p < 0.01), and moderate to high correlations between total score and GOALS (ρ = 0.66, p = 0.05), VCT and GOALS (ρ = 0.83, p < 0.01), total score and OPRS (ρ = 0.67, p = 0.04), and VCT and OPRS (ρ = 0.78, p = 0.01). Most subjects agreed or strongly agreed that the platform and assessment was easy to use [n = 29 (78 %)], facilitates learning intra-operative decision-making [n = 28 (81 %)], and should be integrated into surgical training [n = 28 (76 %)]. This study provides preliminary validity evidence for a novel interactive platform to objectively assess decision-making during laparoscopic cholecystectomy.

  8. Validation of tablet-based evaluation of color fundus images

    PubMed Central

    Christopher, Mark; Moga, Daniela C.; Russell, Stephen R.; Folk, James C.; Scheetz, Todd; Abràmoff, Michael D.

    2012-01-01

    Purpose To compare diabetic retinopathy (DR) referral recommendations made by viewing fundus images using a tablet computer to recommendations made using a standard desktop display. Methods A tablet computer (iPad) and a desktop PC with a high-definition color display were compared. For each platform, two retinal specialists independently rated 1200 color fundus images from patients at risk for DR using an annotation program, Truthseeker. The specialists determined whether each image had referable DR, and also how urgently each patient should be referred for medical examination. Graders viewed and rated the randomly presented images independently and were masked to their ratings on the alternative platform. Tablet- and desktop display-based referral ratings were compared using cross-platform, intra-observer kappa as the primary outcome measure. Additionally, inter-observer kappa, sensitivity, specificity, and area under ROC (AUC) were determined. Results A high level of cross-platform, intra-observer agreement was found for the DR referral ratings between the platforms (κ=0.778), and for the two graders, (κ=0.812). Inter-observer agreement was similar for the two platforms (κ=0.544 and κ=0.625 for tablet and desktop, respectively). The tablet-based ratings achieved a sensitivity of 0.848, a specificity of 0.987, and an AUC of 0.950 compared to desktop display-based ratings. Conclusions In this pilot study, tablet-based rating of color fundus images for subjects at risk for DR was consistent with desktop display-based rating. These results indicate that tablet computers can be reliably used for clinical evaluation of fundus images for DR. PMID:22495326

  9. Test strategies for industrial testers for converter controls equipment

    NASA Astrophysics Data System (ADS)

    Oleniuk, P.; Di Cosmo, M.; Kasampalis, V.; Nisbet, D.; Todd, B.; Uznański, S.

    2017-04-01

    Power converters and their controls electronics are key elements for the operation of the CERN accelerator complex, having a direct impact on its availability. To prevent early-life failures and provide means to verify electronics, a set of industrial testers is used throughout the converters controls electronics' life cycle. The roles of the testers are to validate mass production during the manufacturing phase and to provide means to diagnose and repair failed modules that are brought back from operation. In the converter controls electronics section of the power converters group in the technology department of CERN (TE/EPC/CCE), two main test platforms have been adopted: a PXI platform for mixed analogue-digital functional tests and a JTAG Boundary-Scan platform for digital interconnection and functional tests. Depending on the functionality of the device under test, the appropriate test platforms are chosen. This paper is a follow-up to results presented at the TWEPP 2015 conference, adding the boundary scan test platform and the first results from exploitation of the test system. This paper reports on the test software, hardware design and test strategy applied for a number of devices that has resulted in maximizing test coverage and minimizing test design effort.

  10. Scutellarin protects against Aβ-induced learning and memory deficits in rats: involvement of nicotinic acetylcholine receptors and cholinesterase

    PubMed Central

    Guo, Li-li; Guan, Zhi-zhong; Wang, Yong-lin

    2011-01-01

    Aim: To examine the protective effects of scutellarin (Scu) on rats with learning and memory deficit induced by β-amyloid peptide (Aβ). Methods: Fifty male Wistar rats were randomly divided into 5 groups: control, sham operation, Aβ, Aβ+Scu, and Aβ+piracetam groups. Aβ25–35 was injected into the lateral ventricle (10 μg each side). Scu (10 mg/2 mL) or piracetam (10 mg/2 mL was intragastrically administered per day for 20 consecutive days following Aβ treatment. Learning and memory was assessed with Morris water maze test. The protein and mRNA levels of nicotinic acetylcholine receptor (nAChR) α4, α7, and β2 subunits in the brain were examined using Western blotting and real-time PCR, respectively. The activities of acetylcholinesterase (AChE) and butyrylcholinesterase (BuChE) in the brain and plasma were measured using Ellman's colorimetric method. Results: In Aβ group, the escape latency period and first platform cross was significantly increased, and the total number of platform crossings was significantly decreased, as compared with the control and the sham operation groups. Both Scu and piracetam treatment significantly reduced the escape latency period and time to cross platform, and increased the number of platform crosses, but there were no significant differences between Aβ+Scu and Aβ+piracetam groups. In Aβ group, the protein levels of nAChR α4 and α7 subunits in the cerebral cortex were significantly decreased by 42%–47% and 58%–61%, respectively, as compared to the control and the sham operation groups. Scu treatment caused upregulation of α4 and α7 subunit proteins by around 24% and 30%, respectively, as compared to Aβ group, but there were no significant differences between Aβ+Scu and Aβ+piracetam groups. The protein level of nAChR β2 subunit had no significant difference among different groups. The mRNA levels of nAChR α4, α7, and β2 subunits were not significantly changed. In Aβ group, the activities of AChE and BuChE in the brain were significantly increased, but were significantly decreased in the plasma, as compared to the control and the sham operation groups. Scu or piracetam treatment restored the activities in brain and plasma nearly to the levels in the control group. Conclusion: The results suggest that Scu may rescue some of the deleterious effects of Aβ, possibly by stimulating nAChR protein translation and regulating cholinesterase activity. PMID:21986571

  11. Architecture Design and Experimental Platform Demonstration of Optical Network based on OpenFlow Protocol

    NASA Astrophysics Data System (ADS)

    Xing, Fangyuan; Wang, Honghuan; Yin, Hongxi; Li, Ming; Luo, Shenzi; Wu, Chenguang

    2016-02-01

    With the extensive application of cloud computing and data centres, as well as the constantly emerging services, the big data with the burst characteristic has brought huge challenges to optical networks. Consequently, the software defined optical network (SDON) that combines optical networks with software defined network (SDN), has attracted much attention. In this paper, an OpenFlow-enabled optical node employed in optical cross-connect (OXC) and reconfigurable optical add/drop multiplexer (ROADM), is proposed. An open source OpenFlow controller is extended on routing strategies. In addition, the experiment platform based on OpenFlow protocol for software defined optical network, is designed. The feasibility and availability of the OpenFlow-enabled optical nodes and the extended OpenFlow controller are validated by the connectivity test, protection switching and load balancing experiments in this test platform.

  12. Analysis of Aurora's Performance Simulation Engine for Three Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freeman, Janine; Simon, Joseph

    2015-07-07

    Aurora Solar Inc. is building a cloud-based optimization platform to automate the design, engineering, and permit generation process of solar photovoltaic (PV) installations. They requested that the National Renewable Energy Laboratory (NREL) validate the performance of the PV system performance simulation engine of Aurora Solar’s solar design platform, Aurora. In previous work, NREL performed a validation of multiple other PV modeling tools 1, so this study builds upon that work by examining all of the same fixed-tilt systems with available module datasheets that NREL selected and used in the aforementioned study. Aurora Solar set up these three operating PV systemsmore » in their modeling platform using NREL-provided system specifications and concurrent weather data. NREL then verified the setup of these systems, ran the simulations, and compared the Aurora-predicted performance data to measured performance data for those three systems, as well as to performance data predicted by other PV modeling tools.« less

  13. Validity of a Jump Mat for assessing Countermovement Jump Performance in Elite Rugby Players.

    PubMed

    Dobbin, Nick; Hunwicks, Richard; Highton, Jamie; Twist, Craig

    2017-02-01

    This study determined the validity of the Just Jump System ® (JJS) for measuring flight time, jump height and peak power output (PPO) in elite rugby league players. 37 elite rugby league players performed 6 countermovement jumps (CMJ; 3 with and 3 without arms) on a jump mat and force platform. A sub-sample (n=28) was used to cross-validate the equations for flight time, jump height and PPO. The JJS systematically overestimated flight time and jump height compared to the force platform (P<0.05), but demonstrated strong associations for flight time ( with R 2 =0.938; without R 2 =0.972) and jump height ( with R 2 =0.945; without R 2 =0.987). Our equations revealed no systematic difference between corrected and force platform scores and an improved the agreement for flight time (Ratio limits of agreement: with 1.00 vs. 1.36; without 1.00 vs. 1.16) and jump height ( with 1.01 vs. 1.34; without 1.01 vs. 1.15), meaning that our equations can be used to correct JJS scores for elite rugby players. While our equation improved the estimation of PPO ( with 1.02; without 1.01) compared to existing equations (Harman: 1.20; Sayers: 1.04), this only accounted for 64 and 69% of PPO. © Georg Thieme Verlag KG Stuttgart · New York.

  14. A Cross-Platform Infrastructure for Scalable Runtime Application Performance Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jack Dongarra; Shirley Moore; Bart Miller, Jeffrey Hollingsworth

    2005-03-15

    The purpose of this project was to build an extensible cross-platform infrastructure to facilitate the development of accurate and portable performance analysis tools for current and future high performance computing (HPC) architectures. Major accomplishments include tools and techniques for multidimensional performance analysis, as well as improved support for dynamic performance monitoring of multithreaded and multiprocess applications. Previous performance tool development has been limited by the burden of having to re-write a platform-dependent low-level substrate for each architecture/operating system pair in order to obtain the necessary performance data from the system. Manual interpretation of performance data is not scalable for large-scalemore » long-running applications. The infrastructure developed by this project provides a foundation for building portable and scalable performance analysis tools, with the end goal being to provide application developers with the information they need to analyze, understand, and tune the performance of terascale applications on HPC architectures. The backend portion of the infrastructure provides runtime instrumentation capability and access to hardware performance counters, with thread-safety for shared memory environments and a communication substrate to support instrumentation of multiprocess and distributed programs. Front end interfaces provides tool developers with a well-defined, platform-independent set of calls for requesting performance data. End-user tools have been developed that demonstrate runtime data collection, on-line and off-line analysis of performance data, and multidimensional performance analysis. The infrastructure is based on two underlying performance instrumentation technologies. These technologies are the PAPI cross-platform library interface to hardware performance counters and the cross-platform Dyninst library interface for runtime modification of executable images. The Paradyn and KOJAK projects have made use of this infrastructure to build performance measurement and analysis tools that scale to long-running programs on large parallel and distributed systems and that automate much of the search for performance bottlenecks.« less

  15. Single-Port Surgery: Laboratory Experience with the daVinci Single-Site Platform

    PubMed Central

    Haber, Georges-Pascal; Kaouk, Jihad; Kroh, Matthew; Chalikonda, Sricharan; Falcone, Tommaso

    2011-01-01

    Background and Objectives: The purpose of this study was to evaluate the feasibility and validity of a dedicated da Vinci single-port platform in the porcine model in the performance of gynecologic surgery. Methods: This pilot study was conducted in 4 female pigs. All pigs had a general anesthetic and were placed in the supine and flank position. A 2-cm umbilical incision was made, through which a robotic single-port device was placed and pneumoperitoneum obtained. A data set was collected for each procedure and included port placement time, docking time, operative time, blood loss, and complications. Operative times were compared between cases and procedures by use of the Student t test. Results: A total of 28 surgical procedures (8 oophorectomies, 4 hysterectomies, 8 pelvic lymph node dissections, 4 aorto-caval nodal dissections, 2 bladder repairs, 1 uterine horn anastomosis, and 1 radical cystectomy) were performed. There was no statistically significant difference in operating times for symmetrical procedures among animals (P=0.3215). Conclusions: This animal study demonstrates that single-port robotic surgery using a dedicated single-site platform allows performing technically challenging procedures within acceptable operative times and without complications or insertion of additional trocars. PMID:21902962

  16. Microfluidic platform combining droplets and magnetic tweezers: application to HER2 expression in cancer diagnosis.

    PubMed

    Ferraro, Davide; Champ, Jérôme; Teste, Bruno; Serra, Marco; Malaquin, Laurent; Viovy, Jean-Louis; de Cremoux, Patricia; Descroix, Stephanie

    2016-05-09

    The development of precision medicine, together with the multiplication of targeted therapies and associated molecular biomarkers, call for major progress in genetic analysis methods, allowing increased multiplexing and the implementation of more complex decision trees, without cost increase or loss of robustness. We present a platform combining droplet microfluidics and magnetic tweezers, performing RNA purification, reverse transcription and amplification in a fully automated and programmable way, in droplets of 250nL directly sampled from a microtiter-plate. This platform decreases sample consumption about 100 fold as compared to current robotized platforms and it reduces human manipulations and contamination risk. The platform's performance was first evaluated on cell lines, showing robust operation on RNA quantities corresponding to less than one cell, and then clinically validated with a cohort of 21 breast cancer samples, for the determination of their HER2 expression status, in a blind comparison with an established routine clinical analysis.

  17. A cross-platform GUI to control instruments compliant with SCPI through VISA

    NASA Astrophysics Data System (ADS)

    Roach, Eric; Liu, Jing

    2015-10-01

    In nuclear physics experiments, it is necessary and important to control instruments from a PC, which automates many tasks that require human operations otherwise. Not only does this make long term measurements possible, but it also makes repetitive operations less error-prone. We created a graphical user interface (GUI) to control instruments connected to a PC through RS232, USB, LAN, etc. The GUI is developed using Qt Creator, a cross-platform integrated development environment, which makes it portable to various operating systems, including those commonly used in mobile devices. NI-VISA library is used in the back end so that the GUI can be used to control instruments connected through various I/O interfaces without any modification. Commonly used SCPI commands can be sent to different instruments using buttons, sliders, knobs, and other various widgets provided by Qt Creator. As an example, we demonstrate how we set and fetch parameters and how to retrieve and display data from an Agilent Digital Storage Oscilloscope X3034A with the GUI. Our GUI can be easily used for other instruments compliant with SCPI and VISA with little or no modification.

  18. Offshore platform sourced pollution monitoring using space-borne fully polarimetric C and X band synthetic aperture radar.

    PubMed

    Singha, Suman; Ressel, Rudolf

    2016-11-15

    Use of polarimetric SAR data for offshore pollution monitoring is relatively new and shows great potential for operational offshore platform monitoring. This paper describes the development of an automated oil spill detection chain for operational purposes based on C-band (RADARSAT-2) and X-band (TerraSAR-X) fully polarimetric images, wherein we use polarimetric features to characterize oil spills and look-alikes. Numbers of near coincident TerraSAR-X and RADARSAT-2 images have been acquired over offshore platforms. Ten polarimetric feature parameters were extracted from different types of oil and 'look-alike' spots and divided into training and validation dataset. Extracted features were then used to develop a pixel based Artificial Neural Network classifier. Mutual information contents among extracted features were assessed and feature parameters were ranked according to their ability to discriminate between oil spill and look-alike spots. Polarimetric features such as Scattering Diversity, Surface Scattering Fraction and Span proved to be most suitable for operational services. Copyright © 2016 Elsevier Ltd. All rights reserved.

  19. Painless File Extraction: The A(rc)--Z(oo) of Internet Archive Formats.

    ERIC Educational Resources Information Center

    Simmonds, Curtis

    1993-01-01

    Discusses extraction programs needed to postprocess software downloaded from the Internet that has been archived and compressed for the purposes of storage and file transfer. Archiving formats for DOS, Macintosh, and UNIX operating systems are described; and cross-platform compression utilities are explained. (LRW)

  20. Basic Simulation Environment for Highly Customized Connected and Autonomous Vehicle Kinematic Scenarios.

    PubMed

    Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen

    2017-08-23

    To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification.

  1. Basic Simulation Environment for Highly Customized Connected and Autonomous Vehicle Kinematic Scenarios

    PubMed Central

    Chai, Linguo; Cai, Baigen; ShangGuan, Wei; Wang, Jian; Wang, Huashen

    2017-01-01

    To enhance the reality of Connected and Autonomous Vehicles (CAVs) kinematic simulation scenarios and to guarantee the accuracy and reliability of the verification, a four-layer CAVs kinematic simulation framework, which is composed with road network layer, vehicle operating layer, uncertainties modelling layer and demonstrating layer, is proposed in this paper. Properties of the intersections are defined to describe the road network. A target position based vehicle position updating method is designed to simulate such vehicle behaviors as lane changing and turning. Vehicle kinematic models are implemented to maintain the status of the vehicles when they are moving towards the target position. Priorities for individual vehicle control are authorized for different layers. Operation mechanisms of CAVs uncertainties, which are defined as position error and communication delay in this paper, are implemented in the simulation to enhance the reality of the simulation. A simulation platform is developed based on the proposed methodology. A comparison of simulated and theoretical vehicle delay has been analyzed to prove the validity and the creditability of the platform. The scenario of rear-end collision avoidance is conducted to verify the uncertainties operating mechanisms, and a slot-based intersections (SIs) control strategy is realized and verified in the simulation platform to show the supports of the platform to CAVs kinematic simulation and verification. PMID:28832518

  2. Visualization of Vgi Data Through the New NASA Web World Wind Virtual Globe

    NASA Astrophysics Data System (ADS)

    Brovelli, M. A.; Kilsedar, C. E.; Zamboni, G.

    2016-06-01

    GeoWeb 2.0, laying the foundations of Volunteered Geographic Information (VGI) systems, has led to platforms where users can contribute to the geographic knowledge that is open to access. Moreover, as a result of the advancements in 3D visualization, virtual globes able to visualize geographic data even on browsers emerged. However the integration of VGI systems and virtual globes has not been fully realized. The study presented aims to visualize volunteered data in 3D, considering also the ease of use aspects for general public, using Free and Open Source Software (FOSS). The new Application Programming Interface (API) of NASA, Web World Wind, written in JavaScript and based on Web Graphics Library (WebGL) is cross-platform and cross-browser, so that the virtual globe created using this API can be accessible through any WebGL supported browser on different operating systems and devices, as a result not requiring any installation or configuration on the client-side, making the collected data more usable to users, which is not the case with the World Wind for Java as installation and configuration of the Java Virtual Machine (JVM) is required. Furthermore, the data collected through various VGI platforms might be in different formats, stored in a traditional relational database or in a NoSQL database. The project developed aims to visualize and query data collected through Open Data Kit (ODK) platform and a cross-platform application, where data is stored in a relational PostgreSQL and NoSQL CouchDB databases respectively.

  3. Cross-Platform Toxicogenomics for the Prediction of Non-Genotoxic Hepatocarcinogenesis in Rat

    PubMed Central

    Metzger, Ute; Templin, Markus F.; Plummer, Simon; Ellinger-Ziegelbauer, Heidrun; Zell, Andreas

    2014-01-01

    In the area of omics profiling in toxicology, i.e. toxicogenomics, characteristic molecular profiles have previously been incorporated into prediction models for early assessment of a carcinogenic potential and mechanism-based classification of compounds. Traditionally, the biomarker signatures used for model construction were derived from individual high-throughput techniques, such as microarrays designed for monitoring global mRNA expression. In this study, we built predictive models by integrating omics data across complementary microarray platforms and introduced new concepts for modeling of pathway alterations and molecular interactions between multiple biological layers. We trained and evaluated diverse machine learning-based models, differing in the incorporated features and learning algorithms on a cross-omics dataset encompassing mRNA, miRNA, and protein expression profiles obtained from rat liver samples treated with a heterogeneous set of substances. Most of these compounds could be unambiguously classified as genotoxic carcinogens, non-genotoxic carcinogens, or non-hepatocarcinogens based on evidence from published studies. Since mixed characteristics were reported for the compounds Cyproterone acetate, Thioacetamide, and Wy-14643, we reclassified these compounds as either genotoxic or non-genotoxic carcinogens based on their molecular profiles. Evaluating our toxicogenomics models in a repeated external cross-validation procedure, we demonstrated that the prediction accuracy of our models could be increased by joining the biomarker signatures across multiple biological layers and by adding complex features derived from cross-platform integration of the omics data. Furthermore, we found that adding these features resulted in a better separation of the compound classes and a more confident reclassification of the three undefined compounds as non-genotoxic carcinogens. PMID:24830643

  4. Current status of validation for robotic surgery simulators - a systematic review.

    PubMed

    Abboudi, Hamid; Khan, Mohammed S; Aboumarzouk, Omar; Guru, Khurshid A; Challacombe, Ben; Dasgupta, Prokar; Ahmed, Kamran

    2013-02-01

    To analyse studies validating the effectiveness of robotic surgery simulators. The MEDLINE(®), EMBASE(®) and PsycINFO(®) databases were systematically searched until September 2011. References from retrieved articles were reviewed to broaden the search. The simulator name, training tasks, participant level, training duration and evaluation scoring were extracted from each study. We also extracted data on feasibility, validity, cost-effectiveness, reliability and educational impact. We identified 19 studies investigating simulation options in robotic surgery. There are five different robotic surgery simulation platforms available on the market. In all, 11 studies sought opinion and compared performance between two different groups; 'expert' and 'novice'. Experts ranged in experience from 21-2200 robotic cases. The novice groups consisted of participants with no prior experience on a robotic platform and were often medical students or junior doctors. The Mimic dV-Trainer(®), ProMIS(®), SimSurgery Educational Platform(®) (SEP) and Intuitive systems have shown face, content and construct validity. The Robotic Surgical SimulatorTM system has only been face and content validated. All of the simulators except SEP have shown educational impact. Feasibility and cost-effectiveness of simulation systems was not evaluated in any trial. Virtual reality simulators were shown to be effective training tools for junior trainees. Simulation training holds the greatest potential to be used as an adjunct to traditional training methods to equip the next generation of robotic surgeons with the skills required to operate safely. However, current simulation models have only been validated in small studies. There is no evidence to suggest one type of simulator provides more effective training than any other. More research is needed to validate simulated environments further and investigate the effectiveness of animal and cadaveric training in robotic surgery. © 2012 BJU International.

  5. The NPOESS Crosstrack Infrared Sounder (CrIS) and Advanced Technology Microwave Sounder (ATMS) as a Companion to the New Generation AIRS/AMSU and IASI/AMSU Sounder Suites

    NASA Astrophysics Data System (ADS)

    Bingham, G. E.; Pougatchev, N. S.; Zavyalov, V.; Esplin, M.; Blackwell, W. J.; Barnet, C.

    2009-12-01

    The NPOESS Preparatory Project is serving the operations and research community as the bridge mission between the Earth Observing System and the National Polar-orbiting Operational Environmental Satellite System. The Cross-track Infrared Sounder (CrIS), combined with the Advanced Technology Microwave Sounder (ATMS) are the core instruments to provide the key performance temperature and humidity profiles (along with some other atmospheric constituent information). Both the high spectral resolution CrIS and the upgraded microwave sounder (ATMS) will be working in parallel with already orbiting Advanced Atmospheric Infrared Sounder (AIRS/AMSU) on EOS AQUA platform and Infrared Atmospheric Sounding Interferometer (IASI/AMSU) on METOP-A satellite. This presentation will review the CrIS/ATMS capabilities in the context of continuity with the excellent performance records established by AIRS and IASI. The CrIS sensor is in the process of its final calibration and characterization testing and the results and Sensor Data Record process are being validated against this excellent dataset. The comparison between CrIS, AIRS, and IASI will include spectral, spatial, radiometric performance and sounding capability comparisons.

  6. A novel low-cost open-hardware platform for monitoring soil water content and multiple soil-air-vegetation parameters.

    PubMed

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-10-21

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost "open hardware" platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSE(training) = 2.63; RMSE(validation) = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies.

  7. Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto

    2016-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The Glenn Research Center (GRC) team made a software-defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development on an STRS compliant platform to support future space communication systems for advanced exploration missions. Validated STRS compliant applications provided tested code with extensive documentation to potentially reduce risk, cost and efforts in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, the sample waveform, and wrapper development efforts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation SDRs for advance exploration missions.

  8. A Hardware-in-the-Loop Simulation Platform for the Verification and Validation of Safety Control Systems

    NASA Astrophysics Data System (ADS)

    Rankin, Drew J.; Jiang, Jin

    2011-04-01

    Verification and validation (V&V) of safety control system quality and performance is required prior to installing control system hardware within nuclear power plants (NPPs). Thus, the objective of the hardware-in-the-loop (HIL) platform introduced in this paper is to verify the functionality of these safety control systems. The developed platform provides a flexible simulated testing environment which enables synchronized coupling between the real and simulated world. Within the platform, National Instruments (NI) data acquisition (DAQ) hardware provides an interface between a programmable electronic system under test (SUT) and a simulation computer. Further, NI LabVIEW resides on this remote DAQ workstation for signal conversion and routing between Ethernet and standard industrial signals as well as for user interface. The platform is applied to the testing of a simplified implementation of Canadian Deuterium Uranium (CANDU) shutdown system no. 1 (SDS1) which monitors only the steam generator level of the simulated NPP. CANDU NPP simulation is performed on a Darlington NPP desktop training simulator provided by Ontario Power Generation (OPG). Simplified SDS1 logic is implemented on an Invensys Tricon v9 programmable logic controller (PLC) to test the performance of both the safety controller and the implemented logic. Prior to HIL simulation, platform availability of over 95% is achieved for the configuration used during the V&V of the PLC. Comparison of HIL simulation results to benchmark simulations shows good operational performance of the PLC following a postulated initiating event (PIE).

  9. A versatile retarding potential analyzer for nano-satellite platforms.

    PubMed

    Fanelli, L; Noel, S; Earle, G D; Fish, C; Davidson, R L; Robertson, R V; Marquis, P; Garg, V; Somasundaram, N; Kordella, L; Kennedy, P

    2015-12-01

    The design of the first retarding potential analyzer (RPA) built specifically for use on resource-limited cubesat platforms is described. The size, mass, and power consumption are consistent with the limitations of a nano-satellite, but the performance specifications are commensurate with those of RPAs flown on much larger platforms. The instrument is capable of measuring the ion density, temperature, and the ram component of the ion velocity in the spacecraft reference frame, while also providing estimates of the ion composition. The mechanical and electrical designs are described, as are the operating modes, command and data structure, and timing scheme. Test data obtained using an ion source inside a laboratory vacuum chamber are presented to validate the performance of the new design.

  10. Multi-task learning for cross-platform siRNA efficacy prediction: an in-silico study

    PubMed Central

    2010-01-01

    Background Gene silencing using exogenous small interfering RNAs (siRNAs) is now a widespread molecular tool for gene functional study and new-drug target identification. The key mechanism in this technique is to design efficient siRNAs that incorporated into the RNA-induced silencing complexes (RISC) to bind and interact with the mRNA targets to repress their translations to proteins. Although considerable progress has been made in the computational analysis of siRNA binding efficacy, few joint analysis of different RNAi experiments conducted under different experimental scenarios has been done in research so far, while the joint analysis is an important issue in cross-platform siRNA efficacy prediction. A collective analysis of RNAi mechanisms for different datasets and experimental conditions can often provide new clues on the design of potent siRNAs. Results An elegant multi-task learning paradigm for cross-platform siRNA efficacy prediction is proposed. Experimental studies were performed on a large dataset of siRNA sequences which encompass several RNAi experiments recently conducted by different research groups. By using our multi-task learning method, the synergy among different experiments is exploited and an efficient multi-task predictor for siRNA efficacy prediction is obtained. The 19 most popular biological features for siRNA according to their jointly importance in multi-task learning were ranked. Furthermore, the hypothesis is validated out that the siRNA binding efficacy on different messenger RNAs(mRNAs) have different conditional distribution, thus the multi-task learning can be conducted by viewing tasks at an "mRNA"-level rather than at the "experiment"-level. Such distribution diversity derived from siRNAs bound to different mRNAs help indicate that the properties of target mRNA have important implications on the siRNA binding efficacy. Conclusions The knowledge gained from our study provides useful insights on how to analyze various cross-platform RNAi data for uncovering of their complex mechanism. PMID:20380733

  11. Multi-task learning for cross-platform siRNA efficacy prediction: an in-silico study.

    PubMed

    Liu, Qi; Xu, Qian; Zheng, Vincent W; Xue, Hong; Cao, Zhiwei; Yang, Qiang

    2010-04-10

    Gene silencing using exogenous small interfering RNAs (siRNAs) is now a widespread molecular tool for gene functional study and new-drug target identification. The key mechanism in this technique is to design efficient siRNAs that incorporated into the RNA-induced silencing complexes (RISC) to bind and interact with the mRNA targets to repress their translations to proteins. Although considerable progress has been made in the computational analysis of siRNA binding efficacy, few joint analysis of different RNAi experiments conducted under different experimental scenarios has been done in research so far, while the joint analysis is an important issue in cross-platform siRNA efficacy prediction. A collective analysis of RNAi mechanisms for different datasets and experimental conditions can often provide new clues on the design of potent siRNAs. An elegant multi-task learning paradigm for cross-platform siRNA efficacy prediction is proposed. Experimental studies were performed on a large dataset of siRNA sequences which encompass several RNAi experiments recently conducted by different research groups. By using our multi-task learning method, the synergy among different experiments is exploited and an efficient multi-task predictor for siRNA efficacy prediction is obtained. The 19 most popular biological features for siRNA according to their jointly importance in multi-task learning were ranked. Furthermore, the hypothesis is validated out that the siRNA binding efficacy on different messenger RNAs(mRNAs) have different conditional distribution, thus the multi-task learning can be conducted by viewing tasks at an "mRNA"-level rather than at the "experiment"-level. Such distribution diversity derived from siRNAs bound to different mRNAs help indicate that the properties of target mRNA have important implications on the siRNA binding efficacy. The knowledge gained from our study provides useful insights on how to analyze various cross-platform RNAi data for uncovering of their complex mechanism.

  12. Testing single point incremental forming molds for thermoforming operations

    NASA Astrophysics Data System (ADS)

    Afonso, Daniel; de Sousa, Ricardo Alves; Torcato, Ricardo

    2016-10-01

    Low pressure polymer processing processes as thermoforming or rotational molding use much simpler molds then high pressure processes like injection. However, despite the low forces involved with the process, molds manufacturing for this operations is still a very material, energy and time consuming operation. The goal of the research is to develop and validate a method for manufacturing plastically formed sheets metal molds by single point incremental forming (SPIF) operation for thermoforming operation. Stewart platform based SPIF machines allow the forming of thick metal sheets, granting the required structural stiffness for the mold surface, and keeping the short lead time manufacture and low thermal inertia.

  13. BEARS: a multi-mission anomaly response system

    NASA Astrophysics Data System (ADS)

    Roberts, Bryce A.

    2009-05-01

    The Mission Operations Group at UC Berkeley's Space Sciences Laboratory operates a highly automated ground station and presently a fleet of seven satellites, each with its own associated command and control console. However, the requirement for prompt anomaly detection and resolution is shared commonly between the ground segment and all spacecraft. The efficient, low-cost operation and "lights-out" staffing of the Mission Operations Group requires that controllers and engineers be notified of spacecraft and ground system problems around the clock. The Berkeley Emergency Anomaly and Response System (BEARS) is an in-house developed web- and paging-based software system that meets this need. BEARS was developed as a replacement for an existing emergency reporting software system that was too closedsource, platform-specific, expensive, and antiquated to expand or maintain. To avoid these limitations, the new system design leverages cross-platform, open-source software products such as MySQL, PHP, and Qt. Anomaly notifications and responses make use of the two-way paging capabilities of modern smart phones.

  14. SU-F-R-22: Malignancy Classification for Small Pulmonary Nodules with Radiomics and Logistic Regression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, W; Tu, S

    Purpose: We conducted a retrospective study of Radiomics research for classifying malignancy of small pulmonary nodules. A machine learning algorithm of logistic regression and open research platform of Radiomics, IBEX (Imaging Biomarker Explorer), were used to evaluate the classification accuracy. Methods: The training set included 100 CT image series from cancer patients with small pulmonary nodules where the average diameter is 1.10 cm. These patients registered at Chang Gung Memorial Hospital and received a CT-guided operation of lung cancer lobectomy. The specimens were classified by experienced pathologists with a B (benign) or M (malignant). CT images with slice thickness ofmore » 0.625 mm were acquired from a GE BrightSpeed 16 scanner. The study was formally approved by our institutional internal review board. Nodules were delineated and 374 feature parameters were extracted from IBEX. We first used the t-test and p-value criteria to study which feature can differentiate between group B and M. Then we implemented a logistic regression algorithm to perform nodule malignancy classification. 10-fold cross-validation and the receiver operating characteristic curve (ROC) were used to evaluate the classification accuracy. Finally hierarchical clustering analysis, Spearman rank correlation coefficient, and clustering heat map were used to further study correlation characteristics among different features. Results: 238 features were found differentiable between group B and M based on whether their statistical p-values were less than 0.05. A forward search algorithm was used to select an optimal combination of features for the best classification and 9 features were identified. Our study found the best accuracy of classifying malignancy was 0.79±0.01 with the 10-fold cross-validation. The area under the ROC curve was 0.81±0.02. Conclusion: Benign nodules may be treated as a malignant tumor in low-dose CT and patients may undergo unnecessary surgeries or treatments. Our study may help radiologists to differentiate nodule malignancy for low-dose CT.« less

  15. CROPPER: a metagene creator resource for cross-platform and cross-species compendium studies.

    PubMed

    Paananen, Jussi; Storvik, Markus; Wong, Garry

    2006-09-22

    Current genomic research methods provide researchers with enormous amounts of data. Combining data from different high-throughput research technologies commonly available in biological databases can lead to novel findings and increase research efficiency. However, combining data from different heterogeneous sources is often a very arduous task. These sources can be different microarray technology platforms, genomic databases, or experiments performed on various species. Our aim was to develop a software program that could facilitate the combining of data from heterogeneous sources, and thus allow researchers to perform genomic cross-platform/cross-species studies and to use existing experimental data for compendium studies. We have developed a web-based software resource, called CROPPER that uses the latest genomic information concerning different data identifiers and orthologous genes from the Ensembl database. CROPPER can be used to combine genomic data from different heterogeneous sources, allowing researchers to perform cross-platform/cross-species compendium studies without the need for complex computational tools or the requirement of setting up one's own in-house database. We also present an example of a simple cross-platform/cross-species compendium study based on publicly available Parkinson's disease data derived from different sources. CROPPER is a user-friendly and freely available web-based software resource that can be successfully used for cross-species/cross-platform compendium studies.

  16. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    1999-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include ASTER, CERES, MISR, MODIS and MOPITT. In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS), AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2, though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community. Detailed information about the EOS Terra validation Program can be found on the EOS Validation program homepage i/e.: http://ospso.gsfc.nasa.gov/validation/valpage.html).

  17. From Fulcher to PLEVALEX: Issues in Interface Design, Validity and Reliability in Internet Based Language Testing

    ERIC Educational Resources Information Center

    Garcia Laborda, Jesus

    2007-01-01

    Interface design and ergonomics, while already studied in much of educational theory, have not until recently been considered in language testing (Fulcher, 2003). In this paper, we revise the design principles of PLEVALEX, a fully operational prototype Internet based language testing platform. Our focus here is to show PLEVALEX's interfaces and…

  18. Auto-Generated Semantic Processing Services

    NASA Technical Reports Server (NTRS)

    Davis, Rodney; Hupf, Greg

    2009-01-01

    Auto-Generated Semantic Processing (AGSP) Services is a suite of software tools for automated generation of other computer programs, denoted cross-platform semantic adapters, that support interoperability of computer-based communication systems that utilize a variety of both new and legacy communication software running in a variety of operating- system/computer-hardware combinations. AGSP has numerous potential uses in military, space-exploration, and other government applications as well as in commercial telecommunications. The cross-platform semantic adapters take advantage of common features of computer- based communication systems to enforce semantics, messaging protocols, and standards of processing of streams of binary data to ensure integrity of data and consistency of meaning among interoperating systems. The auto-generation aspect of AGSP Services reduces development time and effort by emphasizing specification and minimizing implementation: In effect, the design, building, and debugging of software for effecting conversions among complex communication protocols, custom device mappings, and unique data-manipulation algorithms is replaced with metadata specifications that map to an abstract platform-independent communications model. AGSP Services is modular and has been shown to be easily integrable into new and legacy NASA flight and ground communication systems.

  19. Independent Auditors Report on the Air Force General Fund FY 2015 and FY 2014 Basic Financial Statements for United States Air Force Agency Financial Report 2015

    DTIC Science & Technology

    2015-11-09

    and intelligence warfighting support. AFSPC operates sensors that provide direct attack warning and assessment to U.S. Strategic Command and North...combinations. AFRL conducted low-speed wind tunnel tests of 9%-scale model completed at NASA Langley Research Center (LaRC); data validated analytical...by $2M across JTAC platforms and expanding mobile device operation usage by 95 hours. The BATMAN-II team also delivered a new wireless mobile

  20. Raman spectroscopy-based screening of IgM positive and negative sera for dengue virus infection

    NASA Astrophysics Data System (ADS)

    Bilal, M.; Saleem, M.; Bilal, Maria; Ijaz, T.; Khan, Saranjam; Ullah, Rahat; Raza, A.; Khurram, M.; Akram, W.; Ahmed, M.

    2016-11-01

    A statistical method based on Raman spectroscopy for the screening of immunoglobulin M (IgM) in dengue virus (DENV) infected human sera is presented. In total, 108 sera samples were collected and their antibody indexes (AI) for IgM were determined through enzyme-linked immunosorbent assay (ELISA). Raman spectra of these samples were acquired using a 785 nm wavelength excitation laser. Seventy-eight Raman spectra were selected randomly and unbiasedly for the development of a statistical model using partial least square (PLS) regression, while the remaining 30 were used for testing the developed model. An R-square (r 2) value of 0.929 was determined using the leave-one-sample-out (LOO) cross validation method, showing the validity of this model. It considers all molecular changes related to IgM concentration, and describes their role in infection. A graphical user interface (GUI) platform has been developed to run a developed multivariate model for the prediction of AI of IgM for blindly tested samples, and an excellent agreement has been found between model predicted and clinically determined values. Parameters like sensitivity, specificity, accuracy, and area under receiver operator characteristic (ROC) curve for these tested samples are also reported to visualize model performance.

  1. Smart Sensor Systems for Aerospace Applications: From Sensor Development to Application Testing

    NASA Technical Reports Server (NTRS)

    Hunter, G. W.; Xu, J. C.; Dungan, L. K.; Ward, B. J.; Rowe, S.; Williams, J.; Makel, D. B.; Liu, C. C.; Chang, C. W.

    2008-01-01

    The application of Smart Sensor Systems for aerospace applications is a multidisciplinary process consisting of sensor element development, element integration into Smart Sensor hardware, and testing of the resulting sensor systems in application environments. This paper provides a cross-section of these activities for multiple aerospace applications illustrating the technology challenges involved. The development and application testing topics discussed are: 1) The broadening of sensitivity and operational range of silicon carbide (SiC) Schottky gas sensor elements; 2) Integration of fire detection sensor technology into a "Lick and Stick" Smart Sensor hardware platform for Crew Exploration Vehicle applications; 3) Extended testing for zirconia based oxygen sensors in the basic "Lick and Stick" platform for environmental monitoring applications. It is concluded that that both core sensor platform technology and a basic hardware platform can enhance the viability of implementing smart sensor systems in aerospace applications.

  2. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the service platform for Orion spacecraft processing. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  3. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the top level of the service platform for Orion spacecraft processing. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  4. UTM Technical Capabilities Level 2 (TLC2) Test at Reno-Stead Airport.

    NASA Image and Video Library

    2016-10-06

    Test of Unmanned Aircraft Systems Traffic Management (UTM) technical capability Level 2 (TCL2) at Reno-Stead Airport, Nevada. During the test, five drones simultaneously crossed paths, separated by altitude. Two drones flew beyond visual line-of-sight and three flew within line-of-sight of their operators. Engineers Priya Venkatesan and Joey Mercer review flight paths using the UAS traffic management research platform at flight operations mission control at NASA’s UTM TCL2 test.

  5. Pitfalls in Prediction Modeling for Normal Tissue Toxicity in Radiation Therapy: An Illustration With the Individual Radiation Sensitivity and Mammary Carcinoma Risk Factor Investigation Cohorts

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mbah, Chamberlain, E-mail: chamberlain.mbah@ugent.be; Department of Mathematical Modeling, Statistics, and Bioinformatics, Faculty of Bioscience Engineering, Ghent University, Ghent; Thierens, Hubert

    Purpose: To identify the main causes underlying the failure of prediction models for radiation therapy toxicity to replicate. Methods and Materials: Data were used from two German cohorts, Individual Radiation Sensitivity (ISE) (n=418) and Mammary Carcinoma Risk Factor Investigation (MARIE) (n=409), of breast cancer patients with similar characteristics and radiation therapy treatments. The toxicity endpoint chosen was telangiectasia. The LASSO (least absolute shrinkage and selection operator) logistic regression method was used to build a predictive model for a dichotomized endpoint (Radiation Therapy Oncology Group/European Organization for the Research and Treatment of Cancer score 0, 1, or ≥2). Internal areas undermore » the receiver operating characteristic curve (inAUCs) were calculated by a naïve approach whereby the training data (ISE) were also used for calculating the AUC. Cross-validation was also applied to calculate the AUC within the same cohort, a second type of inAUC. Internal AUCs from cross-validation were calculated within ISE and MARIE separately. Models trained on one dataset (ISE) were applied to a test dataset (MARIE) and AUCs calculated (exAUCs). Results: Internal AUCs from the naïve approach were generally larger than inAUCs from cross-validation owing to overfitting the training data. Internal AUCs from cross-validation were also generally larger than the exAUCs, reflecting heterogeneity in the predictors between cohorts. The best models with largest inAUCs from cross-validation within both cohorts had a number of common predictors: hypertension, normalized total boost, and presence of estrogen receptors. Surprisingly, the effect (coefficient in the prediction model) of hypertension on telangiectasia incidence was positive in ISE and negative in MARIE. Other predictors were also not common between the 2 cohorts, illustrating that overcoming overfitting does not solve the problem of replication failure of prediction models completely. Conclusions: Overfitting and cohort heterogeneity are the 2 main causes of replication failure of prediction models across cohorts. Cross-validation and similar techniques (eg, bootstrapping) cope with overfitting, but the development of validated predictive models for radiation therapy toxicity requires strategies that deal with cohort heterogeneity.« less

  6. Novel Robotic Platforms for the Accurate Sampling and Monitoring of Water Columns

    PubMed Central

    Fernández, Roemi; Apalkov, Andrey; Armada, Manuel

    2016-01-01

    The hydrosphere contains large amounts of suspended particulate material, including living and non-living material that can be found in different compositions and concentrations, and that can be composed of particles of different sizes. The study of this particulate material along water columns plays a key role in understanding a great variety of biological, chemical, and physical processes. This paper presents the conceptual design of two patented robotic platforms that have been conceived for carrying out studies of water properties at desired depths with very high accuracy in the vertical positioning. One platform has been specially designed for operating near to a reservoir bottom, while the other is intended to be used near the surface. Several experimental tests have been conducted in order to validate the proposed approaches. PMID:27589745

  7. Chronic Antibody-Mediated Rejection in Nonhuman Primate Renal Allografts: Validation of Human Histological and Molecular Phenotypes.

    PubMed

    Adam, B A; Smith, R N; Rosales, I A; Matsunami, M; Afzali, B; Oura, T; Cosimi, A B; Kawai, T; Colvin, R B; Mengel, M

    2017-11-01

    Molecular testing represents a promising adjunct for the diagnosis of antibody-mediated rejection (AMR). Here, we apply a novel gene expression platform in sequential formalin-fixed paraffin-embedded samples from nonhuman primate (NHP) renal transplants. We analyzed 34 previously described gene transcripts related to AMR in humans in 197 archival NHP samples, including 102 from recipients that developed chronic AMR, 80 from recipients without AMR, and 15 normal native nephrectomies. Three endothelial genes (VWF, DARC, and CAV1), derived from 10-fold cross-validation receiver operating characteristic curve analysis, demonstrated excellent discrimination between AMR and non-AMR samples (area under the curve = 0.92). This three-gene set correlated with classic features of AMR, including glomerulitis, capillaritis, glomerulopathy, C4d deposition, and DSAs (r = 0.39-0.63, p < 0.001). Principal component analysis confirmed the association between three-gene set expression and AMR and highlighted the ambiguity of v lesions and ptc lesions between AMR and T cell-mediated rejection (TCMR). Elevated three-gene set expression corresponded with the development of immunopathological evidence of rejection and often preceded it. Many recipients demonstrated mixed AMR and TCMR, suggesting that this represents the natural pattern of rejection. These data provide NHP animal model validation of recent updates to the Banff classification including the assessment of molecular markers for diagnosing AMR. © 2017 The American Society of Transplantation and the American Society of Transplant Surgeons.

  8. Photonic-crystal diplexers for terahertz-wave applications.

    PubMed

    Yata, Masahiro; Fujita, Masayuki; Nagatsuma, Tadao

    2016-04-04

    A compact diplexer is designed using a silicon photonic-crystal directional coupler of length comparable to the incident wavelength. The diplexer theoretically and experimentally exhibits a cross state bandwidth as broad as 2% of the operation frequency, with over 40-dB isolation between the cross and bar ports. We also demonstrate 1.5-Gbit/s frequency-division communication in the 0.32- and 0.33-THz bands using a single-wavelength-sized diplexer, and discuss the transmission bandwidth. Our study demonstrates the potential for application of photonic crystals as terahertz-wave integration platforms.

  9. Reconfigurable, Intelligently-Adaptive, Communication System, an SDR Platform

    NASA Technical Reports Server (NTRS)

    Roche, Rigoberto J.; Shalkhauser, Mary Jo; Hickey, Joseph P.; Briones, Janette C.

    2016-01-01

    The Space Telecommunications Radio System (STRS) provides a common, consistent framework to abstract the application software from the radio platform hardware. STRS aims to reduce the cost and risk of using complex, configurable and reprogrammable radio systems across NASA missions. The NASA Glenn Research Center (GRC) team made a software defined radio (SDR) platform STRS compliant by adding an STRS operating environment and a field programmable gate array (FPGA) wrapper, capable of implementing each of the platforms interfaces, as well as a test waveform to exercise those interfaces. This effort serves to provide a framework toward waveform development onto an STRS compliant platform to support future space communication systems for advanced exploration missions. The use of validated STRS compliant applications provides tested code with extensive documentation to potentially reduce risk, cost and e ort in development of space-deployable SDRs. This paper discusses the advantages of STRS, the integration of STRS onto a Reconfigurable, Intelligently-Adaptive, Communication System (RIACS) SDR platform, and the test waveform and wrapper development e orts. The paper emphasizes the infusion of the STRS Architecture onto the RIACS platform for potential use in next generation flight system SDRs for advanced exploration missions.

  10. Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification

    NASA Technical Reports Server (NTRS)

    Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand; hide

    2016-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  11. Towards a New Architecture for Autonomous Data Collection

    NASA Astrophysics Data System (ADS)

    Tanzi, T. J.; Roudier, Y.; Apvrille, L.

    2015-08-01

    A new generation of UAVs is coming that will help improve the situational awareness and assessment necessary to ensure quality data collection, especially in difficult conditions like natural disasters. Operators should be relieved from time-consuming data collection tasks as much as possible and at the same time, UAVs should assist data collection operations through a more insightful and automated guidance thanks to advanced sensing capabilities. In order to achieve this vision, two challenges must be addressed though. The first one is to achieve a sufficient autonomy, both in terms of navigation and of interpretation of the data sensed. The second one relates to the reliability of the UAV with respect to accidental (safety) or even malicious (security) risks. This however requires the design and development of new embedded architectures for drones to be more autonomous, while mitigating the harm they may potentially cause. We claim that the increased complexity and flexibility of such platforms requires resorting to modelling, simulation, or formal verification techniques in order to validate such critical aspects of the platform. This paper first discusses the potential and challenges faced by autonomous UAVs for data acquisition. The design of a flexible and adaptable embedded UAV architecture is then addressed. Finally, the need for validating the properties of the platform is discussed. Our approach is sketched and illustrated with the example of a lightweight drone performing 3D reconstructions out of the combination of 2D image acquisition and a specific motion control.

  12. Small unmanned aircraft system for remote contour mapping of a nuclear radiation field

    NASA Astrophysics Data System (ADS)

    Guss, Paul; McCall, Karen; Malchow, Russell; Fischer, Rick; Lukens, Michael; Adan, Mark; Park, Ki; Abbott, Roy; Howard, Michael; Wagner, Eric; Trainham, Clifford P.; Luke, Tanushree; Mukhopadhyay, Sanjoy; Oh, Paul; Brahmbhatt, Pareshkumar; Henderson, Eric; Han, Jinlu; Huang, Justin; Huang, Casey; Daniels, Jon

    2017-09-01

    For nuclear disasters involving radioactive contamination, small unmanned aircraft systems (sUASs) equipped with nuclear radiation detection and monitoring capability can be very important tools. Among the advantages of a sUAS are quick deployment, low-altitude flying that enhances sensitivity, wide area coverage, no radiation exposure health safety restriction, and the ability to access highly hazardous or radioactive areas. Additionally, the sUAS can be configured with the nuclear detecting sensor optimized to measure the radiation associated with the event. In this investigation, sUAS platforms were obtained for the installation of sensor payloads for radiation detection and electro-optical systems that were specifically developed for sUAS research, development, and operational testing. The sensor payloads were optimized for the contour mapping of a nuclear radiation field, which will result in a formula for low-cost sUAS platform operations with built-in formation flight control. Additional emphases of the investigation were to develop the relevant contouring algorithms; initiate the sUAS comprehensive testing using the Unmanned Systems, Inc. (USI) Sandstorm platforms and other acquired platforms; and both acquire and optimize the sensors for detection and localization. We demonstrated contour mapping through simulation and validated waypoint detection. We mounted a detector on a sUAS and operated it initially in the counts per second (cps) mode to perform field and flight tests to demonstrate that the equipment was functioning as designed. We performed ground truth measurements to determine the response of the detector as a function of source-to-detector distance. Operation of the radiation detector was tested using different unshielded sources.

  13. A three-dimensional parabolic equation model of sound propagation using higher-order operator splitting and Padé approximants.

    PubMed

    Lin, Ying-Tsong; Collis, Jon M; Duda, Timothy F

    2012-11-01

    An alternating direction implicit (ADI) three-dimensional fluid parabolic equation solution method with enhanced accuracy is presented. The method uses a square-root Helmholtz operator splitting algorithm that retains cross-multiplied operator terms that have been previously neglected. With these higher-order cross terms, the valid angular range of the parabolic equation solution is improved. The method is tested for accuracy against an image solution in an idealized wedge problem. Computational efficiency improvements resulting from the ADI discretization are also discussed.

  14. Holistic approach to design and implementation of a medical teleconsultation workspace.

    PubMed

    Czekierda, Łukasz; Malawski, Filip; Wyszkowski, Przemysław

    2015-10-01

    While there are many state-of-the-art approaches to introducing telemedical services in the area of medical imaging, it is hard to point to studies which would address all relevant aspects in a complete and comprehensive manner. In this paper we describe our approach to design and implementation of a universal platform for imaging medicine which is based on our longstanding experience in this area. We claim it is holistic, because, contrary to most of the available studies it addresses all aspects related to creation and utilization of a medical teleconsultation workspace. We present an extensive analysis of requirements, including possible usage scenarios, user needs, organizational and security issues and infrastructure components. We enumerate and analyze multiple usage scenarios related to medical imaging data in treatment, research and educational applications - with typical teleconsultations treated as just one of many possible options. Certain phases common to all these scenarios have been identified, with the resulting classification distinguishing several modes of operation (local vs. remote, collaborative vs. non-interactive etc.). On this basis we propose a system architecture which addresses all of the identified requirements, applying two key concepts: Service Oriented Architecture (SOA) and Virtual Organizations (VO). The SOA paradigm allows us to decompose the functionality of the system into several distinct building blocks, ensuring flexibility and reliability. The VO paradigm defines the cooperation model for all participating healthcare institutions. Our approach is validated by an ICT platform called TeleDICOM II which implements the proposed architecture. All of its main elements are described in detail and cross-checked against the listed requirements. A case study presents the role and usage of the platform in a specific scenario. Finally, our platform is compared with similar systems described into-date studies and available on the market. Copyright © 2015. Published by Elsevier Inc.

  15. A standalone perfusion platform for drug testing and target validation in micro-vessel networks

    PubMed Central

    Zhang, Boyang; Peticone, Carlotta; Murthy, Shashi K.; Radisic, Milica

    2013-01-01

    Studying the effects of pharmacological agents on human endothelium includes the routine use of cell monolayers cultivated in multi-well plates. This configuration fails to recapitulate the complex architecture of vascular networks in vivo and does not capture the relationship between shear stress (i.e. flow) experienced by the cells and dose of the applied pharmacological agents. Microfluidic platforms have been applied extensively to create vascular systems in vitro; however, they rely on bulky external hardware to operate, which hinders the wide application of microfluidic chips by non-microfluidic experts. Here, we have developed a standalone perfusion platform where multiple devices were perfused at a time with a single miniaturized peristaltic pump. Using the platform, multiple micro-vessel networks, that contained three levels of branching structures, were created by culturing endothelial cells within circular micro-channel networks mimicking the geometrical configuration of natural blood vessels. To demonstrate the feasibility of our platform for drug testing and validation assays, a drug induced nitric oxide assay was performed on the engineered micro-vessel network using a panel of vaso-active drugs (acetylcholine, phenylephrine, atorvastatin, and sildenafil), showing both flow and drug dose dependent responses. The interactive effects between flow and drug dose for sildenafil could not be captured by a simple straight rectangular channel coated with endothelial cells, but it was captured in a more physiological branching circular network. A monocyte adhesion assay was also demonstrated with and without stimulation by an inflammatory cytokine, tumor necrosis factor-α. PMID:24404058

  16. GIFT-Cloud: A data sharing and collaboration platform for medical imaging research.

    PubMed

    Doel, Tom; Shakir, Dzhoshkun I; Pratt, Rosalind; Aertsen, Michael; Moggridge, James; Bellon, Erwin; David, Anna L; Deprest, Jan; Vercauteren, Tom; Ourselin, Sébastien

    2017-02-01

    Clinical imaging data are essential for developing research software for computer-aided diagnosis, treatment planning and image-guided surgery, yet existing systems are poorly suited for data sharing between healthcare and academia: research systems rarely provide an integrated approach for data exchange with clinicians; hospital systems are focused towards clinical patient care with limited access for external researchers; and safe haven environments are not well suited to algorithm development. We have established GIFT-Cloud, a data and medical image sharing platform, to meet the needs of GIFT-Surg, an international research collaboration that is developing novel imaging methods for fetal surgery. GIFT-Cloud also has general applicability to other areas of imaging research. GIFT-Cloud builds upon well-established cross-platform technologies. The Server provides secure anonymised data storage, direct web-based data access and a REST API for integrating external software. The Uploader provides automated on-site anonymisation, encryption and data upload. Gateways provide a seamless process for uploading medical data from clinical systems to the research server. GIFT-Cloud has been implemented in a multi-centre study for fetal medicine research. We present a case study of placental segmentation for pre-operative surgical planning, showing how GIFT-Cloud underpins the research and integrates with the clinical workflow. GIFT-Cloud simplifies the transfer of imaging data from clinical to research institutions, facilitating the development and validation of medical research software and the sharing of results back to the clinical partners. GIFT-Cloud supports collaboration between multiple healthcare and research institutions while satisfying the demands of patient confidentiality, data security and data ownership. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  17. Bridging EO Research, Operations and Collaborative Learning

    NASA Astrophysics Data System (ADS)

    Scarth, Peter

    2016-04-01

    Building flexible and responsive processing and delivery systems is key to getting EO information used by researchers, policy agents and the public. There are typically three distinct processes we tackle to get product uptake: undertake research, operationalise the validated research, and deliver information and garner feedback in an appropriate way. In many cases however, the gaps between these process elements are large and lead to poor outcomes. Good research may be "lost" and not adopted, there may be resistance to uptake by government or NGOs of significantly better operational products based on EO data, and lack of accessibility means that there is no use of interactive science outputs to improve cross disciplinary science or to start a dialog with citizens. So one of the the most important tasks, if we wish to have broad uptake of EO information and accelerate further research, is to link these processes together in a formal but flexible way. One of the ways to operationalize research output is by building a platform that can take research code and scale it across much larger areas. In remote sensing, this is typically a system that has access to current and historical corrected imagery with a processing pipeline built over the top. To reduce the demand on high level scientific programmers and allowing cross disciplinary researchers to hack and play and refine, this pipeline needs to be easy to use, collaborative and link to existing tools to encourage code experimentation and reuse. It is also critical to have efficient, tight integration with information delivery and extension components so that the science relevant to your user is available quickly and efficiently. The rapid expansion of open data licensing has helped this process, but building top-down web portals and tools without flexibility and regard for end user needs has limited the use of EO information in many areas. This research reports on the operalization of a scale independent time series query API that allows the interrogation of the entire current processed Australian Landsat archive in web time. The system containerises data interrogation and time series tasks to allow easy scaling and expansion and is currently in operational use by several land management portals across the country to deliver EO land information products to government agents, NGOs and individual farmers. Plans to ingest and process the Sentinel 2 archive are well underway, and the logistics of scaling this globally using an open source project based on the Earth Engine Platform will be discussed.

  18. Portability and Cross-Platform Performance of an MPI-Based Parallel Polygon Renderer

    NASA Technical Reports Server (NTRS)

    Crockett, Thomas W.

    1999-01-01

    Visualizing the results of computations performed on large-scale parallel computers is a challenging problem, due to the size of the datasets involved. One approach is to perform the visualization and graphics operations in place, exploiting the available parallelism to obtain the necessary rendering performance. Over the past several years, we have been developing algorithms and software to support visualization applications on NASA's parallel supercomputers. Our results have been incorporated into a parallel polygon rendering system called PGL. PGL was initially developed on tightly-coupled distributed-memory message-passing systems, including Intel's iPSC/860 and Paragon, and IBM's SP2. Over the past year, we have ported it to a variety of additional platforms, including the HP Exemplar, SGI Origin2OOO, Cray T3E, and clusters of Sun workstations. In implementing PGL, we have had two primary goals: cross-platform portability and high performance. Portability is important because (1) our manpower resources are limited, making it difficult to develop and maintain multiple versions of the code, and (2) NASA's complement of parallel computing platforms is diverse and subject to frequent change. Performance is important in delivering adequate rendering rates for complex scenes and ensuring that parallel computing resources are used effectively. Unfortunately, these two goals are often at odds. In this paper we report on our experiences with portability and performance of the PGL polygon renderer across a range of parallel computing platforms.

  19. Real-time modeling and simulation of distribution feeder and distributed resources

    NASA Astrophysics Data System (ADS)

    Singh, Pawan

    The analysis of the electrical system dates back to the days when analog network analyzers were used. With the advent of digital computers, many programs were written for power-flow and short circuit analysis for the improvement of the electrical system. Real-time computer simulations can answer many what-if scenarios in the existing or the proposed power system. In this thesis, the standard IEEE 13-Node distribution feeder is developed and validated on a real-time platform OPAL-RT. The concept and the challenges of the real-time simulation are studied and addressed. Distributed energy resources include some of the commonly used distributed generation and storage devices like diesel engine, solar photovoltaic array, and battery storage system are modeled and simulated on a real-time platform. A microgrid encompasses a portion of an electric power distribution which is located downstream of the distribution substation. Normally, the microgrid operates in paralleled mode with the grid; however, scheduled or forced isolation can take place. In such conditions, the microgrid must have the ability to operate stably and autonomously. The microgrid can operate in grid connected and islanded mode, both the operating modes are studied in the last chapter. Towards the end, a simple microgrid controller modeled and simulated on the real-time platform is developed for energy management and protection for the microgrid.

  20. Multi-photon vertical cross-sectional imaging with a dynamically-balanced thin-film PZT z-axis microactuator.

    PubMed

    Choi, Jongsoo; Duan, Xiyu; Li, Haijun; Wang, Thomas D; Oldham, Kenn R

    2017-10-01

    Use of a thin-film piezoelectric microactuator for axial scanning during multi-photon vertical cross-sectional imaging is described. The actuator uses thin-film lead-zirconate-titanate (PZT) to generate upward displacement of a central mirror platform, micro-machined from a silicon-on-insulator (SOI) wafer to dimensions compatible with endoscopic imaging instruments. Device modeling in this paper focuses on existence of frequencies near device resonance producing vertical motion with minimal off-axis tilt even in the presence of multiple vibration modes and non-uniformity in fabrication outcomes. Operation near rear resonance permits large stroke lengths at low voltages relative to other vertical microactuators. Highly uniform vertical motion of the mirror platform is a key requirement for vertical cross-sectional imaging in the remote scan architecture being used for multi-photon instrument prototyping. The stage is installed in a benchtop testbed in combination with an electrostatic mirror that performs in-plane scanning. Vertical sectional images are acquired from 15 μm diameter beads and excised mouse colon tissue.

  1. Developing a New Wireless Sensor Network Platform and Its Application in Precision Agriculture

    PubMed Central

    Aquino-Santos, Raúl; González-Potes, Apolinar; Edwards-Block, Arthur; Virgen-Ortiz, Raúl Alejandro

    2011-01-01

    Wireless sensor networks are gaining greater attention from the research community and industrial professionals because these small pieces of “smart dust” offer great advantages due to their small size, low power consumption, easy integration and support for “green” applications. Green applications are considered a hot topic in intelligent environments, ubiquitous and pervasive computing. This work evaluates a new wireless sensor network platform and its application in precision agriculture, including its embedded operating system and its routing algorithm. To validate the technological platform and the embedded operating system, two different routing strategies were compared: hierarchical and flat. Both of these routing algorithms were tested in a small-scale network applied to a watermelon field. However, we strongly believe that this technological platform can be also applied to precision agriculture because it incorporates a modified version of LORA-CBF, a wireless location-based routing algorithm that uses cluster-based flooding. Cluster-based flooding addresses the scalability concerns of wireless sensor networks, while the modified LORA-CBF routing algorithm includes a metric to monitor residual battery energy. Furthermore, results show that the modified version of LORA-CBF functions well with both the flat and hierarchical algorithms, although it functions better with the flat algorithm in a small-scale agricultural network. PMID:22346622

  2. Developing a new wireless sensor network platform and its application in precision agriculture.

    PubMed

    Aquino-Santos, Raúl; González-Potes, Apolinar; Edwards-Block, Arthur; Virgen-Ortiz, Raúl Alejandro

    2011-01-01

    Wireless sensor networks are gaining greater attention from the research community and industrial professionals because these small pieces of "smart dust" offer great advantages due to their small size, low power consumption, easy integration and support for "green" applications. Green applications are considered a hot topic in intelligent environments, ubiquitous and pervasive computing. This work evaluates a new wireless sensor network platform and its application in precision agriculture, including its embedded operating system and its routing algorithm. To validate the technological platform and the embedded operating system, two different routing strategies were compared: hierarchical and flat. Both of these routing algorithms were tested in a small-scale network applied to a watermelon field. However, we strongly believe that this technological platform can be also applied to precision agriculture because it incorporates a modified version of LORA-CBF, a wireless location-based routing algorithm that uses cluster-based flooding. Cluster-based flooding addresses the scalability concerns of wireless sensor networks, while the modified LORA-CBF routing algorithm includes a metric to monitor residual battery energy. Furthermore, results show that the modified version of LORA-CBF functions well with both the flat and hierarchical algorithms, although it functions better with the flat algorithm in a small-scale agricultural network.

  3. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen

    2015-01-01

    The engineering development of the new Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these spacecraft systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex system engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in specialized Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model based algorithms and their development lifecycle from inception through Flight Software certification are an important focus of this development effort to further insure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. NASA formed a dedicated M&FM team for addressing fault management early in the development lifecycle for the SLS initiative. As part of the development of the M&FM capabilities, this team has developed a dedicated testbed that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW processor scheduling constraints due to their target platform - ARINC 653 partitioned OS, resource limitations, and other factors related to integration with other subsystems not directly involved with M&FM such as telemetry packing and processing. The baseline plan for use of VMET encompasses testing the original M&FM algorithms coded in the same C++ language and state machine architectural concepts as that used by Flight Software. This enables the development of performance standards and test cases to characterize the M&FM algorithms and sets a benchmark from which to measure the effectiveness of M&FM algorithms performance in the FSW development and test processes.

  4. Cooperative Multi-Agent Mobile Sensor Platforms for Jet Engine Inspection: Concept and Implementation

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Wong, Edmond; Krasowski, Michael J.; Greer, Lawrence C.

    2003-01-01

    Cooperative behavior algorithms utilizing swarm intelligence are being developed for mobile sensor platforms to inspect jet engines on-wing. Experiments are planned in which several relatively simple autonomous platforms will work together in a coordinated fashion to carry out complex maintenance-type tasks within the constrained working environment modeled on the interior of a turbofan engine. The algorithms will emphasize distribution of the tasks among multiple units; they will be scalable and flexible so that units may be added in the future; and will be designed to operate on an individual unit level to produce the desired global effect. This proof of concept demonstration will validate the algorithms and provide justification for further miniaturization and specialization of the hardware toward the true application of on-wing in situ turbine engine maintenance.

  5. Innovative Extension Models and Smallholders: How ICT platforms can Deliver Timely Information to Farmers in India.

    NASA Astrophysics Data System (ADS)

    Nagothu, U. S.

    2016-12-01

    Agricultural extension services, among others, contribute to improving rural livelihoods and enhancing economic development. Knowledge development and transfer from the cognitive science point of view, is about, how farmers use and apply their experiential knowledge as well as acquired new knowledge to solve new problems. This depends on the models adopted, the way knowledge is generated and delivered. New extension models based on ICT platforms and smart phones are promising. Results from a 5-year project (www.climaadapt.org) in India shows that farmer led-on farm validations of technologies and knowledge exchange through ICT based platforms outperformed state operated linear extension programs. Innovation here depends on the connectivity, net-working between stakeholders that are involved in generating, transferring and using the knowledge. Key words: Smallholders, Knowledge, Extension, Innovation, India

  6. Benchmarking gate-based quantum computers

    NASA Astrophysics Data System (ADS)

    Michielsen, Kristel; Nocon, Madita; Willsch, Dennis; Jin, Fengping; Lippert, Thomas; De Raedt, Hans

    2017-11-01

    With the advent of public access to small gate-based quantum processors, it becomes necessary to develop a benchmarking methodology such that independent researchers can validate the operation of these processors. We explore the usefulness of a number of simple quantum circuits as benchmarks for gate-based quantum computing devices and show that circuits performing identity operations are very simple, scalable and sensitive to gate errors and are therefore very well suited for this task. We illustrate the procedure by presenting benchmark results for the IBM Quantum Experience, a cloud-based platform for gate-based quantum computing.

  7. Cross-platform learning: on the nature of children's learning from multiple media platforms.

    PubMed

    Fisch, Shalom M

    2013-01-01

    It is increasingly common for an educational media project to span several media platforms (e.g., TV, Web, hands-on materials), assuming that the benefits of learning from multiple media extend beyond those gained from one medium alone. Yet research typically has investigated learning from a single medium in isolation. This paper reviews several recent studies to explore cross-platform learning (i.e., learning from combined use of multiple media platforms) and how such learning compares to learning from one medium. The paper discusses unique benefits of cross-platform learning, a theoretical mechanism to explain how these benefits might arise, and questions for future research in this emerging field. Copyright © 2013 Wiley Periodicals, Inc., A Wiley Company.

  8. STORMVEX: The Storm Peak Lab Cloud Property Validation Experiment Science and Operations Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mace, J; Matrosov, S; Shupe, M

    2010-09-29

    During the Storm Peak Lab Cloud Property Validation Experiment (STORMVEX), a substantial correlative data set of remote sensing observations and direct in situ measurements from fixed and airborne platforms will be created in a winter season, mountainous environment. This will be accomplished by combining mountaintop observations at Storm Peak Laboratory and the airborne National Science Foundation-supported Colorado Airborne Multi-Phase Cloud Study campaign with collocated measurements from the second ARM Mobile Facility (AMF2). We describe in this document the operational plans and motivating science for this experiment, which includes deployment of AMF2 to Steamboat Springs, Colorado. The intensive STORMVEX field phasemore » will begin nominally on 1 November 2010 and extend to approximately early April 2011.« less

  9. A Serological Point-of-Care Test for the Detection of IgG Antibodies against Ebola Virus in Human Survivors.

    PubMed

    Brangel, Polina; Sobarzo, Ariel; Parolo, Claudio; Miller, Benjamin S; Howes, Philip D; Gelkop, Sigal; Lutwama, Julius J; Dye, John M; McKendry, Rachel A; Lobel, Leslie; Stevens, Molly M

    2018-01-23

    Ebola virus disease causes widespread and highly fatal epidemics in human populations. Today, there is still great need for point-of-care tests for diagnosis, patient management and surveillance, both during and post outbreaks. We present a point-of-care test comprising an immunochromatographic strip and a smartphone reader, which detects and semiquantifies Ebola-specific antibodies in human survivors. We developed a Sudan virus glycoprotein monoplex platform and validated it using sera from 90 human survivors and 31 local noninfected controls. The performance of the glycoprotein monoplex was 100% sensitivity and 98% specificity compared to standard whole antigen enzyme-linked immunosorbent assay (ELISA), and it was validated with freshly collected patient samples in Uganda. Moreover, we constructed a multiplex test for simultaneous detection of antibodies against three recombinant Sudan virus proteins. A pilot study comprising 15 survivors and 5 noninfected controls demonstrated sensitivity and specificity of 100% compared to standard ELISA. Finally, we developed a second multiplex subtype assay for the identification of exposure to three related EVD species: Sudan virus, Bundibugyo virus and Ebola virus (formerly Zaire) using recombinant viral glycoprotein. This multiplex test could distinguish between the host's immunity to specific viral species and identify cross-reactive immunity. These developed serological platforms consisted of capture ligands with high specificity and sensitivity, in-house developed strips and a compatible smartphone application. These platforms enabled rapid and portable testing, data storage and sharing as well as geographical tagging of the tested individuals in Uganda. This platform holds great potential as a field tool for diagnosis, vaccine development, and therapeutic evaluation.

  10. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the service platform for Orion spacecraft processing. To the left are several pneumatic panels. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  11. A Generic Approach for Inversion of Surface Reflectance over Land: Overview, Application and Validation Using MODIS and LANDSAT8 Data

    NASA Technical Reports Server (NTRS)

    Vermote, E.; Roger, J. C.; Justice, C. O.; Franch, B.; Claverie, M.

    2016-01-01

    This paper presents a generic approach developed to derive surface reflectance over land from a variety of sensors. This technique builds on the extensive dataset acquired by the Terra platform by combining MODIS and MISR to derive an explicit and dynamic map of band ratio's between blue and red channels and is a refinement of the operational approach used for MODIS and LANDSAT over the past 15 years. We will present the generic approach and the application to MODIS and LANDSAT data and its validation using the AERONET data.

  12. Learning by Doing: How to Develop a Cross-Platform Web App

    ERIC Educational Resources Information Center

    Huynh, Minh; Ghimire, Prashant

    2015-01-01

    As mobile devices become prevalent, there is always a need for apps. How hard is it to develop an app, especially a cross-platform app? The paper shares an experience in a project that involved the development of a student services web app that can be run on cross-platform mobile devices. The paper first describes the background of the project,…

  13. Using Kokkos for Performant Cross-Platform Acceleration of Liquid Rocket Simulations

    DTIC Science & Technology

    2017-05-08

    NUMBER (Include area code) 08 May 2017 Briefing Charts 05 April 2017 - 08 May 2017 Using Kokkos for Performant Cross-Platform Acceleration of Liquid ...ERC Incorporated RQRC AFRL-West Using Kokkos for Performant Cross-Platform Acceleration of Liquid Rocket Simulations 2DISTRIBUTION A: Approved for... Liquid Rocket Combustion Simulation SPACE simulation of rotating detonation engine (courtesy of Dr. Christopher Lietz) 3DISTRIBUTION A: Approved

  14. ScaMo: Realisation of an OO-functional DSL for cross platform mobile applications development

    NASA Astrophysics Data System (ADS)

    Macos, Dragan; Solymosi, Andreas

    2013-10-01

    The software market is dynamically changing: the Internet is going mobile, the software applications are shifting from the desktop hardware onto the mobile devices. The largest markets are the mobile applications for iOS, Android and Windows Phone and for the purpose the typical programming languages include Objective-C, Java and C ♯. The realization of the native applications implies the integration of the developed software into the environments of mentioned mobile operating systems to enable the access to different hardware components of the devices: GPS module, display, GSM module, etc. This paper deals with the definition and possible implementation of an environment for the automatic application generation for multiple mobile platforms. It is based on a DSL for mobile application development, which includes the programming language Scala and a DSL defined in Scala. As part of a multi-stage cross-compiling algorithm, this language is translated into the language of the affected mobile platform. The advantage of our method lies in the expressiveness of the defined language and the transparent source code translation between different languages, which implies, for example, the advantages of debugging and development of the generated code.

  15. Microfluidic platform combining droplets and magnetic tweezers: application to HER2 expression in cancer diagnosis

    PubMed Central

    Ferraro, Davide; Champ, Jérôme; Teste, Bruno; Serra, Marco; Malaquin, Laurent; Viovy, Jean-Louis; de Cremoux, Patricia; Descroix, Stephanie

    2016-01-01

    The development of precision medicine, together with the multiplication of targeted therapies and associated molecular biomarkers, call for major progress in genetic analysis methods, allowing increased multiplexing and the implementation of more complex decision trees, without cost increase or loss of robustness. We present a platform combining droplet microfluidics and magnetic tweezers, performing RNA purification, reverse transcription and amplification in a fully automated and programmable way, in droplets of 250nL directly sampled from a microtiter-plate. This platform decreases sample consumption about 100 fold as compared to current robotized platforms and it reduces human manipulations and contamination risk. The platform’s performance was first evaluated on cell lines, showing robust operation on RNA quantities corresponding to less than one cell, and then clinically validated with a cohort of 21 breast cancer samples, for the determination of their HER2 expression status, in a blind comparison with an established routine clinical analysis. PMID:27157697

  16. Microfluidic platform combining droplets and magnetic tweezers: application to HER2 expression in cancer diagnosis

    NASA Astrophysics Data System (ADS)

    Ferraro, Davide; Champ, Jérôme; Teste, Bruno; Serra, Marco; Malaquin, Laurent; Viovy, Jean-Louis; de Cremoux, Patricia; Descroix, Stephanie

    2016-05-01

    The development of precision medicine, together with the multiplication of targeted therapies and associated molecular biomarkers, call for major progress in genetic analysis methods, allowing increased multiplexing and the implementation of more complex decision trees, without cost increase or loss of robustness. We present a platform combining droplet microfluidics and magnetic tweezers, performing RNA purification, reverse transcription and amplification in a fully automated and programmable way, in droplets of 250nL directly sampled from a microtiter-plate. This platform decreases sample consumption about 100 fold as compared to current robotized platforms and it reduces human manipulations and contamination risk. The platform’s performance was first evaluated on cell lines, showing robust operation on RNA quantities corresponding to less than one cell, and then clinically validated with a cohort of 21 breast cancer samples, for the determination of their HER2 expression status, in a blind comparison with an established routine clinical analysis.

  17. Calibration and Data Efforts of the National Ecological Observatory Network (NEON) Airborne Observation Platform during its Engineering Development Phase

    NASA Astrophysics Data System (ADS)

    Adler, J.; Goulden, T.; Kampe, T. U.; Leisso, N.; Musinsky, J.

    2014-12-01

    The National Ecological Observatory Network (NEON) has collected airborne photographic, lidar, and imaging spectrometer data in 5 of 20 unique ecological climate regions (domains) within the United States. As part of its mission to detect and forecast ecological change at continental scales over multiple decades, NEON Airborne Observation Platform (AOP) will aerially survey the entire network of 60 core and re-locatable terrestrial sites annually, each of which are a minimum of 10km-by-10km in extent. The current effort encompasses three years of AOP engineering test flights; in 2017 NEON will transition to full operational status in all 20 domains. To date the total airborne data collected spans 34 Terabytes, and three of the five sampled domain's L1 data are publically available upon request. The large volume of current data, and the expected data collection over the remaining 15 domains, is challenging NEON's data distribution plans, backup capability, and data discovery processes. To provide the public with the highest quality data, calibration and validation efforts of the camera, lidar, and spectrometer L0 data are implemented to produce L1 datasets. Where available, the collected airborne measurements are validated against ground reference points and surfaces and adjusted for instrumentation and atmospheric effects. The imaging spectrometer data is spectrally and radiometrically corrected using NIST-traceable procedures. This presentation highlights three years of flight operation experiences including:1) Lessons learned on payload re-configuration, data extraction, data distribution, permitting requirements, flight planning, and operational procedures2) Lidar validation through control data comparisons collected at the Boulder Municipal Airport (KBDU), the site of NEON's new hangar facility3) Spectrometer calibration efforts, to include both the laboratory and ground observations

  18. NPOESS Preparatory Project Validation Program for the Cross-track Infrared Sounder

    NASA Astrophysics Data System (ADS)

    Barnet, C.; Gu, D.; Nalli, N. R.

    2009-12-01

    The National Polar-orbiting Operational Environmental Satellite System (NPOESS) Program, in partnership with National Aeronautical Space Administration (NASA), will launch the NPOESS Preparatory Project (NPP), a risk reduction and data continuity mission, prior to the first operational NPOESS launch. The NPOESS Program, in partnership with Northrop Grumman Aerospace Systems, will execute the NPP Calibration and Validation (Cal/Val) program to ensure the data products comply with the requirements of the sponsoring agencies. The Cross-track Infrared Sounder (CrIS) and the Advanced Technology Microwave Sounder (ATMS) are two of the instruments that make up the suite of sensors on NPP. Together, CrIS and ATMS will produce three Environmental Data Records (EDRs) including the Atmospheric Vertical Temperature Profile (AVTP), Atmospheric Vertical Moisture Profile (AVMP), and the Atmospheric Vertical Pressure Profile (AVPP). The AVTP and the AVMP are both NPOESS Key Performance Parameters (KPPs). The validation plans establish science and user community leadership and participation, and demonstrated, cost-effective Cal/Val approaches. This presentation will provide an overview of the collaborative data, techniques, and schedule for the validation of the NPP CrIS and ATMS environmental data products.

  19. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASA's Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The development of the Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large complex systems engineering challenge being addressed in part by focusing on the specific subsystems handling of off-nominal mission and fault tolerance. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML), the Mission and Fault Management (M&FM) algorithms are crafted and vetted in specialized Integrated Development Teams composed of multiple development disciplines. NASA also has formed an M&FM team for addressing fault management early in the development lifecycle. This team has developed a dedicated Vehicle Management End-to-End Testbed (VMET) that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. The flexibility of VMET enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the algorithms utilizing actual subsystem models. The intent is to validate the algorithms and substantiate them with performance baselines for each of the vehicle subsystems in an independent platform exterior to flight software test processes. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test processes. Risk reduction is addressed by working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and detection and responses that can be tested in VMET and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW processor scheduling constraints due to their target platform - ARINC 653 partitioned OS, resource limitations, and other factors related to integration with other subsystems not directly involved with M&FM. The plan for VMET encompasses testing the original M&FM algorithms coded in the same C++ language and state machine architectural concepts as that used by Flight Software. This enables the development of performance standards and test cases to characterize the M&FM algorithms and sets a benchmark from which to measure the effectiveness of M&FM algorithms performance in the FSW development and test processes. This paper is outlined in a systematic fashion analogous to a lifecycle process flow for engineering development of algorithms into software and testing. Section I describes the NASA SLS M&FM context, presenting the current infrastructure, leading principles, methods, and participants. Section II defines the testing philosophy of the M&FM algorithms as related to VMET followed by section III, which presents the modeling methods of the algorithms to be tested and validated in VMET. Its details are then further presented in section IV followed by Section V presenting integration, test status, and state analysis. Finally, section VI addresses the summary and forward directions followed by the appendices presenting relevant information on terminology and documentation.

  20. Towards an Open, Distributed Software Architecture for UxS Operations

    NASA Technical Reports Server (NTRS)

    Cross, Charles D.; Motter, Mark A.; Neilan, James H.; Qualls, Garry D.; Rothhaar, Paul M.; Tran, Loc; Trujillo, Anna C.; Allen, B. Danette

    2015-01-01

    To address the growing need to evaluate, test, and certify an ever expanding ecosystem of UxS platforms in preparation of cultural integration, NASA Langley Research Center's Autonomy Incubator (AI) has taken on the challenge of developing a software framework in which UxS platforms developed by third parties can be integrated into a single system which provides evaluation and testing, mission planning and operation, and out-of-the-box autonomy and data fusion capabilities. This software framework, named AEON (Autonomous Entity Operations Network), has two main goals. The first goal is the development of a cross-platform, extensible, onboard software system that provides autonomy at the mission execution and course-planning level, a highly configurable data fusion framework sensitive to the platform's available sensor hardware, and plug-and-play compatibility with a wide array of computer systems, sensors, software, and controls hardware. The second goal is the development of a ground control system that acts as a test-bed for integration of the proposed heterogeneous fleet, and allows for complex mission planning, tracking, and debugging capabilities. The ground control system should also be highly extensible and allow plug-and-play interoperability with third party software systems. In order to achieve these goals, this paper proposes an open, distributed software architecture which utilizes at its core the Data Distribution Service (DDS) standards, established by the Object Management Group (OMG), for inter-process communication and data flow. The design decisions proposed herein leverage the advantages of existing robotics software architectures and the DDS standards to develop software that is scalable, high-performance, fault tolerant, modular, and readily interoperable with external platforms and software.

  1. BiodMHC: an online server for the prediction of MHC class II-peptide binding affinity.

    PubMed

    Wang, Lian; Pan, Danling; Hu, Xihao; Xiao, Jinyu; Gao, Yangyang; Zhang, Huifang; Zhang, Yan; Liu, Juan; Zhu, Shanfeng

    2009-05-01

    Effective identification of major histocompatibility complex (MHC) molecules restricted peptides is a critical step in discovering immune epitopes. Although many online servers have been built to predict class II MHC-peptide binding affinity, they have been trained on different datasets, and thus fail in providing a unified comparison of various methods. In this paper, we present our implementation of seven popular predictive methods, namely SMM-align, ARB, SVR-pairwise, Gibbs sampler, ProPred, LP-top2, and MHCPred, on a single web server named BiodMHC (http://biod.whu.edu.cn/BiodMHC/index.html, the software is available upon request). Using a standard measure of AUC (Area Under the receiver operating characteristic Curves), we compare these methods by means of not only cross validation but also prediction on independent test datasets. We find that SMM-align, ProPred, SVR-pairwise, ARB, and Gibbs sampler are the five best-performing methods. For the binding affinity prediction of class II MHC-peptide, BiodMHC provides a convenient online platform for researchers to obtain binding information simultaneously using various methods.

  2. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  3. Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks.

    PubMed

    Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A; Montefiori, David C; Zimmermann, Heiko; von Briesen, Hagen

    2018-01-01

    The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria-accuracy, precision as well as the specificity and robustness-were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials.

  4. Validation of an automated system for aliquoting of HIV-1 Env-pseudotyped virus stocks

    PubMed Central

    Schultz, Anke; Germann, Anja; Fuss, Martina; Sarzotti-Kelsoe, Marcella; Ozaki, Daniel A.; Montefiori, David C.; Zimmermann, Heiko

    2018-01-01

    The standardized assessments of HIV-specific immune responses are of main interest in the preclinical and clinical stage of HIV-1 vaccine development. In this regard, HIV-1 Env-pseudotyped viruses play a central role for the evaluation of neutralizing antibody profiles and are produced according to Good Clinical Laboratory Practice- (GCLP-) compliant manual and automated procedures. To further improve and complete the automated production cycle an automated system for aliquoting HIV-1 pseudovirus stocks has been implemented. The automation platform consists of a modified Tecan-based system including a robot platform for handling racks containing 48 cryovials, a Decapper, a tubing pump and a safety device consisting of ultrasound sensors for online liquid level detection of each individual cryovial. With the aim to aliquot the HIV-1 pseudoviruses in an automated manner under GCLP-compliant conditions a validation plan was developed where the acceptance criteria—accuracy, precision as well as the specificity and robustness—were defined and summarized. By passing the validation experiments described in this article the automated system for aliquoting has been successfully validated. This allows the standardized and operator independent distribution of small-scale and bulk amounts of HIV-1 pseudovirus stocks with a precise and reproducible outcome to support upcoming clinical vaccine trials. PMID:29300769

  5. Ozone Observations by the Gas and Aerosol Measurement Sensor during SOLVE II

    NASA Technical Reports Server (NTRS)

    Pitts, M. C.; Thomason, L. W.; Zawodny, J. M.; Wenny, B. N.; Livingston, J. M.; Russell, P. B.; Yee, J.-H.; Swartz, W. H.; Shetter, R. E.

    2006-01-01

    The Gas and Aerosol Measurement Sensor (GAMS) was deployed aboard the NASA DC-8 aircraft during the second SAGE III Ozone Loss and Validation Experiment (SOLVE II). GAMS acquired line-of-sight (LOS) direct solar irradiance spectra during the sunlit portions of ten science flights of the DC-8 between 12 January and 4 February 2003. Differential line-of-sight (DLOS) optical depth spectra are produced from the GAMS raw solar irradiance spectra. Then, DLOS ozone number densities are retrieved from the GAMS spectra using a multiple linear regression spectral fitting technique. Both the DLOS optical depth spectra and retrieved ozone data are compared with coincident measurements from two other solar instruments aboard the DC-8 platform to demonstrate the robustness and stability of the GAMS data. The GAMS ozone measurements are then utilized to evaluate the quality of the Wulf band ozone cross sections, a critical component of the SAGE III aerosol, water vapor, and temperature/pressure retrievals. Results suggest the ozone cross section compilation of Shettle and Anderson currently used operationally in SAGE III data processing may be in error by as much as 10-20% in theWulf bands, and their lack of reported temperature dependence is a significant deficiency. A second, more recent, cross section database compiled for the SCIAMACHY satellite mission appears to be of much better quality in the Wulf bands, but still may have errors as large as 5% near the Wulf band absorption peaks, which is slightly larger than their stated uncertainty. Additional laboratory measurements of the Wulf band cross sections should be pursued to further reduce their uncertainty and better quantify their temperature dependence.

  6. EOS Terra Validation Program

    NASA Technical Reports Server (NTRS)

    Starr, David

    2000-01-01

    The EOS Terra mission will be launched in July 1999. This mission has great relevance to the atmospheric radiation community and global change issues. Terra instruments include Advanced Spaceborne Thermal Emission and Reflection Radiometer (ASTER), Clouds and Earth's Radiant Energy System (CERES), Multi-Angle Imaging Spectroradiometer (MISR), Moderate Resolution Imaging Spectroradiometer (MODIS) and Measurements of Pollution in the Troposphere (MOPITT). In addition to the fundamental radiance data sets, numerous global science data products will be generated, including various Earth radiation budget, cloud and aerosol parameters, as well as land surface, terrestrial ecology, ocean color, and atmospheric chemistry parameters. Significant investments have been made in on-board calibration to ensure the quality of the radiance observations. A key component of the Terra mission is the validation of the science data products. This is essential for a mission focused on global change issues and the underlying processes. The Terra algorithms have been subject to extensive pre-launch testing with field data whenever possible. Intensive efforts will be made to validate the Terra data products after launch. These include validation of instrument calibration (vicarious calibration) experiments, instrument and cross-platform comparisons, routine collection of high quality correlative data from ground-based networks, such as AERONET, and intensive sites, such as the SGP ARM site, as well as a variety field experiments, cruises, etc. Airborne simulator instruments have been developed for the field experiment and underflight activities including the MODIS Airborne Simulator (MAS) AirMISR, MASTER (MODIS-ASTER), and MOPITT-A. All are integrated on the NASA ER-2 though low altitude platforms are more typically used for MASTER. MATR is an additional sensor used for MOPITT algorithm development and validation. The intensive validation activities planned for the first year of the Terra mission will be described with emphasis on derived geophysical parameters of most relevance to the atmospheric radiation community.

  7. Development of a PEGylated-Based Platform for Efficient Delivery of Dietary Antioxidants Across the Blood-Brain Barrier.

    PubMed

    Fernandes, Carlos; Pinto, Miguel; Martins, Cláudia; Gomes, Maria João; Sarmento, Bruno; Oliveira, Paulo J; Remião, Fernando; Borges, Fernanda

    2018-05-16

    The uptake and transport of dietary antioxidants remains the most important setback for their application in therapy. To overcome the limitations, a PEGylated-based platform was developed to improve the delivery properties of two dietary hydroxycinnamic (HCA) antioxidants-caffeic and ferulic acids. The antioxidant properties of the new polymer-antioxidant conjugates (PEGAntiOxs), prepared by linking poly(ethylene glycol) (PEG) to the cinnamic acids by a one-step Knovenagel condensation reaction, were evaluated. PEGAntiOxs present a higher lipophilicity than the parent compounds (caffeic and ferulic acids) and similar, or higher, antioxidant properties. PEGAntiOxs were not cytotoxic at the tested concentrations in SH-SY5Y, Caco-2, and hCMEC/D3 cells. By contrast, cytotoxic effects in hCMEC/D3 and SH-SY5Y cells were observed, at 50 and 100 μM, for caffeic and ferulic acids. PEGAntiOxs operate as antioxidants against several oxidative stress-cellular inducers in a neuronal cell-based model, and were able to inhibit glycoprotein-P in Caco-2 cells. PEGAntiOxs can cross hCMEC/D3 monolayer cells, a model of the blood-brain barrier (BBB) endothelial membrane. In summary, PEGAntiOxs are valid antioxidant prototypes that can uphold the antioxidant properties of HCAs, reduce their cytotoxicity, and improve their BBB permeability. PEGAntiOxs can be used in the near future as drug candidates to prevent or slow oxidative stress associated with neurodegenerative diseases.

  8. Radiated EMC immunity investigation of common recognition identification platform for medical applications

    NASA Astrophysics Data System (ADS)

    Miranda, Jorge; Cabral, Jorge; Ravelo, Blaise; Wagner, Stefan; Pedersen, Christian F.; Memon, Mukhtiar; Mathiesen, Morten

    2015-01-01

    An innovative e-healthcare platform named common recognition and identification platform (CRIP) was developed and tested as part of the CareStore project. CareStore and CRIP aims at delivering accurate and safe disease management by minimising human operator errors in hospitals and care facilities. To support this, the CRIP platform features fingerprint biometrics and near field communication (NFC) for user identification; and Bluetooth communication support for a range of telemedicine medical devices adhering to the IEEE 11073 standard. The aim of this study was to evaluate the electromagnetic compatibility (EMC) immunity of the CRIP platform in order to validate it for medical application use. The first prototype of CRIP was demonstrated to operate as expected by showing the user identification function feasibility, both via NFC and biometric, and by detection of Bluetooth devices via radio frequency (RF) scanning. The NFC module works in the 13.56 MHz band and the Bluetooth module work in the 2.4 GHz band, according to the IEEE 802.15.1 standard. The standard test qualification of the CRIP was performed based on the radiated EMC immunity with respect to the EN 61000-4-3 standard. The immunity tests were conducted under industrial EMC compliance with electric field aggression, with levels up to 10 V/m in both horizontal and vertical polarisations when the test antenna and the CRIP were placed at a distance of 3 m. It was found that the CRIP device complies with the European electromagnetic (EM) radiation immunity requirements.

  9. Development and Validation of a Portable Platform for Deploying Decision-Support Algorithms in Prehospital Settings

    PubMed Central

    Reisner, A. T.; Khitrov, M. Y.; Chen, L.; Blood, A.; Wilkins, K.; Doyle, W.; Wilcox, S.; Denison, T.; Reifman, J.

    2013-01-01

    Summary Background Advanced decision-support capabilities for prehospital trauma care may prove effective at improving patient care. Such functionality would be possible if an analysis platform were connected to a transport vital-signs monitor. In practice, there are technical challenges to implementing such a system. Not only must each individual component be reliable, but, in addition, the connectivity between components must be reliable. Objective We describe the development, validation, and deployment of the Automated Processing of Physiologic Registry for Assessment of Injury Severity (APPRAISE) platform, intended to serve as a test bed to help evaluate the performance of decision-support algorithms in a prehospital environment. Methods We describe the hardware selected and the software implemented, and the procedures used for laboratory and field testing. Results The APPRAISE platform met performance goals in both laboratory testing (using a vital-sign data simulator) and initial field testing. After its field testing, the platform has been in use on Boston MedFlight air ambulances since February of 2010. Conclusion These experiences may prove informative to other technology developers and to healthcare stakeholders seeking to invest in connected electronic systems for prehospital as well as in-hospital use. Our experiences illustrate two sets of important questions: are the individual components reliable (e.g., physical integrity, power, core functionality, and end-user interaction) and is the connectivity between components reliable (e.g., communication protocols and the metadata necessary for data interpretation)? While all potential operational issues cannot be fully anticipated and eliminated during development, thoughtful design and phased testing steps can reduce, if not eliminate, technical surprises. PMID:24155791

  10. Joint chemical agent detector (JCAD): the future of chemical agent detection

    NASA Astrophysics Data System (ADS)

    Laljer, Charles E.

    2003-08-01

    The Joint Chemical Agent Detector (JCAD) has continued development through 2002. The JCAD has completed Contractor Validation Testing (CVT) that included chemical warfare agent testing, environmental testing, electromagnetic interferent testing, and platform integration validation. The JCAD provides state of the art chemical warfare agent detection capability to military and homeland security operators. Intelligence sources estimate that over twenty countries have active chemical weapons programs. The spread of weapons of mass destruction (and the industrial capability for manufacture of these weapons) to third world nations and terrorist organizations has greatly increased the chemical agent threat to U.S. interests. Coupled with the potential for U.S. involvement in localized conflicts in an operational or support capacity, increases the probability that the military Joint Services may encounter chemical agents anywhere in the world. The JCAD is a small (45 in3), lightweight (2 lb.) chemical agent detector for vehicle interiors, aircraft, individual personnel, shipboard, and fixed site locations. The system provides a common detection component across multi-service platforms. This common detector system will allow the Joint Services to use the same operational and support concept for more efficient utilization of resources. The JCAD detects, identifies, quantifies, and warns of the presence of chemical agents prior to onset of miosis. Upon detection of chemical agents, the detector provides local and remote audible and visual alarms to the operators. Advance warning will provide the vehicle crew and other personnel in the local area with the time necessary to protect themselves from the lethal effects of chemical agents. The JCAD is capable of being upgraded to protect against future chemical agent threats. The JCAD provides the operator with the warning necessary to survive and fight in a chemical warfare agent threat environment.

  11. L-band Soil Moisture Mapping using Small UnManned Aerial Systems

    NASA Astrophysics Data System (ADS)

    Dai, E.

    2015-12-01

    Soil moisture is of fundamental importance to many hydrological, biological and biogeochemical processes, plays an important role in the development and evolution of convective weather and precipitation, and impacts water resource management, agriculture, and flood runoff prediction. The launch of NASA's Soil Moisture Active/Passive (SMAP) mission in 2015 promises to provide global measurements of soil moisture and surface freeze/thaw state at fixed crossing times and spatial resolutions as low as 5 km for some products. However, there exists a need for measurements of soil moisture on smaller spatial scales and arbitrary diurnal times for SMAP validation, precision agriculture and evaporation and transpiration studies of boundary layer heat transport. The Lobe Differencing Correlation Radiometer (LDCR) provides a means of mapping soil moisture on spatial scales as small as several meters (i.e., the height of the platform) .Compared with various other proposed methods of validation based on either situ measurements [1,2] or existing airborne sensors suitable for manned aircraft deployment [3], the integrated design of the LDCR on a lightweight small UAS (sUAS) is capable of providing sub-watershed (~km scale) coverage at very high spatial resolution (~15 m) suitable for scaling scale studies, and at comparatively low operator cost. The LDCR on Tempest unit can supply the soil moisture mapping with different resolution which is of order the Tempest altitude.

  12. Wide Angle Imaging Lidar (WAIL): Theory of Operation and Results from Cross-Platform Validation at the ARM Southern Great Plains Site

    NASA Astrophysics Data System (ADS)

    Polonsky, I. N.; Davis, A. B.; Love, S. P.

    2004-05-01

    WAIL was designed to determine physical and geometrical characteristics of optically thick clouds using the off-beam component of the lidar return that can be accurately modeled within the 3D photon diffusion approximation. The theory shows that the WAIL signal depends not only on the cloud optical characteristics (phase function, extinction and scattering coefficients) but also on the outer thickness of the cloud layer. This makes it possible to estimate the mean optical and geometrical thicknesses of the cloud. The comparison with Monte Carlo simulation demonstrates the high accuracy of the diffusion approximation for moderately to very dense clouds. During operation WAIL is able to collect a complete data set from a cloud every few minutes, with averaging over horizontal scale of a kilometer or so. In order to validate WAIL's ability to deliver cloud properties, the LANL instrument was deployed as a part of the THickness from Off-beam Returns (THOR) validation IOP. The goal was to probe clouds above the SGP CART site at night in March 2002 from below (WAIL and ARM instruments) and from NASA's P3 aircraft (carrying THOR, the GSFC counterpart of WAIL) flying above the clouds. The permanent cloud instruments we used to compare with the results obtained from WAIL were ARM's laser ceilometer, micro-pulse lidar (MPL), millimeter-wavelength cloud radar (MMCR), and micro-wave radiometer (MWR). The comparison shows that, in spite of an unusually low cloud ceiling, an unfavorable observation condition for WAIL's present configuration, cloud properties obtained from the new instrument are in good agreement with their counterparts obtained by other instruments. So WAIL can duplicate, at least for single-layer clouds, the cloud products of the MWR and MMCR together. But WAIL does this with green laser light, which is far more representative than microwaves of photon transport processes at work in the climate system.

  13. Transcultural adaptation into Spanish of the Induction Compliance Checklist for assessing children's behaviour during induction of anaesthesia.

    PubMed

    Jerez-Molina, Carmen; Lázaro-Alcay, Juan J; Ullán-de la Fuente, Ana M

    2017-10-17

    Cross-cultural adaptation into Spanish of the Induction Compliance Checklist (ICC) for assessing children's behaviour during induction of anaesthesia. A descriptive cross-sectional observational study was conducted on a sample of 81 children aged 2 to 12 years operated in an ambulatory surgery unit of a paediatric hospital in Barcelona. Adaptation by translation-back translation of the tool and analysis of the scale's validity and reliability. Face validity of the tool was guaranteed through a discussion group and inter-observer reliability was evaluated, obtaining an intraclass correlation index of r = 0.956. The ICC scale validated for the Spanish population can be an effective tool for the presurgical evaluation of activities carried out to minimise children's anxiety. The ICC is an easy-to-use scale completed by operating room staff in one minute and would provide important information about children's behaviour, specifically during induction. Copyright © 2017 Elsevier España, S.L.U. All rights reserved.

  14. Phase I Development of Neutral Beam Injector Solid-State Power System

    NASA Astrophysics Data System (ADS)

    Prager, James; Ziemba, Timothy; Miller, Kenneth E.; Slobodov, Ilia; Anderson, Seth

    2017-10-01

    Neutral beam injection (NBI) is an important tool for plasma heating, current drive and a diagnostic at fusion science experiments around the United States, including tokamaks, validation platform experiments, and privately funded fusion concepts. Currently, there are no vendors in the United States for NBI power systems. Eagle Harbor Technologies (EHT), Inc. is developing a new power system for NBI that takes advantage of the latest developments in solid-state switching. EHT has developed a resonant converter that can be scaled to the power levels required for NBI at small-scale validation platform experiments like the Lithium Tokamak Experiment. This power system can be used to modulate the NBI voltages over the course of a plasma shot, which can lead to improved control over the plasma. EHT will present initial modeling used to design this system as well as experimental data showing operation at 15 kV and 40 A for 10 ms into a test load. With support of DOE SBIR.

  15. Development of a QDots 800 based fluorescent solid phantom for validation of NIRF imaging platforms

    NASA Astrophysics Data System (ADS)

    Zhu, Banghe; Sevick-Muraca, Eva M.

    2013-02-01

    Over the past decade, we developed near-infrared fluorescence (NIRF) devices for non-invasive lymphatic imaging using microdosages of ICG in humans and for detection of lymph node metastasis in animal models mimicking metastatic human prostate cancer. To validate imaging, a NIST traceable phantom is needed so that developed "first-inhumans" drugs may be used with different luorescent imaging platforms. In this work, we developed a QDots 800 based fluorescent solid phantom for installation and operational qualification of clinical and preclinical, NIRF imaging devices. Due to its optical clearance, polyurethane was chosen as the base material. Titanium dioxide was used as the scattering agent because of its miscibility in polyurethane. QDots 800 was chosen owing to its stability and NIR emission spectra. A first phantom was constructed for evaluation of the noise floor arising from excitation light leakage, a phenomenon that can be minimized during engineering and design of fluorescent imaging systems. A second set of phantoms were constructed to enable quantification of device sensitivity associated with our preclinical and clinical devices. The phantoms have been successfully applied for installation and operational qualification of our preclinical and clinical devices. Assessment of excitation light leakage provides a figure of merit for "noise floor" and imaging sensitivity can be used to benchmark devices for specific imaging agents.

  16. KOLAM: a cross-platform architecture for scalable visualization and tracking in wide-area imagery

    NASA Astrophysics Data System (ADS)

    Fraser, Joshua; Haridas, Anoop; Seetharaman, Guna; Rao, Raghuveer M.; Palaniappan, Kannappan

    2013-05-01

    KOLAM is an open, cross-platform, interoperable, scalable and extensible framework supporting a novel multi- scale spatiotemporal dual-cache data structure for big data visualization and visual analytics. This paper focuses on the use of KOLAM for target tracking in high-resolution, high throughput wide format video also known as wide-area motion imagery (WAMI). It was originally developed for the interactive visualization of extremely large geospatial imagery of high spatial and spectral resolution. KOLAM is platform, operating system and (graphics) hardware independent, and supports embedded datasets scalable from hundreds of gigabytes to feasibly petabytes in size on clusters, workstations, desktops and mobile computers. In addition to rapid roam, zoom and hyper- jump spatial operations, a large number of simultaneously viewable embedded pyramid layers (also referred to as multiscale or sparse imagery), interactive colormap and histogram enhancement, spherical projection and terrain maps are supported. The KOLAM software architecture was extended to support airborne wide-area motion imagery by organizing spatiotemporal tiles in very large format video frames using a temporal cache of tiled pyramid cached data structures. The current version supports WAMI animation, fast intelligent inspection, trajectory visualization and target tracking (digital tagging); the latter by interfacing with external automatic tracking software. One of the critical needs for working with WAMI is a supervised tracking and visualization tool that allows analysts to digitally tag multiple targets, quickly review and correct tracking results and apply geospatial visual analytic tools on the generated trajectories. One-click manual tracking combined with multiple automated tracking algorithms are available to assist the analyst and increase human effectiveness.

  17. Time and Space Partitioning the EagleEye Reference Misson

    NASA Astrophysics Data System (ADS)

    Bos, Victor; Mendham, Peter; Kauppinen, Panu; Holsti, Niklas; Crespo, Alfons; Masmano, Miguel; de la Puente, Juan A.; Zamorano, Juan

    2013-08-01

    We discuss experiences gained by porting a Software Validation Facility (SVF) and a satellite Central Software (CSW) to a platform with support for Time and Space Partitioning (TSP). The SVF and CSW are part of the EagleEye Reference mission of the European Space Agency (ESA). As a reference mission, EagleEye is a perfect candidate to evaluate practical aspects of developing satellite CSW for and on TSP platforms. The specific TSP platform we used consists of a simulated LEON3 CPU controlled by the XtratuM separation micro-kernel. On top of this, we run five separate partitions. Each partition runs its own real-time operating system or Ada run-time kernel, which in turn are running the application software of the CSW. We describe issues related to partitioning; inter-partition communication; scheduling; I/O; and fault-detection, isolation, and recovery (FDIR).

  18. REVEAL: Software Documentation and Platform Migration

    NASA Technical Reports Server (NTRS)

    Wilson, Michael A.; Veibell, Victoir T.; Freudinger, Lawrence C.

    2008-01-01

    The Research Environment for Vehicle Embedded Analysis on Linux (REVEAL) is reconfigurable data acquisition software designed for network-distributed test and measurement applications. In development since 2001, it has been successfully demonstrated in support of a number of actual missions within NASA s Suborbital Science Program. Improvements to software configuration control were needed to properly support both an ongoing transition to operational status and continued evolution of REVEAL capabilities. For this reason the project described in this report targets REVEAL software source documentation and deployment of the software on a small set of hardware platforms different from what is currently used in the baseline system implementation. This report specifically describes the actions taken over a ten week period by two undergraduate student interns and serves as a final report for that internship. The topics discussed include: the documentation of REVEAL source code; the migration of REVEAL to other platforms; and an end-to-end field test that successfully validates the efforts.

  19. Validation of the three web quality dimensions of a minimally invasive surgery e-learning platform.

    PubMed

    Ortega-Morán, Juan Francisco; Pagador, J Blas; Sánchez-Peralta, Luisa Fernanda; Sánchez-González, Patricia; Noguera, José; Burgos, Daniel; Gómez, Enrique J; Sánchez-Margallo, Francisco M

    2017-11-01

    E-learning web environments, including the new TELMA platform, are increasingly being used to provide cognitive training in minimally invasive surgery (MIS) to surgeons. A complete validation of this MIS e-learning platform has been performed to determine whether it complies with the three web quality dimensions: usability, content and functionality. 21 Surgeons participated in the validation trials. They performed a set of tasks in the TELMA platform, where an e-MIS validity approach was followed. Subjective (questionnaires and checklists) and objective (web analytics) metrics were analysed to achieve the complete validation of usability, content and functionality. The TELMA platform allowed access to didactic content with easy and intuitive navigation. Surgeons performed all tasks with a close-to-ideal number of clicks and amount of time. They considered the design of the website to be consistent (95.24%), organised (90.48%) and attractive (85.71%). Moreover, they gave the content a high score (4.06 out of 5) and considered it adequate for teaching purposes. The surgeons scored the professional language and content (4.35), logo (4.24) and recommendations (4.20) the highest. Regarding functionality, the TELMA platform received an acceptance of 95.24% for navigation and 90.48% for interactivity. According to the study, it seems that TELMA had an attractive design, innovative content and interactive navigation, which are three key features of an e-learning platform. TELMA successfully met the three criteria necessary for consideration as a website of quality by achieving more than 70% of agreements regarding all usability, content and functionality items validated; this constitutes a preliminary requirement for an effective e-learning platform. However, the content completeness, authoring tool and registration process required improvement. Finally, the e-MIS validity methodology used to measure the three dimensions of web quality in this work can be applied to other clinical areas or training fields. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Control of a Vanadium Redox Battery and supercapacitor using a Three-Level Neutral Point Clamped converter

    NASA Astrophysics Data System (ADS)

    Etxeberria, A.; Vechiu, I.; Baudoin, S.; Camblong, H.; Kreckelbergh, S.

    2014-02-01

    The increasing use of distributed generators, which are mainly based on renewable sources, can create several issues in the operation of the electric grid. The microgrid is being analysed as a solution to the integration in the grid of the renewable sources at a high penetration level in a controlled way. The storage systems play a vital role in order to keep the energy and power balance of the microgrid. Due to the technical limitations of the currently available storage systems, it is necessary to use more than one storage technology to satisfy the requirements of the microgrid application. This work validates in simulations and experimentally the use of a Three-Level Neutral Point Clamped converter to control the power flow of a hybrid storage system formed by a SuperCapacitor and a Vanadium Redox Battery. The operation of the system is validated in two case studies in the experimental platform installed in ESTIA. The experimental results prove the validity of the proposed system as well as the designed control algorithm. The good agreement among experimental and simulation results also validates the simulation model, that can therefore be used to analyse the operation of the system in different case studies.

  1. Autonomous low-power magnetic data collection platform to enable remote high latitude array deployment.

    PubMed

    Musko, Stephen B; Clauer, C Robert; Ridley, Aaron J; Arnett, Kennneth L

    2009-04-01

    A major driver in the advancement of geophysical sciences is improvement in the quality and resolution of data for use in scientific analysis, discovery, and for assimilation into or validation of empirical and physical models. The need for more and better measurements together with improvements in technical capabilities is driving the ambition to deploy arrays of autonomous geophysical instrument platforms in remote regions. This is particularly true in the southern polar regions where measurements are presently sparse due to the remoteness, lack of infrastructure, and harshness of the environment. The need for the acquisition of continuous long-term data from remote polar locations exists across geophysical disciplines and is a generic infrastructure problem. The infrastructure, however, to support autonomous instrument platforms in polar environments is still in the early stages of development. We report here the development of an autonomous low-power magnetic variation data collection system. Following 2 years of field testing at the south pole station, the system is being reproduced to establish a dense chain of stations on the Antarctic plateau along the 40 degrees magnetic meridian. The system is designed to operate for at least 5 years unattended and to provide data access via satellite communication. The system will store 1 s measurements of the magnetic field variation (<0.2 nT resolution) in three vector components plus a variety of engineering status and environment parameters. We believe that the data collection platform can be utilized by a variety of low-power instruments designed for low-temperature operation. The design, technical characteristics, and operation results are presented here.

  2. Mission demonstration concept for the long-duration storage and transfer of cryogenic propellants

    NASA Astrophysics Data System (ADS)

    McLean, C.; Deininger, W.; Ingram, K.; Schweickart, R.; Unruh, B.

    This paper describes an experimental platform that will demonstrate the major technologies required for the handling and storage of cryogenic propellants in a low-to-zero-g environment. In order to develop a cost-effective, high value-added demonstration mission, a review of the complete mission concept of operations (CONOPS) was performed. The overall cost of such a mission is driven not only by the spacecraft platform and on-orbit experiments themselves, but also by the complexities of handling cryogenic propellants during ground-processing operations. On-orbit storage methodologies were looked at for both passive and active systems. Passive systems rely purely on isolation of the stored propellant from environmental thermal loads, while active cooling employs cryocooler technologies. The benefit trade between active and passive systems is mission-dependent due to the mass, power, and system-level penalties associated with active cooling systems. The experimental platform described in this paper is capable of demonstrating multiple advanced micro-g cryogenic propellant management technologies. In addition to the requirements of demonstrating these technologies, the methodology of propellant transfer must be evaluated. The handling of multiphase liquids in micro-g is discussed using flight-heritage micro-g propellant management device technologies as well as accelerated tank stratification for access to vapor-free or liquid-free propellants. The mission concept presented shows the extensibility of the experimental platform to demonstrate advanced cryogenic components and technologies, propellant transfer methodologies, as well as the validation of thermal and fluidic models, from subscale tankage to an operational architecture.

  3. Mechanical evaluation of articulating instruments and cross-handed manipulation in laparoendoscopic single-site surgery.

    PubMed

    Xu, An An; Zhu, Jiang Fan; Xie, Xiaofeng; Su, Yuantao

    2014-08-01

    Laparoendoscopic single-site surgery (LESS) is limited by loss of triangulation and internal instruments conflict. To overcome these difficulties, some concepts have been introduced, namely, articulating instruments and cross-handed manipulation, which causes the right hand to control the left instrument tip and vice versa. The aim of this study was to compare task performance with different approaches based on a mechanical evaluation platform. A LESS mechanical evaluation platform was set up to investigate the performance of 2 tasks (suture pass-through rings and clip-cut) with 3 different settings: uncrossed manipulation with straight instruments (group A, the control group), uncrossed manipulation with articulating instruments (group B), and cross-handed manipulation with articulating instruments (group C). The operation time and average load required for accomplishment of the standard tasks were measured. Group A presented significantly better time scores than group B, and group C consumed the longest time to accomplish the 2 tasks (P < .05). Comparing of average load required to perform the suture pass-through rings task, it differed significantly between dominant and nondominant hand in all groups (P < .01) and was less in group A and group B than group C in dominant hand (P < .01), while it was almost the same in all groups in the nondominant hand. In terms of average load requirement to accomplish clip-cut task, it was almost equal not only between group A and B but also between dominant and nondominant hand while the increase reached statistical significance when comparing group C with other groups (P < .05). Compared with conventional devices and maneuvering techniques, articulating instruments and cross-handed manipulation are associated with longer operation time and higher workload. Instruments with better maneuverability should be developed in the future for LESS. © The Author(s) 2013.

  4. A comparative analysis of high-throughput platforms for validation of a circulating microRNA signature in diabetic retinopathy.

    PubMed

    Farr, Ryan J; Januszewski, Andrzej S; Joglekar, Mugdha V; Liang, Helena; McAulley, Annie K; Hewitt, Alex W; Thomas, Helen E; Loudovaris, Tom; Kay, Thomas W H; Jenkins, Alicia; Hardikar, Anandwardhan A

    2015-06-02

    MicroRNAs are now increasingly recognized as biomarkers of disease progression. Several quantitative real-time PCR (qPCR) platforms have been developed to determine the relative levels of microRNAs in biological fluids. We systematically compared the detection of cellular and circulating microRNA using a standard 96-well platform, a high-content microfluidics platform and two ultra-high content platforms. We used extensive analytical tools to compute inter- and intra-run variability and concordance measured using fidelity scoring, coefficient of variation and cluster analysis. We carried out unprejudiced next generation sequencing to identify a microRNA signature for Diabetic Retinopathy (DR) and systematically assessed the validation of this signature on clinical samples using each of the above four qPCR platforms. The results indicate that sensitivity to measure low copy number microRNAs is inversely related to qPCR reaction volume and that the choice of platform for microRNA biomarker validation should be made based on the abundance of miRNAs of interest.

  5. Hierarchical Engine for Large-scale Infrastructure Co-Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2017-04-24

    HELICS is designed to support very-large-scale (100,000+ federates) cosimulations with off-the-shelf power-system, communication, market, and end-use tools. Other key features include cross platform operating system support, the integration of both event driven (e.g., packetized communication) and time-series (e.g., power flow) simulations, and the ability to co-iterate among federates to ensure physical model convergence at each time step.

  6. CAGE IIIA Distributed Simulation Design Methodology

    DTIC Science & Technology

    2014-05-01

    2 VHF Very High Frequency VLC Video LAN Codec – an Open-source cross-platform multimedia player and framework VM Virtual Machine VOIP Voice Over...Implementing Defence Experimentation (GUIDEx). The key challenges for this methodology are with understanding how to: • design it o define the...operation and to be available in the other nation’s simulations. The challenge for the CAGE campaign of experiments is to continue to build upon this

  7. Digest of celestial X-ray missions and experiments

    NASA Technical Reports Server (NTRS)

    Locke, M. C.

    1982-01-01

    Information on instruments, the platforms that carried them, and the data they gathered is presented. Instrument selection was confined to detectors operating in the 0.20 to 300 keV range. Included are brief descriptions of the spacecraft, experiment packages and missions. Cross-referenced indexes are provided for types of instruments, energy ranges, time spans covered, positional catalogs and observational catalogs. Data sets from these experiments (NSSDC) are described.

  8. NPP ATMS Prelaunch Performance Assessment and Sensor Data Record Validation

    DTIC Science & Technology

    2011-04-29

    TMS to sense scattering of cold cosmic background radiance from the tops of preci pitating clouds allows the retrieval of preCipitation intensities...operational and research missions over the last 40 years. The Cross-track Infrared and Microwave Sounding Suite (CrIMSS), consisting of the Cross-track...Infrared Sounder (CrrS) and the flIst space-based, Nyquist-sampled cross-track microwave sounder, the Advanced Technology Microwave Sounder (ATMS), will

  9. PIV Measurements of the CEV Hot Abort Motor Plume for CFD Validation

    NASA Technical Reports Server (NTRS)

    Wernet, Mark; Wolter, John D.; Locke, Randy; Wroblewski, Adam; Childs, Robert; Nelson, Andrea

    2010-01-01

    NASA s next manned launch platform for missions to the moon and Mars are the Orion and Ares systems. Many critical aspects of the launch system performance are being verified using computational fluid dynamics (CFD) predictions. The Orion Launch Abort Vehicle (LAV) consists of a tower mounted tractor rocket tasked with carrying the Crew Module (CM) safely away from the launch vehicle in the event of a catastrophic failure during the vehicle s ascent. Some of the predictions involving the launch abort system flow fields produced conflicting results, which required further investigation through ground test experiments. Ground tests were performed to acquire data from a hot supersonic jet in cross-flow for the purpose of validating CFD turbulence modeling relevant to the Orion Launch Abort Vehicle (LAV). Both 2-component axial plane Particle Image Velocimetry (PIV) and 3-component cross-stream Stereo Particle Image Velocimetry (SPIV) measurements were obtained on a model of an Abort Motor (AM). Actual flight conditions could not be simulated on the ground, so the highest temperature and pressure conditions that could be safely used in the test facility (nozzle pressure ratio 28.5 and a nozzle temperature ratio of 3) were used for the validation tests. These conditions are significantly different from those of the flight vehicle, but were sufficiently high enough to begin addressing turbulence modeling issues that predicated the need for the validation tests.

  10. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to integrate existing mission applications for sequence development, sequence validation, and high level activity planning, and other functions into a component-based environment. For each of these, we used a somewhat different technique based upon the structure and usage of the existing application.

  11. Modelling and attenuation feasibility of the aeroelastic response of active helicopter rotor systems during the engagement/disengagement phase of maritime operation

    NASA Astrophysics Data System (ADS)

    Khouli, F.

    An aeroelastic phenomenon, known as blade sailing, encountered during maritime operation of helicopters is identified as being a factor that limits the tactical flexibility of helicopter operation in some sea conditions. The hazards associated with this phenomenon and its complexity, owing to the number of factors contributing to its occurrence, led previous investigators to conclude that advanced and validated simulation tools are best suited to investigate it. A research gap is identified in terms of scaled experimental investigation of this phenomenon and practical engineering solutions to alleviate its negative impact on maritime helicopter operation. The feasibility of a proposed strategy to alleviate it required addressing a gap in modelling thin-walled composite active beams/rotor blades. The modelling is performed by extending a mathematically-consistent and asymptotic reduction strategy of the 3-D elastic problem to account for embedded active materials. The derived active cross-sectional theory is validated using 2-D finite element results for closed and open cross-sections. The geometrically-exact intrinsic formulation of active maritime rotor systems is demonstrated to yield compact and symbolic governing equations. The intrinsic feature is shown to allow a classical and proven solution scheme to be successfully applied to obtain time history solutions. A Froude-scaled experimental rotor was designed, built, and tested in a scaled ship airwake environment and representative ship motion. Based on experimental and simulations data, conclusions are drawn regarding the influence of the maritime operation environment and the rotor operation parameters on the blade sailing phenomenon. The experimental data is also used to successfully validate the developed simulation tools. The feasibility of an open-loop control strategy based on the integral active twist concept to counter blade sailing is established in a Mach-scaled maritime operation environment. Recommendations are proposed to improve the strategy and further establish its validity in a full-scale maritime operation environment.

  12. A Novel Low-Cost Open-Hardware Platform for Monitoring Soil Water Content and Multiple Soil-Air-Vegetation Parameters

    PubMed Central

    Bitella, Giovanni; Rossi, Roberta; Bochicchio, Rocco; Perniola, Michele; Amato, Mariana

    2014-01-01

    Monitoring soil water content at high spatio-temporal resolution and coupled to other sensor data is crucial for applications oriented towards water sustainability in agriculture, such as precision irrigation or phenotyping root traits for drought tolerance. The cost of instrumentation, however, limits measurement frequency and number of sensors. The objective of this work was to design a low cost “open hardware” platform for multi-sensor measurements including water content at different depths, air and soil temperatures. The system is based on an open-source ARDUINO microcontroller-board, programmed in a simple integrated development environment (IDE). Low cost high-frequency dielectric probes were used in the platform and lab tested on three non-saline soils (ECe1: 2.5 < 0.1 mS/cm). Empirical calibration curves were subjected to cross-validation (leave-one-out method), and normalized root mean square error (NRMSE) were respectively 0.09 for the overall model, 0.09 for the sandy soil, 0.07 for the clay loam and 0.08 for the sandy loam. The overall model (pooled soil data) fitted the data very well (R2 = 0.89) showing a high stability, being able to generate very similar RMSEs during training and validation (RMSEtraining = 2.63; RMSEvalidation = 2.61). Data recorded on the card were automatically sent to a remote server allowing repeated field-data quality checks. This work provides a framework for the replication and upgrading of a customized low cost platform, consistent with the open source approach whereby sharing information on equipment design and software facilitates the adoption and continuous improvement of existing technologies. PMID:25337742

  13. Comparison of statistical methods for detection of serum lipid biomarkers for mesothelioma and asbestos exposure.

    PubMed

    Xu, Rengyi; Mesaros, Clementina; Weng, Liwei; Snyder, Nathaniel W; Vachani, Anil; Blair, Ian A; Hwang, Wei-Ting

    2017-07-01

    We compared three statistical methods in selecting a panel of serum lipid biomarkers for mesothelioma and asbestos exposure. Serum samples from mesothelioma, asbestos-exposed subjects and controls (40 per group) were analyzed. Three variable selection methods were considered: top-ranked predictors from univariate model, stepwise and least absolute shrinkage and selection operator. Crossed-validated area under the receiver operating characteristic curve was used to compare the prediction performance. Lipids with high crossed-validated area under the curve were identified. Lipid with mass-to-charge ratio of 372.31 was selected by all three methods comparing mesothelioma versus control. Lipids with mass-to-charge ratio of 1464.80 and 329.21 were selected by two models for asbestos exposure versus control. Different methods selected a similar set of serum lipids. Combining candidate biomarkers can improve prediction.

  14. Low-cost, efficient wireless intelligent sensors (LEWIS) measuring real-time reference-free dynamic displacements

    NASA Astrophysics Data System (ADS)

    Ozdagli, A. I.; Liu, B.; Moreu, F.

    2018-07-01

    According to railroad managers, displacement of railroad bridges under service loads is an important parameter in the condition assessment and performance evaluation. However, measuring bridge responses in the field is often costly and labor-intensive. This paper proposes a low-cost, efficient wireless intelligent sensor (LEWIS) platform that can compute in real-time the dynamic transverse displacements of railroad bridges under service loads. This sensing platform drives on an open-source Arduino ecosystem and combines low-cost microcontrollers with affordable accelerometers and wireless transmission modules. The proposed LEWIS system is designed to reconstruct dynamic displacements from acceleration measurements onboard, eliminating the need for offline post-processing, and to transmit the data in real-time to a base station where the inspector at the bridge can see the displacements while the train is crossing, or to a remote office if so desired by internet. Researchers validated the effectiveness of the new LEWIS by conducting a series of laboratory experiments. A shake table setup simulated transverse bridge displacements measured on the field and excited the proposed platform, a commercially available wired expensive accelerometer, and reference LVDT displacement sensor. The responses obtained from the wireless system were compared to the displacements reconstructed from commercial accelerometer readings and the reference LVDT. The results of the laboratory experiments demonstrate that the proposed system is capable of reconstructing transverse displacements of railroad bridges under revenue service traffic accurately and transmitting the data in real-time wirelessly. In conclusion, the platform presented in this paper can be used in the performance assessment of railroad bridge network cost-effectively and accurately. Future work includes collecting real-time reference-free displacements of one railroad bridge in Colorado under train crossings to further prove LEWIS' suitability for engineering applications.

  15. Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.

    2014-06-08

    High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts, such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been developed that have the potential to mitigate many power quality concerns. However, local closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. To enable the study of the performance of advanced control schemes in a detailed distribution system environment, a test platform has been developed that integrates Power Hardware-in-the-Loop (PHIL) withmore » concurrent time-series electric distribution system simulation. In the test platform, GridLAB-D, a distribution system simulation tool, runs a detailed simulation of a distribution feeder in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling. At the National Renewable Energy Laboratory (NREL), a hardware inverter interacts with grid and PV simulators emulating an operational distribution system. Power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of inverter control modes—constant power factor and active Volt/VAr control—when integrated into a simulated IEEE 8500-node test feeder. We demonstrate that this platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, results are used to validate GridLAB-D simulations of advanced inverter controls.« less

  16. In-Flight Thermal Performance of the Lidar In-Space Technology Experiment

    NASA Technical Reports Server (NTRS)

    Roettker, William

    1995-01-01

    The Lidar In-Space Technology Experiment (LITE) was developed at NASA s Langley Research Center to explore the applications of lidar operated from an orbital platform. As a technology demonstration experiment, LITE was developed to gain experience designing and building future operational orbiting lidar systems. Since LITE was the first lidar system to be flown in space, an important objective was to validate instrument design principles in such areas as thermal control, laser performance, instrument alignment and control, and autonomous operations. Thermal and structural analysis models of the instrument were developed during the design process to predict the behavior of the instrument during its mission. In order to validate those mathematical models, extensive engineering data was recorded during all phases of LITE's mission. This inflight engineering data was compared with preflight predictions and, when required, adjustments to the thermal and structural models were made to more accurately match the instrument s actual behavior. The results of this process for the thermal analysis and design of LITE are presented in this paper.

  17. Body Dysmorphic Symptoms Scale for patients seeking esthetic surgery: cross-cultural validation study.

    PubMed

    Ramos, Tatiana Dalpasquale; Brito, Maria José Azevedo de; Piccolo, Mônica Sarto; Rosella, Maria Fernanda Normanha da Silva Martins; Sabino, Miguel; Ferreira, Lydia Masako

    2016-07-21

    Rhinoplasty is one of the most sought-after esthetic operations among individuals with body dysmorphic disorder. The aim of this study was to cross-culturally adapt and validate the Body Dysmorphic Symptoms Scale. Cross-cultural validation study conducted in a plastic surgery outpatient clinic of a public university hospital. Between February 2014 and March 2015, 80 consecutive patients of both sexes seeking rhinoplasty were selected. Thirty of them participated in the phase of cultural adaptation of the instrument. Reproducibility was tested on 20 patients and construct validity was assessed on 50 patients, with correlation against the Yale-Brown Obsessive Compulsive Scale for Body Dysmorphic Disorder. The Brazilian version of the instrument showed Cronbach's alpha of 0.805 and excellent inter-rater reproducibility (intraclass correlation coefficient, ICC = 0.873; P < 0.001) and intra-rater reproducibility (ICC = 0.939; P < 0.001). Significant differences in total scores were found between patients with and without symptoms (P < 0.001). A strong correlation (r = 0.841; P < 0.001) was observed between the Yale-Brown Obsessive Compulsive Scale for Body Dysmorphic Disorder and the Body Dysmorphic Symptoms Scale. The area under the receiver operating characteristic curve was 0.981, thus showing good accuracy for discriminating between presence and absence of symptoms of body dysmorphic disorder. Forty-six percent of the patients had body dysmorphic symptoms and 54% had moderate to severe appearance-related obsessive-compulsive symptoms. The Brazilian version of the Body Dysmorphic Symptoms Scale is a reproducible instrument that presents face, content and construct validity.

  18. Body Dysmorphic Symptoms Scale for patients seeking esthetic surgery: cross-cultural validation study.

    PubMed

    Ramos, Tatiana Dalpasquale; Brito, Maria José Azevedo de; Piccolo, Mônica Sarto; Rosella, Maria Fernanda Normanha da Silva Martins; Sabino, Miguel; Ferreira, Lydia Masako

    2016-01-01

    Rhinoplasty is one of the most sought-after esthetic operations among individuals with body dysmorphic disorder. The aim of this study was to cross-culturally adapt and validate the Body Dysmorphic Symptoms Scale. Cross-cultural validation study conducted in a plastic surgery outpatient clinic of a public university hospital. Between February 2014 and March 2015, 80 consecutive patients of both sexes seeking rhinoplasty were selected. Thirty of them participated in the phase of cultural adaptation of the instrument. Reproducibility was tested on 20 patients and construct validity was assessed on 50 patients, with correlation against the Yale-Brown Obsessive Compulsive Scale for Body Dysmorphic Disorder. The Brazilian version of the instrument showed Cronbach's alpha of 0.805 and excellent inter-rater reproducibility (intraclass correlation coefficient, ICC = 0.873; P < 0.001) and intra-rater reproducibility (ICC = 0.939; P < 0.001). Significant differences in total scores were found between patients with and without symptoms (P < 0.001). A strong correlation (r = 0.841; P < 0.001) was observed between the Yale-Brown Obsessive Compulsive Scale for Body Dysmorphic Disorder and the Body Dysmorphic Symptoms Scale. The area under the receiver operating characteristic curve was 0.981, thus showing good accuracy for discriminating between presence and absence of symptoms of body dysmorphic disorder. Forty-six percent of the patients had body dysmorphic symptoms and 54% had moderate to severe appearance-related obsessive-compulsive symptoms. The Brazilian version of the Body Dysmorphic Symptoms Scale is a reproducible instrument that presents face, content and construct validity.

  19. SERS quantitative urine creatinine measurement of human subject

    NASA Astrophysics Data System (ADS)

    Wang, Tsuei Lian; Chiang, Hui-hua K.; Lu, Hui-hsin; Hung, Yung-da

    2005-03-01

    SERS method for biomolecular analysis has several potentials and advantages over traditional biochemical approaches, including less specimen contact, non-destructive to specimen, and multiple components analysis. Urine is an easily available body fluid for monitoring the metabolites and renal function of human body. We developed surface-enhanced Raman scattering (SERS) technique using 50nm size gold colloidal particles for quantitative human urine creatinine measurements. This paper shows that SERS shifts of creatinine (104mg/dl) in artificial urine is from 1400cm-1 to 1500cm-1 which was analyzed for quantitative creatinine measurement. Ten human urine samples were obtained from ten healthy persons and analyzed by the SERS technique. Partial least square cross-validation (PLSCV) method was utilized to obtain the estimated creatinine concentration in clinically relevant (55.9mg/dl to 208mg/dl) concentration range. The root-mean square error of cross validation (RMSECV) is 26.1mg/dl. This research demonstrates the feasibility of using SERS for human subject urine creatinine detection, and establishes the SERS platform technique for bodily fluids measurement.

  20. A novel tablet computer platform for advanced language mapping during awake craniotomy procedures.

    PubMed

    Morrison, Melanie A; Tam, Fred; Garavaglia, Marco M; Golestanirad, Laleh; Hare, Gregory M T; Cusimano, Michael D; Schweizer, Tom A; Das, Sunit; Graham, Simon J

    2016-04-01

    A computerized platform has been developed to enhance behavioral testing during intraoperative language mapping in awake craniotomy procedures. The system is uniquely compatible with the environmental demands of both the operating room and preoperative functional MRI (fMRI), thus providing standardized testing toward improving spatial agreement between the 2 brain mapping techniques. Details of the platform architecture, its advantages over traditional testing methods, and its use for language mapping are described. Four illustrative cases demonstrate the efficacy of using the testing platform to administer sophisticated language paradigms, and the spatial agreement between intraoperative mapping and preoperative fMRI results. The testing platform substantially improved the ability of the surgeon to detect and characterize language deficits. Use of a written word generation task to assess language production helped confirm areas of speech apraxia and speech arrest that were inadequately characterized or missed with the use of traditional paradigms, respectively. Preoperative fMRI of the analogous writing task was also assistive, displaying excellent spatial agreement with intraoperative mapping in all 4 cases. Sole use of traditional testing paradigms can be limiting during awake craniotomy procedures. Comprehensive assessment of language function will require additional use of more sophisticated and ecologically valid testing paradigms. The platform presented here provides a means to do so.

  1. Automated Cough Assessment on a Mobile Platform

    PubMed Central

    2014-01-01

    The development of an Automated System for Asthma Monitoring (ADAM) is described. This consists of a consumer electronics mobile platform running a custom application. The application acquires an audio signal from an external user-worn microphone connected to the device analog-to-digital converter (microphone input). This signal is processed to determine the presence or absence of cough sounds. Symptom tallies and raw audio waveforms are recorded and made easily accessible for later review by a healthcare provider. The symptom detection algorithm is based upon standard speech recognition and machine learning paradigms and consists of an audio feature extraction step followed by a Hidden Markov Model based Viterbi decoder that has been trained on a large database of audio examples from a variety of subjects. Multiple Hidden Markov Model topologies and orders are studied. Performance of the recognizer is presented in terms of the sensitivity and the rate of false alarm as determined in a cross-validation test. PMID:25506590

  2. Cross-platform single cell analysis of kidney development shows stromal cells express Gdnf.

    PubMed

    Magella, Bliss; Adam, Mike; Potter, Andrew S; Venkatasubramanian, Meenakshi; Chetal, Kashish; Hay, Stuart B; Salomonis, Nathan; Potter, S Steven

    2018-02-01

    The developing kidney provides a useful model for study of the principles of organogenesis. In this report we use three independent platforms, Drop-Seq, Chromium 10x Genomics and Fluidigm C1, to carry out single cell RNA-Seq (scRNA-Seq) analysis of the E14.5 mouse kidney. Using the software AltAnalyze, in conjunction with the unsupervised approach ICGS, we were unable to identify and confirm the presence of 16 distinct cell populations during this stage of active nephrogenesis. Using a novel integrative supervised computational strategy, we were able to successfully harmonize and compare the cell profiles across all three technological platforms. Analysis of possible cross compartment receptor/ligand interactions identified the nephrogenic zone stroma as a source of GDNF. This was unexpected because the cap mesenchyme nephron progenitors had been thought to be the sole source of GDNF, which is a key driver of branching morphogenesis of the collecting duct system. The expression of Gdnf by stromal cells was validated in several ways, including Gdnf in situ hybridization combined with immunohistochemistry for SIX2, and marker of nephron progenitors, and MEIS1, a marker of stromal cells. Finally, the single cell gene expression profiles generated in this study confirmed and extended previous work showing the presence of multilineage priming during kidney development. Nephron progenitors showed stochastic expression of genes associated with multiple potential differentiation lineages. Copyright © 2017 Elsevier Inc. All rights reserved.

  3. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  4. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  5. [Design of visualized medical images network and web platform based on MeVisLab].

    PubMed

    Xiang, Jun; Ye, Qing; Yuan, Xun

    2017-04-01

    With the trend of the development of "Internet +", some further requirements for the mobility of medical images have been required in the medical field. In view of this demand, this paper presents a web-based visual medical imaging platform. First, the feasibility of medical imaging is analyzed and technical points. CT (Computed Tomography) or MRI (Magnetic Resonance Imaging) images are reconstructed three-dimensionally by MeVisLab and packaged as X3D (Extensible 3D Graphics) files shown in the present paper. Then, the B/S (Browser/Server) system specially designed for 3D image is designed by using the HTML 5 and WebGL rendering engine library, and the X3D image file is parsed and rendered by the system. The results of this study showed that the platform was suitable for multiple operating systems to realize the platform-crossing and mobilization of medical image data. The development of medical imaging platform is also pointed out in this paper. It notes that web application technology will not only promote the sharing of medical image data, but also facilitate image-based medical remote consultations and distance learning.

  6. 3D printed rapid disaster response

    NASA Astrophysics Data System (ADS)

    Lacaze, Alberto; Murphy, Karl; Mottern, Edward; Corley, Katrina; Chu, Kai-Dee

    2014-05-01

    Under the Department of Homeland Security-sponsored Sensor-smart Affordable Autonomous Robotic Platforms (SAARP) project, Robotic Research, LLC is developing an affordable and adaptable method to provide disaster response robots developed with 3D printer technology. The SAARP Store contains a library of robots, a developer storefront, and a user storefront. The SAARP Store allows the user to select, print, assemble, and operate the robot. In addition to the SAARP Store, two platforms are currently being developed. They use a set of common non-printed components that will allow the later design of other platforms that share non-printed components. During disasters, new challenges are faced that require customized tools or platforms. Instead of prebuilt and prepositioned supplies, a library of validated robots will be catalogued to satisfy various challenges at the scene. 3D printing components will allow these customized tools to be deployed in a fraction of the time that would normally be required. While the current system is focused on supporting disaster response personnel, this system will be expandable to a range of customers, including domestic law enforcement, the armed services, universities, and research facilities.

  7. An overview of sensor calibration inter-comparison and applications

    USGS Publications Warehouse

    Xiong, Xiaoxiong; Cao, Changyong; Chander, Gyanesh

    2010-01-01

    Long-term climate data records (CDR) are often constructed using observations made by multiple Earth observing sensors over a broad range of spectra and a large scale in both time and space. These sensors can be of the same or different types operated on the same or different platforms. They can be developed and built with different technologies and are likely operated over different time spans. It has been known that the uncertainty of climate models and data records depends not only on the calibration quality (accuracy and stability) of individual sensors, but also on their calibration consistency across instruments and platforms. Therefore, sensor calibration inter-comparison and validation have become increasingly demanding and will continue to play an important role for a better understanding of the science product quality. This paper provides an overview of different methodologies, which have been successfully applied for sensor calibration inter-comparison. Specific examples using different sensors, including MODIS, AVHRR, and ETM+, are presented to illustrate the implementation of these methodologies.

  8. Increased anatomic severity predicts outcomes: validation of the American Association for the Surgery of Trauma's emergency general surgery score in appendicitis

    PubMed Central

    Hernandez, Matthew; Aho, Johnathan M.; Habermann, Elizabeth B.; Choudhry, Asad; Morris, David; Zielinski, Martin

    2016-01-01

    Background Determination and reporting of disease severity in emergency general surgery (EGS) lacks standardization. Recently, the American Association for the Surgery of Trauma (AAST) proposed an anatomic severity grading system. We aimed to validate this system in patients with appendicitis, and determine if cross sectional imaging correlates with disease severity at operation. Methods Patients 18 years or older undergoing treatment for acute appendicitis between 2013 and 2015 were identified. Baseline demographics, procedure types were recorded, and AAST grades were assigned based on intraoperative and radiologic findings. Outcomes including length of stay, 30 day mortality, and complications based on Clavien-Dindo categories and National Surgical Quality Improvement Program variables. Summary statistical univariate, nominal logistic and standard least squares analyses were performed comparing AAST grade with key outcomes. Bland-Altman analysis compared operative findings to preoperative cross sectional imaging to compare assigning grades. Results 334 patients with mean (±SD) age of 39.3 years (±16.5) were included (53% male) and all patients had cross sectional imaging. 299 underwent appendectomy, and 85% completed laparoscopic. 30 day mortality rate was 0.9%, complication rate 21%. Increased median [IQR] AAST grade was recorded in patients with complications 2 [1-4] compared to those without 1 [1-1], p=0.001. For operative management, a median [IQR] AAST grades were significantly associated with procedure type: laparoscopic 1 [1-1], open 4 [2-5] conversion to open 3 [1-4], p=0.001. Increased median [IQR] AAST grades were significantly associated in non-operative management: patients having a complication had a higher median AAST grade of 4 [3-5], compared to those without 3 [2-3], p=0.001. Bland Altman analysis comparing AAST grade and cross sectional imaging demonstrated no difference; −0.02 ±0.02 p = 0.2 coefficient of repeatability 0.9. Conclusions The AAST grading system is valid in our population. Increased AAST grade is associated with open procedures, complications, and length of stay. AAST EGS grade determined by preoperative imaging strongly correlated to operative findings. PMID:27805996

  9. 360° Operative Videos: A Randomised Cross-Over Study Evaluating Attentiveness and Information Retention.

    PubMed

    Harrington, Cuan M; Kavanagh, Dara O; Wright Ballester, Gemma; Wright Ballester, Athena; Dicker, Patrick; Traynor, Oscar; Hill, Arnold; Tierney, Sean

    2017-11-06

    Although two-dimensional (2D) and three-dimensional videos have traditionally provided foundations for reviewing operative procedures, the recent 360º format may provide new dimensions to surgical education. This study sought to describe the production of a high quality 360º video for an index-operation (augmented with educational material), while evaluating for variances in attentiveness, information retention, and appraisal compared to 2D. A 6-camera synchronised array (GoPro Omni, [California, United States]) was suspended inverted and recorded an elective laparoscopic cholecystectomy in 2016. A single-blinded randomised cross-over study was performed to evaluate this video in 360º vs 2D formats. Group A experienced the 360º video using Samsung (Suwon, South-Korea) GearVR virtual-reality headsets, followed by the 2D experience on a 75-inch television. Group B were reversed. Each video was probed at designated time points for engagement levels and task-unrelated images or thoughts. Alternating question banks were administered following each video experience. Feedback was obtained via a short survey at study completion. The New Academic and Education Building (NAEB) in Dublin, Royal College of Surgeons in Ireland, July 2017. Preclinical undergraduate students from a medical university in Ireland. Forty students participated with a mean age of 23.2 ± 4.5 years and equal sex involvement. The 360º video demonstrated significantly higher engagement (p < 0.01) throughout the experience and lower task-unrelated images or thoughts (p < 0.01). Significant variances in information retention between the 2 groups were absent (p = 0.143) but most (65%) reported the 360º video as their learning platform of choice. Mean appraisal levels for the 360º platform were positive with mean responses of >8/10 for the platform for learning, immersion, and entertainment. This study describes the successful development and evaluation of a 360º operative video. This new video format demonstrated significant engagement and attentiveness benefits compared to traditional 2D formats. This requires further evaluation in the field of technology enhanced learning. Copyright © 2017 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.

  10. We-Measure: Toward a low-cost portable posturography for patients with multiple sclerosis using the commercial Wii balance board.

    PubMed

    Castelli, Letizia; Stocchi, Luca; Patrignani, Maurizio; Sellitto, Giovanni; Giuliani, Manuela; Prosperini, Luca

    2015-12-15

    This study was aimed at investigating whether postural sway measures derived from a standard force platform were similar to those generated by a custom-written software ("We-Measure") acquiring and processing data from a commercial Nintendo balance board (BB). For this purpose, 90 patients with multiple sclerosis (MS) and 50 healthy controls (HC) were tested in a single-day session with a reference standard force platform and a BB-based system. Despite its acceptable between-device agreement (tested by visual evaluation of Bland-Altman plot), the low-cost BB-based system tended to overestimate postural sway when compared to the reference standard force platform in both MS and HC groups (on average +30% and +54%, respectively). Between-device reliability was just adequate (MS: 66%, HC: 47%), while test-retest reliability was excellent (MS: 84%, HC: 88%). Concurrent validity evaluation showed similar performance between the reference standard force platform and the BB-based system in discriminating fallers and non-fallers among patients with MS. All these findings may encourage the use of this balance board-based new device in longitudinal study, rather than in cross-sectional design, thus providing a potential useful tool for multicenter settings. Copyright © 2015 Elsevier B.V. All rights reserved.

  11. A new tool for converting food frequency questionnaire data into nutrient and food group values: FETA research methods and availability

    PubMed Central

    Mulligan, Angela A; Luben, Robert N; Bhaniani, Amit; Parry-Smith, David J; O'Connor, Laura; Khawaja, Anthony P; Forouhi, Nita G; Khaw, Kay-Tee

    2014-01-01

    Objectives To describe the research methods for the development of a new open source, cross-platform tool which processes data from the European Prospective Investigation into Cancer and Nutrition Norfolk Food Frequency Questionnaire (EPIC-Norfolk FFQ). A further aim was to compare nutrient and food group values derived from the current tool (FETA, FFQ EPIC Tool for Analysis) with the previously validated but less accessible tool, CAFÉ (Compositional Analyses from Frequency Estimates). The effect of text matching on intake data was also investigated. Design Cross-sectional analysis of a prospective cohort study—EPIC-Norfolk. Setting East England population (city of Norwich and its surrounding small towns and rural areas). Participants Complete FFQ data from 11 250 men and 13 602 women (mean age 59 years; range 40–79 years). Outcome measures Nutrient and food group intakes derived from FETA and CAFÉ analyses of EPIC-Norfolk FFQ data. Results Nutrient outputs from FETA and CAFÉ were similar; mean (SD) energy intake from FETA was 9222 kJ (2633) in men, 8113 kJ (2296) in women, compared with CAFÉ intakes of 9175 kJ (2630) in men, 8091 kJ (2298) in women. The majority of differences resulted in one or less quintile change (98.7%). Only mean daily fruit and vegetable food group intakes were higher in women than in men (278 vs 212 and 284 vs 255 g, respectively). Quintile changes were evident for all nutrients, with the exception of alcohol, when text matching was not executed; however, only the cereals food group was affected. Conclusions FETA produces similar nutrient and food group values to the previously validated CAFÉ but has the advantages of being open source, cross-platform and complete with a data-entry form directly compatible with the software. The tool will facilitate research using the EPIC-Norfolk FFQ, and can be customised for different study populations. PMID:24674997

  12. A Semantic Cooperation and Interoperability Platform for the European Chambers of Commerce

    NASA Astrophysics Data System (ADS)

    Missikoff, Michele; Taglino, Francesco

    The LD-CAST project aims at developing a semantic cooperation and interoperability platform for the European Chambers of Commerce. Some of the key issues that this platform addresses are: The variety and number of different kinds of resources (i.e., business processes, concrete services) that concur to achieve a business service The diversity of cultural and procedural models emerging when composing articulated cross-country services The limited possibility of reusing similar services in different contexts (for instance, supporting the same service between different countries: an Italian-Romanian cooperation is different from an Italian-Polish one) The objective of the LD-CAST platform, and in particular of the semantic services provided therein, is to address the above problems with flexible solutions. We aim at introducing high levels of flexibility, both at the time of development of business processes and concrete services (i.e., operational services offered by service providers), with the possibility of dynamically binding c-services to the selected BP, according to user needs. To this end, an approach based on semantic services and a reference ontology has been proposed.

  13. Earth resources instrumentation for the Space Station Polar Platform

    NASA Technical Reports Server (NTRS)

    Donohoe, Martin J.; Vane, Deborah

    1986-01-01

    The spacecraft and payloads of the Space Station Polar Platform program are described in a brief overview. Present plans call for one platform in a descending morning-equator-crossing orbit at 824 km and two or three platforms in ascending afternoon-crossing orbits at 542-824 km. The components of the NASA Earth Observing System (EOS) and NOAA payloads are listed in tables and briefly characterized, and data-distribution requirements and the mission development schedule are discussed. A drawing of the platform, a graph showing the spectral coverage of the EOS instruments, and a glossary of acronyms are provided.

  14. A Design of a Novel Airborne Aerosol Spectrometer for Remote Sensing Validation

    NASA Astrophysics Data System (ADS)

    Adler, G. A.; Brock, C. A.; Dube, W. P.; Erdesz, F.; Gordon, T.; Law, D. C.; Manfred, K.; Mason, B. J.; McLaughlin, R. J.; Richardson, M.; Wagner, N. L.; Washenfelder, R. A.; Murphy, D. M.

    2016-12-01

    Aerosols and their effect on the radiative properties of clouds contribute one of the largest sources of uncertainty to the Earth's energy budget. Many current global assessments, of atmospheric aerosol radiative forcing rely heavily on remote sensing observation; therefore, in situ aircraft and ground-based measurements are essential for validation of remote sensing measurements. Cavity ringdown spectrometers (CRD) measure aerosol extinction and are commonly used to validate remote sensing observations. These instruments have been deployed on aircraft based platforms over the years thus providing the opportunity to measure these properties over large areas in various conditions. However, deployment of the CRD on an aircraft platform has drawbacks. Typically, aircraft based CRDs draw sampled aerosol into a cabin based instrument through long lengths of tubing. This limits the ability of the instrument to measure: 1) Course mode aerosols (e.g. dust) 2) Aerosols at high relative humidity (above 90%) Here we describe the design of a novel aircraft based open path CRD. The open path CRD is intended to be mounted external to the cabin and has no sample tubing for aerosol delivery, thus measuring optical properties of all aerosol at the ambient conditions. However, the design of an open path CRD for operation on a wing-mounted aircraft platform has certain design complexities. The instrument's special design features include 2 CRD channels, 2 airfoils around the open Path CRD and a configuration which could be easily aligned and rigid at the same time. This novel implementation of cavity ringdown spectroscopy will provide a better assessment of the accuracy of remote sensing satellite measurements

  15. PREPARE: innovative integrated tools and platforms for radiological emergency preparedness and post-accident response in Europe.

    PubMed

    Raskob, Wolfgang; Schneider, Thierry; Gering, Florian; Charron, Sylvie; Zhelezniak, Mark; Andronopoulos, Spyros; Heriard-Dubreuil, Gilles; Camps, Johan

    2015-04-01

    The PREPARE project that started in February 2013 and will end at the beginning of 2016 aims to close gaps that have been identified in nuclear and radiological preparedness in Europe following the first evaluation of the Fukushima disaster. Among others, the project will address the review of existing operational procedures for dealing with long-lasting releases and cross-border problems in radiation monitoring and food safety and further develop missing functionalities in decision support systems (DSS) ranging from improved source-term estimation and dispersion modelling to the inclusion of hydrological pathways for European water bodies. In addition, a so-called Analytical Platform will be developed exploring the scientific and operational means to improve information collection, information exchange and the evaluation of such types of disasters. The tools developed within the project will be partly integrated into the two DSS ARGOS and RODOS. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  16. Biomechanical ToolKit: Open-source framework to visualize and process biomechanical data.

    PubMed

    Barre, Arnaud; Armand, Stéphane

    2014-04-01

    C3D file format is widely used in the biomechanical field by companies and laboratories to store motion capture systems data. However, few software packages can visualize and modify the integrality of the data in the C3D file. Our objective was to develop an open-source and multi-platform framework to read, write, modify and visualize data from any motion analysis systems using standard (C3D) and proprietary file formats (used by many companies producing motion capture systems). The Biomechanical ToolKit (BTK) was developed to provide cost-effective and efficient tools for the biomechanical community to easily deal with motion analysis data. A large panel of operations is available to read, modify and process data through C++ API, bindings for high-level languages (Matlab, Octave, and Python), and standalone application (Mokka). All these tools are open-source and cross-platform and run on all major operating systems (Windows, Linux, MacOS X). Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Reducing mechanical cross-coupling in phased array transducers using stop band material as backing

    NASA Astrophysics Data System (ADS)

    Henneberg, J.; Gerlach, A.; Storck, H.; Cebulla, H.; Marburg, S.

    2018-06-01

    Phased array transducers are widely used for acoustic imaging and surround sensing applications. A major design challenge is the achievement of low mechanical cross-coupling between the single transducer elements. Cross-coupling induces a loss of imaging resolution. In this work, the mechanical cross-coupling between acoustic transducers is investigated for a generic model. The model contains a common backing with two bending elements bonded on top. The dimensions of the backing are small; thus, wave reflections on the backing edges have to be considered. This is different to other researches. The operating frequency in the generic model is set to a low kHz range. Low operating frequencies are typical for surround sensing applications. The influence of the backing on cross-coupling is investigated numerically. In order to reduce mechanical cross-coupling a stop band material is designed. It is shown numerically that a reduction in mechanical cross-coupling can be achieved by using stop band material as backing. The effect is validated with experimental testing.

  18. Cross-Cultural Applicability of the Montreal Cognitive Assessment (MoCA): A Systematic Review.

    PubMed

    O'Driscoll, Ciarán; Shaikh, Madiha

    2017-01-01

    The Montreal Cognitive Assessment (MoCA) is widely used to screen for mild cognitive impairment (MCI). While there are many available versions, the cross-cultural validity of the assessment has not been explored sufficiently. We aimed to interrogate the validity of the MoCA in a cross-cultural context: in differentiating MCI from normal controls (NC); and identifying cut-offs and adjustments for age and education where possible. This review sourced a wide range of studies including case-control studies. In addition, we report findings for differentiating dementias from NC and MCI from dementias, however, these were not considered to be an appropriate use of the MoCA. The subject of the review assumes heterogeneity and therefore meta-analyses was not conducted. Quality ratings, forest plots of validated studies (sensitivity and specificity) with covariates (suggested cut-offs, age, education and country), and summary receiver operating characteristic curve are presented. The results showed a wide range in suggested cutoffs for MCI cross-culturally, with variability in levels of sensitivity and specificity ranging from low to high. Poor methodological rigor appears to have affected reported accuracy and validity of the MoCA. The review highlights the necessity for cross-cultural considerations when using the MoCA, and recognizing it as a screen and not a diagnostic tool. Appropriate cutoffs and point adjustments for education are suggested.

  19. A review of simulation platforms in surgery of the temporal bone.

    PubMed

    Bhutta, M F

    2016-10-01

    Surgery of the temporal bone is a high-risk activity in an anatomically complex area. Simulation enables rehearsal of such surgery. The traditional simulation platform is the cadaveric temporal bone, but in recent years other simulation platforms have been created, including plastic and virtual reality platforms. To undertake a review of simulation platforms for temporal bone surgery, specifically assessing their educational value in terms of validity and in enabling transition to surgery. Systematic qualitative review. Search of the Pubmed, CINAHL, BEI and ERIC databases. Assessment of reported outcomes in terms of educational value. A total of 49 articles were included, covering cadaveric, animal, plastic and virtual simulation platforms. Cadaveric simulation is highly rated as an educational tool, but there may be a ceiling effect on educational outcomes after drilling 8-10 temporal bones. Animal models show significant anatomical variation from man. Plastic temporal bone models offer much potential, but at present lack sufficient anatomical or haptic validity. Similarly, virtual reality platforms lack sufficient anatomical or haptic validity, but with technological improvements they are advancing rapidly. At present, cadaveric simulation remains the best platform for training in temporal bone surgery. Technological advances enabling improved materials or modelling mean that in the future plastic or virtual platforms may become comparable to cadaveric platforms, and also offer additional functionality including patient-specific simulation from CT data. © 2015 John Wiley & Sons Ltd.

  20. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE PAGES

    Van Berkel, Gary J.; Kertesz, Vilmos

    2016-11-15

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  1. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Berkel, Gary J.; Kertesz, Vilmos

    An “Open Access”-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendorprovided software libraries. Sample classification based on spectral comparison utilized themore » spectral contrast angle method. As a result, using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. In conclusion, this work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching.« less

  2. Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approach

    NASA Astrophysics Data System (ADS)

    Baker, B.; Lee, T.; Buban, M.; Dumas, E. J.

    2017-12-01

    Evaluation of Unmanned Aircraft Systems (UAS) for Weather and Climate using the Multi-testbed approachC. Bruce Baker1, Ed Dumas1,2, Temple Lee1,2, Michael Buban1,21NOAA ARL, Atmospheric Turbulence and Diffusion Division, Oak Ridge, TN2Oak Ridge Associated Universities, Oak Ridge, TN The development of a small Unmanned Aerial System (sUAS) testbeds that can be used to validate, integrate, calibrate and evaluate new technology and sensors for routine boundary layer research, validation of operational weather models, improvement of model parameterizations, and recording observations within high-impact storms is important for understanding the importance and impact of using sUAS's routinely as a new observing platform. The goal of the multi-testbed approach is to build a robust set of protocols to assess the cost and operational feasibility of unmanned observations for routine applications using various combinations of sUAS aircraft and sensors in different locations and field experiments. All of these observational testbeds serve different community needs, but they also use a diverse suite of methodologies for calibration and evaluation of different sensors and platforms for severe weather and boundary layer research. The primary focus will be to evaluate meteorological sensor payloads to measure thermodynamic parameters and define surface characteristics with visible, IR, and multi-spectral cameras. This evaluation will lead to recommendations for sensor payloads for VTOL and fixed-wing sUAS.

  3. Statistical analysis of an RNA titration series evaluates microarray precision and sensitivity on a whole-array basis

    PubMed Central

    Holloway, Andrew J; Oshlack, Alicia; Diyagama, Dileepa S; Bowtell, David DL; Smyth, Gordon K

    2006-01-01

    Background Concerns are often raised about the accuracy of microarray technologies and the degree of cross-platform agreement, but there are yet no methods which can unambiguously evaluate precision and sensitivity for these technologies on a whole-array basis. Results A methodology is described for evaluating the precision and sensitivity of whole-genome gene expression technologies such as microarrays. The method consists of an easy-to-construct titration series of RNA samples and an associated statistical analysis using non-linear regression. The method evaluates the precision and responsiveness of each microarray platform on a whole-array basis, i.e., using all the probes, without the need to match probes across platforms. An experiment is conducted to assess and compare four widely used microarray platforms. All four platforms are shown to have satisfactory precision but the commercial platforms are superior for resolving differential expression for genes at lower expression levels. The effective precision of the two-color platforms is improved by allowing for probe-specific dye-effects in the statistical model. The methodology is used to compare three data extraction algorithms for the Affymetrix platforms, demonstrating poor performance for the commonly used proprietary algorithm relative to the other algorithms. For probes which can be matched across platforms, the cross-platform variability is decomposed into within-platform and between-platform components, showing that platform disagreement is almost entirely systematic rather than due to measurement variability. Conclusion The results demonstrate good precision and sensitivity for all the platforms, but highlight the need for improved probe annotation. They quantify the extent to which cross-platform measures can be expected to be less accurate than within-platform comparisons for predicting disease progression or outcome. PMID:17118209

  4. A Vehicle Management End-to-End Testing and Analysis Platform for Validation of Mission and Fault Management Algorithms to Reduce Risk for NASAs Space Launch System

    NASA Technical Reports Server (NTRS)

    Trevino, Luis; Johnson, Stephen B.; Patterson, Jonathan; Teare, David

    2015-01-01

    The engineering development of the National Aeronautics and Space Administration's (NASA) new Space Launch System (SLS) requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The nominal and off-nominal characteristics of SLS's elements and subsystems must be understood and matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex systems engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model-based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model-based algorithms and their development lifecycle from inception through FSW certification are an important focus of SLS's development effort to further ensure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. To test and validate these M&FM algorithms a dedicated test-bed was developed for full Vehicle Management End-to-End Testing (VMET). For addressing fault management (FM) early in the development lifecycle for the SLS program, NASA formed the M&FM team as part of the Integrated Systems Health Management and Automation Branch under the Spacecraft Vehicle Systems Department at the Marshall Space Flight Center (MSFC). To support the development of the FM algorithms, the VMET developed by the M&FM team provides the ability to integrate the algorithms, perform test cases, and integrate vendor-supplied physics-based launch vehicle (LV) subsystem models. Additionally, the team has developed processes for implementing and validating the M&FM algorithms for concept validation and risk reduction. The flexibility of the VMET capabilities enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS, GNC, and others. One of the principal functions of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software test and validation processes. In any software development process there is inherent risk in the interpretation and implementation of concepts from requirements and test cases into flight software compounded with potential human errors throughout the development and regression testing lifecycle. Risk reduction is addressed by the M&FM group but in particular by the Analysis Team working with other organizations such as S&MA, Structures and Environments, GNC, Orion, Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission (LOM) and Loss of Crew (LOC) probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses to be tested in VMET to ensure reliable failure detection, and confirm responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW processor scheduling constraints due to their target platform - the ARINC 6535-partitioned Operating System, resource limitations, and other factors related to integration with other subsystems not directly involved with M&FM such as telemetry packing and processing. The baseline plan for use of VMET encompasses testing the original M&FM algorithms coded in the same C++ language and state machine architectural concepts as that used by FSW. This enables the development of performance standards and test cases to characterize the M&FM algorithms and sets a benchmark from which to measure their effectiveness and performance in the exterior FSW development and test processes. This paper is outlined in a systematic fashion analogous to a lifecycle process flow for engineering development of algorithms into software and testing. Section I describes the NASA SLS M&FM context, presenting the current infrastructure, leading principles, methods, and participants. Section II defines the testing philosophy of the M&FM algorithms as related to VMET followed by section III, which presents the modeling methods of the algorithms to be tested and validated in VMET. Its details are then further presented in section IV followed by Section V presenting integration, test status, and state analysis. Finally, section VI addresses the summary and forward directions followed by the appendices presenting relevant information on terminology and documentation.

  5. Browser App Approach: Can It Be an Answer to the Challenges in Cross-Platform App Development?

    ERIC Educational Resources Information Center

    Huynh, Minh; Ghimire, Prashant

    2017-01-01

    Aim/Purpose: As smartphones proliferate, many different platforms begin to emerge. The challenge to developers as well as IS [Information Systems] educators and students is how to learn the skills to design and develop apps to run on cross-platforms. Background: For developers, the purpose of this paper is to describe an alternative to the complex…

  6. Standard Specimen Reference Set: Pancreatic — EDRN Public Portal

    Cancer.gov

    The primary objective of the EDRN Pancreatic Cancer Working Group Proposal is to create a reference set consisting of well-characterized serum/plasma specimens to use as a resource for the development of biomarkers for the early detection of pancreatic adenocarcinoma. The testing of biomarkers on the same sample set permits direct comparison among them; thereby, allowing the development of a biomarker panel that can be evaluated in a future validation study. Additionally, the establishment of an infrastructure with core data elements and standardized operating procedures for specimen collection, processing and storage, will provide the necessary preparatory platform for larger validation studies when the appropriate marker/panel for pancreatic adenocarcinoma has been identified.

  7. Analysis of Human Plasma Metabolites across Different Liquid Chromatography - Mass Spectrometry Platforms: Cross-platform Transferable Chemical Signatures

    PubMed Central

    Telu, Kelly H.; Yan, Xinjian; Wallace, William E.; Stein, Stephen E.; Simón-Manso, Yamil

    2016-01-01

    RATIONALE The metabolite profiling of a NIST plasma Standard Reference Material (SRM 1950) on different LC-MS platforms showed significant differences. Although these findings suggest caution when interpreting metabolomics results, the degree of overlap of both profiles allowed us to use tandem mass spectral libraries of recurrent spectra to evaluate to what extent these results are transferable across platforms and to develop cross-platform chemical signatures. METHODS Non-targeted global metabolite profiles of SRM 1950 were obtained on different LC-MS platforms using reversed phase chromatography and different chromatographic scales (nano, conventional and UHPLC). The data processing and the metabolite differential analysis were carried out using publically available (XCMS), proprietary (Mass Profiler Professional) and in-house software (NIST pipeline). RESULTS Repeatability and intermediate precision showed that the non-targeted SRM 1950 profiling was highly reproducible when working on the same platform (RSD < 2%); however, substantial differences were found in the LC-MS patterns originating on different platforms or even using different chromatographic scales (conventional HPLC, UHPLC and nanoLC) on the same platform. A substantial degree of overlap (common molecular features) was also found. A procedure to generate consistent chemical signatures using tandem mass spectral libraries of recurrent spectra is proposed. CONLUSIONS Different platforms rendered significantly different metabolite profiles, but the results were highly reproducible when working within one platform. Tandem mass spectral libraries of recurrent spectra are proposed to evaluate the degree of transferability of chemical signatures generated on different platforms. Chemical signatures based on our procedure are most likely cross-platform transferable. PMID:26842580

  8. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  9. A High-Speed, Real-Time Visualization and State Estimation Platform for Monitoring and Control of Electric Distribution Systems: Implementation and Field Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lundstrom, Blake; Gotseff, Peter; Giraldez, Julieta

    Continued deployment of renewable and distributed energy resources is fundamentally changing the way that electric distribution systems are controlled and operated; more sophisticated active system control and greater situational awareness are needed. Real-time measurements and distribution system state estimation (DSSE) techniques enable more sophisticated system control and, when combined with visualization applications, greater situational awareness. This paper presents a novel demonstration of a high-speed, real-time DSSE platform and related control and visualization functionalities, implemented using existing open-source software and distribution system monitoring hardware. Live scrolling strip charts of meter data and intuitive annotated map visualizations of the entire state (obtainedmore » via DSSE) of a real-world distribution circuit are shown. The DSSE implementation is validated to demonstrate provision of accurate voltage data. This platform allows for enhanced control and situational awareness using only a minimum quantity of distribution system measurement units and modest data and software infrastructure.« less

  10. A New Instrument for Measurement of the Solar Aureole Radiance Distribution from Unstable Platforms

    NASA Technical Reports Server (NTRS)

    Ritter, Joseph M.; Voss, Kenneth J.

    1999-01-01

    A novel imaging solar aureole radiometer, which can obtain absolute radiometric measurements of the solar aureole when operated on an unstable platform is described. A CCD array is used to image the aureole, while a neutral density occulter on a long pole blocks the direct solar radiation. This ensures accurate direction registration as the sun appears in acquired images, and the total circumsolar region is measured simultaneously. The imaging nature of this instrument along with a special triggering device permit acquisition of the circumsolar sky radiance within 7.5 degrees of the center of the solar disk, and within 1 degree of the edge of the solar disk. This innovation makes possible for the first time, reliable and accurate radiometric measurements of the solar aureole from unstable mobile platforms such as ships. This allows determination small angle atmospheric scattering. The instrument has been used in field studies of atmospheric aerosols and will be used in satellite validation and calibration campaigns.

  11. H∞ memory feedback control with input limitation minimization for offshore jacket platform stabilization

    NASA Astrophysics Data System (ADS)

    Yang, Jia Sheng

    2018-06-01

    In this paper, we investigate a H∞ memory controller with input limitation minimization (HMCIM) for offshore jacket platforms stabilization. The main objective of this study is to reduce the control consumption as well as protect the actuator when satisfying the requirement of the system performance. First, we introduce a dynamic model of offshore platform with low order main modes based on mode reduction method in numerical analysis. Then, based on H∞ control theory and matrix inequality techniques, we develop a novel H∞ memory controller with input limitation. Furthermore, a non-convex optimization model to minimize input energy consumption is proposed. Since it is difficult to solve this non-convex optimization model by optimization algorithm, we use a relaxation method with matrix operations to transform this non-convex optimization model to be a convex optimization model. Thus, it could be solved by a standard convex optimization solver in MATLAB or CPLEX. Finally, several numerical examples are given to validate the proposed models and methods.

  12. Acceleration of Radiance for Lighting Simulation by Using Parallel Computing with OpenCL

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zuo, Wangda; McNeil, Andrew; Wetter, Michael

    2011-09-06

    We report on the acceleration of annual daylighting simulations for fenestration systems in the Radiance ray-tracing program. The algorithm was optimized to reduce both the redundant data input/output operations and the floating-point operations. To further accelerate the simulation speed, the calculation for matrix multiplications was implemented using parallel computing on a graphics processing unit. We used OpenCL, which is a cross-platform parallel programming language. Numerical experiments show that the combination of the above measures can speed up the annual daylighting simulations 101.7 times or 28.6 times when the sky vector has 146 or 2306 elements, respectively.

  13. Comments on airborne ISR radar utilization

    NASA Astrophysics Data System (ADS)

    Doerry, A. W.

    2016-05-01

    A sensor/payload operator for modern multi-sensor multi-mode Intelligence, Surveillance, and Reconnaissance (ISR) platforms is often confronted with a plethora of options in sensors and sensor modes. This often leads an over-worked operator to down-select to favorite sensors and modes; for example a justifiably favorite Full Motion Video (FMV) sensor at the expense of radar modes, even if radar modes can offer unique and advantageous information. At best, sensors might be used in a serial monogamous fashion with some cross-cueing. The challenge is then to increase the utilization of the radar modes in a manner attractive to the sensor/payload operator. We propose that this is best accomplished by combining sensor modes and displays into `super-modes'.

  14. A software platform for phase contrast x-ray breast imaging research.

    PubMed

    Bliznakova, K; Russo, P; Mettivier, G; Requardt, H; Popov, P; Bravin, A; Buliev, I

    2015-06-01

    To present and validate a computer-based simulation platform dedicated for phase contrast x-ray breast imaging research. The software platform, developed at the Technical University of Varna on the basis of a previously validated x-ray imaging software simulator, comprises modules for object creation and for x-ray image formation. These modules were updated to take into account the refractive index for phase contrast imaging as well as implementation of the Fresnel-Kirchhoff diffraction theory of the propagating x-ray waves. Projection images are generated in an in-line acquisition geometry. To test and validate the platform, several phantoms differing in their complexity were constructed and imaged at 25 keV and 60 keV at the beamline ID17 of the European Synchrotron Radiation Facility. The software platform was used to design computational phantoms that mimic those used in the experimental study and to generate x-ray images in absorption and phase contrast modes. The visual and quantitative results of the validation process showed an overall good correlation between simulated and experimental images and show the potential of this platform for research in phase contrast x-ray imaging of the breast. The application of the platform is demonstrated in a feasibility study for phase contrast images of complex inhomogeneous and anthropomorphic breast phantoms, compared to x-ray images generated in absorption mode. The improved visibility of mammographic structures suggests further investigation and optimisation of phase contrast x-ray breast imaging, especially when abnormalities are present. The software platform can be exploited also for educational purposes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Partial least squares analysis of rocket propulsion fuel data using diaphragm valve-based comprehensive two-dimensional gas chromatography coupled with flame ionization detection.

    PubMed

    Freye, Chris E; Fitz, Brian D; Billingsley, Matthew C; Synovec, Robert E

    2016-06-01

    The chemical composition and several physical properties of RP-1 fuels were studied using comprehensive two-dimensional (2D) gas chromatography (GC×GC) coupled with flame ionization detection (FID). A "reversed column" GC×GC configuration was implemented with a RTX-wax column on the first dimension ((1)D), and a RTX-1 as the second dimension ((2)D). Modulation was achieved using a high temperature diaphragm valve mounted directly in the oven. Using leave-one-out cross-validation (LOOCV), the summed GC×GC-FID signal of three compound-class selective 2D regions (alkanes, cycloalkanes, and aromatics) was regressed against previously measured ASTM derived values for these compound classes, yielding root mean square errors of cross validation (RMSECV) of 0.855, 0.734, and 0.530mass%, respectively. For comparison, using partial least squares (PLS) analysis with LOOCV, the GC×GC-FID signal of the entire 2D separations was regressed against the same ASTM values, yielding a linear trend for the three compound classes (alkanes, cycloalkanes, and aromatics), yielding RMSECV values of 1.52, 2.76, and 0.945 mass%, respectively. Additionally, a more detailed PLS analysis was undertaken of the compounds classes (n-alkanes, iso-alkanes, mono-, di-, and tri-cycloalkanes, and aromatics), and of physical properties previously determined by ASTM methods (such as net heat of combustion, hydrogen content, density, kinematic viscosity, sustained boiling temperature and vapor rise temperature). Results from these PLS studies using the relatively simple to use and inexpensive GC×GC-FID instrumental platform are compared to previously reported results using the GC×GC-TOFMS instrumental platform. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Real-time FPGA-based radar imaging for smart mobility systems

    NASA Astrophysics Data System (ADS)

    Saponara, Sergio; Neri, Bruno

    2016-04-01

    The paper presents an X-band FMCW (Frequency Modulated Continuous Wave) Radar Imaging system, called X-FRI, for surveillance in smart mobility applications. X-FRI allows for detecting the presence of targets (e.g. obstacles in a railway crossing or urban road crossing, or ships in a small harbor), as well as their speed and their position. With respect to alternative solutions based on LIDAR or camera systems, X-FRI operates in real-time also in bad lighting and weather conditions, night and day. The radio-frequency transceiver is realized through COTS (Commercial Off The Shelf) components on a single-board. An FPGA-based baseband platform allows for real-time Radar image processing.

  17. New Mobile Atmospheric Lidar Systems for Spaceborne Instrument Validation

    NASA Astrophysics Data System (ADS)

    Chazette, P.; Raut, J.-C.; Sanak, J.; Berthier, S.; Dulac, F.; Kim, S. W.; Royer, P.

    2009-04-01

    We present an overview of our different approaches using lidar systems as a tool to validate and develop the new generation of spaceborne missions. We have developed several mini-lidars in order to study the vertical structure, the clouds and the particulate composition of the atmosphere from mobile platforms. Here we focus on three mobile instrumental platforms including a backscatter lidar instrument developed for validation of the Cloud-Aerosol LIdar with Orthogonal Polarization (CALIOP) onboard CALIPSO and of the Interféromètre Atmosphérique de Sondage Infrarouge (IASI) onboard METOP. The first system is operated onboard an ultra-light aircraft (ULA) (Chazette et al., Environ. Sci. Technol., 2007). The second one is operated onboard a stratospheric balloon to study the interest of the measurement synergy with the Infrared Atmospheric Sounding Interferometer (IASI). The third one is part of a truck/car mobile station to be positioned close to the satellite ground-track (e.g. CALIPSO) or inside the area delimitated by the instrumental swath (e.g. IASI). CALIPSO was inserted in the A-Train constellation behind Aqua on 28 April, 2006 (http://www-calipso.larc.nasa.gov/about/atrain.php). One of the main objectives of the scientific mission is the study of atmospheric aerosols. Before the CALIOP lidar profiles could be used in an operational way, it has been necessary to validate both the raw and geophysical data of the instrument. For this purpose, we carried out an experiment in south-eastern France in summer 2007 to validate the aerosol product of CALIOP by operating both the ground-based and the airborne mobile lidars in coincidence with CALIOP. The synergy between the new generation of spaceborne passive and active instruments is promising to assess the concentration of main pollutants as aerosol, O3 and CO, and greenhouse gases as CO2 and CH4 within the planetary boundary layer (PBL) and to increase the accuracy on the vertical profile of temperature. IASI is a key payload element of the METOP series of European meteorological polar-orbit satellites. The MetOp-A satellite was successfully launched from the Baikonur Cosmodrome, Kazakhstan on 19 October 2006 (http://www.eumetsat.int/). IASI is a Fourier transform spectrometer dedicated to the operational meteorology and the chemistry of the troposphere. The technological approach for the stratospheric lidar system was tested in south-western France in spring 2007. Acknowledgements: our lidar systems have been developed by CEA and CNRS with the support of CNES. We acknowledge the ULA pilot Franck Toussaint and the Air Creation ULA Company for logistical help during the ULA campaign.

  18. Examining System-Wide Impacts of Solar PV Control Systems with a Power Hardware-in-the-Loop Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Williams, Tess L.; Fuller, Jason C.; Schneider, Kevin P.

    2014-10-11

    High penetration levels of distributed solar PV power generation can lead to adverse power quality impacts such as excessive voltage rise, voltage flicker, and reactive power values that result in unacceptable voltage levels. Advanced inverter control schemes have been proposed that have the potential to mitigate many power quality concerns. However, closed-loop control may lead to unintended behavior in deployed systems as complex interactions can occur between numerous operating devices. In order to enable the study of the performance of advanced control schemes in a detailed distribution system environment, a Hardware-in-the-Loop (HIL) platform has been developed. In the HIL system,more » GridLAB-D, a distribution system simulation tool, runs in real-time mode at the Pacific Northwest National Laboratory (PNNL) and supplies power system parameters at a point of common coupling to hardware located at the National Renewable Energy Laboratory (NREL). Hardware inverters interact with grid and PV simulators emulating an operational distribution system and power output from the inverters is measured and sent to PNNL to update the real-time distribution system simulation. The platform is described and initial test cases are presented. The platform is used to study the system-wide impacts and the interactions of controls applied to inverters that are integrated into a simulation of the IEEE 8500-node test feeder, with inverters in either constant power factor control or active volt/VAR control. We demonstrate that this HIL platform is well-suited to the study of advanced inverter controls and their impacts on the power quality of a distribution feeder. Additionally, the results from HIL are used to validate GridLAB-D simulations of advanced inverter controls.« less

  19. Discovery and analysis of time delay sources in the USGS personal computer data collection platform (PCDCP) system

    USGS Publications Warehouse

    White, Timothy C.; Sauter, Edward A.; Stewart, Duff C.

    2014-01-01

    Intermagnet is an international oversight group which exists to establish a global network for geomagnetic observatories. This group establishes data standards and standard operating procedures for members and prospective members. Intermagnet has proposed a new One-Second Data Standard, for that emerging geomagnetic product. The standard specifies that all data collected must have a time stamp accuracy of ±10 milliseconds of the top-of-the-second Coordinated Universal Time. Therefore, the U.S. Geological Survey Geomagnetism Program has designed and executed several tests on its current data collection system, the Personal Computer Data Collection Platform. Tests are designed to measure the time shifts introduced by individual components within the data collection system, as well as to measure the time shift introduced by the entire Personal Computer Data Collection Platform. Additional testing designed for Intermagnet will be used to validate further such measurements. Current results of the measurements showed a 5.0–19.9 millisecond lag for the vertical channel (Z) of the Personal Computer Data Collection Platform and a 13.0–25.8 millisecond lag for horizontal channels (H and D) of the collection system. These measurements represent a dynamically changing delay introduced within the U.S. Geological Survey Personal Computer Data Collection Platform.

  20. Development and Validation of Sandwich ELISA Microarrays with Minimal Assay Interference

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gonzalez, Rachel M.; Servoss, Shannon; Crowley, Sheila A.

    Sandwich enzyme-linked immunosorbent assay (ELISA) microarrays are emerging as a strong candidate platform for multiplex biomarker analysis because of the ELISA’s ability to quantitatively measure rare proteins in complex biological fluids. Advantages of this platform are high-throughput potential, assay sensitivity and stringency, and the similarity to the standard ELISA test, which facilitates assay transfer from a research setting to a clinical laboratory. However, a major concern with the multiplexing of ELISAs is maintaining high assay specificity. In this study, we systematically determine the amount of assay interference and noise contributed by individual components of the multiplexed 24-assay system. We findmore » that non-specific reagent cross-reactivity problems are relatively rare. We did identify the presence of contaminant antigens in a “purified antigen”. We tested the validated ELISA microarray chip using paired serum samples that had been collected from four women at a 6-month interval. This analysis demonstrated that protein levels typically vary much more between individuals then within an individual over time, a result which suggests that longitudinal studies may be useful in controlling for biomarker variability across a population. Overall, this research demonstrates the importance of a stringent screening protocol and the value of optimizing the antibody and antigen concentrations when designing chips for ELISA microarrays.« less

  1. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping

    PubMed Central

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables. PMID:27529547

  2. GiNA, an Efficient and High-Throughput Software for Horticultural Phenotyping.

    PubMed

    Diaz-Garcia, Luis; Covarrubias-Pazaran, Giovanny; Schlautman, Brandon; Zalapa, Juan

    2016-01-01

    Traditional methods for trait phenotyping have been a bottleneck for research in many crop species due to their intensive labor, high cost, complex implementation, lack of reproducibility and propensity to subjective bias. Recently, multiple high-throughput phenotyping platforms have been developed, but most of them are expensive, species-dependent, complex to use, and available only for major crops. To overcome such limitations, we present the open-source software GiNA, which is a simple and free tool for measuring horticultural traits such as shape- and color-related parameters of fruits, vegetables, and seeds. GiNA is multiplatform software available in both R and MATLAB® programming languages and uses conventional images from digital cameras with minimal requirements. It can process up to 11 different horticultural morphological traits such as length, width, two-dimensional area, volume, projected skin, surface area, RGB color, among other parameters. Different validation tests produced highly consistent results under different lighting conditions and camera setups making GiNA a very reliable platform for high-throughput phenotyping. In addition, five-fold cross validation between manually generated and GiNA measurements for length and width in cranberry fruits were 0.97 and 0.92. In addition, the same strategy yielded prediction accuracies above 0.83 for color estimates produced from images of cranberries analyzed with GiNA compared to total anthocyanin content (TAcy) of the same fruits measured with the standard methodology of the industry. Our platform provides a scalable, easy-to-use and affordable tool for massive acquisition of phenotypic data of fruits, seeds, and vegetables.

  3. Reliability and validity of a smartphone pulse rate application for the assessment of resting and elevated pulse rate.

    PubMed

    Mitchell, Katy; Graff, Megan; Hedt, Corbin; Simmons, James

    2016-08-01

    Purpose/hypothesis: This study was designed to investigate the test-retest reliability, concurrent validity, and the standard error of measurement (SEm) of a pulse rate assessment application (Azumio®'s Instant Heart Rate) on both Android® and iOS® (iphone operating system) smartphones as compared to a FT7 Polar® Heart Rate monitor. Number of subjects: 111. Resting (sitting) pulse rate was assessed twice and then the participants were asked to complete a 1-min standing step test and then immediately re-assessed. The smartphone assessors were blinded to their measurements. Test-retest reliability (intraclass correlation coefficient [ICC 2,1] and 95% confidence interval) for the three tools at rest (time 1/time 2): iOS® (0.76 [0.67-0.83]); Polar® (0.84 [0.78-0.89]); and Android® (0.82 [0.75-0.88]). Concurrent validity at rest time 2 (ICC 2,1) with the Polar® device: IOS® (0.92 [0.88-0.94]) and Android® (0.95 [0.92-0.96]). Concurrent validity post-exercise (time 3) (ICC) with the Polar® device: iOS® (0.90 [0.86-0.93]) and Android® (0.94 [0.91-0.96]). The SEm values for the three devices at rest: iOS® (5.77 beats per minute [BPM]), Polar® (4.56 BPM) and Android® (4.96 BPM). The Android®, iOS®, and Polar® devices showed acceptable test-retest reliability at rest and post-exercise. Both the smartphone platforms demonstrated concurrent validity with the Polar® at rest and post-exercise. The Azumio® Instant Heart Rate application when used by either platform appears to be a reliable and valid tool to assess pulse rate in healthy individuals.

  4. Future Combat System Spinout 1 Technical Field Test - Establishing and Implementing Models and Simulations System of Systems Verification, Validation and Accreditation Practices, Methodologies and Procedures

    DTIC Science & Technology

    2009-11-24

    assisted by the Brigade Combat Team (BCT) Modernization effort, the use of Models and Simulations ( M &S) becomes more crucial in supporting major...in 2008 via a slice of the Current Force (CF) BCT structure. To ensure realistic operational context, a M &S System-of- Systems (SoS) level...messages, and constructive representation of platforms, vehicles, and terrain. The M &S federation also provided test control, data collection, and live

  5. Solar Occultation Retrieval Algorithm Development

    NASA Technical Reports Server (NTRS)

    Lumpe, Jerry D.

    2004-01-01

    This effort addresses the comparison and validation of currently operational solar occultation retrieval algorithms, and the development of generalized algorithms for future application to multiple platforms. initial development of generalized forward model algorithms capable of simulating transmission data from of the POAM II/III and SAGE II/III instruments. Work in the 2" quarter will focus on: completion of forward model algorithms, including accurate spectral characteristics for all instruments, and comparison of simulated transmission data with actual level 1 instrument data for specific occultation events.

  6. An overview on development and application of an experimental platform for quantitative cardiac imaging research in rabbit models of myocardial infarction

    PubMed Central

    Feng, Yuanbo; Bogaert, Jan; Oyen, Raymond

    2014-01-01

    To exploit the advantages of using rabbits for cardiac imaging research and to tackle the technical obstacles, efforts have been made under the framework of a doctoral research program. In this overview article, by cross-referencing the current literature, we summarize how we have developed a preclinical cardiac research platform based on modified models of reperfused myocardial infarction (MI) in rabbits; how the in vivo manifestations of cardiac imaging could be closely matched with those ex vivo macro- and microscopic findings; how these imaging outcomes could be quantitatively analyzed, validated and demonstrated; and how we could apply this cardiac imaging platform to provide possible solutions to certain lingering diagnostic and therapeutic problems in experimental cardiology. In particular, tissue components in acute cardiac ischemia have been stratified and characterized, post-infarct lipomatous metaplasia (LM) as a common but hardly illuminated clinical pathology has been identified in rabbit models, and a necrosis avid tracer as well as an anti-ischemic drug have been successfully assessed for their potential utilities in clinical cardiology. These outcomes may interest the researchers in the related fields and help strengthen translational research in cardiovascular diseases. PMID:25392822

  7. An overview on development and application of an experimental platform for quantitative cardiac imaging research in rabbit models of myocardial infarction.

    PubMed

    Feng, Yuanbo; Bogaert, Jan; Oyen, Raymond; Ni, Yicheng

    2014-10-01

    To exploit the advantages of using rabbits for cardiac imaging research and to tackle the technical obstacles, efforts have been made under the framework of a doctoral research program. In this overview article, by cross-referencing the current literature, we summarize how we have developed a preclinical cardiac research platform based on modified models of reperfused myocardial infarction (MI) in rabbits; how the in vivo manifestations of cardiac imaging could be closely matched with those ex vivo macro- and microscopic findings; how these imaging outcomes could be quantitatively analyzed, validated and demonstrated; and how we could apply this cardiac imaging platform to provide possible solutions to certain lingering diagnostic and therapeutic problems in experimental cardiology. In particular, tissue components in acute cardiac ischemia have been stratified and characterized, post-infarct lipomatous metaplasia (LM) as a common but hardly illuminated clinical pathology has been identified in rabbit models, and a necrosis avid tracer as well as an anti-ischemic drug have been successfully assessed for their potential utilities in clinical cardiology. These outcomes may interest the researchers in the related fields and help strengthen translational research in cardiovascular diseases.

  8. Geostationary platform systems concepts definition study. Volume 2: Technical, book 1

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The initial selection and definition of operational geostationary platform concepts is discussed. Candidate geostationary platform missions and payloads were identified from COMSAT, Aerospace, and NASA studies. These missions and payloads were cataloged; classified with to communications, military, or scientific uses; screened for application and compatibility with geostationary platforms; and analyzed to identify platform requirements. Two platform locations were then selected (Western Hemisphere - 110 deg W, and Atlantic - 15 deg W), and payloads allocated based on nominal and high traffic models. Trade studies were performed leading to recommendation of selected concepts. Of 30 Orbit Transfer Vehicle (0TV) configuration and operating mode options identified, 18 viable candidates compatible with the operational geostationary platform missions were selected for analysis. Each was considered using four platform operational modes - 8 or 16 year life, and serviced or nonserviced, providing a total of 72 OTV/platform-mode options. For final trade study concept selection, a cost program was developed considering payload and platform costs and weight; transportation unit and total costs for the shuttle and OTV; and operational costs such as assembly or construction time, mating time, and loiter time. Servicing costs were added for final analysis and recommended selection.

  9. How close are we to customizing chemotherapy in early non-small cell lung cancer?

    PubMed Central

    Ioannidis, Georgios; Georgoulias, Vassilis; Souglakos, John

    2011-01-01

    Although surgery is the only potentially curative treatment for early-stage non-small cell lung cancer (NSCLC), 5-year survival rates range from 77% for stage IA tumors to 23% in stage IIIA disease. Adjuvant chemotherapy has recently been established as a standard of care for resected stage II-III NSCLC, on the basis of large-scale clinical trials employing third-generation platinum-based regimens. As the overall absolute 5-year survival benefit from this approach does not exceed 5% and potential long-term complications are an issue of concern, the aim of customized adjuvant systemic treatment is to optimize the toxicity/benefit ratio, so that low-risk individuals are spared from unnecessary intervention, while avoiding undertreatment of high-risk patients, including those with stage I disease. Therefore, the application of reliable prognostic and predictive biomarkers would enable to identify appropriate patients for the most effective treatment. This is an overview of the data available on the most promising clinicopathological and molecular biomarkers that could affect adjuvant and neoadjuvant chemotherapy decisions for operable NSCLC in routine practice. Among the numerous candidate molecular biomarkers, only few gene-expression profiling signatures provide clinically relevant information warranting further validation. On the other hand, real-time quantitative polymerase-chain reaction strategy involving relatively small number of genes offers a practical alternative, with high cross-platform performance. Although data extrapolation from the metastatic setting should be cautious, the concept of personalized, pharmacogenomics-guided chemotherapy for early NSCLC seems feasible, and is currently being evaluated in randomized phase 2 and 3 trials. The mRNA and/or protein expression levels of excision repair cross-complementation group 1, ribonucleotide reductase M1 and breast cancer susceptibility gene 1 are among the most potential biomarkers for early disease, with stage-independent prognostic and predictive values, the clinical utility of which is being validated prospectively. Inter-assay discordance in determining the biomarker status and association with clinical outcomes is noteworthing. PMID:21904580

  10. Novel Directional Protection Scheme for the FREEDM Smart Grid System

    NASA Astrophysics Data System (ADS)

    Sharma, Nitish

    This research primarily deals with the design and validation of the protection system for a large scale meshed distribution system. The large scale system simulation (LSSS) is a system level PSCAD model which is used to validate component models for different time-scale platforms, to provide a virtual testing platform for the Future Renewable Electric Energy Delivery and Management (FREEDM) system. It is also used to validate the cases of power system protection, renewable energy integration and storage, and load profiles. The protection of the FREEDM system against any abnormal condition is one of the important tasks. The addition of distributed generation and power electronic based solid state transformer adds to the complexity of the protection. The FREEDM loop system has a fault current limiter and in addition, the Solid State Transformer (SST) limits the fault current at 2.0 per unit. Former students at ASU have developed the protection scheme using fiber-optic cable. However, during the NSF-FREEDM site visit, the National Science Foundation (NSF) team regarded the system incompatible for the long distances. Hence, a new protection scheme with a wireless scheme is presented in this thesis. The use of wireless communication is extended to protect the large scale meshed distributed generation from any fault. The trip signal generated by the pilot protection system is used to trigger the FID (fault isolation device) which is an electronic circuit breaker operation (switched off/opening the FIDs). The trip signal must be received and accepted by the SST, and it must block the SST operation immediately. A comprehensive protection system for the large scale meshed distribution system has been developed in PSCAD with the ability to quickly detect the faults. The validation of the protection system is performed by building a hardware model using commercial relays at the ASU power laboratory.

  11. Developing a model of competence in the operating theatre: psychometric validation of the perceived perioperative competence scale-revised.

    PubMed

    Gillespie, Brigid M; Polit, Denise F; Hamlin, Lois; Chaboyer, Wendy

    2012-01-01

    This paper describes the development and validation of the Revised Perioperative Competence Scale (PPCS-R). There is a lack of a psychometrically tested sound self-assessment tools to measure nurses' perceived competence in the operating room. Content validity was established by a panel of international experts and the original 98-item scale was pilot tested with 345 nurses in Queensland, Australia. Following the removal of several items, a national sample that included all 3209 nurses who were members of the Australian College of Operating Room Nurses was surveyed using the 94-item version. Psychometric testing assessed content validity using exploratory factor analysis, internal consistency using Cronbach's alpha, and construct validity using the "known groups" technique. During item reduction, several preliminary factor analyses were performed on two random halves of the sample (n=550). Usable data for psychometric assessment were obtained from 1122 nurses. The original 94-item scale was reduced to 40 items. The final factor analysis using the entire sample resulted in a 40 item six-factor solution. Cronbach's alpha for the 40-item scale was .96. Construct validation demonstrated significant differences (p<.0001) in perceived competence scores relative to years of operating room experience and receipt of specialty education. On the basis of these results, the psychometric properties of the PPCS-R were considered encouraging. Further testing of the tool in different samples of operating room nurses is necessary to enable cross-cultural comparisons. Copyright © 2011 Elsevier Ltd. All rights reserved.

  12. Decentralized State Estimation and Remedial Control Action for Minimum Wind Curtailment Using Distributed Computing Platform

    DOE PAGES

    Liu, Ren; Srivastava, Anurag K.; Bakken, David E.; ...

    2017-08-17

    Intermittency of wind energy poses a great challenge for power system operation and control. Wind curtailment might be necessary at the certain operating condition to keep the line flow within the limit. Remedial Action Scheme (RAS) offers quick control action mechanism to keep reliability and security of the power system operation with high wind energy integration. In this paper, a new RAS is developed to maximize the wind energy integration without compromising the security and reliability of the power system based on specific utility requirements. A new Distributed Linear State Estimation (DLSE) is also developed to provide the fast andmore » accurate input data for the proposed RAS. A distributed computational architecture is designed to guarantee the robustness of the cyber system to support RAS and DLSE implementation. The proposed RAS and DLSE is validated using the modified IEEE-118 Bus system. Simulation results demonstrate the satisfactory performance of the DLSE and the effectiveness of RAS. Real-time cyber-physical testbed has been utilized to validate the cyber-resiliency of the developed RAS against computational node failure.« less

  13. Recent developments of DMI's operational system: Coupled Ecosystem-Circulation-and SPM model.

    NASA Astrophysics Data System (ADS)

    Murawski, Jens; Tian, Tian; Dobrynin, Mikhail

    2010-05-01

    ECOOP is a pan- European project with 72 partners from 29 countries around the Baltic Sea, the North Sea, the Iberia-Biscay-Ireland region, the Mediterranean Sea and the Black Sea. The project aims at the development and the integration of the different coastal and regional observation and forecasting systems. The Danish Meteorological Institute DMI coordinates the project and is responsible for the Baltic Sea regional forecasting System. Over the project period, the Baltic Sea system was developed from a purely hydro dynamical model (version V1), running operationally since summer 2009, to a coupled model platform (version V2), including model components for the simulation of suspended particles, data assimilation and ecosystem variables. The ECOOP V2 model is currently tested and validated, and will replace the V1 version soon. The coupled biogeochemical- and circulation model runs operationally since November 2009. The daily forecasts are presented at DMI's homepage http:/ocean.dmi.dk. The presentation includes a short description of the ECOOP forecasting system, discusses the model results and shows the outcome of the model validation.

  14. Decentralized State Estimation and Remedial Control Action for Minimum Wind Curtailment Using Distributed Computing Platform

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Ren; Srivastava, Anurag K.; Bakken, David E.

    Intermittency of wind energy poses a great challenge for power system operation and control. Wind curtailment might be necessary at the certain operating condition to keep the line flow within the limit. Remedial Action Scheme (RAS) offers quick control action mechanism to keep reliability and security of the power system operation with high wind energy integration. In this paper, a new RAS is developed to maximize the wind energy integration without compromising the security and reliability of the power system based on specific utility requirements. A new Distributed Linear State Estimation (DLSE) is also developed to provide the fast andmore » accurate input data for the proposed RAS. A distributed computational architecture is designed to guarantee the robustness of the cyber system to support RAS and DLSE implementation. The proposed RAS and DLSE is validated using the modified IEEE-118 Bus system. Simulation results demonstrate the satisfactory performance of the DLSE and the effectiveness of RAS. Real-time cyber-physical testbed has been utilized to validate the cyber-resiliency of the developed RAS against computational node failure.« less

  15. The moderate resolution imaging spectrometer (MODIS) science and data system requirements

    NASA Technical Reports Server (NTRS)

    Ardanuy, Philip E.; Han, Daesoo; Salomonson, Vincent V.

    1991-01-01

    The Moderate Resolution Imaging Spectrometer (MODIS) has been designated as a facility instrument on the first NASA polar orbiting platform as part of the Earth Observing System (EOS) and is scheduled for launch in the late 1990s. The near-global daily coverage of MODIS, combined with its continuous operation, broad spectral coverage, and relatively high spatial resolution, makes it central to the objectives of EOS. The development, implementation, production, and validation of the core MODIS data products define a set of functional, performance, and operational requirements on the data system that operate between the sensor measurements and the data products supplied to the user community. The science requirements guiding the processing of MODIS data are reviewed, and the aspects of an operations concept for the production of data products from MODIS for use by the scientific community are discussed.

  16. Analysis of human plasma metabolites across different liquid chromatography/mass spectrometry platforms: Cross-platform transferable chemical signatures.

    PubMed

    Telu, Kelly H; Yan, Xinjian; Wallace, William E; Stein, Stephen E; Simón-Manso, Yamil

    2016-03-15

    The metabolite profiling of a NIST plasma Standard Reference Material (SRM 1950) on different liquid chromatography/mass spectrometry (LC/MS) platforms showed significant differences. Although these findings suggest caution when interpreting metabolomics results, the degree of overlap of both profiles allowed us to use tandem mass spectral libraries of recurrent spectra to evaluate to what extent these results are transferable across platforms and to develop cross-platform chemical signatures. Non-targeted global metabolite profiles of SRM 1950 were obtained on different LC/MS platforms using reversed-phase chromatography and different chromatographic scales (conventional HPLC, UHPLC and nanoLC). The data processing and the metabolite differential analysis were carried out using publically available (XCMS), proprietary (Mass Profiler Professional) and in-house software (NIST pipeline). Repeatability and intermediate precision showed that the non-targeted SRM 1950 profiling was highly reproducible when working on the same platform (relative standard deviation (RSD) <2%); however, substantial differences were found in the LC/MS patterns originating on different platforms or even using different chromatographic scales (conventional HPLC, UHPLC and nanoLC) on the same platform. A substantial degree of overlap (common molecular features) was also found. A procedure to generate consistent chemical signatures using tandem mass spectral libraries of recurrent spectra is proposed. Different platforms rendered significantly different metabolite profiles, but the results were highly reproducible when working within one platform. Tandem mass spectral libraries of recurrent spectra are proposed to evaluate the degree of transferability of chemical signatures generated on different platforms. Chemical signatures based on our procedure are most likely cross-platform transferable. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  17. Future Directions for Astronomical Image Display

    NASA Technical Reports Server (NTRS)

    Mandel, Eric

    2000-01-01

    In the "Future Directions for Astronomical Image Displav" project, the Smithsonian Astrophysical Observatory (SAO) and the National Optical Astronomy Observatories (NOAO) evolved our existing image display program into fully extensible. cross-platform image display software. We also devised messaging software to support integration of image display into astronomical analysis systems. Finally, we migrated our software from reliance on Unix and the X Window System to a platform-independent architecture that utilizes the cross-platform Tcl/Tk technology.

  18. Increased anatomic severity predicts outcomes: Validation of the American Association for the Surgery of Trauma's Emergency General Surgery score in appendicitis.

    PubMed

    Hernandez, Matthew C; Aho, Johnathon M; Habermann, Elizabeth B; Choudhry, Asad J; Morris, David S; Zielinski, Martin D

    2017-01-01

    Determination and reporting of disease severity in emergency general surgery lacks standardization. Recently, the American Association for the Surgery of Trauma (AAST) proposed an anatomic severity grading system. We aimed to validate this system in patients with appendicitis and determine if cross-sectional imaging correlates with disease severity at operation. Patients 18 years or older undergoing treatment for acute appendicitis between 2013 and 2015 were identified. Baseline demographics, procedure types were recorded, and AAST grades were assigned based on intraoperative and radiologic findings. Outcomes including length of stay, 30-day mortality, and complications based on Clavien-Dindo categories and National Surgical Quality Improvement Program variables. Summary statistical univariate, nominal logistic, and standard least squares analyses were performed comparing AAST grade with key outcomes. Bland-Altman analysis compared operative findings with preoperative cross-sectional imaging to compare assigning grades. Three hundred thirty-four patients with mean (±SD) age of 39.3 years (±16.5) were included (53% men), and all patients had cross-sectional imaging. Two hundred ninety-nine underwent appendectomy, and 85% completed laparoscopic. Thirty-day mortality rate was 0.9%, complication rate was 21%. Increased (median [interquartile range, IQR]) AAST grade was recorded in patients with complications (2 [1-4]) compared with those without (1 [1-1], p = 0.001). For operative management, (median [IQR]) AAST grades were significantly associated with procedure type: laparoscopic (1 [1-1]), open (4 [2-5]), conversion to open (3 [1-4], p = 0.001). Increased (median [IQR]) AAST grades were significantly associated in nonoperative management: patients having a complication had a higher median AAST grade (4 [3-5]) compared with those without (3 [2-3], p = 0.001). Bland-Altman analysis comparing AAST grade and cross-sectional imaging demonstrated no difference (-0.02 ± 0.02; p = 0.2; coefficient of repeatability 0.9). The AAST grading system is valid in our population. Increased AAST grade is associated with open procedures, complications, and length of stay. The AAST emergency general surgery grade determined by preoperative imaging strongly correlated to operative findings. Epidemiologic/prognostic study, level III.

  19. MONO FOR CROSS-PLATFORM CONTROL SYSTEM ENVIRONMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nishimura, Hiroshi; Timossi, Chris

    2006-10-19

    Mono is an independent implementation of the .NET Frameworkby Novell that runs on multiple operating systems (including Windows,Linux and Macintosh) and allows any .NET compatible application to rununmodified. For instance Mono can run programs with graphical userinterfaces (GUI) developed with the C# language on Windows with VisualStudio (a full port of WinForm for Mono is in progress). We present theresults of tests we performed to evaluate the portability of our controlssystem .NET applications from MS Windows to Linux.

  20. Translation and validation of the new version of the Knee Society Score - The 2011 KS Score - into Brazilian Portuguese.

    PubMed

    Silva, Adriana Lucia Pastore E; Croci, Alberto Tesconi; Gobbi, Riccardo Gomes; Hinckel, Betina Bremer; Pecora, José Ricardo; Demange, Marco Kawamura

    2017-01-01

    Translation, cultural adaptation, and validation of the new version of the Knee Society Score - The 2011 KS Score - into Brazilian Portuguese and verification of its measurement properties, reproducibility, and validity. In 2012, the new version of the Knee Society Score was developed and validated. This scale comprises four separate subscales: (a) objective knee score (seven items: 100 points); (b) patient satisfaction score (five items: 40 points); (c) patient expectations score (three items: 15 points); and (d) functional activity score (19 items: 100 points). A total of 90 patients aged 55-85 years were evaluated in a clinical cross-sectional study. The pre-operative translated version was applied to patients with TKA referral, and the post-operative translated version was applied to patients who underwent TKA. Each patient answered the same questionnaire twice and was evaluated by two experts in orthopedic knee surgery. Evaluations were performed pre-operatively and three, six, or 12 months post-operatively. The reliability of the questionnaire was evaluated using the intraclass correlation coefficient (ICC) between the two applications. Internal consistency was evaluated using Cronbach's alpha. The ICC found no difference between the means of the pre-operative, three-month, and six-month post-operative evaluations between sub-scale items. The Brazilian Portuguese version of The 2011 KS Score is a valid and reliable instrument for objective and subjective evaluation of the functionality of Brazilian patients who undergo TKA and revision TKA.

  1. OMPS SDR Calibration and Validation

    NASA Astrophysics Data System (ADS)

    Sen, B.; Done, J.; Buss, R.; Jaross, G. R.; Kelly, T. J.

    2009-12-01

    The Ozone Mapper and Profiler Suite (OMPS) is scheduled to be launched on the NPOESS Preparatory Project (NPP) platform in early 2011. The OMPS will continue monitoring ozone from space, using three instruments, namely the Total Column Mapper (heritage: TOMS), the Nadir Profiler (heritage: SBUV) and the Limb Profiler (heritage: SOLSE/LORE). The Total Column Mapper (TC) sensor images the Earth through a slit, nadir-cell horizontally spaced at 49.5 km cross-track with an along-track reporting interval of 50 km. The total field of view (FOV) cross-track is 110 degree to provide daily global coverage. The TC sensor, a grating spectrometer, provides 0.45 nm spectral sampling across the wavelength range of 300-380 nm. The calibration stability, which is essential to enable long-term ozone monitoring, is maintained by periodic observations of the Sun, using a diffuser to redirect the solar irradiance into the sensor. We describe the plans to calibrate the TC sensor and validate the radiance data (TC Sensor Data Record or TC SDR) after launch. We discuss the measurements planned during the Intensive Cal/Val (ICV) phase of NPP mission, the data analysis methodology and results from the analysis of OMPS calibration measurements.

  2. Direct target NOTES: prospective applications for next generation robotic platforms.

    PubMed

    Atallah, S; Hodges, A; Larach, S W

    2018-05-01

    A new era in surgical robotics has centered on alternative access to anatomic targets and next generation designs include flexible, single-port systems which follow circuitous rather than straight pathways. Such systems maintain a small footprint and could be utilized for specialized operations based on direct organ target natural orifice transluminal endoscopic surgery (NOTES), of which transanal total mesorectal excision (taTME) is an important derivative. During two sessions, four direct target NOTES operations were conducted on a cadaveric model using a flexible robotic system to demonstrate proof-of-concept of the application of a next generation robotic system to specific types of NOTES operations, all of which required removal of a direct target organ through natural orifice access. These four operations were (a) robotic taTME, (b) robotic transvaginal hysterectomy in conjunction with (c) robotic transvaginal salpingo-oophorectomy, and in an ex vivo model, (d) trans-cecal appendectomy. Feasibility was demonstrated in all cases using the Flex ® Robotic System with Colorectal Drive. During taTME, the platform excursion was 17 cm along a non-linear path; operative time was 57 min for the transanal portion of the dissection. Robotic transvaginal hysterectomy was successfully completed in 78 min with transvaginal extraction of the uterus, although laparoscopic assistance was required. Robotic transvaginal unilateral salpingo-oophorectomy with transvaginal extraction of the ovary and fallopian tube was performed without laparoscopic assistance in 13.5 min. In an ex vivo model, a robotic trans-cecal appendectomy was also successfully performed for the purpose of demonstrating proof-of-concept only; this was completed in 24 min. A flexible robotic system has the potential to access anatomy along circuitous paths, making it a suitable platform for direct target NOTES. The conceptual operations posed could be considered suitable for next generation robotics once the technology is optimized, and after further preclinical validation.

  3. Polyethylene glycol modified, cross-linked starch-coated iron oxide nanoparticles for enhanced magnetic tumor targeting.

    PubMed

    Cole, Adam J; David, Allan E; Wang, Jianxin; Galbán, Craig J; Hill, Hannah L; Yang, Victor C

    2011-03-01

    While successful magnetic tumor targeting of iron oxide nanoparticles has been achieved in a number of models, the rapid blood clearance of magnetically suitable particles by the reticuloendothelial system (RES) limits their availability for targeting. This work aimed to develop a long-circulating magnetic iron oxide nanoparticle (MNP) platform capable of sustained tumor exposure via the circulation and, thus, potentially enhanced magnetic tumor targeting. Aminated, cross-linked starch (DN) and aminosilane (A) coated MNPs were successfully modified with 5 kDa (A5, D5) or 20 kDa (A20, D20) polyethylene glycol (PEG) chains using simple N-Hydroxysuccinimide (NHS) chemistry and characterized. Identical PEG-weight analogues between platforms (A5 & D5, A20 & D20) were similar in size (140-190 nm) and relative PEG labeling (1.5% of surface amines - A5/D5, 0.4% - A20/D20), with all PEG-MNPs possessing magnetization properties suitable for magnetic targeting. Candidate PEG-MNPs were studied in RES simulations in vitro to predict long-circulating character. D5 and D20 performed best showing sustained size stability in cell culture medium at 37 °C and 7 (D20) to 10 (D5) fold less uptake in RAW264.7 macrophages when compared to previously targeted, unmodified starch MNPs (D). Observations in vitro were validated in vivo, with D5 (7.29 h) and D20 (11.75 h) showing much longer half-lives than D (0.12 h). Improved plasma stability enhanced tumor MNP exposure 100 (D5) to 150 (D20) fold as measured by plasma AUC(0-∞). Sustained tumor exposure over 24 h was visually confirmed in a 9L-glioma rat model (12 mg Fe/kg) using magnetic resonance imaging (MRI). Findings indicate that a polyethylene glycol modified, cross-linked starch-coated MNP is a promising platform for enhanced magnetic tumor targeting, warranting further study in tumor models. Copyright © 2010 Elsevier Ltd. All rights reserved.

  4. Current Status and Future Plans of the NEON Airborne Observation Platform (AOP): Data Products, Observatory Requirements and Opportunities for the Community

    NASA Astrophysics Data System (ADS)

    Petroy, S. B.; Leisso, N.; Goulden, T.; Gulbransen, T.

    2016-12-01

    The National Ecological Observatory Network (NEON) is a continental-scale ecological observation platform designed to collect and disseminate data that contributes to understanding and forecasting the impacts of climate change, land use change, and invasive species on ecology. NEON will collect in-situ and airborne data over 81 sites across the US, including Alaska, Hawaii, and Puerto Rico. The Airborne Observation Platform (AOP) group within the NEON project operates a payload suite that includes a waveform LiDAR, imaging spectrometer (NIS) and high resolution RGB camera. Data from this sensor suite will be collected annually over each site and processed into a set of standard data products, generally following the processing levels used by NASA (Level 1 through Level 3). We will present a summary of the first operational flight campaign (2016), where AOP flew 42 of the 81 planned NEON sites, our operational plans for 2017, and how we will ramp up to full operations by 2018. We will also describe the final set of AOP data products to be delivered as part of NEON construction and those field (observational) data products collected concurrently on the ground, that may be used to support validation efforts of algorithms for deriving vegetation characteristics from airborne data (e.g. Plant foliar physical/chemical properties, Digital Hemispherical Photos, Plant Diversity, etc.). Opportunities for future enhancements to data products or algorithms will be facilitated via NEON's cyberinfrastructure, which is designed to support wrapping/integration of externally-developed code. And finally, we will present NEON's plans for the third AOP Sensor Suite as an assignable asset and the intent of NSF to provide research opportunities to the community for developing higher level AOP data products that were removed from the NEON project in 2015.

  5. Evaluation of Smartphone Inertial Sensor Performance for Cross-Platform Mobile Applications

    PubMed Central

    Kos, Anton; Tomažič, Sašo; Umek, Anton

    2016-01-01

    Smartphone sensors are being increasingly used in mobile applications. The performance of sensors varies considerably among different smartphone models and the development of a cross-platform mobile application might be a very complex and demanding task. A publicly accessible resource containing real-life-situation smartphone sensor parameters could be of great help for cross-platform developers. To address this issue we have designed and implemented a pilot participatory sensing application for measuring, gathering, and analyzing smartphone sensor parameters. We start with smartphone accelerometer and gyroscope bias and noise parameters. The application database presently includes sensor parameters of more than 60 different smartphone models of different platforms. It is a modest, but important start, offering information on several statistical parameters of the measured smartphone sensors and insights into their performance. The next step, a large-scale cloud-based version of the application, is already planned. The large database of smartphone sensor parameters may prove particularly useful for cross-platform developers. It may also be interesting for individual participants who would be able to check-up and compare their smartphone sensors against a large number of similar or identical models. PMID:27049391

  6. Continuous measurement of breast tumor hormone receptor expression: a comparison of two computational pathology platforms

    PubMed Central

    Ahern, Thomas P.; Beck, Andrew H.; Rosner, Bernard A.; Glass, Ben; Frieling, Gretchen; Collins, Laura C.; Tamimi, Rulla M.

    2017-01-01

    Background Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumor estrogen receptor (ER) and progesterone receptor (PR) expression. Methods Breast tumor microarrays from the Nurses’ Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumor nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots, and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Results Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (rho≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUCAperio=0.97; AUCDefiniens=0.90; difference=0.07, 95% CI: 0.05, 0.09) and PR positivity (AUCAperio=0.94; AUCDefiniens=0.87; difference=0.07, 95% CI: 0.03, 0.12). Conclusion Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumor biomarker discovery. PMID:27729430

  7. Finite difference time domain electromagnetic scattering from frequency-dependent lossy materials

    NASA Technical Reports Server (NTRS)

    Luebbers, Raymond J.; Beggs, John H.

    1991-01-01

    Four different FDTD computer codes and companion Radar Cross Section (RCS) conversion codes on magnetic media are submitted. A single three dimensional dispersive FDTD code for both dispersive dielectric and magnetic materials was developed, along with a user's manual. The extension of FDTD to more complicated materials was made. The code is efficient and is capable of modeling interesting radar targets using a modest computer workstation platform. RCS results for two different plate geometries are reported. The FDTD method was also extended to computing far zone time domain results in two dimensions. Also the capability to model nonlinear materials was incorporated into FDTD and validated.

  8. Open Targets: a platform for therapeutic target identification and validation

    PubMed Central

    Koscielny, Gautier; An, Peter; Carvalho-Silva, Denise; Cham, Jennifer A.; Fumis, Luca; Gasparyan, Rippa; Hasan, Samiul; Karamanis, Nikiforos; Maguire, Michael; Papa, Eliseo; Pierleoni, Andrea; Pignatelli, Miguel; Platt, Theo; Rowland, Francis; Wankar, Priyanka; Bento, A. Patrícia; Burdett, Tony; Fabregat, Antonio; Forbes, Simon; Gaulton, Anna; Gonzalez, Cristina Yenyxe; Hermjakob, Henning; Hersey, Anne; Jupe, Steven; Kafkas, Şenay; Keays, Maria; Leroy, Catherine; Lopez, Francisco-Javier; Magarinos, Maria Paula; Malone, James; McEntyre, Johanna; Munoz-Pomer Fuentes, Alfonso; O'Donovan, Claire; Papatheodorou, Irene; Parkinson, Helen; Palka, Barbara; Paschall, Justin; Petryszak, Robert; Pratanwanich, Naruemon; Sarntivijal, Sirarat; Saunders, Gary; Sidiropoulos, Konstantinos; Smith, Thomas; Sondka, Zbyslaw; Stegle, Oliver; Tang, Y. Amy; Turner, Edward; Vaughan, Brendan; Vrousgou, Olga; Watkins, Xavier; Martin, Maria-Jesus; Sanseau, Philippe; Vamathevan, Jessica; Birney, Ewan; Barrett, Jeffrey; Dunham, Ian

    2017-01-01

    We have designed and developed a data integration and visualization platform that provides evidence about the association of known and potential drug targets with diseases. The platform is designed to support identification and prioritization of biological targets for follow-up. Each drug target is linked to a disease using integrated genome-wide data from a broad range of data sources. The platform provides either a target-centric workflow to identify diseases that may be associated with a specific target, or a disease-centric workflow to identify targets that may be associated with a specific disease. Users can easily transition between these target- and disease-centric workflows. The Open Targets Validation Platform is accessible at https://www.targetvalidation.org. PMID:27899665

  9. jsc2018m000314_Spinning_Science_Multi-use_Variable-g_Platform_Arrives_at_the_Space_Station-MP4

    NASA Image and Video Library

    2018-05-09

    Spinning Science: Multi-use Variable-g Platform Arrives at the Space Station --- The Multi-use Variable-gravity Platform (MVP) Validation mission will install and test the MVP, a new hardware platform developed and owned by Techshot Inc., on the International Space Station (ISS). Though the MVP is designed for research with many different kinds of organisms and cell types, this validation mission will focus on Drosophila melanogaster, more commonly known as the fruit fly. This platform will be especially important for fruit fly research, as it will allow researchers to study larger sample sizes of Drosophila melanogaster than in other previous hardware utilizing centrifuges and it will be able to support fly colonies for multiple generations.

  10. Toward an E-Government Semantic Platform

    NASA Astrophysics Data System (ADS)

    Sbodio, Marco Luca; Moulin, Claude; Benamou, Norbert; Barthès, Jean-Paul

    This chapter describes the major aspects of an e-government platform in which semantics underpins more traditional technologies in order to enable new capabilities and to overcome technical and cultural challenges. The design and development of such an e-government Semantic Platform has been conducted with the financial support of the European Commission through the Terregov research project: "Impact of e-government on Territorial Government Services" (Terregov 2008). The goal of this platform is to let local government and government agencies offer online access to their services in an interoperable way, and to allow them to participate in orchestrated processes involving services provided by multiple agencies. Implementing a business process through an electronic procedure is indeed a core goal in any networked organization. However, the field of e-government brings specific constraints to the operations allowed in procedures, especially concerning the flow of private citizens' data: because of legal reasons in most countries, such data are allowed to circulate only from agency to agency directly. In order to promote transparency and responsibility in e-government while respecting the specific constraints on data flows, Terregov supports the creation of centrally controlled orchestrated processes; while the cross agencies data flows are centrally managed, data flow directly across agencies.

  11. DataViewer3D: An Open-Source, Cross-Platform Multi-Modal Neuroimaging Data Visualization Tool

    PubMed Central

    Gouws, André; Woods, Will; Millman, Rebecca; Morland, Antony; Green, Gary

    2008-01-01

    Integration and display of results from multiple neuroimaging modalities [e.g. magnetic resonance imaging (MRI), magnetoencephalography, EEG] relies on display of a diverse range of data within a common, defined coordinate frame. DataViewer3D (DV3D) is a multi-modal imaging data visualization tool offering a cross-platform, open-source solution to simultaneous data overlay visualization requirements of imaging studies. While DV3D is primarily a visualization tool, the package allows an analysis approach where results from one imaging modality can guide comparative analysis of another modality in a single coordinate space. DV3D is built on Python, a dynamic object-oriented programming language with support for integration of modular toolkits, and development of cross-platform software for neuroimaging. DV3D harnesses the power of the Visualization Toolkit (VTK) for two-dimensional (2D) and 3D rendering, calling VTK's low level C++ functions from Python. Users interact with data via an intuitive interface that uses Python to bind wxWidgets, which in turn calls the user's operating system dialogs and graphical user interface tools. DV3D currently supports NIfTI-1, ANALYZE™ and DICOM formats for MRI data display (including statistical data overlay). Formats for other data types are supported. The modularity of DV3D and ease of use of Python allows rapid integration of additional format support and user development. DV3D has been tested on Mac OSX, RedHat Linux and Microsoft Windows XP. DV3D is offered for free download with an extensive set of tutorial resources and example data. PMID:19352444

  12. Cross talk between ceramide and redox signaling: implications for endothelial dysfunction and renal disease.

    PubMed

    Li, Pin-Lan; Zhang, Yang

    2013-01-01

    Recent studies have demonstrated that cross talk between ceramide and redox signaling modulates various cell activities and functions and contributes to the development of cardiovascular diseases and renal dysfunctions. Ceramide triggers the generation of reactive oxygen species (ROS) and increases oxidative stress in many mammalian cells and animal models. On the other hand, inhibition of ROS-generating enzymes or treatment of antioxidants impairs sphingomyelinase activation and ceramide production. As a mechanism, ceramide-enriched signaling platforms, special cell membrane rafts (MR) (formerly lipid rafts), provide an important microenvironment to mediate the cross talk of ceramide and redox signaling to exert a corresponding regulatory role on cell and organ functions. In this regard, activation of acid sphingomyelinase and generation of ceramide mediate the formation of ceramide-enriched membrane platforms, where transmembrane signals are transmitted or amplified through recruitment, clustering, assembling, or integration of various signaling molecules. A typical such signaling platform is MR redox signaling platform that is centered on ceramide production and aggregation leading to recruitment and assembling of NADPH oxidase to form an active complex in the cell plasma membrane. This redox signaling platform not only conducts redox signaling or regulation but also facilitates a feedforward amplification of both ceramide and redox signaling. In addition to this membrane MR redox signaling platform, the cross talk between ceramide and redox signaling may occur in other cell compartments. This book chapter focuses on the molecular mechanisms, spatial-temporal regulations, and implications of this cross talk between ceramide and redox signaling, which may provide novel insights into the understanding of both ceramide and redox signaling pathways.

  13. Regenerator cross arm seal assembly

    DOEpatents

    Jackman, Anthony V.

    1988-01-01

    A seal assembly for disposition between a cross arm on a gas turbine engine block and a regenerator disc, the seal assembly including a platform coextensive with the cross arm, a seal and wear layer sealingly and slidingly engaging the regenerator disc, a porous and compliant support layer between the platform and the seal and wear layer porous enough to permit flow of cooling air therethrough and compliant to accommodate relative thermal growth and distortion, a dike between the seal and wear layer and the platform for preventing cross flow through the support layer between engine exhaust and pressurized air passages, and air diversion passages for directing unregenerated pressurized air through the support layer to cool the seal and wear layer and then back into the flow of regenerated pressurized air.

  14. NARMER-1: a photon point-kernel code with build-up factors

    NASA Astrophysics Data System (ADS)

    Visonneau, Thierry; Pangault, Laurence; Malouch, Fadhel; Malvagi, Fausto; Dolci, Florence

    2017-09-01

    This paper presents an overview of NARMER-1, the new generation of photon point-kernel code developed by the Reactor Studies and Applied Mathematics Unit (SERMA) at CEA Saclay Center. After a short introduction giving some history points and the current context of development of the code, the paper exposes the principles implemented in the calculation, the physical quantities computed and surveys the generic features: programming language, computer platforms, geometry package, sources description, etc. Moreover, specific and recent features are also detailed: exclusion sphere, tetrahedral meshes, parallel operations. Then some points about verification and validation are presented. Finally we present some tools that can help the user for operations like visualization and pre-treatment.

  15. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  16. Experimental results in autonomous landing approaches by dynamic machine vision

    NASA Astrophysics Data System (ADS)

    Dickmanns, Ernst D.; Werner, Stefan; Kraus, S.; Schell, R.

    1994-07-01

    The 4-D approach to dynamic machine vision, exploiting full spatio-temporal models of the process to be controlled, has been applied to on board autonomous landing approaches of aircraft. Aside from image sequence processing, for which it was developed initially, it is also used for data fusion from a range of sensors. By prediction error feedback an internal representation of the aircraft state relative to the runway in 3-D space and time is servo- maintained in the interpretation process, from which the control applications required are being derived. The validity and efficiency of the approach have been proven both in hardware- in-the-loop simulations and in flight experiments with a twin turboprop aircraft Do128 under perturbations from cross winds and wind gusts. The software package has been ported to `C' and onto a new transputer image processing platform; the system has been expanded for bifocal vision with two cameras of different focal length mounted fixed relative to each other on a two-axes platform for viewing direction control.

  17. Cross-platform normalization of microarray and RNA-seq data for machine learning applications

    PubMed Central

    Thompson, Jeffrey A.; Tan, Jie

    2016-01-01

    Large, publicly available gene expression datasets are often analyzed with the aid of machine learning algorithms. Although RNA-seq is increasingly the technology of choice, a wealth of expression data already exist in the form of microarray data. If machine learning models built from legacy data can be applied to RNA-seq data, larger, more diverse training datasets can be created and validation can be performed on newly generated data. We developed Training Distribution Matching (TDM), which transforms RNA-seq data for use with models constructed from legacy platforms. We evaluated TDM, as well as quantile normalization, nonparanormal transformation, and a simple log2 transformation, on both simulated and biological datasets of gene expression. Our evaluation included both supervised and unsupervised machine learning approaches. We found that TDM exhibited consistently strong performance across settings and that quantile normalization also performed well in many circumstances. We also provide a TDM package for the R programming language. PMID:26844019

  18. Clustering and Network Analysis of Reverse Phase Protein Array Data.

    PubMed

    Byron, Adam

    2017-01-01

    Molecular profiling of proteins and phosphoproteins using a reverse phase protein array (RPPA) platform, with a panel of target-specific antibodies, enables the parallel, quantitative proteomic analysis of many biological samples in a microarray format. Hence, RPPA analysis can generate a high volume of multidimensional data that must be effectively interrogated and interpreted. A range of computational techniques for data mining can be applied to detect and explore data structure and to form functional predictions from large datasets. Here, two approaches for the computational analysis of RPPA data are detailed: the identification of similar patterns of protein expression by hierarchical cluster analysis and the modeling of protein interactions and signaling relationships by network analysis. The protocols use freely available, cross-platform software, are easy to implement, and do not require any programming expertise. Serving as data-driven starting points for further in-depth analysis, validation, and biological experimentation, these and related bioinformatic approaches can accelerate the functional interpretation of RPPA data.

  19. Recent advances in nanoplasmonic biosensors: applications and lab-on-a-chip integration

    NASA Astrophysics Data System (ADS)

    Lopez, Gerardo A.; Estevez, M.-Carmen; Soler, Maria; Lechuga, Laura M.

    2017-01-01

    Motivated by the recent progress in the nanofabrication field and the increasing demand for cost-effective, portable, and easy-to-use point-of-care platforms, localized surface plasmon resonance (LSPR) biosensors have been subjected to a great scientific interest in the last few years. The progress observed in the research of this nanoplasmonic technology is remarkable not only from a nanostructure fabrication point of view but also in the complete development and integration of operative devices and their application. The potential benefits that LSPR biosensors can offer, such as sensor miniaturization, multiplexing opportunities, and enhanced performances, have quickly positioned them as an interesting candidate in the design of lab-on-a-chip (LOC) optical biosensor platforms. This review covers specifically the most significant achievements that occurred in recent years towards the integration of this technology in compact devices, with views of obtaining LOC devices. We also discuss the most relevant examples of the use of the nanoplasmonic biosensors for real bioanalytical and clinical applications from assay development and validation to the identification of the implications, requirements, and challenges to be surpassed to achieve fully operative devices.

  20. [Cross-cultural adaptation and validation of the Health and Taste Attitude Scale (HTAS) in Portuguese].

    PubMed

    Koritar, Priscila; Philippi, Sonia Tucunduva; Alvarenga, Marle dos Santos; Santos, Bernardo dos

    2014-08-01

    The scope of this study was to show the cross-cultural adaptation and validation of the Health and Taste Attitude Scale in Portuguese. The methodology included translation of the scale; evaluation of conceptual, operational and item-based equivalence by 14 experts and 51 female undergraduates; semantic equivalence and measurement assessment by 12 bilingual women by the paired t-test, the Pearson correlation coefficient and the coefficient intraclass correlation; internal consistency and test-retest reliability by Cronbach's alpha and intraclass correlation coefficient, respectively, after application on 216 female undergraduates; assessment of discriminant and concurrent validity via the t-test and Spearman's correlation coefficient, respectively, in addition to Confirmatory Factor and Exploratory Factor Analysis. The scale was considered adequate and easily understood by the experts and university students and presented good internal consistency and reliability (µ 0.86, ICC 0.84). The results show that the scale is valid and can be used in studies with women to better understand attitudes related to taste.

  1. Shotgun Canceling.

    ERIC Educational Resources Information Center

    Szymanski, Theodore

    1999-01-01

    Discusses a common misunderstanding demonstrated by many students in basic mathematics courses: not knowing how to properly "cancel" factors in simplifying mathematical equations. Asserts that "crossing-out" or "canceling" is not a valid mathematical operation, and that instructors should be wary about using these terms because of the ease with…

  2. Blood-based protein biomarkers for diagnosis of Alzheimer disease.

    PubMed

    Doecke, James D; Laws, Simon M; Faux, Noel G; Wilson, William; Burnham, Samantha C; Lam, Chiou-Peng; Mondal, Alinda; Bedo, Justin; Bush, Ashley I; Brown, Belinda; De Ruyck, Karl; Ellis, Kathryn A; Fowler, Christopher; Gupta, Veer B; Head, Richard; Macaulay, S Lance; Pertile, Kelly; Rowe, Christopher C; Rembach, Alan; Rodrigues, Mark; Rumble, Rebecca; Szoeke, Cassandra; Taddei, Kevin; Taddei, Tania; Trounson, Brett; Ames, David; Masters, Colin L; Martins, Ralph N

    2012-10-01

    To identify plasma biomarkers for the diagnosis of Alzheimer disease (AD). Baseline plasma screening of 151 multiplexed analytes combined with targeted biomarker and clinical pathology data. General community-based, prospective, longitudinal study of aging. A total of 754 healthy individuals serving as controls and 207 participants with AD from the Australian Imaging Biomarker and Lifestyle study (AIBL) cohort with identified biomarkers that were validated in 58 healthy controls and 112 individuals with AD from the Alzheimer Disease Neuroimaging Initiative (ADNI) cohort. A biomarker panel was identified that included markers significantly increased (cortisol, pancreatic polypeptide, insulinlike growth factor binding protein 2, β(2) microglobulin, vascular cell adhesion molecule 1, carcinoembryonic antigen, matrix metalloprotein 2, CD40, macrophage inflammatory protein 1α, superoxide dismutase, and homocysteine) and decreased (apolipoprotein E, epidermal growth factor receptor, hemoglobin, calcium, zinc, interleukin 17, and albumin) in AD. Cross-validated accuracy measures from the AIBL cohort reached a mean (SD) of 85% (3.0%) for sensitivity and specificity and 93% (3.0) for the area under the receiver operating characteristic curve. A second validation using the ADNI cohort attained accuracy measures of 80% (3.0%) for sensitivity and specificity and 85% (3.0) for area under the receiver operating characteristic curve. This study identified a panel of plasma biomarkers that distinguish individuals with AD from cognitively healthy control subjects with high sensitivity and specificity. Cross-validation within the AIBL cohort and further validation within the ADNI cohort provides strong evidence that the identified biomarkers are important for AD diagnosis.

  3. Cross platform development using Delphi and Kylix

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McDonald, J.L.; Nishimura, H.; Timossi, C.

    2002-10-08

    A cross platform component for EPICS Simple Channel Access (SCA) has been developed for the use with Delphi on Windows and Kylix on Linux. An EPICS controls GUI application developed on Windows runs on Linux by simply rebuilding it, and vice versa. This paper describes the technical details of the component.

  4. Using Small UAS for Mission Simulation, Science Validation, and Definition

    NASA Astrophysics Data System (ADS)

    Abakians, H.; Donnellan, A.; Chapman, B. D.; Williford, K. H.; Francis, R.; Ehlmann, B. L.; Smith, A. T.

    2017-12-01

    Small Unmanned Aerial Systems (sUAS) are increasingly being used across JPL and NASA for science data collection, mission simulation, and mission validation. They can also be used as proof of concept for development of autonomous capabilities for Earth and planetary exploration. sUAS are useful for reconstruction of topography and imagery for a variety of applications ranging from fault zone morphology, Mars analog studies, geologic mapping, photometry, and estimation of vegetation structure. Imagery, particularly multispectral imagery can be used for identifying materials such as fault lithology or vegetation type. Reflectance maps can be produced for wetland or other studies. Topography and imagery observations are useful in radar studies such as from UAVSAR or the future NISAR mission to validate 3D motions and to provide imagery in areas of disruption where the radar measurements decorrelate. Small UAS are inexpensive to operate, reconfigurable, and agile, making them a powerful platform for validating mission science measurements, and also for providing surrogate data for existing or future missions.

  5. MODIS Validation, Data Merger and Other Activities Accomplished by the SIMBIOS Project: 2002-2003

    NASA Technical Reports Server (NTRS)

    Fargion, Giulietta S.; McClain, Charles R.

    2003-01-01

    The purpose of this technical report is to provide current documentation of the Sensor Intercomparison and Merger for Biological and Interdisciplinary Oceanic Studies (SIMBIOS) Project activities, satellite data processing, and data product validation. This documentation is necessary to ensure that critical information is related to the scientific community and NASA management. This critical information includes the technical difficulties and challenges of validating and combining ocean color data from an array of independent satellite systems to form consistent and accurate global bio-optical time series products. This technical report focuses on the SIMBIOS Project s efforts in support of the Moderate-Resolution Imaging Spectroradiometer (MODIS) on the Earth Observing System (EOS) Terra platform (similar evaluations of MODIS/Aqua are underway). This technical report is not meant as a substitute for scientific literature. Instead, it will provide a ready and responsive vehicle for the multitude of technical reports issued by an operational project.

  6. NOAA/DOE CWP structural analysis package. [CWPFLY, CWPEXT, COTEC, and XOTEC codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pompa, J.A.; Lunz, D.F.

    1979-09-01

    The theoretical development and computer code user's manual for analysis of the Ocean Thermal Energy Conversion (OTEC) plant cold water pipe (CWP) are presented. The analysis of the CWP includes coupled platform/CWP loadngs and dynamic responses. This report with the exception of the Introduction and Appendix F was orginally published as Hydronautics, Inc., Technical Report No. 7825-2 (by Barr, Chang, and Thasanatorn) in November 1978. A detailed theoretical development of the equations describing the coupled platform/CWP system and preliminary validation efforts are described. The appendices encompass a complete user's manual, describing the inputs, outputs and operation of the four componentmore » programs, and detail changes and updates implemented since the original release of the code by Hydronautics. The code itself is available through NOAA's Office of Ocean Technology and Engineering Services.« less

  7. ATLAS software stack on ARM64

    NASA Astrophysics Data System (ADS)

    Smith, Joshua Wyatt; Stewart, Graeme A.; Seuster, Rolf; Quadt, Arnulf; ATLAS Collaboration

    2017-10-01

    This paper reports on the port of the ATLAS software stack onto new prototype ARM64 servers. This included building the “external” packages that the ATLAS software relies on. Patches were needed to introduce this new architecture into the build as well as patches that correct for platform specific code that caused failures on non-x86 architectures. These patches were applied such that porting to further platforms will need no or only very little adjustments. A few additional modifications were needed to account for the different operating system, Ubuntu instead of Scientific Linux 6 / CentOS7. Selected results from the validation of the physics outputs on these ARM 64-bit servers will be shown. CPU, memory and IO intensive benchmarks using ATLAS specific environment and infrastructure have been performed, with a particular emphasis on the performance vs. energy consumption.

  8. SMAP RADAR Calibration and Validation

    NASA Astrophysics Data System (ADS)

    West, R. D.; Jaruwatanadilok, S.; Chaubel, M. J.; Spencer, M.; Chan, S. F.; Chen, C. W.; Fore, A.

    2015-12-01

    The Soil Moisture Active Passive (SMAP) mission launched on Jan 31, 2015. The mission employs L-band radar and radiometer measurements to estimate soil moisture with 4% volumetric accuracy at a resolution of 10 km, and freeze-thaw state at a resolution of 1-3 km. Immediately following launch, there was a three month instrument checkout period, followed by six months of level 1 (L1) calibration and validation. In this presentation, we will discuss the calibration and validation activities and results for the L1 radar data. Early SMAP radar data were used to check commanded timing parameters, and to work out issues in the low- and high-resolution radar processors. From April 3-13 the radar collected receive only mode data to conduct a survey of RFI sources. Analysis of the RFI environment led to a preferred operating frequency. The RFI survey data were also used to validate noise subtraction and scaling operations in the radar processors. Normal radar operations resumed on April 13. All radar data were examined closely for image quality and calibration issues which led to improvements in the radar data products for the beta release at the end of July. Radar data were used to determine and correct for small biases in the reported spacecraft attitude. Geo-location was validated against coastline positions and the known positions of corner reflectors. Residual errors at the time of the beta release are about 350 m. Intra-swath biases in the high-resolution backscatter images are reduced to less than 0.3 dB for all polarizations. Radiometric cross-calibration with Aquarius was performed using areas of the Amazon rain forest. Cross-calibration was also examined using ocean data from the low-resolution processor and comparing with the Aquarius wind model function. Using all a-priori calibration constants provided good results with co-polarized measurements matching to better than 1 dB, and cross-polarized measurements matching to about 1 dB in the beta release. During the second half of the L1 cal/val period, the RFI removal algorithm will be tuned for optimal performance, and the Faraday rotation corrections used in radar processing will be further developed and validated. This work is supported by the SMAP project at the Jet Propulsion Laboratory, California Institute of Technology.

  9. Low-Power Architecture for an Optical Life Gas Analyzer

    NASA Technical Reports Server (NTRS)

    Pilgrim, Jeffrey; Vakhtin, Andrei

    2012-01-01

    Analog and digital electronic control architecture has been combined with an operating methodology for an optical trace gas sensor platform that allows very low power consumption while providing four independent gas measurements in essentially real time, as well as a user interface and digital data storage and output. The implemented design eliminates the cross-talk between the measurement channels while maximizing the sensitivity, selectivity, and dynamic range for each measured gas. The combination provides for battery operation on a simple camcorder battery for as long as eight hours. The custom, compact, rugged, self-contained design specifically targets applications of optical major constituent and trace gas detection for multiple gases using multiple lasers and photodetectors in an integrated package.

  10. Department of Defense Joint Technical Architecture Version 2.0

    DTIC Science & Technology

    1998-05-26

    Mandates 2.1-5 2.1.4.1 Year 2000 (Y2K) Compliance 2.1-5 2.1.4.2 Defense Information Infrastructure Common Operating Environment (DU COE) 2.1-5 2.1.5...2.2.2.2.1.6 Communications Services 2.2-11 2.2.2.2.1.7 Operating System Services 2.2-11 2.2.2.2.2 Application Platform Cross-Area Services 2.2- 12 ...2.2.2.2.2.1 Internationalization Services 2.2- 12 2.2.2.2.2.2 Security Services 2.2- 12 2.2.2.2.2.3 System Management Services 2.2- 12 2.2.2.2.2.4

  11. Literature evidence in open targets - a target validation platform.

    PubMed

    Kafkas, Şenay; Dunham, Ian; McEntyre, Johanna

    2017-06-06

    We present the Europe PMC literature component of Open Targets - a target validation platform that integrates various evidence to aid drug target identification and validation. The component identifies target-disease associations in documents and ranks the documents based on their confidence from the Europe PMC literature database, by using rules utilising expert-provided heuristic information. The confidence score of a given document represents how valuable the document is in the scope of target validation for a given target-disease association by taking into account the credibility of the association based on the properties of the text. The component serves the platform regularly with the up-to-date data since December, 2015. Currently, there are a total number of 1168365 distinct target-disease associations text mined from >26 million PubMed abstracts and >1.2 million Open Access full text articles. Our comparative analyses on the current available evidence data in the platform revealed that 850179 of these associations are exclusively identified by literature mining. This component helps the platform's users by providing the most relevant literature hits for a given target and disease. The text mining evidence along with the other types of evidence can be explored visually through https://www.targetvalidation.org and all the evidence data is available for download in json format from https://www.targetvalidation.org/downloads/data .

  12. Clinical validation of the 50 gene AmpliSeq Cancer Panel V2 for use on a next generation sequencing platform using formalin fixed, paraffin embedded and fine needle aspiration tumour specimens.

    PubMed

    Rathi, Vivek; Wright, Gavin; Constantin, Diana; Chang, Siok; Pham, Huong; Jones, Kerryn; Palios, Atha; Mclachlan, Sue-Anne; Conron, Matthew; McKelvie, Penny; Williams, Richard

    2017-01-01

    The advent of massively parallel sequencing has caused a paradigm shift in the ways cancer is treated, as personalised therapy becomes a reality. More and more laboratories are looking to introduce next generation sequencing (NGS) as a tool for mutational analysis, as this technology has many advantages compared to conventional platforms like Sanger sequencing. In Australia all massively parallel sequencing platforms are still considered in-house in vitro diagnostic tools by the National Association of Testing Authorities (NATA) and a comprehensive analytical validation of all assays, and not just mere verification, is a strict requirement before accreditation can be granted for clinical testing on these platforms. Analytical validation of assays on NGS platforms can prove to be extremely challenging for pathology laboratories. Although there are many affordable and easily accessible NGS instruments available, there are no standardised guidelines as yet for clinical validation of NGS assays. We present an accreditation development procedure that was both comprehensive and applicable in a setting of hospital laboratory for NGS services. This approach may also be applied to other NGS applications in service laboratories. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  13. Corsica: A Multi-Mission Absolute Calibration Site

    NASA Astrophysics Data System (ADS)

    Bonnefond, P.; Exertier, P.; Laurain, O.; Guinle, T.; Femenias, P.

    2013-09-01

    In collaboration with the CNES and NASA oceanographic projects (TOPEX/Poseidon and Jason), the OCA (Observatoire de la Côte d'Azur) developed a verification site in Corsica since 1996, operational since 1998. CALibration/VALidation embraces a wide variety of activities, ranging from the interpretation of information from internal-calibration modes of the sensors to validation of the fully corrected estimates of the reflector heights using in situ data. Now, Corsica is, like the Harvest platform (NASA side) [14], an operating calibration site able to support a continuous monitoring with a high level of accuracy: a 'point calibration' which yields instantaneous bias estimates with a 10-day repeatability of 30 mm (standard deviation) and mean errors of 4 mm (standard error). For a 35-day repeatability (ERS, Envisat), due to a smaller time series, the standard error is about the double ( 7 mm).In this paper, we will present updated results of the absolute Sea Surface Height (SSH) biases for TOPEX/Poseidon (T/P), Jason-1, Jason-2, ERS-2 and Envisat.

  14. Validation of Ionospheric Measurements from the International Space Station (ISS)

    NASA Technical Reports Server (NTRS)

    Coffey, Victoria; Minow, Joseph; Wright, Kenneth

    2009-01-01

    The International Space Station orbit provides an ideal platform for in-situ studies of space weather effects on the mid and low-latitude F-2 region ionosphere. The Floating Potential Measurement Unit (FPMU) operating on the ISS since Aug 2006, is a suite of plasma instruments: a Floating Potential Probe (FPP), a Plasma Impedance Probe (PIP), a Wide-sweep Langmuir Probe (WLP), and a Narrow-Sweep Langmuir Probe. This instrument package provides a new opportunity for collaborative multi-instrument studies of the F-region ionosphere during both quiet and disturbed periods. This presentation first describes the operational parameters for each of the FPMU probes and shows examples of an intra-instrument validation. We then show comparisons with the plasma density and temperature measurements derived from the TIMED GUVI ultraviolet imager, the Millstone Hill ground based incoherent scatter radar, and DIAS digisondes, Finally we show one of several observations of night-time equatorial density holes demonstrating the capabilities of the probes for monitoring mid and low latitude plasma processes.

  15. Development and systematic validation of qPCR assays for rapid and reliable differentiation of Xylella fastidiosa strains causing citrus variegated chlorosis.

    PubMed

    Li, Wenbin; Teixeira, Diva C; Hartung, John S; Huang, Qi; Duan, Yongping; Zhou, Lijuan; Chen, Jianchi; Lin, Hong; Lopes, Silvio; Ayres, A Juliano; Levy, Laurene

    2013-01-01

    The xylem-limited, Gram-negative, fastidious plant bacterium Xylella fastidiosa is the causal agent of citrus variegated chlorosis (CVC), a destructive disease affecting approximately half of the citrus plantations in the State of São Paulo, Brazil. The disease was recently found in Central America and is threatening the multi-billion U.S. citrus industry. Many strains of X. fastidiosa are pathogens or endophytes in various plants growing in the U.S., and some strains cross infect several host plants. In this study, a TaqMan-based assay targeting the 16S rDNA signature region was developed for the identification of X. fastidiosa at the species level. Another TaqMan-based assay was developed for the specific identification of the CVC strains. Both new assays have been systematically validated in comparison with the primer/probe sets from four previously published assays on one platform and under similar PCR conditions, and shown to be superior. The species specific assay detected all X. fastidiosa strains and did not amplify any other citrus pathogen or endophyte tested. The CVC-specific assay detected all CVC strains but did not amplify any non-CVC X. fastidiosa nor any other citrus pathogen or endophyte evaluated. Both sets were multiplexed with a reliable internal control assay targeting host plant DNA, and their diagnostic specificity and sensitivity remained unchanged. This internal control provides quality assurance for DNA extraction, performance of PCR reagents, platforms and operators. The limit of detection for both assays was equivalent to 2 to 10 cells of X. fastidiosa per reaction for field citrus samples. Petioles and midribs of symptomatic leaves of sweet orange harbored the highest populations of X. fastidiosa, providing the best materials for detection of the pathogen. These new species specific assay will be invaluable for molecular identification of X. fastidiosa at the species level, and the CVC specific assay will be very powerful for the specific identification of X. fastidiosa strains that cause citrus variegated chlorosis. Published by Elsevier B.V.

  16. Rapid sample classification using an open port sampling interface coupled with liquid introduction atmospheric pressure ionization mass spectrometry.

    PubMed

    Van Berkel, Gary J; Kertesz, Vilmos

    2017-02-15

    An "Open Access"-like mass spectrometric platform to fully utilize the simplicity of the manual open port sampling interface for rapid characterization of unprocessed samples by liquid introduction atmospheric pressure ionization mass spectrometry has been lacking. The in-house developed integrated software with a simple, small and relatively low-cost mass spectrometry system introduced here fills this void. Software was developed to operate the mass spectrometer, to collect and process mass spectrometric data files, to build a database and to classify samples using such a database. These tasks were accomplished via the vendor-provided software libraries. Sample classification based on spectral comparison utilized the spectral contrast angle method. Using the developed software platform near real-time sample classification is exemplified using a series of commercially available blue ink rollerball pens and vegetable oils. In the case of the inks, full scan positive and negative ion ESI mass spectra were both used for database generation and sample classification. For the vegetable oils, full scan positive ion mode APCI mass spectra were recorded. The overall accuracy of the employed spectral contrast angle statistical model was 95.3% and 98% in case of the inks and oils, respectively, using leave-one-out cross-validation. This work illustrates that an open port sampling interface/mass spectrometer combination, with appropriate instrument control and data processing software, is a viable direct liquid extraction sampling and analysis system suitable for the non-expert user and near real-time sample classification via database matching. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA. Published in 2016. This article is a U.S. Government work and is in the public domain in the USA.

  17. Arsenic removal from contaminated groundwater by membrane-integrated hybrid plant: optimization and control using Visual Basic platform.

    PubMed

    Chakrabortty, S; Sen, M; Pal, P

    2014-03-01

    A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.

  18. Engineer Company Force Structure Force Modularization in Support of Decisive Action. Does the Corps of Engineers Need to Re-Structure Engineer Construction Companies Again in order to Support Decisive Actions?

    DTIC Science & Technology

    2012-05-16

    Regional Command RCP Route Clearance Platoon RSOI Reception, Staging, Onward Movement, Integration SBCT Stryker Brigade Combat Team TOE Table of...Point (ASPs), and field hospital platforms; prepare river crossing sites; and support port repair due to Hydraulic Excavator (HYEX), provides force...platforms, FARPS, supply routes, roads, control points, fire bases, tank ditches, ASPs, and field hospital platforms; prepare river crossing sites; and

  19. Effects of Data Quality on the Characterization of Aerosol Properties from Multiple Sensors

    NASA Technical Reports Server (NTRS)

    Petrenko, Maksym; Ichoku, Charles; Leptoukh, Gregory

    2011-01-01

    Cross-comparison of aerosol properties between ground-based and spaceborne measurements is an important validation technique that helps to investigate the uncertainties of aerosol products acquired using spaceborne sensors. However, it has been shown that even minor differences in the cross-characterization procedure may significantly impact the results of such validation. Of particular consideration is the quality assurance I quality control (QA/QC) information - an auxiliary data indicating a "confidence" level (e.g., Bad, Fair, Good, Excellent, etc.) conferred by the retrieval algorithms on the produced data. Depending on the treatment of available QA/QC information, a cross-characterization procedure has the potential of filtering out invalid data points, such as uncertain or erroneous retrievals, which tend to reduce the credibility of such comparisons. However, under certain circumstances, even high QA/QC values may not fully guarantee the quality of the data. For example, retrievals in proximity of a cloud might be particularly perplexing for an aerosol retrieval algorithm, resulting in an invalid data that, nonetheless, could be assigned a high QA/QC confidence. In this presentation, we will study the effects of several QA/QC parameters on cross-characterization of aerosol properties between the data acquired by multiple spaceborne sensors. We will utilize the Multi-sensor Aerosol Products Sampling System (MAPSS) that provides a consistent platform for multi-sensor comparison, including collocation with measurements acquired by the ground-based Aerosol Robotic Network (AERONET), The multi-sensor spaceborne data analyzed include those acquired by the Terra-MODIS, Aqua-MODIS, Terra-MISR, Aura-OMI, Parasol-POLDER, and CalipsoCALIOP satellite instruments.

  20. A holistic approach to SIM platform and its application to early-warning satellite system

    NASA Astrophysics Data System (ADS)

    Sun, Fuyu; Zhou, Jianping; Xu, Zheyao

    2018-01-01

    This study proposes a new simulation platform named Simulation Integrated Management (SIM) for the analysis of parallel and distributed systems. The platform eases the process of designing and testing both applications and architectures. The main characteristics of SIM are flexibility, scalability, and expandability. To improve the efficiency of project development, new models of early-warning satellite system were designed based on the SIM platform. Finally, through a series of experiments, the correctness of SIM platform and the aforementioned early-warning satellite models was validated, and the systematical analyses for the orbital determination precision of the ballistic missile during its entire flight process were presented, as well as the deviation of the launch/landing point. Furthermore, the causes of deviation and prevention methods will be fully explained. The simulation platform and the models will lay the foundations for further validations of autonomy technology in space attack-defense architecture research.

  1. Testing single point incremental forming moulds for rotomoulding operations

    NASA Astrophysics Data System (ADS)

    Afonso, Daniel; de Sousa, Ricardo Alves; Torcato, Ricardo

    2017-10-01

    Low pressure polymer processes as thermoforming or rotational moulding use much simpler moulds than high pressure processes like injection. However, despite the low forces involved in the process, moulds manufacturing for these applications is still a very material, energy and time consuming operation. Particularly in rotational moulding there is no standard for the mould manufacture and very different techniques are applicable. The goal of this research is to develop and validate a method for manufacturing plastically formed sheet metal moulds by single point incremental forming (SPIF) for rotomoulding and rotocasting operations. A Stewart platform based SPIF machine allow the forming of thick metal sheets, granting the required structural stiffness for the mould surface, and keeping a short manufacture lead time and low thermal inertia. The experimental work involves the proposal of a hollow part, design and fabrication of a sheet metal mould using dieless incremental forming techniques and testing its operation in the production of prototype parts.

  2. ROS-IGTL-Bridge: an open network interface for image-guided therapy using the ROS environment.

    PubMed

    Frank, Tobias; Krieger, Axel; Leonard, Simon; Patel, Niravkumar A; Tokuda, Junichi

    2017-08-01

    With the growing interest in advanced image-guidance for surgical robot systems, rapid integration and testing of robotic devices and medical image computing software are becoming essential in the research and development. Maximizing the use of existing engineering resources built on widely accepted platforms in different fields, such as robot operating system (ROS) in robotics and 3D Slicer in medical image computing could simplify these tasks. We propose a new open network bridge interface integrated in ROS to ensure seamless cross-platform data sharing. A ROS node named ROS-IGTL-Bridge was implemented. It establishes a TCP/IP network connection between the ROS environment and external medical image computing software using the OpenIGTLink protocol. The node exports ROS messages to the external software over the network and vice versa simultaneously, allowing seamless and transparent data sharing between the ROS-based devices and the medical image computing platforms. Performance tests demonstrated that the bridge could stream transforms, strings, points, and images at 30 fps in both directions successfully. The data transfer latency was <1.2 ms for transforms, strings and points, and 25.2 ms for color VGA images. A separate test also demonstrated that the bridge could achieve 900 fps for transforms. Additionally, the bridge was demonstrated in two representative systems: a mock image-guided surgical robot setup consisting of 3D slicer, and Lego Mindstorms with ROS as a prototyping and educational platform for IGT research; and the smart tissue autonomous robot surgical setup with 3D Slicer. The study demonstrated that the bridge enabled cross-platform data sharing between ROS and medical image computing software. This will allow rapid and seamless integration of advanced image-based planning/navigation offered by the medical image computing software such as 3D Slicer into ROS-based surgical robot systems.

  3. Description of the CERES Ocean Validation Experiment (COVE), A Dedicated EOS Validation Test Site

    NASA Astrophysics Data System (ADS)

    Rutledge, K.; Charlock, T.; Smith, B.; Jin, Z.; Rose, F.; Denn, F.; Rutan, D.; Haeffelin, M.; Su, W.; Xhang, T.; Jay, M.

    2001-12-01

    A unique test site located in the mid-Atlantic coastal marine waters has been used by several EOS projects for validation measurements. A common theme across these projects is the need for a stable measurement site within the marine environment for long-term, high quality radiation measurements. The site was initiated by NASA's Clouds and the Earths Radiant Energy System (CERES) project. One of CERES's challenging goals is to provide upwelling and downwelling shortwave fluxes at several pressure altitudes within the atmosphere and at the surface. Operationally the radiative transfer model of Fu and Liou (1996, 1998), the CERES instrument measured radiances and various other EOS platform data are being used to accomplish this goal. We present here, a component of the CERES/EOS validation effort that is focused to verify and optimize the prediction algorithms for radiation parameters associated with the marine coastal and oceanic surface types of the planet. For this validation work, the CERES Ocean Validation Experiment (COVE) was developed to provide detailed high-frequency and long-duration measurements for radiation and their associated dependent variables. The CERES validations also include analytical efforts which will not be described here (but see Charlock et.al, Su et.al., Smith et.al-Fall 2001 AGU Meeting) The COVE activity is based on a rigid ocean platform which is located approximately twenty kilometers off of the coast of Virginia Beach, Virginia. The once-manned US Coast Guard facility rises 35 meters from the ocean surface allowing the radiation instruments to be well above the splash zone. The depth of the sea is eleven meters at the site. A power and communications system has been installed for present and future requirements. Scientific measurements at the site have primarily been developed within the framework of established national and international monitoring programs. These include the Baseline Surface Radiation Network of the World Meteorological Organization, NASA's robotic aerosol measurement program - AERONET, NOAA's GPS Water Vapor Demonstration Network, NOAA's National Buoy Data Center and GEWEX's Global Aerosol Climate Program. Other EOS projects have utilized the COVE platform for validation measurements (short term: MODIS, MISR intermediate term: SEAWIFS). A longer term measurement program for the AIRS instrument to be deployed on the AQUA satellite is underway. The poster will detail the unique measurement and infrastructure assets of the COVE site and present example 1.5 year time series of the major radiometric parameters. Lastly, the near term measurement augmentations that are anticipated at COVE will be discussed.

  4. XPAT: a toolkit to conduct cross-platform association studies with heterogeneous sequencing datasets.

    PubMed

    Yu, Yao; Hu, Hao; Bohlender, Ryan J; Hu, Fulan; Chen, Jiun-Sheng; Holt, Carson; Fowler, Jerry; Guthery, Stephen L; Scheet, Paul; Hildebrandt, Michelle A T; Yandell, Mark; Huff, Chad D

    2018-04-06

    High-throughput sequencing data are increasingly being made available to the research community for secondary analyses, providing new opportunities for large-scale association studies. However, heterogeneity in target capture and sequencing technologies often introduce strong technological stratification biases that overwhelm subtle signals of association in studies of complex traits. Here, we introduce the Cross-Platform Association Toolkit, XPAT, which provides a suite of tools designed to support and conduct large-scale association studies with heterogeneous sequencing datasets. XPAT includes tools to support cross-platform aware variant calling, quality control filtering, gene-based association testing and rare variant effect size estimation. To evaluate the performance of XPAT, we conducted case-control association studies for three diseases, including 783 breast cancer cases, 272 ovarian cancer cases, 205 Crohn disease cases and 3507 shared controls (including 1722 females) using sequencing data from multiple sources. XPAT greatly reduced Type I error inflation in the case-control analyses, while replicating many previously identified disease-gene associations. We also show that association tests conducted with XPAT using cross-platform data have comparable performance to tests using matched platform data. XPAT enables new association studies that combine existing sequencing datasets to identify genetic loci associated with common diseases and other complex traits.

  5. Design and Operational Characteristics of the Shuttle Coherent Wind Lidar

    NASA Technical Reports Server (NTRS)

    Amzajerdian, Farzin; Spiers, Gary D.; Peters, Bruce R.; Li, Ye; Blackwell, Timothy S.; Geary, Joseph M.

    1998-01-01

    NOAA has identified the measurement of atmospheric wind velocities as one of the key unmet data sets for its next generation of sensing platforms. The merits of coherent lidars for the measurement of atmospheric winds from space platforms have been widely recognized; however, it is only recently that several key technologies have advanced to a point where a compact, high fidelity system could be created. Advances have been made in the areas of the diode-pumped, eye-safe, solid state lasers and room temperature, wide bandwidth, semiconductor detectors operating in the near-infrared region. These new lasers can be integrated into efficient and compact optical systems creating new possibilities for the development of low-cost, reliable, and compact coherent lidar systems for wind measurements. Over the past five years, the University of Alabama in Huntsville (UAH) has been working toward further advancing the solid state coherent lidar technology for the measurement of atmospheric winds from space. As part of this effort, UAH had established the design characteristics and defined the expected performance for three different proposed space-based instruments: a technology demonstrator, an operational prototype, and a 7-year lifetime operational instrument. SPARCLE is an ambitious project that is intended to evaluate the suitability of coherent lidar for wind measurements, demonstrate the maturity of the technology for space application, and provide a useable data set for model development and validation. This paper describes the SPARCLE instrument's major physical and environmental design constraints, optical and mechanical designs, and its operational characteristics.

  6. Media-Education Convergence: Applying Transmedia Storytelling Edutainment in E-Learning Environments

    ERIC Educational Resources Information Center

    Kalogeras, Stavroula

    2013-01-01

    In the era of media convergence, transmedia (cross-media/cross-platform/multi-platform) narratives are catering to users who are willing to immerse themselves in their favorite entertainment content. The inherent interactivity of the Internet and the emotional engagement of story can lead to innovative pedagogies in media rich environments. This…

  7. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  8. Cross Talk Between Ceramide and Redox Signaling: Implications for Endothelial Dysfunction and Renal Disease

    PubMed Central

    Li, Pin-Lan; Zhang, Yang

    2013-01-01

    Recent studies have demonstrated that cross talk between ceramide and redox signaling modulates various cell activities and functions and contributes to the development of cardiovascular diseases and renal dysfunctions. Ceramide triggers the generation of reactive oxygen species (ROS) and increases oxidative stress in many mammalian cells and animal models. On the other hand, inhibition of ROS-generating enzymes or treatment of antioxidants impairs sphingomyelinase activation and ceramide production. As a mechanism, ceramide-enriched signaling platforms, special cell membrane rafts (MR) (formerly lipid rafts), provide an important microenvironment to mediate the cross talk of ceramide and redox signaling to exert a corresponding regulatory role on cell and organ functions. In this regard, activation of acid sphingomyelinase and generation of ceramide mediate the formation of ceramide-enriched membrane platforms, where trans-membrane signals are transmitted or amplified through recruitment, clustering, assembling, or integration of various signaling molecules. A typical such signaling platform is MR redox signaling platform that is centered on ceramide production and aggregation leading to recruitment and assembling of NADPH oxidase to form an active complex in the cell plasma membrane. This redox signaling platform not only conducts redox signaling or regulation but also facilitates a feedforward amplification of both ceramide and redox signaling. In addition to this membrane MR redox signaling platform, the cross talk between ceramide and redox signaling may occur in other cell compartments. This book chapter focuses on the molecular mechanisms, spatial–temporal regulations, and implications of this cross talk between ceramide and redox signaling, which may provide novel insights into the understanding of both ceramide and redox signaling pathways. PMID:23563657

  9. Reliability and validity of the Wii Balance Board for assessment of standing balance: A systematic review.

    PubMed

    Clark, Ross A; Mentiplay, Benjamin F; Pua, Yong-Hao; Bower, Kelly J

    2018-03-01

    The use of force platform technologies to assess standing balance is common across a range of clinical areas. Numerous researchers have evaluated the low-cost Wii Balance Board (WBB) for its utility in assessing balance, with variable findings. This review aimed to systematically evaluate the reliability and concurrent validity of the WBB for assessment of static standing balance. Articles were retrieved from six databases (Medline, SCOPUS, EMBASE, CINAHL, Web of Science, Inspec) from 2007 to 2017. After independent screening by two reviewers, 25 articles were included. Two reviewers performed the data extraction and quality assessment. Test-retest reliability was investigated in 12 studies, with intraclass correlation coefficients or Pearson's correlation values showing a range from poor to excellent reliability (range: 0.27 to 0.99). Concurrent validity (i.e. comparison with another force platform) was examined in 21 studies, and was generally found to be excellent in studies examining the association between the same outcome measures collected on both devices. For studies reporting predominantly poor to moderate validity, potentially influential factors included the choice of 1) criterion reference (e.g. not a common force platform), 2) test duration (e.g. <30 s for double leg), 3) outcome measure (e.g. comparing a centre of pressure variable from the WBB with a summary score from the force platform), 4) data acquisition platform (studies using Apple iOS reported predominantly moderate validity), and 5) low sample size. In conclusion, evidence suggests that the WBB can be used as a reliable and valid tool for assessing standing balance. Protocol registration number: PROSPERO 2017: CRD42017058122. Copyright © 2018 Elsevier B.V. All rights reserved.

  10. MACSAT - A Near Equatorial Earth Observation Mission

    NASA Astrophysics Data System (ADS)

    Kim, B. J.; Park, S.; Kim, E.-E.; Park, W.; Chang, H.; Seon, J.

    MACSAT mission was initiated by Malaysia to launch a high-resolution remote sensing satellite into Near Equatorial Orbit (NEO). Due to its geographical location, Malaysia can have large benefits from NEO satellite operation. From the baseline circular orbit of 685 km altitude with 7 degrees of inclination, the neighboring regions around Malaysian territory can be frequently monitored. The equatorial environment around the globe can also be regularly observed with unique revisit characteristics. The primary mission objective of MACSAT program is to develop and validate technologies for a near equatorial orbit remote sensing satellite system. MACSAT is optimally designed to accommodate an electro-optic Earth observation payload, Medium-sized Aperture Camera (MAC). Malaysian and Korean joint engineering teams are formed for the effective implementation of the satellite system. An integrated team approach is adopted for the joint development for MACSAT. MAC is a pushbroom type camera with 2.5 m of Ground Sampling Distance (GSD) in panchromatic band and 5 m of GSD in four multi-spectral bands. The satellite platform is a mini-class satellite. Including MAC payload, the satellite weighs under 200 kg. Spacecraft bus is designed optimally to support payload operations during 3 years of mission life. The payload has 20 km of swath width with +/- 30 o of tilting capability. 32 Gbits of solid state recorder is implemented as the mass image storage. The ground element is an integrated ground station for mission control and payload operation. It is equipped with S- band up/down link for commanding and telemetry reception as well as 30 Mbps class X-band down link for image reception and processing. The MACSAT system is capable of generating 1:25,000-scale image maps. It is also anticipated to have capability for cross-track stereo imaging for Digital elevation Model (DEM) generation.

  11. New Parameterization of Neutron Absorption Cross Sections

    NASA Technical Reports Server (NTRS)

    Tripathi, Ram K.; Wilson, John W.; Cucinotta, Francis A.

    1997-01-01

    Recent parameterization of absorption cross sections for any system of charged ion collisions, including proton-nucleus collisions, is extended for neutron-nucleus collisions valid from approx. 1 MeV to a few GeV, thus providing a comprehensive picture of absorption cross sections for any system of collision pairs (charged or uncharged). The parameters are associated with the physics of the problem. At lower energies, optical potential at the surface is important, and the Pauli operator plays an increasingly important role at intermediate energies. The agreement between the calculated and experimental data is better than earlier published results.

  12. Can smartwatches replace smartphones for posture tracking?

    PubMed

    Mortazavi, Bobak; Nemati, Ebrahim; VanderWall, Kristina; Flores-Rodriguez, Hector G; Cai, Jun Yu Jacinta; Lucier, Jessica; Naeim, Arash; Sarrafzadeh, Majid

    2015-10-22

    This paper introduces a human posture tracking platform to identify the human postures of sitting, standing or lying down, based on a smartwatch. This work develops such a system as a proof-of-concept study to investigate a smartwatch's ability to be used in future remote health monitoring systems and applications. This work validates the smartwatches' ability to track the posture of users accurately in a laboratory setting while reducing the sampling rate to potentially improve battery life, the first steps in verifying that such a system would work in future clinical settings. The algorithm developed classifies the transitions between three posture states of sitting, standing and lying down, by identifying these transition movements, as well as other movements that might be mistaken for these transitions. The system is trained and developed on a Samsung Galaxy Gear smartwatch, and the algorithm was validated through a leave-one-subject-out cross-validation of 20 subjects. The system can identify the appropriate transitions at only 10 Hz with an F-score of 0.930, indicating its ability to effectively replace smart phones, if needed.

  13. Development and validation of a flax (Linum usitatissimum L.) gene expression oligo microarray

    PubMed Central

    2010-01-01

    Background Flax (Linum usitatissimum L.) has been cultivated for around 9,000 years and is therefore one of the oldest cultivated species. Today, flax is still grown for its oil (oil-flax or linseed cultivars) and its cellulose-rich fibres (fibre-flax cultivars) used for high-value linen garments and composite materials. Despite the wide industrial use of flax-derived products, and our actual understanding of the regulation of both wood fibre production and oil biosynthesis more information must be acquired in both domains. Recent advances in genomics are now providing opportunities to improve our fundamental knowledge of these complex processes. In this paper we report the development and validation of a high-density oligo microarray platform dedicated to gene expression analyses in flax. Results Nine different RNA samples obtained from flax inner- and outer-stems, seeds, leaves and roots were used to generate a collection of 1,066,481 ESTs by massive parallel pyrosequencing. Sequences were assembled into 59,626 unigenes and 48,021 sequences were selected for oligo design and high-density microarray (Nimblegen 385K) fabrication with eight, non-overlapping 25-mers oligos per unigene. 18 independent experiments were used to evaluate the hybridization quality, precision, specificity and accuracy and all results confirmed the high technical quality of our microarray platform. Cross-validation of microarray data was carried out using quantitative qRT-PCR. Nine target genes were selected on the basis of microarray results and reflected the whole range of fold change (both up-regulated and down-regulated genes in different samples). A statistically significant positive correlation was obtained comparing expression levels for each target gene across all biological replicates both in qRT-PCR and microarray results. Further experiments illustrated the capacity of our arrays to detect differential gene expression in a variety of flax tissues as well as between two contrasted flax varieties. Conclusion All results suggest that our high-density flax oligo-microarray platform can be used as a very sensitive tool for analyzing gene expression in a large variety of tissues as well as in different cultivars. Moreover, this highly reliable platform can also be used for the quantification of mRNA transcriptional profiling in different flax tissues. PMID:20964859

  14. Development and validation of a flax (Linum usitatissimum L.) gene expression oligo microarray.

    PubMed

    Fenart, Stéphane; Ndong, Yves-Placide Assoumou; Duarte, Jorge; Rivière, Nathalie; Wilmer, Jeroen; van Wuytswinkel, Olivier; Lucau, Anca; Cariou, Emmanuelle; Neutelings, Godfrey; Gutierrez, Laurent; Chabbert, Brigitte; Guillot, Xavier; Tavernier, Reynald; Hawkins, Simon; Thomasset, Brigitte

    2010-10-21

    Flax (Linum usitatissimum L.) has been cultivated for around 9,000 years and is therefore one of the oldest cultivated species. Today, flax is still grown for its oil (oil-flax or linseed cultivars) and its cellulose-rich fibres (fibre-flax cultivars) used for high-value linen garments and composite materials. Despite the wide industrial use of flax-derived products, and our actual understanding of the regulation of both wood fibre production and oil biosynthesis more information must be acquired in both domains. Recent advances in genomics are now providing opportunities to improve our fundamental knowledge of these complex processes. In this paper we report the development and validation of a high-density oligo microarray platform dedicated to gene expression analyses in flax. Nine different RNA samples obtained from flax inner- and outer-stems, seeds, leaves and roots were used to generate a collection of 1,066,481 ESTs by massive parallel pyrosequencing. Sequences were assembled into 59,626 unigenes and 48,021 sequences were selected for oligo design and high-density microarray (Nimblegen 385K) fabrication with eight, non-overlapping 25-mers oligos per unigene. 18 independent experiments were used to evaluate the hybridization quality, precision, specificity and accuracy and all results confirmed the high technical quality of our microarray platform. Cross-validation of microarray data was carried out using quantitative qRT-PCR. Nine target genes were selected on the basis of microarray results and reflected the whole range of fold change (both up-regulated and down-regulated genes in different samples). A statistically significant positive correlation was obtained comparing expression levels for each target gene across all biological replicates both in qRT-PCR and microarray results. Further experiments illustrated the capacity of our arrays to detect differential gene expression in a variety of flax tissues as well as between two contrasted flax varieties. All results suggest that our high-density flax oligo-microarray platform can be used as a very sensitive tool for analyzing gene expression in a large variety of tissues as well as in different cultivars. Moreover, this highly reliable platform can also be used for the quantification of mRNA transcriptional profiling in different flax tissues.

  15. Livingstone Model-Based Diagnosis of Earth Observing One Infusion Experiment

    NASA Technical Reports Server (NTRS)

    Hayden, Sandra C.; Sweet, Adam J.; Christa, Scott E.

    2004-01-01

    The Earth Observing One satellite, launched in November 2000, is an active earth science observation platform. This paper reports on the progress of an infusion experiment in which the Livingstone 2 Model-Based Diagnostic engine is deployed on Earth Observing One, demonstrating the capability to monitor the nominal operation of the spacecraft under command of an on-board planner, and demonstrating on-board diagnosis of spacecraft failures. Design and development of the experiment, specification and validation of diagnostic scenarios, characterization of performance results and benefits of the model- based approach are presented.

  16. STR-validator: an open source platform for validation and process control.

    PubMed

    Hansson, Oskar; Gill, Peter; Egeland, Thore

    2014-11-01

    This paper addresses two problems faced when short tandem repeat (STR) systems are validated for forensic purposes: (1) validation is extremely time consuming and expensive, and (2) there is strong consensus about what to validate but not how. The first problem is solved by powerful data processing functions to automate calculations. Utilising an easy-to-use graphical user interface, strvalidator (hereafter referred to as STR-validator) can greatly increase the speed of validation. The second problem is exemplified by a series of analyses, and subsequent comparison with published material, highlighting the need for a common validation platform. If adopted by the forensic community STR-validator has the potential to standardise the analysis of validation data. This would not only facilitate information exchange but also increase the pace at which laboratories are able to switch to new technology. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  17. Virtual network computing: cross-platform remote display and collaboration software.

    PubMed

    Konerding, D E

    1999-04-01

    VNC (Virtual Network Computing) is a computer program written to address the problem of cross-platform remote desktop/application display. VNC uses a client/server model in which an image of the desktop of the server is transmitted to the client and displayed. The client collects mouse and keyboard input from the user and transmits them back to the server. The VNC client and server can run on Windows 95/98/NT, MacOS, and Unix (including Linux) operating systems. VNC is multi-user on Unix machines (any number of servers can be run are unrelated to the primary display of the computer), while it is effectively single-user on Macintosh and Windows machines (only one server can be run, displaying the contents of the primary display of the server). The VNC servers can be configured to allow more than one client to connect at one time, effectively allowing collaboration through the shared desktop. I describe the function of VNC, provide details of installation, describe how it achieves its goal, and evaluate the use of VNC for molecular modelling. VNC is an extremely useful tool for collaboration, instruction, software development, and debugging of graphical programs with remote users.

  18. SimHap GUI: an intuitive graphical user interface for genetic association analysis.

    PubMed

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-12-25

    Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis.

  19. Implementation of remote monitoring and managing switches

    NASA Astrophysics Data System (ADS)

    Leng, Junmin; Fu, Guo

    2010-12-01

    In order to strengthen the safety performance of the network and provide the big convenience and efficiency for the operator and the manager, the system of remote monitoring and managing switches has been designed and achieved using the advanced network technology and present network resources. The fast speed Internet Protocol Cameras (FS IP Camera) is selected, which has 32-bit RSIC embedded processor and can support a number of protocols. An Optimal image compress algorithm Motion-JPEG is adopted so that high resolution images can be transmitted by narrow network bandwidth. The architecture of the whole monitoring and managing system is designed and implemented according to the current infrastructure of the network and switches. The control and administrative software is projected. The dynamical webpage Java Server Pages (JSP) development platform is utilized in the system. SQL (Structured Query Language) Server database is applied to save and access images information, network messages and users' data. The reliability and security of the system is further strengthened by the access control. The software in the system is made to be cross-platform so that multiple operating systems (UNIX, Linux and Windows operating systems) are supported. The application of the system can greatly reduce manpower cost, and can quickly find and solve problems.

  20. Genome Wide Identification of SARS-CoV Susceptibility Loci Using the Collaborative Cross

    PubMed Central

    Gralinski, Lisa E.; Ferris, Martin T.; Aylor, David L.; Whitmore, Alan C.; Green, Richard; Frieman, Matthew B.; Deming, Damon; Menachery, Vineet D.; Miller, Darla R.; Buus, Ryan J.; Bell, Timothy A.; Churchill, Gary A.; Threadgill, David W.; Katze, Michael G.; McMillan, Leonard; Valdar, William; Heise, Mark T.; Pardo-Manuel de Villena, Fernando; Baric, Ralph S.

    2015-01-01

    New systems genetics approaches are needed to rapidly identify host genes and genetic networks that regulate complex disease outcomes. Using genetically diverse animals from incipient lines of the Collaborative Cross mouse panel, we demonstrate a greatly expanded range of phenotypes relative to classical mouse models of SARS-CoV infection including lung pathology, weight loss and viral titer. Genetic mapping revealed several loci contributing to differential disease responses, including an 8.5Mb locus associated with vascular cuffing on chromosome 3 that contained 23 genes and 13 noncoding RNAs. Integrating phenotypic and genetic data narrowed this region to a single gene, Trim55, an E3 ubiquitin ligase with a role in muscle fiber maintenance. Lung pathology and transcriptomic data from mice genetically deficient in Trim55 were used to validate its role in SARS-CoV-induced vascular cuffing and inflammation. These data establish the Collaborative Cross platform as a powerful genetic resource for uncovering genetic contributions of complex traits in microbial disease severity, inflammation and virus replication in models of outbred populations. PMID:26452100

  1. HIVprotI: an integrated web based platform for prediction and design of HIV proteins inhibitors.

    PubMed

    Qureshi, Abid; Rajput, Akanksha; Kaur, Gazaldeep; Kumar, Manoj

    2018-03-09

    A number of anti-retroviral drugs are being used for treating Human Immunodeficiency Virus (HIV) infection. Due to emergence of drug resistant strains, there is a constant quest to discover more effective anti-HIV compounds. In this endeavor, computational tools have proven useful in accelerating drug discovery. Although methods were published to design a class of compounds against a specific HIV protein, but an integrated web server for the same is lacking. Therefore, we have developed support vector machine based regression models using experimentally validated data from ChEMBL repository. Quantitative structure activity relationship based features were selected for predicting inhibition activity of a compound against HIV proteins namely protease (PR), reverse transcriptase (RT) and integrase (IN). The models presented a maximum Pearson correlation coefficient of 0.78, 0.76, 0.74 and 0.76, 0.68, 0.72 during tenfold cross-validation on IC 50 and percent inhibition datasets of PR, RT, IN respectively. These models performed equally well on the independent datasets. Chemical space mapping, applicability domain analyses and other statistical tests further support robustness of the predictive models. Currently, we have identified a number of chemical descriptors that are imperative in predicting the compound inhibition potential. HIVprotI platform ( http://bioinfo.imtech.res.in/manojk/hivproti ) would be useful in virtual screening of inhibitors as well as designing of new molecules against the important HIV proteins for therapeutics development.

  2. Multivariate Models of Men's and Women's Partner Aggression

    ERIC Educational Resources Information Center

    O'Leary, K. Daniel; Smith Slep, Amy M.; O'Leary, Susan G.

    2007-01-01

    This exploratory study was designed to address how multiple factors drawn from varying focal models and ecological levels of influence might operate relative to each other to predict partner aggression, using data from 453 representatively sampled couples. The resulting cross-validated models predicted approximately 50% of the variance in men's…

  3. An Efficient Method for Classifying Perfectionists

    ERIC Educational Resources Information Center

    Rice, Kenneth G.; Ashby, Jeffrey S.

    2007-01-01

    Multiple samples of university students (N = 1,537) completed the Almost Perfect Scale-Revised (APS-R; R. B. Slaney, M. Mobley, J. Trippi, J. Ashby, & D. G. Johnson, 1996). Cluster analyses, cross-validated discriminant function analyses, and receiver operating characteristic curves for sensitivity and specificity of APS-R scores were used to…

  4. Integrated Software Health Management for Aircraft GN and C

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole

    2011-01-01

    Modern aircraft rely heavily on dependable operation of many safety-critical software components. Despite careful design, verification and validation (V&V), on-board software can fail with disastrous consequences if it encounters problematic software/hardware interaction or must operate in an unexpected environment. We are using a Bayesian approach to monitor the software and its behavior during operation and provide up-to-date information about the health of the software and its components. The powerful reasoning mechanism provided by our model-based Bayesian approach makes reliable diagnosis of the root causes possible and minimizes the number of false alarms. Compilation of the Bayesian model into compact arithmetic circuits makes SWHM feasible even on platforms with limited CPU power. We show initial results of SWHM on a small simulator of an embedded aircraft software system, where software and sensor faults can be injected.

  5. NASA Stennis Space Center integrated system health management test bed and development capabilities

    NASA Astrophysics Data System (ADS)

    Figueroa, Fernando; Holland, Randy; Coote, David

    2006-05-01

    Integrated System Health Management (ISHM) capability for rocket propulsion testing is rapidly evolving and promises substantial reduction in time and cost of propulsion systems development, with substantially reduced operational costs and evolutionary improvements in launch system operational robustness. NASA Stennis Space Center (SSC), along with partners that includes NASA, contractor, and academia; is investigating and developing technologies to enable ISHM capability in SSC's rocket engine test stands (RETS). This will enable validation and experience capture over a broad range of rocket propulsion systems of varying complexity. This paper describes key components that constitute necessary ingredients to make possible implementation of credible ISHM capability in RETS, other NASA ground test and operations facilities, and ultimately spacecraft and space platforms and systems: (1) core technologies for ISHM, (2) RETS as ISHM testbeds, and (3) RETS systems models.

  6. KNIME for reproducible cross-domain analysis of life science data.

    PubMed

    Fillbrunn, Alexander; Dietz, Christian; Pfeuffer, Julianus; Rahn, René; Landrum, Gregory A; Berthold, Michael R

    2017-11-10

    Experiments in the life sciences often involve tools from a variety of domains such as mass spectrometry, next generation sequencing, or image processing. Passing the data between those tools often involves complex scripts for controlling data flow, data transformation, and statistical analysis. Such scripts are not only prone to be platform dependent, they also tend to grow as the experiment progresses and are seldomly well documented, a fact that hinders the reproducibility of the experiment. Workflow systems such as KNIME Analytics Platform aim to solve these problems by providing a platform for connecting tools graphically and guaranteeing the same results on different operating systems. As an open source software, KNIME allows scientists and programmers to provide their own extensions to the scientific community. In this review paper we present selected extensions from the life sciences that simplify data exploration, analysis, and visualization and are interoperable due to KNIME's unified data model. Additionally, we name other workflow systems that are commonly used in the life sciences and highlight their similarities and differences to KNIME. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Targeted proteomics guided by label-free global proteome analysis in saliva reveal transition signatures from health to periodontal disease.

    PubMed

    Bostanci, Nagihan; Selevsek, Nathalie; Wolski, Witold; Grossmann, Jonas; Bao, Kai; Wahlander, Asa; Trachsel, Christian; Schlapbach, Ralph; Özturk, Veli Özgen; Afacan, Beral; Emingil, Gulnur; Belibasakis, Georgios N

    2018-04-02

    Periodontal diseases are among the most prevalent worldwide, but largely silent, chronic diseases. They affect the tooth-supporting tissues with multiple ramifications on life quality. Their early diagnosis is still challenging, due to lack of appropriate molecular diagnostic methods. Saliva offers a non-invasively collectable reservoir of clinically relevant biomarkers, which, if utilized efficiently, could facilitate early diagnosis and monitoring of ongoing disease. Despite several novel protein markers being recently enlisted by discovery proteomics, their routine diagnostic application is hampered by the lack of validation platforms that allow for rapid, accurate and simultaneous quantification of multiple proteins in large cohorts. We carried out a pipeline of two proteomic platforms; firstly, we applied open ended label-free quantitative (LFQ) proteomics for discovery in saliva (n=67, health, gingivitis, and periodontitis), followed by selected-reaction monitoring (SRM)-targeted proteomics for validation in an independent cohort (n=82). The LFQ platform led to the discovery of 119 proteins with at least two-fold significant difference between health and disease. The 65 proteins chosen for the subsequent SRM platform included 50 related proteins derived from the significantly enriched processes of the LFQ data, 11 from literature-mining, and four house-keeping ones. Among those, 60 were reproducibly quantifiable proteins (92% success rate), represented by a total of 143 peptides. Machine-learning modeling led to a narrowed-down panel of five proteins of high predictive value for periodontal diseases (higher in disease: Matrix metalloproteinase-9, Ras-related protein-1, Actin-related protein 2/3 complex subunit 5; lower in disease: Clusterin, Deleted in Malignant Brain Tumors 1), with maximum area under the receiver operating curve >0.97. This panel enriches the pool of credible clinical biomarker candidates for diagnostic assay development. Yet, the quantum leap brought in periodontal diagnostics by this study lies in the introduction of the well established discovery-through-verification pipeline for periodontal biomarker discovery and validation in further periodontal patient cohorts. Published under license by The American Society for Biochemistry and Molecular Biology, Inc.

  8. Microfluidic Synthesis of Composite Cross-Gradient Materials for Investigating Cell–Biomaterial Interactions

    PubMed Central

    He, Jiankang; Du, Yanan; Guo, Yuqi; Hancock, Matthew J.; Wang, Ben; Shin, Hyeongho; Wu, Jinhui; Li, Dichen; Khademhosseini, Ali

    2010-01-01

    Combinatorial material synthesis is a powerful approach for creating composite material libraries for the high-throughput screening of cell–material interactions. Although current combinatorial screening platforms have been tremendously successful in identifying target (termed “hit”) materials from composite material libraries, new material synthesis approaches are needed to further optimize the concentrations and blending ratios of the component materials. Here we employed a microfluidic platform to rapidly synthesize composite materials containing cross-gradients of gelatin and chitosan for investigating cell–biomaterial interactions. The microfluidic synthesis of the cross-gradient was optimized experimentally and theoretically to produce quantitatively controllable variations in the concentrations and blending ratios of the two components. The anisotropic chemical compositions of the gelatin/chitosan cross-gradients were characterized by Fourier transform infrared spectrometry and X-ray photoelectron spectrometry. The three-dimensional (3D) porous gelatin/chitosan cross-gradient materials were shown to regulate the cellular morphology and proliferation of smooth muscle cells (SMCs) in a gradient-dependent manner. We envision that our microfluidic cross-gradient platform may accelerate the material development processes involved in a wide range of biomedical applications. PMID:20721897

  9. Energy Efficient, Cross-Layer Enabled, Dynamic Aggregation Networks for Next Generation Internet

    NASA Astrophysics Data System (ADS)

    Wang, Michael S.

    Today, the Internet traffic is growing at a near exponential rate, driven predominately by data center-based applications and Internet-of-Things services. This fast-paced growth in Internet traffic calls into question the ability of the existing optical network infrastructure to support this continued growth. The overall optical networking equipment efficiency has not been able to keep up with the traffic growth, creating a energy gap that makes energy and cost expenditures scale linearly with the traffic growth. The implication of this energy gap is that it is infeasible to continue using existing networking equipment to meet the growing bandwidth demand. A redesign of the optical networking platform is needed. The focus of this dissertation is on the design and implementation of energy efficient, cross-layer enabled, dynamic optical networking platforms, which is a promising approach to address the exponentially growing Internet bandwidth demand. Chapter 1 explains the motivation for this work by detailing the huge Internet traffic growth and the unsustainable energy growth of today's networking equipment. Chapter 2 describes the challenges and objectives of enabling agile, dynamic optical networking platforms and the vision of the Center for Integrated Access Networks (CIAN) to realize these objectives; the research objectives of this dissertation and the large body of related work in this field is also summarized. Chapter 3 details the design and implementation of dynamic networking platforms that support wavelength switching granularity. The main contribution of this work involves the experimental validation of deep cross-layer communication across the optical performance monitoring (OPM), data, and control planes. The first experiment shows QoS-aware video streaming over a metro-scale test-bed through optical power monitoring of the transmission wavelength and cross-layer feedback control of the power level. The second experiment extends the performance monitoring capabilities to include real-time monitoring of OSNR and polarization mode dispersion (PMD) to enable dynamic wavelength switching and selective restoration. Chapter 4 explains the author?s contributions in designing dynamic networking at the sub-wavelength switching granularity, which can provide greater network efficiency due to its finer granularity. To support dynamic switching, regeneration, adding/dropping, and control decisions on each individual packet, the cross-layer enabled node architecture is enhanced with a FPGA controller that brings much more precise timing and control to the switching, OPM, and control planes. Furthermore, QoS-aware packet protection and dynamic switching, dropping, and regeneration functionalities were experimentally demonstrated in a multi-node network. Chapter 5 describes a technique to perform optical grooming, a process of optically combining multiple incoming data streams into a single data stream, which can simultaneously achieve greater bandwidth utilization and increased spectral efficiency. In addition, an experimental demonstration highlighting a fully functioning multi-node, agile optical networking platform is detailed. Finally, a summary and discussion of future work is provided in Chapter 6. The future of the Internet is very exciting, filled with not-yet-invented applications and services driven by cloud computing and Internet-of-Things. The author is cautiously optimistic that agile, dynamically reconfigurable optical networking is the solution to realizing this future.

  10. CrossTalk: The Journal of Defense Software Engineering. Volume 26, Number 6, November/December 2013

    DTIC Science & Technology

    2013-12-01

    requirements during sprint planning. Automated scanning, which includes automated code-review tools, allows the expert to monitor the system... sprint . This enables the validator to leverage the test results for formal validation and verification, and perform a shortened “hybrid” style of IV&V...per SPRINT (1-4 weeks) 1 week 1 Month Up to four months Ø Deliverable product to user Ø Security posture assessed Ø Accredited to field/operate

  11. Voroprot: an interactive tool for the analysis and visualization of complex geometric features of protein structure.

    PubMed

    Olechnovic, Kliment; Margelevicius, Mindaugas; Venclovas, Ceslovas

    2011-03-01

    We present Voroprot, an interactive cross-platform software tool that provides a unique set of capabilities for exploring geometric features of protein structure. Voroprot allows the construction and visualization of the Apollonius diagram (also known as the additively weighted Voronoi diagram), the Apollonius graph, protein alpha shapes, interatomic contact surfaces, solvent accessible surfaces, pockets and cavities inside protein structure. Voroprot is available for Windows, Linux and Mac OS X operating systems and can be downloaded from http://www.ibt.lt/bioinformatics/voroprot/.

  12. Software Tools for Development on the Peregrine System | High-Performance

    Science.gov Websites

    Computing | NREL Software Tools for Development on the Peregrine System Software Tools for and manage software at the source code level. Cross-Platform Make and SCons The "Cross-Platform Make" (CMake) package is from Kitware, and SCons is a modern software build tool based on Python

  13. Flow optimization study of a batch microfluidics PET tracer synthesizing device

    PubMed Central

    Elizarov, Arkadij M.; Meinhart, Carl; van Dam, R. Michael; Huang, Jiang; Daridon, Antoine; Heath, James R.; Kolb, Hartmuth C.

    2010-01-01

    We present numerical modeling and experimental studies of flow optimization inside a batch microfluidic micro-reactor used for synthesis of human-scale doses of Positron Emission Tomography (PET) tracers. Novel techniques are used for mixing within, and eluting liquid out of, the coin-shaped reaction chamber. Numerical solutions of the general incompressible Navier Stokes equations along with time-dependent elution scalar field equation for the three dimensional coin-shaped geometry were obtained and validated using fluorescence imaging analysis techniques. Utilizing the approach presented in this work, we were able to identify optimized geometrical and operational conditions for the micro-reactor in the absence of radioactive material commonly used in PET related tracer production platforms as well as evaluate the designed and fabricated micro-reactor using numerical and experimental validations. PMID:21072595

  14. Mach-Zehnder Interferometer Biochemical Sensor Based on Silicon-on-Insulator Rib Waveguide with Large Cross Section

    PubMed Central

    Yuan, Dengpeng; Dong, Ying; Liu, Yujin; Li, Tianjian

    2015-01-01

    A high-sensitivity Mach-Zehnder interferometer (MZI) biochemical sensing platform based on Silicon-in-insulator (SOI) rib waveguide with large cross section is proposed in this paper. Based on the analyses of the evanescent field intensity, the mode polarization and cross section dimensions of the SOI rib waveguide are optimized through finite difference method (FDM) simulation. To realize high-resolution MZI read-out configuration based on the SOI rib waveguide, medium-filled trenches are employed and their performances are simulated through two-dimensional finite-difference-time domain (2D-FDTD) method. With the fundamental EH-polarized mode of the SOI rib waveguide with a total rib height of 10 μm, an outside rib height of 5 μm and a rib width of 2.5 μm at the operating wavelength of 1550 nm, when the length of the sensitive window in the MZI configuration is 10 mm, a homogeneous sensitivity of 7296.6%/refractive index unit (RIU) is obtained. Supposing the resolutions of the photoelectric detectors connected to the output ports are 0.2%, the MZI sensor can achieve a detection limit of 2.74 × 10−6 RIU. Due to high coupling efficiency of SOI rib waveguide with large cross section with standard single-mode glass optical fiber, the proposed MZI sensing platform can be conveniently integrated with optical fiber communication systems and (opto-) electronic systems, and therefore has the potential to realize remote sensing, in situ real-time detecting, and possible applications in the internet of things. PMID:26343678

  15. Nonalcoholic fatty liver disease is associated with dysbiosis independent of body mass index and insulin resistance.

    PubMed

    Da Silva, Hannah E; Teterina, Anastasia; Comelli, Elena M; Taibi, Amel; Arendt, Bianca M; Fischer, Sandra E; Lou, Wendy; Allard, Johane P

    2018-01-23

    This study aimed to determine if there is an association between dysbiosis and nonalcoholic fatty liver disease (NAFLD) independent of obesity and insulin resistance (IR). This is a prospective cross-sectional study assessing the intestinal microbiome (IM) of 39 adults with biopsy-proven NAFLD (15 simple steatosis [SS]; 24 nonalcoholic steatohepatitis [NASH]) and 28 healthy controls (HC). IM composition (llumina MiSeq Platform) in NAFLD patients compared to HC were identified by two statistical methods (Metastats, Wilcoxon). Selected taxa was validated using quantitative PCR (qPCR). Metabolites in feces and serum were also analyzed. In NAFLD, 8 operational taxonomic units, 6 genera, 6 families and 2 phyla (Bacteroidetes, Firmicutes) were less abundant and; 1 genus (Lactobacillus) and 1 family (Lactobacillaceae) were more abundant compared to HC. Lower abundance in both NASH and SS patients compared to HC were confirmed by qPCR for Ruminococcus, Faecalibacterium prausnitzii and Coprococcus. No difference was found between NASH and SS. This lower abundance in NAFLD (NASH+SS) was independent of BMI and IR. NAFLD patients had higher concentrations of fecal propionate and isobutyric acid and serum 2-hydroxybutyrate and L-lactic acid. These findings suggest a potential role for a specific IM community and functional profile in the pathogenesis of NAFLD.

  16. Rodent motor and neuropsychological behaviour measured in home cages using the integrated modular platform SmartCage™

    PubMed Central

    Khroyan, Taline V; Zhang, Jingxi; Yang, Liya; Zou, Bende; Xie, James; Pascual, Conrado; Malik, Adam; Xie, Julian; Zaveri, Nurulain T; Vazquez, Jacqueline; Polgar, Willma; Toll, Lawrence; Fang, Jidong; Xie, Xinmin

    2017-01-01

    SUMMARY To facilitate investigation of diverse rodent behaviours in rodents’ home cages, we have developed an integrated modular platform, the SmartCage™ system (AfaSci, Inc. Burlingame, CA, USA), which enables automated neurobehavioural phenotypic analysis and in vivo drug screening in a relatively higher-throughput and more objective manner.The individual platform consists of an infrared array, a vibration floor sensor and a variety of modular devices. One computer can simultaneously operate up to 16 platforms via USB cables.The SmartCage™ detects drug-induced increases and decreases in activity levels, as well as changes in movement patterns. Wake and sleep states of mice can be detected using the vibration floor sensor. The arousal state classification achieved up to 98% accuracy compared with results obtained by electroencephalography and electromyography. More complex behaviours, including motor coordination, anxiety-related behaviours and social approach behaviour, can be assessed using appropriate modular devices and the results obtained are comparable with results obtained using conventional methods.In conclusion, the SmartCage™ system provides an automated and accurate tool to quantify various rodent behaviours in a ‘stress-free’ environment. This system, combined with the validated testing protocols, offers powerful a tool kit for transgenic phenotyping and in vivo drug screening. PMID:22540540

  17. PARAMO: A Parallel Predictive Modeling Platform for Healthcare Analytic Research using Electronic Health Records

    PubMed Central

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R.; Stewart, Walter F.; Malin, Bradley; Sun, Jimeng

    2014-01-01

    Objective Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: 1) cohort construction, 2) feature construction, 3) cross-validation, 4) feature selection, and 5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. Methods To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which 1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, 2) schedules the tasks in a topological ordering of the graph, and 3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. Results We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3 hours in parallel compared to 9 days if running sequentially. Conclusion This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines that are specialized for health data researchers. PMID:24370496

  18. PARAMO: a PARAllel predictive MOdeling platform for healthcare analytic research using electronic health records.

    PubMed

    Ng, Kenney; Ghoting, Amol; Steinhubl, Steven R; Stewart, Walter F; Malin, Bradley; Sun, Jimeng

    2014-04-01

    Healthcare analytics research increasingly involves the construction of predictive models for disease targets across varying patient cohorts using electronic health records (EHRs). To facilitate this process, it is critical to support a pipeline of tasks: (1) cohort construction, (2) feature construction, (3) cross-validation, (4) feature selection, and (5) classification. To develop an appropriate model, it is necessary to compare and refine models derived from a diversity of cohorts, patient-specific features, and statistical frameworks. The goal of this work is to develop and evaluate a predictive modeling platform that can be used to simplify and expedite this process for health data. To support this goal, we developed a PARAllel predictive MOdeling (PARAMO) platform which (1) constructs a dependency graph of tasks from specifications of predictive modeling pipelines, (2) schedules the tasks in a topological ordering of the graph, and (3) executes those tasks in parallel. We implemented this platform using Map-Reduce to enable independent tasks to run in parallel in a cluster computing environment. Different task scheduling preferences are also supported. We assess the performance of PARAMO on various workloads using three datasets derived from the EHR systems in place at Geisinger Health System and Vanderbilt University Medical Center and an anonymous longitudinal claims database. We demonstrate significant gains in computational efficiency against a standard approach. In particular, PARAMO can build 800 different models on a 300,000 patient data set in 3h in parallel compared to 9days if running sequentially. This work demonstrates that an efficient parallel predictive modeling platform can be developed for EHR data. This platform can facilitate large-scale modeling endeavors and speed-up the research workflow and reuse of health information. This platform is only a first step and provides the foundation for our ultimate goal of building analytic pipelines that are specialized for health data researchers. Copyright © 2013 Elsevier Inc. All rights reserved.

  19. ESTEST: An Open Science Platform for Electronic Structure Research

    ERIC Educational Resources Information Center

    Yuan, Gary

    2012-01-01

    Open science platforms in support of data generation, analysis, and dissemination are becoming indispensible tools for conducting research. These platforms use informatics and information technologies to address significant problems in open science data interoperability, verification & validation, comparison, analysis, post-processing,…

  20. Mission Level Autonomy for USSV

    NASA Technical Reports Server (NTRS)

    Huntsberger, Terry; Stirb, Robert C.; Brizzolara, Robert

    2011-01-01

    On-water demonstration of a wide range of mission-proven, advanced technologies at TRL 5+ that provide a total integrated, modular approach to effectively address the majority of the key needs for full mission-level autonomous, cross-platform control of USV s. Wide baseline stereo system mounted on the ONR USSV was shown to be an effective sensing modality for tracking of dynamic contacts as a first step to automated retrieval operations. CASPER onboard planner/replanner successfully demonstrated realtime, on-water resource-based analysis for mission-level goal achievement and on-the-fly opportunistic replanning. Full mixed mode autonomy was demonstrated on-water with a seamless transition between operator over-ride and return to current mission plan. Autonomous cooperative operations for fixed asset protection and High Value Unit escort using 2 USVs (AMN1 & 14m RHIB) were demonstrated during Trident Warrior 2010 in JUN 2010

  1. The Turkish Version of Web-Based Learning Platform Evaluation Scale: Reliability and Validity Study

    ERIC Educational Resources Information Center

    Dag, Funda

    2016-01-01

    The purpose of this study is to determine the language equivalence and the validity and reliability of the Turkish version of the "Web-Based Learning Platform Evaluation Scale" ("Web Tabanli Ögrenme Ortami Degerlendirme Ölçegi" [WTÖODÖ]) used in the selection and evaluation of web-based learning environments. Within this scope,…

  2. Design of a Parachute Canopy Instrumentation Platform

    NASA Technical Reports Server (NTRS)

    Alshahin, Wahab M.; Daum, Jared S.; Holley, James J.; Litteken, Douglas A.; Vandewalle, Michael T.

    2015-01-01

    This paper discusses the current technology available to design and develop a reliable and compact instrumentation platform for parachute system data collection and command actuation. Wireless communication with a parachute canopy will be an advancement to the state of the art of parachute design, development, and testing. Embedded instrumentation of the parachute canopy will provide reefing line tension, skirt position data, parachute health monitoring, and other telemetry, further validating computer models and giving engineering insight into parachute dynamics for both Earth and Mars entry that is currently unavailable. This will allow for more robust designs which are more optimally designed in terms of structural loading, less susceptible to adverse dynamics, and may eventually pave the way to currently unattainable advanced concepts of operations. The development of this technology has dual use potential for a variety of other applications including inflatable habitats, aerodynamic decelerators, heat shields, and other high stress environments.

  3. Analysis of the economic and ecological performances in the transient regimes of the European driving cycle for a midsize SUV equipped with a DHEP, using the simulation platforms

    NASA Astrophysics Data System (ADS)

    Bancă, Gheorghe; Ivan, Florian; Iozsa, Daniel; Nisulescu, Valentin

    2017-10-01

    Currently, the tendency of the car manufacturers is to continue the expansion of the global production of SUVs (Sport Utility Vehicle), while observing the requirements imposed by the new pollution standards by developing new technologies like DHEP (Diesel Hybrid Electric Powertrain). Experience has shown that the transient regimes are the most difficult to control from an economic and ecological perspective. As a result, this paper will highlight the behaviour of such engines that are provided in a middle class SUV (Sport Utility Vehicle), which operates in such states. We selected the transient regimes characteristic to the NMVEG (New Motor Vehicle Emissions Group) cycle. The investigations using the modelling platform AMESim allowed for rigorous interpretations for the 16 acceleration and 18 deceleration states. The results obtained from the simulation will be validated by experiments.

  4. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.

  5. Routing architecture and security for airborne networks

    NASA Astrophysics Data System (ADS)

    Deng, Hongmei; Xie, Peng; Li, Jason; Xu, Roger; Levy, Renato

    2009-05-01

    Airborne networks are envisioned to provide interconnectivity for terrestial and space networks by interconnecting highly mobile airborne platforms. A number of military applications are expected to be used by the operator, and all these applications require proper routing security support to establish correct route between communicating platforms in a timely manner. As airborne networks somewhat different from traditional wired and wireless networks (e.g., Internet, LAN, WLAN, MANET, etc), security aspects valid in these networks are not fully applicable to airborne networks. Designing an efficient security scheme to protect airborne networks is confronted with new requirements. In this paper, we first identify a candidate routing architecture, which works as an underlying structure for our proposed security scheme. And then we investigate the vulnerabilities and attack models against routing protocols in airborne networks. Based on these studies, we propose an integrated security solution to address routing security issues in airborne networks.

  6. Wireless inertial measurement unit with GPS (WIMU-GPS)--wearable monitoring platform for ecological assessment of lifespace and mobility in aging and disease.

    PubMed

    Boissy, Patrick; Brière, Simon; Hamel, Mathieu; Jog, Mandar; Speechley, Mark; Karelis, Antony; Frank, James; Vincent, Claude; Edwards, Rodrick; Duval, Christian

    2011-01-01

    This paper proposes an innovative ambulatory mobility and activity monitoring approach based on a wearable datalogging platform that combines inertial sensing with GPS tracking to assess the lifespace and mobility profile of individuals in their home and community environments. The components, I/O architecture, sensors and functions of the WIMU-GPS are presented. Outcome variables that can be measured with it are described and illustrated. Data on the power usage, operating autonomy of the WIMU-GPS and the GPS tracking performances and time to first fix of the unit are presented. The study of lifespace and mobility with the WIMU-GPS can potentially provide unique insights into intrapersonal and environmental factors contributing to mobility restriction. On-going studies are underway to establish the validity and reliability of the WIMU-GPS in characterizing the lifespace and mobility profile of older adults.

  7. Visualizing In Situ Microstructure Dependent Crack Tip Stress Distribution in IN-617 Using Nano-mechanical Raman Spectroscopy

    NASA Astrophysics Data System (ADS)

    Zhang, Yang; Mohanty, Debapriya P.; Tomar, Vikas

    2016-11-01

    Inconel 617 (IN-617) is a solid solution alloy, which is widely used in applications that require high-temperature component operation due to its high-temperature stability and strength as well as strong resistance to oxidation and carburization. The current work focuses on in situ measurements of stress distribution under 3-point bending at elevated temperature in IN-617. A nanomechanical Raman spectroscopy measurement platform was designed and built based on a combination of a customized open Raman spectroscopy (NMRS) system incorporating a motorized scanning and imaging system with a nanomechanical loading platform. Based on the scanning of the crack tip notch area using the NMRS notch tip, stress distribution under applied load with micron-scale resolution for analyzed microstructures is predicted. A finite element method-based formulation to predict crack tip stresses is presented and validated using the presented experimental data.

  8. SPIDER ® sleeve gastrectomy--a new concept in single-trocar bariatric surgery: initial experience and technical details.

    PubMed

    Noel, P; Nedelcu, M; Gagner, M

    2014-04-01

    Single port instrument delivery extended reach (SPIDER(®)) surgical system is a revolutionary surgical platform that allows triangulation of the surgical instruments while eliminating the crossing of instruments, the problematic characteristic of single access laparoscopic surgery. The purpose of this study was to analyze our initial experience with SPIDER(®) sleeve gastrectomy and to present the technical details of this new minimally invasive approach, performed in ten patients at the La Casamance Private Hospital between November 2012 and April 2013. All patients were reviewed at scheduled post-operative consultations at 1, 3 and 6 months. In addition to clinical examination, the post-operative consultation at one month also included a satisfaction survey using the Moorehead-Ardelt questionnaire. An initial series of ten sleeve gastrectomies were performed in female patients with a mean age of 41.5 years (range: 2-52). The mean BMI was 40.11 (range: 37.25-44.3). The intervention was performed through a single trocar in all patients with no "conversion" to classic laparoscopy or open surgery. The mean operative time was 61 ± 15.22 minutes (SD=standard deviation) (range: 43-96 min). The mean BMI at one month was 35.5 (SD:± 3.58, SEM: ± 1.13) (SEM=standard error of mean) with an average percentage of excess weight loss (%EWL) of 32.9% (SD:± 8.56%, SEM:± 2.71%). The mean BMI at three months was 32.4 (SD: ± 2.78, SEM: ± 0.88) with an average %EWL of 52.7% (SD: ± 8.64%, SEM: ± 2.73%). The mean BMI at six months was 29.9 (SD:± 2.60, SEM: ± 0.98) with a mean %EWL of 68.8% (SD: ± 8.38%, SEM:± 3.17%). Complete remission of co-morbid conditions was observed in four patients, improvement in three others, and no change in a single patient. The mean duration of hospitalization was 3.1 days. The mean follow-up period was 161 days (SD:± 57.4 days, range: 90-243 days). There was no mortality and no intra-operative and post-operative complications were noted. The SPIDER(®) surgical platform seems to be a usable and effective method for performance of minimally invasive single-access sleeve gastrectomy, offering an easy and efficient operative procedure compared to other single-port systems. Prospective long-term studies are recommended before this approach can be validated to be of comparable efficiency to conventional multi-port laparoscopic surgery. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  9. Predicting the Operational Acceptability of Route Advisories

    NASA Technical Reports Server (NTRS)

    Evans, Antony; Lee, Paul

    2017-01-01

    NASA envisions a future Air Traffic Management system that allows safe, efficient growth in global operations, enabled by increasing levels of automation and autonomy. In a safety-critical system, the introduction of increasing automation and autonomy has to be done in stages, making human-system integrated concepts critical in the foreseeable future. One example where this is relevant is for tools that generate more efficient flight routings or reroute advisories. If these routes are not operationally acceptable, they will be rejected by human operators, and the associated benefits will not be realized. Operational acceptance is therefore required to enable the increased efficiency and reduced workload benefits associated with these tools. In this paper, the authors develop a predictor of operational acceptability for reroute advisories. Such a capability has applications in tools that identify more efficient routings around weather and congestion and that better meet airline preferences. The capability is based on applying data mining techniques to flight plan amendment data reported by the Federal Aviation Administration and data on requested reroutes collected from a field trial of the NASA developed Dynamic Weather Routes tool, which advised efficient route changes to American Airlines dispatchers in 2014. 10-Fold cross validation was used for feature, model and parameter selection, while nested cross validation was used to validate the model. The model performed well in predicting controller acceptance or rejection of a route change as indicated by chosen performance metrics. Features identified as relevant to controller acceptance included the historical usage of the advised route, the location of the maneuver start point relative to the boundaries of the airspace sector containing the maneuver start (the maneuver start sector), the reroute deviation from the original flight plan, and the demand level in the maneuver start sector. A random forest with forty trees was the best performing of the five models evaluated in this paper.

  10. Smart repeater system for communications interoperability during multiagency law enforcement operations

    NASA Astrophysics Data System (ADS)

    Crutcher, Richard I.; Jones, R. W.; Moore, Michael R.; Smith, S. F.; Tolley, Alan L.; Rochelle, Robert W.

    1997-02-01

    A prototype 'smart' repeater that provides interoperability capabilities for radio communication systems in multi-agency and multi-user scenarios is being developed by the Oak Ridge National Laboratory. The smart repeater functions as a deployable communications platform that can be dynamically reconfigured to cross-link the radios of participating federal, state, and local government agencies. This interconnection capability improves the coordination and execution of multi-agency operations, including coordinated law enforcement activities and general emergency or disaster response scenarios. The repeater provides multiple channels of operation in the 30-50, 118-136, 138-174, and 403-512 MHz land mobile communications and aircraft bands while providing the ability to cross-connect among multiple frequencies, bands, modulation types, and encryption formats. Additionally, two telephone interconnects provide links to the fixed and cellular telephone networks. The 800- and 900-MHz bands are not supported by the prototype, but the modular design of the system accommodates future retrofits to extend frequency capabilities with minimal impact to the system. Configuration of the repeater is through a portable personal computer with a Windows-based graphical interface control screen that provides dynamic reconfiguration of network interconnections and formats.

  11. Radar cross calibration investigation TAMU radar polarimeter calibration measurements

    NASA Technical Reports Server (NTRS)

    Blanchard, A. J.; Newton, R. W.; Bong, S.; Kronke, C.; Warren, G. L.; Carey, D.

    1982-01-01

    A short pulse, 20 MHz bandwidth, three frequency radar polarimeter system (RPS) operates at center frequencies of 10.003 GHz, 4.75 GHz, and 1.6 GHz and utilizes dual polarized transmit and receive antennas for each frequency. The basic lay-out of the RPS is different from other truck mounted systems in that it uses a pulse compression IF section common to all three RF heads. Separate transmit and receive antennas are used to improve the cross-polarization isolation at each particular frequency. The receive is a digitally controlled gain modulated subsystem and is interfaced directly with a microprocesser computer for control and data manipulation. Antenna focusing distance, focusing each antenna pair, rf head stability, and polarization characteristics of RPS antennas are discussed. Platform and data acquisition procedures are described.

  12. Study on key technologies of vehicle networking system platform for electric automobiles based on micro-service

    NASA Astrophysics Data System (ADS)

    Ye, Fei

    2018-04-01

    With the rapid increase of electric automobiles and charging piles, the elastic expansion and online rapid upgrade were required for the vehicle networking system platform (system platform for short). At present, it is difficult to meet the operation needs due to the traditional huge rock architecture used by the system platform. This paper studied the system platform technology architecture based on "cloud platform +micro-service" to obtain a new generation of vehicle networking system platform with the combination of elastic expansion and application, thus significantly improving the service operation ability of system.

  13. Validation of Land Surface Temperature from Sentinel-3

    NASA Astrophysics Data System (ADS)

    Ghent, D.

    2017-12-01

    One of the main objectives of the Sentinel-3 mission is to measure sea- and land-surface temperature with high-end accuracy and reliability in support of environmental and climate monitoring in an operational context. Calibration and validation are thus key criteria for operationalization within the framework of the Sentinel-3 Mission Performance Centre (S3MPC). Land surface temperature (LST) has a long heritage of satellite observations which have facilitated our understanding of land surface and climate change processes, such as desertification, urbanization, deforestation and land/atmosphere coupling. These observations have been acquired from a variety of satellite instruments on platforms in both low-earth orbit and in geostationary orbit. Retrieval accuracy can be a challenge though; surface emissivities can be highly variable owing to the heterogeneity of the land, and atmospheric effects caused by the presence of aerosols and by water vapour absorption can give a bias to the underlying LST. As such, a rigorous validation is critical in order to assess the quality of the data and the associated uncertainties. Validation of the level-2 SL_2_LST product, which became freely available on an operational basis from 5th July 2017 builds on an established validation protocol for satellite-based LST. This set of guidelines provides a standardized framework for structuring LST validation activities. The protocol introduces a four-pronged approach which can be summarised thus: i) in situ validation where ground-based observations are available; ii) radiance-based validation over sites that are homogeneous in emissivity; iii) intercomparison with retrievals from other satellite sensors; iv) time-series analysis to identify artefacts on an interannual time-scale. This multi-dimensional approach is a necessary requirement for assessing the performance of the LST algorithm for the Sea and Land Surface Temperature Radiometer (SLSTR) which is designed around biome-based coefficients, thus emphasizing the importance of non-traditional forms of validation such as radiance-based techniques. Here we present examples of the ongoing routine application of the protocol to operational Sentinel-3 LST data.

  14. Criterion and Concurrent Validity of the activPAL™ Professional Physical Activity Monitor in Adolescent Females

    PubMed Central

    Dowd, Kieran P.; Harrington, Deirdre M.; Donnelly, Alan E.

    2012-01-01

    Background The activPAL has been identified as an accurate and reliable measure of sedentary behaviour. However, only limited information is available on the accuracy of the activPAL activity count function as a measure of physical activity, while no unit calibration of the activPAL has been completed to date. This study aimed to investigate the criterion validity of the activPAL, examine the concurrent validity of the activPAL, and perform and validate a value calibration of the activPAL in an adolescent female population. The performance of the activPAL in estimating posture was also compared with sedentary thresholds used with the ActiGraph accelerometer. Methodologies Thirty adolescent females (15 developmental; 15 cross-validation) aged 15–18 years performed 5 activities while wearing the activPAL, ActiGraph GT3X, and the Cosmed K4B2. A random coefficient statistics model examined the relationship between metabolic equivalent (MET) values and activPAL counts. Receiver operating characteristic analysis was used to determine activity thresholds and for cross-validation. The random coefficient statistics model showed a concordance correlation coefficient of 0.93 (standard error of the estimate = 1.13). An optimal moderate threshold of 2997 was determined using mixed regression, while an optimal vigorous threshold of 8229 was determined using receiver operating statistics. The activPAL count function demonstrated very high concurrent validity (r = 0.96, p<0.01) with the ActiGraph count function. Levels of agreement for sitting, standing, and stepping between direct observation and the activPAL and ActiGraph were 100%, 98.1%, 99.2% and 100%, 0%, 100%, respectively. Conclusions These findings suggest that the activPAL is a valid, objective measurement tool that can be used for both the measurement of physical activity and sedentary behaviours in an adolescent female population. PMID:23094069

  15. Monte Carol-based validation of neutronic methodology for EBR-II analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liaw, J.R.; Finck, P.J.

    1993-01-01

    The continuous-energy Monte Carlo code VIM (Ref. 1) has been validated extensively over the years against fast critical experiments and other neutronic analysis codes. A high degree of confidence in VIM for predicting reactor physics parameters has been firmly established. This paper presents a numerical validation of two conventional multigroup neutronic analysis codes, DIF3D (Ref. 4) and VARIANT (Ref. 5), against VIM for two Experimental Breeder Reactor II (EBR-II) core loadings in detailed three-dimensional hexagonal-z geometry. The DIF3D code is based on nodal diffusion theory, and it is used in calculations for day-today reactor operations, whereas the VARIANT code ismore » based on nodal transport theory and is used with increasing frequency for specific applications. Both DIF3D and VARIANT rely on multigroup cross sections generated from ENDF/B-V by the ETOE-2/MC[sup 2]-II/SDX (Ref. 6) code package. Hence, this study also validates the multigroup cross-section processing methodology against the continuous-energy approach used in VIM.« less

  16. Dielectrophoretic lab-on-CMOS platform for trapping and manipulation of cells.

    PubMed

    Park, Kyoungchul; Kabiri, Shideh; Sonkusale, Sameer

    2016-02-01

    Trapping and manipulation of cells are essential operations in numerous studies in biology and life sciences. We discuss the realization of a Lab-on-a-Chip platform for dielectrophoretic trapping and repositioning of cells and microorganisms on a complementary metal oxide semiconductor (CMOS) technology, which we define here as Lab-on-CMOS (LoC). The LoC platform is based on dielectrophoresis (DEP) which is the force experienced by any dielectric particle including biological entities in non-uniform AC electrical field. DEP force depends on the permittivity of the cells, its size and shape and also on the permittivity of the medium and therefore it enables selective targeting of cells based on their phenotype. In this paper, we address an important matter that of electrode design for DEP for which we propose a three-dimensional (3D) octapole geometry to create highly confined electric fields for trapping and manipulation of cells. Conventional DEP-based platforms are implemented stand-alone on glass, silicon or polymers connected to external infrastructure for electronics and optics, making it bulky and expensive. In this paper, the use of CMOS as a platform provides a pathway to truly miniaturized lab-on-CMOS or LoC platform, where DEP electrodes are designed using built-in multiple metal layers of the CMOS process for effective trapping of cells, with built-in electronics for in-situ impedance monitoring of the cell position. We present electromagnetic simulation results of DEP force for this unique 3D octapole geometry on CMOS. Experimental results with yeast cells validate the design. These preliminary results indicate the promise of using CMOS technology for truly compact miniaturized lab-on-chip platform for cell biotechnology applications.

  17. Mini All-purpose Satellite Control Center (MASCC)

    NASA Technical Reports Server (NTRS)

    Zaouche, Gerard

    1994-01-01

    A new generation of Mini All-purpose Satellite Control Centers (MASCC) has been developed by CNES (F). They turn out to be easily adaptable to different kinds of satellites, both Low Earth Orbital or Geostationary. The features of MASCC allow both standard satellite control activities, and checking of passengers experiments hosted on a space platform. In the different environments in which it may be used, MASCC provides standard broadcasting of telemetry parameters on animated synoptics (curves, bar graphs, alphanumeric displays, ...), which turns out to be a very useful and ergonomic medium for operational teams or satellite specialists. Special care has been taken during the MASCC development about two points: - automation of all routine tasks, allowing automated operation, and limiting human commitment to system supervision and decision making, - software adaptability. To reach these two main objectives, the MASCC design provides:(1) a simple, robust and flexible hardware architecture, based on powerful distributed workstations; and (2) a table-driven software architecture, easily adapted to various operational needs. Satellite characteristics are described in a central Data Base. Hence, the processing of telemetry and commands is largely independent from the satellite itself. In order to validate these capabilities, the MASCC has been customized to several types of satellites and orbital platforms: (1) SPOT4, the French new generation of remote sensing satellites; (2) TELECOM2, the French geostationary TV and telecommunication satellite; and (3) MIR, the Russian orbital platform. MASCC development has been completed by the third quarter of 1993. This paper will provide first a description of the MASCC basic functions, of its hardware and software design. It will then detail the increased automation capability, along with the easy adaptation of the MASCC to new satellites with minimal software modifications.

  18. Intercomparison and validation of operational coastal-scale models, the experience of the project MOMAR.

    NASA Astrophysics Data System (ADS)

    Brandini, C.; Coudray, S.; Taddei, S.; Fattorini, M.; Costanza, L.; Lapucci, C.; Poulain, P.; Gerin, R.; Ortolani, A.; Gozzini, B.

    2012-04-01

    The need for regional governments to implement operational systems for the sustainable management of coastal waters, in order to meet the requirements imposed by legislation (e.g. EU directives such as WFD, MSFD, BD and relevant national legislation) often lead to the implementation of coastal measurement networks and to the construction of computational models that surround and describe parts of regional seas without falling in the classic definition of regional/coastal models. Although these operational models may be structured to cover parts of different oceanographic basins, they can have considerable advantages and highlight relevant issues, such as the role of narrow channels, straits and islands in coastal circulation, as both in physical and biogeochemical processes such as in the exchanges of water masses among basins. Two models of this type were made in the context of cross-border European project MOMAR: an operational model of the Tuscan Archipelago sea and one around the Corsica coastal waters, which are both located between the Tyrrhenian and the Algerian-Ligurian-Provençal basins. Although these two models were based on different computer codes (MARS3D and ROMS), they have several elements in common, such as a 400 m resolution, boundary conditions from the same "father" model, and an important area of overlap, the Corsica channel, which has a key role in the exchange of water masses between the two oceanographic basins. In this work we present the results of the comparison of these two ocean forecasting systems in response to different weather and oceanographic forcing. In particular, we discuss aspects related to the validation of the two systems, and a systematic comparison between the forecast/hindcast based on such hydrodynamic models, as regards to both operational models available at larger scale, both to in-situ measurements made by fixed or mobile platforms. In this context we will also present the results of two oceanographic cruises in the marine area between Tuscany and Corsica, named MELBA (May 2011) and Milonga (October 2011). In both campaigns, in addition to standard oceanographic measurements (profiles, samples), currentemeter data were collected along tracks using vessel mounted ADCPs, which have allowed us to identify some of the most interesting hydrodynamic features of the area. During MELBA, such current measurements were also carried out through the use of an Autonomous Underwater Vehicle (AUV), while during MILONGA a large survey of the area and a mapping of currents and water masses were carried out by a large number of Lagrangian instruments (drifters and floats). First results allow a hydrodynamic characterization of the Corsica channel, highlighting the three-dimensional structure of the currents along the channel, and characterizing the current reversals (from North to South and vice versa) in dependence to different oceanographic and weather conditions. Collected data provides a basis for a first validation of such operational models, and allow the evaluation of their relative reliability under different conditions.

  19. Hip2Norm: an object-oriented cross-platform program for 3D analysis of hip joint morphology using 2D pelvic radiographs.

    PubMed

    Zheng, G; Tannast, M; Anderegg, C; Siebenrock, K A; Langlotz, F

    2007-07-01

    We developed an object-oriented cross-platform program to perform three-dimensional (3D) analysis of hip joint morphology using two-dimensional (2D) anteroposterior (AP) pelvic radiographs. Landmarks extracted from 2D AP pelvic radiographs and optionally an additional lateral pelvic X-ray were combined with a cone beam projection model to reconstruct 3D hip joints. Since individual pelvic orientation can vary considerably, a method for standardizing pelvic orientation was implemented to determine the absolute tilt/rotation. The evaluation of anatomically morphologic differences was achieved by reconstructing the projected acetabular rim and the measured hip parameters as if obtained in a standardized neutral orientation. The program had been successfully used to interactively objectify acetabular version in hips with femoro-acetabular impingement or developmental dysplasia. Hip(2)Norm is written in object-oriented programming language C++ using cross-platform software Qt (TrollTech, Oslo, Norway) for graphical user interface (GUI) and is transportable to any platform.

  20. Cross border semantic interoperability for clinical research: the EHR4CR semantic resources and services

    PubMed Central

    Daniel, Christel; Ouagne, David; Sadou, Eric; Forsberg, Kerstin; Gilchrist, Mark Mc; Zapletal, Eric; Paris, Nicolas; Hussain, Sajjad; Jaulent, Marie-Christine; MD, Dipka Kalra

    2016-01-01

    With the development of platforms enabling the use of routinely collected clinical data in the context of international clinical research, scalable solutions for cross border semantic interoperability need to be developed. Within the context of the IMI EHR4CR project, we first defined the requirements and evaluation criteria of the EHR4CR semantic interoperability platform and then developed the semantic resources and supportive services and tooling to assist hospital sites in standardizing their data for allowing the execution of the project use cases. The experience gained from the evaluation of the EHR4CR platform accessing to semantically equivalent data elements across 11 European participating EHR systems from 5 countries demonstrated how far the mediation model and mapping efforts met the expected requirements of the project. Developers of semantic interoperability platforms are beginning to address a core set of requirements in order to reach the goal of developing cross border semantic integration of data. PMID:27570649

  1. The National Cohort of Dairy Farms--a data collection platform for mastitis research in Canada.

    PubMed

    Reyher, K K; Dufour, S; Barkema, H W; Des Côteaux, L; Devries, T J; Dohoo, I R; Keefe, G P; Roy, J-P; Scholl, D T

    2011-03-01

    Costs and feasibility of extensive sample collection and processing are major obstacles to mastitis epidemiology research. Studies are often consequentially limited, and fundamental mastitis researchers rarely have the opportunity to conduct their work in epidemiologically valid populations. To mitigate these limitations, the Canadian Bovine Mastitis Research Network has optimized research funds by creating a data collection platform to provide epidemiologically meaningful data for several simultaneous research endeavors. This platform consists of a National Cohort of Dairy Farms (NCDF), Mastitis Laboratory Network, and Mastitis Pathogen Culture Collection. This paper describes the implementation and operation of the NCDF, explains its sampling protocols and data collection, and documents characteristics, strengths and limitations of these data for current and potential users. The NCDF comprises 91 commercial dairy farms in 6 provinces sampled over a 2-yr period. Primarily Holstein-Friesian herds participating in Dairy Herd Improvement milk recording were selected in order to achieve a uniform distribution among 3 strata of bulk tank somatic cell counts and to reflect regional proportions of freestall housing systems. Standardized protocols were implemented for repeated milk samplings on clinical mastitis cases, fresh and randomly selected lactating cows, and cows at dry-off and after calving. Just fewer than 133,000 milk samples were collected. Demographic and production data were recorded at individual cow and farm levels. Health management data are documented and extensive questionnaire data detailing farm management and cleanliness information are also captured. The Laboratory Network represents coordinated regional mastitis bacteriology laboratories using standardized procedures. The Culture Collection archives isolates recovered from intramammary infections of cows in the NCDF and contains over 16,500 isolates, all epidemiologically cross-referenced between linked databases. The NCDF is similar to Canadian dairies in relation to mean herd size, average production, and freestall percentages. Pathogen recovery was greater than anticipated, particularly for coagulase-negative staphylococci and Corynebacterium spp. International scientists are encouraged to use this extensive archive of data and material to enhance their own mastitis research. Copyright © 2011 American Dairy Science Association. Published by Elsevier Inc. All rights reserved.

  2. Smoothing Data Friction through building Service Oriented Data Platforms

    NASA Astrophysics Data System (ADS)

    Wyborn, L. A.; Richards, C. J.; Evans, B. J. K.; Wang, J.; Druken, K. A.

    2017-12-01

    Data Friction has been commonly defined as the costs in time, energy and attention required to simply collect, check, store, move, receive, and access data. On average, researchers spend a significant fraction of their time finding the data for their research project and then reformatting it so that it can be used by the software application of their choice. There is an increasing role for both data repositories and software to be modernised to help reduce data friction in ways that support the better use of the data. Many generic data repositories simply accept data in the format as supplied: the key check is that the data have sufficient metadata to enable discovery and download. Few generic repositories have both the expertise and infrastructure to support the multiple domain specific requirements that facilitate the increasing need for integration and reusability. In contrast, major science domain-focused repositories are increasingly able to implement and enforce community endorsed best practices and guidelines that ensure reusability and harmonization of data for use within the community by offering semi-automated QC workflows to improve quality of submitted data. The most advanced of these science repositories now operate as service-oriented data platforms that extend the use of data across domain silos and increasingly provide server-side programmatically-enabled access to data via network protocols and community standard APIs. To provide this, more rigorous QA/QC procedures are needed to validate data against standards and community software and tools. This ensures that the data can be accessed in expected ways and also demonstrates that the data works across different (non-domain specific) packages, tools and programming languages deployed by the various user communities. In Australia, the National Computational Infrastructure (NCI) has created such a service-oriented data platform which is demonstrating how this approach can reduce data friction, servicing both individual domains as well as facilitating cross-domain collaboration. The approach has required an increase in effort for the repository to provide the additional expertise, so as to enable a better capability and efficient system which ultimately saves time by the individual researcher.

  3. A Set of Free Cross-Platform Authoring Programs for Flexible Web-Based CALL Exercises

    ERIC Educational Resources Information Center

    O'Brien, Myles

    2012-01-01

    The Mango Suite is a set of three freely downloadable cross-platform authoring programs for flexible network-based CALL exercises. They are Adobe Air applications, so they can be used on Windows, Macintosh, or Linux computers, provided the freely-available Adobe Air has been installed on the computer. The exercises which the programs generate are…

  4. Optimal Combinations of Diagnostic Tests Based on AUC.

    PubMed

    Huang, Xin; Qin, Gengsheng; Fang, Yixin

    2011-06-01

    When several diagnostic tests are available, one can combine them to achieve better diagnostic accuracy. This article considers the optimal linear combination that maximizes the area under the receiver operating characteristic curve (AUC); the estimates of the combination's coefficients can be obtained via a nonparametric procedure. However, for estimating the AUC associated with the estimated coefficients, the apparent estimation by re-substitution is too optimistic. To adjust for the upward bias, several methods are proposed. Among them the cross-validation approach is especially advocated, and an approximated cross-validation is developed to reduce the computational cost. Furthermore, these proposed methods can be applied for variable selection to select important diagnostic tests. The proposed methods are examined through simulation studies and applications to three real examples. © 2010, The International Biometric Society.

  5. Assessment of the cPAS-based BGISEQ-500 platform for metagenomic sequencing.

    PubMed

    Fang, Chao; Zhong, Huanzi; Lin, Yuxiang; Chen, Bing; Han, Mo; Ren, Huahui; Lu, Haorong; Luber, Jacob M; Xia, Min; Li, Wangsheng; Stein, Shayna; Xu, Xun; Zhang, Wenwei; Drmanac, Radoje; Wang, Jian; Yang, Huanming; Hammarström, Lennart; Kostic, Aleksandar D; Kristiansen, Karsten; Li, Junhua

    2018-03-01

    More extensive use of metagenomic shotgun sequencing in microbiome research relies on the development of high-throughput, cost-effective sequencing. Here we present a comprehensive evaluation of the performance of the new high-throughput sequencing platform BGISEQ-500 for metagenomic shotgun sequencing and compare its performance with that of 2 Illumina platforms. Using fecal samples from 20 healthy individuals, we evaluated the intra-platform reproducibility for metagenomic sequencing on the BGISEQ-500 platform in a setup comprising 8 library replicates and 8 sequencing replicates. Cross-platform consistency was evaluated by comparing 20 pairwise replicates on the BGISEQ-500 platform vs the Illumina HiSeq 2000 platform and the Illumina HiSeq 4000 platform. In addition, we compared the performance of the 2 Illumina platforms against each other. By a newly developed overall accuracy quality control method, an average of 82.45 million high-quality reads (96.06% of raw reads) per sample, with 90.56% of bases scoring Q30 and above, was obtained using the BGISEQ-500 platform. Quantitative analyses revealed extremely high reproducibility between BGISEQ-500 intra-platform replicates. Cross-platform replicates differed slightly more than intra-platform replicates, yet a high consistency was observed. Only a low percentage (2.02%-3.25%) of genes exhibited significant differences in relative abundance comparing the BGISEQ-500 and HiSeq platforms, with a bias toward genes with higher GC content being enriched on the HiSeq platforms. Our study provides the first set of performance metrics for human gut metagenomic sequencing data using BGISEQ-500. The high accuracy and technical reproducibility confirm the applicability of the new platform for metagenomic studies, though caution is still warranted when combining metagenomic data from different platforms.

  6. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on PENELOPE and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: PENELOPE was first translated from FORTRAN to C++ and the result was confirmed to produce equivalent results to the original code. The C++ code was then adapted to CUDA in a workflow optimized for GPU architecture. The original code was expandedmore » to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gPENELOPE highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gPENELOPE as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gPENELOPE. Ultimately, gPENELOPE was applied toward independent validation of patient doses calculated by MRIdian’s KMC. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread FORTRAN implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gPENELOPE with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of PENELOPE. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.« less

  7. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model

    PubMed Central

    Wang, Yuhe; Mazur, Thomas R.; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H. Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H. Harold

    2016-01-01

    Purpose: The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. Methods: penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian’s kmc. Results: An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). Conclusions: A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems. PMID:27370123

  8. A GPU-accelerated Monte Carlo dose calculation platform and its application toward validating an MRI-guided radiation therapy beam model.

    PubMed

    Wang, Yuhe; Mazur, Thomas R; Green, Olga; Hu, Yanle; Li, Hua; Rodriguez, Vivian; Wooten, H Omar; Yang, Deshan; Zhao, Tianyu; Mutic, Sasa; Li, H Harold

    2016-07-01

    The clinical commissioning of IMRT subject to a magnetic field is challenging. The purpose of this work is to develop a GPU-accelerated Monte Carlo dose calculation platform based on penelope and then use the platform to validate a vendor-provided MRIdian head model toward quality assurance of clinical IMRT treatment plans subject to a 0.35 T magnetic field. penelope was first translated from fortran to c++ and the result was confirmed to produce equivalent results to the original code. The c++ code was then adapted to cuda in a workflow optimized for GPU architecture. The original code was expanded to include voxelized transport with Woodcock tracking, faster electron/positron propagation in a magnetic field, and several features that make gpenelope highly user-friendly. Moreover, the vendor-provided MRIdian head model was incorporated into the code in an effort to apply gpenelope as both an accurate and rapid dose validation system. A set of experimental measurements were performed on the MRIdian system to examine the accuracy of both the head model and gpenelope. Ultimately, gpenelope was applied toward independent validation of patient doses calculated by MRIdian's kmc. An acceleration factor of 152 was achieved in comparison to the original single-thread fortran implementation with the original accuracy being preserved. For 16 treatment plans including stomach (4), lung (2), liver (3), adrenal gland (2), pancreas (2), spleen(1), mediastinum (1), and breast (1), the MRIdian dose calculation engine agrees with gpenelope with a mean gamma passing rate of 99.1% ± 0.6% (2%/2 mm). A Monte Carlo simulation platform was developed based on a GPU- accelerated version of penelope. This platform was used to validate that both the vendor-provided head model and fast Monte Carlo engine used by the MRIdian system are accurate in modeling radiation transport in a patient using 2%/2 mm gamma criteria. Future applications of this platform will include dose validation and accumulation, IMRT optimization, and dosimetry system modeling for next generation MR-IGRT systems.

  9. High throughput field plant phenotyping facility at University of Nebraska-Lincoln and the first year experience

    NASA Astrophysics Data System (ADS)

    Ge, Y.; Bai, G.; Irmak, S.; Awada, T.; Stoerger, V.; Graef, G.; Scoby, D.; Schnable, J.

    2017-12-01

    University of Nebraska - Lincoln's high throughput field plant phenotyping facility is a cable robot based system built on a 1-ac field. The sensor platform is tethered with eight cables via four poles at the corners of the field for its precise control and positioning. The sensor modules on the platform include a 4-band RGB-NIR camera, a thermal infrared camera, a 3D LiDAR, VNIR spectrometers, and environmental sensors. These sensors are used to collect multifaceted physiological, structural and chemical properties of plants from the field plots. A subsurface drip irrigation system is established in this field which allows a controlled amount of water and fertilizers to be delivered to individual plots. An extensive soil moisture sensor network is also established to monitor soil water status, and serve as a feedback loop for irrigation scheduling. In the first year of operation, the field is planted maize and soybean. Weekly ground truth data were collected from the plots to validate image and sensor data from the phenotyping system. This presentation will provide an overview of this state-of-the-art field plant phenotyping facility, and present preliminary data from the first year operation of the system.

  10. Aircraft/island/ship/satellite intercomparison: Preliminary results from July 16, 1987

    NASA Technical Reports Server (NTRS)

    Hanson, Howard P.; Davidson, Ken; Gerber, Herman; Khalsa, Siri Jodha Singh; Kloesel, Kevin A.; Schwiesow, Ronald; Snider, Jack B.; Wielicki, Bruce M.; Wylie, Donald P.

    1990-01-01

    The First ISCCP Regional Experiment (FIRE) objective of validating and improving satellite algorithms for inferring cloud properties from satellite radiances was one of the central motivating factors in the design of the specific field experimental strategies used in the July, 1987 marine stratocumulus intensive field observations (IFO). The in situ measuring platforms were deployed to take maximum advantage of redundant measurements (for intercomparison of the in situ sensors) and to provide optimal coverage within satellite images. One of the most ambitious of these strategies was the attempt to coordinate measurements from San Nicolas Island (SNI), the R/V Pt. Sur, the meteorological aircraft, and the satellites. For the most part, this attempt was frustrated by flight restrictions in the vicinity of SNI. The exception was the mission of July 16, 1987, which achieved remarkable success in the coordination of the platforms. This presentation concerns operations conducted by the National Center for Atmospheric Research (NCAR) Electra and how data from the Electra can be integrated with and compared to data from the Pt. Sur, SNI, and the satellites. The focus is on the large-scale, integrated picture of the conditions on July 16 from the perspective of the Electra's flight operations.

  11. NASA Operational Simulator for Small Satellites (NOS3)

    NASA Technical Reports Server (NTRS)

    Zemerick, Scott

    2015-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operationstraining, verification and validation (VV), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  12. Continuous measurement of breast tumour hormone receptor expression: a comparison of two computational pathology platforms.

    PubMed

    Ahern, Thomas P; Beck, Andrew H; Rosner, Bernard A; Glass, Ben; Frieling, Gretchen; Collins, Laura C; Tamimi, Rulla M

    2017-05-01

    Computational pathology platforms incorporate digital microscopy with sophisticated image analysis to permit rapid, continuous measurement of protein expression. We compared two computational pathology platforms on their measurement of breast tumour oestrogen receptor (ER) and progesterone receptor (PR) expression. Breast tumour microarrays from the Nurses' Health Study were stained for ER (n=592) and PR (n=187). One expert pathologist scored cases as positive if ≥1% of tumour nuclei exhibited stain. ER and PR were then measured with the Definiens Tissue Studio (automated) and Aperio Digital Pathology (user-supervised) platforms. Platform-specific measurements were compared using boxplots, scatter plots and correlation statistics. Classification of ER and PR positivity by platform-specific measurements was evaluated with areas under receiver operating characteristic curves (AUC) from univariable logistic regression models, using expert pathologist classification as the standard. Both platforms showed considerable overlap in continuous measurements of ER and PR between positive and negative groups classified by expert pathologist. Platform-specific measurements were strongly and positively correlated with one another (r≥0.77). The user-supervised Aperio workflow performed slightly better than the automated Definiens workflow at classifying ER positivity (AUC Aperio =0.97; AUC Definiens =0.90; difference=0.07, 95% CI 0.05 to 0.09) and PR positivity (AUC Aperio =0.94; AUC Definiens =0.87; difference=0.07, 95% CI 0.03 to 0.12). Paired hormone receptor expression measurements from two different computational pathology platforms agreed well with one another. The user-supervised workflow yielded better classification accuracy than the automated workflow. Appropriately validated computational pathology algorithms enrich molecular epidemiology studies with continuous protein expression data and may accelerate tumour biomarker discovery. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://www.bmj.com/company/products-services/rights-and-licensing/.

  13. Validation of GPU-accelerated superposition-convolution dose computations for the Small Animal Radiation Research Platform.

    PubMed

    Cho, Nathan; Tsiamas, Panagiotis; Velarde, Esteban; Tryggestad, Erik; Jacques, Robert; Berbeco, Ross; McNutt, Todd; Kazanzides, Peter; Wong, John

    2018-05-01

    The Small Animal Radiation Research Platform (SARRP) has been developed for conformal microirradiation with on-board cone beam CT (CBCT) guidance. The graphics processing unit (GPU)-accelerated Superposition-Convolution (SC) method for dose computation has been integrated into the treatment planning system (TPS) for SARRP. This paper describes the validation of the SC method for the kilovoltage energy by comparing with EBT2 film measurements and Monte Carlo (MC) simulations. MC data were simulated by EGSnrc code with 3 × 10 8 -1.5 × 10 9 histories, while 21 photon energy bins were used to model the 220 kVp x-rays in the SC method. Various types of phantoms including plastic water, cork, graphite, and aluminum were used to encompass the range of densities of mouse organs. For the comparison, percentage depth dose (PDD) of SC, MC, and film measurements were analyzed. Cross beam (x,y) dosimetric profiles of SC and film measurements are also presented. Correction factors (CFz) to convert SC to MC dose-to-medium are derived from the SC and MC simulations in homogeneous phantoms of aluminum and graphite to improve the estimation. The SC method produces dose values that are within 5% of film measurements and MC simulations in the flat regions of the profile. The dose is less accurate at the edges, due to factors such as geometric uncertainties of film placement and difference in dose calculation grids. The GPU-accelerated Superposition-Convolution dose computation method was successfully validated with EBT2 film measurements and MC calculations. The SC method offers much faster computation speed than MC and provides calculations of both dose-to-water in medium and dose-to-medium in medium. © 2018 American Association of Physicists in Medicine.

  14. miR-16-5p Is a Stably-Expressed Housekeeping MicroRNA in Breast Cancer Tissues from Primary Tumors and from Metastatic Sites

    PubMed Central

    Rinnerthaler, Gabriel; Hackl, Hubert; Gampenrieder, Simon Peter; Hamacher, Frank; Hufnagl, Clemens; Hauser-Kronberger, Cornelia; Zehentmayr, Franz; Fastner, Gerd; Sedlmayer, Felix; Mlineritsch, Brigitte; Greil, Richard

    2016-01-01

    For quantitative microRNA analyses in formalin-fixed paraffin-embedded (FFPE) tissue, expression levels have to be normalized to endogenous controls. To investigate the most stably-expressed microRNAs in breast cancer and its surrounding tissue, we used tumor samples from primary tumors and from metastatic sites. MiRNA profiling using TaqMan® Array Human MicroRNA Cards, enabling quantification of 754 unique human miRNAs, was performed in FFPE specimens from 58 patients with metastatic breast cancer. Forty-two (72%) samples were collected from primary tumors and 16 (28%) from metastases. In a cross-platform analysis of a validation cohort of 32 FFPE samples from patients with early breast cancer genome-wide microRNA expression analysis using SurePrintG3 miRNA (8 × 60 K)® microarrays from Agilent® was performed. Eleven microRNAs could be detected in all samples analyzed. Based on NormFinder and geNorm stability values and the high correlation (rho ≥ 0.8) with the median of all measured microRNAs, miR-16-5p, miR-29a-3p, miR-126-3p, and miR-222-3p are suitable single gene housekeeper candidates. In the cross-platform validation, 29 human microRNAs were strongly expressed (mean log2-intensity > 10) and 21 of these microRNAs including miR-16-5p and miR-29a-3p were also stably expressed (CV < 5%). Thus, miR-16-5p and miR-29a-3p are both strong housekeeper candidates. Their Normfinder stability values calculated across the primary tumor and metastases subgroup indicate that miR-29a-3p can be considered as the strongest housekeeper in a cohort with mainly samples from primary tumors, whereas miR-16-5p might perform better in a metastatic sample enriched cohort. PMID:26821018

  15. miR-16-5p Is a Stably-Expressed Housekeeping MicroRNA in Breast Cancer Tissues from Primary Tumors and from Metastatic Sites.

    PubMed

    Rinnerthaler, Gabriel; Hackl, Hubert; Gampenrieder, Simon Peter; Hamacher, Frank; Hufnagl, Clemens; Hauser-Kronberger, Cornelia; Zehentmayr, Franz; Fastner, Gerd; Sedlmayer, Felix; Mlineritsch, Brigitte; Greil, Richard

    2016-01-26

    For quantitative microRNA analyses in formalin-fixed paraffin-embedded (FFPE) tissue, expression levels have to be normalized to endogenous controls. To investigate the most stably-expressed microRNAs in breast cancer and its surrounding tissue, we used tumor samples from primary tumors and from metastatic sites. MiRNA profiling using TaqMan(®) Array Human MicroRNA Cards, enabling quantification of 754 unique human miRNAs, was performed in FFPE specimens from 58 patients with metastatic breast cancer. Forty-two (72%) samples were collected from primary tumors and 16 (28%) from metastases. In a cross-platform analysis of a validation cohort of 32 FFPE samples from patients with early breast cancer genome-wide microRNA expression analysis using SurePrintG3 miRNA (8 × 60 K)(®) microarrays from Agilent(®) was performed. Eleven microRNAs could be detected in all samples analyzed. Based on NormFinder and geNorm stability values and the high correlation (rho ≥ 0.8) with the median of all measured microRNAs, miR-16-5p, miR-29a-3p, miR-126-3p, and miR-222-3p are suitable single gene housekeeper candidates. In the cross-platform validation, 29 human microRNAs were strongly expressed (mean log2-intensity > 10) and 21 of these microRNAs including miR-16-5p and miR-29a-3p were also stably expressed (CV < 5%). Thus, miR-16-5p and miR-29a-3p are both strong housekeeper candidates. Their Normfinder stability values calculated across the primary tumor and metastases subgroup indicate that miR-29a-3p can be considered as the strongest housekeeper in a cohort with mainly samples from primary tumors, whereas miR-16-5p might perform better in a metastatic sample enriched cohort.

  16. Cloud computing and validation of expandable in silico livers.

    PubMed

    Ropella, Glen E P; Hunt, C Anthony

    2010-12-03

    In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware.

  17. A promising method for identifying cross-cultural differences in patient perspective: the use of Internet-based focus groups for content validation of new Patient Reported Outcome assessments

    PubMed Central

    Atkinson, Mark J; Lohs, Jan; Kuhagen, Ilka; Kaufman, Julie; Bhaidani, Shamsu

    2006-01-01

    Objectives This proof of concept (POC) study was designed to evaluate the use of an Internet-based bulletin board technology to aid parallel cross-cultural development of thematic content for a new set of patient-reported outcome measures (PROs). Methods The POC study, conducted in Germany and the United States, utilized Internet Focus Groups (IFGs) to assure the validity of new PRO items across the two cultures – all items were designed to assess the impact of excess facial oil on individuals' lives. The on-line IFG activities were modeled after traditional face-to-face focus groups and organized by a common 'Topic' Guide designed with input from thought leaders in dermatology and health outcomes research. The two sets of IFGs were professionally moderated in the native language of each country. IFG moderators coded the thematic content of transcripts, and a frequency analysis of code endorsement was used to identify areas of content similarity and difference between the two countries. Based on this information, draft PRO items were designed and a majority (80%) of the original participants returned to rate the relative importance of the newly designed questions. Findings The use of parallel cross-cultural content analysis of IFG transcripts permitted identification of the major content themes in each country as well as exploration of the possible reasons for any observed differences between the countries. Results from coded frequency counts and transcript reviews informed the design and wording of the test questions for the future PRO instrument(s). Subsequent ratings of item importance also deepened our understanding of potential areas of cross-cultural difference, differences that would be explored over the course of future validation studies involving these PROs. Conclusion The use of IFGs for cross-cultural content development received positive reviews from participants and was found to be both cost and time effective. The novel thematic coding methodology provided an empirical platform on which to develop culturally sensitive questionnaire content using the natural language of participants. Overall, the IFG responses and thematic analyses provided a thorough evaluation of similarities and differences in cross-cultural themes, which in turn acted as a sound base for the development of new PRO questionnaires. PMID:16995935

  18. A promising method for identifying cross-cultural differences in patient perspective: the use of Internet-based focus groups for content validation of new patient reported outcome assessments.

    PubMed

    Atkinson, Mark J; Lohs, Jan; Kuhagen, Ilka; Kaufman, Julie; Bhaidani, Shamsu

    2006-09-22

    This proof of concept (POC) study was designed to evaluate the use of an Internet-based bulletin board technology to aid parallel cross-cultural development of thematic content for a new set of patient-reported outcome measures (PROs). The POC study, conducted in Germany and the United States, utilized Internet Focus Groups (IFGs) to assure the validity of new PRO items across the two cultures--all items were designed to assess the impact of excess facial oil on individuals' lives. The on-line IFG activities were modeled after traditional face-to-face focus groups and organized by a common 'Topic' Guide designed with input from thought leaders in dermatology and health outcomes research. The two sets of IFGs were professionally moderated in the native language of each country. IFG moderators coded the thematic content of transcripts, and a frequency analysis of code endorsement was used to identify areas of content similarity and difference between the two countries. Based on this information, draft PRO items were designed and a majority (80%) of the original participants returned to rate the relative importance of the newly designed questions. The use of parallel cross-cultural content analysis of IFG transcripts permitted identification of the major content themes in each country as well as exploration of the possible reasons for any observed differences between the countries. Results from coded frequency counts and transcript reviews informed the design and wording of the test questions for the future PRO instrument(s). Subsequent ratings of item importance also deepened our understanding of potential areas of cross-cultural difference, differences that would be explored over the course of future validation studies involving these PROs. The use of IFGs for cross-cultural content development received positive reviews from participants and was found to be both cost and time effective. The novel thematic coding methodology provided an empirical platform on which to develop culturally sensitive questionnaire content using the natural language of participants. Overall, the IFG responses and thematic analyses provided a thorough evaluation of similarities and differences in cross-cultural themes, which in turn acted as a sound base for the development of new PRO questionnaires.

  19. Soft tissue deformation for surgical simulation: a position-based dynamics approach.

    PubMed

    Camara, Mafalda; Mayer, Erik; Darzi, Ara; Pratt, Philip

    2016-06-01

    To assist the rehearsal and planning of robot-assisted partial nephrectomy, a real-time simulation platform is presented that allows surgeons to visualise and interact with rapidly constructed patient-specific biomechanical models of the anatomical regions of interest. Coupled to a framework for volumetric deformation, the platform furthermore simulates intracorporeal 2D ultrasound image acquisition, using preoperative imaging as the data source. This not only facilitates the planning of optimal transducer trajectories and viewpoints, but can also act as a validation context for manually operated freehand 3D acquisitions and reconstructions. The simulation platform was implemented within the GPU-accelerated NVIDIA FleX position-based dynamics framework. In order to validate the model and determine material properties and other simulation parameter values, a porcine kidney with embedded fiducial beads was CT-scanned and segmented. Acquisitions for the rest position and three different levels of probe-induced deformation were collected. Optimal values of the cluster stiffness coefficients were determined for a range of different particle radii, where the objective function comprised the mean distance error between real and simulated fiducial positions over the sequence of deformations. The mean fiducial error at each deformation stage was found to be compatible with the level of ultrasound probe calibration error typically observed in clinical practice. Furthermore, the simulation exhibited unconditional stability on account of its use of clustered shape-matching constraints. A novel position-based dynamics implementation of soft tissue deformation has been shown to facilitate several desirable simulation characteristics: real-time performance, unconditional stability, rapid model construction enabling patient-specific behaviour and accuracy with respect to reference CT images.

  20. Measuring and Validating Neutron Capture Cross Sections Using a Lead Slowing-Down Spectrometer

    NASA Astrophysics Data System (ADS)

    Thompson, Nicholas

    Accurate nuclear data is essential for the modeling, design, and operation of nuclear systems. In this work, the Rensselaer Polytechnic Institute (RPI) Lead Slowing-Down Spectrometer (LSDS) at the Gaerttner Linear Accelerator Center (LINAC) was used to measure neutron capture cross sections and validate capture cross sections in cross section libraries. The RPI LINAC was used to create a fast burst of neutrons in the center of the LSDS, a large cube of high purity lead. A sample and YAP:Ce scintillator were placed in the LSDS, and as neutrons lost energy through scattering interactions with the lead, the scintillator detected capture gammas resulting from neutron capture events in the sample. Samples of silver, gold, cobalt, iron, indium, molybdenum, niobium, nickel, tin, tantalum, and zirconium were measured. Data was collected as a function of time after neutron pulse, or slowing-down time, which is correlated to average neutron energy. An analog and a digital data acquisition system collected data simultaneously, allowing for collection of pulse shape information as well as timing. Collection of digital data allowed for pulse shape analysis after the experiment. This data was then analyzed and compared to Monte Carlo simulations to validate the accuracy of neutron capture cross section libraries. These measurements represent the first time that neutron capture cross sections have been measured using an LSDS in the United States, and the first time tools such as coincidence measurements and pulse height weighting have been applied to measurements of neutron capture cross sections using an LSDS. Significant differences between measurement results and simulation results were found in multiple materials, and some errors in nuclear data libraries have already been identified due to these measurements.

  1. Development and validation of a gene expression oligo microarray for the gilthead sea bream (Sparus aurata).

    PubMed

    Ferraresso, Serena; Vitulo, Nicola; Mininni, Alba N; Romualdi, Chiara; Cardazzo, Barbara; Negrisolo, Enrico; Reinhardt, Richard; Canario, Adelino V M; Patarnello, Tomaso; Bargelloni, Luca

    2008-12-03

    Aquaculture represents the most sustainable alternative of seafood supply to substitute for the declining marine fisheries, but severe production bottlenecks remain to be solved. The application of genomic technologies offers much promise to rapidly increase our knowledge on biological processes in farmed species and overcome such bottlenecks. Here we present an integrated platform for mRNA expression profiling in the gilthead sea bream (Sparus aurata), a marine teleost of great importance for aquaculture. A public data base was constructed, consisting of 19,734 unique clusters (3,563 contigs and 16,171 singletons). Functional annotation was obtained for 8,021 clusters. Over 4,000 sequences were also associated with a GO entry. Two 60mer probes were designed for each gene and in-situ synthesized on glass slides using Agilent SurePrint technology. Platform reproducibility and accuracy were assessed on two early stages of sea bream development (one-day and four days old larvae). Correlation between technical replicates was always > 0.99, with strong positive correlation between paired probes. A two class SAM test identified 1,050 differentially expressed genes between the two developmental stages. Functional analysis suggested that down-regulated transcripts (407) in older larvae are mostly essential/housekeeping genes, whereas tissue-specific genes are up-regulated in parallel with the formation of key organs (eye, digestive system). Cross-validation of microarray data was carried out using quantitative qRT-PCR on 11 target genes, selected to reflect the whole range of fold-change and both up-regulated and down-regulated genes. A statistically significant positive correlation was obtained comparing expression levels for each target gene across all biological replicates. Good concordance between qRT-PCR and microarray data was observed between 2- and 7-fold change, while fold-change compression in the microarray was present for differences greater than 10-fold in the qRT-PCR. A highly reliable oligo-microarray platform was developed and validated for the gilthead sea bream despite the presently limited knowledge of the species transcriptome. Because of the flexible design this array will be able to accommodate additional probes as soon as novel unique transcripts are available.

  2. OTEC Cold Water Pipe-Platform Subsystem Dynamic Interaction Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Varley, Robert; Halkyard, John; Johnson, Peter

    A commercial floating 100-megawatt (MW) ocean thermal energy conversion (OTEC) power plant will require a cold water pipe (CWP) with a diameter of 10-meter (m) and length of up to 1,000 m. The mass of the cold water pipe, including entrained water, can exceed the mass of the platform supporting it. The offshore industry uses software-modeling tools to develop platform and riser (pipe) designs to survive the offshore environment. These tools are typically validated by scale model tests in facilities able to replicate real at-sea meteorological and ocean (metocean) conditions to provide the understanding and confidence to proceed to finalmore » design and full-scale fabrication. However, today’s offshore platforms (similar to and usually larger than those needed for OTEC applications) incorporate risers (or pipes) with diameters well under one meter. Secondly, the preferred construction method for large diameter OTEC CWPs is the use of composite materials, primarily a form of fiber-reinforced plastic (FRP). The use of these material results in relatively low pipe stiffness and large strains compared to steel construction. These factors suggest the need for further validation of offshore industry software tools. The purpose of this project was to validate the ability to model numerically the dynamic interaction between a large cold water-filled fiberglass pipe and a floating OTEC platform excited by metocean weather conditions using measurements from a scale model tested in an ocean basin test facility.« less

  3. Multidisciplinary Tool for Systems Analysis of Planetary Entry, Descent, and Landing

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    2011-01-01

    Systems analysis of a planetary entry (SAPE), descent, and landing (EDL) is a multidisciplinary activity in nature. SAPE improves the performance of the systems analysis team by automating and streamlining the process, and this improvement can reduce the errors that stem from manual data transfer among discipline experts. SAPE is a multidisciplinary tool for systems analysis of planetary EDL for Venus, Earth, Mars, Jupiter, Saturn, Uranus, Neptune, and Titan. It performs EDL systems analysis for any planet, operates cross-platform (i.e., Windows, Mac, and Linux operating systems), uses existing software components and open-source software to avoid software licensing issues, performs low-fidelity systems analysis in one hour on a computer that is comparable to an average laptop, and keeps discipline experts in the analysis loop. SAPE uses Python, a platform-independent, open-source language, for integration and for the user interface. Development has relied heavily on the object-oriented programming capabilities that are available in Python. Modules are provided to interface with commercial and government off-the-shelf software components (e.g., thermal protection systems and finite-element analysis). SAPE currently includes the following analysis modules: geometry, trajectory, aerodynamics, aerothermal, thermal protection system, and interface for structural sizing.

  4. An open-source software platform for data management, visualisation, model building and model sharing in water, energy and other resource modelling domains.

    NASA Astrophysics Data System (ADS)

    Knox, S.; Meier, P.; Mohammed, K.; Korteling, B.; Matrosov, E. S.; Hurford, A.; Huskova, I.; Harou, J. J.; Rosenberg, D. E.; Thilmant, A.; Medellin-Azuara, J.; Wicks, J.

    2015-12-01

    Capacity expansion on resource networks is essential to adapting to economic and population growth and pressures such as climate change. Engineered infrastructure systems such as water, energy, or transport networks require sophisticated and bespoke models to refine management and investment strategies. Successful modeling of such complex systems relies on good data management and advanced methods to visualize and share data.Engineered infrastructure systems are often represented as networks of nodes and links with operating rules describing their interactions. Infrastructure system management and planning can be abstracted to simulating or optimizing new operations and extensions of the network. By separating the data storage of abstract networks from manipulation and modeling we have created a system where infrastructure modeling across various domains is facilitated.We introduce Hydra Platform, a Free Open Source Software designed for analysts and modelers to store, manage and share network topology and data. Hydra Platform is a Python library with a web service layer for remote applications, called Apps, to connect. Apps serve various functions including network or results visualization, data export (e.g. into a proprietary format) or model execution. This Client-Server architecture allows users to manipulate and share centrally stored data. XML templates allow a standardised description of the data structure required for storing network data such that it is compatible with specific models.Hydra Platform represents networks in an abstract way and is therefore not bound to a single modeling domain. It is the Apps that create domain-specific functionality. Using Apps researchers from different domains can incorporate different models within the same network enabling cross-disciplinary modeling while minimizing errors and streamlining data sharing. Separating the Python library from the web layer allows developers to natively expand the software or build web-based apps in other languages for remote functionality. Partner CH2M is developing a commercial user-interface for Hydra Platform however custom interfaces and visualization tools can be built. Hydra Platform is available on GitHub while Apps will be shared on a central repository.

  5. Role of turbulence fluctuations on uncertainties of acoutic Doppler current profiler discharge measurements

    USGS Publications Warehouse

    Tarrab, Leticia; Garcia, Carlos M.; Cantero, Mariano I.; Oberg, Kevin

    2012-01-01

    This work presents a systematic analysis quantifying the role of the presence of turbulence fluctuations on uncertainties (random errors) of acoustic Doppler current profiler (ADCP) discharge measurements from moving platforms. Data sets of three-dimensional flow velocities with high temporal and spatial resolution were generated from direct numerical simulation (DNS) of turbulent open channel flow. Dimensionless functions relating parameters quantifying the uncertainty in discharge measurements due to flow turbulence (relative variance and relative maximum random error) to sampling configuration were developed from the DNS simulations and then validated with field-scale discharge measurements. The validated functions were used to evaluate the role of the presence of flow turbulence fluctuations on uncertainties in ADCP discharge measurements. The results of this work indicate that random errors due to the flow turbulence are significant when: (a) a low number of transects is used for a discharge measurement, and (b) measurements are made in shallow rivers using high boat velocity (short time for the boat to cross a flow turbulence structure).

  6. Perception SoC Based on an Ultrasonic Array of Sensors: Efficient DSP Core Implementation and Subsequent Experimental Results

    NASA Astrophysics Data System (ADS)

    Kassem, A.; Sawan, M.; Boukadoum, M.; Haidar, A.

    2005-12-01

    We are concerned with the design, implementation, and validation of a perception SoC based on an ultrasonic array of sensors. The proposed SoC is dedicated to ultrasonic echography applications. A rapid prototyping platform is used to implement and validate the new architecture of the digital signal processing (DSP) core. The proposed DSP core efficiently integrates all of the necessary ultrasonic B-mode processing modules. It includes digital beamforming, quadrature demodulation of RF signals, digital filtering, and envelope detection of the received signals. This system handles 128 scan lines and 6400 samples per scan line with a[InlineEquation not available: see fulltext.] angle of view span. The design uses a minimum size lookup memory to store the initial scan information. Rapid prototyping using an ARM/FPGA combination is used to validate the operation of the described system. This system offers significant advantages of portability and a rapid time to market.

  7. Harmonics analysis of the ITER poloidal field converter based on a piecewise method

    NASA Astrophysics Data System (ADS)

    Xudong, WANG; Liuwei, XU; Peng, FU; Ji, LI; Yanan, WU

    2017-12-01

    Poloidal field (PF) converters provide controlled DC voltage and current to PF coils. The many harmonics generated by the PF converter flow into the power grid and seriously affect power systems and electric equipment. Due to the complexity of the system, the traditional integral operation in Fourier analysis is complicated and inaccurate. This paper presents a piecewise method to calculate the harmonics of the ITER PF converter. The relationship between the grid input current and the DC output current of the ITER PF converter is deduced. The grid current is decomposed into the sum of some simple functions. By calculating simple function harmonics based on the piecewise method, the harmonics of the PF converter under different operation modes are obtained. In order to examine the validity of the method, a simulation model is established based on Matlab/Simulink and a relevant experiment is implemented in the ITER PF integration test platform. Comparative results are given. The calculated results are found to be consistent with simulation and experiment. The piecewise method is proved correct and valid for calculating the system harmonics.

  8. Phage display peptide libraries in molecular allergology: from epitope mapping to mimotope-based immunotherapy.

    PubMed

    Luzar, J; Štrukelj, B; Lunder, M

    2016-11-01

    Identification of allergen epitopes is a key component in proper understanding of the pathogenesis of type I allergies, for understanding cross-reactivity and for the development of mimotope immunotherapeutics. Phage particles have garnered recognition in the field of molecular allergology due to their value not only in competitive immunoscreening of peptide libraries but also as immunogenic carriers of allergen mimotopes. They integrate epitope discovery technology and immunization functions into a single platform. This article provides an overview of allergen mimotopes identified through the phage display technique. We discuss the contribution of phage display peptide libraries in determining dominant B-cell epitopes of allergens, in developing mimotope immunotherapy, in understanding cross-reactivity, and in determining IgE epitope profiles of individual patients to improve diagnostics and individualize immunotherapy. We also discuss the advantages and pitfalls of the methodology used to identify and validate the mimotopes. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  9. The Development of a Finite Volume Method for Modeling Sound in Coastal Ocean Environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Wen; Yang, Zhaoqing; Copping, Andrea E.

    : As the rapid growth of marine renewable energy and off-shore wind energy, there have been concerns that the noises generated from construction and operation of the devices may interfere marine animals’ communication. In this research, a underwater sound model is developed to simulate sound prorogation generated by marine-hydrokinetic energy (MHK) devices or offshore wind (OSW) energy platforms. Finite volume and finite difference methods are developed to solve the 3D Helmholtz equation of sound propagation in the coastal environment. For finite volume method, the grid system consists of triangular grids in horizontal plane and sigma-layers in vertical dimension. A 3Dmore » sparse matrix solver with complex coefficients is formed for solving the resulting acoustic pressure field. The Complex Shifted Laplacian Preconditioner (CSLP) method is applied to efficiently solve the matrix system iteratively with MPI parallelization using a high performance cluster. The sound model is then coupled with the Finite Volume Community Ocean Model (FVCOM) for simulating sound propagation generated by human activities in a range-dependent setting, such as offshore wind energy platform constructions and tidal stream turbines. As a proof of concept, initial validation of the finite difference solver is presented for two coastal wedge problems. Validation of finite volume method will be reported separately.« less

  10. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC)

    NASA Astrophysics Data System (ADS)

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B.; Jia, Xun

    2015-09-01

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia’s CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE’s random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  11. A GPU OpenCL based cross-platform Monte Carlo dose calculation engine (goMC).

    PubMed

    Tian, Zhen; Shi, Feng; Folkerts, Michael; Qin, Nan; Jiang, Steve B; Jia, Xun

    2015-10-07

    Monte Carlo (MC) simulation has been recognized as the most accurate dose calculation method for radiotherapy. However, the extremely long computation time impedes its clinical application. Recently, a lot of effort has been made to realize fast MC dose calculation on graphic processing units (GPUs). However, most of the GPU-based MC dose engines have been developed under NVidia's CUDA environment. This limits the code portability to other platforms, hindering the introduction of GPU-based MC simulations to clinical practice. The objective of this paper is to develop a GPU OpenCL based cross-platform MC dose engine named goMC with coupled photon-electron simulation for external photon and electron radiotherapy in the MeV energy range. Compared to our previously developed GPU-based MC code named gDPM (Jia et al 2012 Phys. Med. Biol. 57 7783-97), goMC has two major differences. First, it was developed under the OpenCL environment for high code portability and hence could be run not only on different GPU cards but also on CPU platforms. Second, we adopted the electron transport model used in EGSnrc MC package and PENELOPE's random hinge method in our new dose engine, instead of the dose planning method employed in gDPM. Dose distributions were calculated for a 15 MeV electron beam and a 6 MV photon beam in a homogenous water phantom, a water-bone-lung-water slab phantom and a half-slab phantom. Satisfactory agreement between the two MC dose engines goMC and gDPM was observed in all cases. The average dose differences in the regions that received a dose higher than 10% of the maximum dose were 0.48-0.53% for the electron beam cases and 0.15-0.17% for the photon beam cases. In terms of efficiency, goMC was ~4-16% slower than gDPM when running on the same NVidia TITAN card for all the cases we tested, due to both the different electron transport models and the different development environments. The code portability of our new dose engine goMC was validated by successfully running it on a variety of different computing devices including an NVidia GPU card, two AMD GPU cards and an Intel CPU processor. Computational efficiency among these platforms was compared.

  12. Performance Assessment and Geometric Calibration of RESOURCESAT-2

    NASA Astrophysics Data System (ADS)

    Radhadevi, P. V.; Solanki, S. S.; Akilan, A.; Jyothi, M. V.; Nagasubramanian, V.

    2016-06-01

    Resourcesat-2 (RS-2) has successfully completed five years of operations in its orbit. This satellite has multi-resolution and multi-spectral capabilities in a single platform. A continuous and autonomous co-registration, geo-location and radiometric calibration of image data from different sensors with widely varying view angles and resolution was one of the challenges of RS-2 data processing. On-orbit geometric performance of RS-2 sensors has been widely assessed and calibrated during the initial phase operations. Since then, as an ongoing activity, various geometric performance data are being generated periodically. This is performed with sites of dense ground control points (GCPs). These parameters are correlated to the direct geo-location accuracy of the RS-2 sensors and are monitored and validated to maintain the performance. This paper brings out the geometric accuracy assessment, calibration and validation done for about 500 datasets of RS-2. The objectives of this study are to ensure the best absolute and relative location accuracy of different cameras, location performance with payload steering and co-registration of multiple bands. This is done using a viewing geometry model, given ephemeris and attitude data, precise camera geometry and datum transformation. In the model, the forward and reverse transformations between the coordinate systems associated with the focal plane, payload, body, orbit and ground are rigorously and explicitly defined. System level tests using comparisons to ground check points have validated the operational geo-location accuracy performance and the stability of the calibration parameters.

  13. Visible to Short Wavelength Infrared Spectroscopy on Rovers: Why We Need it on Mars and What We Need to do on Earth

    NASA Technical Reports Server (NTRS)

    Blaney, D. L.

    2002-01-01

    The next stage of Mars exploration will include the use of rovers to seek out specific mineralogies. Understanding the mineralogical diversity of the locale will be used to determining which targets should be investigated with the full suite of in situ capability on the rover. Visible to Short Wavelength Infrared (VSWIR) spectroscopy is critical in evaluating the mineralogical diversity and to validate the global remote sensing data sets to be collected by Mars Express and the Mars Reconnaissance Orbiter. However, spectroscopy on mobile platforms present challenges in both the design of instruments and in the efficient operation of the instrument and mission. Field-testing and validation on Earth can be used to develop instrument requirements analysis tools needed for used on Mars.

  14. Software platform for simulation of a prototype proton CT scanner.

    PubMed

    Giacometti, Valentina; Bashkirov, Vladimir A; Piersimoni, Pierluigi; Guatelli, Susanna; Plautz, Tia E; Sadrozinski, Hartmut F-W; Johnson, Robert P; Zatserklyaniy, Andriy; Tessonnier, Thomas; Parodi, Katia; Rosenfeld, Anatoly B; Schulte, Reinhard W

    2017-03-01

    Proton computed tomography (pCT) is a promising imaging technique to substitute or at least complement x-ray CT for more accurate proton therapy treatment planning as it allows calculating directly proton relative stopping power from proton energy loss measurements. A proton CT scanner with a silicon-based particle tracking system and a five-stage scintillating energy detector has been completed. In parallel a modular software platform was developed to characterize the performance of the proposed pCT. The modular pCT software platform consists of (1) a Geant4-based simulation modeling the Loma Linda proton therapy beam line and the prototype proton CT scanner, (2) water equivalent path length (WEPL) calibration of the scintillating energy detector, and (3) image reconstruction algorithm for the reconstruction of the relative stopping power (RSP) of the scanned object. In this work, each component of the modular pCT software platform is described and validated with respect to experimental data and benchmarked against theoretical predictions. In particular, the RSP reconstruction was validated with both experimental scans, water column measurements, and theoretical calculations. The results show that the pCT software platform accurately reproduces the performance of the existing prototype pCT scanner with a RSP agreement between experimental and simulated values to better than 1.5%. The validated platform is a versatile tool for clinical proton CT performance and application studies in a virtual setting. The platform is flexible and can be modified to simulate not yet existing versions of pCT scanners and higher proton energies than those currently clinically available. © 2017 American Association of Physicists in Medicine.

  15. Calibration of the Thermal Infrared Sensor on the Landsat Data Continuity Mission

    NASA Technical Reports Server (NTRS)

    Thome, K; Reuter, D.; Lunsford, D.; Montanaro, M.; Smith, J.; Tesfaye, Z.; Wenny, B.

    2011-01-01

    The Landsat series of satellites provides the longest running continuous data set of moderate-spatial-resolution imagery beginning with the launch of Landsat 1 in 1972 and continuing with the 1999 launch of Landsat 7 and current operation of Landsats 5 and 7. The Landsat Data Continuity Mission (LDCM) will continue this program into a fourth decade providing data that are keys to understanding changes in land-use changes and resource management. LDCM consists of a two-sensor platform comprised of the Operational Land Imager (OLI) and Thermal Infrared Sensors (TIRS). A description of the applications and design of the TIRS instrument is given as well as the plans for calibration and characterization. Included are early results from preflight calibration and a description of the inflight validation.

  16. Treatment Planning and Image Guidance for Radiofrequency Ablations of Large Tumors

    PubMed Central

    Ren, Hongliang; Campos-Nanez, Enrique; Yaniv, Ziv; Banovac, Filip; Abeledo, Hernan; Hata, Nobuhiko; Cleary, Kevin

    2014-01-01

    This article addresses the two key challenges in computer-assisted percutaneous tumor ablation: planning multiple overlapping ablations for large tumors while avoiding critical structures, and executing the prescribed plan. Towards semi-automatic treatment planning for image-guided surgical interventions, we develop a systematic approach to the needle-based ablation placement task, ranging from pre-operative planning algorithms to an intra-operative execution platform. The planning system incorporates clinical constraints on ablations and trajectories using a multiple objective optimization formulation, which consists of optimal path selection and ablation coverage optimization based on integer programming. The system implementation is presented and validated in phantom studies and on an animal model. The presented system can potentially be further extended for other ablation techniques such as cryotherapy. PMID:24235279

  17. Predictive modeling of outcomes following definitive chemoradiotherapy for oropharyngeal cancer based on FDG-PET image characteristics

    NASA Astrophysics Data System (ADS)

    Folkert, Michael R.; Setton, Jeremy; Apte, Aditya P.; Grkovski, Milan; Young, Robert J.; Schöder, Heiko; Thorstad, Wade L.; Lee, Nancy Y.; Deasy, Joseph O.; Oh, Jung Hun

    2017-07-01

    In this study, we investigate the use of imaging feature-based outcomes research (‘radiomics’) combined with machine learning techniques to develop robust predictive models for the risk of all-cause mortality (ACM), local failure (LF), and distant metastasis (DM) following definitive chemoradiation therapy (CRT). One hundred seventy four patients with stage III-IV oropharyngeal cancer (OC) treated at our institution with CRT with retrievable pre- and post-treatment 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) scans were identified. From pre-treatment PET scans, 24 representative imaging features of FDG-avid disease regions were extracted. Using machine learning-based feature selection methods, multiparameter logistic regression models were built incorporating clinical factors and imaging features. All model building methods were tested by cross validation to avoid overfitting, and final outcome models were validated on an independent dataset from a collaborating institution. Multiparameter models were statistically significant on 5 fold cross validation with the area under the receiver operating characteristic curve (AUC)  =  0.65 (p  =  0.004), 0.73 (p  =  0.026), and 0.66 (p  =  0.015) for ACM, LF, and DM, respectively. The model for LF retained significance on the independent validation cohort with AUC  =  0.68 (p  =  0.029) whereas the models for ACM and DM did not reach statistical significance, but resulted in comparable predictive power to the 5 fold cross validation with AUC  =  0.60 (p  =  0.092) and 0.65 (p  =  0.062), respectively. In the largest study of its kind to date, predictive features including increasing metabolic tumor volume, increasing image heterogeneity, and increasing tumor surface irregularity significantly correlated to mortality, LF, and DM on 5 fold cross validation in a relatively uniform single-institution cohort. The LF model also retained significance in an independent population.

  18. Evaluation of an Online Platform for Multiple Sclerosis Research: Patient Description, Validation of Severity Scale, and Exploration of BMI Effects on Disease Course

    PubMed Central

    Bove, Riley; Secor, Elizabeth; Healy, Brian C.; Musallam, Alexander; Vaughan, Timothy; Glanz, Bonnie I.; Greeke, Emily; Weiner, Howard L.; Chitnis, Tanuja; Wicks, Paul; De Jager, Philip L.

    2013-01-01

    Objectives To assess the potential of an online platform, PatientsLikeMe.com (PLM), for research in multiple sclerosis (MS). An investigation of the role of body mass index (BMI) on MS disease course was conducted to illustrate the utility of the platform. Methods First, we compared the demographic characteristics of subjects from PLM and from a regional MS center. Second, we validated PLM’s patient-reported outcome measure (MS Rating Scale, MSRS) against standard physician-rated tools. Finally, we analyzed the relation of BMI to the MSRS measure. Results Compared with 4,039 MS Center patients, the 10,255 PLM members were younger, more educated, and less often male and white. Disease course was more often relapsing remitting, with younger symptom onset and shorter disease duration. Differences were significant because of large sample sizes but small in absolute terms. MSRS scores for 121 MS Center patients revealed acceptable agreement between patient-derived and physician-derived composite scores (weighted kappa = 0.46). The Walking domain showed the highest weighted kappa (0.73) and correlation (rs = 0.86) between patient and physician scores. Additionally, there were good correlations between the patient-reported MSRS composite and walking scores and physician-derived measures: Expanded Disability Status Scale (composite rs = 0.61, walking rs = 0.74), Timed 25 Foot Walk (composite rs = 0.70, walking rs = 0.69), and Ambulation Index (composite rs = 0.81, walking rs = 0.84). Finally, using PLM data, we found a modest correlation between BMI and cross-sectional MSRS (rho = 0.17) and no association between BMI and disease course. Conclusions The PLM population is comparable to a clinic population, and its patient-reported MSRS is correlated with existing clinical instruments. Thus, this online platform may provide a venue for MS investigations with unique strengths (frequent data collection, large sample sizes). To illustrate its applicability, we assessed the role of BMI in MS disease course but did not find a clinically meaningful role for BMI in this setting. PMID:23527256

  19. Controlled Hydrogen Fleet and Infrastructure Demonstration and Validation Project Final Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verma, Puneet; Casey, Dan

    This report summarizes the work conducted under U.S. Department of Energy (US DOE) contract DE-FC36-04GO14286 by Chevron Technology Ventures (CTV, a division of Chevron U.S.A., Inc.), Hyundai Motor Company (HMC), and UTC Power (UTCP, a United Technologies company) to validate hydrogen (H2) infrastructure technology and fuel cell hybrid vehicles. Chevron established hydrogen filling stations at fleet operator sites using multiple technologies for on-site hydrogen generation, storage, and dispensing. CTV constructed five demonstration stations to support a vehicle fleet of 33 fuel cell passenger vehicles, eight internal combustion engine (ICE) vehicles, three fuel cell transit busses, and eight internal combustion enginemore » shuttle busses. Stations were operated between 2005 and 2010. HMC introduced 33 fuel cell hybrid electric vehicles (FCHEV) in the course of the project. Generation I included 17 vehicles that used UTCP fuel cell power plants and operated at 350 bar. Generation II included 16 vehicles that had upgraded UTC fuel cell power plants and demonstrated options such as the use of super-capacitors and operation at 700 bar. All 33 vehicles used the Hyundai Tucson sports utility vehicle (SUV) platform. Fleet operators demonstrated commercial operation of the vehicles in three climate zones (hot, moderate, and cold) and for various driving patterns. Fleet operators were Southern California Edison (SCE), AC Transit (of Oakland, California), Hyundai America Technical Center Inc. (HATCI), and the U.S. Army Tank Automotive Research, Development and Engineering Center (TARDEC, in a site agreement with Selfridge Army National Guard Base in Selfridge, Michigan).« less

  20. Robotic kidney autotransplantation in a porcine model: a procedure-specific training platform for the simulation of robotic intracorporeal vascular anastomosis.

    PubMed

    Tiong, Ho Yee; Goh, Benjamin Yen Seow; Chiong, Edmund; Tan, Lincoln Guan Lim; Vathsala, Anatharaman

    2018-03-31

    Robotic-assisted kidney transplantation (RKT) with the Da Vinci (Intuitive, USA) platform has been recently developed to improve outcomes by decreasing surgical site complications and morbidity, especially in obese patients. This potential paradigm shift in the surgical technique of kidney transplantation is performed in only a few centers. For wider adoption of this high stake complex operation, we aimed to develop a procedure-specific simulation platform in a porcine model for the training of robotic intracorporeal vascular anastomosis and evaluating vascular anastomoses patency. This paper describes the requirements and steps developed for the above training purpose. Over a series of four animal ethics' approved experiments, the technique of robotic-assisted laparoscopic autotransplantation of the kidney was developed in Amsterdam live pigs (60-70 kg). The surgery was based around the vascular anastomosis technique described by Menon et al. This non-survival porcine training model is targeted at transplant surgeons with robotic surgery experience. Under general anesthesia, each pig was placed in lateral decubitus position with the placement of one robotic camera port, two robotic 8 mm ports and one assistant port. Robotic docking over the pig posteriorly was performed. The training platform involved the following procedural steps. First, ipsilateral iliac vessel dissection was performed. Second, robotic-assisted laparoscopic donor nephrectomy was performed with in situ perfusion of the kidney with cold Hartmann's solution prior to complete division of the hilar vessels, ureter and kidney mobilization. Thirdly, the kidney was either kept in situ for orthotopic autotransplantation or mobilized to the pelvis and orientated for the vascular anastomosis, which was performed end to end or end to side after vessel loop clamping of the iliac vessels, respectively, using 6/0 Gore-Tex sutures. Following autotransplantation and release of vessel loops, perfusion of the graft was assessed using intraoperative indocyanine green imaging and monitoring urine output after unclamping. This training platform demonstrates adequate face and content validity. With practice, arterial anastomotic time could be improved, showing its construct validity. This porcine training model can be useful in providing training for robotic intracorporeal vascular anastomosis and may facilitate confident translation into a transplant human recipient.

  1. Flight Technical Error Analysis of the SATS Higher Volume Operations Simulation and Flight Experiments

    NASA Technical Reports Server (NTRS)

    Williams, Daniel M.; Consiglio, Maria C.; Murdoch, Jennifer L.; Adams, Catherine H.

    2005-01-01

    This paper provides an analysis of Flight Technical Error (FTE) from recent SATS experiments, called the Higher Volume Operations (HVO) Simulation and Flight experiments, which NASA conducted to determine pilot acceptability of the HVO concept for normal operating conditions. Reported are FTE results from simulation and flight experiment data indicating the SATS HVO concept is viable and acceptable to low-time instrument rated pilots when compared with today s system (baseline). Described is the comparative FTE analysis of lateral, vertical, and airspeed deviations from the baseline and SATS HVO experimental flight procedures. Based on FTE analysis, all evaluation subjects, low-time instrument-rated pilots, flew the HVO procedures safely and proficiently in comparison to today s system. In all cases, the results of the flight experiment validated the results of the simulation experiment and confirm the utility of the simulation platform for comparative Human in the Loop (HITL) studies of SATS HVO and Baseline operations.

  2. Multi-agent-based bio-network for systems biology: protein-protein interaction network as an example.

    PubMed

    Ren, Li-Hong; Ding, Yong-Sheng; Shen, Yi-Zhen; Zhang, Xiang-Feng

    2008-10-01

    Recently, a collective effort from multiple research areas has been made to understand biological systems at the system level. This research requires the ability to simulate particular biological systems as cells, organs, organisms, and communities. In this paper, a novel bio-network simulation platform is proposed for system biology studies by combining agent approaches. We consider a biological system as a set of active computational components interacting with each other and with an external environment. Then, we propose a bio-network platform for simulating the behaviors of biological systems and modelling them in terms of bio-entities and society-entities. As a demonstration, we discuss how a protein-protein interaction (PPI) network can be seen as a society of autonomous interactive components. From interactions among small PPI networks, a large PPI network can emerge that has a remarkable ability to accomplish a complex function or task. We also simulate the evolution of the PPI networks by using the bio-operators of the bio-entities. Based on the proposed approach, various simulators with different functions can be embedded in the simulation platform, and further research can be done from design to development, including complexity validation of the biological system.

  3. On the prospects of cross-calibrating the Cherenkov Telescope Array with an airborne calibration platform

    NASA Astrophysics Data System (ADS)

    Brown, Anthony M.

    2018-01-01

    Recent advances in unmanned aerial vehicle (UAV) technology have made UAVs an attractive possibility as an airborne calibration platform for astronomical facilities. This is especially true for arrays of telescopes spread over a large area such as the Cherenkov Telescope Array (CTA). In this paper, the feasibility of using UAVs to calibrate CTA is investigated. Assuming a UAV at 1km altitude above CTA, operating on astronomically clear nights with stratified, low atmospheric dust content, appropriate thermal protection for the calibration light source and an onboard photodiode to monitor its absolute light intensity, inter-calibration of CTA's telescopes of the same size class is found to be achievable with a 6 - 8 % uncertainty. For cross-calibration of different telescope size classes, a systematic uncertainty of 8 - 10 % is found to be achievable. Importantly, equipping the UAV with a multi-wavelength calibration light source affords us the ability to monitor the wavelength-dependent degradation of CTA telescopes' optical system, allowing us to not only maintain this 6 - 10 % uncertainty after the first few years of telescope deployment, but also to accurately account for the effect of multi-wavelength degradation on the cross-calibration of CTA by other techniques, namely with images of air showers and local muons. A UAV-based system thus provides CTA with several independent and complementary methods of cross-calibrating the optical throughput of individual telescopes. Furthermore, housing environmental sensors on the UAV system allows us to not only minimise the systematic uncertainty associated with the atmospheric transmission of the calibration signal, it also allows us to map the dust content above CTA as well as monitor the temperature, humidity and pressure profiles of the first kilometre of atmosphere above CTA with each UAV flight.

  4. Impacts of Cross-Platform Vicarious Calibration on the Deep Blue Aerosol Retrievals for Moderate Resolution Imaging Spectroradiometer Aboard Terra

    NASA Technical Reports Server (NTRS)

    Jeong, Myeong-Jae; Hsu, N. Christina; Kwiatkowska, Ewa J.; Franz, Bryan A.; Meister, Gerhard; Salustro, Clare E.

    2012-01-01

    The retrieval of aerosol properties from spaceborne sensors requires highly accurate and precise radiometric measurements, thus placing stringent requirements on sensor calibration and characterization. For the Terra/Moderate Resolution Imaging Spedroradiometer (MODIS), the characteristics of the detectors of certain bands, particularly band 8 [(B8); 412 nm], have changed significantly over time, leading to increased calibration uncertainty. In this paper, we explore a possibility of utilizing a cross-calibration method developed for characterizing the Terral MODIS detectors in the ocean bands by the National Aeronautics and Space Administration Ocean Biology Processing Group to improve aerosol retrieval over bright land surfaces. We found that the Terra/MODIS B8 reflectance corrected using the cross calibration method resulted in significant improvements for the retrieved aerosol optical thickness when compared with that from the Multi-angle Imaging Spectroradiometer, Aqua/MODIS, and the Aerosol Robotic Network. The method reported in this paper is implemented for the operational processing of the Terra/MODIS Deep Blue aerosol products.

  5. SimHap GUI: An intuitive graphical user interface for genetic association analysis

    PubMed Central

    Carter, Kim W; McCaskie, Pamela A; Palmer, Lyle J

    2008-01-01

    Background Researchers wishing to conduct genetic association analysis involving single nucleotide polymorphisms (SNPs) or haplotypes are often confronted with the lack of user-friendly graphical analysis tools, requiring sophisticated statistical and informatics expertise to perform relatively straightforward tasks. Tools, such as the SimHap package for the R statistics language, provide the necessary statistical operations to conduct sophisticated genetic analysis, but lacks a graphical user interface that allows anyone but a professional statistician to effectively utilise the tool. Results We have developed SimHap GUI, a cross-platform integrated graphical analysis tool for conducting epidemiological, single SNP and haplotype-based association analysis. SimHap GUI features a novel workflow interface that guides the user through each logical step of the analysis process, making it accessible to both novice and advanced users. This tool provides a seamless interface to the SimHap R package, while providing enhanced functionality such as sophisticated data checking, automated data conversion, and real-time estimations of haplotype simulation progress. Conclusion SimHap GUI provides a novel, easy-to-use, cross-platform solution for conducting a range of genetic and non-genetic association analyses. This provides a free alternative to commercial statistics packages that is specifically designed for genetic association analysis. PMID:19109877

  6. Signal replication in a DNA nanostructure

    NASA Astrophysics Data System (ADS)

    Mendoza, Oscar; Houmadi, Said; Aimé, Jean-Pierre; Elezgaray, Juan

    2017-01-01

    Logic circuits based on DNA strand displacement reaction are the basic building blocks of future nanorobotic systems. The circuits tethered to DNA origami platforms present several advantages over solution-phase versions where couplings are always diffusion-limited. Here we consider a possible implementation of one of the basic operations needed in the design of these circuits, namely, signal replication. We show that with an appropriate preparation of the initial state, signal replication performs in a reproducible way. We also show the existence of side effects concomitant to the high effective concentrations in tethered circuits, such as slow leaky reactions and cross-activation.

  7. Accounting for one-channel depletion improves missing value imputation in 2-dye microarray data.

    PubMed

    Ritz, Cecilia; Edén, Patrik

    2008-01-19

    For 2-dye microarray platforms, some missing values may arise from an un-measurably low RNA expression in one channel only. Information of such "one-channel depletion" is so far not included in algorithms for imputation of missing values. Calculating the mean deviation between imputed values and duplicate controls in five datasets, we show that KNN-based imputation gives a systematic bias of the imputed expression values of one-channel depleted spots. Evaluating the correction of this bias by cross-validation showed that the mean square deviation between imputed values and duplicates were reduced up to 51%, depending on dataset. By including more information in the imputation step, we more accurately estimate missing expression values.

  8. Risk-based process safety assessment and control measures design for offshore process facilities.

    PubMed

    Khan, Faisal I; Sadiq, Rehan; Husain, Tahir

    2002-09-02

    Process operation is the most hazardous activity next to the transportation and drilling operation on an offshore oil and gas (OOG) platform. Past experiences of onshore and offshore oil and gas activities have revealed that a small mis-happening in the process operation might escalate to a catastrophe. This is of especial concern in the OOG platform due to the limited space and compact geometry of the process area, less ventilation, and difficult escape routes. On an OOG platform, each extra control measure, which is implemented, not only occupies space on the platform and increases congestion but also adds extra load to the platform. Eventualities in the OOG platform process operation can be avoided through incorporating the appropriate control measures at the early design stage. In this paper, the authors describe a methodology for risk-based process safety decision making for OOG activities. The methodology is applied to various offshore process units, that is, the compressor, separators, flash drum and driers of an OOG platform. Based on the risk potential, appropriate safety measures are designed for each unit. This paper also illustrates that implementation of the designed safety measures reduces the high Fatal accident rate (FAR) values to an acceptable level.

  9. Evaluation of Game Engines for Cross-Platform Development of Mobile Serious Games for Health.

    PubMed

    Kleinschmidt, Carina; Haag, Martin

    2016-01-01

    Studies have shown that serious games for health can improve patient compliance and help to increase the quality of medical education. Due to a growing availability of mobile devices, especially the development of cross-platform mobile apps is helpful for improving healthcare. As the development can be highly time-consuming and expensive, an alternative development process is needed. Game engines are expected to simplify this process. Therefore, this article examines the question whether using game engines for cross-platform serious games for health can simplify the development compared to the development of a plain HTML5 app. At first, a systematic review of the literature was conducted in different databases (MEDLINE, ACM and IEEE). Afterwards three different game engines were chosen, evaluated in different categories and compared to the development of a HTML5 app. This was realized by implementing a prototypical application in the different engines and conducting a utility analysis. The evaluation shows that the Marmalade engine is the best choice for development in this scenario. Furthermore, it is obvious that the game engines have great benefits against plain HTML5 development as they provide components for graphics, physics, sounds, etc. The authors recommend to use the Marmalade Engine for a cross-platform mobile Serious Game for Health.

  10. A Hydrazone-Based Covalent Organic Framework as an Efficient and Reusable Photocatalyst for the Cross-Dehydrogenative Coupling Reaction of N-Aryltetrahydroisoquinolines.

    PubMed

    Liu, Wanting; Su, Qing; Ju, Pengyao; Guo, Bixuan; Zhou, Hui; Li, Guanghua; Wu, Qiaolin

    2017-02-22

    A hydrazone-based covalent organic framework (COF) was synthesized by condensation of 2,5-dimethoxyterephthalohydrazide with 1,3,5-triformylbenzene under solvothermal conditions. The COF material exhibits excellent porosity with a BET surface area of up to 1501 m 2  g -1 , high crystallinity, and good thermal and chemical stability. Moreover, it showed efficient photocatalytic activity towards cross-dehydrogenative coupling (CDC) reactions between tetrahydroisoquinolines and nucleophiles such as nitromethane, acetone, and phenylethyl ketone. The metal-free catalytic system also offers attractive advantages including simplicity of operation, wide substrate adaptability, ambient reaction conditions, and robust recycling capability of the catalyst, thus providing a promising platform for highly efficient and reusable photocatalysts. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Development of Cross-Platform Software for Well Logging Data Visualization

    NASA Astrophysics Data System (ADS)

    Akhmadulin, R. K.; Miraev, A. I.

    2017-07-01

    Well logging data processing is one of the main sources of information in the oil-gas field analysis and is of great importance in the process of its development and operation. Therefore, it is important to have the software which would accurately and clearly provide the user with processed data in the form of well logs. In this work, there have been developed a software product which not only has the basic functionality for this task (loading data from .las files, well log curves display, etc.), but can be run in different operating systems and on different devices. In the article a subject field analysis and task formulation have been performed, and the software design stage has been considered. At the end of the work the resulting software product interface has been described.

  12. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark®) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens

    PubMed Central

    Larson, Jeffrey S.; Goodman, Laurie J.; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C.; Cook, Jennifer W.; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D. B.; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J.; Whitcomb, Jeannette M.

    2010-01-01

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7–10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH). PMID:21151530

  13. Analytical Validation of a Highly Quantitative, Sensitive, Accurate, and Reproducible Assay (HERmark) for the Measurement of HER2 Total Protein and HER2 Homodimers in FFPE Breast Cancer Tumor Specimens.

    PubMed

    Larson, Jeffrey S; Goodman, Laurie J; Tan, Yuping; Defazio-Eli, Lisa; Paquet, Agnes C; Cook, Jennifer W; Rivera, Amber; Frankson, Kristi; Bose, Jolly; Chen, Lili; Cheung, Judy; Shi, Yining; Irwin, Sarah; Kiss, Linda D B; Huang, Weidong; Utter, Shannon; Sherwood, Thomas; Bates, Michael; Weidler, Jodi; Parry, Gordon; Winslow, John; Petropoulos, Christos J; Whitcomb, Jeannette M

    2010-06-28

    We report here the results of the analytical validation of assays that measure HER2 total protein (H2T) and HER2 homodimer (H2D) expression in Formalin Fixed Paraffin Embedded (FFPE) breast cancer tumors as well as cell line controls. The assays are based on the VeraTag technology platform and are commercially available through a central CAP-accredited clinical reference laboratory. The accuracy of H2T measurements spans a broad dynamic range (2-3 logs) as evaluated by comparison with cross-validating technologies. The measurement of H2T expression demonstrates a sensitivity that is approximately 7-10 times greater than conventional immunohistochemistry (IHC) (HercepTest). The HERmark assay is a quantitative assay that sensitively and reproducibly measures continuous H2T and H2D protein expression levels and therefore may have the potential to stratify patients more accurately with respect to response to HER2-targeted therapies than current methods which rely on semiquantitative protein measurements (IHC) or on indirect assessments of gene amplification (FISH).

  14. Rodent motor and neuropsychological behaviour measured in home cages using the integrated modular platform SmartCage™.

    PubMed

    Khroyan, Taline V; Zhang, Jingxi; Yang, Liya; Zou, Bende; Xie, James; Pascual, Conrado; Malik, Adam; Xie, Julian; Zaveri, Nurulain T; Vazquez, Jacqueline; Polgar, Willma; Toll, Lawrence; Fang, Jidong; Xie, Xinmin

    2012-07-01

    1. To facilitate investigation of diverse rodent behaviours in rodents' home cages, we have developed an integrated modular platform, the SmartCage(™) system (AfaSci, Inc. Burlingame, CA, USA), which enables automated neurobehavioural phenotypic analysis and in vivo drug screening in a relatively higher-throughput and more objective manner. 2, The individual platform consists of an infrared array, a vibration floor sensor and a variety of modular devices. One computer can simultaneously operate up to 16 platforms via USB cables. 3. The SmartCage(™) detects drug-induced increases and decreases in activity levels, as well as changes in movement patterns. Wake and sleep states of mice can be detected using the vibration floor sensor. The arousal state classification achieved up to 98% accuracy compared with results obtained by electroencephalography and electromyography. More complex behaviours, including motor coordination, anxiety-related behaviours and social approach behaviour, can be assessed using appropriate modular devices and the results obtained are comparable with results obtained using conventional methods. 4. In conclusion, the SmartCage(™) system provides an automated and accurate tool to quantify various rodent behaviours in a 'stress-free' environment. This system, combined with the validated testing protocols, offers powerful a tool kit for transgenic phenotyping and in vivo drug screening. © 2012 The Authors. Clinical and Experimental Pharmacology and Physiology © 2012 Blackwell Publishing Asia Pty Ltd.

  15. From cutting-edge pointwise cross-section to groupwise reaction rate: A primer

    NASA Astrophysics Data System (ADS)

    Sublet, Jean-Christophe; Fleming, Michael; Gilbert, Mark R.

    2017-09-01

    The nuclear research and development community has a history of using both integral and differential experiments to support accurate lattice-reactor, nuclear reactor criticality and shielding simulations, as well as verification and validation efforts of cross sections and emitted particle spectra. An important aspect to this type of analysis is the proper consideration of the contribution of the neutron spectrum in its entirety, with correct propagation of uncertainties and standard deviations derived from Monte Carlo simulations, to the local and total uncertainty in the simulated reactions rates (RRs), which usually only apply to one application at a time. This paper identifies deficiencies in the traditional treatment, and discusses correct handling of the RR uncertainty quantification and propagation, including details of the cross section components in the RR uncertainty estimates, which are verified for relevant applications. The methodology that rigorously captures the spectral shift and cross section contributions to the uncertainty in the RR are discussed with quantified examples that demonstrate the importance of the proper treatment of the spectrum profile and cross section contributions to the uncertainty in the RR and subsequent response functions. The recently developed inventory code FISPACT-II, when connected to the processed nuclear data libraries TENDL-2015, ENDF/B-VII.1, JENDL-4.0u or JEFF-3.2, forms an enhanced multi-physics platform providing a wide variety of advanced simulation methods for modelling activation, transmutation, burnup protocols and simulating radiation damage sources terms. The system has extended cutting-edge nuclear data forms, uncertainty quantification and propagation methods, which have been the subject of recent integral and differential, fission, fusion and accelerators validation efforts. The simulation system is used to accurately and predictively probe, understand and underpin a modern and sustainable understanding of the nuclear physics that is so important for many areas of science and technology; advanced fission and fuel systems, magnetic and inertial confinement fusion, high energy, accelerator physics, medical application, isotope production, earth exploration, astrophysics and homeland security.

  16. Profound Effect of Profiling Platform and Normalization Strategy on Detection of Differentially Expressed MicroRNAs – A Comparative Study

    PubMed Central

    Meyer, Swanhild U.; Kaiser, Sebastian; Wagner, Carola; Thirion, Christian; Pfaffl, Michael W.

    2012-01-01

    Background Adequate normalization minimizes the effects of systematic technical variations and is a prerequisite for getting meaningful biological changes. However, there is inconsistency about miRNA normalization performances and recommendations. Thus, we investigated the impact of seven different normalization methods (reference gene index, global geometric mean, quantile, invariant selection, loess, loessM, and generalized procrustes analysis) on intra- and inter-platform performance of two distinct and commonly used miRNA profiling platforms. Methodology/Principal Findings We included data from miRNA profiling analyses derived from a hybridization-based platform (Agilent Technologies) and an RT-qPCR platform (Applied Biosystems). Furthermore, we validated a subset of miRNAs by individual RT-qPCR assays. Our analyses incorporated data from the effect of differentiation and tumor necrosis factor alpha treatment on primary human skeletal muscle cells and a murine skeletal muscle cell line. Distinct normalization methods differed in their impact on (i) standard deviations, (ii) the area under the receiver operating characteristic (ROC) curve, (iii) the similarity of differential expression. Loess, loessM, and quantile analysis were most effective in minimizing standard deviations on the Agilent and TLDA platform. Moreover, loess, loessM, invariant selection and generalized procrustes analysis increased the area under the ROC curve, a measure for the statistical performance of a test. The Jaccard index revealed that inter-platform concordance of differential expression tended to be increased by loess, loessM, quantile, and GPA normalization of AGL and TLDA data as well as RGI normalization of TLDA data. Conclusions/Significance We recommend the application of loess, or loessM, and GPA normalization for miRNA Agilent arrays and qPCR cards as these normalization approaches showed to (i) effectively reduce standard deviations, (ii) increase sensitivity and accuracy of differential miRNA expression detection as well as (iii) increase inter-platform concordance. Results showed the successful adoption of loessM and generalized procrustes analysis to one-color miRNA profiling experiments. PMID:22723911

  17. Analyses of cosmic ray induced-neutron based on spectrometers operated simultaneously at mid-latitude and Antarctica high-altitude stations during quiet solar activity

    NASA Astrophysics Data System (ADS)

    Hubert, G.

    2016-10-01

    In this paper are described a new neutron spectrometer which operate in the Concordia station (Antarctica, Dome C) since December 2015. This instrument complements a network including neutron spectrometers operating in the Pic-du-Midi and the Pico dos Dias. Thus, this work present an analysis of cosmic ray induced-neutron based on spectrometers operated simultaneously in the Pic-du-Midi and the Concordia stations during a quiet solar activity. The both high station platforms allow for investigating the long period dynamics to analyze the spectral variation and effects of local and seasonal changes, but also the short term dynamics during solar flare events. A first part is devoted to analyze the count rates, the spectrum and the neutron fluxes, implying cross-comparisons between data obtained in the both stations. In a second part, measurements analyses were reinforced by modeling based on simulations of atmospheric cascades according to primary spectra which only depend on the solar modulation potential.

  18. Developmental Testing of Electric Thrust Vector Control Systems for Manned Launch Vehicle Applications

    NASA Technical Reports Server (NTRS)

    Bates, Lisa B.; Young, David T.

    2012-01-01

    This paper describes recent developmental testing to verify the integration of a developmental electromechanical actuator (EMA) with high rate lithium ion batteries and a cross platform extensible controller. Testing was performed at the Thrust Vector Control Research, Development and Qualification Laboratory at the NASA George C. Marshall Space Flight Center. Electric Thrust Vector Control (ETVC) systems like the EMA may significantly reduce recurring launch costs and complexity compared to heritage systems. Electric actuator mechanisms and control requirements across dissimilar platforms are also discussed with a focus on the similarities leveraged and differences overcome by the cross platform extensible common controller architecture.

  19. Progress on China nuclear data processing code system

    NASA Astrophysics Data System (ADS)

    Liu, Ping; Wu, Xiaofei; Ge, Zhigang; Li, Songyang; Wu, Haicheng; Wen, Lili; Wang, Wenming; Zhang, Huanyu

    2017-09-01

    China is developing the nuclear data processing code Ruler, which can be used for producing multi-group cross sections and related quantities from evaluated nuclear data in the ENDF format [1]. The Ruler includes modules for reconstructing cross sections in all energy range, generating Doppler-broadened cross sections for given temperature, producing effective self-shielded cross sections in unresolved energy range, calculating scattering cross sections in thermal energy range, generating group cross sections and matrices, preparing WIMS-D format data files for the reactor physics code WIMS-D [2]. Programming language of the Ruler is Fortran-90. The Ruler is tested for 32-bit computers with Windows-XP and Linux operating systems. The verification of Ruler has been performed by comparison with calculation results obtained by the NJOY99 [3] processing code. The validation of Ruler has been performed by using WIMSD5B code.

  20. Using airborne measurements and modelling to determine the leak rate of the Elgin platform in 2012

    NASA Astrophysics Data System (ADS)

    Mobbs, Stephen D.; Bauguitte, Stephane J.-B.; Wellpott, Axel; O'Shea, Sebastian

    2013-04-01

    On the 25th March 2012 the French multinational oil and gas company Total reported a gas leak at the Elgin gas field in the North Sea following an operation on well G4 on the wellhead platform. During operations to plug and decommission the well methane leaked out which lead to the evacuation of the platform. Total made immense efforts to quickly stop the leak and on the 16th May 2012 the company announced the successful "Top kill". The UK's National Centre for Atmospheric Science (NCAS) supported the Total response to the leak with flights of the Facility for Airborne Atmospheric Measurements (FAAM) BAe-146 aircraft. Between the 3rd of April and the 4th of May five missions were flown. The FAAM aircraft was equipped with a Fast Greenhouse Gas Analyser (FGGA, Model RMT-200, Los Gatos Research Inc., US) to measure CH4 mixing ratios with an accuracy of 0.07±2.48 ppbv. The measurement strategy used followed closely NOAA's during the Deepwater Horizon (DWH) spill in the Gulf of Mexico in 2010. The basis of the method is to sample the cross-wind structure of the plume at different heights downwind of the source. The measurements were then fitted to a Gaussian dispersion model which allowed the calculation of the leak rate. The first mission was flown on the 30th March 2012 only 5 days after Total reported the leak. On this day maximum CH4 concentrations exceeded 2800 ppbv. The plume was very distinct and narrow especially near the platform (10km) and it showed almost perfect Gaussian characteristics. Further downwind the plume was split up into several filaments. On this day the CH4 leak rate was estimated to be 1.1 kg/s. Between the 1st and 2nd mission (03/04/2012) the leak rate decreased significantly to about 0.5 kg/s. From the 2nd flight onwards only a minor decrease in leak rate was calculated. The last mission - while the platform was still leaking - was flown on the 4th of May, when the leak rate was estimated to be 0.3 kg/s. The FAAM aircraft measurements delivered time-critical, actionable information that accurately quantified the Elgin leak rate and contributed directly to safe and successful operational decision making.

  1. [Effect of leptin on long-term spatial memory of rats with white matter damage in developing brain].

    PubMed

    Feng, Er-Cui; Jiang, Li

    2017-12-01

    To investigate the neuroprotective effect of leptin by observing its effect on spatial memory of rats with white matter damage in developing brain. A total of 80 neonatal rats were randomly divided into 3 groups: sham-operation (n=27), model (n=27) and leptin intervention (n=27). The rats in the model and leptin intervention groups were used to prepare a model of white matter damage in developing brain, and the rats in the leptin intervention group were given leptin (100 μg/kg) diluted with normal saline immediately after modelling for 4 consecutive days. The survival rate of the rats was observed and the change in body weight was monitored. When the rats reached the age of 21 days, the Morris water maze test was used to evaluate spatial memory. There was no significant difference in the survival rate of rats between the three groups (P>0.05). Within 10 days after birth, the leptin intervention group had similar body weight as the sham-operation group and significantly lower body weight than the model group (P<0.05); more than 10 days after birth, the leptin intervention group had rapid growth with higher body weight than the model and sham-operation groups (P>0.05). The results of place navigation showed that from the second day of experiment, there was a significant difference in the latency period between the three groups (P<0.05); from the fourth day of experiment, the leptin intervention group had a similar latency period as the sham-operation and a significantly shorter latency period than the model group (P<0.05). The results of space search experiment showed that compared with the sham-operation group, the model group had a significant reduction in the number of platform crossings and a significantly longer latency period (P<0.05); compared with the model group, the leptin intervention group had a significantly increased number of platform crossings and a significantly shortened latency period (P<0.05), while there was no significant difference between the leptin intervention and sham-operation groups. Leptin can alleviate spatial memory impairment of rats with white matter damage in developing brain. It thus exerts a neuroprotective effect, and is worthy of further research.

  2. DNA-Aptamer optical biosensors based on a LPG-SPR optical fiber platform for point-of-care diagnostic

    NASA Astrophysics Data System (ADS)

    Coelho, L.; Queirós, R. B.; Santos, J. L.; Martins, M. Cristina L.; Viegas, D.; Jorge, P. A. S.

    2014-03-01

    Surface Plasmon Resonance (SPR) is the base for some of the most sensitive label free optical fiber biosensors. However, most solutions presented to date require the use of fragile fiber optic structure such as adiabatic tapers or side polished fibers. On the other hand, long-period fiber gratings (LPG) present themselves as an interesting solution to attain an evanescent wave refractive index sensor platform while preserving the optical fiber integrity. The combination of these two approaches constitute a powerful platform that can potentially reach the highest sensitivities as it was recently demonstrated by detailed theoretical study [1, 2]. In this work, a LPG-SPR platform is explored in different configurations (metal coating between two LPG - symmetric and asymmetric) operating in the telecom band (around 1550 nm). For this purpose LPGs with period of 396 μm are combined with tailor made metallic thin films. In particular, the sensing regions were coated with 2 nm of chromium to improve the adhesion to the fiber and 16 nm of gold followed by a 100 nm thick layer of TiO2 dielectric material strategically chosen to attain plasmon resonance in the desired wavelength range. The obtained refractometric platforms were then validated as a biosensor. For this purpose the detection of thrombin using an aptamer based probe was used as a model system for protein detection. The surface of the sensing fibers were cleaned with isopropanol and dried with N2 and then the aminated thrombin aptamer (5'-[NH2]- GGTTGGTGTGGTTGG-3') was immobilized by physisorption using Poly-L-Lysine (PLL) as cationic polymer. Preliminary results indicate the viability of the LPFG-SPR-APTAMER as a flexible platforms point of care diagnostic biosensors.

  3. Intelligent user interface concept for space station

    NASA Technical Reports Server (NTRS)

    Comer, Edward; Donaldson, Cameron; Bailey, Elizabeth; Gilroy, Kathleen

    1986-01-01

    The space station computing system must interface with a wide variety of users, from highly skilled operations personnel to payload specialists from all over the world. The interface must accommodate a wide variety of operations from the space platform, ground control centers and from remote sites. As a result, there is a need for a robust, highly configurable and portable user interface that can accommodate the various space station missions. The concept of an intelligent user interface executive, written in Ada, that would support a number of advanced human interaction techniques, such as windowing, icons, color graphics, animation, and natural language processing is presented. The user interface would provide intelligent interaction by understanding the various user roles, the operations and mission, the current state of the environment and the current working context of the users. In addition, the intelligent user interface executive must be supported by a set of tools that would allow the executive to be easily configured and to allow rapid prototyping of proposed user dialogs. This capability would allow human engineering specialists acting in the role of dialog authors to define and validate various user scenarios. The set of tools required to support development of this intelligent human interface capability is discussed and the prototyping and validation efforts required for development of the Space Station's user interface are outlined.

  4. Slope and basinal deposits adjacent to isolated carbonate platforms in the Indian Ocean: Sedimentology, geomorphology, and a new 1.2 Ma record of highstand shedding

    NASA Astrophysics Data System (ADS)

    Counts, J. W.; Jorry, S.; Jouet, G.

    2017-12-01

    Newly analyzed bathymetric, seismic, and core data from carbonate-topped seamounts in the Mozambique Channel reveals a variety of depositional processes and products operating on platform slopes and adjacent basins. Mass transport complexes (including turbidites and debrites), leveed channel systems with basin-floor fans, and contourites are imaged in high resolution in both seafloor maps and cross-section, and show both differences and similarities compared with platform slopes in the Bahamas and elsewhere. In some, though not all, platforms, increased sedimentation can be observed on the leeward margins, and slope rugosity may be asymmetric with respect to prevailing wind direction. Deposition is also controlled by glacial-interglacial cycles; cores taken from the lower slopes (3000+ m water depth) of carbonate platforms reveal a causative relationship between sea level and aragonite export to the deep ocean. δ18O isotopes from planktonic and benthic foraminifera of two 27-meter cores, reveal a high-resolution, continuous depositional record of carbonate sediment dating back to 1.2 Ma. Sea level rise, as determined by correlation with the LR04 benthic stack, is coincident with increased aragonite flux from platform tops. Gravity flow deposits are also affected by platform flooding—the frequency of turbidite/debrite deposits on pinnacle slopes increases during highstand, although such deposits are also present during glacial episodes. The results reported here are the first record of highstand shedding in the southern Indian Ocean, and provide the longest Quaternary sediment record to date in the region, including the Mid-Brunhes transition (MIS 11) that serves as an analog for the current climate conditions. In addition, this is the first study to describe sedimentation on the slopes of these platforms, providing an important point of comparison that has the potential to influence source-to-sink carbonate facies models.

  5. The Geohazards Exploitation Platform

    NASA Astrophysics Data System (ADS)

    Laur, Henri; Casu, Francesco; Bally, Philippe; Caumont, Hervé; Pinto, Salvatore

    2016-04-01

    The Geohazards Exploitation Platform, or Geohazards TEP (GEP), is an ESA originated R&D activity of the EO ground segment to demonstrate the benefit of new technologies for large scale processing of EO data. This encompasses on-demand processing for specific user needs, systematic processing to address common information needs of the geohazards community, and integration of newly developed processors for scientists and other expert users. The platform supports the geohazards community's objectives as defined in the context of the International Forum on Satellite EO and Geohazards organised by ESA and GEO in Santorini in 2012. The GEP is a follow on to the Supersites Exploitation Platform (SSEP) an ESA initiative to support the Geohazards Supersites & Natural Laboratories initiative (GSNL). Today the GEP allows to exploit 70+ Terabyte of ERS and ENVISAT archive and the Copernicus Sentinel-1 data available on line. The platform has already engaged 22 European early adopters in a validation activity initiated in March 2015. Since September, this validation has reached 29 single user projects. Each project is concerned with either integrating an application, running on demand processing or systematically generating a product collection using an application available in the platform. The users primarily include 15 geoscience centres and universities based in Europe: British Geological Survey (UK), University of Leeds (UK), University College London (UK), ETH University of Zurich (CH), INGV (IT), CNR-IREA and CNR-IRPI (IT), University of L'Aquila (IT), NOA (GR), Univ. Blaise Pascal & CNRS (FR), Ecole Normale Supérieure (FR), ISTERRE / University of Grenoble-Alpes (FR). In addition, there are users from Africa and North America with the University of Rabat (MA) and the University of Miami (US). Furthermore two space agencies and four private companies are involved: the German Space Research Centre DLR (DE), the European Space Agency (ESA), Altamira Information (ES), DEIMOS Space (ES), eGEOS (IT) and SATIM (PL). The GEP is now pursuing these projects with early adopters integrating additional conventional and advanced EO processors. It will also expand its user base to gradually reach a total of 60 separate users in pre-operations in 2017 with 6 new pilot projects being taken on board: photogrammetric processing using Optical EO data with University of Strasbourg (FR); optical based processing method for volcanic hazard monitoring with INGV (IT); systematic generation of Interferometric displacement time series based on the Sentinel-1 data with CNR IREA (IT); systematic processing of Sentinel-1 Interferometric Browse imagery with DLR (DE); precise terrain motion mapping with SPN Persistent Scatterers Interferometric chain of Altamira Information (ES); and a campaign to test and exploit GEP applications with the Corinth Rift Laboratory in which Greek and French experts of seismic hazards are engaged. Following the pre-operations phase starting in 2017 the Geohazards platform is intended to support a broad user community and has already established partnerships with large user networks, a particular example of which being the EPOS research infrastructure. Within EPOS, the GEP is intended to act as the main interface for accessing, processing, analysing and sharing products related to the Satellite Data Thematic Service.

  6. Using the NANA toolkit at home to predict older adults' future depression.

    PubMed

    Andrews, J A; Harrison, R F; Brown, L J E; MacLean, L M; Hwang, F; Smith, T; Williams, E A; Timon, C; Adlam, T; Khadra, H; Astell, A J

    2017-04-15

    Depression is currently underdiagnosed among older adults. As part of the Novel Assessment of Nutrition and Aging (NANA) validation study, 40 older adults self-reported their mood using a touchscreen computer over three, one-week periods. Here, we demonstrate the potential of these data to predict future depression status. We analysed data from the NANA validation study using a machine learning approach. We applied the least absolute shrinkage and selection operator with a logistic model to averages of six measures of mood, with depression status according to the Geriatric Depression Scale 10 weeks later as the outcome variable. We tested multiple values of the selection parameter in order to produce a model with low deviance. We used a cross-validation framework to avoid overspecialisation, and receiver operating characteristic (ROC) curve analysis to determine the quality of the fitted model. The model we report contained coefficients for two variables: sadness and tiredness, as well as a constant. The cross-validated area under the ROC curve for this model was 0.88 (CI: 0.69-0.97). While results are based on a small sample, the methodology for the selection of variables appears suitable for the problem at hand, suggesting promise for a wider study and ultimate deployment with older adults at increased risk of depression. We have identified self-reported scales of sadness and tiredness as sensitive measures which have the potential to predict future depression status in older adults, partially addressing the problem of underdiagnosis. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  7. Harvesting rockfall hazard evaluation parameters from Google Earth Street View

    NASA Astrophysics Data System (ADS)

    Partsinevelos, Panagiotis; Agioutantis, Zacharias; Tripolitsiotis, Achilles; Steiakakis, Chrysanthos; Mertikas, Stelios

    2015-04-01

    Rockfall incidents along highways and railways prove extremely dangerous for properties, infrastructures and human lives. Several qualitative metrics such as the Rockfall Hazard Rating System (RHRS) and the Colorado Rockfall Hazard Rating System (CRHRS) have been established to estimate rockfall potential and provide risk maps in order to control and monitor rockfall incidents. The implementation of such metrics for efficient and reliable risk modeling require accurate knowledge of multi-parametric attributes such as the geological, geotechnical, topographic parameters of the study area. The Missouri Rockfall Hazard Rating System (MORH RS) identifies the most potentially problematic areas using digital video logging for the determination of parameters like slope height and angle, face irregularities, etc. This study aims to harvest in a semi-automated approach geometric and qualitative measures through open source platforms that may provide 3-dimensional views of the areas of interest. More specifically, the Street View platform from Google Maps, is hereby used to provide essential information that can be used towards 3-dimensional reconstruction of slopes along highways. The potential of image capturing along a programmable virtual route to provide the input data for photogrammetric processing is also evaluated. Moreover, qualitative characterization of the geological and geotechnical status, based on the Street View images, is performed. These attributes are then integrated to deliver a GIS-based rockfall hazard map. The 3-dimensional models are compared to actual photogrammetric measures in a rockfall prone area in Crete, Greece while in-situ geotechnical characterization is also used to compare and validate the hazard risk. This work is considered as the first step towards the exploitation of open source platforms to improve road safety and the development of an operational system where authorized agencies (i.e., civil protection) will be able to acquire near-real time hazard maps based on video images retrieved either by open source platforms, operational unmanned aerial vehicles, and/or simple video recordings from users. This work has been performed under the framework of the "Cooperation 2011" project ISTRIA (11_SYN_9_13989) funded from the Operational Program "Competitiveness and Entrepreneurship" (co-funded by the European Regional Development Fund (ERDF)) and managed by the Greek General Secretariat for Research and Technology.

  8. Rapid Operational Access and Maneuver Support (ROAMS) Platform for Improved Military Logistics Lines of Communication and Operational Vessel Routing

    DTIC Science & Technology

    2017-06-01

    case study in a northeastern American metropolitan area. METHODOLOGY : The ROAMS platform provides expanded analysis, model automation, and enhanced...shoals. An initial route for such operations is selected much like the military logistics case . Subsequent adjustments to routes may be done on an ad...IX-45 June 2017 8 CASE STUDY: The ROAMS platform was applied to a large, northeast American metropolitan region to demonstrate the capability of

  9. 15 CFR 911.5 - NOAA Data Collection Systems Use Agreements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... initial in-situ deployment of the platforms, and may be renewed for additional 3-year periods. (2... valid for 1 year from the date of initial in-situ deployment of the platforms, and may be renewed for... year from the date of initial in-situ deployment of the platforms, and may be renewed for additional 1...

  10. 15 CFR 911.5 - NOAA Data Collection Systems Use Agreements.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... initial in-situ deployment of the platforms, and may be renewed for additional 3-year periods. (2... valid for 1 year from the date of initial in-situ deployment of the platforms, and may be renewed for... year from the date of initial in-situ deployment of the platforms, and may be renewed for additional 1...

  11. 15 CFR 911.5 - NOAA Data Collection Systems Use Agreements.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... initial in-situ deployment of the platforms, and may be renewed for additional 3-year periods. (2... valid for 1 year from the date of initial in-situ deployment of the platforms, and may be renewed for... year from the date of initial in-situ deployment of the platforms, and may be renewed for additional 1...

  12. 15 CFR 911.5 - NOAA Data Collection Systems Use Agreements.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... initial in-situ deployment of the platforms, and may be renewed for additional 3-year periods. (2... valid for 1 year from the date of initial in-situ deployment of the platforms, and may be renewed for... year from the date of initial in-situ deployment of the platforms, and may be renewed for additional 1...

  13. 15 CFR 911.5 - NOAA Data Collection Systems Use Agreements.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... initial in-situ deployment of the platforms, and may be renewed for additional 3-year periods. (2... valid for 1 year from the date of initial in-situ deployment of the platforms, and may be renewed for... year from the date of initial in-situ deployment of the platforms, and may be renewed for additional 1...

  14. 30 CFR 250.903 - What records must I keep?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Platforms and Structures General Requirements for... platform safety, structural reliability, or operating capabilities. Items such as steel brackets, deck...

  15. Low-loss compact multilayer silicon nitride platform for 3D photonic integrated circuits.

    PubMed

    Shang, Kuanping; Pathak, Shibnath; Guan, Binbin; Liu, Guangyao; Yoo, S J B

    2015-08-10

    We design, fabricate, and demonstrate a silicon nitride (Si(3)N(4)) multilayer platform optimized for low-loss and compact multilayer photonic integrated circuits. The designed platform, with 200 nm thick waveguide core and 700 nm interlayer gap, is compatible for active thermal tuning and applicable to realizing compact photonic devices such as arrayed waveguide gratings (AWGs). We achieve ultra-low loss vertical couplers with 0.01 dB coupling loss, multilayer crossing loss of 0.167 dB at 90° crossing angle, 50 μm bending radius, 100 × 2 μm(2) footprint, lateral misalignment tolerance up to 400 nm, and less than -52 dB interlayer crosstalk at 1550 nm wavelength. Based on the designed platform, we demonstrate a 27 × 32 × 2 multilayer star coupler.

  16. Xi-cam: a versatile interface for data visualization and analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  17. Xi-cam: a versatile interface for data visualization and analysis

    DOE PAGES

    Pandolfi, Ronald J.; Allan, Daniel B.; Arenholz, Elke; ...

    2018-05-31

    Xi-cam is an extensible platform for data management, analysis and visualization.Xi-camaims to provide a flexible and extensible approach to synchrotron data treatment as a solution to rising demands for high-volume/high-throughput processing pipelines. The core ofXi-camis an extensible plugin-based graphical user interface platform which provides users with an interactive interface to processing algorithms. Plugins are available for SAXS/WAXS/GISAXS/GIWAXS, tomography and NEXAFS data. WithXi-cam's `advanced' mode, data processing steps are designed as a graph-based workflow, which can be executed live, locally or remotely. Remote execution utilizes high-performance computing or de-localized resources, allowing for the effective reduction of high-throughput data.Xi-cam's plugin-based architecture targetsmore » cross-facility and cross-technique collaborative development, in support of multi-modal analysis.Xi-camis open-source and cross-platform, and available for download on GitHub.« less

  18. Graph processing platforms at scale: practices and experiences

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lim, Seung-Hwan; Lee, Sangkeun; Brown, Tyler C

    2015-01-01

    Graph analysis unveils hidden associations of data in many phenomena and artifacts, such as road network, social networks, genomic information, and scientific collaboration. Unfortunately, a wide diversity in the characteristics of graphs and graph operations make it challenging to find a right combination of tools and implementation of algorithms to discover desired knowledge from the target data set. This study presents an extensive empirical study of three representative graph processing platforms: Pegasus, GraphX, and Urika. Each system represents a combination of options in data model, processing paradigm, and infrastructure. We benchmarked each platform using three popular graph operations, degree distribution,more » connected components, and PageRank over a variety of real-world graphs. Our experiments show that each graph processing platform shows different strength, depending the type of graph operations. While Urika performs the best in non-iterative operations like degree distribution, GraphX outputforms iterative operations like connected components and PageRank. In addition, we discuss challenges to optimize the performance of each platform over large scale real world graphs.« less

  19. Space Station

    NASA Image and Video Library

    1981-12-01

    During 1980 and the first half of 1981, the Marshall Space Flight Center conducted studies concerned with a relatively low-cost, near-term, manned space platform to satisfy current user needs, yet capable of evolutionary growth to meet future needs. The Science and Application Manned Space Platform (SAMSP) studies were to serve as a test bed for developing scientific and operational capabilities required by later, more advanced manned platforms while accomplishing early science and operations. This concept illustrates a manned space platform.

  20. Cleaning and other control and validation strategies to prevent allergen cross-contact in food-processing operations.

    PubMed

    Jackson, Lauren S; Al-Taher, Fadwa M; Moorman, Mark; DeVries, Jonathan W; Tippett, Roger; Swanson, Katherine M J; Fu, Tong-Jen; Salter, Robert; Dunaif, George; Estes, Susan; Albillos, Silvia; Gendel, Steven M

    2008-02-01

    Food allergies affect an estimated 10 to 12 million people in the United States. Some of these individuals can develop life-threatening allergic reactions when exposed to allergenic proteins. At present, the only successful method to manage food allergies is to avoid foods containing allergens. Consumers with food allergies rely on food labels to disclose the presence of allergenic ingredients. However, undeclared allergens can be inadvertently introduced into a food via cross-contact during manufacturing. Although allergen removal through cleaning of shared equipment or processing lines has been identified as one of the critical points for effective allergen control, there is little published information on the effectiveness of cleaning procedures for removing allergenic materials from processing equipment. There also is no consensus on how to validate or verify the efficacy of cleaning procedures. The objectives of this review were (i) to study the incidence and cause of allergen cross-contact, (ii) to assess the science upon which the cleaning of food contact surfaces is based, (iii) to identify best practices for cleaning allergenic foods from food contact surfaces in wet and dry manufacturing environments, and (iv) to present best practices for validating and verifying the efficacy of allergen cleaning protocols.

  1. Robonaut 2 and You: Specifying and Executing Complex Operations

    NASA Technical Reports Server (NTRS)

    Baker, William; Kingston, Zachary; Moll, Mark; Badger, Julia; Kavraki, Lydia

    2017-01-01

    Crew time is a precious resource due to the expense of trained human operators in space. Efficient caretaker robots could lessen the manual labor load required by frequent vehicular and life support maintenance tasks, freeing astronaut time for scientific mission objectives. Humanoid robots can fluidly exist alongside human counterparts due to their form, but they are complex and high-dimensional platforms. This paper describes a system that human operators can use to maneuver Robonaut 2 (R2), a dexterous humanoid robot developed by NASA to research co-robotic applications. The system includes a specification of constraints used to describe operations, and the supporting planning framework that solves constrained problems on R2 at interactive speeds. The paper is developed in reference to an illustrative, typical example of an operation R2 performs to highlight the challenges inherent to the problems R2 must face. Finally, the interface and planner is validated through a case-study using the guiding example on the physical robot in a simulated microgravity environment. This work reveals the complexity of employing humanoid caretaker robots and suggest solutions that are broadly applicable.

  2. Dynamic gene expression response to altered gravity in human T cells.

    PubMed

    Thiel, Cora S; Hauschild, Swantje; Huge, Andreas; Tauber, Svantje; Lauber, Beatrice A; Polzer, Jennifer; Paulsen, Katrin; Lier, Hartwin; Engelmann, Frank; Schmitz, Burkhard; Schütte, Andreas; Layer, Liliana E; Ullrich, Oliver

    2017-07-12

    We investigated the dynamics of immediate and initial gene expression response to different gravitational environments in human Jurkat T lymphocytic cells and compared expression profiles to identify potential gravity-regulated genes and adaptation processes. We used the Affymetrix GeneChip® Human Transcriptome Array 2.0 containing 44,699 protein coding genes and 22,829 non-protein coding genes and performed the experiments during a parabolic flight and a suborbital ballistic rocket mission to cross-validate gravity-regulated gene expression through independent research platforms and different sets of control experiments to exclude other factors than alteration of gravity. We found that gene expression in human T cells rapidly responded to altered gravity in the time frame of 20 s and 5 min. The initial response to microgravity involved mostly regulatory RNAs. We identified three gravity-regulated genes which could be cross-validated in both completely independent experiment missions: ATP6V1A/D, a vacuolar H + -ATPase (V-ATPase) responsible for acidification during bone resorption, IGHD3-3/IGHD3-10, diversity genes of the immunoglobulin heavy-chain locus participating in V(D)J recombination, and LINC00837, a long intergenic non-protein coding RNA. Due to the extensive and rapid alteration of gene expression associated with regulatory RNAs, we conclude that human cells are equipped with a robust and efficient adaptation potential when challenged with altered gravitational environments.

  3. Development of a UAS-based survey module for ecological research

    NASA Astrophysics Data System (ADS)

    Meng, R.; McMahon, A. M.; Serbin, S.

    2016-12-01

    The development of small unmanned aircraft system (UAS, < 25 kg) techniques is enabling measurements of terrestrial ecosystems at unprecedented temporal and spatial scales. Given the potential for improved mission safety, high revisit frequency, and reduced operation cost, UAS platforms are of particular interest in the development for scientific research. Our group is developing a UAS-based survey module for ecological research (e.g. scaling and mapping plant functional traits). However, in addition to technical challenges, the complicated regulations required to operate a UAS for research (e.g. Certificates of Waiver or Authorization, COA, for each location) and complying with Federal Aviation Administration (FAA) restrictions, which still actively evolving, can have significant impacts on research and schedules. Here we briefly discuss our lessons-learned related to FAA registration and COA procedures, requirements, and regulations in the US, accompanied by our hand-on experiences (our group currently have two COA granted and three more under review by FAA). We then introduce our design for a modular data collection software framework. This framework is open source (available on GitHub) and cross-platform compatible (written in Python), providing flexibility in development and deployment hardware configurations. In addition our framework uses a central module to coordinate the data acquisition, synchronization with the UAS control system and data storage through a common interface and interchangeable, hardware specific software modules. Utilizing this structure and a common data transfer format, the system can be easily reconfigured to meet the needs of a specific platform or operation, eliminating the need to redevelop acquisition systems for specific instrument/platform configurations. On-site data measurement tests of UAS-based survey module were conducted and data quality from multi-sensors (e.g. a high-resolution digital camera, spectroradiometer, and a thermal infrared camera) was reported. Finally, the results of this prototype study show that the UAS techniques can be used to develop a low-cost alternative for ecological research, but much effort is still needed to carefully deal with flight regulations and integrate off-the-shelf instrumentation, by the practitioner.

  4. Vector 33: A reduce program for vector algebra and calculus in orthogonal curvilinear coordinates

    NASA Astrophysics Data System (ADS)

    Harper, David

    1989-06-01

    This paper describes a package with enables REDUCE 3.3 to perform algebra and calculus operations upon vectors. Basic algebraic operations between vectors and between scalars and vectors are provided, including scalar (dot) product and vector (cross) product. The vector differential operators curl, divergence, gradient and Laplacian are also defined, and are valid in any orthogonal curvilinear coordinate system. The package is written in RLISP to allow algebra and calculus to be performed using notation identical to that for operations. Scalars and vectors can be mixed quite freely in the same expression. The package will be of interest to mathematicians, engineers and scientists who need to perform vector calculations in orthogonal curvilinear coordinates.

  5. Improvements and Advances to the Cross-Calibrated Multi-Platform (CCMP) Ocean Vector Wind Analysis (V2.0 release)

    NASA Astrophysics Data System (ADS)

    Scott, J. P.; Wentz, F. J.; Hoffman, R. N.; Atlas, R. M.

    2016-02-01

    Ocean vector wind is a valuable climate data record (CDR) useful in observing and monitoring changes in climate and air-sea interactions. Ocean surface wind stress influences such processes as heat, moisture, and momentum fluxes between the atmosphere and ocean, driving ocean currents and forcing ocean circulation. The Cross-Calibrated Multi-Platform (CCMP) ocean vector wind analysis is a quarter-degree, six-hourly global ocean wind analysis product created using the variational analysis method (VAM) [Atlas et al., 1996; Hoffman et al., 2003]. The CCMP V1.1 wind product is a highly-esteemed, widely-used data set containing the longest gap-free record of satellite-based ocean vector wind data (July 1987 to June 2012). CCMP V1.1 was considered a "first-look" data set that used the most-timely, albeit preliminary, releases of satellite, in situ, and modeled ECMWF-Operational wind background fields. The authors have been working with the original producers of CCMP V1.1 to create an updated, improved, and consistently-reprocessed CCMP V2.0 ocean vector wind analysis data set. With Remote Sensing Systems (RSS) having recently updated all passive microwave satellite instrument calibrations and retrievals to the RSS Version-7 RTM standard, the reprocessing of the CCMP data set into a higher-quality CDR using inter-calibrated satellite inputs became feasible. In addition to the use of SSM/I, SSMIS, TRMM TMI, QuikSCAT, AMSRE, and WindSat instruments, AMSR2, GMI, and ASCAT have been also included in the CCMP V2.0 data set release, which has now been extended to the beginning of 2015. Additionally, the background field has been updated to use six-hourly, quarter-degree ERA-Interim wind vector inputs, and the quality-checks on the in situ data have been carefully reviewed and improved. The goal of the release of the CCMP V2.0 ocean wind vector analysis product is to serve as a merged ocean wind vector data set for climate studies. Diligent effort has been made by the authors to minimize systematic and spurious sources of error. The authors will present a complete discussion of upgrades made to the CCMP V2.0 data set, as well as present validation work that has been completed on the CCMP V2.0 wind analysis product.

  6. Synthetic Diagnostics Platform for Fusion Plasma and a Two-Dimensional Synthetic Electron Cyclotron Emission Imaging Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Lei

    Magnetic confinement fusion is one of the most promising approaches to achieve fusion energy. With the rapid increase of the computational power over the past decades, numerical simulation have become an important tool to study the fusion plasmas. Eventually, the numerical models will be used to predict the performance of future devices, such as the International Thermonuclear Experiment Reactor (ITER) or DEMO. However, the reliability of these models needs to be carefully validated against experiments before the results can be trusted. The validation between simulations and measurements is hard particularly because the quantities directly available from both sides are different.more » While the simulations have the information of the plasma quantities calculated explicitly, the measurements are usually in forms of diagnostic signals. The traditional way of making the comparison relies on the diagnosticians to interpret the measured signals as plasma quantities. The interpretation is in general very complicated and sometimes not even unique. In contrast, given the plasma quantities from the plasma simulations, we can unambiguously calculate the generation and propagation of the diagnostic signals. These calculations are called synthetic diagnostics, and they enable an alternate way to compare the simulation results with the measurements. In this dissertation, we present a platform for developing and applying synthetic diagnostic codes. Three diagnostics on the platform are introduced. The reflectometry and beam emission spectroscopy diagnostics measure the electron density, and the electron cyclotron emission diagnostic measures the electron temperature. The theoretical derivation and numerical implementation of a new two dimensional Electron cyclotron Emission Imaging code is discussed in detail. This new code has shown the potential to address many challenging aspects of the present ECE measurements, such as runaway electron effects, and detection of the cross phase between the electron temperature and density fluctuations.« less

  7. Large-scale cross-species chemogenomic platform proposes a new drug discovery strategy of veterinary drug from herbal medicines.

    PubMed

    Huang, Chao; Yang, Yang; Chen, Xuetong; Wang, Chao; Li, Yan; Zheng, Chunli; Wang, Yonghua

    2017-01-01

    Veterinary Herbal Medicine (VHM) is a comprehensive, current, and informative discipline on the utilization of herbs in veterinary practice. Driven by chemistry but progressively directed by pharmacology and the clinical sciences, drug research has contributed more to address the needs for innovative veterinary medicine for curing animal diseases. However, research into veterinary medicine of vegetal origin in the pharmaceutical industry has reduced, owing to questions such as the short of compatibility of traditional natural-product extract libraries with high-throughput screening. Here, we present a cross-species chemogenomic screening platform to dissect the genetic basis of multifactorial diseases and to determine the most suitable points of attack for future veterinary medicines, thereby increasing the number of treatment options. First, based on critically examined pharmacology and text mining, we build a cross-species drug-likeness evaluation approach to screen the lead compounds in veterinary medicines. Second, a specific cross-species target prediction model is developed to infer drug-target connections, with the purpose of understanding how drugs work on the specific targets. Third, we focus on exploring the multiple targets interference effects of veterinary medicines by heterogeneous network convergence and modularization analysis. Finally, we manually integrate a disease pathway to test whether the cross-species chemogenomic platform could uncover the active mechanism of veterinary medicine, which is exemplified by a specific network module. We believe the proposed cross-species chemogenomic platform allows for the systematization of current and traditional knowledge of veterinary medicine and, importantly, for the application of this emerging body of knowledge to the development of new drugs for animal diseases.

  8. Large-scale cross-species chemogenomic platform proposes a new drug discovery strategy of veterinary drug from herbal medicines

    PubMed Central

    Huang, Chao; Yang, Yang; Chen, Xuetong; Wang, Chao; Li, Yan; Zheng, Chunli

    2017-01-01

    Veterinary Herbal Medicine (VHM) is a comprehensive, current, and informative discipline on the utilization of herbs in veterinary practice. Driven by chemistry but progressively directed by pharmacology and the clinical sciences, drug research has contributed more to address the needs for innovative veterinary medicine for curing animal diseases. However, research into veterinary medicine of vegetal origin in the pharmaceutical industry has reduced, owing to questions such as the short of compatibility of traditional natural-product extract libraries with high-throughput screening. Here, we present a cross-species chemogenomic screening platform to dissect the genetic basis of multifactorial diseases and to determine the most suitable points of attack for future veterinary medicines, thereby increasing the number of treatment options. First, based on critically examined pharmacology and text mining, we build a cross-species drug-likeness evaluation approach to screen the lead compounds in veterinary medicines. Second, a specific cross-species target prediction model is developed to infer drug-target connections, with the purpose of understanding how drugs work on the specific targets. Third, we focus on exploring the multiple targets interference effects of veterinary medicines by heterogeneous network convergence and modularization analysis. Finally, we manually integrate a disease pathway to test whether the cross-species chemogenomic platform could uncover the active mechanism of veterinary medicine, which is exemplified by a specific network module. We believe the proposed cross-species chemogenomic platform allows for the systematization of current and traditional knowledge of veterinary medicine and, importantly, for the application of this emerging body of knowledge to the development of new drugs for animal diseases. PMID:28915268

  9. RMS: a platform for managing cross-disciplinary and multi-institutional research project collaboration.

    PubMed

    Luo, Jake; Apperson-Hansen, Carolyn; Pelfrey, Clara M; Zhang, Guo-Qiang

    2014-11-30

    Cross-institutional cross-disciplinary collaboration has become a trend as researchers move toward building more productive and innovative teams for scientific research. Research collaboration is significantly changing the organizational structure and strategies used in the clinical and translational science domain. However, due to the obstacles of diverse administrative structures, differences in area of expertise, and communication barriers, establishing and managing a cross-institutional research project is still a challenging task. We address these challenges by creating an integrated informatics platform to reduce the barriers to biomedical research collaboration. The Request Management System (RMS) is an informatics infrastructure designed to transform a patchwork of expertise and resources into an integrated support network. The RMS facilitates investigators' initiation of new collaborative projects and supports the management of the collaboration process. In RMS, experts and their knowledge areas are categorized and managed structurally to provide consistent service. A role-based collaborative workflow is tightly integrated with domain experts and services to streamline and monitor the life-cycle of a research project. The RMS has so far tracked over 1,500 investigators with over 4,800 tasks. The research network based on the data collected in RMS illustrated that the investigators' collaborative projects increased close to 3 times from 2009 to 2012. Our experience with RMS indicates that the platform reduces barriers for cross-institutional collaboration of biomedical research projects. Building a new generation of infrastructure to enhance cross-disciplinary and multi-institutional collaboration has become an important yet challenging task. In this paper, we share the experience of developing and utilizing a collaborative project management system. The results of this study demonstrate that a web-based integrated informatics platform can facilitate and increase research interactions among investigators.

  10. Geometric approach to segmentation and protein localization in cell culture assays.

    PubMed

    Raman, S; Maxwell, C A; Barcellos-Hoff, M H; Parvin, B

    2007-01-01

    Cell-based fluorescence imaging assays are heterogeneous and require the collection of a large number of images for detailed quantitative analysis. Complexities arise as a result of variation in spatial nonuniformity, shape, overlapping compartments and scale (size). A new technique and methodology has been developed and tested for delineating subcellular morphology and partitioning overlapping compartments at multiple scales. This system is packaged as an integrated software platform for quantifying images that are obtained through fluorescence microscopy. Proposed methods are model based, leveraging geometric shape properties of subcellular compartments and corresponding protein localization. From the morphological perspective, convexity constraint is imposed to delineate and partition nuclear compartments. From the protein localization perspective, radial symmetry is imposed to localize punctate protein events at submicron resolution. Convexity constraint is imposed against boundary information, which are extracted through a combination of zero-crossing and gradient operator. If the convexity constraint fails for the boundary then positive curvature maxima are localized along the contour and the entire blob is partitioned into disjointed convex objects representing individual nuclear compartment, by enforcing geometric constraints. Nuclear compartments provide the context for protein localization, which may be diffuse or punctate. Punctate signal are localized through iterative voting and radial symmetries for improved reliability and robustness. The technique has been tested against 196 images that were generated to study centrosome abnormalities. Corresponding computed representations are compared against manual counts for validation.

  11. Digital image analysis of Ki67 proliferation index in breast cancer using virtual dual staining on whole tissue sections: clinical validation and inter-platform agreement.

    PubMed

    Koopman, Timco; Buikema, Henk J; Hollema, Harry; de Bock, Geertruida H; van der Vegt, Bert

    2018-05-01

    The Ki67 proliferation index is a prognostic and predictive marker in breast cancer. Manual scoring is prone to inter- and intra-observer variability. The aims of this study were to clinically validate digital image analysis (DIA) of Ki67 using virtual dual staining (VDS) on whole tissue sections and to assess inter-platform agreement between two independent DIA platforms. Serial whole tissue sections of 154 consecutive invasive breast carcinomas were stained for Ki67 and cytokeratin 8/18 with immunohistochemistry in a clinical setting. Ki67 proliferation index was determined using two independent DIA platforms, implementing VDS to identify tumor tissue. Manual Ki67 score was determined using a standardized manual counting protocol. Inter-observer agreement between manual and DIA scores and inter-platform agreement between both DIA platforms were determined and calculated using Spearman's correlation coefficients. Correlations and agreement were assessed with scatterplots and Bland-Altman plots. Spearman's correlation coefficients were 0.94 (p < 0.001) for inter-observer agreement between manual counting and platform A, 0.93 (p < 0.001) between manual counting and platform B, and 0.96 (p < 0.001) for inter-platform agreement. Scatterplots and Bland-Altman plots revealed no skewness within specific data ranges. In the few cases with ≥ 10% difference between manual counting and DIA, results by both platforms were similar. DIA using VDS is an accurate method to determine the Ki67 proliferation index in breast cancer, as an alternative to manual scoring of whole sections in clinical practice. Inter-platform agreement between two different DIA platforms was excellent, suggesting vendor-independent clinical implementability.

  12. Coupled Modeling of Hydrodynamics and Sound in Coastal Ocean for Renewable Ocean Energy Development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, Wen; Jung, Ki Won; Yang, Zhaoqing

    An underwater sound model was developed to simulate sound propagation from marine and hydrokinetic energy (MHK) devices or offshore wind (OSW) energy platforms. Finite difference methods were developed to solve the 3D Helmholtz equation for sound propagation in the coastal environment. A 3D sparse matrix solver with complex coefficients was formed for solving the resulting acoustic pressure field. The Complex Shifted Laplacian Preconditioner (CSLP) method was applied to solve the matrix system iteratively with MPI parallelization using a high performance cluster. The sound model was then coupled with the Finite Volume Community Ocean Model (FVCOM) for simulating sound propagation generatedmore » by human activities, such as construction of OSW turbines or tidal stream turbine operations, in a range-dependent setting. As a proof of concept, initial validation of the solver is presented for two coastal wedge problems. This sound model can be useful for evaluating impacts on marine mammals due to deployment of MHK devices and OSW energy platforms.« less

  13. DCT Trigger in a High-Resolution Test Platform for the Detection of Very Inclined Showers in Pierre Auger Surface Detectors

    NASA Astrophysics Data System (ADS)

    Szadkowski, Zbigniew; Wiedeński, Michał

    2017-06-01

    We present first results from a trigger based on the discrete cosine transform (DCT) operating in new front-end boards with a Cyclone V E field-programmable gate array (FPGA) deployed in seven test surface detectors in the Pierre Auger Test Array. The patterns of the ADC traces generated by very inclined showers (arriving at 70° to 90° from the vertical) were obtained from the Auger database and from the CORSIKA simulation package supported by the Auger OffLine event reconstruction platform that gives predicted digitized signal profiles. Simulations for many values of the initial cosmic ray angle of arrival, the shower initialization depth in the atmosphere, the type of particle, and its initial energy gave a boundary on the DCT coefficients used for the online pattern recognition in the FPGA. Preliminary results validated the approach used. We recorded several showers triggered by the DCT for 120 Msamples/s and 160 Msamples/s.

  14. Cross-Platform Learning: On the Nature of Children's Learning from Multiple Media Platforms

    ERIC Educational Resources Information Center

    Fisch, Shalom M.

    2013-01-01

    It is increasingly common for an educational media project to span several media platforms (e.g., TV, Web, hands-on materials), assuming that the benefits of learning from multiple media extend beyond those gained from one medium alone. Yet research typically has investigated learning from a single medium in isolation. This paper reviews several…

  15. Prediction Models for 30-Day Mortality and Complications After Total Knee and Hip Arthroplasties for Veteran Health Administration Patients With Osteoarthritis.

    PubMed

    Harris, Alex Hs; Kuo, Alfred C; Bowe, Thomas; Gupta, Shalini; Nordin, David; Giori, Nicholas J

    2018-05-01

    Statistical models to preoperatively predict patients' risk of death and major complications after total joint arthroplasty (TJA) could improve the quality of preoperative management and informed consent. Although risk models for TJA exist, they have limitations including poor transparency and/or unknown or poor performance. Thus, it is currently impossible to know how well currently available models predict short-term complications after TJA, or if newly developed models are more accurate. We sought to develop and conduct cross-validation of predictive risk models, and report details and performance metrics as benchmarks. Over 90 preoperative variables were used as candidate predictors of death and major complications within 30 days for Veterans Health Administration patients with osteoarthritis who underwent TJA. Data were split into 3 samples-for selection of model tuning parameters, model development, and cross-validation. C-indexes (discrimination) and calibration plots were produced. A total of 70,569 patients diagnosed with osteoarthritis who received primary TJA were included. C-statistics and bootstrapped confidence intervals for the cross-validation of the boosted regression models were highest for cardiac complications (0.75; 0.71-0.79) and 30-day mortality (0.73; 0.66-0.79) and lowest for deep vein thrombosis (0.59; 0.55-0.64) and return to the operating room (0.60; 0.57-0.63). Moderately accurate predictive models of 30-day mortality and cardiac complications after TJA in Veterans Health Administration patients were developed and internally cross-validated. By reporting model coefficients and performance metrics, other model developers can test these models on new samples and have a procedure and indication-specific benchmark to surpass. Published by Elsevier Inc.

  16. Development and Validation of an NPSS Model of a Small Turbojet Engine

    NASA Astrophysics Data System (ADS)

    Vannoy, Stephen Michael

    Recent studies have shown that integrated gas turbine engine (GT)/solid oxide fuel cell (SOFC) systems for combined propulsion and power on aircraft offer a promising method for more efficient onboard electrical power generation. However, it appears that nobody has actually attempted to construct a hybrid GT/SOFC prototype for combined propulsion and electrical power generation. This thesis contributes to this ambition by developing an experimentally validated thermodynamic model of a small gas turbine (˜230 N thrust) platform for a bench-scale GT/SOFC system. The thermodynamic model is implemented in a NASA-developed software environment called Numerical Propulsion System Simulation (NPSS). An indoor test facility was constructed to measure the engine's performance parameters: thrust, air flow rate, fuel flow rate, engine speed (RPM), and all axial stage stagnation temperatures and pressures. The NPSS model predictions are compared to the measured performance parameters for steady state engine operation.

  17. Polyethylene Glycol Modified, Cross-Linked Starch Coated Iron Oxide Nanoparticles for Enhanced Magnetic Tumor Targeting

    PubMed Central

    Cole, Adam J.; David, Allan E.; Wang, Jianxin; Galbán, Craig J.; Hill, Hannah L.; Yang, Victor C.

    2010-01-01

    While successful magnetic tumor targeting of iron oxide nanoparticles has been achieved in a number of models, the rapid blood clearance of magnetically suitable particles by the reticuloendothelial system (RES) limits their availability for targeting. This work aimed to develop a long-circulating magnetic iron oxide nanoparticle (MNP) platform capable of sustained tumor exposure via the circulation and, thus, enhanced magnetic tumor targeting. Aminated, cross-linked starch (DN) and aminosilane (A) coated MNPs were successfully modified with 5 kDa (A5, D5) or 20 kDa (A20, D20) polyethylene glycol (PEG) chains using simple N-Hydroxysuccinimide (NHS) chemistry and characterized. Identical PEG-weight analogues between platforms (A5 & D5, A20 & D20) were similar in size (140–190 nm) and relative PEG labeling (1.5% of surface amines – A5/D5, 0.4% – A20/D20), with all PEG-MNPs possessing magnetization properties suitable for magnetic targeting. Candidate PEG-MNPs were studied in RES simulations in vitro to predict long-circulating character. D5 and D20 performed best showing sustained size stability in cell culture medium at 37°C and 7 (D20) to 10 (D5) fold less uptake in RAW264.7 macrophages when compared to previously targeted, unmodified starch MNPs (D). Observations in vitro were validated in vivo, with D5 (7.29 hr) and D20 (11.75 hr) showing much longer half-lives than D (0.12 hr). Improved plasma stability enhanced tumor MNP exposure 100 (D5) to 150 (D20) fold as measured by plasma AUC0-∞ Sustained tumor exposure over 24 hours was visually confirmed in a 9L-glioma rat model (12 mg Fe/kg) using magnetic resonance imaging (MRI). Findings indicate that both D5 and D20 are promising MNP platforms for enhanced magnetic tumor targeting, warranting further study in tumor models. PMID:21176955

  18. High fidelity wireless network evaluation for heterogeneous cognitive radio networks

    NASA Astrophysics Data System (ADS)

    Ding, Lei; Sagduyu, Yalin; Yackoski, Justin; Azimi-Sadjadi, Babak; Li, Jason; Levy, Renato; Melodia, Tammaso

    2012-06-01

    We present a high fidelity cognitive radio (CR) network emulation platform for wireless system tests, measure- ments, and validation. This versatile platform provides the configurable functionalities to control and repeat realistic physical channel effects in integrated space, air, and ground networks. We combine the advantages of scalable simulation environment with reliable hardware performance for high fidelity and repeatable evaluation of heterogeneous CR networks. This approach extends CR design only at device (software-defined-radio) or lower-level protocol (dynamic spectrum access) level to end-to-end cognitive networking, and facilitates low-cost deployment, development, and experimentation of new wireless network protocols and applications on frequency- agile programmable radios. Going beyond the channel emulator paradigm for point-to-point communications, we can support simultaneous transmissions by network-level emulation that allows realistic physical-layer inter- actions between diverse user classes, including secondary users, primary users, and adversarial jammers in CR networks. In particular, we can replay field tests in a lab environment with real radios perceiving and learning the dynamic environment thereby adapting for end-to-end goals over distributed spectrum coordination channels that replace the common control channel as a single point of failure. CR networks offer several dimensions of tunable actions including channel, power, rate, and route selection. The proposed network evaluation platform is fully programmable and can reliably evaluate the necessary cross-layer design solutions with configurable op- timization space by leveraging the hardware experiments to represent the realistic effects of physical channel, topology, mobility, and jamming on spectrum agility, situational awareness, and network resiliency. We also provide the flexibility to scale up the test environment by introducing virtual radios and establishing seamless signal-level interactions with real radios. This holistic wireless evaluation approach supports a large-scale, het- erogeneous, and dynamic CR network architecture and allows developing cross-layer network protocols under high fidelity, repeatable, and scalable wireless test scenarios suitable for heterogeneous space, air, and ground networks.

  19. Development of a Bayesian model to estimate health care outcomes in the severely wounded

    PubMed Central

    Stojadinovic, Alexander; Eberhardt, John; Brown, Trevor S; Hawksworth, Jason S; Gage, Frederick; Tadaki, Douglas K; Forsberg, Jonathan A; Davis, Thomas A; Potter, Benjamin K; Dunne, James R; Elster, E A

    2010-01-01

    Background: Graphical probabilistic models have the ability to provide insights as to how clinical factors are conditionally related. These models can be used to help us understand factors influencing health care outcomes and resource utilization, and to estimate morbidity and clinical outcomes in trauma patient populations. Study design: Thirty-two combat casualties with severe extremity injuries enrolled in a prospective observational study were analyzed using step-wise machine-learned Bayesian belief network (BBN) and step-wise logistic regression (LR). Models were evaluated using 10-fold cross-validation to calculate area-under-the-curve (AUC) from receiver operating characteristics (ROC) curves. Results: Our BBN showed important associations between various factors in our data set that could not be developed using standard regression methods. Cross-validated ROC curve analysis showed that our BBN model was a robust representation of our data domain and that LR models trained on these findings were also robust: hospital-acquired infection (AUC: LR, 0.81; BBN, 0.79), intensive care unit length of stay (AUC: LR, 0.97; BBN, 0.81), and wound healing (AUC: LR, 0.91; BBN, 0.72) showed strong AUC. Conclusions: A BBN model can effectively represent clinical outcomes and biomarkers in patients hospitalized after severe wounding, and is confirmed by 10-fold cross-validation and further confirmed through logistic regression modeling. The method warrants further development and independent validation in other, more diverse patient populations. PMID:21197361

  20. Cloud computing and validation of expandable in silico livers

    PubMed Central

    2010-01-01

    Background In Silico Livers (ISLs) are works in progress. They are used to challenge multilevel, multi-attribute, mechanistic hypotheses about the hepatic disposition of xenobiotics coupled with hepatic responses. To enhance ISL-to-liver mappings, we added discrete time metabolism, biliary elimination, and bolus dosing features to a previously validated ISL and initiated re-validated experiments that required scaling experiments to use more simulated lobules than previously, more than could be achieved using the local cluster technology. Rather than dramatically increasing the size of our local cluster we undertook the re-validation experiments using the Amazon EC2 cloud platform. So doing required demonstrating the efficacy of scaling a simulation to use more cluster nodes and assessing the scientific equivalence of local cluster validation experiments with those executed using the cloud platform. Results The local cluster technology was duplicated in the Amazon EC2 cloud platform. Synthetic modeling protocols were followed to identify a successful parameterization. Experiment sample sizes (number of simulated lobules) on both platforms were 49, 70, 84, and 152 (cloud only). Experimental indistinguishability was demonstrated for ISL outflow profiles of diltiazem using both platforms for experiments consisting of 84 or more samples. The process was analogous to demonstration of results equivalency from two different wet-labs. Conclusions The results provide additional evidence that disposition simulations using ISLs can cover the behavior space of liver experiments in distinct experimental contexts (there is in silico-to-wet-lab phenotype similarity). The scientific value of experimenting with multiscale biomedical models has been limited to research groups with access to computer clusters. The availability of cloud technology coupled with the evidence of scientific equivalency has lowered the barrier and will greatly facilitate model sharing as well as provide straightforward tools for scaling simulations to encompass greater detail with no extra investment in hardware. PMID:21129207

  1. Control system design for the large space systems technology reference platform

    NASA Technical Reports Server (NTRS)

    Edmunds, R. S.

    1982-01-01

    Structural models and classical frequency domain control system designs were developed for the large space systems technology (LSST) reference platform which consists of a central bus structure, solar panels, and platform arms on which a variety of experiments may be mounted. It is shown that operation of multiple independently articulated payloads on a single platform presents major problems when subarc second pointing stability is required. Experiment compatibility will be an important operational consideration for systems of this type.

  2. Validation of the iPhone app using the force platform to estimate vertical jump height.

    PubMed

    Carlos-Vivas, Jorge; Martin-Martinez, Juan P; Hernandez-Mocholi, Miguel A; Perez-Gomez, Jorge

    2018-03-01

    Vertical jump performance has been evaluated with several devices: force platforms, contact mats, Vertec, accelerometers, infrared cameras and high-velocity cameras; however, the force platform is considered the gold standard for measuring vertical jump height. The purpose of this study was to validate an iPhone app called My Jump, that measures vertical jump height by comparing it with other methods that use the force platform to estimate vertical jump height, namely, vertical velocity at take-off and time in the air. A total of 40 sport sciences students (age 21.4±1.9 years) completed five countermovement jumps (CMJs) over a force platform. Thus, 200 CMJ heights were evaluated from the vertical velocity at take-off and the time in the air using the force platform, and from the time in the air with the My Jump mobile application. The height obtained was compared using the intraclass correlation coefficient (ICC). Correlation between APP and force platform using the time in the air was perfect (ICC=1.000, P<0.001). Correlation between APP and force platform using the vertical velocity at take-off was also very high (ICC=0.996, P<0.001), with an error margin of 0.78%. Therefore, these results showed that application, My Jump, is an appropriate method to evaluate the vertical jump performance; however, vertical jump height is slightly overestimated compared with that of the force platform.

  3. Counter tunnel exploration, mapping, and localization with an unmanned ground vehicle

    NASA Astrophysics Data System (ADS)

    Larson, Jacoby; Okorn, Brian; Pastore, Tracy; Hooper, David; Edwards, Jim

    2014-06-01

    Covert, cross-border tunnels are a security vulnerability that enables people and contraband to illegally enter the United States. All of these tunnels to-date have been constructed for the purpose of drug smuggling, but they may also be used to support terrorist activity. Past robotic tunnel exploration efforts have had limited success in aiding law enforcement to explore and map the suspect cross-border tunnels. These efforts have made use of adapted explosive ordnance disposal (EOD) or pipe inspection robotic systems that are not ideally suited to the cross-border tunnel environment. The Counter Tunnel project was sponsored by the Office of Secretary of Defense (OSD) Joint Ground Robotics Enterprise (JGRE) to develop a prototype robotic system for counter-tunnel operations, focusing on exploration, mapping, and characterization of tunnels. The purpose of this system is to provide a safe and effective solution for three-dimensional (3D) localization, mapping, and characterization of a tunnel environment. The system is composed of the robotic mobility platform, the mapping sensor payload, and the delivery apparatus. The system is able to deploy and retrieve the robotic mobility platform through a 20-cm-diameter borehole into the tunnel. This requirement posed many challenges in order to design and package the sensor and robotic system to fit through this narrow opening and be able to perform the mission. This paper provides a short description of a few aspects of the Counter Tunnel system such as mobility, perception, and localization, which were developed to meet the unique challenges required to access, explore, and map tunnel environments.

  4. Novel laser communications transceiver with internal gimbal-less pointing and tracking

    NASA Astrophysics Data System (ADS)

    Chalfant, Charles H., III; Orlando, Fred J., Jr.; Gregory, Jeff T.; Sulham, Clifford; O'Neal, Chad B.; Taylor, Geoffrey W.; Craig, Douglas M.; Foshee, James J.; Lovett, J. Timothy

    2002-12-01

    This paper describes a novel laser communications transceiver for use in multi-platform satellite networks or clusters that provides internal pointing and tracking technique allowing static mounting of the transceiver subsystems and minimal use of mechanical stabilization techniques. This eliminates the need for the large, power hungry, mechanical gimbals that are required for laser cross-link pointing, acquisition and tracking. The miniature transceiver is designed for pointing accuracies required for satellite cross-link distances of between 500 meters to 5000 meters. Specifically, the designs are targeting Air Force Research Lab's TechSat21 Program, although alternative transceiver configurations can provide for much greater link distances and other satellite systems. The receiver and transmitter are connected via fiber optic cabling from a separate electronics subsystem containing the optoelectronics PCBs, thereby eliminating active optoelectronic elements from the transceiver's mechanical housing. The internal acquisition and tracking capability is provided by an advanced micro-electro-mechanical system (MEMS) and an optical design that provides a specific field-of-view based on the satellite cluster's interface specifications. The acquisition & tracking control electronics will utilize conventional closed loop tracking techniques. The link optical power budget and optoelectronics designs allow use of transmitter sources with output powers of near 100 mW. The transceiver will provide data rates of up to 2.5 Gbps and operate at either 1310 nm or 1550 nm. In addition to space-based satellite to satellite cross-links, we are planning to develop a broad range of applications including air to air communications between highly mobile airborne platforms and terrestrial fixed point to point communications.

  5. Compact Modbus TCP/IP protocol for data acquisition systems based on limited hardware resources

    NASA Astrophysics Data System (ADS)

    Bai, Q.; Jin, B.; Wang, D.; Wang, Y.; Liu, X.

    2018-04-01

    The Modbus TCP/IP has been a standard industry communication protocol and widely utilized for establishing sensor-cloud platforms on the Internet. However, numerous existing data acquisition systems built on traditional single-chip microcontrollers without sufficient resources cannot support it, because the complete Modbus TCP/IP protocol always works dependent on a full operating system which occupies abundant hardware resources. Hence, a compact Modbus TCP/IP protocol is proposed in this work to make it run efficiently and stably even on a resource-limited hardware platform. Firstly, the Modbus TCP/IP protocol stack is analyzed and the refined protocol suite is rebuilt by streamlining the typical TCP/IP suite. Then, specific implementation of every hierarchical layer is respectively presented in detail according to the protocol structure. Besides, the compact protocol is implemented in a traditional microprocessor to validate the feasibility of the scheme. Finally, the performance of the proposed scenario is assessed. The experimental results demonstrate that message packets match the frame format of Modbus TCP/IP protocol and the average bandwidth reaches to 1.15 Mbps. The compact protocol operates stably even based on a traditional microcontroller with only 4-kB RAM and 12-MHz system clock, and no communication congestion or frequent packet loss occurs.

  6. Geostationary platform systems concepts definition study. Volume 1: Executive summary

    NASA Technical Reports Server (NTRS)

    1980-01-01

    The results of a geostationary platform concept analysis are summarized. Mission and payloads definition, concept selection, the requirements of an experimental platform, supporting research and technology, and the Space Transportation System interface requirements are addressed. It is concluded that platforms represent a logical extension of current trends toward larger, more complex, multifrequency satellites. Geostationary platforms offer significant cost savings compared to individual satellites, with the majority of these economies being realized with single Shuttle launched platforms. Further cost savings can be realized, however, by having larger platforms. Platforms accommodating communications equipment that operates at multiple frequencies and which provide larger scale frequency reuse through the use of large aperture multibeam antennas and onboard switching maximize the useful capacity of the orbital arc and frequency spectrum. Projections of market demand indicate that such conservation measures are clearly essential if orderly growth is to be provided for. In addition, it is pointed out that a NASA experimental platform is required to demonstrate the technologies necessary for operational geostationary platforms of the 1990's.

  7. The 1980 Large space systems technology. Volume 2: Base technology

    NASA Technical Reports Server (NTRS)

    Kopriver, F., III (Compiler)

    1981-01-01

    Technology pertinent to large antenna systems, technology related to large space platform systems, and base technology applicable to both antenna and platform systems are discussed. Design studies, structural testing results, and theoretical applications are presented with accompanying validation data. A total systems approach including controls, platforms, and antennas is presented as a cohesive, programmatic plan for large space systems.

  8. Polarization Observations with the Cosmic Background Imager

    NASA Astrophysics Data System (ADS)

    Cartwright, J. K.; Padin, S.; Pearson, T. J.; Readhead, A. C. S.; Shepherd, M. C.; Taylor, G. B.

    2001-05-01

    We describe polarization observations of the CMBR with the Cosmic Background Imager, a 13 element interferometer which operates in the 26-36 GHz band from a site at 5000m in northern Chile. The array consists of 90-cm Cassegrain antennas mounted on a single, fully steerable platform; this platform can be rotated about the optical axis to facilitate polarization observations. The CBI employs single mode circularly polarized receivers, of which 12 are configured for LCP and one is configured for RCP. The 12 cross polarized baselines sample multipoles from l 600 to l 3500. The instrumental polarization of the CBI was calibrated with observations of 3C279, a bright polarized source which is unresolved by the CBI. Because the centimeter flux of 3C279 is variable, it was monitored twice per month for 8 months in 2000 with the VLA at 22 and 43 GHz. These observations also established the stability of the polarization characteristics of the CBI. This work was made possible by NSF grant AST-9802989

  9. YAG aerosol lidar

    NASA Technical Reports Server (NTRS)

    Sullivan, R.

    1988-01-01

    The Global Atmospheric Backscatter Experiment (GLOBE) Mission, using the NASA DC-8 aircraft platform, is designed to provide the magnitude and statistical distribution of atmospheric backscatter cross section at lidar operating wavelengths. This is a fundamental parameter required for the Doppler lidar proposed to be used on a spacecraft platform for global wind field measurements. The prime measurements will be made by a CO2 lidar instrument in the 9 to 10 micron range. These measurements will be complemented with the Goddard YAG Aerosol Lidar (YAL) data in two wavelengths, 0.532 and 1.06 micron, in the visible and near-infrared. The YAL, is being designed to utilize as much existing hardware, as feasible, to minimize cost and reduce implementation time. The laser, energy monitor, telescope and detector package will be mounted on an optical breadboard. The optical breadboard is mounted through isolation mounts between two low boy racks. The detector package will utilize a photomultiplier tube for the 0.532 micron channel and a silicon avalanche photo detector (APD) for the 1.06 micron channel.

  10. StreptomycesInforSys: A web-enabled information repository

    PubMed Central

    Jain, Chakresh Kumar; Gupta, Vidhi; Gupta, Ashvarya; Gupta, Sanjay; Wadhwa, Gulshan; Sharma, Sanjeev Kumar; Sarethy, Indira P

    2012-01-01

    Members of Streptomyces produce 70% of natural bioactive products. There is considerable amount of information available based on polyphasic approach for classification of Streptomyces. However, this information based on phenotypic, genotypic and bioactive component production profiles is crucial for pharmacological screening programmes. This is scattered across various journals, books and other resources, many of which are not freely accessible. The designed database incorporates polyphasic typing information using combinations of search options to aid in efficient screening of new isolates. This will help in the preliminary categorization of appropriate groups. It is a free relational database compatible with existing operating systems. A cross platform technology with XAMPP Web server has been used to develop, manage, and facilitate the user query effectively with database support. Employment of PHP, a platform-independent scripting language, embedded in HTML and the database management software MySQL will facilitate dynamic information storage and retrieval. The user-friendly, open and flexible freeware (PHP, MySQL and Apache) is foreseen to reduce running and maintenance cost. Availability www.sis.biowaves.org PMID:23275736

  11. StreptomycesInforSys: A web-enabled information repository.

    PubMed

    Jain, Chakresh Kumar; Gupta, Vidhi; Gupta, Ashvarya; Gupta, Sanjay; Wadhwa, Gulshan; Sharma, Sanjeev Kumar; Sarethy, Indira P

    2012-01-01

    Members of Streptomyces produce 70% of natural bioactive products. There is considerable amount of information available based on polyphasic approach for classification of Streptomyces. However, this information based on phenotypic, genotypic and bioactive component production profiles is crucial for pharmacological screening programmes. This is scattered across various journals, books and other resources, many of which are not freely accessible. The designed database incorporates polyphasic typing information using combinations of search options to aid in efficient screening of new isolates. This will help in the preliminary categorization of appropriate groups. It is a free relational database compatible with existing operating systems. A cross platform technology with XAMPP Web server has been used to develop, manage, and facilitate the user query effectively with database support. Employment of PHP, a platform-independent scripting language, embedded in HTML and the database management software MySQL will facilitate dynamic information storage and retrieval. The user-friendly, open and flexible freeware (PHP, MySQL and Apache) is foreseen to reduce running and maintenance cost. www.sis.biowaves.org.

  12. A Vision of Quantitative Imaging Technology for Validation of Advanced Flight Technologies

    NASA Technical Reports Server (NTRS)

    Horvath, Thomas J.; Kerns, Robert V.; Jones, Kenneth M.; Grinstead, Jay H.; Schwartz, Richard J.; Gibson, David M.; Taylor, Jeff C.; Tack, Steve; Dantowitz, Ronald F.

    2011-01-01

    Flight-testing is traditionally an expensive but critical element in the development and ultimate validation and certification of technologies destined for future operational capabilities. Measurements obtained in relevant flight environments also provide unique opportunities to observe flow phenomenon that are often beyond the capabilities of ground testing facilities and computational tools to simulate or duplicate. However, the challenges of minimizing vehicle weight and internal complexity as well as instrumentation bandwidth limitations often restrict the ability to make high-density, in-situ measurements with discrete sensors. Remote imaging offers a potential opportunity to noninvasively obtain such flight data in a complementary fashion. The NASA Hypersonic Thermodynamic Infrared Measurements Project has demonstrated such a capability to obtain calibrated thermal imagery on a hypersonic vehicle in flight. Through the application of existing and accessible technologies, the acreage surface temperature of the Shuttle lower surface was measured during reentry. Future hypersonic cruise vehicles, launcher configurations and reentry vehicles will, however, challenge current remote imaging capability. As NASA embarks on the design and deployment of a new Space Launch System architecture for access beyond earth orbit (and the commercial sector focused on low earth orbit), an opportunity exists to implement an imagery system and its supporting infrastructure that provides sufficient flexibility to incorporate changing technology to address the future needs of the flight test community. A long term vision is offered that supports the application of advanced multi-waveband sensing technology to aid in the development of future aerospace systems and critical technologies to enable highly responsive vehicle operations across the aerospace continuum, spanning launch, reusable space access and global reach. Motivations for development of an Agency level imagery-based measurement capability to support cross cutting applications that span the Agency mission directorates as well as meeting potential needs of the commercial sector and national interests of the Intelligence, Surveillance and Reconnaissance community are explored. A recommendation is made for an assessment study to baseline current imaging technology including the identification of future mission requirements. Development of requirements fostered by the applications suggested in this paper would be used to identify technology gaps and direct roadmapping for implementation of an affordable and sustainable next generation sensor/platform system.

  13. Design of shared instruments to utilize simulated gravities generated by a large-gradient, high-field superconducting magnet.

    PubMed

    Wang, Y; Yin, D C; Liu, Y M; Shi, J Z; Lu, H M; Shi, Z H; Qian, A R; Shang, P

    2011-03-01

    A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.

  14. Design of shared instruments to utilize simulated gravities generated by a large-gradient, high-field superconducting magnet

    NASA Astrophysics Data System (ADS)

    Wang, Y.; Yin, D. C.; Liu, Y. M.; Shi, J. Z.; Lu, H. M.; Shi, Z. H.; Qian, A. R.; Shang, P.

    2011-03-01

    A high-field superconducting magnet can provide both high-magnetic fields and large-field gradients, which can be used as a special environment for research or practical applications in materials processing, life science studies, physical and chemical reactions, etc. To make full use of a superconducting magnet, shared instruments (the operating platform, sample holders, temperature controller, and observation system) must be prepared as prerequisites. This paper introduces the design of a set of sample holders and a temperature controller in detail with an emphasis on validating the performance of the force and temperature sensors in the high-magnetic field.

  15. Teachable, high-content analytics for live-cell, phase contrast movies.

    PubMed

    Alworth, Samuel V; Watanabe, Hirotada; Lee, James S J

    2010-09-01

    CL-Quant is a new solution platform for broad, high-content, live-cell image analysis. Powered by novel machine learning technologies and teach-by-example interfaces, CL-Quant provides a platform for the rapid development and application of scalable, high-performance, and fully automated analytics for a broad range of live-cell microscopy imaging applications, including label-free phase contrast imaging. The authors used CL-Quant to teach off-the-shelf universal analytics, called standard recipes, for cell proliferation, wound healing, cell counting, and cell motility assays using phase contrast movies collected on the BioStation CT and BioStation IM platforms. Similar to application modules, standard recipes are intended to work robustly across a wide range of imaging conditions without requiring customization by the end user. The authors validated the performance of the standard recipes by comparing their performance with truth created manually, or by custom analytics optimized for each individual movie (and therefore yielding the best possible result for the image), and validated by independent review. The validation data show that the standard recipes' performance is comparable with the validated truth with low variation. The data validate that the CL-Quant standard recipes can provide robust results without customization for live-cell assays in broad cell types and laboratory settings.

  16. BiofOmics: a Web platform for the systematic and standardized collection of high-throughput biofilm data.

    PubMed

    Lourenço, Anália; Ferreira, Andreia; Veiga, Nuno; Machado, Idalina; Pereira, Maria Olivia; Azevedo, Nuno F

    2012-01-01

    Consortia of microorganisms, commonly known as biofilms, are attracting much attention from the scientific community due to their impact in human activity. As biofilm research grows to be a data-intensive discipline, the need for suitable bioinformatics approaches becomes compelling to manage and validate individual experiments, and also execute inter-laboratory large-scale comparisons. However, biofilm data is widespread across ad hoc, non-standardized individual files and, thus, data interchange among researchers, or any attempt of cross-laboratory experimentation or analysis, is hardly possible or even attempted. This paper presents BiofOmics, the first publicly accessible Web platform specialized in the management and analysis of data derived from biofilm high-throughput studies. The aim is to promote data interchange across laboratories, implementing collaborative experiments, and enable the development of bioinformatics tools in support of the processing and analysis of the increasing volumes of experimental biofilm data that are being generated. BiofOmics' data deposition facility enforces data structuring and standardization, supported by controlled vocabulary. Researchers are responsible for the description of the experiments, their results and conclusions. BiofOmics' curators interact with submitters only to enforce data structuring and the use of controlled vocabulary. Then, BiofOmics' search facility makes publicly available the profile and data associated with a submitted study so that any researcher can profit from these standardization efforts to compare similar studies, generate new hypotheses to be tested or even extend the conditions experimented in the study. BiofOmics' novelty lies in its support to standardized data deposition, the availability of computerizable data files and the free-of-charge dissemination of biofilm studies across the community. Hopefully, this will open promising research possibilities, namely the comparison of results between different laboratories, the reproducibility of methods within and between laboratories, and the development of guidelines and standardized protocols for biofilm formation operating procedures and analytical methods.

  17. A new stratospheric sounding platform based on unmanned aerial vehicle (UAV) droppable from meteorological balloon

    NASA Astrophysics Data System (ADS)

    Efremov, Denis; Khaykin, Sergey; Lykov, Alexey; Berezhko, Yaroslav; Lunin, Aleksey

    High-resolution measurements of climate-relevant trace gases and aerosols in the upper troposphere and stratosphere (UTS) have been and remain technically challenging. The high cost of measurements onboard airborne platforms or heavy stratospheric balloons results in a lack of accurate information on vertical distribution of atmospheric constituents. Whereas light-weight instruments carried by meteorological balloons are becoming progressively available, their usage is constrained by the cost of the equipment or the recovery operations. The evolving need in cost-efficient observations for UTS process studies has led to development of small airborne platforms - unmanned aerial vehicles (UAV), capable of carrying small sensors for in-situ measurements. We present a new UAV-based stratospheric sounding platform capable of carrying scientific payload of up to 2 kg. The airborne platform comprises of a latex meteorological balloon and detachable flying wing type UAV with internal measurement controller. The UAV is launched on a balloon to stratospheric altitudes up to 20 km, where it can be automatically released by autopilot or by a remote command sent from the ground control. Having been released from the balloon the UAV glides down and returns to the launch position. Autopilot using 3-axis gyro, accelerometer, barometer, compas and GPS navigation provides flight stabilization and optimal way back trajectory. Backup manual control is provided for emergencies. During the flight the onboard measurement controller stores the data into internal memory and transmits current flight parameters to the ground station via telemetry. Precise operation of the flight control systems ensures safe landing at the launch point. A series of field tests of the detachable stratospheric UAV has been conducted. The scientific payload included the following instruments involved in different flights: a) stratospheric Lyman-alpha hygrometer (FLASH); b) backscatter sonde; c) electrochemical ozone sonde; d) optical CO2 sensor; e) radioactivity sensor; f) solar radiation sensor. In addition, each payload included temperature sensor, barometric sensor and a GPS receiver. Design features of measurement systems onboard UAV and flight results are presented. Possible applications for atmospheric studies and validation of remote ground-based and space-borne observations is discussed.

  18. High-resolution velocimetry in energetic tidal currents using a convergent-beam acoustic Doppler profiler

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sellar, Brian; Harding, Samuel F.; Richmond, Marshall C.

    An array of convergent acoustic Doppler velocimeters has been developed and tested for the high resolution measurement of three-dimensional tidal flow velocities in an energetic tidal site. This configuration has been developed to increase spatial resolution of velocity measurements in comparison to conventional acoustic Doppler profilers (ADPs) which characteristically use diverging acoustic beams emanating from a single instrument. This is achieved using converging acoustic beams with a sample volume at the focal point of 0.03 m 3. The array is also able to simultaneously measure three-dimensional velocity components in a profile throughout the water column, and as such is referredmore » to herein as a converging-beam acoustic Doppler profiler (CADP). Mid-depth profiling is achieved through integration of the sensor platform with the operational Alstom 1MW DeepGen-IV Tidal Turbine. This proof-of-concept paper outlines system configuration and comparison to measurements provided by co-installed reference instrumentation. Comparison of CADP to standard ADP velocity measurements reveals a mean difference of 8 mm/s, standard deviation of 18 mm/s, and order-of-magnitude reduction in realizable length-scale. CADP focal point measurements compared to a proximal single-beam reference show peak cross-correlation coefficient of 0.96 over 4.0 s averaging period and a 47% reduction in Doppler noise. The dual functionality of the CADP as a profiling instrument with a high resolution focal point make this configuration a unique and valuable advancement in underwater velocimetry enabling improved turbulence, resource and structural loading quantification and validation of numerical simulations. Alternative modes of operation have been implemented including noise-reducing bi-static sampling. Since waves are simultaneously measured it is expected that derivatives of this system will be a powerful tool in wave-current interaction studies.« less

  19. PSI-Center Simulations of Validation Platform Experiments

    NASA Astrophysics Data System (ADS)

    Nelson, B. A.; Akcay, C.; Glasser, A. H.; Hansen, C. J.; Jarboe, T. R.; Marklin, G. J.; Milroy, R. D.; Morgan, K. D.; Norgaard, P. C.; Shumlak, U.; Victor, B. S.; Sovinec, C. R.; O'Bryan, J. B.; Held, E. D.; Ji, J.-Y.; Lukin, V. S.

    2013-10-01

    The Plasma Science and Innovation Center (PSI-Center - http://www.psicenter.org) supports collaborating validation platform experiments with extended MHD simulations. Collaborators include the Bellan Plasma Group (Caltech), CTH (Auburn U), FRX-L (Los Alamos National Laboratory), HIT-SI (U Wash - UW), LTX (PPPL), MAST (Culham), Pegasus (U Wisc-Madison), PHD/ELF (UW/MSNW), SSX (Swarthmore College), TCSU (UW), and ZaP/ZaP-HD (UW). Modifications have been made to the NIMROD, HiFi, and PSI-Tet codes to specifically model these experiments, including mesh generation/refinement, non-local closures, appropriate boundary conditions (external fields, insulating BCs, etc.), and kinetic and neutral particle interactions. The PSI-Center is exploring application of validation metrics between experimental data and simulations results. Biorthogonal decomposition is proving to be a powerful method to compare global temporal and spatial structures for validation. Results from these simulation and validation studies, as well as an overview of the PSI-Center status will be presented.

  20. Understanding health-related quality of life in caregivers of civilians and service members/veterans with traumatic brain injury: Establishing the reliability and validity of PROMIS Fatigue and Sleep Disturbance item banks.

    PubMed

    Carlozzi, Noelle E; Ianni, Phillip A; Tulsky, David S; Brickell, Tracey A; Lange, Rael T; French, Louis M; Cella, David; Kallen, Michael A; Miner, Jennifer A; Kratz, Anna L

    2018-06-19

    To examine the reliability and validity of Patient Reported Outcomes Measurement Information System (PROMIS) measures of sleep disturbance and fatigue in TBI caregivers and to determine the severity of fatigue and sleep disturbance in these caregivers. Cross-sectional survey data collected through an online data capture platform. Four rehabilitation hospitals and Walter Reed National Military Medical Center. Caregivers (N=560) of civilians (n=344) and service member/veterans (n=216) with TBI. Not Applicable MAIN OUTCOME MEASURES: PROMIS sleep and fatigue measures administered as both computerized adaptive tests (CATs) and 4-item short forms (SFs). For both samples, floor and ceiling effects for the PROMIS measures were low (<11%), internal consistency was very good (all alphas ≥0.80), and test-retest reliability was acceptable (all r≥0.70 except for the fatigue CAT in the service member/veteran sample r=0.63). Convergent validity was supported by moderate correlations between the PROMIS and related measures. Discriminant validity was supported by low correlations between PROMIS measures and measures of dissimilar constructs. PROMIS scores indicated significantly worse sleep and fatigue for those caring for someone with high levels versus low levels of impairment. Findings support the reliability and validity of the PROMIS CAT and SF measures of sleep disturbance and fatigue in caregivers of civilians and service members/veterans with TBI. Copyright © 2018. Published by Elsevier Inc.

  1. The QCDOC Project

    NASA Astrophysics Data System (ADS)

    Boyle, P.; Chen, D.; Christ, N.; Clark, M.; Cohen, S.; Cristian, C.; Dong, Z.; Gara, A.; Joo, B.; Jung, C.; Kim, C.; Levkova, L.; Liao, X.; Liu, G.; Li, S.; Lin, H.; Mawhinney, R.; Ohta, S.; Petrov, K.; Wettig, T.; Yamaguchi, A.

    2005-03-01

    The QCDOC project has developed a supercomputer optimised for the needs of Lattice QCD simulations. It provides a very competitive price to sustained performance ratio of around $1 USD per sustained Megaflop/s in combination with outstanding scalability. Thus very large systems delivering over 5 TFlop/s of performance on the evolution of a single lattice is possible. Large prototypes have been built and are functioning correctly. The software environment raises the state of the art in such custom supercomputers. It is based on a lean custom node operating system that eliminates many unnecessary overheads that plague other systems. Despite the custom nature, the operating system implements a standards compliant UNIX-like programming environment easing the porting of software from other systems. The SciDAC QMP interface adds internode communication in a fashion that provides a uniform cross-platform programming environment.

  2. A remark on copy number variation detection methods.

    PubMed

    Li, Shuo; Dou, Xialiang; Gao, Ruiqi; Ge, Xinzhou; Qian, Minping; Wan, Lin

    2018-01-01

    Copy number variations (CNVs) are gain and loss of DNA sequence of a genome. High throughput platforms such as microarrays and next generation sequencing technologies (NGS) have been applied for genome wide copy number losses. Although progress has been made in both approaches, the accuracy and consistency of CNV calling from the two platforms remain in dispute. In this study, we perform a deep analysis on copy number losses on 254 human DNA samples, which have both SNP microarray data and NGS data publicly available from Hapmap Project and 1000 Genomes Project respectively. We show that the copy number losses reported from Hapmap Project and 1000 Genome Project only have < 30% overlap, while these reports are required to have cross-platform (e.g. PCR, microarray and high-throughput sequencing) experimental supporting by their corresponding projects, even though state-of-art calling methods were employed. On the other hand, copy number losses are found directly from HapMap microarray data by an accurate algorithm, i.e. CNVhac, almost all of which have lower read mapping depth in NGS data; furthermore, 88% of which can be supported by the sequences with breakpoint in NGS data. Our results suggest the ability of microarray calling CNVs and the possible introduction of false negatives from the unessential requirement of the additional cross-platform supporting. The inconsistency of CNV reports from Hapmap Project and 1000 Genomes Project might result from the inadequate information containing in microarray data, the inconsistent detection criteria, or the filtration effect of cross-platform supporting. The statistical test on CNVs called from CNVhac show that the microarray data can offer reliable CNV reports, and majority of CNV candidates can be confirmed by raw sequences. Therefore, the CNV candidates given by a good caller could be highly reliable without cross-platform supporting, so additional experimental information should be applied in need instead of necessarily.

  3. 30 CFR 250.406 - What additional safety measures must I take when I conduct drilling operations on a platform that...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Oil and Gas Drilling Operations General Requirements § 250.406 What additional safety measures must I take when I conduct drilling... when I conduct drilling operations on a platform that has producing wells or has other hydrocarbon flow...

  4. 30 CFR 250.406 - What additional safety measures must I take when I conduct drilling operations on a platform that...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... when I conduct drilling operations on a platform that has producing wells or has other hydrocarbon flow... OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Oil and Gas Drilling Operations General Requirements § 250.406 What additional safety measures must I take...

  5. 30 CFR 250.406 - What additional safety measures must I take when I conduct drilling operations on a platform that...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... ENFORCEMENT, DEPARTMENT OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Oil and Gas Drilling Operations General Requirements § 250.406 What additional safety measures... when I conduct drilling operations on a platform that has producing wells or has other hydrocarbon flow...

  6. 30 CFR 250.406 - What additional safety measures must I take when I conduct drilling operations on a platform that...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... when I conduct drilling operations on a platform that has producing wells or has other hydrocarbon flow... OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Oil and Gas Drilling Operations General Requirements § 250.406 What additional safety measures must I take...

  7. 30 CFR 250.406 - What additional safety measures must I take when I conduct drilling operations on a platform that...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... when I conduct drilling operations on a platform that has producing wells or has other hydrocarbon flow... OF THE INTERIOR OFFSHORE OIL AND GAS AND SULPHUR OPERATIONS IN THE OUTER CONTINENTAL SHELF Oil and Gas Drilling Operations General Requirements § 250.406 What additional safety measures must I take...

  8. A Research on E - learning Resources Construction Based on Semantic Web

    NASA Astrophysics Data System (ADS)

    Rui, Liu; Maode, Deng

    Traditional e-learning platforms have the flaws that it's usually difficult to query or positioning, and realize the cross platform sharing and interoperability. In the paper, the semantic web and metadata standard is discussed, and a kind of e - learning system framework based on semantic web is put forward to try to solve the flaws of traditional elearning platforms.

  9. CatSim: a new computer assisted tomography simulation environment

    NASA Astrophysics Data System (ADS)

    De Man, Bruno; Basu, Samit; Chandra, Naveen; Dunham, Bruce; Edic, Peter; Iatrou, Maria; McOlash, Scott; Sainath, Paavana; Shaughnessy, Charlie; Tower, Brendon; Williams, Eugene

    2007-03-01

    We present a new simulation environment for X-ray computed tomography, called CatSim. CatSim provides a research platform for GE researchers and collaborators to explore new reconstruction algorithms, CT architectures, and X-ray source or detector technologies. The main requirements for this simulator are accurate physics modeling, low computation times, and geometrical flexibility. CatSim allows simulating complex analytic phantoms, such as the FORBILD phantoms, including boxes, ellipsoids, elliptical cylinders, cones, and cut planes. CatSim incorporates polychromaticity, realistic quantum and electronic noise models, finite focal spot size and shape, finite detector cell size, detector cross-talk, detector lag or afterglow, bowtie filtration, finite detector efficiency, non-linear partial volume, scatter (variance-reduced Monte Carlo), and absorbed dose. We present an overview of CatSim along with a number of validation experiments.

  10. An empirical assessment of validation practices for molecular classifiers

    PubMed Central

    Castaldi, Peter J.; Dahabreh, Issa J.

    2011-01-01

    Proposed molecular classifiers may be overfit to idiosyncrasies of noisy genomic and proteomic data. Cross-validation methods are often used to obtain estimates of classification accuracy, but both simulations and case studies suggest that, when inappropriate methods are used, bias may ensue. Bias can be bypassed and generalizability can be tested by external (independent) validation. We evaluated 35 studies that have reported on external validation of a molecular classifier. We extracted information on study design and methodological features, and compared the performance of molecular classifiers in internal cross-validation versus external validation for 28 studies where both had been performed. We demonstrate that the majority of studies pursued cross-validation practices that are likely to overestimate classifier performance. Most studies were markedly underpowered to detect a 20% decrease in sensitivity or specificity between internal cross-validation and external validation [median power was 36% (IQR, 21–61%) and 29% (IQR, 15–65%), respectively]. The median reported classification performance for sensitivity and specificity was 94% and 98%, respectively, in cross-validation and 88% and 81% for independent validation. The relative diagnostic odds ratio was 3.26 (95% CI 2.04–5.21) for cross-validation versus independent validation. Finally, we reviewed all studies (n = 758) which cited those in our study sample, and identified only one instance of additional subsequent independent validation of these classifiers. In conclusion, these results document that many cross-validation practices employed in the literature are potentially biased and genuine progress in this field will require adoption of routine external validation of molecular classifiers, preferably in much larger studies than in current practice. PMID:21300697

  11. EAQUATE: An International Experiment for Hyper-Spectral Atmospheric Sounding Validation

    NASA Technical Reports Server (NTRS)

    Taylor, J. P.; Smith, W.; Cuomo, V.; Larar, A.; Zhou, D.; Serio, C.; Maestri, T.; Rizzi, R.; Newman, S.; Antonelli, P.; hide

    2008-01-01

    The international experiment called EAQUATE (European AQUA Thermodynamic Experiment) was held in September 2004 in Italy and the United Kingdom to demonstrate certain ground-based and airborne systems useful for validating hyperspectral satellite sounding observations. A range of flights over land and marine surfaces were conducted to coincide with overpasses of the AIRS instrument on the EOS Aqua platform. Direct radiance evaluation of AIRS using NAST-I and SHIS has shown excellent agreement. Comparisons of level 2 retrievals of temperature and water vapor from AIRS and NAST-I validated against high quality lidar and drop sonde data show that the 1K/1km and 10%/1km requirements for temperature and water vapor (respectively) are generally being met. The EAQUATE campaign has proven the need for synergistic measurements from a range of observing systems for satellite cal/val and has paved the way for future cal/val activities in support of IASI on the European Metop platform and CrIS on the US NPP/NPOESS platform.

  12. McIDAS-V: Data Analysis and Visualization for NPOESS and GOES-R

    NASA Astrophysics Data System (ADS)

    Rink, T.; Achtor, T. H.

    2009-12-01

    McIDAS-V, the next-generation McIDAS, is being built on top a modern, cross-platform software framework which supports development of 4-D, interactive displays and integration of wide-array of geophysical data. As the replacement of McIDAS, the development emphasis is on future satellite observation platforms such as NPOESS and GOES-R. Data interrogation, analysis and visualization capabilities have been developed for multi- and hyper-spectral instruments like MODIS, AIRS and IASI, and are being extended for application to VIIRS and CrIS. Compatibility with GOES-R ABI level1 and level2 product storage formats has been demonstrated. The abstract data model, which can internalize most any geophysical data, opens up new possibilities for data fusion techniques, for example, polar and geostationary, (LEO/GEO), synergy for research and validation. McIDAS-V follows an object-oriented design model, using the Java programming language, allowing specialized extensions for for new sources of data, and novel displays and interactive behavior. The reference application, what the user sees on startup, can be customized, and the system has a persistence mechanism allowing sharing of the application state across the internet. McIDAS-V is open-source, and free to the public.

  13. Slope climbing challenges, fear of heights, anxiety and time of the day.

    PubMed

    Ennaceur, A; Hussain, M D; Abuhamdah, R M; Mostafa, R M; Chazot, P L

    2017-01-01

    When exposed to an unfamiliar open space, animals experience fear and attempt to find an escape route. Anxiety emerges when animals are confronted with a challenging obstacle to this fear motivated escape. High anxiety animals do not take risks; they avoid the challenge. The present experiments investigated this risk avoidant behavior in mice. In experiment 1, BALB/c, C57BL/6J and CD-1 mice were exposed to a large platform with downward inclined steep slopes attached on two opposite sides. The platform was elevated 75 and 100cm from the ground, in a standard (SPDS) and in a raised (RPDS) configuration, respectively. In experiment 2, the platform was elevated 75cm from the ground. Mice had to climb onto a stand at the top of upward inclined slopes (SPUS). In experiment 3, BALB/c mice were exposed to SPDS with steep or shallow slopes either in early morning or in late afternoon. In all 3 test configurations, mice spent more time in the areas adjacent to the slopes than in the areas adjacent to void, however only C57BL/6J and CD-1 crossed onto the slopes in SPDS, and crossed onto the stands in SPUS whereas BALB/c remained on the platform in SPDS and explored the slopes in SPUS. Elevation of the platform from the ground reduced the crossings onto the slopes in C57BL/6J and CD-1, and no differences were observed between BALB/c and C57BL/6J. BALB/c mice demonstrated no difference in anxiety when tested early morning or late afternoon; they crossed onto shallow slopes and avoided the steep one. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Validation of Cryosat-2 SAR Wind and Wave Products

    NASA Astrophysics Data System (ADS)

    Abdalla, Saleh; Dinardo, Salvatore; Benveniste, Jerome; Janssen, Peter

    2016-08-01

    Significant wave height (SWH) and surface wind speed (WS) products from the CryoSat-2 Synthetic Aperture Radar (SAR) Mode are validated against operational ECMWF atmospheric and wave model results in addition to available observations from buoys, platforms and other altimeters. The SAMOSA ocean model SAR data processed in the ESRIN G-POD service using SAR Versatile Altimetric Toolkit for Ocean Research & Exploitation (SARvatore). The data cover two geographic boxes: one in the northeast Atlantic Ocean extending from 32°N to 70°N and from 20°W to the prime meridian (NE Atlantic Box) for the period from 6 September 2010 to 30 June 2014 and the other is in eastern Pacific extending from 2.5°S to 25.5°S and from 160°W to 85°W (Pacific Box) for the period from 7 May 2012 to 30 June 2014. The amount of data is limited by the CryoSat SAR mode acquisition capability over ocean but high enough to ensure robustness and significance of the results (Sentinel-3 will operate in SAR mode over the whole ocean). The results show that the quality of both SWH and WS products is very high.

  15. ibex: An open infrastructure software platform to facilitate collaborative work in radiomics

    PubMed Central

    Zhang, Lifei; Fried, David V.; Fave, Xenia J.; Hunter, Luke A.; Court, Laurence E.

    2015-01-01

    Purpose: Radiomics, which is the high-throughput extraction and analysis of quantitative image features, has been shown to have considerable potential to quantify the tumor phenotype. However, at present, a lack of software infrastructure has impeded the development of radiomics and its applications. Therefore, the authors developed the imaging biomarker explorer (ibex), an open infrastructure software platform that flexibly supports common radiomics workflow tasks such as multimodality image data import and review, development of feature extraction algorithms, model validation, and consistent data sharing among multiple institutions. Methods: The ibex software package was developed using the matlab and c/c++ programming languages. The software architecture deploys the modern model-view-controller, unit testing, and function handle programming concepts to isolate each quantitative imaging analysis task, to validate if their relevant data and algorithms are fit for use, and to plug in new modules. On one hand, ibex is self-contained and ready to use: it has implemented common data importers, common image filters, and common feature extraction algorithms. On the other hand, ibex provides an integrated development environment on top of matlab and c/c++, so users are not limited to its built-in functions. In the ibex developer studio, users can plug in, debug, and test new algorithms, extending ibex’s functionality. ibex also supports quality assurance for data and feature algorithms: image data, regions of interest, and feature algorithm-related data can be reviewed, validated, and/or modified. More importantly, two key elements in collaborative workflows, the consistency of data sharing and the reproducibility of calculation result, are embedded in the ibex workflow: image data, feature algorithms, and model validation including newly developed ones from different users can be easily and consistently shared so that results can be more easily reproduced between institutions. Results: Researchers with a variety of technical skill levels, including radiation oncologists, physicists, and computer scientists, have found the ibex software to be intuitive, powerful, and easy to use. ibex can be run at any computer with the windows operating system and 1GB RAM. The authors fully validated the implementation of all importers, preprocessing algorithms, and feature extraction algorithms. Windows version 1.0 beta of stand-alone ibex and ibex’s source code can be downloaded. Conclusions: The authors successfully implemented ibex, an open infrastructure software platform that streamlines common radiomics workflow tasks. Its transparency, flexibility, and portability can greatly accelerate the pace of radiomics research and pave the way toward successful clinical translation. PMID:25735289

  16. IBEX: an open infrastructure software platform to facilitate collaborative work in radiomics.

    PubMed

    Zhang, Lifei; Fried, David V; Fave, Xenia J; Hunter, Luke A; Yang, Jinzhong; Court, Laurence E

    2015-03-01

    Radiomics, which is the high-throughput extraction and analysis of quantitative image features, has been shown to have considerable potential to quantify the tumor phenotype. However, at present, a lack of software infrastructure has impeded the development of radiomics and its applications. Therefore, the authors developed the imaging biomarker explorer (IBEX), an open infrastructure software platform that flexibly supports common radiomics workflow tasks such as multimodality image data import and review, development of feature extraction algorithms, model validation, and consistent data sharing among multiple institutions. The IBEX software package was developed using the MATLAB and c/c++ programming languages. The software architecture deploys the modern model-view-controller, unit testing, and function handle programming concepts to isolate each quantitative imaging analysis task, to validate if their relevant data and algorithms are fit for use, and to plug in new modules. On one hand, IBEX is self-contained and ready to use: it has implemented common data importers, common image filters, and common feature extraction algorithms. On the other hand, IBEX provides an integrated development environment on top of MATLAB and c/c++, so users are not limited to its built-in functions. In the IBEX developer studio, users can plug in, debug, and test new algorithms, extending IBEX's functionality. IBEX also supports quality assurance for data and feature algorithms: image data, regions of interest, and feature algorithm-related data can be reviewed, validated, and/or modified. More importantly, two key elements in collaborative workflows, the consistency of data sharing and the reproducibility of calculation result, are embedded in the IBEX workflow: image data, feature algorithms, and model validation including newly developed ones from different users can be easily and consistently shared so that results can be more easily reproduced between institutions. Researchers with a variety of technical skill levels, including radiation oncologists, physicists, and computer scientists, have found the IBEX software to be intuitive, powerful, and easy to use. IBEX can be run at any computer with the windows operating system and 1GB RAM. The authors fully validated the implementation of all importers, preprocessing algorithms, and feature extraction algorithms. Windows version 1.0 beta of stand-alone IBEX and IBEX's source code can be downloaded. The authors successfully implemented IBEX, an open infrastructure software platform that streamlines common radiomics workflow tasks. Its transparency, flexibility, and portability can greatly accelerate the pace of radiomics research and pave the way toward successful clinical translation.

  17. Strategy and Structure for Online News Production - Case Studies of CNN and NRK

    NASA Astrophysics Data System (ADS)

    Krumsvik, Arne H.

    This cross-national comparative case study of online news production analyzes the strategies of Cable News Network (CNN) and the Norwegian Broadcasting Corporation (NRK), aiming at understanding of the implications of organizational strategy on the role of journalists, explains why traditional media organizations have a tendency to develop a multi-platform approach (distributing content on several platforms, such as television, online, mobile) rather than developing the cross-media (with interplay between media types) or multimedia approach anticipated by both scholars and practitioners.

  18. Estimation of Genetic Relationships Between Individuals Across Cohorts and Platforms: Application to Childhood Height.

    PubMed

    Fedko, Iryna O; Hottenga, Jouke-Jan; Medina-Gomez, Carolina; Pappa, Irene; van Beijsterveldt, Catharina E M; Ehli, Erik A; Davies, Gareth E; Rivadeneira, Fernando; Tiemeier, Henning; Swertz, Morris A; Middeldorp, Christel M; Bartels, Meike; Boomsma, Dorret I

    2015-09-01

    Combining genotype data across cohorts increases power to estimate the heritability due to common single nucleotide polymorphisms (SNPs), based on analyzing a Genetic Relationship Matrix (GRM). However, the combination of SNP data across multiple cohorts may lead to stratification, when for example, different genotyping platforms are used. In the current study, we address issues of combining SNP data from different cohorts, the Netherlands Twin Register (NTR) and the Generation R (GENR) study. Both cohorts include children of Northern European Dutch background (N = 3102 + 2826, respectively) who were genotyped on different platforms. We explore imputation and phasing as a tool and compare three GRM-building strategies, when data from two cohorts are (1) just combined, (2) pre-combined and cross-platform imputed and (3) cross-platform imputed and post-combined. We test these three strategies with data on childhood height for unrelated individuals (N = 3124, average age 6.7 years) to explore their effect on SNP-heritability estimates and compare results to those obtained from the independent studies. All combination strategies result in SNP-heritability estimates with a standard error smaller than those of the independent studies. We did not observe significant difference in estimates of SNP-heritability based on various cross-platform imputed GRMs. SNP-heritability of childhood height was on average estimated as 0.50 (SE = 0.10). Introducing cohort as a covariate resulted in ≈2 % drop. Principal components (PCs) adjustment resulted in SNP-heritability estimates of about 0.39 (SE = 0.11). Strikingly, we did not find significant difference between cross-platform imputed and combined GRMs. All estimates were significant regardless the use of PCs adjustment. Based on these analyses we conclude that imputation with a reference set helps to increase power to estimate SNP-heritability by combining cohorts of the same ethnicity genotyped on different platforms. However, important factors should be taken into account such as remaining cohort stratification after imputation and/or phenotypic heterogeneity between and within cohorts. Whether one should use imputation, or just combine the genotype data, depends on the number of overlapping SNPs in relation to the total number of genotyped SNPs for both cohorts, and their ability to tag all the genetic variance related to the specific trait of interest.

  19. Implementation and Evaluation of a Fully Automated Multiplex Real-Time PCR Assay on the BD Max Platform to Detect and Differentiate Herpesviridae from Cerebrospinal Fluids

    PubMed Central

    Köller, Thomas; Kurze, Daniel; Lange, Mirjam; Scherdin, Martin; Podbielski, Andreas; Warnke, Philipp

    2016-01-01

    A fully automated multiplex real-time PCR assay—including a sample process control and a plasmid based positive control—for the detection and differentiation of herpes simplex virus 1 (HSV1), herpes simplex virus 2 (HSV2) and varicella-zoster virus (VZV) from cerebrospinal fluids (CSF) was developed on the BD Max platform. Performance was compared to an established accredited multiplex real time PCR protocol utilizing the easyMAG and the LightCycler 480/II, both very common devices in viral molecular diagnostics. For clinical validation, 123 CSF specimens and 40 reference samples from national interlaboratory comparisons were examined with both methods, resulting in 97.6% and 100% concordance for CSF and reference samples, respectively. Utilizing the BD Max platform revealed sensitivities of 173 (CI 95%, 88–258) copies/ml for HSV1, 171 (CI 95%, 148–194) copies/ml for HSV2 and 84 (CI 95%, 5–163) copies/ml for VZV. Cross reactivity could be excluded by checking 25 common viral, bacterial and fungal human pathogens. Workflow analyses displayed shorter test duration as well as remarkable fewer and easier preparation steps with the potential to reduce error rates occurring when manually assessing patient samples. This protocol allows for a fully automated PCR assay on the BD Max platform for the simultaneously detection of herpesviridae from CSF specimens. Singular or multiple infections due to HSV1, HSV2 and VZV can reliably be differentiated with good sensitivities. Control parameters are included within the assay, thereby rendering its suitability for current quality management requirements. PMID:27092772

  20. Cooperative Collision Avoidance Technology Demonstration Data Analysis Report

    NASA Technical Reports Server (NTRS)

    2007-01-01

    This report details the National Aeronautics and Space Administration (NASA) Access 5 Project Office Cooperative Collision Avoidance (CCA) Technology Demonstration for unmanned aircraft systems (UAS) conducted from 21 to 28 September 2005. The test platform chosen for the demonstration was the Proteus Optionally Piloted Vehicle operated by Scaled Composites, LLC, flown out of the Mojave Airport, Mojave, CA. A single intruder aircraft, a NASA Gulf stream III, was used during the demonstration to execute a series of near-collision encounter scenarios. Both aircraft were equipped with Traffic Alert and Collision Avoidance System-II (TCAS-II) and Automatic Dependent Surveillance Broadcast (ADS-B) systems. The objective of this demonstration was to collect flight data to support validation efforts for the Access 5 CCA Work Package Performance Simulation and Systems Integration Laboratory (SIL). Correlation of the flight data with results obtained from the performance simulation serves as the basis for the simulation validation. A similar effort uses the flight data to validate the SIL architecture that contains the same sensor hardware that was used during the flight demonstration.

  1. Aerodynamic Flow Control by Thermoacoustic Excitation from the Constituent Nanomaterials on the Platform Surface

    DTIC Science & Technology

    2016-02-01

    Nanomaterials on the Platform Surface by Bryan Glaz Approved for public release; distribution is unlimited...Research Laboratory Aerodynamic Flow Control by Thermoacoustic Excitation from the Constituent Nanomaterials on the Platform Surface by Bryan Glaz...shall be subject to any penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number

  2. Cost (non)-recovery by platform technology facilities in the Bio21 Cluster.

    PubMed

    Gibbs, Gerard; Clark, Stella; Quinn, Julieanne; Gleeson, Mary Joy

    2010-04-01

    Platform technologies (PT) are techniques or tools that enable a range of scientific investigations and are critical to today's advanced technology research environment. Once installed, they require specialized staff for their operations, who in turn, provide expertise to researchers in designing appropriate experiments. Through this pipeline, research outputs are raised to the benefit of the researcher and the host institution. Platform facilities provide access to instrumentation and expertise for a wide range of users beyond the host institution, including other academic and industry users. To maximize the return on these substantial public investments, this wider access needs to be supported. The question of support and the mechanisms through which this occurs need to be established based on a greater understanding of how PT facilities operate. This investigation was aimed at understanding if and how platform facilities across the Bio21 Cluster meet operating costs. Our investigation found: 74% of platforms surveyed do not recover 100% of direct operating costs and are heavily subsidized by their home institution, which has a vested interest in maintaining the technology platform; platform managers play a major role in establishing the costs and pricing of the facility, normally in a collaborative process with a management committee or institutional accountant; and most facilities have a three-tier pricing structure recognizing internal academic, external academic, and commercial clients.

  3. Cost (Non)-Recovery by Platform Technology Facilities in the Bio21 Cluster

    PubMed Central

    Gibbs, Gerard; Clark, Stella; Quinn, JulieAnne; Gleeson, Mary Joy

    2010-01-01

    Platform technologies (PT) are techniques or tools that enable a range of scientific investigations and are critical to today's advanced technology research environment. Once installed, they require specialized staff for their operations, who in turn, provide expertise to researchers in designing appropriate experiments. Through this pipeline, research outputs are raised to the benefit of the researcher and the host institution.1 Platform facilities provide access to instrumentation and expertise for a wide range of users beyond the host institution, including other academic and industry users. To maximize the return on these substantial public investments, this wider access needs to be supported. The question of support and the mechanisms through which this occurs need to be established based on a greater understanding of how PT facilities operate. This investigation was aimed at understanding if and how platform facilities across the Bio21 Cluster meet operating costs. Our investigation found: 74% of platforms surveyed do not recover 100% of direct operating costs and are heavily subsidized by their home institution, which has a vested interest in maintaining the technology platform; platform managers play a major role in establishing the costs and pricing of the facility, normally in a collaborative process with a management committee or institutional accountant; and most facilities have a three-tier pricing structure recognizing internal academic, external academic, and commercial clients. PMID:20357980

  4. AGARD Highlights.

    DTIC Science & Technology

    1980-09-01

    Air Vehicles as Sensor Platforms" (see inset) and would seek to determine the operational capability of advanced high altitude air vehicles as...Unmanned Air Vehicles as Sensor Platforms Determine the operational capability of advanced high-altitude air vehicles as platforms for surveillance...m) IT ala (1) a freccia ES sensacion (r) artificial ES indicador (m) tipo A NE pillvleugel FR I sensation MI artificielle FR indicateur (m) type A

  5. Development of a Platform to Enable Fully Automated Cross-Titration Experiments.

    PubMed

    Cassaday, Jason; Finley, Michael; Squadroni, Brian; Jezequel-Sur, Sylvie; Rauch, Albert; Gajera, Bharti; Uebele, Victor; Hermes, Jeffrey; Zuck, Paul

    2017-04-01

    In the triage of hits from a high-throughput screening campaign or during the optimization of a lead compound, it is relatively routine to test compounds at multiple concentrations to determine potency and maximal effect. Additional follow-up experiments, such as agonist shift, can be quite valuable in ascertaining compound mechanism of action (MOA). However, these experiments require cross-titration of a test compound with the activating ligand of the receptor requiring 100-200 data points, severely limiting the number tested in MOA assays in a screening triage. We describe a process to enhance the throughput of such cross-titration experiments through the integration of Hewlett Packard's D300 digital dispenser onto one of our robotics platforms to enable on-the-fly cross-titration of compounds in a 1536-well plate format. The process handles all the compound management and data tracking, as well as the biological assay. The process relies heavily on in-house-built software and hardware, and uses our proprietary control software for the platform. Using this system, we were able to automate the cross-titration of compounds for both positive and negative allosteric modulators of two different G protein-coupled receptors (GPCRs) using two distinct assay detection formats, IP1 and Ca 2+ detection, on nearly 100 compounds for each target.

  6. Optimized molten salt receivers for ultimate trough solar fields

    NASA Astrophysics Data System (ADS)

    Riffelmann, Klaus-J.; Richert, Timo; Kuckelkorn, Thomas

    2016-05-01

    Today parabolic trough collectors are the most successful concentrating solar power (CSP) technology. For the next development step new systems with increased operation temperature and new heat transfer fluids (HTF) are currently developed. Although the first power tower projects have successfully been realized, up to now there is no evidence of an all-dominant economic or technical advantage of power tower or parabolic trough. The development of parabolic trough technology towards higher performance and significant cost reduction have led to significant improvements in competitiveness. The use of molten salt instead of synthetic oil as heat transfer fluid will bring down the levelized costs of electricity (LCOE) even further while providing dispatchable energy with high capacity factors. FLABEG has developed the Ultimate TroughTM (UT) collector, jointly with sbp Sonne GmbH and supported by public funds. Due to its validated high optical accuracy, the collector is very suitable to operate efficiently at elevated temperatures up to 550 °C. SCHOTT will drive the key-innovations by introducing the 4th generation solar receiver that addresses the most significant performance and cost improvement measures. The new receivers have been completely redesigned to provide a product platform that is ready for high temperature operation up to 550 °C. Moreover distinct product features have been introduced to reduce costs and risks in solar field assembly and installation. The increased material and design challenges incurred with the high temperature operation have been reflected in sophisticated qualification and validation procedures.

  7. Experimental platform utilising melting curve technology for detection of mutations in Mycobacterium tuberculosis isolates.

    PubMed

    Broda, Agnieszka; Nikolayevskyy, Vlad; Casali, Nicki; Khan, Huma; Bowker, Richard; Blackwell, Gemma; Patel, Bhakti; Hume, James; Hussain, Waqar; Drobniewski, Francis

    2018-04-20

    Tuberculosis (TB) remains one of the most deadly infections with approximately a quarter of cases not being identified and/or treated mainly due to a lack of resources. Rapid detection of TB or drug-resistant TB enables timely adequate treatment and is a cornerstone of effective TB management. We evaluated the analytical performance of a single-tube assay for multidrug-resistant TB (MDR-TB) on an experimental platform utilising RT-PCR and melting curve analysis that could potentially be operated as a point-of-care (PoC) test in resource-constrained settings with a high burden of TB. Firstly, we developed and evaluated the prototype MDR-TB assay using specimens extracted from well-characterised TB isolates with a variety of distinct rifampicin and isoniazid resistance conferring mutations and nontuberculous Mycobacteria (NTM) strains. Secondly, we validated the experimental platform using 98 clinical sputum samples from pulmonary TB patients collected in high MDR-TB settings. The sensitivity of the platform for TB detection in clinical specimens was 75% for smear-negative and 92.6% for smear-positive sputum samples. The sensitivity of detection for rifampicin and isoniazid resistance was 88.9 and 96.0% and specificity was 87.5 and 100%, respectively. Observed limitations in sensitivity and specificity could be resolved by adjusting the sample preparation methodology and melting curve recognition algorithm. Overall technology could be considered a promising PoC methodology especially in resource-constrained settings based on its combined accuracy, convenience, simplicity, speed, and cost characteristics.

  8. Predicting Droplet Formation on Centrifugal Microfluidic Platforms

    NASA Astrophysics Data System (ADS)

    Moebius, Jacob Alfred

    Centrifugal microfluidics is a widely known research tool for biological sample and water quality analysis. Currently, the standard equipment used for such diagnostic applications include slow, bulky machines controlled by multiple operators. These machines can be condensed into a smaller, faster benchtop sample-to-answer system. Sample processing is an important step taken to extract, isolate, and convert biological factors, such as nucleic acids or proteins, from a raw sample to an analyzable solution. Volume definition is one such step. The focus of this thesis is the development of a model predicting monodispersed droplet formation and the application of droplets as a technique for volume definition. First, a background of droplet microfluidic platforms is presented, along with current biological analysis technologies and the advantages of integrating such technologies onto microfluidic platforms. Second, background and theories of centrifugal microfluidics is given, followed by theories relevant to droplet emulsions. Third, fabrication techniques for centrifugal microfluidic designs are discussed. Finally, the development of a model for predicting droplet formation on the centrifugal microfluidic platform are presented for the rest of the thesis. Predicting droplet formation analytically based on the volumetric flow rates of the continuous and dispersed phases, the ratios of these two flow rates, and the interfacial tension between the continuous and dispersed phases presented many challenges, which will be discussed in this work. Experimental validation was completed using continuous phase solutions of different interfacial tensions. To conclude, prospective applications are discussed with expected challenges.

  9. Concept and design of a UAS-based platform for measurements of RF signal-in-space

    NASA Astrophysics Data System (ADS)

    Schrader, Thorsten; Bredemeyer, Jochen; Mihalachi, Marius; Rohde, Jan; Kleine-Ostmann, Thomas

    2016-09-01

    Field strength or signal-in-space (SIS) measurements have been performed by using manned helicopters, aircrafts or from ground level using extendable masts. With the availability of unmanned aerial systems (UAS) such as multicopters a new versatile platform for SIS measurements is deployable. Larger types show up to eight individually driven electric motors and controllers (therefore called octocopter). They provide the ability to fly along predefined traces, to hover at waypoints and to initiate other actions when those have been reached. They provide self-levelling and stabilisation and moreover, they may gear at a point of interest regardless of their actual position, e.g. during their flight around a tower. Their payload mainly depends on the platform size and allows integration of complex measurement equipment. Upgrading their navigation capabilities including state-of-the-art global navigation satellite system (GNSS) and ground station transmitter (real-time kinematic - RTK) enables precise localisation of the UAS. For operation in electromagnetic harsh environments a shielding can be considered and integrated into the concept. This paper describes concept and design of an octocopter and its instrumentation, along with applications in recent projects, in which we measure and validate terrestrial navigation systems applied in air traffic and the weather forecast services. Among those are instrumentation landing systems (ILS), VHF omnidirectional radio ranges (VOR), airport traffic and weather radars as well as military surveillance radars, and UHF wind profilers. Especially to investigate the possible interaction of VORs and radars with single wind turbines (WT) or wind power plants has become a major request of economy, military and politics. Here, UAS can be deployed to deliver measurement data investigating this interaction. Once developed and setup to a certain extent, UAS are easy and cost-efficient to operate. Nonetheless, due to their compact size, UAS will have rather low interaction with the electromagnetic field to be measured compared to the operation of manned helicopters.

  10. The balloon ring: a high-performance low-cost instrumentation platform for measuring atmospheric turbulence profiles

    NASA Astrophysics Data System (ADS)

    Kyrazis, Demos T.; Eaton, Frank D.; Black, Don G.; Black, Wiley T.; Black, Alastair

    2009-08-01

    Balloons, similar to those used for meteorological observations, are commonly used to carry a small instrumentation package for measuring optical turbulence in the atmosphere as a function of altitude. Two temperature sensors, one meter apart, measure a single point of the temperature structure function. The raw data is processed to provided the value of CT2, and the results transmitted to a ground receiving site. These data are converted to the index of refraction structure constant, Cn2. The validity of these measurements depend on the correctness of a number of assumptions. These include local isotropy of the turbulence and the existence of the Kolmogorov inertial subrange, and that the data is not contaminated by the wake of the ascending balloon. A variety of experiments on other platforms, and in the laboratory, demonstrate that the assumptions upon which these balloon measurements are made are not valid for a large percentage of the above described flights. In order to collect data whose interpretation did not require preconceived assumptions, the balloon ring instrumentation system was developed. The ring is 8.69 meters in diameter, with a cross-sectional diameter of 14 cm. The ring is hung just below the balloon, so that the wake goes through the center of the ring, and the sensors are mounted tangent to the circumference of the ring. The raw data is transmitted to the ground with a bandwidth extending to 1.25 kHz. A sample of the measurements taken during a flight at Vandenberg Air Force Base, Calif. is presented.

  11. NASA's Analog Missions: Driving Exploration Through Innovative Testing

    NASA Technical Reports Server (NTRS)

    Reagan, Marcum L.; Janoiko, Barbara A.; Parker, Michele L.; Johnson, James E.; Chappell, Steven P.; Abercromby, Andrew F.

    2012-01-01

    Human exploration beyond low-Earth orbit (LEO) will require a unique collection of advanced, innovative technologies and the precise execution of complex and challenging operational concepts. One tool we in the Analog Missions Project at the National Aeronautics and Space Administration (NASA) utilize to validate exploration system architecture concepts and conduct technology demonstrations, while gaining a deeper understanding of system-wide technical and operational challenges, is our analog missions. Analog missions are multi-disciplinary activities that test multiple features of future spaceflight missions in an integrated fashion to gain a deeper understanding of system-level interactions and integrated operations. These missions frequently occur in remote and extreme environments that are representative in one or more ways to that of future spaceflight destinations. They allow us to test robotics, vehicle prototypes, habitats, communications systems, in-situ resource utilization, and human performance as it relates to these technologies. And they allow us to validate architectural concepts, conduct technology demonstrations, and gain a deeper understanding of system-wide technical and operational challenges needed to support crewed missions beyond LEO. As NASA develops a capability driven architecture for transporting crew to a variety of space environments, including the moon, near-Earth asteroids (NEA), Mars, and other destinations, it will use its analog missions to gather requirements and develop the technologies that are necessary to ensure successful human exploration beyond LEO. Currently, there are four analog mission platforms: Research and Technology Studies (RATS), NASA s Extreme Environment Mission Operations (NEEMO), In-Situ Resource Utilization (ISRU), and International Space Station (ISS) Test bed for Analog Research (ISTAR).

  12. Arc4nix: A cross-platform geospatial analytical library for cluster and cloud computing

    NASA Astrophysics Data System (ADS)

    Tang, Jingyin; Matyas, Corene J.

    2018-02-01

    Big Data in geospatial technology is a grand challenge for processing capacity. The ability to use a GIS for geospatial analysis on Cloud Computing and High Performance Computing (HPC) clusters has emerged as a new approach to provide feasible solutions. However, users lack the ability to migrate existing research tools to a Cloud Computing or HPC-based environment because of the incompatibility of the market-dominating ArcGIS software stack and Linux operating system. This manuscript details a cross-platform geospatial library "arc4nix" to bridge this gap. Arc4nix provides an application programming interface compatible with ArcGIS and its Python library "arcpy". Arc4nix uses a decoupled client-server architecture that permits geospatial analytical functions to run on the remote server and other functions to run on the native Python environment. It uses functional programming and meta-programming language to dynamically construct Python codes containing actual geospatial calculations, send them to a server and retrieve results. Arc4nix allows users to employ their arcpy-based script in a Cloud Computing and HPC environment with minimal or no modification. It also supports parallelizing tasks using multiple CPU cores and nodes for large-scale analyses. A case study of geospatial processing of a numerical weather model's output shows that arcpy scales linearly in a distributed environment. Arc4nix is open-source software.

  13. Building a Secure and Feature-rich Mobile Mapping Service App Using HTML5: Challenges and Best Practices

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Karthik, Rajasekar; Patlolla, Dilip Reddy; Sorokine, Alexandre

    Managing a wide variety of mobile devices across multiple mobile operating systems is a security challenge for any organization [1, 2]. With the wide adoption of mobile devices to access work-related apps, there is an increase in third-party apps that might either misuse or improperly handle user s personal or sensitive data [3]. HTML5 has been receiving wide attention for developing cross-platform mobile apps. According to International Data Corporation (IDC), by 2015, 80% of all mobile apps will be based in part or wholly upon HTML5 [4]. Though HTML5 provides a rich set of features for building an app, itmore » is a challenge for organizations to deploy and manage HTML5 apps on wide variety of devices while keeping security policies intact. In this paper, we will describe an upcoming secure mobile environment for HTML5 apps, called Sencha Space that addresses these issues and discuss how it will be used to design and build a secure and cross-platform mobile mapping service app. We will also describe how HTML5 and a new set of related technologies such as Geolocation API, WebGL, Open Layers 3, and Local Storage, can be used to provide a high end and high performance experience for users of the mapping service app.« less

  14. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. The ideal laboratory information system.

    PubMed

    Sepulveda, Jorge L; Young, Donald S

    2013-08-01

    Laboratory information systems (LIS) are critical components of the operation of clinical laboratories. However, the functionalities of LIS have lagged significantly behind the capacities of current hardware and software technologies, while the complexity of the information produced by clinical laboratories has been increasing over time and will soon undergo rapid expansion with the use of new, high-throughput and high-dimensionality laboratory tests. In the broadest sense, LIS are essential to manage the flow of information between health care providers, patients, and laboratories and should be designed to optimize not only laboratory operations but also personalized clinical care. To list suggestions for designing LIS with the goal of optimizing the operation of clinical laboratories while improving clinical care by intelligent management of laboratory information. Literature review, interviews with laboratory users, and personal experience and opinion. Laboratory information systems can improve laboratory operations and improve patient care. Specific suggestions for improving the function of LIS are listed under the following sections: (1) Information Security, (2) Test Ordering, (3) Specimen Collection, Accessioning, and Processing, (4) Analytic Phase, (5) Result Entry and Validation, (6) Result Reporting, (7) Notification Management, (8) Data Mining and Cross-sectional Reports, (9) Method Validation, (10) Quality Management, (11) Administrative and Financial Issues, and (12) Other Operational Issues.

  16. mMass 3: a cross-platform software environment for precise analysis of mass spectrometric data.

    PubMed

    Strohalm, Martin; Kavan, Daniel; Novák, Petr; Volný, Michael; Havlícek, Vladimír

    2010-06-01

    While tools for the automated analysis of MS and LC-MS/MS data are continuously improving, it is still often the case that at the end of an experiment, the mass spectrometrist will spend time carefully examining individual spectra. Current software support is mostly provided only by the instrument vendors, and the available software tools are often instrument-dependent. Here we present a new generation of mMass, a cross-platform environment for the precise analysis of individual mass spectra. The software covers a wide range of processing tasks such as import from various data formats, smoothing, baseline correction, peak picking, deisotoping, charge determination, and recalibration. Functions presented in the earlier versions such as in silico digestion and fragmentation were redesigned and improved. In addition to Mascot, an interface for ProFound has been implemented. A specific tool is available for isotopic pattern modeling to enable precise data validation. The largest available lipid database (from the LIPID MAPS Consortium) has been incorporated and together with the new compound search tool lipids can be rapidly identified. In addition, the user can define custom libraries of compounds and use them analogously. The new version of mMass is based on a stand-alone Python library, which provides the basic functionality for data processing and interpretation. This library can serve as a good starting point for other developers in their projects. Binary distributions of mMass, its source code, a detailed user's guide, and video tutorials are freely available from www.mmass.org .

  17. Evaluation of Serologic and Antigenic Relationships Between Middle Eastern Respiratory Syndrome Coronavirus and Other Coronaviruses to Develop Vaccine Platforms for the Rapid Response to Emerging Coronaviruses

    PubMed Central

    Agnihothram, Sudhakar; Gopal, Robin; Yount, Boyd L.; Donaldson, Eric F.; Menachery, Vineet D.; Graham, Rachel L.; Scobey, Trevor D.; Gralinski, Lisa E.; Denison, Mark R.; Zambon, Maria; Baric, Ralph S.

    2014-01-01

    Background. Middle East respiratory syndrome coronavirus (MERS-CoV) emerged in 2012, causing severe acute respiratory disease and pneumonia, with 44% mortality among 136 cases to date. Design of vaccines to limit the virus spread or diagnostic tests to track newly emerging strains requires knowledge of antigenic and serologic relationships between MERS-CoV and other CoVs. Methods. Using synthetic genomics and Venezuelan equine encephalitis virus replicons (VRPs) expressing spike and nucleocapsid proteins from MERS-CoV and other human and bat CoVs, we characterize the antigenic responses (using Western blot and enzyme-linked immunosorbent assay) and serologic responses (using neutralization assays) against 2 MERS-CoV isolates in comparison with those of other human and bat CoVs. Results. Serologic and neutralization responses against the spike glycoprotein were primarily strain specific, with a very low level of cross-reactivity within or across subgroups. CoV N proteins within but not across subgroups share cross-reactive epitopes with MERS-CoV isolates. Our findings were validated using a convalescent-phase serum specimen from a patient infected with MERS-CoV (NA 01) and human antiserum against SARS-CoV, human CoV NL63, and human CoV OC43. Conclusions. Vaccine design for emerging CoVs should involve chimeric spike protein containing neutralizing epitopes from multiple virus strains across subgroups to reduce immune pathology, and a diagnostic platform should include a panel of nucleocapsid and spike proteins from phylogenetically distinct CoVs. PMID:24253287

  18. A system to measure minute hydraulic permeability of nanometer scale devices in a non-destructive manner

    NASA Astrophysics Data System (ADS)

    Smith, Ross A.; Fleischman, Aaron J.; Fissell, William H.; Zorman, Christian A.; Roy, Shuvo

    2011-04-01

    We report an automated system for measuring the hydraulic permeability of nanoporous membranes in a tangential-flow configuration. The system was designed and built specifically for micromachined silicon nanoporous membranes (SNM) with monodisperse slit-shaped pores. These novel membranes are under development for water filtration, artificial organ and drug delivery applications. The filtration cell permits non-destructive testing of the membrane over many remove-modify-replace testing cycles, allowing for direct experiments into the effects of surface modifications on such membranes. The experimental apparatus was validated using microfluidic tubing with circular cross sections that provided similar fluidic resistances to SNM. Further validation was performed with SNM chips for which the pore dimensions were known from scanning electron microscopy measurements. The system was then used to measure the hydraulic permeability of nanoporous membranes before and after surface modification. The system yields measurements with low variance and excellent agreement with predicted values, providing a platform for determining pore sizes in micro/nanofluidic systems with tight pore size distributions to a higher degree of precision than can be achieved with traditional techniques.

  19. By land, sea and air (and space): Verifying UK methane emissions at a range of scales by integrating multiple measurement platforms

    NASA Astrophysics Data System (ADS)

    Rigby, M. L.; Lunt, M. F.; Ganesan, A.

    2015-12-01

    The Greenhouse gAs Uk and Global Emissions (GAUGE) programme and Department of Energy and Climate Change (DECC) network aim to quantify the magnitude and uncertainty of UK greenhouse gas (GHG) emissions at a resolution and accuracy higher than has previously been possible. The on going DECC tall tower network consists of three sites, and an eastern background site in Ireland. The GAUGE project includes instruments at two additional tall tower sites, a high-density measurement network over agricultural land in eastern England, a ferry that performs near-daily transects along the east coast of the UK, and a research aircraft that has been deployed on a campaign basis. Together with data collected by the GOSAT satellite, these data represent the GAUGE/DECC GHG measurement network that is being used to quantify UK GHG fluxes. As part of the wider GAUGE modelling efforts, we have derived methane flux estimates for the UK and northwest Europe using the UK Met Office NAME atmospheric transport model and a novel hierarchical Bayesian "trans-dimensional" inversion framework. We will show that our estimated fluxes for the UK as a whole are largely consistent between individual measurement platforms, albeit with very different uncertainties. Our novel inversion approach uses the data to objectively determine the extent to which we can further refine our national estimates to the level of large urban areas, major hotspots or larger sub-national regions. In this talk, we will outline some initial findings of the GAUGE project, tackling questions such as: At what spatial scale can we effectively derive greenhouse gas fluxes with a dense, multi-platform national network? Can we resolve individual metropolitan areas or major hotspots? What is relative impact of individual stations, platforms and network configurations on flux estimates for a country of the size of the UK? How can we effectively use multi-platform observations to cross-validate flux estimates and determine likely errors in model transport?

  20. Bayesian approach to transforming public gene expression repositories into disease diagnosis databases.

    PubMed

    Huang, Haiyan; Liu, Chun-Chi; Zhou, Xianghong Jasmine

    2010-04-13

    The rapid accumulation of gene expression data has offered unprecedented opportunities to study human diseases. The National Center for Biotechnology Information Gene Expression Omnibus is currently the largest database that systematically documents the genome-wide molecular basis of diseases. However, thus far, this resource has been far from fully utilized. This paper describes the first study to transform public gene expression repositories into an automated disease diagnosis database. Particularly, we have developed a systematic framework, including a two-stage Bayesian learning approach, to achieve the diagnosis of one or multiple diseases for a query expression profile along a hierarchical disease taxonomy. Our approach, including standardizing cross-platform gene expression data and heterogeneous disease annotations, allows analyzing both sources of information in a unified probabilistic system. A high level of overall diagnostic accuracy was shown by cross validation. It was also demonstrated that the power of our method can increase significantly with the continued growth of public gene expression repositories. Finally, we showed how our disease diagnosis system can be used to characterize complex phenotypes and to construct a disease-drug connectivity map.

Top