Padalino, Saverio; Sfondrini, Maria Francesca; Chenuil, Laura; Scudeller, Luigia; Gandini, Paola
2014-12-01
The aim of this study was to assess the feasibility of skeletal maturation analysis using the Cervical Vertebrae Maturation (CVM) method by means of dedicated software, developed in collaboration with Outside Format (Paullo-Milan), as compared with manual analysis. From a sample of patients aged 7-21 years, we gathered 100 lateral cephalograms, 20 for each of the five CVM stages. For each cephalogram, we traced cervical vertebrae C2, C3 and C4 by hand using a lead pencil and an acetate sheet and dedicated software. All the tracings were made by an experienced operator (a dentofacial orthopedics resident) and by an inexperienced operator (a student in dental surgery). Each operator recorded the time needed to make each tracing in order to demonstrate differences in the times taken. Concordance between the manual analysis and the analysis performed using the dedicated software was 94% for the resident and 93% for the student. Interobserver concordance was 99%. The hand-tracing was quicker than that performed by means of the software (28 seconds more on average). The cervical vertebrae analysis software offers excellent clinical performance, even if the method takes longer than the manual technique. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
The ALICE Software Release Validation cluster
NASA Astrophysics Data System (ADS)
Berzano, D.; Krzewicki, M.
2015-12-01
One of the most important steps of software lifecycle is Quality Assurance: this process comprehends both automatic tests and manual reviews, and all of them must pass successfully before the software is approved for production. Some tests, such as source code static analysis, are executed on a single dedicated service: in High Energy Physics, a full simulation and reconstruction chain on a distributed computing environment, backed with a sample “golden” dataset, is also necessary for the quality sign off. The ALICE experiment uses dedicated and virtualized computing infrastructures for the Release Validation in order not to taint the production environment (i.e. CVMFS and the Grid) with non-validated software and validation jobs: the ALICE Release Validation cluster is a disposable virtual cluster appliance based on CernVM and the Virtual Analysis Facility, capable of deploying on demand, and with a single command, a dedicated virtual HTCondor cluster with an automatically scalable number of virtual workers on any cloud supporting the standard EC2 interface. Input and output data are externally stored on EOS, and a dedicated CVMFS service is used to provide the software to be validated. We will show how the Release Validation Cluster deployment and disposal are completely transparent for the Release Manager, who simply triggers the validation from the ALICE build system's web interface. CernVM 3, based entirely on CVMFS, permits to boot any snapshot of the operating system in time: we will show how this allows us to certify each ALICE software release for an exact CernVM snapshot, addressing the problem of Long Term Data Preservation by ensuring a consistent environment for software execution and data reprocessing in the future.
Efficient Process Migration for Parallel Processing on Non-Dedicated Networks of Workstations
NASA Technical Reports Server (NTRS)
Chanchio, Kasidit; Sun, Xian-He
1996-01-01
This paper presents the design and preliminary implementation of MpPVM, a software system that supports process migration for PVM application programs in a non-dedicated heterogeneous computing environment. New concepts of migration point as well as migration point analysis and necessary data analysis are introduced. In MpPVM, process migrations occur only at previously inserted migration points. Migration point analysis determines appropriate locations to insert migration points; whereas, necessary data analysis provides a minimum set of variables to be transferred at each migration pint. A new methodology to perform reliable point-to-point data communications in a migration environment is also discussed. Finally, a preliminary implementation of MpPVM and its experimental results are presented, showing the correctness and promising performance of our process migration mechanism in a scalable non-dedicated heterogeneous computing environment. While MpPVM is developed on top of PVM, the process migration methodology introduced in this study is general and can be applied to any distributed software environment.
NASA Technical Reports Server (NTRS)
Bremmer, D. A.
1986-01-01
The feasibility of some off-the-shelf microprocessors and state-of-art software is assessed (1) as a development system for the principle investigator (pi) in the design of the experiment model, (2) as an example of available technology application for future PI's experiments, (3) as a system capable of being interactive in the PCTC's simulation of the dedicated experiment processor (DEP), preferably by bringing the PI's DEP software directly into the simulation model, (4) as a system having bus compatibility with host VAX simulation computers, (5) as a system readily interfaced with mock-up panels and information displays, and (6) as a functional system for post mission data analysis.
Salvador, R; Luque, M P; Ciudin, A; Paño, B; Buñesch, L; Sebastia, C; Nicolau, C
2016-01-01
To prospectively evaluate the usefulness of dual-energy computed tomography (DECT) with and without dedicated software in identifying uric acid kidney stones in vivo. We studied 65 kidney stones in 63 patients. All stones were analyzed in vivo by DECT and ex vivo by spectrophotometry. We evaluated the diagnostic performance in identifying uric acid stones with DECT by analyzing the radiologic densities with dedicated software and without using it (through manual measurements) as well as by analyzing the attenuation ratios of the stones in both energies with and without the dedicated software. The six uric acid stones included were correctly identified by evaluating the attenuation ratios with a cutoff of 1.21, both with the dedicated software and without it, yielding perfect diagnostic performance without false positives or false negatives. The study of the attenuations of the stones obtained the following values on the receiver operating characteristic curves in the classification of the uric acid stones: 0.92 for the measurements done with the software and 0.89 for the manual measurements; a cutoff of 538 HU yielded 84% (42/50) diagnostic accuracy for the software and 83.1% (54/65) for the manual measurements. DECT enabled the uric acid stones to be identified correctly through the calculation of the ratio of the attenuations in the two energies. The results obtained with the dedicated software were similar to those obtained manually. Copyright © 2015 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
Long-term Preservation of Data Analysis Capabilities
NASA Astrophysics Data System (ADS)
Gabriel, C.; Arviset, C.; Ibarra, A.; Pollock, A.
2015-09-01
While the long-term preservation of scientific data obtained by large astrophysics missions is ensured through science archives, the issue of data analysis software preservation has hardly been addressed. Efforts by large data centres have contributed so far to maintain some instrument or mission-specific data reduction packages on top of high-level general purpose data analysis software. However, it is always difficult to keep software alive without support and maintenance once the active phase of a mission is over. This is especially difficult in the budgetary model followed by space agencies. We discuss the importance of extending the lifetime of dedicated data analysis packages and review diverse strategies under development at ESA using new paradigms such as Virtual Machines, Cloud Computing, and Software as a Service for making possible full availability of data analysis and calibration software for decades at minimal cost.
[Evaluation of Organ Dose Estimation from Indices of CT Dose Using Dose Index Registry].
Iriuchijima, Akiko; Fukushima, Yasuhiro; Ogura, Akio
Direct measurement of each patient organ dose from computed tomography (CT) is not possible. Most methods to estimate patient organ dose is using Monte Carlo simulation with dedicated software. However, dedicated software is too expensive for small scale hospitals. Not every hospital can estimate organ dose with dedicated software. The purpose of this study was to evaluate the simple method of organ dose estimation using some common indices of CT dose. The Monte Carlo simulation software Radimetrics (Bayer) was used for calculating organ dose and analysis relationship between indices of CT dose and organ dose. Multidetector CT scanners were compared with those from two manufactures (LightSpeed VCT, GE Healthcare; SOMATOM Definition Flash, Siemens Healthcare). Using stored patient data from Radimetrics, the relationships between indices of CT dose and organ dose were indicated as each formula for estimating organ dose. The accuracy of estimation method of organ dose was compared with the results of Monte Carlo simulation using the Bland-Altman plots. In the results, SSDE was the feasible index for estimation organ dose in almost organs because it reflected each patient size. The differences of organ dose between estimation and simulation were within 23%. In conclusion, our estimation method of organ dose using indices of CT dose is convenient for clinical with accuracy.
A field-to-desktop toolchain for X-ray CT densitometry enables tree ring analysis
De Mil, Tom; Vannoppen, Astrid; Beeckman, Hans; Van Acker, Joris; Van den Bulcke, Jan
2016-01-01
Background and Aims Disentangling tree growth requires more than ring width data only. Densitometry is considered a valuable proxy, yet laborious wood sample preparation and lack of dedicated software limit the widespread use of density profiling for tree ring analysis. An X-ray computed tomography-based toolchain of tree increment cores is presented, which results in profile data sets suitable for visual exploration as well as density-based pattern matching. Methods Two temperate (Quercus petraea, Fagus sylvatica) and one tropical species (Terminalia superba) were used for density profiling using an X-ray computed tomography facility with custom-made sample holders and dedicated processing software. Key Results Density-based pattern matching is developed and able to detect anomalies in ring series that can be corrected via interactive software. Conclusions A digital workflow allows generation of structure-corrected profiles of large sets of cores in a short time span that provide sufficient intra-annual density information for tree ring analysis. Furthermore, visual exploration of such data sets is of high value. The dated profiles can be used for high-resolution chronologies and also offer opportunities for fast screening of lesser studied tropical tree species. PMID:27107414
A communications model for an ISAS to NASA span link
NASA Technical Reports Server (NTRS)
Green, James L.; Mcguire, Robert E.; Lopez-Swafford, Brian
1987-01-01
The authors propose that an initial computer-to-computer communication link use the public packet switched networks (PPSN) Venus-P in Japan and TELENET in the U.S. When the traffic warrants it, this link would then be upgraded to a dedicated leased line that directly connects into the Space Physics Analysis Network (SPAN). The proposed system of hardware and software will easily support migration to such a dedicated link. It therefore provides a cost effective approach to the network problem. Once a dedicated line becomes operation it is suggested that the public networks link and continue to coexist, providing a backup capability.
Mining collections of compounds with Screening Assistant 2
2012-01-01
Background High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. Results We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Conclusions Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/. PMID:23327565
Mining collections of compounds with Screening Assistant 2.
Guilloux, Vincent Le; Arrault, Alban; Colliandre, Lionel; Bourg, Stéphane; Vayer, Philippe; Morin-Allory, Luc
2012-08-31
High-throughput screening assays have become the starting point of many drug discovery programs for large pharmaceutical companies as well as academic organisations. Despite the increasing throughput of screening technologies, the almost infinite chemical space remains out of reach, calling for tools dedicated to the analysis and selection of the compound collections intended to be screened. We present Screening Assistant 2 (SA2), an open-source JAVA software dedicated to the storage and analysis of small to very large chemical libraries. SA2 stores unique molecules in a MySQL database, and encapsulates several chemoinformatics methods, among which: providers management, interactive visualisation, scaffold analysis, diverse subset creation, descriptors calculation, sub-structure / SMART search, similarity search and filtering. We illustrate the use of SA2 by analysing the composition of a database of 15 million compounds collected from 73 providers, in terms of scaffolds, frameworks, and undesired properties as defined by recently proposed HTS SMARTS filters. We also show how the software can be used to create diverse libraries based on existing ones. Screening Assistant 2 is a user-friendly, open-source software that can be used to manage collections of compounds and perform simple to advanced chemoinformatics analyses. Its modular design and growing documentation facilitate the addition of new functionalities, calling for contributions from the community. The software can be downloaded at http://sa2.sourceforge.net/.
pyLIMA : an open source microlensing software
NASA Astrophysics Data System (ADS)
Bachelet, Etienne
2017-01-01
Planetary microlensing is a unique tool to detect cold planets around low-mass stars which is approaching a watershed in discoveries as near-future missions incorporate dedicated surveys. NASA and ESA have decided to complement WFIRST-AFTA and Euclid with microlensing programs to enrich our statistics about this planetary population. Of the nany challenges in- herent in these missions, the data analysis is of primary importance, yet is often perceived as time consuming, complex and daunting barrier to participation in the field. We present the first open source modeling software to conduct a microlensing analysis. This software is written in Python and use as much as possible existing packages.
Evaluation of the BreastSimulator software platform for breast tomography
NASA Astrophysics Data System (ADS)
Mettivier, G.; Bliznakova, K.; Sechopoulos, I.; Boone, J. M.; Di Lillo, F.; Sarno, A.; Castriconi, R.; Russo, P.
2017-08-01
The aim of this work was the evaluation of the software BreastSimulator, a breast x-ray imaging simulation software, as a tool for the creation of 3D uncompressed breast digital models and for the simulation and the optimization of computed tomography (CT) scanners dedicated to the breast. Eight 3D digital breast phantoms were created with glandular fractions in the range 10%-35%. The models are characterised by different sizes and modelled realistic anatomical features. X-ray CT projections were simulated for a dedicated cone-beam CT scanner and reconstructed with the FDK algorithm. X-ray projection images were simulated for 5 mono-energetic (27, 32, 35, 43 and 51 keV) and 3 poly-energetic x-ray spectra typically employed in current CT scanners dedicated to the breast (49, 60, or 80 kVp). Clinical CT images acquired from two different clinical breast CT scanners were used for comparison purposes. The quantitative evaluation included calculation of the power-law exponent, β, from simulated and real breast tomograms, based on the power spectrum fitted with a function of the spatial frequency, f, of the form S(f) = α/f β . The breast models were validated by comparison against clinical breast CT and published data. We found that the calculated β coefficients were close to that of clinical CT data from a dedicated breast CT scanner and reported data in the literature. In evaluating the software package BreastSimulator to generate breast models suitable for use with breast CT imaging, we found that the breast phantoms produced with the software tool can reproduce the anatomical structure of real breasts, as evaluated by calculating the β exponent from the power spectral analysis of simulated images. As such, this research tool might contribute considerably to the further development, testing and optimisation of breast CT imaging techniques.
What's Happening in the Software Engineering Laboratory?
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Green, Scott; Smith, Donald
1995-01-01
Since 1976 the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. This paper presents an overview of recent activities and studies in SEL, using as a framework the SEL's organizational goals and experience based software improvement approach. It focuses on two SEL experience areas : (1) the evolution of the measurement program and (2) an analysis of three generations of Cleanroom experiments.
BAM/DASS: Data Analysis Software for Sub-Microarcsecond Astrometry Device
NASA Astrophysics Data System (ADS)
Gardiol, D.; Bonino, D.; Lattanzi, M. G.; Riva, A.; Russo, F.
2010-12-01
The INAF - Osservatorio Astronomico di Torino is part of the Data Processing and Analysis Consortium (DPAC) for Gaia, a cornerstone mission of the European Space Agency. Gaia will perform global astrometry by means of two telescopes looking at the sky along two different lines of sight oriented at a fixed angle, also called basic angle. Knowledge of the basic angle fluctuations at the sub-microarcsecond level over periods of the order of the minute is crucial to reach the mission goals. A specific device, the Basic Angle Monitoring, will be dedicated to this purpose. We present here the software system we are developing to analyze the BAM data and recover the basic angle variations. This tool is integrated into the whole DPAC data analysis software.
Dedicated computer system AOTK for image processing and analysis of horse navicular bone
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Fojud, A.; Koszela, K.; Mueller, W.; Górna, K.; Okoń, P.; Piekarska-Boniecka, H.
2017-07-01
The aim of the research was made the dedicated application AOTK (pol. Analiza Obrazu Trzeszczki Kopytowej) for image processing and analysis of horse navicular bone. The application was produced by using specialized software like Visual Studio 2013 and the .NET platform. To implement algorithms of image processing and analysis were used libraries of Aforge.NET. Implemented algorithms enabling accurate extraction of the characteristics of navicular bones and saving data to external files. Implemented in AOTK modules allowing the calculations of distance selected by user, preliminary assessment of conservation of structure of the examined objects. The application interface is designed in a way that ensures user the best possible view of the analyzed images.
Introduction to Flight Test Engineering (Introduction aux techniques des essais en vol)
2005-07-01
or aircraft parameters • Calculations in the frequency domain ( Fast Fourier Transform) • Data analysis with dedicated software for: • Signal...density Fast Fourier Transform Transfer function analysis Frequency response analysis Etc. PRESENTATION Color/black & white Display screen...envelope by operating the airplane at increasing ranges - representing increasing risk - of engine operation, airspeeds both fast and slow, altitude
A field-to-desktop toolchain for X-ray CT densitometry enables tree ring analysis.
De Mil, Tom; Vannoppen, Astrid; Beeckman, Hans; Van Acker, Joris; Van den Bulcke, Jan
2016-06-01
Disentangling tree growth requires more than ring width data only. Densitometry is considered a valuable proxy, yet laborious wood sample preparation and lack of dedicated software limit the widespread use of density profiling for tree ring analysis. An X-ray computed tomography-based toolchain of tree increment cores is presented, which results in profile data sets suitable for visual exploration as well as density-based pattern matching. Two temperate (Quercus petraea, Fagus sylvatica) and one tropical species (Terminalia superba) were used for density profiling using an X-ray computed tomography facility with custom-made sample holders and dedicated processing software. Density-based pattern matching is developed and able to detect anomalies in ring series that can be corrected via interactive software. A digital workflow allows generation of structure-corrected profiles of large sets of cores in a short time span that provide sufficient intra-annual density information for tree ring analysis. Furthermore, visual exploration of such data sets is of high value. The dated profiles can be used for high-resolution chronologies and also offer opportunities for fast screening of lesser studied tropical tree species. © The Author 2016. Published by Oxford University Press on behalf of the Annals of Botany Company. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
NASA Astrophysics Data System (ADS)
Chen, W.; Bauer, J.; Kurz, C.; Tessonnier, T.; Handrack, J.; Haberer, T.; Debus, J.; Parodi, K.
2017-01-01
We present the workflow of the offline-PET based range verification method used at the Heidelberg Ion Beam Therapy Center, detailing the functionalities of an in-house developed software application, SimInterface14, with which range analysis is performed. Moreover, we introduce the design of a decision support system assessing uncertainties and facilitating physicians in decisions making for plan adaptation.
NASA Astrophysics Data System (ADS)
Zaraska, Leszek; Stępniowski, Wojciech J.; Sulka, Grzegorz D.; Ciepiela, Eryk; Jaskuła, Marian
2014-02-01
Anodic porous alumina layers were fabricated by a two-step self-organized anodization in 0.3 M oxalic acid under various anodizing potentials ranging from 30 to 60 V at two different temperatures (10 and 17 ∘C). The effect of anodizing conditions on structural features and pore arrangement of AAO was investigated in detail by using the dedicated executable publication combined with ImageJ software. With increasing anodizing potential, a linear increase of the average pore diameter, interpore distance, wall thickness and barrier layer thickness, as well as a decrease of the pore density, were observed. In addition, the higher pore diameter and porosity values were obtained for samples anodized at the elevated temperature, independently of the anodizing potential. A degree of pore order was investigated on the basis of Delaunay triangulations (defect maps) and calculation of pair distribution or angle distribution functions (PDF or ADF), respectively. All methods confirmed that in order to obtain nanoporous alumina with the best, hexagonal pore arrangement, the potential of 40 V should be applied during anodization. It was confirmed that the dedicated executable publication can be used to a fast and complex analysis of nanopore arrangement and structural features of nanoporous oxide layers.
Singendonk, M M J; Rosen, R; Oors, J; Rommel, N; van Wijk, M P; Benninga, M A; Nurko, S; Omari, T I
2017-11-01
Subtyping achalasia by high-resolution manometry (HRM) is clinically relevant as response to therapy and prognosis have shown to vary accordingly. The aim of this study was to assess inter- and intrarater reliability of diagnosing achalasia and achalasia subtyping in children using the Chicago Classification (CC) V3.0. Six observers analyzed 40 pediatric HRM recordings (22 achalasia and 18 non-achalasia) twice by using dedicated analysis software (ManoView 3.0, Given Imaging, Los Angeles, CA, USA). Integrated relaxation pressure (IRP4s), distal contractile integral (DCI), intrabolus pressurization pattern (IBP), and distal latency (DL) were extracted and analyzed hierarchically. Cohen's κ (2 raters) and Fleiss' κ (>2 raters) and the intraclass correlation coefficient (ICC) were used for categorical and ordinal data, respectively. Based on the results of dedicated analysis software only, intra- and interrater reliability was excellent and moderate (κ=0.89 and κ=0.52, respectively) for differentiating achalasia from non-achalasia. For subtyping achalasia, reliability decreased to substantial and fair (κ=0.72 and κ=0.28, respectively). When observers were allowed to change the software-driven diagnosis according to their own interpretation of the manometric patterns, intra- and interrater reliability increased for diagnosing achalasia (κ=0.98 and κ=0.92, respectively) and for subtyping achalasia (κ=0.79 and κ=0.58, respectively). Intra- and interrater agreement for diagnosing achalasia when using HRM and the CC was very good to excellent when results of automated analysis software were interpreted by experienced observers. More variability was seen when relying solely on the software-driven diagnosis and for subtyping achalasia. Therefore, diagnosing and subtyping achalasia should be performed in pediatric motility centers with significant expertise. © 2017 John Wiley & Sons Ltd.
Welding process modelling and control
NASA Technical Reports Server (NTRS)
Romine, Peter L.; Adenwala, Jinen A.
1993-01-01
The research and analysis performed, and software developed, and hardware/software recommendations made during 1992 in development of the PC-based data acquisition system for support of Welding Process Modeling and Control is reported. A need was identified by the Metals Processing Branch of NASA Marshall Space Flight Center, for a mobile data aquisition and analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC-based system was chosen. The Welding Measurement System (WMS) is a dedicated instrument, strictly for the use of data aquisition and analysis. Although the WMS supports many of the functions associated with the process control, it is not the intention for this system to be used for welding process control.
Real time software for a heat recovery steam generator control system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Valdes, R.; Delgadillo, M.A.; Chavez, R.
1995-12-31
This paper is addressed to the development and successful implementation of a real time software for the Heat Recovery Steam Generator (HRSG) control system of a Combined Cycle Power Plant. The real time software for the HRSG control system physically resides in a Control and Acquisition System (SAC) which is a component of a distributed control system (DCS). The SAC is a programmable controller. The DCS installed at the Gomez Palacio power plant in Mexico accomplishes the functions of logic, analog and supervisory control. The DCS is based on microprocessors and the architecture consists of workstations operating as a Man-Machinemore » Interface (MMI), linked to SAC controllers by means of a communication system. The HRSG real time software is composed of an operating system, drivers, dedicated computer program and application computer programs. The operating system used for the development of this software was the MultiTasking Operating System (MTOS). The application software developed at IIE for the HRSG control system basically consisted of a set of digital algorithms for the regulation of the main process variables at the HRSG. By using the multitasking feature of MTOS, the algorithms are executed pseudo concurrently. In this way, the applications programs continuously use the resources of the operating system to perform their functions through a uniform service interface. The application software of the HRSG consist of three tasks, each of them has dedicated responsibilities. The drivers were developed for the handling of hardware resources of the SAC controller which in turn allows the signals acquisition and data communication with a MMI. The dedicated programs were developed for hardware diagnostics, task initializations, access to the data base and fault tolerance. The application software and the dedicated software for the HRSG control system was developed using C programming language due to compactness, portability and efficiency.« less
Implementation and Testing of VLBI Software Correlation at the USNO
NASA Technical Reports Server (NTRS)
Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken
2010-01-01
The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.
RSAT 2015: Regulatory Sequence Analysis Tools
Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques
2015-01-01
RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632
A digital acquisition and elaboration system for nuclear fast pulse detection
NASA Astrophysics Data System (ADS)
Esposito, B.; Riva, M.; Marocco, D.; Kaschuck, Y.
2007-03-01
A new digital acquisition and elaboration system has been developed and assembled in ENEA-Frascati for the direct sampling of fast pulses from nuclear detectors such as scintillators and diamond detectors. The system is capable of performing the digital sampling of the pulses (200 MSamples/s, 14-bit) and the simultaneous (compressed) data transfer for further storage and software elaboration. The design (FPGA-based) is oriented to real-time applications and has been developed in order to allow acquisition with no loss of pulses and data storage for long-time intervals (tens of s at MHz pulse count rates) without the need of large on-board memory. A dedicated pulse analysis software, written in LabVIEWTM, performs the treatment of the acquired pulses, including pulse recognition, pile-up rejection, baseline removal, pulse shape particle separation and pulse height spectra analysis. The acquisition and pre-elaboration programs have been fully integrated with the analysis software.
Wayland, Matthew T; Defaye, Arnaud; Rocha, Joao; Jayaram, Satish Arcot; Royet, Julien; Miguel-Aliaga, Irene; Leulier, François; Cognigni, Paola
2014-10-01
The intestinal physiology of Drosophila melanogaster can be monitored in an integrative, non-invasive manner by analysing graphical features of the excreta produced by flies fed on a dye-supplemented diet. This assay has been used by various labs to explore gut function and its regulation. To facilitate its use, we present here a free, stand-alone dedicated software tool for the analysis of fly excreta. The Ultimate Reader of Dung (T.U.R.D.) is designed to offer a flexible environment for a wide range of experimental designs, with special attention to automation and high-throughput processing. This software detects the distinctive changes in acid-base and water balance previously reported to occur in response to dietary challenges and mating. We have used T.U.R.D. to test the contribution of the bacterial environment of the flies to various intestinal parameters including the established diet- and mating-triggered responses. To this end, we have analysed the faecal patterns of flies reared in germ-free conditions, upon re-association with controlled microbiota and subjected to food-borne or systemic, non-lethal bacterial infections. We find that the tested faecal outputs are unchanged in all these conditions, suggesting that the impact of the bacterial environment on the intestinal features highlighted by faecal deposit analysis is minimal. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
Casella, Ivan Benaduce; Fukushima, Rodrigo Bono; Marques, Anita Battistini de Azevedo; Cury, Marcus Vinícius Martins; Presti, Calógero
2015-03-01
To compare a new dedicated software program and Adobe Photoshop for gray-scale median (GSM) analysis of B-mode images of carotid plaques. A series of 42 carotid plaques generating ≥50% diameter stenosis was evaluated by a single observer. The best segment for visualization of internal carotid artery plaque was identified on a single longitudinal view and images were recorded in JPEG format. Plaque analysis was performed by both programs. After normalization of image intensity (blood = 0, adventitial layer = 190), histograms were obtained after manual delineation of plaque. Results were compared with nonparametric Wilcoxon signed rank test and Kendall tau-b correlation analysis. GSM ranged from 00 to 100 with Adobe Photoshop and from 00 to 96 with IMTPC, with a high grade of similarity between image pairs, and a highly significant correlation (R = 0.94, p < .0001). IMTPC software appears suitable for the GSM analysis of carotid plaques. © 2014 Wiley Periodicals, Inc.
Design sensitivity analysis and optimization tool (DSO) for sizing design applications
NASA Technical Reports Server (NTRS)
Chang, Kuang-Hua; Choi, Kyung K.; Perng, Jyh-Hwa
1992-01-01
The DSO tool, a structural design software system that provides the designer with a graphics-based menu-driven design environment to perform easy design optimization for general applications, is presented. Three design stages, preprocessing, design sensitivity analysis, and postprocessing, are implemented in the DSO to allow the designer to carry out the design process systematically. A framework, including data base, user interface, foundation class, and remote module, has been designed and implemented to facilitate software development for the DSO. A number of dedicated commercial software/packages have been integrated in the DSO to support the design procedures. Instead of parameterizing an FEM, design parameters are defined on a geometric model associated with physical quantities, and the continuum design sensitivity analysis theory is implemented to compute design sensitivity coefficients using postprocessing data from the analysis codes. A tracked vehicle road wheel is given as a sizing design application to demonstrate the DSO's easy and convenient design optimization process.
NASA Astrophysics Data System (ADS)
Zaborowicz, M.; Przybył, J.; Koszela, K.; Boniecki, P.; Mueller, W.; Raba, B.; Lewicki, A.; Przybył, K.
2014-04-01
The aim of the project was to make the software which on the basis on image of greenhouse tomato allows for the extraction of its characteristics. Data gathered during the image analysis and processing were used to build learning sets of artificial neural networks. Program enables to process pictures in jpeg format, acquisition of statistical information of the picture and export them to an external file. Produced software is intended to batch analyze collected research material and obtained information saved as a csv file. Program allows for analysis of 33 independent parameters implicitly to describe tested image. The application is dedicated to processing and image analysis of greenhouse tomatoes. The program can be used for analysis of other fruits and vegetables of a spherical shape.
Mobile Applications and Multi-User Virtual Reality Simulations
NASA Technical Reports Server (NTRS)
Gordillo, Orlando Enrique
2016-01-01
This is my third internship with NASA and my second one at the Johnson Space Center. I work within the engineering directorate in ER7 (Software Robotics and Simulations Division) at a graphics lab called IGOAL. We are a very well-rounded lab because we have dedicated software developers and dedicated 3D artist, and when you combine the two, what you get is the ability to create many different things such as interactive simulations, 3D models, animations, and mobile applications.
A new software for dimensional measurements in 3D endodontic root canal instrumentation.
Sinibaldi, Raffaele; Pecci, Raffaella; Somma, Francesco; Della Penna, Stefania; Bedini, Rossella
2012-01-01
The main issue to be faced to get size estimates of 3D modification of the dental canal after endodontic treatment is the co-registration of the image stacks obtained through micro computed tomography (micro-CT) scans before and after treatment. Here quantitative analysis of micro-CT images have been performed by means of new dedicated software targeted to the analysis of root canal after endodontic instrumentation. This software analytically calculates the best superposition between the pre and post structures using the inertia tensor of the tooth. This strategy avoid minimization procedures, which can be user dependent, and time consuming. Once the co-registration have been achieved dimensional measurements have then been performed by contemporary evaluation of quantitative parameters over the two superimposed stacks of micro-CT images. The software automatically calculated the changes of volume, surface and symmetry axes in 3D occurring after the instrumentation. The calculation is based on direct comparison of the canal and canal branches selected by the user on the pre treatment image stack.
RSAT 2015: Regulatory Sequence Analysis Tools.
Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques
2015-07-01
RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. © The Author(s) 2015. Published by Oxford University Press on behalf of Nucleic Acids Research.
Analysis of artery blood flow before and after angioplasty
NASA Astrophysics Data System (ADS)
Tomaszewski, Michał; Baranowski, Paweł; Małachowski, Jerzy; Damaziak, Krzysztof; Bukała, Jakub
2018-01-01
The study presents a comparison of results obtained from numerical simulations of blood flow in two different arteries. One of them was considered to be narrowed in order to simulate an arteriosclerosis obstructing the blood flow in the vessel, whereas the second simulates the vessel after angioplasty treatment. During the treatment, a biodegradable stent is inserted into the artery, which prevents the vessel walls from collapsing. The treatment was simulated through the use of numerical simulation using the finite element method. The final mesh geometry obtained from the analysis was exported to the dedicated software in order to create geometry in which a flow domain inside the artery with the stent was created. The flow analysis was conducted in ANSYS Fluent software with non-deformable vessel walls.
NASA Astrophysics Data System (ADS)
Foucher, Johann; Labrosse, Aurelien; Dervillé, Alexandre; Zimmermann, Yann; Bernard, Guilhem; Martinez, Sergio; Grönqvist, Hanna; Baderot, Julien; Pinzan, Florian
2017-03-01
The development and integration of new materials and structures at the nanoscale require multiple parallel characterizations in order to control mostly physico-chemical properties as a function of applications. Among all properties, we can list physical properties such as: size, shape, specific surface area, aspect ratio, agglomeration/aggregation state, size distribution, surface morphology/topography, structure (including crystallinity and defect structure), solubility and chemical properties such as: structural formula/molecular structure, composition (including degree of purity, known impurities or additives), phase identity, surface chemistry (composition, charge, tension, reactive sites, physical structure, photocatalytic properties, zeta potential), hydrophilicity/lipophilicity. Depending on the final material formulation (aerosol, powder, nanostructuration…) and the industrial application (semiconductor, cosmetics, chemistry, automotive…), a fleet of complementary characterization equipments must be used in synergy for accurate process tuning and high production yield. The synergy between equipment so-called hybrid metrology consists in using the strength of each technique in order to reduce the global uncertainty for better and faster process control. The only way to succeed doing this exercise is to use data fusion methodology. In this paper, we will introduce the work that has been done to create the first generic hybrid metrology software platform dedicated to nanotechnologies process control. The first part will be dedicated to process flow modeling that is related to a fleet of metrology tools. The second part will introduce the concept of entity model which describes the various parameters that have to be extracted. The entity model is fed with data analysis as a function of the application (automatic analysis or semi-automated analysis). The final part will introduce two ways of doing data fusion on real data coming from imaging (SEM, TEM, AFM) and non-imaging techniques (SAXS). First approach is dedicated to high level fusion which is the art of combining various populations of results from homogeneous or heterogeneous tools, taking into account precision and repeatability of each of them to obtain a new more accurate result. The second approach is dedicated to deep level fusion which is the art of combining raw data from various tools in order to create a new raw data. We will introduce a new concept of virtual tool creator based on deep level fusion. As a conclusion we will discuss the implementation of hybrid metrology in semiconductor environment for advanced process control
Using Flash Technology for Motivation and Assessment
ERIC Educational Resources Information Center
Deal, Walter F., III
2004-01-01
A visit to most any technology education laboratory or classroom will reveal that computers, software, and multimedia software are rapidly becoming a mainstay in learning about technology and technological literacy. Almost all technology labs have at least several computers dedicated to specialized software or hardware such as Computer-aided…
2017-12-04
gap spacing.92,93 By running current through an EBL-fabricated gap array, it has been shown to be possible to impact atomic positions within a...Spectra were collected and the instrument was run using Wire 2.0 software operating on a dedicated computer. 2.5 Data Analysis Data analysis...accomplished using the Unaxis VLR 700 Etch PM3-Dieclectric etch. For this step it is important to first run the process on a dummy wafer to
Dang, Catherine; Phuong, Thomas; Beddag, Mahmoud; Vega, Anabel; Denis, Céline
2018-07-01
To present a data model for clinical legal medicine and the software based on that data model for both practitioners and researchers. The main functionalities of the presented software are computer-assisted production of medical certificates and data capture, storage and retrieval. The data model and the software were jointly developed by the department of forensic medicine of the Jean Verdier Hospital (Bondy, France) and an bioinformatics laboratory (LIMICS, Paris universities 6-13) between November 2015 and May 2016. The data model was built based on four sources: i) a template used in our department for producing standardised medical certificates; ii) a random sample of medical certificates produced by the forensic department; iii) anterior consensus between four healthcare professionals (two forensic practitioners, a psychologist and a forensic psychiatrist) and iv) anatomical dictionaries. The trial version of the open source software was first designed for examination of physical assault survivors. An UML-like data model dedicated to clinical legal practice was built. The data model describes the terminology for examinations of sexual assault survivors, physical assault survivors, individuals kept in police custody and undocumented migrants for age estimation. A trial version of a software relying on the data model was developed and tested by three physicians. The software allows files archiving, standardised data collection, extraction and assistance for certificate generation. It can be used for research purpose, by data exchange and analysis. Despite some current limitations of use, it is a tool which can be shared and used by other departments of forensic medicine and other specialties, improving data management and exploitation. Full integration with external sources, analytics software and use of a semantic interoperability framework are planned for the next months. Copyright © 2016 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.
NASA Astrophysics Data System (ADS)
Singh, H.; Donetsky, D.; Liu, J.; Attenkofer, K.; Cheng, B.; Trelewicz, J. R.; Lubomirsky, I.; Stavitski, E.; Frenkel, A. I.
2018-04-01
We report the development, testing, and demonstration of a setup for modulation excitation spectroscopy experiments at the Inner Shell Spectroscopy beamline of National Synchrotron Light Source - II. A computer algorithm and dedicated software were developed for asynchronous data processing and analysis. We demonstrate the reconstruction of X-ray absorption spectra for different time points within the modulation pulse using a model system. This setup and the software are intended for a broad range of functional materials which exhibit structural and/or electronic responses to the external stimulation, such as catalysts, energy and battery materials, and electromechanical devices.
Trimarchi, Matteo; Lund, Valerie J; Nicolai, Piero; Pini, Massimiliano; Senna, Massimo; Howard, David J
2004-04-01
The Neoplasms of the Sinonasal Tract software package (NSNT v 1.0) implements a complete visual database for patients with sinonasal neoplasia, facilitating standardization of data and statistical analysis. The software, which is compatible with the Macintosh and Windows platforms, provides multiuser application with a dedicated server (on Windows NT or 2000 or Macintosh OS 9 or X and a network of clients) together with web access, if required. The system hardware consists of an Apple Power Macintosh G4500 MHz computer with PCI bus, 256 Mb of RAM plus 60 Gb hard disk, or any IBM-compatible computer with a Pentium 2 processor. Image acquisition may be performed with different frame-grabber cards for analog or digital video input of different standards (PAL, SECAM, or NTSC) and levels of quality (VHS, S-VHS, Betacam, Mini DV, DV). The visual database is based on 4th Dimension by 4D Inc, and video compression is made in real-time MPEG format. Six sections have been developed: demographics, symptoms, extent of disease, radiology, treatment, and follow-up. Acquisition of data includes computed tomography and magnetic resonance imaging, histology, and endoscopy images, allowing sequential comparison. Statistical analysis integral to the program provides Kaplan-Meier survival curves. The development of a dedicated, user-friendly database for sinonasal neoplasia facilitates a multicenter network and has obvious clinical and research benefits.
A Voyager attitude control perspective on fault tolerant systems
NASA Technical Reports Server (NTRS)
Rasmussen, R. D.; Litty, E. C.
1981-01-01
In current spacecraft design, a trend can be observed to achieve greater fault tolerance through the application of on-board software dedicated to detecting and isolating failures. Whether fault tolerance through software can meet the desired objectives depends on very careful consideration and control of the system in which the software is imbedded. The considered investigation has the objective to provide some of the insight needed for the required analysis of the system. A description is given of the techniques which have been developed in this connection during the development of the Voyager spacecraft. The Voyager Galileo Attitude and Articulation Control Subsystem (AACS) fault tolerant design is discussed to emphasize basic lessons learned from this experience. The central driver of hardware redundancy implementation on Voyager was known as the 'single point failure criterion'.
Brainstorm: A User-Friendly Application for MEG/EEG Analysis
Tadel, François; Baillet, Sylvain; Mosher, John C.; Pantazis, Dimitrios; Leahy, Richard M.
2011-01-01
Brainstorm is a collaborative open-source application dedicated to magnetoencephalography (MEG) and electroencephalography (EEG) data visualization and processing, with an emphasis on cortical source estimation techniques and their integration with anatomical magnetic resonance imaging (MRI) data. The primary objective of the software is to connect MEG/EEG neuroscience investigators with both the best-established and cutting-edge methods through a simple and intuitive graphical user interface (GUI). PMID:21584256
Hansson, Jonny; Månsson, Lars Gunnar; Båth, Magnus
2016-06-01
The purpose of the present work was to investigate the validity of using single-reader-adapted receiver operating characteristics (ROC) software for analysis of visual grading characteristics (VGC) data. VGC data from four published VGC studies on optimisation of X-ray examinations, previously analysed using ROCFIT, were reanalysed using a recently developed software dedicated to VGC analysis (VGC Analyzer), and the outcomes [the mean and 95 % confidence interval (CI) of the area under the VGC curve (AUCVGC) and the p-value] were compared. The studies included both paired and non-paired data and were reanalysed both for the fixed-reader and the random-reader situations. The results showed good agreement between the softwares for the mean AUCVGC For non-paired data, wider CIs were obtained with VGC Analyzer than previously reported, whereas for paired data, the previously reported CIs were similar or even broader. Similar observations were made for the p-values. The results indicate that the use of single-reader-adapted ROC software such as ROCFIT for analysing non-paired VGC data may lead to an increased risk of committing Type I errors, especially in the random-reader situation. On the other hand, the use of ROC software for analysis of paired VGC data may lead to an increased risk of committing Type II errors, especially in the fixed-reader situation. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.
The ImageJ ecosystem: an open platform for biomedical image analysis
Schindelin, Johannes; Rueden, Curtis T.; Hiner, Mark C.; Eliceiri, Kevin W.
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available – from commercial to academic, special-purpose to Swiss army knife, small to large–but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts life science, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. PMID:26153368
The ImageJ ecosystem: An open platform for biomedical image analysis.
Schindelin, Johannes; Rueden, Curtis T; Hiner, Mark C; Eliceiri, Kevin W
2015-01-01
Technology in microscopy advances rapidly, enabling increasingly affordable, faster, and more precise quantitative biomedical imaging, which necessitates correspondingly more-advanced image processing and analysis techniques. A wide range of software is available-from commercial to academic, special-purpose to Swiss army knife, small to large-but a key characteristic of software that is suitable for scientific inquiry is its accessibility. Open-source software is ideal for scientific endeavors because it can be freely inspected, modified, and redistributed; in particular, the open-software platform ImageJ has had a huge impact on the life sciences, and continues to do so. From its inception, ImageJ has grown significantly due largely to being freely available and its vibrant and helpful user community. Scientists as diverse as interested hobbyists, technical assistants, students, scientific staff, and advanced biology researchers use ImageJ on a daily basis, and exchange knowledge via its dedicated mailing list. Uses of ImageJ range from data visualization and teaching to advanced image processing and statistical analysis. The software's extensibility continues to attract biologists at all career stages as well as computer scientists who wish to effectively implement specific image-processing algorithms. In this review, we use the ImageJ project as a case study of how open-source software fosters its suites of software tools, making multitudes of image-analysis technology easily accessible to the scientific community. We specifically explore what makes ImageJ so popular, how it impacts the life sciences, how it inspires other projects, and how it is self-influenced by coevolving projects within the ImageJ ecosystem. © 2015 Wiley Periodicals, Inc.
Dorr, Ricardo; Ozu, Marcelo; Parisi, Mario
2007-04-15
Water channels (aquaporins) family members have been identified in central nervous system cells. A classic method to measure membrane water permeability and its regulation is to capture and analyse images of Xenopus laevis oocytes expressing them. Laboratories dedicated to the analysis of motion images usually have powerful equipment valued in thousands of dollars. However, some scientists consider that new approaches are needed to reduce costs in scientific labs, especially in developing countries. The objective of this work is to share a very low-cost hardware and software setup based on a well-selected webcam, a hand-made adapter to a microscope and the use of free software to measure membrane water permeability in Xenopus oocytes. One of the main purposes of this setup is to maintain a high level of quality in images obtained at brief intervals (shorter than 70 ms). The presented setup helps to economize without sacrificing image analysis requirements.
TUBEs-Mass Spectrometry for Identification and Analysis of the Ubiquitin-Proteome.
Azkargorta, Mikel; Escobes, Iraide; Elortza, Felix; Matthiesen, Rune; Rodríguez, Manuel S
2016-01-01
Mass spectrometry (MS) has become the method of choice for the large-scale analysis of protein ubiquitylation. There exist a number of proposed methods for mapping ubiquitin sites, each with different pros and cons. We present here a protocol for the MS analysis of the ubiquitin-proteome captured by TUBEs and subsequent data analysis. Using dedicated software and algorithms, specific information on the presence of ubiquitylated peptides can be obtained from the MS search results. In addition, a quantitative and functional analysis of the ubiquitylated proteins and their interacting partners helps to unravel the biological and molecular processes they are involved in.
Software Package Completed for Alloy Design at the Atomic Level
NASA Technical Reports Server (NTRS)
Bozzolo, Guillermo H.; Noebe, Ronald D.; Abel, Phillip B.; Good, Brian S.
2001-01-01
As a result of a multidisciplinary effort involving solid-state physics, quantum mechanics, and materials and surface science, the first version of a software package dedicated to the atomistic analysis of multicomponent systems was recently completed. Based on the BFS (Bozzolo, Ferrante, and Smith) method for the calculation of alloy and surface energetics, this package includes modules devoted to the analysis of many essential features that characterize any given alloy or surface system, including (1) surface structure analysis, (2) surface segregation, (3) surface alloying, (4) bulk crystalline material properties and atomic defect structures, and (5) thermal processes that allow us to perform phase diagram calculations. All the modules of this Alloy Design Workbench 1.0 (ADW 1.0) are designed to run in PC and workstation environments, and their operation and performance are substantially linked to the needs of the user and the specific application.
Regulatory sequence analysis tools.
van Helden, Jacques
2003-07-01
The web resource Regulatory Sequence Analysis Tools (RSAT) (http://rsat.ulb.ac.be/rsat) offers a collection of software tools dedicated to the prediction of regulatory sites in non-coding DNA sequences. These tools include sequence retrieval, pattern discovery, pattern matching, genome-scale pattern matching, feature-map drawing, random sequence generation and other utilities. Alternative formats are supported for the representation of regulatory motifs (strings or position-specific scoring matrices) and several algorithms are proposed for pattern discovery. RSAT currently holds >100 fully sequenced genomes and these data are regularly updated from GenBank.
COMAN: a web server for comprehensive metatranscriptomics analysis.
Ni, Yueqiong; Li, Jun; Panagiotou, Gianni
2016-08-11
Microbiota-oriented studies based on metagenomic or metatranscriptomic sequencing have revolutionised our understanding on microbial ecology and the roles of both clinical and environmental microbes. The analysis of massive metatranscriptomic data requires extensive computational resources, a collection of bioinformatics tools and expertise in programming. We developed COMAN (Comprehensive Metatranscriptomics Analysis), a web-based tool dedicated to automatically and comprehensively analysing metatranscriptomic data. COMAN pipeline includes quality control of raw reads, removal of reads derived from non-coding RNA, followed by functional annotation, comparative statistical analysis, pathway enrichment analysis, co-expression network analysis and high-quality visualisation. The essential data generated by COMAN are also provided in tabular format for additional analysis and integration with other software. The web server has an easy-to-use interface and detailed instructions, and is freely available at http://sbb.hku.hk/COMAN/ CONCLUSIONS: COMAN is an integrated web server dedicated to comprehensive functional analysis of metatranscriptomic data, translating massive amount of reads to data tables and high-standard figures. It is expected to facilitate the researchers with less expertise in bioinformatics in answering microbiota-related biological questions and to increase the accessibility and interpretation of microbiota RNA-Seq data.
van Zelst, J C M; Tan, T; Platel, B; de Jong, M; Steenbakkers, A; Mourits, M; Grivegnee, A; Borelli, C; Karssemeijer, N; Mann, R M
2017-04-01
To investigate the effect of dedicated Computer Aided Detection (CAD) software for automated breast ultrasound (ABUS) on the performance of radiologists screening for breast cancer. 90 ABUS views of 90 patients were randomly selected from a multi-institutional archive of cases collected between 2010 and 2013. This dataset included normal cases (n=40) with >1year of follow up, benign (n=30) lesions that were either biopsied or remained stable, and malignant lesions (n=20). Six readers evaluated all cases with and without CAD in two sessions. CAD-software included conventional CAD-marks and an intelligent minimum intensity projection of the breast tissue. Readers reported using a likelihood-of-malignancy scale from 0 to 100. Alternative free-response ROC analysis was used to measure the performance. Without CAD, the average area-under-the-curve (AUC) of the readers was 0.77 and significantly improved with CAD to 0.84 (p=0.001). Sensitivity of all readers improved (range 5.2-10.6%) by using CAD but specificity decreased in four out of six readers (range 1.4-5.7%). No significant difference was observed in the AUC between experienced radiologists and residents both with and without CAD. Dedicated CAD-software for ABUS has the potential to improve the cancer detection rates of radiologists screening for breast cancer. Copyright © 2017 Elsevier B.V. All rights reserved.
The Other Infrastructure: Distance Education's Digital Plant.
ERIC Educational Resources Information Center
Boettcher, Judith V.; Kumar, M. S. Vijay
2000-01-01
Suggests a new infrastructure--the digital plant--for supporting flexible Web campus environments. Describes four categories which make up the infrastructure: personal communication tools and applications; network of networks for the Web campus; dedicated servers and software applications; software applications and services from external…
Multisensor system for tunnel inspection
NASA Astrophysics Data System (ADS)
Idoux, Maurice
2005-01-01
The system is aimed at assisting inspection and monitoring of the degradation of tunnels in order to minimize maintenance and repair time. ATLAS 70 is a complete sensors/software package which enables thorough diagnosis of tunnel wall conditions. The data collected locally are stored on a computer hard disk for subsequent analysis in a remote location via elaborate dedicated software. The sensors and local computer are loaded onto a rail and/or road vehicle of specific design, i.e. with even travelling speed of 2 to 5 km/h. Originally, the system has been developed for the Paris Underground Company and has since been applied to rail and road tunnels, large town sewage systems, clean water underground aqueducts and electric cable tunnels.
Automated Transfer Vehicle Proximity Flight Safety Overview
NASA Astrophysics Data System (ADS)
Cornier, Dominique; Berthelier, David; Requiston, Helene; Zekri, Eric; Chase, Richard
2005-12-01
The European Automated Transfer Vehicle (ATV) is an unmanned transportation spacecraft designed to contribute to the logistic servicing of the ISS. The ATV will be launched by ARIANE 5 and, after phasing and rendezvous maneuvers, it autonomously docks to the International Space Station (ISS).The ATV control is nominally handled by the Guidance, Navigation and Control (GNC) function using computers, software, sensors and actuators. During rendezvous operations, in order to cover the extreme situations where the GNC function fails to ensure a safe trajectory with respect to the ISS, a segregated Proximity Flight Safety (PFS) function is activated : this function will initiate a collision avoidance maneuver which will place the ATV on a trajectory ensuring safety with respect to the ISS. The PFS function relies on segregated computers, the Monitoring and Safing Units (MSUs) running specific software, on four dedicated thrusters, on dedicated batteries and on specific interfaces with ATV gyrometers.The PFS function being the ultimate protection to ensure ISS safety in case of ATV malfunction, specific rules have been applied to its implementation, in particular for the development of the MSU software, which is critical since any failure of this software may result in catastrophic consequences.This paper provides an overview of the ATV Proximity Flight Safety function. After a short description of the overall ATV avionics architecture and its rationale, the second part of the paper presents more details on the PFS function both in terms of hardware and software implementation. The third part of the paper is dedicated to the MSU software validation method that is specific considering its criticality. The last part of the paper provides information on the different operations related to the use of the PFS function during an ATV flight.
NASA Astrophysics Data System (ADS)
Kuehnel, C.; Hennemuth, A.; Oeltze, S.; Boskamp, T.; Peitgen, H.-O.
2008-03-01
The diagnosis support in the field of coronary artery disease (CAD) is very complex due to the numerous symptoms and performed studies leading to the final diagnosis. CTA and MRI are on their way to replace invasive catheter angiography. Thus, there is a need for sophisticated software tools that present the different analysis results, and correlate the anatomical and dynamic image information. We introduce a new software assistant for the combined result visualization of CTA and MR images, in which a dedicated concept for the structured presentation of original data, segmentation results, and individual findings is realized. Therefore, we define a comprehensive class hierarchy and assign suitable interaction functions. User guidance is coupled as closely as possible with available data, supporting a straightforward workflow design. The analysis results are extracted from two previously developed software assistants, providing coronary artery analysis and measurements, function analysis as well as late enhancement data investigation. As an extension we introduce a finding concept directly relating suspicious positions to the underlying data. An affine registration of CT and MR data in combination with the AHA 17-segment model enables the coupling of local findings to positions in all data sets. Furthermore, sophisticated visualization in 2D and 3D and interactive bull's eye plots facilitate a correlation of coronary stenoses and physiology. The software has been evaluated on 20 patient data sets.
Damaris: Addressing performance variability in data management for post-petascale simulations
Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck; ...
2016-10-01
With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters.more » Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.« less
Damaris: Addressing performance variability in data management for post-petascale simulations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Dorier, Matthieu; Antoniu, Gabriel; Cappello, Franck
With exascale computing on the horizon, reducing performance variability in data management tasks (storage, visualization, analysis, etc.) is becoming a key challenge in sustaining high performance. Here, this variability significantly impacts the overall application performance at scale and its predictability over time. In this article, we present Damaris, a system that leverages dedicated cores in multicore nodes to offload data management tasks, including I/O, data compression, scheduling of data movements, in situ analysis, and visualization. We evaluate Damaris with the CM1 atmospheric simulation and the Nek5000 computational fluid dynamic simulation on four platforms, including NICS’s Kraken and NCSA’s Blue Waters.more » Our results show that (1) Damaris fully hides the I/O variability as well as all I/O-related costs, thus making simulation performance predictable; (2) it increases the sustained write throughput by a factor of up to 15 compared with standard I/O approaches; (3) it allows almost perfect scalability of the simulation up to over 9,000 cores, as opposed to state-of-the-art approaches that fail to scale; and (4) it enables a seamless connection to the VisIt visualization software to perform in situ analysis and visualization in a way that impacts neither the performance of the simulation nor its variability. In addition, we extended our implementation of Damaris to also support the use of dedicated nodes and conducted a thorough comparison of the two approaches—dedicated cores and dedicated nodes—for I/O tasks with the aforementioned applications.« less
Establishing Qualitative Software Metrics in Department of the Navy Programs
2015-10-29
dedicated to provide the highest quality software to its users. In doing, there is a need for a formalized set of Software Quality Metrics . The goal...of this paper is to establish the validity of those necessary Quality metrics . In our approach we collected the data of over a dozen programs...provide the necessary variable data for our formulas and tested the formulas for validity. Keywords: metrics ; software; quality I. PURPOSE Space
Automated data acquisition technology development:Automated modeling and control development
NASA Technical Reports Server (NTRS)
Romine, Peter L.
1995-01-01
This report documents the completion of, and improvements made to, the software developed for automated data acquisition and automated modeling and control development on the Texas Micro rackmounted PC's. This research was initiated because a need was identified by the Metal Processing Branch of NASA Marshall Space Flight Center for a mobile data acquisition and data analysis system, customized for welding measurement and calibration. Several hardware configurations were evaluated and a PC based system was chosen. The Welding Measurement System (WMS), is a dedicated instrument strickly for use of data acquisition and data analysis. In addition to the data acquisition functions described in this thesis, WMS also supports many functions associated with process control. The hardware and software requirements for an automated acquisition system for welding process parameters, welding equipment checkout, and welding process modeling were determined in 1992. From these recommendations, NASA purchased the necessary hardware and software. The new welding acquisition system is designed to collect welding parameter data and perform analysis to determine the voltage versus current arc-length relationship for VPPA welding. Once the results of this analysis are obtained, they can then be used to develop a RAIL function to control welding startup and shutdown without torch crashing.
GUIDOS: tools for the assessment of pattern, connectivity, and fragmentation
NASA Astrophysics Data System (ADS)
Vogt, Peter
2013-04-01
Pattern, connectivity, and fragmentation can be considered as pillars for a quantitative analysis of digital landscape images. The free software toolbox GUIDOS (http://forest.jrc.ec.europa.eu/download/software/guidos) includes a variety of dedicated methodologies for the quantitative assessment of these features. Amongst others, Morphological Spatial Pattern Analysis (MSPA) is used for an intuitive description of image pattern structures and the automatic detection of connectivity pathways. GUIDOS includes tools for the detection and quantitative assessment of key nodes and links as well as to define connectedness in raster images and to setup appropriate input files for an enhanced network analysis using Conefor Sensinode. Finally, fragmentation is usually defined from a species point of view but a generic and quantifiable indicator is needed to measure fragmentation and its changes. Some preliminary results for different conceptual approaches will be shown for a sample dataset. Complemented by pre- and post-processing routines and a complete GIS environment the portable GUIDOS Toolbox may facilitate a holistic assessment in risk assessment studies, landscape planning, and conservation/restoration policies. Alternatively, individual analysis components may contribute to or enhance studies conducted with other software packages in landscape ecology.
Using OSG Computing Resources with (iLC)Dirac
NASA Astrophysics Data System (ADS)
Sailer, A.; Petric, M.; CLICdp Collaboration
2017-10-01
CPU cycles for small experiments and projects can be scarce, thus making use of all available resources, whether dedicated or opportunistic, is mandatory. While enabling uniform access to the LCG computing elements (ARC, CREAM), the DIRAC grid interware was not able to use OSG computing elements (GlobusCE, HTCondor-CE) without dedicated support at the grid site through so called ‘SiteDirectors’, which directly submit to the local batch system. This in turn requires additional dedicated effort for small experiments on the grid site. Adding interfaces to the OSG CEs through the respective grid middleware is therefore allowing accessing them within the DIRAC software without additional site-specific infrastructure. This enables greater use of opportunistic resources for experiments and projects without dedicated clusters or an established computing infrastructure with the DIRAC software. To allow sending jobs to HTCondor-CE and legacy Globus computing elements inside DIRAC the required wrapper classes were developed. Not only is the usage of these types of computing elements now completely transparent for all DIRAC instances, which makes DIRAC a flexible solution for OSG based virtual organisations, but it also allows LCG Grid Sites to move to the HTCondor-CE software, without shutting DIRAC based VOs out of their site. In these proceedings we detail how we interfaced the DIRAC system to the HTCondor-CE and Globus computing elements and explain the encountered obstacles and solutions developed, and how the linear collider community uses resources in the OSG.
Teaching Science and Mathematics Subjects Using the Excel Spreadsheet Package
ERIC Educational Resources Information Center
Ibrahim, Dogan
2009-01-01
The teaching of scientific subjects usually require laboratories where students can put the theory they have learned into practice. Traditionally, electronic programmable calculators, dedicated software, or expensive software simulation packages, such as MATLAB have been used to simulate scientific experiments. Recently, spreadsheet programs have…
Software Engineering Education Directory
1988-01-01
Dana Hausman and Suzanne Woolf were crucial to the successful completion of this edition of the directory. Their teamwork, energy, and dedication...for this directory began in the summer of 1986 with a questionnaire mailed to schools selected from Peterson’s Graduate Programs in Engineering and...Christoper, and Siegel, Stan Software Cost Estimation and Life-Cycle Control by Putnam, Lawrence H. Software Quality Assurance: A Practical Approach by
Exploring the quality of life (QOL) in the Indian software industry: a public health viewpoint.
Jha, Ayan; Sadhukhan, Sanjoy Kumar; Velusamy, Saravanan; Banerjee, Gargi; Banerjee, Arpita; Saha, Amitava; Talukdar, Sumit
2012-04-01
Our objectives were to describe the QOL and its determinants among software professionals of Kolkata, and to compare the same according to information technology (IT) and IT-enabled services (ITeS) sub-sectors. An institution-based cross-sectional study was conducted among software professionals of Kolkata applying a two-stage stratified random sampling technique. The WHO QOL BREF questionnaire was administered along with a list of pertinent variables. Overall, the analysis for 338 software professionals (177 IT and 161 ITeS) clearly demonstrated significant differences between mean scores of these two sectors for each of the six outcome domains of WHO QOL BREF. Multilevel multivariate analysis outlined 13 significant predictors of QOL-four positive (age, regular fitness regimes, foreign placements and changing companies frequently) and the rest of the nine, negative (multiple sex partners, multiple addictions, extended working hours, night-shift duties, income, expenditure, carrying office work home, current illness and ITeS company type). Our study helps in obtaining a clear understanding of the multifaceted risk factors prevailing in this sector, the majority of which can be effectively addressed by specific health promotional interventions. A dedicated health policy is mandated at both government and company levels.
Giancarlo, R; Scaturro, D; Utro, F
2015-02-01
The prediction of the number of clusters in a dataset, in particular microarrays, is a fundamental task in biological data analysis, usually performed via validation measures. Unfortunately, it has received very little attention and in fact there is a growing need for software tools/libraries dedicated to it. Here we present ValWorkBench, a software library consisting of eleven well known validation measures, together with novel heuristic approximations for some of them. The main objective of this paper is to provide the interested researcher with the full software documentation of an open source cluster validation platform having the main features of being easily extendible in a homogeneous way and of offering software components that can be readily re-used. Consequently, the focus of the presentation is on the architecture of the library, since it provides an essential map that can be used to access the full software documentation, which is available at the supplementary material website [1]. The mentioned main features of ValWorkBench are also discussed and exemplified, with emphasis on software abstraction design and re-usability. A comparison with existing cluster validation software libraries, mainly in terms of the mentioned features, is also offered. It suggests that ValWorkBench is a much needed contribution to the microarray software development/algorithm engineering community. For completeness, it is important to mention that previous accurate algorithmic experimental analysis of the relative merits of each of the implemented measures [19,23,25], carried out specifically on microarray data, gives useful insights on the effectiveness of ValWorkBench for cluster validation to researchers in the microarray community interested in its use for the mentioned task. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S
2013-01-01
Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.
Distributed Software for Observations in the Near Infrared
NASA Astrophysics Data System (ADS)
Gavryusev, V.; Baffa, C.; Giani, E.
We have developed an integrated system that performs astronomical observations in Near Infrared bands operating two-dimensional instruments at the Italian National Infrared Facility's \\htmllink{ARNICA}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/arnica/arnica.html} and \\htmllink{LONGSP}{http://helios.arcetri.astro.it:/home/idefix/Mosaic/ instr/longsp/longsp.html}. This software consists of several communicating processes, generally executed across a network, as well as on a single computer. The user interface is organized as widget-based X11 client. The interprocess communication is provided by sockets and uses TCP/IP. The processes denoted for control of hardware (telescope and other instruments) should be executed currently on a PC dedicated for this task under DESQview/X, while all other components (user interface, tools for the data analysis, etc.) can also work under UNIX\\@. The hardware independent part of software is based on the Athena Widget Set and is compiled by GNU C to provide maximum portability.
NASA Astrophysics Data System (ADS)
Barbasiewicz, Adrianna; Widerski, Tadeusz; Daliga, Karol
2018-01-01
This article was created as a result of research conducted within the master thesis. The purpose of the measurements was to analyze the accuracy of the positioning of points by computer programs. Selected software was a specialized computer software dedicated to photogrammetric work. For comparative purposes it was decided to use tools with similar functionality. As the basic parameters that affect the results selected the resolution of the photos on which the key points were searched. In order to determine the location of the determined points, it was decided to follow the photogrammetric resection rule. In order to automate the measurement, the measurement session planning was omitted. The coordinates of the points collected by the tachymetric measure were used as a reference system. The resulting deviations and linear displacements oscillate in millimeters. The visual aspects of the cloud points have also been briefly analyzed.
Arduino: a low-cost multipurpose lab equipment.
D'Ausilio, Alessandro
2012-06-01
Typical experiments in psychological and neurophysiological settings often require the accurate control of multiple input and output signals. These signals are often generated or recorded via computer software and/or external dedicated hardware. Dedicated hardware is usually very expensive and requires additional software to control its behavior. In the present article, I present some accuracy tests on a low-cost and open-source I/O board (Arduino family) that may be useful in many lab environments. One of the strengths of Arduinos is the possibility they afford to load the experimental script on the board's memory and let it run without interfacing with computers or external software, thus granting complete independence, portability, and accuracy. Furthermore, a large community has arisen around the Arduino idea and offers many hardware add-ons and hundreds of free scripts for different projects. Accuracy tests show that Arduino boards may be an inexpensive tool for many psychological and neurophysiological labs.
Process Based on SysML for New Launchers System and Software Developments
NASA Astrophysics Data System (ADS)
Hiron, Emmanuel; Miramont, Philippe
2010-08-01
The purpose of this paper is to present the Astrium-ST engineering process based on SysML. This process is currently set-up in the frame of common CNES /Astrium-ST R&T studies related to the Ariane 5 electrical system and flight software modelling. The tool used to set up this process is Rhapsody release 7.3 from IBM-Software firm [1]. This process focuses on the system engineering phase dedicated to Software with the objective to generate both System documents (sequential system design and flight control) and Software specifications.
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization.
Jung, Sang-Kyu; McDonald, Karen
2011-08-16
Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net.
Visual gene developer: a fully programmable bioinformatics software for synthetic gene optimization
2011-01-01
Background Direct gene synthesis is becoming more popular owing to decreases in gene synthesis pricing. Compared with using natural genes, gene synthesis provides a good opportunity to optimize gene sequence for specific applications. In order to facilitate gene optimization, we have developed a stand-alone software called Visual Gene Developer. Results The software not only provides general functions for gene analysis and optimization along with an interactive user-friendly interface, but also includes unique features such as programming capability, dedicated mRNA secondary structure prediction, artificial neural network modeling, network & multi-threaded computing, and user-accessible programming modules. The software allows a user to analyze and optimize a sequence using main menu functions or specialized module windows. Alternatively, gene optimization can be initiated by designing a gene construct and configuring an optimization strategy. A user can choose several predefined or user-defined algorithms to design a complicated strategy. The software provides expandable functionality as platform software supporting module development using popular script languages such as VBScript and JScript in the software programming environment. Conclusion Visual Gene Developer is useful for both researchers who want to quickly analyze and optimize genes, and those who are interested in developing and testing new algorithms in bioinformatics. The software is available for free download at http://www.visualgenedeveloper.net. PMID:21846353
Evolution of user analysis on the grid in ATLAS
NASA Astrophysics Data System (ADS)
Dewhurst, A.; Legger, F.; ATLAS Collaboration
2017-10-01
More than one thousand physicists analyse data collected by the ATLAS experiment at the Large Hadron Collider (LHC) at CERN through 150 computing facilities around the world. Efficient distributed analysis requires optimal resource usage and the interplay of several factors: robust grid and software infrastructures, and system capability to adapt to different workloads. The continuous automatic validation of grid sites and the user support provided by a dedicated team of expert shifters have been proven to provide a solid distributed analysis system for ATLAS users. Typical user workflows on the grid, and their associated metrics, are discussed. Measurements of user job performance and typical requirements are also shown.
A USNRC perspective on the use of commercial-off-shelf software (COTS) in advanced reactors
DOE Office of Scientific and Technical Information (OSTI.GOV)
Stewart, J.C.
1997-12-01
The use of commercially available digital computer systems and components in safety critical systems (nuclear power plant, military, and commercial applications) is increasing rapidly. While this paper focuses on the software aspects of the application most of these continents are applicable to the hardware aspects as well. Commercial dedication (the process of assuring that a commercial grade item will perform its intended safety function) has demonstrated benefits in cost savings and a wide base of user experience, however, care must be taken to avoid difficulties with some aspects of the dedication process such as access to vendor development information, configurationmore » management long term support, and system integration.« less
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2009-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor. Morphometry of the corneal endothelium is presently done by semi-automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development of fully automated analysis of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images. The digitally enhanced images of the corneal endothelium were transformed, using the fast Fourier transform (FFT). Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Open-source software: not quite endsville.
Stahl, Matthew T
2005-02-01
Open-source software will never achieve ubiquity. There are environments in which it simply does not flourish. By its nature, open-source development requires free exchange of ideas, community involvement, and the efforts of talented and dedicated individuals. However, pressures can come from several sources that prevent this from happening. In addition, openness and complex licensing issues invite misuse and abuse. Care must be taken to avoid the pitfalls of open-source software.
Free software for performing physical analysis of systems for digital radiography and mammography.
Donini, Bruno; Rivetti, Stefano; Lanconelli, Nico; Bertolini, Marco
2014-05-01
In this paper, the authors present a free software for assisting users in achieving the physical characterization of x-ray digital systems and image quality checks. The program was developed as a plugin of a well-known public-domain suite ImageJ. The software can assist users in calculating various physical parameters such as the response curve (also termed signal transfer property), modulation transfer function (MTF), noise power spectra (NPS), and detective quantum efficiency (DQE). It also includes the computation of some image quality checks: defective pixel analysis, uniformity, dark analysis, and lag. The software was made available in 2009 and has been used during the last couple of years by many users who gave us valuable feedback for improving its usability. It was tested for achieving the physical characterization of several clinical systems for digital radiography and mammography. Various published papers made use of the outcomes of the plugin. This software is potentially beneficial to a variety of users: physicists working in hospitals, staff working in radiological departments, such as medical physicists, physicians, engineers. The plugin, together with a brief user manual, are freely available and can be found online (www.medphys.it/downloads.htm). With our plugin users can estimate all three most important parameters used for physical characterization (MTF, NPS, and also DQE). The plugin can run on any operating system equipped with ImageJ suite. The authors validated the software by comparing MTF and NPS curves on a common set of images with those obtained with other dedicated programs, achieving a very good agreement.
Cost analysis of Gamma Knife stereotactic radiosurgery.
Griffiths, Alison; Marinovich, Luke; Barton, Michael B; Lord, Sarah J
2007-01-01
Stereotactic radiosurgery (SRS) is used to treat intracranial lesions and vascular malformations as an addition or replacement to whole brain radiotherapy and microsurgery. SRS can be delivered by hardware and software appended to standard linear accelerators (Linacs) or by dedicated systems such as Gamma Knife, which has been proposed as a more accurate and user friendly technology. Internationally, dedicated systems have been funded, despite limitations in evidence. However, some countries including Australia have not recommended additional reimbursement for dedicated systems. This study compares the costs of Linac radiosurgery with Gamma Knife radiosurgery. Due to limited evidence on comparative effects, the economic analysis was restricted to a cost evaluation. The base-case analysis assumed a modified Linac was used only to treat SRS patients. However, because a modified Linac could be used to treat other radiotherapy patients, a second analysis assumed spare time was used to meet other radiotherapy needs, and Linac capital costs were apportioned according to SRS use. The incremental cost of Gamma Knife versus a modified Linac was estimated as AU$209 per patient. This result is sensitive to variations in assumptions. A second analysis proportioning capital costs according to SRS use showed that Gamma Knife may cost up to AU$1673 more per patient. Gamma Knife may be cost competitive only if demand for SRS services is high enough to fully use equipment working time. However, given low patient demand and competing radiotherapy needs, Gamma Knife appears more costly and further evidence of survival or quality of life advantages may be required to justify reimbursement.
The influence of software filtering in digital mammography image quality
NASA Astrophysics Data System (ADS)
Michail, C.; Spyropoulou, V.; Kalyvas, N.; Valais, I.; Dimitropoulos, N.; Fountos, G.; Kandarakis, I.; Panayiotakis, G.
2009-05-01
Breast cancer is one of the most frequently diagnosed cancers among women. Several techniques have been developed to help in the early detection of breast cancer such as conventional and digital x-ray mammography, positron and single-photon emission mammography, etc. A key advantage in digital mammography is that images can be manipulated as simple computer image files. Thus non-dedicated commercially available image manipulation software can be employed to process and store the images. The image processing tools of the Photoshop (CS 2) software usually incorporate digital filters which may be used to reduce image noise, enhance contrast and increase spatial resolution. However, improving an image quality parameter may result in degradation of another. The aim of this work was to investigate the influence of three sharpening filters, named hereafter sharpen, sharpen more and sharpen edges on image resolution and noise. Image resolution was assessed by means of the Modulation Transfer Function (MTF).In conclusion it was found that the correct use of commercial non-dedicated software on digital mammograms may improve some aspects of image quality.
NASA Astrophysics Data System (ADS)
Malik, S.; Shipsey, I.; Cavanaugh, R.; Bloom, K.; Chan, Kai-Feng; D'Hondt, J.; Klima, B.; Narain, M.; Palla, F.; Rolandi, G.; Schörner-Sadenius, T.
2014-06-01
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals of CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.
Scalable cloud without dedicated storage
NASA Astrophysics Data System (ADS)
Batkovich, D. V.; Kompaniets, M. V.; Zarochentsev, A. K.
2015-05-01
We present a prototype of a scalable computing cloud. It is intended to be deployed on the basis of a cluster without the separate dedicated storage. The dedicated storage is replaced by the distributed software storage. In addition, all cluster nodes are used both as computing nodes and as storage nodes. This solution increases utilization of the cluster resources as well as improves fault tolerance and performance of the distributed storage. Another advantage of this solution is high scalability with a relatively low initial and maintenance cost. The solution is built on the basis of the open source components like OpenStack, CEPH, etc.
Software Risk Identification for Interplanetary Probes
NASA Technical Reports Server (NTRS)
Dougherty, Robert J.; Papadopoulos, Periklis E.
2005-01-01
The need for a systematic and effective software risk identification methodology is critical for interplanetary probes that are using increasingly complex and critical software. Several probe failures are examined that suggest more attention and resources need to be dedicated to identifying software risks. The direct causes of these failures can often be traced to systemic problems in all phases of the software engineering process. These failures have lead to the development of a practical methodology to identify risks for interplanetary probes. The proposed methodology is based upon the tailoring of the Software Engineering Institute's (SEI) method of taxonomy-based risk identification. The use of this methodology will ensure a more consistent and complete identification of software risks in these probes.
Use of PharmaCALogy Software in a PBL Programme to Teach Nurse Prescribing
ERIC Educational Resources Information Center
Coleman, Iain P. L.; Watts, Adam S.
2007-01-01
Pharmacology is taught on a dedicated module for nurse prescribers who have a limited physical science background. To facilitate learning a problem-based approach was adopted. However, to enhance students' knowledge of drug action a PharmaCALogy software package from the British Pharmacological Society was used. Students were alternately given a…
Designing software for operational decision support through coloured Petri nets
NASA Astrophysics Data System (ADS)
Maggi, F. M.; Westergaard, M.
2017-05-01
Operational support provides, during the execution of a business process, replies to questions such as 'how do I end the execution of the process in the cheapest way?' and 'is my execution compliant with some expected behaviour?' These questions may be asked several times during a single execution and, to answer them, dedicated software components (the so-called operational support providers) need to be invoked. Therefore, an infrastructure is needed to handle multiple providers, maintain data between queries about the same execution and discard information when it is no longer needed. In this paper, we use coloured Petri nets (CPNs) to model and analyse software implementing such an infrastructure. This analysis is needed to clarify the requirements before implementation and to guarantee that the resulting software is correct. To this aim, we present techniques to represent and analyse state spaces with 250 million states on a normal PC. We show how the specified requirements have been implemented as a plug-in of the process mining tool ProM and how the operational support in ProM can be used in combination with an existing operational support provider.
contributes to the research efforts for commercial buildings. This effort is dedicated to studying the , commercial sector whole-building energy simulation, scientific computing, and software configuration and
Model-Based Fault Diagnosis: Performing Root Cause and Impact Analyses in Real Time
NASA Technical Reports Server (NTRS)
Figueroa, Jorge F.; Walker, Mark G.; Kapadia, Ravi; Morris, Jonathan
2012-01-01
Generic, object-oriented fault models, built according to causal-directed graph theory, have been integrated into an overall software architecture dedicated to monitoring and predicting the health of mission- critical systems. Processing over the generic fault models is triggered by event detection logic that is defined according to the specific functional requirements of the system and its components. Once triggered, the fault models provide an automated way for performing both upstream root cause analysis (RCA), and for predicting downstream effects or impact analysis. The methodology has been applied to integrated system health management (ISHM) implementations at NASA SSC's Rocket Engine Test Stands (RETS).
Numerical algorithm for optimization of positive electrode in lead-acid batteries
NASA Astrophysics Data System (ADS)
Murariu, Ancuta Teodora; Buimaga-Iarinca, Luiza; Morari, Cristian
2017-12-01
The positive electrode in lead-acid batteries is one of the most sensitive parts of the whole battery, since it is affected by various aggresive chemical processes during its life. Therefore, an optimal design of the positive electrode of the battery may have as efect a dramatic improvement of the properties of the battery - such as total capacity or endurance during its life. Our efforts dedicated to this goal cover a range of rather complex tasks, from the design based on numerical analysis to statistic analysis. We present the structure of the software implementation and the results obtained for three types of positive electrodes.
ZBIT Bioinformatics Toolbox: A Web-Platform for Systems Biology and Expression Data Analysis
Römer, Michael; Eichner, Johannes; Dräger, Andreas; Wrzodek, Clemens; Wrzodek, Finja; Zell, Andreas
2016-01-01
Bioinformatics analysis has become an integral part of research in biology. However, installation and use of scientific software can be difficult and often requires technical expert knowledge. Reasons are dependencies on certain operating systems or required third-party libraries, missing graphical user interfaces and documentation, or nonstandard input and output formats. In order to make bioinformatics software easily accessible to researchers, we here present a web-based platform. The Center for Bioinformatics Tuebingen (ZBIT) Bioinformatics Toolbox provides web-based access to a collection of bioinformatics tools developed for systems biology, protein sequence annotation, and expression data analysis. Currently, the collection encompasses software for conversion and processing of community standards SBML and BioPAX, transcription factor analysis, and analysis of microarray data from transcriptomics and proteomics studies. All tools are hosted on a customized Galaxy instance and run on a dedicated computation cluster. Users only need a web browser and an active internet connection in order to benefit from this service. The web platform is designed to facilitate the usage of the bioinformatics tools for researchers without advanced technical background. Users can combine tools for complex analyses or use predefined, customizable workflows. All results are stored persistently and reproducible. For each tool, we provide documentation, tutorials, and example data to maximize usability. The ZBIT Bioinformatics Toolbox is freely available at https://webservices.cs.uni-tuebingen.de/. PMID:26882475
NASA Astrophysics Data System (ADS)
Nocente, M.; Tardocchi, M.; Olariu, A.; Olariu, S.; Pereira, R. C.; Chugunov, I. N.; Fernandes, A.; Gin, D. B.; Grosso, G.; Kiptily, V. G.; Neto, A.; Shevelev, A. E.; Silva, M.; Sousa, J.; Gorini, G.
2013-04-01
High resolution γ-ray spectroscopy measurements at MHz counting rates were carried out at nuclear accelerators, combining a LaBr 3(Ce) detector with dedicated hardware and software solutions based on digitization and off-line analysis. Spectra were measured at counting rates up to 4 MHz, with little or no degradation of the energy resolution, adopting a pile up rejection algorithm. The reported results represent a step forward towards the final goal of high resolution γ-ray spectroscopy measurements on a burning plasma device.
Gabadinho, José; Beteva, Antonia; Guijarro, Matias; Rey-Bakaikoa, Vicente; Spruce, Darren; Bowler, Matthew W.; Brockhauser, Sandor; Flot, David; Gordon, Elspeth J.; Hall, David R.; Lavault, Bernard; McCarthy, Andrew A.; McCarthy, Joanne; Mitchell, Edward; Monaco, Stéphanie; Mueller-Dieckmann, Christoph; Nurizzo, Didier; Ravelli, Raimond B. G.; Thibault, Xavier; Walsh, Martin A.; Leonard, Gordon A.; McSweeney, Sean M.
2010-01-01
The design and features of a beamline control software system for macromolecular crystallography (MX) experiments developed at the European Synchrotron Radiation Facility (ESRF) are described. This system, MxCuBE, allows users to easily and simply interact with beamline hardware components and provides automated routines for common tasks in the operation of a synchrotron beamline dedicated to experiments in MX. Additional functionality is provided through intuitive interfaces that enable the assessment of the diffraction characteristics of samples, experiment planning, automatic data collection and the on-line collection and analysis of X-ray emission spectra. The software can be run in a tandem client-server mode that allows for remote control and relevant experimental parameters and results are automatically logged in a relational database, ISPyB. MxCuBE is modular, flexible and extensible and is currently deployed on eight macromolecular crystallography beamlines at the ESRF. Additionally, the software is installed at MAX-lab beamline I911-3 and at BESSY beamline BL14.1. PMID:20724792
Automated Assignment of MS/MS Cleavable Cross-Links in Protein 3D-Structure Analysis
NASA Astrophysics Data System (ADS)
Götze, Michael; Pettelkau, Jens; Fritzsche, Romy; Ihling, Christian H.; Schäfer, Mathias; Sinz, Andrea
2015-01-01
CID-MS/MS cleavable cross-linkers hold an enormous potential for an automated analysis of cross-linked products, which is essential for conducting structural proteomics studies. The created characteristic fragment ion patterns can easily be used for an automated assignment and discrimination of cross-linked products. To date, there are only a few software solutions available that make use of these properties, but none allows for an automated analysis of cleavable cross-linked products. The MeroX software fills this gap and presents a powerful tool for protein 3D-structure analysis in combination with MS/MS cleavable cross-linkers. We show that MeroX allows an automatic screening of characteristic fragment ions, considering static and variable peptide modifications, and effectively scores different types of cross-links. No manual input is required for a correct assignment of cross-links and false discovery rates are calculated. The self-explanatory graphical user interface of MeroX provides easy access for an automated cross-link search platform that is compatible with commonly used data file formats, enabling analysis of data originating from different instruments. The combination of an MS/MS cleavable cross-linker with a dedicated software tool for data analysis provides an automated workflow for 3D-structure analysis of proteins. MeroX is available at
Tillmar, Andreas O; Kling, Daniel; Butler, John M; Parson, Walther; Prinz, Mechthild; Schneider, Peter M; Egeland, Thore; Gusmão, Leonor
2017-07-01
Forensic genetic laboratories perform an increasing amount of genetic analyses of the X chromosome, in particular to solve complex cases of kinship analysis. For some biological relationships X-chromosomal markers can be more informative than autosomal markers, and there are a large number of markers, methods and databases that have been described for forensic use. Due to their particular mode of inheritance, and their physical location on a single chromosome, some specific considerations are required when estimating the weight of evidence for X-chromosomal marker DNA data. The DNA Commission of the International Society for Forensic Genetics (ISFG) hereby presents guidelines and recommendations for the use of X-chromosomal markers in kinship analysis with a special focus on the biostatistical evaluation. Linkage and linkage disequilibrium (association of alleles) are of special importance for such evaluations and these concepts and the implications for likelihood calculations are described in more detail. Furthermore it is important to use appropriate computer software that accounts for linkage and linkage disequilibrium among loci, as well as for mutations. Even though some software exist, there is still a need for further improvement of dedicated software. Copyright © 2017 Elsevier B.V. All rights reserved.
Approaching the exa-scale: a real-world evaluation of rendering extremely large data sets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Patchett, John M; Ahrens, James P; Lo, Li - Ta
2010-10-15
Extremely large scale analysis is becoming increasingly important as supercomputers and their simulations move from petascale to exascale. The lack of dedicated hardware acceleration for rendering on today's supercomputing platforms motivates our detailed evaluation of the possibility of interactive rendering on the supercomputer. In order to facilitate our understanding of rendering on the supercomputing platform, we focus on scalability of rendering algorithms and architecture envisioned for exascale datasets. To understand tradeoffs for dealing with extremely large datasets, we compare three different rendering algorithms for large polygonal data: software based ray tracing, software based rasterization and hardware accelerated rasterization. We presentmore » a case study of strong and weak scaling of rendering extremely large data on both GPU and CPU based parallel supercomputers using Para View, a parallel visualization tool. Wc use three different data sets: two synthetic and one from a scientific application. At an extreme scale, algorithmic rendering choices make a difference and should be considered while approaching exascale computing, visualization, and analysis. We find software based ray-tracing offers a viable approach for scalable rendering of the projected future massive data sizes.« less
Observatory software for the Maunakea Spectroscopic Explorer
NASA Astrophysics Data System (ADS)
Vermeulen, Tom; Isani, Sidik; Withington, Kanoa; Ho, Kevin; Szeto, Kei; Murowinski, Rick
2016-07-01
The Canada-France-Hawaii Telescope is currently in the conceptual design phase to redevelop its facility into the new Maunakea Spectroscopic Explorer (MSE). MSE is designed to be the largest non-ELT optical/NIR astronomical telescope, and will be a fully dedicated facility for multi-object spectroscopy over a broad range of spectral resolutions. This paper outlines the software and control architecture envisioned for the new facility. The architecture will be designed around much of the existing software infrastructure currently used at CFHT as well as the latest proven opensource software. CFHT plans to minimize risk and development time by leveraging existing technology.
Integrated web system of geospatial data services for climate research
NASA Astrophysics Data System (ADS)
Okladnikov, Igor; Gordov, Evgeny; Titov, Alexander
2016-04-01
Georeferenced datasets are currently actively used for modeling, interpretation and forecasting of climatic and ecosystem changes on different spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size (up to tens terabytes for a single dataset) a special software supporting studies in the climate and environmental change areas is required. An approach for integrated analysis of georefernced climatological data sets based on combination of web and GIS technologies in the framework of spatial data infrastructure paradigm is presented. According to this approach a dedicated data-processing web system for integrated analysis of heterogeneous georeferenced climatological and meteorological data is being developed. It is based on Open Geospatial Consortium (OGC) standards and involves many modern solutions such as object-oriented programming model, modular composition, and JavaScript libraries based on GeoExt library, ExtJS Framework and OpenLayers software. This work is supported by the Ministry of Education and Science of the Russian Federation, Agreement #14.613.21.0037.
Digital readout for image converter cameras
NASA Astrophysics Data System (ADS)
Honour, Joseph
1991-04-01
There is an increasing need for fast and reliable analysis of recorded sequences from image converter cameras so that experimental information can be readily evaluated without recourse to more time consuming photographic procedures. A digital readout system has been developed using a randomly triggerable high resolution CCD camera, the output of which is suitable for use with IBM AT compatible PC. Within half a second from receipt of trigger pulse, the frame reformatter displays the image and transfer to storage media can be readily achieved via the PC and dedicated software. Two software programmes offer different levels of image manipulation which includes enhancement routines and parameter calculations with accuracy down to pixel levels. Hard copy prints can be acquired using a specially adapted Polaroid printer, outputs for laser and video printer extend the overall versatility of the system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bouzaki, Mohammed Moustafa, E-mail: bouzaki-physique1@yahoo.fr; Chadel, Meriem; Université de Lorraine, LMOPS, EA 4423, 57070 Metz
This contribution analyzes the energy provided by a solar kit dedicated to autonomous usage and installed in Central Europe (Longitude 6.10°; Latitude 49.21° and Altitude 160 m) by using the simulation software PVSYST. We focused the analysis on the effect of temperature and solar irradiation on the I-V characteristic of a commercial PV panel. We also consider in this study the influence of charging and discharging the battery on the generator efficiency. Meteorological data are integrated into the simulation software. As expected, the solar kit provides an energy varying all along the year with a minimum in December. In themore » proposed approach, we consider this minimum as the lowest acceptable energy level to satisfy the use. Thus for the other months, a lost in the available renewable energy exists if no storage system is associated.« less
High-resolution, continuous field-of-view (FOV), non-rotating imaging system
NASA Technical Reports Server (NTRS)
Huntsberger, Terrance L. (Inventor); Stirbl, Robert C. (Inventor); Aghazarian, Hrand (Inventor); Padgett, Curtis W. (Inventor)
2010-01-01
A high resolution CMOS imaging system especially suitable for use in a periscope head. The imaging system includes a sensor head for scene acquisition, and a control apparatus inclusive of distributed processors and software for device-control, data handling, and display. The sensor head encloses a combination of wide field-of-view CMOS imagers and narrow field-of-view CMOS imagers. Each bank of imagers is controlled by a dedicated processing module in order to handle information flow and image analysis of the outputs of the camera system. The imaging system also includes automated or manually controlled display system and software for providing an interactive graphical user interface (GUI) that displays a full 360-degree field of view and allows the user or automated ATR system to select regions for higher resolution inspection.
Use of CellNetAnalyzer in biotechnology and metabolic engineering.
von Kamp, Axel; Thiele, Sven; Hädicke, Oliver; Klamt, Steffen
2017-11-10
Mathematical models of the cellular metabolism have become an essential tool for the optimization of biotechnological processes. They help to obtain a systemic understanding of the metabolic processes in the used microorganisms and to find suitable genetic modifications maximizing the production performance. In particular, methods of stoichiometric and constraint-based modeling are frequently used in the context of metabolic and bioprocess engineering. Since metabolic networks can be complex and comprise hundreds or even thousands of metabolites and reactions, dedicated software tools are required for an efficient analysis. One such software suite is CellNetAnalyzer, a MATLAB package providing, among others, various methods for analyzing stoichiometric and constraint-based metabolic models. CellNetAnalyzer can be used via command-line based operations or via a graphical user interface with embedded network visualizations. Herein we will present key functionalities of CellNetAnalyzer for applications in biotechnology and metabolic engineering and thereby review constraint-based modeling techniques such as metabolic flux analysis, flux balance analysis, flux variability analysis, metabolic pathway analysis (elementary flux modes) and methods for computational strain design. Copyright © 2017 The Author(s). Published by Elsevier B.V. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Malik, S.; Shipsey, I.; Cavanaugh, R.
To impart hands-on training in physics analysis, CMS experiment initiated the concept of CMS Data Analysis School (CMSDAS). It was born over three years ago at the LPC (LHC Physics Centre), Fermilab and is based on earlier workshops held at the LPC and CLEO Experiment. As CMS transitioned from construction to the data taking mode, the nature of earlier training also evolved to include more of analysis tools, software tutorials and physics analysis. This effort epitomized as CMSDAS has proven to be a key for the new and young physicists to jump start and contribute to the physics goals ofmore » CMS by looking for new physics with the collision data. With over 400 physicists trained in six CMSDAS around the globe, CMS is trying to engage the collaboration in its discovery potential and maximize physics output. As a bigger goal, CMS is striving to nurture and increase engagement of the myriad talents, in the development of physics, service, upgrade, education of those new to CMS and the career development of younger members. An extension of the concept to the dedicated software and hardware schools is also planned, keeping in mind the ensuing upgrade phase.« less
NASA Astrophysics Data System (ADS)
Papanikolaou, Xanthos; Anastasiou, Demitris; Marinou, Aggeliki; Zacharis, Vangelis; Paradissis, Demitris
2015-04-01
Dionysos Satellite Observatory and Higher Geodesy Laboratory of the National Technical University of Athens, have developed an automated processing scheme to accommodate the daily analysis of all available continuous GNSS stations in Greece. For the moment, a total of approximately 150 regional stations are processed, divided in 4 subnetworks. GNSS data are processed routinely on a daily basis, via Bernese GNSS Software v5.0, developed by AIUB. Each network is solved twice, within a period of 20 days, first using ultra-rapid products (with a latency of ~10 hours) and then using final products (with a latency of ~20 days). Observations are processed using carrier phase, modelled to double differences in the ionosphere-free linear combination. Analysis results, include coordinate estimates, ionospheric corrections (TEC maps) and hourly tropospheric parameters (zenith delay). This processing scheme, has proved helpful in investigating in near real-time abrupt geophysical phenomena, as in the 2011 Santorini inflation episode and the 2014 Kephalonia earthquake events. All analysis results and products are made available via a dedicated webpage. Additionally, most of the GNSS data are hosted in a GSAC web platform, available to all interested parties. Data and results are made available through the laboratory's dedicated website: http://dionysos.survey.ntua.gr/.
NASA Astrophysics Data System (ADS)
Dervilllé, A.; Labrosse, A.; Zimmermann, Y.; Foucher, J.; Gronheid, R.; Boeckx, C.; Singh, A.; Leray, P.; Halder, S.
2016-03-01
The dimensional scaling in IC manufacturing strongly drives the demands on CD and defect metrology techniques and their measurement uncertainties. Defect review has become as important as CD metrology and both of them create a new metrology paradigm because it creates a completely new need for flexible, robust and scalable metrology software. Current, software architectures and metrology algorithms are performant but it must be pushed to another higher level in order to follow roadmap speed and requirements. For example: manage defect and CD in one step algorithm, customize algorithms and outputs features for each R&D team environment, provide software update every day or every week for R&D teams in order to explore easily various development strategies. The final goal is to avoid spending hours and days to manually tune algorithm to analyze metrology data and to allow R&D teams to stay focus on their expertise. The benefits are drastic costs reduction, more efficient R&D team and better process quality. In this paper, we propose a new generation of software platform and development infrastructure which can integrate specific metrology business modules. For example, we will show the integration of a chemistry module dedicated to electronics materials like Direct Self Assembly features. We will show a new generation of image analysis algorithms which are able to manage at the same time defect rates, images classifications, CD and roughness measurements with high throughput performances in order to be compatible with HVM. In a second part, we will assess the reliability, the customization of algorithm and the software platform capabilities to follow new specific semiconductor metrology software requirements: flexibility, robustness, high throughput and scalability. Finally, we will demonstrate how such environment has allowed a drastic reduction of data analysis cycle time.
Eurogrid: a new glideinWMS based portal for CDF data analysis
NASA Astrophysics Data System (ADS)
Amerio, S.; Benjamin, D.; Dost, J.; Compostella, G.; Lucchesi, D.; Sfiligoi, I.
2012-12-01
The CDF experiment at Fermilab ended its Run-II phase on September 2011 after 11 years of operations and 10 fb-1 of collected data. CDF computing model is based on a Central Analysis Farm (CAF) consisting of local computing and storage resources, supported by OSG and LCG resources accessed through dedicated portals. At the beginning of 2011 a new portal, Eurogrid, has been developed to effectively exploit computing and disk resources in Europe: a dedicated farm and storage area at the TIER-1 CNAF computing center in Italy, and additional LCG computing resources at different TIER-2 sites in Italy, Spain, Germany and France, are accessed through a common interface. The goal of this project is to develop a portal easy to integrate in the existing CDF computing model, completely transparent to the user and requiring a minimum amount of maintenance support by the CDF collaboration. In this paper we will review the implementation of this new portal, and its performance in the first months of usage. Eurogrid is based on the glideinWMS software, a glidein based Workload Management System (WMS) that works on top of Condor. As CDF CAF is based on Condor, the choice of the glideinWMS software was natural and the implementation seamless. Thanks to the pilot jobs, user-specific requirements and site resources are matched in a very efficient way, completely transparent to the users. Official since June 2011, Eurogrid effectively complements and supports CDF computing resources offering an optimal solution for the future in terms of required manpower for administration, support and development.
NASA Technical Reports Server (NTRS)
Lux, James P.; Taylor, Gregory H.; Lang, Minh; Stern, Ryan A.
2011-01-01
An FPGA module leverages the previous work from Goddard Space Flight Center (GSFC) relating to NASA s Space Telecommunications Radio System (STRS) project. The STRS SpaceWire FPGA Module is written in the Verilog Register Transfer Level (RTL) language, and it encapsulates an unmodified GSFC core (which is written in VHDL). The module has the necessary inputs/outputs (I/Os) and parameters to integrate seamlessly with the SPARC I/O FPGA Interface module (also developed for the STRS operating environment, OE). Software running on the SPARC processor can access the configuration and status registers within the SpaceWire module. This allows software to control and monitor the SpaceWire functions, but it is also used to give software direct access to what is transmitted and received through the link. SpaceWire data characters can be sent/received through the software interface, as well as through the dedicated interface on the GSFC core. Similarly, SpaceWire time codes can be sent/received through the software interface or through a dedicated interface on the core. This innovation is designed for plug-and-play integration in the STRS OE. The SpaceWire module simplifies the interfaces to the GSFC core, and synchronizes all I/O to a single clock. An interrupt output (with optional masking) identifies time-sensitive events within the module. Test modes were added to allow internal loopback of the SpaceWire link and internal loopback of the client-side data interface.
Dynamic Positioning Capability Analysis for Marine Vessels Based on A DPCap Polar Plot Program
NASA Astrophysics Data System (ADS)
Wang, Lei; Yang, Jian-min; Xu, Sheng-wen
2018-03-01
Dynamic positioning capability (DPCap) analysis is essential in the selection of thrusters, in their configuration, and during preliminary investigation of the positioning ability of a newly designed vessel dynamic positioning system. DPCap analysis can help determine the maximum environmental forces, in which the DP system can counteract in given headings. The accuracy of the DPCap analysis is determined by the precise estimation of the environmental forces as well as the effectiveness of the thrust allocation logic. This paper is dedicated to developing an effective and efficient software program for the DPCap analysis for marine vessels. Estimation of the environmental forces can be obtained by model tests, hydrodynamic computation and empirical formulas. A quadratic programming method is adopted to allocate the total thrust on every thruster of the vessel. A detailed description of the thrust allocation logic of the software program is given. The effectiveness of the new program DPCap Polar Plot (DPCPP) was validated by a DPCap analysis for a supply vessel. The present study indicates that the developed program can be used in the DPCap analysis for marine vessels. Moreover, DPCap analysis considering the thruster failure mode might give guidance to the designers of vessels whose thrusters need to be safer.
ANSYS UIDL-Based CAE Development of Axial Support System for Optical Mirror
NASA Astrophysics Data System (ADS)
Yang, De-Hua; Shao, Liang
2008-09-01
The Whiffle-tree type axial support mechanism is widely adopted by most relatively large optical mirrors. Based on the secondary developing tools offered by the commonly used Finite Element Anylysis (FEA) software ANSYS, ANSYS Parametric Design Language (APDL) is used for creating the mirror FEA model driven by parameters, and ANSYS User Interface Design Language (UIDL) for generating custom menu of interactive manner, whereby, the relatively independent dedicated Computer Aided Engineering (CAE) module is embedded in ANSYS for calculation and optimization of axial Whiffle-tree support of optical mirrors. An example is also described to illustrate the intuitive and effective usage of the dedicated module by boosting work efficiency and releasing related engineering knowledge of user. The philosophy of secondary-developed special module with commonly used software also suggests itself for product development in other industries.
Abreu, Rui Mv; Froufe, Hugo Jc; Queiroz, Maria João Rp; Ferreira, Isabel Cfr
2010-10-28
Virtual screening of small molecules using molecular docking has become an important tool in drug discovery. However, large scale virtual screening is time demanding and usually requires dedicated computer clusters. There are a number of software tools that perform virtual screening using AutoDock4 but they require access to dedicated Linux computer clusters. Also no software is available for performing virtual screening with Vina using computer clusters. In this paper we present MOLA, an easy-to-use graphical user interface tool that automates parallel virtual screening using AutoDock4 and/or Vina in bootable non-dedicated computer clusters. MOLA automates several tasks including: ligand preparation, parallel AutoDock4/Vina jobs distribution and result analysis. When the virtual screening project finishes, an open-office spreadsheet file opens with the ligands ranked by binding energy and distance to the active site. All results files can automatically be recorded on an USB-flash drive or on the hard-disk drive using VirtualBox. MOLA works inside a customized Live CD GNU/Linux operating system, developed by us, that bypass the original operating system installed on the computers used in the cluster. This operating system boots from a CD on the master node and then clusters other computers as slave nodes via ethernet connections. MOLA is an ideal virtual screening tool for non-experienced users, with a limited number of multi-platform heterogeneous computers available and no access to dedicated Linux computer clusters. When a virtual screening project finishes, the computers can just be restarted to their original operating system. The originality of MOLA lies on the fact that, any platform-independent computer available can he added to the cluster, without ever using the computer hard-disk drive and without interfering with the installed operating system. With a cluster of 10 processors, and a potential maximum speed-up of 10x, the parallel algorithm of MOLA performed with a speed-up of 8,64× using AutoDock4 and 8,60× using Vina.
Contribution of Electronic Medical Records to the Management of Rare Diseases.
Bremond-Gignac, Dominique; Lewandowski, Elisabeth; Copin, Henri
2015-01-01
Electronic health record systems provide great opportunity to study most diseases. Objective of this study was to determine whether electronic medical records (EMR) in ophthalmology contribute to management of rare eye diseases, isolated or in syndromes. Study was designed to identify and collect patients' data with ophthalmology-specific EMR. Ophthalmology-specific EMR software (Softalmo software Corilus) was used to acquire ophthalmological ocular consultation data from patients with five rare eye diseases. The rare eye diseases and data were selected and collected regarding expertise of eye center. A total of 135,206 outpatient consultations were performed between 2011 and 2014 in our medical center specialized in rare eye diseases. The search software identified 29 congenital aniridia, 6 Axenfeld/Rieger syndrome, 11 BEPS, 3 Nanophthalmos, and 3 Rubinstein-Taybi syndrome. EMR provides advantages for medical care. The use of ophthalmology-specific EMR is reliable and can contribute to a comprehensive ocular visual phenotype useful for clinical research. Routinely EMR acquired with specific software dedicated to ophthalmology provides sufficient detail for rare diseases. These software-collected data appear useful for creating patient cohorts and recording ocular examination, avoiding the time-consuming analysis of paper records and investigation, in a University Hospital linked to a National Reference Rare Center Disease.
Contribution of Electronic Medical Records to the Management of Rare Diseases
Bremond-Gignac, Dominique; Lewandowski, Elisabeth; Copin, Henri
2015-01-01
Purpose. Electronic health record systems provide great opportunity to study most diseases. Objective of this study was to determine whether electronic medical records (EMR) in ophthalmology contribute to management of rare eye diseases, isolated or in syndromes. Study was designed to identify and collect patients' data with ophthalmology-specific EMR. Methods. Ophthalmology-specific EMR software (Softalmo software Corilus) was used to acquire ophthalmological ocular consultation data from patients with five rare eye diseases. The rare eye diseases and data were selected and collected regarding expertise of eye center. Results. A total of 135,206 outpatient consultations were performed between 2011 and 2014 in our medical center specialized in rare eye diseases. The search software identified 29 congenital aniridia, 6 Axenfeld/Rieger syndrome, 11 BEPS, 3 Nanophthalmos, and 3 Rubinstein-Taybi syndrome. Discussion. EMR provides advantages for medical care. The use of ophthalmology-specific EMR is reliable and can contribute to a comprehensive ocular visual phenotype useful for clinical research. Conclusion. Routinely EMR acquired with specific software dedicated to ophthalmology provides sufficient detail for rare diseases. These software-collected data appear useful for creating patient cohorts and recording ocular examination, avoiding the time-consuming analysis of paper records and investigation, in a University Hospital linked to a National Reference Rare Center Disease. PMID:26539543
Chanu, A; Aboussouan, E; Tamaz, S; Martel, S
2006-01-01
Software architecture for the navigation of a ferromagnetic untethered device in a 1D and 2D phantom environment is briefly described. Navigation is achieved using the real-time capabilities of a Siemens 1.5 T Avanto MRI system coupled with a dedicated software environment and a specially developed 3D tracking pulse sequence. Real-time control of the magnetic core is executed through the implementation of a simple PID controller. 1D and 2D experimental results are presented.
The evolution of CMS software performance studies
NASA Astrophysics Data System (ADS)
Kortelainen, M. J.; Elmer, P.; Eulisse, G.; Innocente, V.; Jones, C. D.; Tuura, L.
2011-12-01
CMS has had an ongoing and dedicated effort to optimize software performance for several years. Initially this effort focused primarily on the cleanup of many issues coming from basic C++ errors, namely reducing dynamic memory churn, unnecessary copies/temporaries and tools to routinely monitor these things. Over the past 1.5 years, however, the transition to 64bit, newer versions of the gcc compiler, newer tools and the enabling of techniques like vectorization have made possible more sophisticated improvements to the software performance. This presentation will cover this evolution and describe the current avenues being pursued for software performance, as well as the corresponding gains.
Developing infrared array controller with software real time operating system
NASA Astrophysics Data System (ADS)
Sako, Shigeyuki; Miyata, Takashi; Nakamura, Tomohiko; Motohara, Kentaro; Uchimoto, Yuka Katsuno; Onaka, Takashi; Kataza, Hirokazu
2008-07-01
Real-time capabilities are required for a controller of a large format array to reduce a dead-time attributed by readout and data transfer. The real-time processing has been achieved by dedicated processors including DSP, CPLD, and FPGA devices. However, the dedicated processors have problems with memory resources, inflexibility, and high cost. Meanwhile, a recent PC has sufficient resources of CPUs and memories to control the infrared array and to process a large amount of frame data in real-time. In this study, we have developed an infrared array controller with a software real-time operating system (RTOS) instead of the dedicated processors. A Linux PC equipped with a RTAI extension and a dual-core CPU is used as a main computer, and one of the CPU cores is allocated to the real-time processing. A digital I/O board with DMA functions is used for an I/O interface. The signal-processing cores are integrated in the OS kernel as a real-time driver module, which is composed of two virtual devices of the clock processor and the frame processor tasks. The array controller with the RTOS realizes complicated operations easily, flexibly, and at a low cost.
The Seven Deadly Sins of Online Microcomputing.
ERIC Educational Resources Information Center
King, Alan
1989-01-01
Offers suggestions for avoiding common errors in online microcomputer use. Areas discussed include learning the basics; hardware protection; backup options; hard disk organization; software selection; file security; and the use of dedicated communications lines. (CLB)
High resolution image processing on low-cost microcomputers
NASA Technical Reports Server (NTRS)
Miller, R. L.
1993-01-01
Recent advances in microcomputer technology have resulted in systems that rival the speed, storage, and display capabilities of traditionally larger machines. Low-cost microcomputers can provide a powerful environment for image processing. A new software program which offers sophisticated image display and analysis on IBM-based systems is presented. Designed specifically for a microcomputer, this program provides a wide-range of functions normally found only on dedicated graphics systems, and therefore can provide most students, universities and research groups with an affordable computer platform for processing digital images. The processing of AVHRR images within this environment is presented as an example.
On data modeling for neurological application
NASA Astrophysics Data System (ADS)
Woźniak, Karol; Mulawka, Jan
The aim of this paper is to design and implement information system containing large database dedicated to support neurological-psychiatric examinations focused on human brain after stroke. This approach encompasses the following steps: analysis of software requirements, presentation of the problem solving concept, design and implementation of the final information system. Certain experiments were performed in order to verify the correctness of the project ideas. The approach can be considered as an interdisciplinary venture. Elaboration of the system architecture, data model and the tools supporting medical examinations are provided. The achievement of the design goals is demonstrated in the final conclusion.
Challenges and perspectives of metaproteomic data analysis.
Heyer, Robert; Schallert, Kay; Zoun, Roman; Becher, Beatrice; Saake, Gunter; Benndorf, Dirk
2017-11-10
In nature microorganisms live in complex microbial communities. Comprehensive taxonomic and functional knowledge about microbial communities supports medical and technical application such as fecal diagnostics as well as operation of biogas plants or waste water treatment plants. Furthermore, microbial communities are crucial for the global carbon and nitrogen cycle in soil and in the ocean. Among the methods available for investigation of microbial communities, metaproteomics can approximate the activity of microorganisms by investigating the protein content of a sample. Although metaproteomics is a very powerful method, issues within the bioinformatic evaluation impede its success. In particular, construction of databases for protein identification, grouping of redundant proteins as well as taxonomic and functional annotation pose big challenges. Furthermore, growing amounts of data within a metaproteomics study require dedicated algorithms and software. This review summarizes recent metaproteomics software and addresses the introduced issues in detail. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Bouzaki, Mohammed Moustafa; Chadel, Meriem; Benyoucef, Boumediene; Petit, Pierre; Aillerie, Michel
2016-07-01
This contribution analyzes the energy provided by a solar kit dedicated to autonomous usage and installed in Central Europa (Longitude 6.10°; Latitude 49.21° and Altitude 160 m) by using the simulation software PVSYST. We focused the analysis on the effect of temperature and solar irradiation on the I-V characteristic of a commercial PV panel. We also consider in this study the influence of charging and discharging the battery on the generator efficiency. Meteorological data are integrated into the simulation software. As expected, the solar kit provides an energy varying all along the year with a minimum in December. In the proposed approach, we consider this minimum as the lowest acceptable energy level to satisfy the use. Thus for the other months, a lost in the available renewable energy exists if no storage system is associated.
Localization-based super-resolution imaging meets high-content screening.
Beghin, Anne; Kechkar, Adel; Butler, Corey; Levet, Florian; Cabillic, Marine; Rossier, Olivier; Giannone, Gregory; Galland, Rémi; Choquet, Daniel; Sibarita, Jean-Baptiste
2017-12-01
Single-molecule localization microscopy techniques have proven to be essential tools for quantitatively monitoring biological processes at unprecedented spatial resolution. However, these techniques are very low throughput and are not yet compatible with fully automated, multiparametric cellular assays. This shortcoming is primarily due to the huge amount of data generated during imaging and the lack of software for automation and dedicated data mining. We describe an automated quantitative single-molecule-based super-resolution methodology that operates in standard multiwell plates and uses analysis based on high-content screening and data-mining software. The workflow is compatible with fixed- and live-cell imaging and allows extraction of quantitative data like fluorophore photophysics, protein clustering or dynamic behavior of biomolecules. We demonstrate that the method is compatible with high-content screening using 3D dSTORM and DNA-PAINT based super-resolution microscopy as well as single-particle tracking.
Aeroelastic analysis of versatile thermal insulation (VTI) panels with pinched boundary conditions
NASA Astrophysics Data System (ADS)
Carrera, Erasmo; Zappino, Enrico; Patočka, Karel; Komarek, Martin; Ferrarese, Adriano; Montabone, Mauro; Kotzias, Bernhard; Huermann, Brian; Schwane, Richard
2014-03-01
Launch vehicle design and analysis is a crucial problem in space engineering. The large range of external conditions and the complexity of space vehicles make the solution of the problem really challenging. The problem considered in the present work deals with the versatile thermal insulation (VTI) panel. This thermal protection system is designed to reduce heat fluxes on the LH2 tank during the long coasting phases. Because of the unconventional boundary conditions and the large-scale geometry of the panel, the aeroelastic behaviour of VTI is investigated in the present work. Known available results from literature related to similar problem, are reviewed by considering the effect of various Mach regimes, including boundary layer thickness effects, in-plane mechanical and thermal loads, non-linear effects and amplitude of limit cycle oscillations. A dedicated finite element model is developed for the supersonic regime. The models used for coupling the orthotropic layered structural model with Piston Theory aerodynamic models allow the calculations of flutter conditions in case of curved panels supported in a discrete number of points. An advanced computational aeroelasticity tool is developed using various dedicated commercial softwares (CFX, ZAERO, EDGE). A wind tunnel test campaign is carried out to assess the computational tool in the analysis of this type of problem.
Fully automated corneal endothelial morphometry of images captured by clinical specular microscopy
NASA Astrophysics Data System (ADS)
Bucht, Curry; Söderberg, Per; Manneberg, Göran
2010-02-01
The corneal endothelium serves as the posterior barrier of the cornea. Factors such as clarity and refractive properties of the cornea are in direct relationship to the quality of the endothelium. The endothelial cell density is considered the most important morphological factor of the corneal endothelium. Pathological conditions and physical trauma may threaten the endothelial cell density to such an extent that the optical property of the cornea and thus clear eyesight is threatened. Diagnosis of the corneal endothelium through morphometry is an important part of several clinical applications. Morphometry of the corneal endothelium is presently carried out by semi automated analysis of pictures captured by a Clinical Specular Microscope (CSM). Because of the occasional need of operator involvement, this process can be tedious, having a negative impact on sampling size. This study was dedicated to the development and use of fully automated analysis of a very large range of images of the corneal endothelium, captured by CSM, using Fourier analysis. Software was developed in the mathematical programming language Matlab. Pictures of the corneal endothelium, captured by CSM, were read into the analysis software. The software automatically performed digital enhancement of the images, normalizing lights and contrasts. The digitally enhanced images of the corneal endothelium were Fourier transformed, using the fast Fourier transform (FFT) and stored as new images. Tools were developed and applied for identification and analysis of relevant characteristics of the Fourier transformed images. The data obtained from each Fourier transformed image was used to calculate the mean cell density of its corresponding corneal endothelium. The calculation was based on well known diffraction theory. Results in form of estimated cell density of the corneal endothelium were obtained, using fully automated analysis software on 292 images captured by CSM. The cell density obtained by the fully automated analysis was compared to the cell density obtained from classical, semi-automated analysis and a relatively large correlation was found.
Applications of the Coastal Zone Color Scanner in oceanography
NASA Technical Reports Server (NTRS)
Mcclain, C. R.
1988-01-01
Research activity has continued to be focused on the applications of the Coastal Zone Color Scanner (CZCS) imagery in oceanography. A number of regional studies were completed including investigations of temporal and spatial variability of phytoplankton populations in the South Atlantic Bight, Northwest Spain, Weddell Sea, Bering Sea, Caribbean Sea and in tropical Atlantic Ocean. In addition to the regional studies, much work was dedicated to developing ancillary global scale meteorological and hydrographic data sets to complement the global CZCS processing products. To accomplish this, SEAPAK's image analysis capability was complemented with an interface to GEMPAK (Severe Storm Branch's meteorological analysis software package) for the analysis and graphical display of gridded data fields. Plans are being made to develop a similar interface to SEAPAK for hydrographic data using EPIC (a hydrographic data analysis package developed by NOAA/PMEL).
Rodríguez-Olivares, Ramón; El Faquir, Nahid; Rahhab, Zouhair; Maugenest, Anne-Marie; Van Mieghem, Nicolas M; Schultz, Carl; Lauritsch, Guenter; de Jaegere, Peter P T
2016-07-01
To study the determinants of image quality of rotational angiography using dedicated research prototype software for motion compensation without rapid ventricular pacing after the implantation of four commercially available catheter-based valves. Prospective observational study including 179 consecutive patients who underwent transcatheter aortic valve implantation (TAVI) with either the Medtronic CoreValve (MCS), Edward-SAPIEN Valve (ESV), Boston Sadra Lotus (BSL) or Saint-Jude Portico Valve (SJP) in whom rotational angiography (R-angio) with motion compensation 3D image reconstruction was performed. Image quality was evaluated from grade 1 (excellent image quality) to grade 5 (strongly degraded). Distinction was made between good (grades 1, 2) and poor image quality (grades 3-5). Clinical (gender, body mass index, Agatston score, heart rate and rhythm, artifacts), procedural (valve type) and technical variables (isocentricity) were related with the image quality assessment. Image quality was good in 128 (72 %) and poor in 51 (28 %) patients. By univariable analysis only valve type (BSL) and the presence of an artefact negatively affected image quality. By multivariate analysis (in which BMI was forced into the model) BSL valve (Odds 3.5, 95 % CI [1.3-9.6], p = 0.02), presence of an artifact (Odds 2.5, 95 % CI [1.2-5.4], p = 0.02) and BMI (Odds 1.1, 95 % CI [1.0-1.2], p = 0.04) were independent predictors of poor image quality. Rotational angiography with motion compensation 3D image reconstruction using a dedicated research prototype software offers good image quality for the evaluation of frame geometry after TAVI in the majority of patients. Valve type, presence of artifacts and higher BMI negatively affect image quality.
[Development of a software for 3D virtual phantom design].
Zou, Lian; Xie, Zhao; Wu, Qi
2014-02-01
In this paper, we present a 3D virtual phantom design software, which was developed based on object-oriented programming methodology and dedicated to medical physics research. This software was named Magical Phan tom (MPhantom), which is composed of 3D visual builder module and virtual CT scanner. The users can conveniently construct any complex 3D phantom, and then export the phantom as DICOM 3.0 CT images. MPhantom is a user-friendly and powerful software for 3D phantom configuration, and has passed the real scene's application test. MPhantom will accelerate the Monte Carlo simulation for dose calculation in radiation therapy and X ray imaging reconstruction algorithm research.
Integrating opto-thermo-mechanical design tools: open engineering's project presentation
NASA Astrophysics Data System (ADS)
De Vincenzo, P.; Klapka, Igor
2017-11-01
An integrated numerical simulation package dedicated to the analysis of the coupled interactions of optical devices is presented. To reduce human interventions during data transfers, it is based on in-memory communications between the structural analysis software OOFELIE and the optical design application ZEMAX. It allows the automated enhancement of the existing optical design with information related to the deformations of optical surfaces due to thermomechanical solicitations. From the knowledge of these deformations, a grid of points or a decomposition based on Zernike polynomials can be generated for each surface. These data are then applied to the optical design. Finally, indicators can be retrieved from ZEMAX in order to compare the optical performances with those of the system in its nominal configuration.
A Lossless Network for Data Acquisition
NASA Astrophysics Data System (ADS)
Jereczek, Grzegorz; Lehmann Miotto, Giovanna; Malone, David; Walukiewicz, Miroslaw
2017-06-01
The bursty many-to-one communication pattern, typical for data acquisition systems, is particularly demanding for commodity TCP/IP and Ethernet technologies. We expand the study of lossless switching in software running on commercial off-the-shelf servers, using the ATLAS experiment as a case study. In this paper, we extend the popular software switch, Open vSwitch, with a dedicated, throughput-oriented buffering mechanism for data acquisition. We compare the performance under heavy congestion on typical Ethernet switches to a commodity server acting as a switch. Our results indicate that software switches with large buffers perform significantly better. Next, we evaluate the scalability of the system when building a larger topology of interconnected software switches, exploiting the integration with software-defined networking technologies. We build an IP-only leaf-spine network consisting of eight software switches running on distinct physical servers as a demonstrator.
NASA Astrophysics Data System (ADS)
Shiklomanov, A. I.; Okladnikov, I.; Gordov, E. P.; Proussevitch, A. A.; Titov, A. G.
2016-12-01
Presented is a collaborative project carrying out by joint team of researchers from the Institute of Monitoring of Climatic and Ecological Systems, Russia and Earth Systems Research Center, University of New Hampshire, USA. Its main objective is development of a hardware and software prototype of Distributed Research Center (DRC) for monitoring and projecting of regional climatic and and their impacts on the environment over the Northern extratropical areas. In the framework of the project new approaches to "cloud" processing and analysis of large geospatial datasets (big geospatial data) are being developed. It will be deployed on technical platforms of both institutions and applied in research of climate change and its consequences. Datasets available at NCEI and IMCES include multidimensional arrays of climatic, environmental, demographic, and socio-economic characteristics. The project is aimed at solving several major research and engineering tasks: 1) structure analysis of huge heterogeneous climate and environmental geospatial datasets used in the project, their preprocessing and unification; 2) development of a new distributed storage and processing model based on a "shared nothing" paradigm; 3) development of a dedicated database of metadata describing geospatial datasets used in the project; 4) development of a dedicated geoportal and a high-end graphical frontend providing intuitive user interface, internet-accessible online tools for analysis of geospatial data and web services for interoperability with other geoprocessing software packages. DRC will operate as a single access point to distributed archives of spatial data and online tools for their processing. Flexible modular computational engine running verified data processing routines will provide solid results of geospatial data analysis. "Cloud" data analysis and visualization approach will guarantee access to the DRC online tools and data from all over the world. Additionally, exporting of data processing results through WMS and WFS services will be used to provide their interoperability. Financial support of this activity by the RF Ministry of Education and Science under Agreement 14.613.21.0037 (RFMEFI61315X0037) and by the Iola Hubbard Climate Change Endowment is acknowledged.
Practical considerations of image analysis and quantification of signal transduction IHC staining.
Grunkin, Michael; Raundahl, Jakob; Foged, Niels T
2011-01-01
The dramatic increase in computer processing power in combination with the availability of high-quality digital cameras during the last 10 years has fertilized the grounds for quantitative microscopy based on digital image analysis. With the present introduction of robust scanners for whole slide imaging in both research and routine, the benefits of automation and objectivity in the analysis of tissue sections will be even more obvious. For in situ studies of signal transduction, the combination of tissue microarrays, immunohistochemistry, digital imaging, and quantitative image analysis will be central operations. However, immunohistochemistry is a multistep procedure including a lot of technical pitfalls leading to intra- and interlaboratory variability of its outcome. The resulting variations in staining intensity and disruption of original morphology are an extra challenge for the image analysis software, which therefore preferably should be dedicated to the detection and quantification of histomorphometrical end points.
Practice Patterns Analysis of Ocular Proton Therapy Centers: The International OPTIC Survey.
Hrbacek, Jan; Mishra, Kavita K; Kacperek, Andrzej; Dendale, Remi; Nauraye, Catherine; Auger, Michel; Herault, Joel; Daftari, Inder K; Trofimov, Alexei V; Shih, Helen A; Chen, Yen-Lin E; Denker, Andrea; Heufelder, Jens; Horwacik, Tomasz; Swakoń, Jan; Hoehr, Cornelia; Duzenli, Cheryl; Pica, Alessia; Goudjil, Farid; Mazal, Alejandro; Thariat, Juliette; Weber, Damien C
2016-05-01
To assess the planning, treatment, and follow-up strategies worldwide in dedicated proton therapy ocular programs. Ten centers from 7 countries completed a questionnaire survey with 109 queries on the eye treatment planning system (TPS), hardware/software equipment, image acquisition/registration, patient positioning, eye surveillance, beam delivery, quality assurance (QA), clinical management, and workflow. Worldwide, 28,891 eye patients were treated with protons at the 10 centers as of the end of 2014. Most centers treated a vast number of ocular patients (1729 to 6369). Three centers treated fewer than 200 ocular patients. Most commonly, the centers treated uveal melanoma (UM) and other primary ocular malignancies, benign ocular tumors, conjunctival lesions, choroidal metastases, and retinoblastomas. The UM dose fractionation was generally within a standard range, whereas dosing for other ocular conditions was not standardized. The majority (80%) of centers used in common a specific ocular TPS. Variability existed in imaging registration, with magnetic resonance imaging (MRI) rarely being used in routine planning (20%). Increased patient to full-time equivalent ratios were observed by higher accruing centers (P=.0161). Generally, ophthalmologists followed up the post-radiation therapy patients, though in 40% of centers radiation oncologists also followed up the patients. Seven centers had a prospective outcomes database. All centers used a cyclotron to accelerate protons with dedicated horizontal beam lines only. QA checks (range, modulation) varied substantially across centers. The first worldwide multi-institutional ophthalmic proton therapy survey of the clinical and technical approach shows areas of substantial overlap and areas of progress needed to achieve sustainable and systematic management. Future international efforts include research and development for imaging and planning software upgrades, increased use of MRI, development of clinical protocols, systematic patient-centered data acquisition, and publishing guidelines on QA, staffing, treatment, and follow-up parameters by dedicated ocular programs to ensure the highest level of care for ocular patients. Copyright © 2016 Elsevier Inc. All rights reserved.
Practice Patterns Analysis of Ocular Proton Therapy Centers: The International OPTIC Survey
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hrbacek, Jan, E-mail: Jan.hrbacek@psi.ch; Mishra, Kavita K.; Kacperek, Andrzej
Purpose: To assess the planning, treatment, and follow-up strategies worldwide in dedicated proton therapy ocular programs. Methods and Materials: Ten centers from 7 countries completed a questionnaire survey with 109 queries on the eye treatment planning system (TPS), hardware/software equipment, image acquisition/registration, patient positioning, eye surveillance, beam delivery, quality assurance (QA), clinical management, and workflow. Results: Worldwide, 28,891 eye patients were treated with protons at the 10 centers as of the end of 2014. Most centers treated a vast number of ocular patients (1729 to 6369). Three centers treated fewer than 200 ocular patients. Most commonly, the centers treated uvealmore » melanoma (UM) and other primary ocular malignancies, benign ocular tumors, conjunctival lesions, choroidal metastases, and retinoblastomas. The UM dose fractionation was generally within a standard range, whereas dosing for other ocular conditions was not standardized. The majority (80%) of centers used in common a specific ocular TPS. Variability existed in imaging registration, with magnetic resonance imaging (MRI) rarely being used in routine planning (20%). Increased patient to full-time equivalent ratios were observed by higher accruing centers (P=.0161). Generally, ophthalmologists followed up the post–radiation therapy patients, though in 40% of centers radiation oncologists also followed up the patients. Seven centers had a prospective outcomes database. All centers used a cyclotron to accelerate protons with dedicated horizontal beam lines only. QA checks (range, modulation) varied substantially across centers. Conclusions: The first worldwide multi-institutional ophthalmic proton therapy survey of the clinical and technical approach shows areas of substantial overlap and areas of progress needed to achieve sustainable and systematic management. Future international efforts include research and development for imaging and planning software upgrades, increased use of MRI, development of clinical protocols, systematic patient-centered data acquisition, and publishing guidelines on QA, staffing, treatment, and follow-up parameters by dedicated ocular programs to ensure the highest level of care for ocular patients.« less
On-Board Software Reference Architecture for Payloads
NASA Astrophysics Data System (ADS)
Bos, Victor; Rugina, Ana; Trcka, Adam
2016-08-01
The goal of the On-board Software Reference Architecture for Payloads (OSRA-P) is to identify an architecture for payload software to harmonize the payload domain, to enable more reuse of common/generic payload software across different payloads and missions and to ease the integration of the payloads with the platform.To investigate the payload domain, recent and current payload instruments of European space missions have been analyzed. This led to a Payload Catalogue describing 12 payload instruments as well as a Capability Matrix listing specific characteristics of each payload. In addition, a functional decomposition of payload software was prepared which contains functionalities typically found in payload systems. The definition of OSRA-P was evaluated by case studies and a dedicated OSRA-P workshop to gather feedback from the payload community.
NASA Astrophysics Data System (ADS)
Maj, P.; Kasiński, K.; Gryboś, P.; Szczygieł, R.; Kozioł, A.
2015-12-01
Integrated circuits designed for specific applications generally use non-standard communication methods. Hybrid pixel detector readout electronics produces a huge amount of data as a result of number of frames per seconds. The data needs to be transmitted to a higher level system without limiting the ASIC's capabilities. Nowadays, the Camera Link interface is still one of the fastest communication methods, allowing transmission speeds up to 800 MB/s. In order to communicate between a higher level system and the ASIC with a dedicated protocol, an FPGA with dedicated code is required. The configuration data is received from the PC and written to the ASIC. At the same time, the same FPGA should be able to transmit the data from the ASIC to the PC at the very high speed. The camera should be an embedded system enabling autonomous operation and self-monitoring. In the presented solution, at least three different hardware platforms are used—FPGA, microprocessor with real-time operating system and the PC with end-user software. We present the use of a single software platform for high speed data transfer from 65k pixel camera to the personal computer.
Software Framework for Development of Web-GIS Systems for Analysis of Georeferenced Geophysical Data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.
2011-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated software framework for rapid development of providing such support information-computational systems based on Web-GIS technologies has been created. The software framework consists of 3 basic parts: computational kernel developed using ITTVIS Interactive Data Language (IDL), a set of PHP-controllers run within specialized web portal, and JavaScript class library for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology. Computational kernel comprise of number of modules for datasets access, mathematical and statistical data analysis and visualization of results. Specialized web-portal consists of web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript library aiming at graphical user interface development is based on GeoExt library combining ExtJS Framework and OpenLayers software. Based on the software framework an information-computational system for complex analysis of large georeferenced data archives was developed. Structured environmental datasets available for processing now include two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, meteorological observational data for the territory of the former USSR for the 20th century, and others. Current version of the system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The software framework presented allows rapid development of Web-GIS systems for geophysical data analysis thus providing specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. This work is partially supported by RFBR grants #10-07-00547, #11-05-01190, and SB RAS projects 4.31.1.5, 4.31.2.7, 4, 8, 9, 50 and 66.
OPEN-SOURCE SOFTWARE IN DENTISTRY: A SYSTEMATIC REVIEW.
Chruściel-Nogalska, Małgorzata; Smektała, Tomasz; Tutak, Marcin; Sporniak-Tutak, Katarzyna; Olszewski, Raphael
2017-01-01
Technological development and the need for electronic health records management resulted in the need for a computer with dedicated, commercial software in daily dental practice. The alternative for commercial software may be open-source solutions. Therefore, this study reviewed the current literature on the availability and use of open-source software (OSS) in dentistry. A comprehensive database search was performed on February 1, 2017. Only articles published in peer-reviewed journals with a focus on the use or description of OSS were retrieved. The level of evidence, according to Oxford EBM Centre Levels of Evidence Scale was classified for all studies. Experimental studies underwent additional quality reporting assessment. The screening and evaluation process resulted in twenty-one studies from 1,940 articles found, with 10 of them being experimental studies. None of the articles provided level 1 evidence, and only one study was considered high quality following quality assessment. Twenty-six different OSS programs were described in the included studies of which ten were used for image visualization, five were used for healthcare records management, four were used for educations processes, one was used for remote consultation and simulation, and six were used for general purposes. Our analysis revealed that the dental literature on OSS consists of scarce, incomplete, and methodologically low quality information.
Instrument Systems Analysis and Verification Facility (ISAVF) users guide
NASA Technical Reports Server (NTRS)
Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.
1985-01-01
The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.
SysML: A Language for Space System Engineering
NASA Astrophysics Data System (ADS)
Mazzini, S.; Strangapede, A.
2008-08-01
This paper presents the results of an ESA/ESTEC internal study, performed with the support of INTECS, about modeling languages to support Space System Engineering activities and processes, with special emphasis on system requirements identification and analysis. The study was focused on the assessment of dedicated UML profiles, their positioning alongside the system and software life cycles and associated methodologies. Requirements for a Space System Requirements Language were identified considering the ECSS-E-10 and ECSS-E_40 processes. The study has identified SysML as a very promising language, having as theoretical background the reference system processes defined by the ISO15288, as well as industrial practices.
The Seismic Tool-Kit (STK): an open source software for seismology and signal processing.
NASA Astrophysics Data System (ADS)
Reymond, Dominique
2016-04-01
We present an open source software project (GNU public license), named STK: Seismic ToolKit, that is dedicated mainly for seismology and signal processing. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 19 500 downloads at the date of writing. The STK project is composed of two main branches: First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The estimation of spectral density of the signal are performed via the Fourier transform, with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noize), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. A MINimum library of Linear AlGebra (MIN-LINAG) is also provided for computing the main matrix process like: QR/QL decomposition, Cholesky solve of linear system, finding eigen value/eigen vectors, QR-solve/Eigen-solve of linear equations systems ... etc. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. Usefull links: http://sourceforge.net/projects/seismic-toolkit/ http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/
PROFIL: A Method for the Development of Multimedia.
ERIC Educational Resources Information Center
Koper, Rob
1995-01-01
Describes a dedicated method for the design of multimedia courseware, called PROFIL, which integrates instructional design with software engineering techniques and incorporates media selection in the design methodology. The phases of development are outlined: preliminary investigation, definition, script, technical realization, implementation, and…
USDA-ARS?s Scientific Manuscript database
Welcome to the Morchella MLST database. This dedicated database was set up at the CBS-KNAW Biodiversity Center by Vincent Robert in February 2012, using BioloMICS software (Robert et al., 2011), to facilitate DNA sequence-based identifications of Morchella species via the Internet. The current datab...
Protyping machine vision software on the World Wide Web
NASA Astrophysics Data System (ADS)
Karantalis, George; Batchelor, Bruce G.
1998-10-01
Interactive image processing is a proven technique for analyzing industrial vision applications and building prototype systems. Several of the previous implementations have used dedicated hardware to perform the image processing, with a top layer of software providing a convenient user interface. More recently, self-contained software packages have been devised and these run on a standard computer. The advent of the Java programming language has made it possible to write platform-independent software, operating over the Internet, or a company-wide Intranet. Thus, there arises the possibility of designing at least some shop-floor inspection/control systems, without the vision engineer ever entering the factories where they will be used. It successful, this project will have a major impact on the productivity of vision systems designers.
GenColors: annotation and comparative genomics of prokaryotes made easy.
Romualdi, Alessandro; Felder, Marius; Rose, Dominic; Gausmann, Ulrike; Schilhabel, Markus; Glöckner, Gernot; Platzer, Matthias; Sühnel, Jürgen
2007-01-01
GenColors (gencolors.fli-leibniz.de) is a new web-based software/database system aimed at an improved and accelerated annotation of prokaryotic genomes considering information on related genomes and making extensive use of genome comparison. It offers a seamless integration of data from ongoing sequencing projects and annotated genomic sequences obtained from GenBank. A variety of export/import filters manages an effective data flow from sequence assembly and manipulation programs (e.g., GAP4) to GenColors and back as well as to standard GenBank file(s). The genome comparison tools include best bidirectional hits, gene conservation, syntenies, and gene core sets. Precomputed UniProt matches allow annotation and analysis in an effective manner. In addition to these analysis options, base-specific quality data (coverage and confidence) can also be handled if available. The GenColors system can be used both for annotation purposes in ongoing genome projects and as an analysis tool for finished genomes. GenColors comes in two types, as dedicated genome browsers and as the Jena Prokaryotic Genome Viewer (JPGV). Dedicated genome browsers contain genomic information on a set of related genomes and offer a large number of options for genome comparison. The system has been efficiently used in the genomic sequencing of Borrelia garinii and is currently applied to various ongoing genome projects on Borrelia, Legionella, Escherichia, and Pseudomonas genomes. One of these dedicated browsers, the Spirochetes Genome Browser (sgb.fli-leibniz.de) with Borrelia, Leptospira, and Treponema genomes, is freely accessible. The others will be released after finalization of the corresponding genome projects. JPGV (jpgv.fli-leibniz.de) offers information on almost all finished bacterial genomes, as compared to the dedicated browsers with reduced genome comparison functionality, however. As of January 2006, this viewer includes 632 genomic elements (e.g., chromosomes and plasmids) of 293 species. The system provides versatile quick and advanced search options for all currently known prokaryotic genomes and generates circular and linear genome plots. Gene information sheets contain basic gene information, database search options, and links to external databases. GenColors is also available on request for local installation.
An Integrated Nonlinear Analysis library - (INA) for solar system plasma turbulence
NASA Astrophysics Data System (ADS)
Munteanu, Costel; Kovacs, Peter; Echim, Marius; Koppan, Andras
2014-05-01
We present an integrated software library dedicated to the analysis of time series recorded in space and adapted to investigate turbulence, intermittency and multifractals. The library is written in MATLAB and provides a graphical user interface (GUI) customized for the analysis of space physics data available online like: Coordinated Data Analysis Web (CDAWeb), Automated Multi Dataset Analysis system (AMDA), Planetary Science Archive (PSA), World Data Center Kyoto (WDC), Ulysses Final Archive (UFA) and Cluster Active Archive (CAA). Three main modules are already implemented in INA : the Power Spectral Density (PSD) Analysis, the Wavelet and Intemittency Analysis and the Probability Density Functions (PDF) analysis.The layered structure of the software allows the user to easily switch between different modules/methods while retaining the same time interval for the analysis. The wavelet analysis module includes algorithms to compute and analyse the PSD, the Scalogram, the Local Intermittency Measure (LIM) or the Flatness parameter. The PDF analysis module includes algorithms for computing the PDFs for a range of scales and parameters fully customizable by the user; it also computes the Flatness parameter and enables fast comparison with standard PDF profiles like, for instance, the Gaussian PDF. The library has been already tested on Cluster and Venus Express data and we will show relevant examples. Research supported by the European Community's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 313038/STORM, and a grant of the Romanian Ministry of National Education, CNCS UEFISCDI, project number PN-II-ID PCE-2012-4-0418.
OpenMSI: A High-Performance Web-Based Platform for Mass Spectrometry Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rubel, Oliver; Greiner, Annette; Cholia, Shreyas
Mass spectrometry imaging (MSI) enables researchers to directly probe endogenous molecules directly within the architecture of the biological matrix. Unfortunately, efficient access, management, and analysis of the data generated by MSI approaches remain major challenges to this rapidly developing field. Despite the availability of numerous dedicated file formats and software packages, it is a widely held viewpoint that the biggest challenge is simply opening, sharing, and analyzing a file without loss of information. Here we present OpenMSI, a software framework and platform that addresses these challenges via an advanced, high-performance, extensible file format and Web API for remote data accessmore » (http://openmsi.nersc.gov). The OpenMSI file format supports storage of raw MSI data, metadata, and derived analyses in a single, self-describing format based on HDF5 and is supported by a large range of analysis software (e.g., Matlab and R) and programming languages (e.g., C++, Fortran, and Python). Careful optimization of the storage layout of MSI data sets using chunking, compression, and data replication accelerates common, selective data access operations while minimizing data storage requirements and are critical enablers of rapid data I/O. The OpenMSI file format has shown to provide >2000-fold improvement for image access operations, enabling spectrum and image retrieval in less than 0.3 s across the Internet even for 50 GB MSI data sets. To make remote high-performance compute resources accessible for analysis and to facilitate data sharing and collaboration, we describe an easy-to-use yet powerful Web API, enabling fast and convenient access to MSI data, metadata, and derived analysis results stored remotely to facilitate high-performance data analysis and enable implementation of Web based data sharing, visualization, and analysis.« less
Using Amazon's Elastic Compute Cloud to dynamically scale CMS computational resources
NASA Astrophysics Data System (ADS)
Evans, D.; Fisk, I.; Holzman, B.; Melo, A.; Metson, S.; Pordes, R.; Sheldon, P.; Tiradani, A.
2011-12-01
Large international scientific collaborations such as the Compact Muon Solenoid (CMS) experiment at the Large Hadron Collider have traditionally addressed their data reduction and analysis needs by building and maintaining dedicated computational infrastructure. Emerging cloud computing services such as Amazon's Elastic Compute Cloud (EC2) offer short-term CPU and storage resources with costs based on usage. These services allow experiments to purchase computing resources as needed, without significant prior planning and without long term investments in facilities and their management. We have demonstrated that services such as EC2 can successfully be integrated into the production-computing model of CMS, and find that they work very well as worker nodes. The cost-structure and transient nature of EC2 services makes them inappropriate for some CMS production services and functions. We also found that the resources are not truely "on-demand" as limits and caps on usage are imposed. Our trial workflows allow us to make a cost comparison between EC2 resources and dedicated CMS resources at a University, and conclude that it is most cost effective to purchase dedicated resources for the "base-line" needs of experiments such as CMS. However, if the ability to use cloud computing resources is built into an experiment's software framework before demand requires their use, cloud computing resources make sense for bursting during times when spikes in usage are required.
Evaluation and Selection of a Telecommunication System at the Naval Postgraduate School BOQ
1991-03-01
software, and operating system software as a part of a complete package. An internal modem must be included for remote diagnostic and programming...2 service delivery will follow the CCITT V.35 recommendation for physical, functional and electrical interfaces. Type 1 & 2 will transfer data at 56K ...by a dedicated access line at 4.8 kbs and 9.6 Kbps and 56k /64Kbps. PSS will follow the CCITT X.25 recommendations. E-mail service may be provided on
Allasia, Paolo; Manconi, Andrea; Giordan, Daniele; Baldo, Marco; Lollino, Giorgio
2013-01-01
We present a new method for near-real-time monitoring of surface displacements due to landslide phenomena, namely ADVanced dIsplaCement monitoring system for Early warning (ADVICE). The procedure includes: (i) data acquisition and transfer protocols; (ii) data collection, filtering, and validation; (iii) data analysis and restitution through a set of dedicated software; (iv) recognition of displacement/velocity threshold, early warning messages via SMS and/or emails; (v) automatic publication of the results on a dedicated webpage. We show how the system evolved and the results obtained by applying ADVICE over three years into a real early warning scenario relevant to a large earthflow located in southern Italy. ADVICE has speed-up and facilitated the understanding of the landslide phenomenon, the communication of the monitoring results to the partners, and consequently the decision-making process in a critical scenario. Our work might have potential applications not only for landslide monitoring but also in other contexts, as monitoring of other geohazards and of complex infrastructures, as open-pit mines, buildings, dams, etc. PMID:23807688
Allasia, Paolo; Manconi, Andrea; Giordan, Daniele; Baldo, Marco; Lollino, Giorgio
2013-06-27
We present a new method for near-real-time monitoring of surface displacements due to landslide phenomena, namely ADVanced dIsplaCement monitoring system for Early warning (ADVICE). The procedure includes: (i) data acquisition and transfer protocols; (ii) data collection, filtering, and validation; (iii) data analysis and restitution through a set of dedicated software; (iv) recognition of displacement/velocity threshold, early warning messages via SMS and/or emails; (v) automatic publication of the results on a dedicated webpage. We show how the system evolved and the results obtained by applying ADVICE over three years into a real early warning scenario relevant to a large earthflow located in southern Italy. ADVICE has speed-up and facilitated the understanding of the landslide phenomenon, the communication of the monitoring results to the partners, and consequently the decision-making process in a critical scenario. Our work might have potential applications not only for landslide monitoring but also in other contexts, as monitoring of other geohazards and of complex infrastructures, as open-pit mines, buildings, dams, etc.
Networking CD-ROMs: A Tutorial Introduction.
ERIC Educational Resources Information Center
Perone, Karen
1996-01-01
Provides an introduction to CD-ROM networking. Highlights include LAN (local area network) architectures for CD-ROM networks, peer-to-peer networks, shared file and dedicated file servers, commercial software/vendor solutions, problems, multiple hardware platforms, and multimedia. Six figures illustrate network architectures and a sidebar contains…
Kellie, John F.; Tran, John C.; Lee, Ji Eun; Ahlf, Dorothy R.; Thomas, Haylee M.; Ntai, Ioanna; Catherman, Adam D.; Durbin, Kenneth R.; Zamdborg, Leonid; Vellaichamy, Adaikkalam; Thomas, Paul M.
2011-01-01
Top Down mass spectrometry (MS) has emerged as an alternative to common Bottom Up strategies for protein analysis. In the Top Down approach, intact proteins are fragmented directly in the mass spectrometer to achieve both protein identification and characterization, even capturing information on combinatorial post-translational modifications. Just in the past two years, Top Down MS has seen incremental advances in instrumentation and dedicated software, and has also experienced a major boost from refined separations of whole proteins in complex mixtures that have both high recovery and reproducibility. Combined with steadily advancing commercial MS instrumentation and data processing, a high-throughput workflow covering intact proteins and polypeptides up to 70 kDa is directly visible in the near future. PMID:20711533
NASA Astrophysics Data System (ADS)
Gann, E.; Young, A. T.; Collins, B. A.; Yan, H.; Nasiatka, J.; Padmore, H. A.; Ade, H.; Hexemer, A.; Wang, C.
2012-04-01
We present the development and characterization of a dedicated resonant soft x-ray scattering facility. Capable of operation over a wide energy range, the beamline and endstation are primarily used for scattering from soft matter systems around the carbon K-edge (˜285 eV). We describe the specialized design of the instrument and characteristics of the beamline. Operational characteristics of immediate interest to users such as polarization control, degree of higher harmonic spectral contamination, and detector noise are delineated. Of special interest is the development of a higher harmonic rejection system that improves the spectral purity of the x-ray beam. Special software and a user-friendly interface have been implemented to allow real-time data processing and preliminary data analysis simultaneous with data acquisition.
Recording and assessment of evoked potentials with electrode arrays.
Miljković, N; Malešević, N; Kojić, V; Bijelić, G; Keller, T; Popović, D B
2015-09-01
In order to optimize procedure for the assessment of evoked potentials and to provide visualization of the flow of action potentials along the motor systems, we introduced array electrodes for stimulation and recording and developed software for the analysis of the recordings. The system uses a stimulator connected to an electrode array for the generation of evoked potentials, an electrode array connected to the amplifier, A/D converter and computer for the recording of evoked potentials, and a dedicated software application. The method has been tested for the assessment of the H-reflex on the triceps surae muscle in six healthy humans. The electrode array with 16 pads was positioned over the posterior aspect of the thigh, while the recording electrode array with 16 pads was positioned over the triceps surae muscle. The stimulator activated all the pads of the stimulation electrode array asynchronously, while the signals were recorded continuously at all the recording sites. The results are topography maps (spatial distribution of evoked potentials) and matrices (spatial visualization of nerve excitability). The software allows the automatic selection of the lowest stimulation intensity to achieve maximal H-reflex amplitude and selection of the recording/stimulation pads according to predefined criteria. The analysis of results shows that the method provides rich information compared with the conventional recording of the H-reflex with regard the spatial distribution.
Assessment of replicate bias in 454 pyrosequencing and a multi-purpose read-filtering tool.
Jérôme, Mariette; Noirot, Céline; Klopp, Christophe
2011-05-26
Roche 454 pyrosequencing platform is often considered the most versatile of the Next Generation Sequencing technology platforms, permitting the sequencing of large genomes, the analysis of variations or the study of transcriptomes. A recent reported bias leads to the production of multiple reads for a unique DNA fragment in a random manner within a run. This bias has a direct impact on the quality of the measurement of the representation of the fragments using the reads. Other cleaning steps are usually performed on the reads before assembly or alignment. PyroCleaner is a software module intended to clean 454 pyrosequencing reads in order to ease the assembly process. This program is a free software and is distributed under the terms of the GNU General Public License as published by the Free Software Foundation. It implements several filters using criteria such as read duplication, length, complexity, base-pair quality and number of undetermined bases. It also permits to clean flowgram files (.sff) of paired-end sequences generating on one hand validated paired-ends file and the other hand single read file. Read cleaning has always been an important step in sequence analysis. The pyrocleaner python module is a Swiss knife dedicated to 454 reads cleaning. It includes commonly used filters as well as specialised ones such as duplicated read removal and paired-end read verification.
Krecsák, László; Micsik, Tamás; Kiszler, Gábor; Krenács, Tibor; Szabó, Dániel; Jónás, Viktor; Császár, Gergely; Czuni, László; Gurzó, Péter; Ficsor, Levente; Molnár, Béla
2011-01-18
The immunohistochemical detection of estrogen (ER) and progesterone (PR) receptors in breast cancer is routinely used for prognostic and predictive testing. Whole slide digitalization supported by dedicated software tools allows quantization of the image objects (e.g. cell membrane, nuclei) and an unbiased analysis of immunostaining results. Validation studies of image analysis applications for the detection of ER and PR in breast cancer specimens provided strong concordance between the pathologist's manual assessment of slides and scoring performed using different software applications. The effectiveness of two connected semi-automated image analysis software (NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14) for determination of ER and PR status in formalin-fixed paraffin embedded breast cancer specimens immunostained with the automated Leica Bond Max system was studied. First the detection algorithm was calibrated to the scores provided an independent assessors (pathologist), using selected areas from 38 small digital slides (created from 16 cases) containing a mean number of 195 cells. Each cell was manually marked and scored according to the Allred-system combining frequency and intensity scores. The performance of the calibrated algorithm was tested on 16 cases (14 invasive ductal carcinoma, 2 invasive lobular carcinoma) against the pathologist's manual scoring of digital slides. The detection was calibrated to 87 percent object detection agreement and almost perfect Total Score agreement (Cohen's kappa 0.859, quadratic weighted kappa 0.986) from slight or moderate agreement at the start of the study, using the un-calibrated algorithm. The performance of the application was tested against the pathologist's manual scoring of digital slides on 53 regions of interest of 16 ER and PR slides covering all positivity ranges, and the quadratic weighted kappa provided almost perfect agreement (κ = 0.981) among the two scoring schemes. NuclearQuant v. 1.13 application for Pannoramic™ Viewer v. 1.14 software application proved to be a reliable image analysis tool for pathologists testing ER and PR status in breast cancer.
Iyoke, C A; Ugwu, O G; Ezugwu, F O; Onah, H E; Agbata, A T; Ajah, L C
2014-01-01
It has been suggested that women with early miscarriage or ectopic pregnancy are best cared for in dedicated units which offer rapid and definitive ultrasonographic and biochemical assessment at the initial review of the patient. To describe the current protocols for the assessment and treatment of women with early miscarriage or ectopic pregnancy as reported by Nigerian Gynecologists, and determine if dedicated early pregnancy services such as Early Pregnancy Assessment Units could be introduced to improve care. A cross-sectional survey of Nigerian Gynecologists attending the 46 th Annual Scientific Conference of the Society of Gynaecology and Obstetrics of Nigeria. This was a questionnaire-based study. Data analysis was by descriptive statistics using Statistical Package for the Social Sciences software, version 17.0 for Windows (IBM Corporation, Armonk, NY, USA). A total of 232 gynecologists working in 52 different secondary and tertiary health facilities participated in the survey. The mean age of the respondents was 42.6 ± 9.1 years (range 28-70 years). The proportion of gynecologists reporting that women with early miscarriage or ectopic pregnancy were first managed within the hospital general emergency room was 92%. The mean reported interval between arrival in hospital and first ultrasound scan was 4.9 ± 1.4 hours (range ½-8 hours). Transvaginal scan was stated as the routine initial imaging investigation by only 17.2% of respondents. Approximately 94.8% of respondents felt that dedicated early pregnancy services were feasible and should be introduced to improve the care of women with early miscarriage and ectopic pregnancy. Reported protocols for managing early miscarriage or ectopic pregnancy in many health facilities in Nigeria appear to engender unnecessary delays and avoidable costs, and dedicated early pregnancy services could be both useful and feasible in addressing these shortcomings in the way women with such conditions are currently managed.
A software solution for recording circadian oscillator features in time-lapse live cell microscopy.
Sage, Daniel; Unser, Michael; Salmon, Patrick; Dibner, Charna
2010-07-06
Fluorescent and bioluminescent time-lapse microscopy approaches have been successfully used to investigate molecular mechanisms underlying the mammalian circadian oscillator at the single cell level. However, most of the available software and common methods based on intensity-threshold segmentation and frame-to-frame tracking are not applicable in these experiments. This is due to cell movement and dramatic changes in the fluorescent/bioluminescent reporter protein during the circadian cycle, with the lowest expression level very close to the background intensity. At present, the standard approach to analyze data sets obtained from time lapse microscopy is either manual tracking or application of generic image-processing software/dedicated tracking software. To our knowledge, these existing software solutions for manual and automatic tracking have strong limitations in tracking individual cells if their plane shifts. In an attempt to improve existing methodology of time-lapse tracking of a large number of moving cells, we have developed a semi-automatic software package. It extracts the trajectory of the cells by tracking theirs displacements, makes the delineation of cell nucleus or whole cell, and finally yields measurements of various features, like reporter protein expression level or cell displacement. As an example, we present here single cell circadian pattern and motility analysis of NIH3T3 mouse fibroblasts expressing a fluorescent circadian reporter protein. Using Circadian Gene Express plugin, we performed fast and nonbiased analysis of large fluorescent time lapse microscopy datasets. Our software solution, Circadian Gene Express (CGE), is easy to use and allows precise and semi-automatic tracking of moving cells over longer period of time. In spite of significant circadian variations in protein expression with extremely low expression levels at the valley phase, CGE allows accurate and efficient recording of large number of cell parameters, including level of reporter protein expression, velocity, direction of movement, and others. CGE proves to be useful for the analysis of widefield fluorescent microscopy datasets, as well as for bioluminescence imaging. Moreover, it might be easily adaptable for confocal image analysis by manually choosing one of the focal planes of each z-stack of the various time points of a time series. CGE is a Java plugin for ImageJ; it is freely available at: http://bigwww.epfl.ch/sage/soft/circadian/.
Image analysis for maintenance of coating quality in nickel electroplating baths--real time control.
Vidal, M; Amigo, J M; Bro, R; van den Berg, F; Ostra, M; Ubide, C
2011-11-07
The aim of this paper is to show how it is possible to extract analytical information from images acquired with a flatbed scanner and make use of this information for real time control of a nickel plating process. Digital images of plated steel sheets in a nickel bath are used to follow the process under degradation of specific additives. Dedicated software has been developed for making the obtained results accessible to process operators. This includes obtaining the RGB image, to select the red channel data exclusively, to calculate the histogram of the red channel data and to calculate the mean colour value (MCV) and the standard deviation of the red channel data. MCV is then used by the software to determine the concentration of the additives Supreme Plus Brightner (SPB) and SA-1 (for confidentiality reasons, the chemical contents cannot be further detailed) present in the bath (these two additives degrade and their concentration changes during the process). Finally, the software informs the operator when the bath is generating unsuitable quality plating and suggests the amount of SPB and SA-1 to be added in order to recover the original plating quality. Copyright © 2011 Elsevier B.V. All rights reserved.
AnthropMMD: An R package with a graphical user interface for the mean measure of divergence.
Santos, Frédéric
2018-01-01
The mean measure of divergence is a dissimilarity measure between groups of individuals described by dichotomous variables. It is well suited to datasets with many missing values, and it is generally used to compute distance matrices and represent phenograms. Although often used in biological anthropology and archaeozoology, this method suffers from a lack of implementation in common statistical software. A package for the R statistical software, AnthropMMD, is presented here. Offering a dynamic graphical user interface, it is the first one dedicated to Smith's mean measure of divergence. The package also provides facilities for graphical representations and the crucial step of trait selection, so that the entire analysis can be performed through the graphical user interface. Its use is demonstrated using an artificial dataset, and the impact of trait selection is discussed. Finally, AnthropMMD is compared to three other free tools available for calculating the mean measure of divergence, and is proven to be consistent with them. © 2017 Wiley Periodicals, Inc.
Flexible, fast and accurate sequence alignment profiling on GPGPU with PaSWAS.
Warris, Sven; Yalcin, Feyruz; Jackson, Katherine J L; Nap, Jan Peter
2015-01-01
To obtain large-scale sequence alignments in a fast and flexible way is an important step in the analyses of next generation sequencing data. Applications based on the Smith-Waterman (SW) algorithm are often either not fast enough, limited to dedicated tasks or not sufficiently accurate due to statistical issues. Current SW implementations that run on graphics hardware do not report the alignment details necessary for further analysis. With the Parallel SW Alignment Software (PaSWAS) it is possible (a) to have easy access to the computational power of NVIDIA-based general purpose graphics processing units (GPGPUs) to perform high-speed sequence alignments, and (b) retrieve relevant information such as score, number of gaps and mismatches. The software reports multiple hits per alignment. The added value of the new SW implementation is demonstrated with two test cases: (1) tag recovery in next generation sequence data and (2) isotype assignment within an immunoglobulin 454 sequence data set. Both cases show the usability and versatility of the new parallel Smith-Waterman implementation.
A real-time monitoring system for the facial nerve.
Prell, Julian; Rachinger, Jens; Scheller, Christian; Alfieri, Alex; Strauss, Christian; Rampp, Stefan
2010-06-01
Damage to the facial nerve during surgery in the cerebellopontine angle is indicated by A-trains, a specific electromyogram pattern. These A-trains can be quantified by the parameter "traintime," which is reliably correlated with postoperative functional outcome. The system presented was designed to monitor traintime in real-time. A dedicated hardware and software platform for automated continuous analysis of the intraoperative facial nerve electromyogram was specifically designed. The automatic detection of A-trains is performed by a software algorithm for real-time analysis of nonstationary biosignals. The system was evaluated in a series of 30 patients operated on for vestibular schwannoma. A-trains can be detected and measured automatically by the described method for real-time analysis. Traintime is monitored continuously via a graphic display and is shown as an absolute numeric value during the operation. It is an expression of overall, cumulated length of A-trains in a given channel; a high correlation between traintime as measured by real-time analysis and functional outcome immediately after the operation (Spearman correlation coefficient [rho] = 0.664, P < .001) and in long-term outcome (rho = 0.631, P < .001) was observed. Automated real-time analysis of the intraoperative facial nerve electromyogram is the first technique capable of reliable continuous real-time monitoring. It can critically contribute to the estimation of functional outcome during the course of the operative procedure.
Concurrent Probabilistic Simulation of High Temperature Composite Structural Response
NASA Technical Reports Server (NTRS)
Abdi, Frank
1996-01-01
A computational structural/material analysis and design tool which would meet industry's future demand for expedience and reduced cost is presented. This unique software 'GENOA' is dedicated to parallel and high speed analysis to perform probabilistic evaluation of high temperature composite response of aerospace systems. The development is based on detailed integration and modification of diverse fields of specialized analysis techniques and mathematical models to combine their latest innovative capabilities into a commercially viable software package. The technique is specifically designed to exploit the availability of processors to perform computationally intense probabilistic analysis assessing uncertainties in structural reliability analysis and composite micromechanics. The primary objectives which were achieved in performing the development were: (1) Utilization of the power of parallel processing and static/dynamic load balancing optimization to make the complex simulation of structure, material and processing of high temperature composite affordable; (2) Computational integration and synchronization of probabilistic mathematics, structural/material mechanics and parallel computing; (3) Implementation of an innovative multi-level domain decomposition technique to identify the inherent parallelism, and increasing convergence rates through high- and low-level processor assignment; (4) Creating the framework for Portable Paralleled architecture for the machine independent Multi Instruction Multi Data, (MIMD), Single Instruction Multi Data (SIMD), hybrid and distributed workstation type of computers; and (5) Market evaluation. The results of Phase-2 effort provides a good basis for continuation and warrants Phase-3 government, and industry partnership.
14 CFR 1214.205 - Revisit and/or retrieval services.
Code of Federal Regulations, 2012 CFR
2012-01-01
... a scheduled Shuttle flight, he will only pay for added mission planning, unique hardware or software... Section 1214.205 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT... priced on the basis of estimated costs. If a special dedicated Shuttle flight is required, the full...
14 CFR 1214.205 - Revisit and/or retrieval services.
Code of Federal Regulations, 2011 CFR
2011-01-01
... a scheduled Shuttle flight, he will only pay for added mission planning, unique hardware or software... Section 1214.205 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT... priced on the basis of estimated costs. If a special dedicated Shuttle flight is required, the full...
14 CFR 1214.205 - Revisit and/or retrieval services.
Code of Federal Regulations, 2013 CFR
2013-01-01
... a scheduled Shuttle flight, he will only pay for added mission planning, unique hardware or software... Section 1214.205 Aeronautics and Space NATIONAL AERONAUTICS AND SPACE ADMINISTRATION SPACE FLIGHT... priced on the basis of estimated costs. If a special dedicated Shuttle flight is required, the full...
Rodrigues, Rita de Cassia Vieira; Peres, Heloisa Helena Ciqueto
2013-02-01
The objective of this study was to develop an educational software program for nursing continuing education. This program was intended to incorporate applied methodological research that used the learning management system methodology created by Galvis Panqueva in association with contextualized instructional design for software design. As a result of this study, we created a computerized educational product (CEP) called ENFNET. This study describes all the necessary steps taken during its development. The creation of a CEP demands a great deal of study, dedication and investment as well as the necessity of specialized technical personnel to construct it. At the end of the study, the software was positively evaluated and shown to be a useful strategy to help users in their education, skills development and professional training.
Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl
2012-11-02
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.
2012-01-01
While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386
Evaluation of a breast software model for 2D and 3D X-ray imaging studies of the breast.
Baneva, Yanka; Bliznakova, Kristina; Cockmartin, Lesley; Marinov, Stoyko; Buliev, Ivan; Mettivier, Giovanni; Bosmans, Hilde; Russo, Paolo; Marshall, Nicholas; Bliznakov, Zhivko
2017-09-01
In X-ray imaging, test objects reproducing breast anatomy characteristics are realized to optimize issues such as image processing or reconstruction, lesion detection performance, image quality and radiation induced detriment. Recently, a physical phantom with a structured background has been introduced for both 2D mammography and breast tomosynthesis. A software version of this phantom and a few related versions are now available and a comparison between these 3D software phantoms and the physical phantom will be presented. The software breast phantom simulates a semi-cylindrical container filled with spherical beads of different diameters. Four computational breast phantoms were generated with a dedicated software application and for two of these, physical phantoms are also available and they are used for the side by side comparison. Planar projections in mammography and tomosynthesis were simulated under identical incident air kerma conditions. Tomosynthesis slices were reconstructed with an in-house developed reconstruction software. In addition to a visual comparison, parameters like fractal dimension, power law exponent β and second order statistics (skewness, kurtosis) of planar projections and tomosynthesis reconstructed images were compared. Visually, an excellent agreement between simulated and real planar and tomosynthesis images is observed. The comparison shows also an overall very good agreement between parameters evaluated from simulated and experimental images. The computational breast phantoms showed a close match with their physical versions. The detailed mathematical analysis of the images confirms the agreement between real and simulated 2D mammography and tomosynthesis images. The software phantom is ready for optimization purpose and extrapolation of the phantom to other breast imaging techniques. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Discrete Spring Model for Predicting Delamination Growth in Z-Fiber Reinforced DCB Specimens
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; OBrien, T. Kevin
2004-01-01
Beam theory analysis was applied to predict delamination growth in Double Cantilever Beam (DCB) specimens reinforced in the thickness direction with pultruded pins, known as Z-fibers. The specimen arms were modeled as cantilever beams supported by discrete springs, which were included to represent the pins. A bi-linear, irreversible damage law was used to represent Z-fiber damage, the parameters of which were obtained from previous experiments. Closed-form solutions were developed for specimen compliance and displacements corresponding to Z-fiber row locations. A solution strategy was formulated to predict delamination growth, in which the parent laminate mode I critical strain energy release rate was used as the criterion for delamination growth. The solution procedure was coded into FORTRAN 90, giving a dedicated software tool for performing the delamination prediction. Comparison of analysis results with previous analysis and experiment showed good agreement, yielding an initial verification for the analytical procedure.
Discrete Spring Model for Predicting Delamination Growth in Z-Fiber Reinforced DCB Specimens
NASA Technical Reports Server (NTRS)
Ratcliffe, James G.; O'Brien, T. Kevin
2004-01-01
Beam theory analysis was applied to predict delamination growth in DCB specimens reinforced in the thickness direction with pultruded pins, known as Z-fibers. The specimen arms were modeled as cantilever beams supported by discrete springs, which were included to represent the pins. A bi-linear, irreversible damage law was used to represent Z-fiber damage, the parameters of which were obtained from previous experiments. Closed-form solutions were developed for specimen compliance and displacements corresponding to Z-fiber row locations. A solution strategy was formulated to predict delamination growth, in which the parent laminate mode I fracture toughness was used as the criterion for delamination growth. The solution procedure was coded into FORTRAN 90, giving a dedicated software tool for performing the delamination prediction. Comparison of analysis results with previous analysis and experiment showed good agreement, yielding an initial verification for the analytical procedure.
The tracking analysis in the Q-weak experiment
NASA Astrophysics Data System (ADS)
Pan, J.; Androic, D.; Armstrong, D. S.; Asaturyan, A.; Averett, T.; Balewski, J.; Beaufait, J.; Beminiwattha, R. S.; Benesch, J.; Benmokhtar, F.; Birchall, J.; Carlini, R. D.; Cates, G. D.; Cornejo, J. C.; Covrig, S.; Dalton, M. M.; Davis, C. A.; Deconinck, W.; Diefenbach, J.; Dowd, J. F.; Dunne, J. A.; Dutta, D.; Duvall, W. S.; Elaasar, M.; Falk, W. R.; Finn, J. M.; Forest, T.; Gaskell, D.; Gericke, M. T. W.; Grames, J.; Gray, V. M.; Grimm, K.; Guo, F.; Hoskins, J. R.; Johnston, K.; Jones, D.; Jones, M.; Jones, R.; Kargiantoulakis, M.; King, P. M.; Korkmaz, E.; Kowalski, S.; Leacock, J.; Leckey, J.; Lee, A. R.; Lee, J. H.; Lee, L.; MacEwan, S.; Mack, D.; Magee, J. A.; Mahurin, R.; Mammei, J.; Martin, J. W.; McHugh, M. J.; Meekins, D.; Mei, J.; Michaels, R.; Micherdzinska, A.; Mkrtchyan, A.; Mkrtchyan, H.; Morgan, N.; Myers, K. E.; Narayan, A.; Ndukum, L. Z.; Nelyubin, V.; Nuruzzaman; van Oers, W. T. H.; Opper, A. K.; Page, S. A.; Pan, J.; Paschke, K. D.; Phillips, S. K.; Pitt, M. L.; Poelker, M.; Rajotte, J. F.; Ramsay, W. D.; Roche, J.; Sawatzky, B.; Seva, T.; Shabestari, M. H.; Silwal, R.; Simicevic, N.; Smith, G. R.; Solvignon, P.; Spayde, D. T.; Subedi, A.; Subedi, R.; Suleiman, R.; Tadevosyan, V.; Tobias, W. A.; Tvaskis, V.; Waidyawansa, B.; Wang, P.; Wells, S. P.; Wood, S. A.; Yang, S.; Young, R. D.; Zhamkochyan, S.
2016-12-01
The Q-weak experiment at Jefferson Laboratory measured the parity violating asymmetry ( A P V ) in elastic electron-proton scattering at small momentum transfer squared ( Q 2=0.025 ( G e V/ c)2), with the aim of extracting the proton's weak charge ({Q^p_W}) to an accuracy of 5 %. As one of the major uncertainty contribution sources to {Q^p_W}, Q 2 needs to be determined to ˜1 % so as to reach the proposed experimental precision. For this purpose, two sets of high resolution tracking chambers were employed in the experiment, to measure tracks before and after the magnetic spectrometer. Data collected by the tracking system were then reconstructed with dedicated software into individual electron trajectories for experimental kinematics determination. The Q-weak kinematics and the analysis scheme for tracking data are briefly described here. The sources that contribute to the uncertainty of Q 2 are discussed, and the current analysis status is reported.
NASA Astrophysics Data System (ADS)
Chęciński, Jakub; Frankowski, Marek
2016-10-01
We present a tool for fully-automated generation of both simulations configuration files (Mif) and Matlab scripts for automated data analysis, dedicated for Object Oriented Micromagnetic Framework (OOMMF). We introduce extended graphical user interface (GUI) that allows for fast, error-proof and easy creation of Mifs, without any programming skills usually required for manual Mif writing necessary. With MAGE we provide OOMMF extensions for complementing it by mangetoresistance and spin-transfer-torque calculations, as well as local magnetization data selection for output. Our software allows for creation of advanced simulations conditions like simultaneous parameters sweeps and synchronic excitation application. Furthermore, since output of such simulation could be long and complicated we provide another GUI allowing for automated creation of Matlab scripts suitable for analysis of such data with Fourier and wavelet transforms as well as user-defined operations.
BFPTool: a software tool for analysis of Biomembrane Force Probe experiments.
Šmít, Daniel; Fouquet, Coralie; Doulazmi, Mohamed; Pincet, Frédéric; Trembleau, Alain; Zapotocky, Martin
2017-01-01
The Biomembrane Force Probe is an approachable experimental technique commonly used for single-molecule force spectroscopy and experiments on biological interfaces. The technique operates in the range of forces from 0.1 pN to 1000 pN. Experiments are typically repeated many times, conditions are often not optimal, the captured video can be unstable and lose focus; this makes efficient analysis challenging, while out-of-the-box non-proprietary solutions are not freely available. This dedicated tool was developed to integrate and simplify the image processing and analysis of videomicroscopy recordings from BFP experiments. A novel processing feature, allowing the tracking of the pipette, was incorporated to address a limitation of preceding methods. Emphasis was placed on versatility and comprehensible user interface implemented in a graphical form. An integrated analytical tool was implemented to provide a faster, simpler and more convenient way to process and analyse BFP experiments.
Rubegni, Pietro; Nami, Niccolò; Poggiali, Sara; Tataranno, Domenico; Fimiani, M
2009-05-01
Because the skin is the only organ completely accessible to visual examination, digital technology has therefore attracted the attention of dermatologists for documenting, monitoring, measuring and classifying morphological manifestations. To describe a digital image management system dedicated to dermatological health care environments and to compare it with other existing softwares for digital image storage. We designed a reliable hardware structure that could ensure future scaling, because storage needs tend to grow exponentially. For the software, we chose a client-web server application based on a relational database and with a 'minimalist' user interface. We developed a software with a ready-made, adaptable index of skin pathologies. It facilitates classification by pathology, patient and visit, with an advanced search option allowing access to all images according to personalized criteria. The software also offers the possibility of comparing two or more digital images (follow-up). The fact that the archives of years of digital photos acquired and saved on PCs can easily be entered in the program distinguishes it from the others in the market. This option is fundamental for accessing all the photos taken in years of practice in the program without entering them one by one. The program is available to any user connected to the local Intranet and the system may directly be available in the future from the Internet. All clinics and surgeries, especially those that rely on digital images, are obliged to keep up with technological advances. It is therefore hoped that our project will become a model for medical structures intending to rationalise digital and other data according to statutory requirements.
MTL distributed magnet measurement system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Nogiec, J.M.; Craker, P.A.; Garbarini, J.P.
1993-04-01
The Magnet Test Laboratory (MTL) at the Superconducting Super collider Laboratory will be required to precisely and reliably measure properties of magnets in a production environment. The extensive testing of the superconducting magnets comprises several types of measurements whose main purpose is to evaluate some basic parameters characterizing magnetic, mechanic and cryogenic properties of magnets. The measurement process will produce a significant amount of data which will be subjected to complex analysis. Such massive measurements require a careful design of both the hardware and software of computer systems, having in mind a reliable, maximally automated system. In order to fulfillmore » this requirement a dedicated Distributed Magnet Measurement System (DMMS) is being developed.« less
Handling knowledge via Concept Maps: a space weather use case
NASA Astrophysics Data System (ADS)
Messerotti, Mauro; Fox, Peter
Concept Maps (Cmaps) are powerful means for knowledge coding in graphical form. As flexible software tools exist to manipulate the knowledge embedded in Cmaps in machine-readable form, such complex entities are suitable candidates not only for the representation of ontologies and semantics in Virtual Observatory (VO) architectures, but also for knowledge handling and knowledge discovery. In this work, we present a use case relevant to space weather applications and we elaborate on its possible implementation and adavanced use in Semantic Virtual Observatories dedicated to Sun-Earth Connections. This analysis was carried out in the framework of the Electronic Geophysical Year (eGY) and represents an achievement synergized by the eGY Virtual Observatories Working Group.
Telemedicine using free voice over internet protocol (VoIP) technology.
Miller, David J; Miljkovic, Nikola; Chiesa, Chad; Callahan, John B; Webb, Brad; Boedeker, Ben H
2011-01-01
Though dedicated videoteleconference (VTC) systems deliver high quality, low-latency audio and video for telemedical applications, they require expensive hardware and extensive infrastructure. The purpose of this study was to investigate free commercially available Voice over Internet Protocol (VoIP) software as a low cost alternative for telemedicine.
"TIS": An Intelligent Gateway Computer for Information and Modeling Networks. Overview.
ERIC Educational Resources Information Center
Hampel, Viktor E.; And Others
TIS (Technology Information System) is being used at the Lawrence Livermore National Laboratory (LLNL) to develop software for Intelligent Gateway Computers (IGC) suitable for the prototyping of advanced, integrated information networks. Dedicated to information management, TIS leads the user to available information resources, on TIS or…
76 FR 61717 - Government-Owned Inventions; Availability for Licensing
Federal Register 2010, 2011, 2012, 2013, 2014
2011-10-05
... computer science based technology that may provide the capability of detecting untoward events such as... is comprised of a dedicated computer server that executes specially designed software with input data... computer assisted clinical ordering. J Biomed Inform. 2003 Feb-Apr;36(1-2):4-22. [PMID 14552843...
Portable Intelligent Tritium in Air Monitor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Purghel, L.; Calin, M.R.; Bartos, D.
2005-07-15
The tritium detection method used for this monitor is original, patented in Romania. The detection unit consists of a single ionization chamber, a special fast preamplifier and a dedicated software associated to the detection unit, for signals processing. Some results concerning the tritium in relative strong gamma-ray fields are presented.
BioAir: Bio-Inspired Airborne Infrastructure Reconfiguration
2016-01-01
PI minicomputer powered by a different supply. The ODROID and Raspberry PI communicate via an Ethernet connection through a software interface named...HardKernel, an Atheros Wi-Fi card connected to it, and a dedicated power pack developed by RavPower. The hexarotor’s autopilot runs on a separate Raspberry
[Computer-assisted management of depots for blood products in health establishments].
Carré, J
2008-11-01
To manage the filing of blood components at the hospital of the city of Bayeux, the laboratory uses Cursus, a dedicated software for haemovigilance. Benefits for using this software at different steps of the blood bank management are: simplification, security and harmonization of practices during receipt and issurance of blood components, securing recordings with the use of bar codes for patient identification and blood components listing, implementation of a computerized tracking system for transfusion, traceability, limitation of written documents and availability of statistics on the management of the depot.
Development of a versatile user-friendly IBA experimental chamber
NASA Astrophysics Data System (ADS)
Kakuee, Omidreza; Fathollahi, Vahid; Lamehi-Rachti, Mohammad
2016-03-01
Reliable performance of the Ion Beam Analysis (IBA) techniques is based on the accurate geometry of the experimental setup, employment of the reliable nuclear data and implementation of dedicated analysis software for each of the IBA techniques. It has already been shown that geometrical imperfections lead to significant uncertainties in quantifications of IBA measurements. To minimize these uncertainties, a user-friendly experimental chamber with a heuristic sample positioning system for IBA analysis was recently developed in the Van de Graaff laboratory in Tehran. This system enhances IBA capabilities and in particular Nuclear Reaction Analysis (NRA) and Elastic Recoil Detection Analysis (ERDA) techniques. The newly developed sample manipulator provides the possibility of both controlling the tilt angle of the sample and analyzing samples with different thicknesses. Moreover, a reasonable number of samples can be loaded in the sample wheel. A comparison of the measured cross section data of the 16O(d,p1)17O reaction with the data reported in the literature confirms the performance and capability of the newly developed experimental chamber.
Illuminator, a desktop program for mutation detection using short-read clonal sequencing.
Carr, Ian M; Morgan, Joanne E; Diggle, Christine P; Sheridan, Eamonn; Markham, Alexander F; Logan, Clare V; Inglehearn, Chris F; Taylor, Graham R; Bonthron, David T
2011-10-01
Current methods for sequencing clonal populations of DNA molecules yield several gigabases of data per day, typically comprising reads of < 100 nt. Such datasets permit widespread genome resequencing and transcriptome analysis or other quantitative tasks. However, this huge capacity can also be harnessed for the resequencing of smaller (gene-sized) target regions, through the simultaneous parallel analysis of multiple subjects, using sample "tagging" or "indexing". These methods promise to have a huge impact on diagnostic mutation analysis and candidate gene testing. Here we describe a software package developed for such studies, offering the ability to resolve pooled samples carrying barcode tags and to align reads to a reference sequence using a mutation-tolerant process. The program, Illuminator, can identify rare sequence variants, including insertions and deletions, and permits interactive data analysis on standard desktop computers. It facilitates the effective analysis of targeted clonal sequencer data without dedicated computational infrastructure or specialized training. Copyright © 2011 Elsevier Inc. All rights reserved.
ProteoSign: an end-user online differential proteomics statistical analysis platform.
Efstathiou, Georgios; Antonakis, Andreas N; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Divanach, Peter; Trudgian, David C; Thomas, Benjamin; Papanikolaou, Nikolas; Aivaliotis, Michalis; Acuto, Oreste; Iliopoulos, Ioannis
2017-07-03
Profiling of proteome dynamics is crucial for understanding cellular behavior in response to intrinsic and extrinsic stimuli and maintenance of homeostasis. Over the last 20 years, mass spectrometry (MS) has emerged as the most powerful tool for large-scale identification and characterization of proteins. Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. Although several non-commercial standalone tools have been developed for post-quantification statistical analysis of proteomics data, most of them are not end-user appealing as they often require very stringent installation of programming environments, third-party software packages and sometimes further scripting or computer programming. To avoid this bottleneck, we have developed a user-friendly software platform accessible via a web interface in order to enable proteomics laboratories and core facilities to statistically analyse quantitative proteomics data sets in a resource-efficient manner. ProteoSign is available at http://bioinformatics.med.uoc.gr/ProteoSign and the source code at https://github.com/yorgodillo/ProteoSign. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.
Monte Carlo Uncertainty Quantification for an Unattended Enrichment Monitor
DOE Office of Scientific and Technical Information (OSTI.GOV)
Jarman, Kenneth D.; Smith, Leon E.; Wittman, Richard S.
As a case study for uncertainty analysis, we consider a model flow monitor for measuring enrichment in gas centrifuge enrichment plants (GCEPs) that could provide continuous monitoring of all declared gas flow and provide high-accuracy gas enrichment estimates as a function of time. The monitor system could include NaI(Tl) gamma-ray spectrometers, a pressure signal-sharing device to be installed on an operator\\rq{}s pressure gauge or a dedicated inspector pressure sensor, and temperature sensors attached to the outside of the header pipe, to provide pressure, temperature, and gamma-ray spectra measurements of UFmore » $$_6$$ gas flow through unit header pipes. Our study builds on previous modeling and analysis methods development for enrichment monitor concepts and a software tool that was developed at Oak Ridge National Laboratory to generate and analyze synthetic data.« less
Metrological digital audio reconstruction
Fadeyev,; Vitaliy, Haber [Berkeley, CA; Carl, [Berkeley, CA
2004-02-19
Audio information stored in the undulations of grooves in a medium such as a phonograph record may be reconstructed, with little or no contact, by measuring the groove shape using precision metrology methods coupled with digital image processing and numerical analysis. The effects of damage, wear, and contamination may be compensated, in many cases, through image processing and analysis methods. The speed and data handling capacity of available computing hardware make this approach practical. Two examples used a general purpose optical metrology system to study a 50 year old 78 r.p.m. phonograph record and a commercial confocal scanning probe to study a 1920's celluloid Edison cylinder. Comparisons are presented with stylus playback of the samples and with a digitally re-mastered version of an original magnetic recording. There is also a more extensive implementation of this approach, with dedicated hardware and software.
Gann, E; Young, A T; Collins, B A; Yan, H; Nasiatka, J; Padmore, H A; Ade, H; Hexemer, A; Wang, C
2012-04-01
We present the development and characterization of a dedicated resonant soft x-ray scattering facility. Capable of operation over a wide energy range, the beamline and endstation are primarily used for scattering from soft matter systems around the carbon K-edge (∼285 eV). We describe the specialized design of the instrument and characteristics of the beamline. Operational characteristics of immediate interest to users such as polarization control, degree of higher harmonic spectral contamination, and detector noise are delineated. Of special interest is the development of a higher harmonic rejection system that improves the spectral purity of the x-ray beam. Special software and a user-friendly interface have been implemented to allow real-time data processing and preliminary data analysis simultaneous with data acquisition. © 2012 American Institute of Physics
YANA – a software tool for analyzing flux modes, gene-expression and enzyme activities
Schwarz, Roland; Musch, Patrick; von Kamp, Axel; Engels, Bernd; Schirmer, Heiner; Schuster, Stefan; Dandekar, Thomas
2005-01-01
Background A number of algorithms for steady state analysis of metabolic networks have been developed over the years. Of these, Elementary Mode Analysis (EMA) has proven especially useful. Despite its low user-friendliness, METATOOL as a reliable high-performance implementation of the algorithm has been the instrument of choice up to now. As reported here, the analysis of metabolic networks has been improved by an editor and analyzer of metabolic flux modes. Analysis routines for expression levels and the most central, well connected metabolites and their metabolic connections are of particular interest. Results YANA features a platform-independent, dedicated toolbox for metabolic networks with a graphical user interface to calculate (integrating METATOOL), edit (including support for the SBML format), visualize, centralize, and compare elementary flux modes. Further, YANA calculates expected flux distributions for a given Elementary Mode (EM) activity pattern and vice versa. Moreover, a dissection algorithm, a centralization algorithm, and an average diameter routine can be used to simplify and analyze complex networks. Proteomics or gene expression data give a rough indication of some individual enzyme activities, whereas the complete flux distribution in the network is often not known. As such data are noisy, YANA features a fast evolutionary algorithm (EA) for the prediction of EM activities with minimum error, including alerts for inconsistent experimental data. We offer the possibility to include further known constraints (e.g. growth constraints) in the EA calculation process. The redox metabolism around glutathione reductase serves as an illustration example. All software and documentation are available for download at . Conclusion A graphical toolbox and an editor for METATOOL as well as a series of additional routines for metabolic network analyses constitute a new user-friendly software for such efforts. PMID:15929789
Unobtrusive Software and System Health Management with R2U2 on a Parallel MIMD Coprocessor
NASA Technical Reports Server (NTRS)
Schumann, Johann; Moosbrugger, Patrick
2017-01-01
Dynamic monitoring of software and system health of a complex cyber-physical system requires observers that continuously monitor variables of the embedded software in order to detect anomalies and reason about root causes. There exists a variety of techniques for code instrumentation, but instrumentation might change runtime behavior and could require costly software re-certification. In this paper, we present R2U2E, a novel realization of our real-time, Realizable, Responsive, and Unobtrusive Unit (R2U2). The R2U2E observers are executed in parallel on a dedicated 16-core EPIPHANY co-processor, thereby avoiding additional computational overhead to the system under observation. A DMA-based shared memory access architecture allows R2U2E to operate without any code instrumentation or program interference.
Towards a Community Environmental Observation Network
NASA Astrophysics Data System (ADS)
Mertl, Stefan; Lettenbichler, Anton
2014-05-01
The Community Environmental Observation Network (CEON) is dedicated to the development of a free sensor network to collect and distribute environmental data (e.g. ground shaking, climate parameters). The data collection will be done with contributions from citizens, research institutions and public authorities like communities or schools. This will lead to a large freely available data base which can be used for public information, research, the arts,..... To start a free sensor network, the most important step is to provide easy access to free data collection and -distribution tools. The initial aims of the project CEON are dedicated to the development of these tools. A high quality data logger based on open hardware and free software is developed and a software suite of already existing free software for near-real time data communication and data distribution over the Internet will be assembled. Foremost, the development focuses on the collection of data related to the deformation of the earth (such as ground shaking, surface displacement of mass movements and glaciers) and the collection of climate data. The extent to other measurements will be considered in the design. The data logger is built using open hardware prototyping platforms like BeagleBone Black and Arduino. Main features of the data logger are: a 24Bit analog-to-digital converter; a GPS module for time reference and positioning; wireless mesh networking using Optimized Link State Routing; near real-time data transmission and communication; and near real-time differential GNSS positioning using the RTKLIB software. The project CEON is supported by the Internet Foundation Austria (IPA) within the NetIdee 2013 call.
Computer systems and software engineering
NASA Technical Reports Server (NTRS)
Mckay, Charles W.
1988-01-01
The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.
Giandini, Tommaso; Panaino, Costanza M V; Avuzzi, Barbara; Morlino, Sara; Villa, Sergio; Bedini, Nice; Carabelli, Gabriele; Frasca, Sarah C; Romanyukha, Anna; Rosenfeld, Anatoly; Pignoli, Emanuele; Valdagni, Riccardo; Carrara, Mauro
2017-03-24
To validate and apply a method for the quantification of breathing-induced prostate motion (BIPM) for patients treated with radiotherapy and implanted with electromagnetic transponders for prostate localization and tracking. For the analysis of electromagnetic transponder signal, dedicated software was developed and validated with a programmable breathing simulator phantom. The software was then applied to 1,132 radiotherapy fractions of 30 patients treated in supine position, and to a further 61 fractions of 2 patients treated in prone position. Application of the software in phantom demonstrated reliability of the developed method in determining simulated breathing frequencies and amplitudes. For supine patients, the in vivo analysis of BIPM resulted in median (maximum) amplitudes of 0.10 mm (0.35 mm), 0.24 mm (0.66 mm), and 0.17 mm (0.61 mm) in the left-right (LR), cranio-caudal (CC), and anterior-posterior (AP) directions, respectively. Breathing frequency ranged between 7.73 and 29.43 breaths per minute. For prone patients, the ranges of the BIPM amplitudes were 0.1-0.5 mm, 0.5-1.3 mm, and 0.7-1.7 mm in the LR, CC, and AP directions, respectively. The developed method was able to detect the BIPM with sub-millimeter accuracy. While for patients treated in supine position the BIPM represents a reduced source of treatment uncertainty, for patients treated in prone position, it can be higher than 3 mm.
The "VoiceForum" Platform for Spoken Interaction
ERIC Educational Resources Information Center
Fynn, Fohn; Wigham, Chiara R.
2011-01-01
Showcased in the courseware exhibition, "VoiceForum" is a web-based software platform for asynchronous learner interaction in threaded discussions using voice and text. A dedicated space is provided for the tutor who can give feedback on a posted message and dialogue with the participants at a separate level from the main interactional…
A High-Resolution Stopwatch for Cents
ERIC Educational Resources Information Center
Gingl, Z.; Kopasz, K.
2011-01-01
A very low-cost, easy-to-make stopwatch is presented to support various experiments in mechanics. The high-resolution stopwatch is based on two photodetectors connected directly to the microphone input of a sound card. Dedicated free open-source software has been developed and made available to download. The efficiency is demonstrated by a free…
Six Pillars of Dynamic Schools
ERIC Educational Resources Information Center
Edwards, Steven W.; Chapman, Paul E.
2009-01-01
"Six Pillars of Dynamic Schools" uncovers an often overlooked truth--effective change is the product of hard work and dedication. There is no silver bullet; no matter how many programs, software packages, or new initiatives a district uses, the magic won't just "happen." Dynamic schools result from consistent and redundant focus on the fundamental…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-11-20
... Sponsoring ETP Holder for the Sponsored Participant, including, among other things, criteria related to order... surrounding ports or port fees and that the Exchange is not aware of any problems that port users would have... gateway software and hardware enhancements and resources dedicated to gateway development, quality...
Podcasting in Higher Education: Does It Make a Difference?
ERIC Educational Resources Information Center
Baker, Russell; Harrison, Jeffery; Thornton, Barry; Yates, Rhett
2010-01-01
Podcasting is a growing trend in higher education. Major software companies, such as Apple, have dedicated entire websites to podcasting. These podcasts are available to college students to be used as supplemental material for specific coursework at their particular college or university. Unfortunately, due to the new and progressive nature of the…
Federal Register 2010, 2011, 2012, 2013, 2014
2012-01-11
..., which Items have been prepared by the Exchange. The Commission is publishing this notice to solicit... ETP ID.\\5\\ \\4\\ See supra note 3. \\5\\ The Exchange has a Common Customer Gateway (``CCG'') that... gateway software and hardware enhancements and resources dedicated to gateway development, quality...
Linking Advanced Visualization and MATLAB for the Analysis of 3D Gene Expression Data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ruebel, Oliver; Keranen, Soile V.E.; Biggin, Mark
Three-dimensional gene expression PointCloud data generated by the Berkeley Drosophila Transcription Network Project (BDTNP) provides quantitative information about the spatial and temporal expression of genes in early Drosophila embryos at cellular resolution. The BDTNP team visualizes and analyzes Point-Cloud data using the software application PointCloudXplore (PCX). To maximize the impact of novel, complex data sets, such as PointClouds, the data needs to be accessible to biologists and comprehensible to developers of analysis functions. We address this challenge by linking PCX and Matlab via a dedicated interface, thereby providing biologists seamless access to advanced data analysis functions and giving bioinformatics researchersmore » the opportunity to integrate their analysis directly into the visualization application. To demonstrate the usefulness of this approach, we computationally model parts of the expression pattern of the gene even skipped using a genetic algorithm implemented in Matlab and integrated into PCX via our Matlab interface.« less
The 2017 Bioinformatics Open Source Conference (BOSC)
Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather
2017-01-01
The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year’s theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest. PMID:29118973
The 2017 Bioinformatics Open Source Conference (BOSC).
Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Munoz-Torres, Monica; Tzovaras, Bastian Greshake; Wiencko, Heather
2017-01-01
The Bioinformatics Open Source Conference (BOSC) is a meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. The 18th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2017) took place in Prague, Czech Republic in July 2017. The conference brought together nearly 250 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, open and reproducible science, and this year's theme, open data. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community, called the OBF Codefest.
Changes and challenges in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Pajerski, Rose
1994-01-01
Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD), develops, maintains, and manages complex flight dynamics systems. The SEL is composed of three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation. During the past 18 years, the SEL's overall goal has remained the same: to improve the FDD's software products and processes in a measured manner. This requires that each development and maintenance effort be viewed, in part, as a SEL experiment which examines a specific technology or builds a model of interest for use on subsequent efforts. The SEL has undertaken many technology studies while developing operational support systems for numerous NASA spacecraft missions.
MMX-I: data-processing software for multimodal X-ray imaging and tomography.
Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea
2016-05-01
A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors' knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments.
The Expanded Owens Valley Solar Array
NASA Astrophysics Data System (ADS)
Gary, Dale E.; Hurford, G. J.; Nita, G. M.; White, S. M.; Tun, S. D.; Fleishman, G. D.; McTiernan, J. M.
2011-05-01
The Expanded Owens Valley Solar Array (EOVSA) is now under construction near Big Pine, CA as a solar-dedicated microwave imaging array operating in the frequency range 1-18 GHz. The solar science to be addressed focuses on the 3D structure of the solar corona (magnetic field, temperature and density), on the sudden release of energy and subsequent particle acceleration, transport and heating, and on space weather phenomena. The project will support the scientific community by providing open data access and software tools for analysis of the data, to exploit synergies with on-going solar research in other wavelengths. The New Jersey Institute of Technology (NJIT) is expanding OVSA from its previous complement of 7 antennas to a total of 15 by adding 8 new antennas, and will reinvest in the existing infrastructure by replacing the existing control systems, signal transmission, and signal processing with modern, far more capable and reliable systems based on new technology developed for the Frequency Agile Solar Radiotelescope (FASR). The project will be completed in time to provide solar-dedicated observations during the upcoming solar maximum in 2013 and beyond. We provide an update on current status and our preparations for exploiting the data through modeling and data analysis tools. This research is supported by NSF grants AST-0908344, and AGS-0961867 and NASA grant NNX10AF27G to New Jersey Institute of Technology.
An Ibm PC/AT-Based Image Acquisition And Processing System For Quantitative Image Analysis
NASA Astrophysics Data System (ADS)
Kim, Yongmin; Alexander, Thomas
1986-06-01
In recent years, a large number of applications have been developed for image processing systems in the area of biological imaging. We have already finished the development of a dedicated microcomputer-based image processing and analysis system for quantitative microscopy. The system's primary function has been to facilitate and ultimately automate quantitative image analysis tasks such as the measurement of cellular DNA contents. We have recognized from this development experience, and interaction with system users, biologists and technicians, that the increasingly widespread use of image processing systems, and the development and application of new techniques for utilizing the capabilities of such systems, would generate a need for some kind of inexpensive general purpose image acquisition and processing system specially tailored for the needs of the medical community. We are currently engaged in the development and testing of hardware and software for a fairly high-performance image processing computer system based on a popular personal computer. In this paper, we describe the design and development of this system. Biological image processing computer systems have now reached a level of hardware and software refinement where they could become convenient image analysis tools for biologists. The development of a general purpose image processing system for quantitative image analysis that is inexpensive, flexible, and easy-to-use represents a significant step towards making the microscopic digital image processing techniques more widely applicable not only in a research environment as a biologist's workstation, but also in clinical environments as a diagnostic tool.
Atomic clock ensemble in space (ACES) data analysis
NASA Astrophysics Data System (ADS)
Meynadier, F.; Delva, P.; le Poncin-Lafitte, C.; Guerlin, C.; Wolf, P.
2018-02-01
The Atomic Clocks Ensemble in Space (ACES/PHARAO mission, ESA & CNES) will be installed on board the International Space Station (ISS) next year. A crucial part of this experiment is its two-way microwave link (MWL), which will compare the timescale generated on board with those provided by several ground stations disseminated on the Earth. A dedicated data analysis center is being implemented at SYRTE—Observatoire de Paris, where our team currently develops theoretical modelling, numerical simulations and the data analysis software itself. In this paper, we present some key aspects of the MWL measurement method and the associated algorithms for simulations and data analysis. We show the results of tests using simulated data with fully realistic effects such as fundamental measurement noise, Doppler, atmospheric delays, or cycle ambiguities. We demonstrate satisfactory performance of the software with respect to the specifications of the ACES mission. The main scientific product of our analysis is the clock desynchronisation between ground and space clocks, i.e. the difference of proper times between the space clocks and ground clocks at participating institutes. While in flight, this measurement will allow for tests of general relativity and Lorentz invariance at unprecedented levels, e.g. measurement of the gravitational redshift at the 3×10-6 level. As a specific example, we use real ISS orbit data with estimated errors at the 10 m level to study the effect of such errors on the clock desynchronisation obtained from MWL data. We demonstrate that the resulting effects are totally negligible.
NASA Astrophysics Data System (ADS)
Saroinsong, T.; A. S Kondoj, M.; Kandiyoh, G.; Pontoh, G.
2018-01-01
The State Polytechnic of Manado (Polimdo) is one of the reliable institutions in North Sulawesi that first implemented ISO 9001. But the accreditation of the institution has not been satisfactory, it means there is still much to be prepared to achieve the expected target. One of the criteria of assessment of institutional accreditation is related to research activities and social work in accordance with the standard seven. Data documentation systems related to research activities and social work are not well integrated and well documented in all existing work units. This causes the process of gathering information related to the activities and the results of research and social work in order to support the accreditation activities of the institution is still not efficient. This study aims to build an integrated software in all work units in Polimdo to obtain documentation and data synchronization in support of activities or reporting of documents accreditation institution in accordance with standard seven specifically in terms of submission of research proposal and dedication. The software will be developed using RUP method with analysis using data flow diagram and ERM so that the result of this research is documentation and synchronization of data and information of research activity and community service which can be used in preparing documents report for accreditation institution.
NASA Astrophysics Data System (ADS)
Reymond, D.
2016-12-01
We present an open source software project (GNU public license), named STK: Seismic Tool-Kit, that is dedicated mainly for learning signal processing and seismology. The STK project that started in 2007, is hosted by SourceForge.net, and count more than 20000 downloads at the date of writing.The STK project is composed of two main branches:First, a graphical interface dedicated to signal processing (in the SAC format (SAC_ASCII and SAC_BIN): where the signal can be plotted, zoomed, filtered, integrated, derivated, ... etc. (a large variety of IFR and FIR filter is proposed). The passage in the frequency domain via the Fourier transform is used to introduce the estimation of spectral density of the signal , with visualization of the Power Spectral Density (PSD) in linear or log scale, and also the evolutive time-frequency representation (or sonagram). The 3-components signals can be also processed for estimating their polarization properties, either for a given window, or either for evolutive windows along the time. This polarization analysis is useful for extracting the polarized noises, differentiating P waves, Rayleigh waves, Love waves, ... etc. Secondly, a panel of Utilities-Program are proposed for working in a terminal mode, with basic programs for computing azimuth and distance in spherical geometry, inter/auto-correlation, spectral density, time-frequency for an entire directory of signals, focal planes, and main components axis, radiation pattern of P waves, Polarization analysis of different waves (including noise), under/over-sampling the signals, cubic-spline smoothing, and linear/non linear regression analysis of data set. STK is developed in C/C++, mainly under Linux OS, and it has been also partially implemented under MS-Windows. STK has been used in some schools for viewing and plotting seismic records provided by IRIS, and it has been used as a practical support for teaching the basis of signal processing. Useful links:http://sourceforge.net/projects/seismic-toolkit/http://sourceforge.net/p/seismic-toolkit/wiki/browse_pages/
NASA Astrophysics Data System (ADS)
Deleflie, Florent; Wailliez, Sébastien; Portmann, Christophe; Gilles, M.; Vienne, Alain; Berthier, J.; Valk, St; Hautesserres, Denis; Martin, Thierry; Fraysse, Hubert
To perform an orbit modelling accurate enough to provide a good estimate of the lifetime of a satellite, or to ensure the stability of a disposal orbit through centuries, we built a new orbit propagator based on the theory of mean orbital motion. It is named SECS-SD2 , for Simplified and Extended CODIOR Software -Space Debris Dedicated . The CODIOR software propagates numerically averaged equations of motion, with a typical integration step size on the order of a few hours, and was originally written in classical orbital elements. The so-called Space Debris -dedicated version is written in orbital elements suitable for orbits with small eccentricities and inclinations, so as to characterize the main dynamic properties of the motion within the LEO, MEO, and GEO regions. The orbital modelling accounts for the very first terms of the geopotential, the perturbations induced by the luni-solar attraction, the solar radiation pressure, and the atmospheric drag (using classical models). The new software was designed so as to ensure short computation times, even over periods of decades or centuries. This paper aims first at describing and validating the main functionalities of the software: we explain how the simplified averaged equations of motion were built, we show how we get sim-plified luni-solar ephemerides without using any huge file for orbit propagations over centuries, and we show how we averaged and simulated the solar flux. We show as well how we expressed short periodic terms to be added to the mean equations of motion, in order to get orbital ele-ments comparable to those deduced from the classical numerical integration of the oscultating equations of motion. The second part of the paper sheds light on some dynamical properties of space debris flying in the LEO and GEO regions, which were obtained from the new software. Knowing that each satellite in the LEO region is now supposed to re-enter the atmosphere within a period of 25 years, we estimated in various dynamical configurations the lifetime of LEO objects depending on their initial conditions of motion, on the solar flux models applied through decades, and on the atmospheric density models and also the satellite area-to-mass ratio. In the GEO region, we investigated the dynamical reasons that can cause space debris re-entering the GEO-protected region after the passivation of a disposal spacecraft.
Development of web-GIS system for analysis of georeferenced geophysical data
NASA Astrophysics Data System (ADS)
Okladnikov, I.; Gordov, E. P.; Titov, A. G.; Bogomolov, V. Y.; Genina, E.; Martynova, Y.; Shulgina, T. M.
2012-12-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, remote sensing products, etc.) are currently actively used in numerous applications including modeling, interpretation and forecast of climatic and ecosystem changes for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their huge size which might constitute up to tens terabytes for a single dataset at present studies in the area of climate and environmental change require a special software support. A dedicated web-GIS information-computational system for analysis of georeferenced climatological and meteorological data has been created. The information-computational system consists of 4 basic parts: computational kernel developed using GNU Data Language (GDL), a set of PHP-controllers run within specialized web-portal, JavaScript class libraries for development of typical components of web mapping application graphical user interface (GUI) based on AJAX technology, and an archive of geophysical datasets. Computational kernel comprises of a number of dedicated modules for querying and extraction of data, mathematical and statistical data analysis, visualization, and preparing output files in geoTIFF and netCDF format containing processing results. Specialized web-portal consists of a web-server Apache, complying OGC standards Geoserver software which is used as a base for presenting cartographical information over the Web, and a set of PHP-controllers implementing web-mapping application logic and governing computational kernel. JavaScript libraries aiming at graphical user interface development are based on GeoExt library combining ExtJS Framework and OpenLayers software. The archive of geophysical data consists of a number of structured environmental datasets represented by data files in netCDF, HDF, GRIB, ESRI Shapefile formats. For processing by the system are available: two editions of NCEP/NCAR Reanalysis, JMA/CRIEPI JRA-25 Reanalysis, ECMWF ERA-40 Reanalysis, ECMWF ERA Interim Reanalysis, MRI/JMA APHRODITE's Water Resources Project Reanalysis, DWD Global Precipitation Climatology Centre's data, GMAO Modern Era-Retrospective analysis for Research and Applications, meteorological observational data for the territory of the former USSR for the 20th century, results of modeling by global and regional climatological models, and others. The system is already involved into a scientific research process. Particularly, recently the system was successfully used for analysis of Siberia climate changes and its impact in the region. The Web-GIS information-computational system for geophysical data analysis provides specialists involved into multidisciplinary research projects with reliable and practical instruments for complex analysis of climate and ecosystems changes on global and regional scales. Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified web-interface in a common graphical web-browser. This work is partially supported by the Ministry of education and science of the Russian Federation (contract #07.514.114044), projects IV.31.1.5, IV.31.2.7, RFBR grants #10-07-00547a, #11-05-01190a, and integrated project SB RAS #131.
Cognitive and Pedagogical Benefits of Argument Mapping: L.A.M.P. Guides the Way to Better Thinking
NASA Astrophysics Data System (ADS)
Rider, Yanna; Thomason, Neil
Experimental evidence shows that in dedicated Critical Thinking courses “Lots of Argument Mapping Practice” (LAMP) using a software tool like Rationale considerably improves students’ critical thinking skills. We believe that teaching with LAMP has additional cognitive and pedagogical benefits, even outside dedicated Critical Thinking subjects. Students learn to better understand and critique arguments, improve in their reading and writing, become clearer in their thinking and, perhaps, even gain meta-cognitive skills that ultimately make them better learners. We discuss some of the evidence for these claims, explain how, as we believe, LAMP confers these benefits, and call for proper experimental and educational research.
CAESAR, French Probative Public Service for In-Orbit Collision Avoidance
NASA Astrophysics Data System (ADS)
Laporte, Francois; Moury, Monique
2013-08-01
This paper starts by describing the conjunction analysis which has to be performed using CSM data provided by JSpOC. This description not only demonstrates that Collision Avoidance is a 2-step process (close approach detection followed by risk evaluation for collision avoidance decision) but also leads to the conclusion that there is a need for Middle Man role. After describing the Middle Man concept, it introduces the French response CAESAR and the need for collaborative work environment which is implied by Middle Man concept. It includes a description of the environment put in place for CAESAR (secure website and dedicated tools), the content of the service, the condition for the distribution of the CNES software JAC and the advantages for subscribers.
CAESAR: An Initiative of Public Service for Collision Risks Mitigation
NASA Astrophysics Data System (ADS)
Laporte, Francois; Moury, Monique; Beaumet, Gregory
2013-09-01
This paper starts by describing the conjunction analysis which has to be performed using CSM data provided by JSpOC. This description not only demonstrates that Collision Avoidance is a 2-step process (close approach detection followed by risk evaluation for collision avoidance decision) but also leads to the conclusion that there is a need for Middle Man role.After describing the Middle Man concept, it introduces the French response CAESAR and the need for collaborative work environment which is implied by Middle Man concept. It includes a description of the environment put in place for CAESAR (secure website and dedicated tools), the content of the service, and the condition for the distribution of the CNES software JAC and the advantages for subscribers.
Real time 3D scanner: investigations and results
NASA Astrophysics Data System (ADS)
Nouri, Taoufik; Pflug, Leopold
1993-12-01
This article presents a concept of reconstruction of 3-D objects using non-invasive and touch loss techniques. The principle of this method is to display parallel interference optical fringes on an object and then to record the object under two angles of view. According to an appropriated treatment one reconstructs the 3-D object even when the object has no symmetrical plan. The 3-D surface data is available immediately in digital form for computer- visualization and for analysis software tools. The optical set-up for recording the 3-D object, the 3-D data extraction and treatment, as well as the reconstruction of the 3-D object are reported and commented on. This application is dedicated for reconstructive/cosmetic surgery, CAD, animation and research purposes.
Advanced active health monitoring system of liquid rocket engines
NASA Astrophysics Data System (ADS)
Qing, Xinlin P.; Wu, Zhanjun; Beard, Shawn; Chang, Fu-Kuo
2008-11-01
An advanced SMART TAPE system has been developed for real-time in-situ monitoring and long term tracking of structural integrity of pressure vessels in liquid rocket engines. The practical implementation of the structural health monitoring (SHM) system including distributed sensor network, portable diagnostic hardware and dedicated data analysis software is addressed based on the harsh operating environment. Extensive tests were conducted on a simulated large booster LOX-H2 engine propellant duct to evaluate the survivability and functionality of the system under the operating conditions of typical liquid rocket engines such as cryogenic temperature, vibration loads. The test results demonstrated that the developed SHM system could survive the combined cryogenic temperature and vibration environments and effectively detect cracks as small as 2 mm.
Online data monitoring in the LHCb experiment
NASA Astrophysics Data System (ADS)
Callot, O.; Cherukuwada, S.; Frank, M.; Gaspar, C.; Graziani, G.; Herwijnen, E. v.; Jost, B.; Neufeld, N.; P-Altarelli, M.; Somogyi, P.; Stoica, R.
2008-07-01
The High Level Trigger and Data Acquisition system selects about 2 kHz of events out of the 40 MHz of beam crossings. The selected events are sent to permanent storage for subsequent analysis. In order to ensure the quality of the collected data, identify possible malfunctions of the detector and perform calibration and alignment checks, a small fraction of the accepted events is sent to a monitoring farm, which consists of a few tens of general purpose processors. This contribution introduces the architecture of the data stream splitting mechanism from the storage system to the monitoring farm, where the raw data are analyzed by dedicated tasks. It describes the collaborating software components that are all based on the Gaudi event processing framework.
NASA Astrophysics Data System (ADS)
Ren, Danping; Wu, Shanshan; Zhang, Lijing
2016-09-01
In view of the characteristics of the global control and flexible monitor of software-defined networks (SDN), we proposes a new optical access network architecture dedicated to Wavelength Division Multiplexing-Passive Optical Network (WDM-PON) systems based on SDN. The network coding (NC) technology is also applied into this architecture to enhance the utilization of wavelength resource and reduce the costs of light source. Simulation results show that this scheme can optimize the throughput of the WDM-PON network, greatly reduce the system time delay and energy consumption.
Upper Atmosphere Research Satellite (UARS) science data processing center implementation history
NASA Technical Reports Server (NTRS)
Herring, Ellen L.; Taylor, K. David
1990-01-01
NASA-Goddard is responsible for the development of a ground system for the Upper Atmosphere Research Satellite (UARS) observatory, whose launch is scheduled for 1991. This ground system encompasses a dedicated Central Data Handling Facility (CDHF); attention is presently given to the management of software systems design and implementation phases for CDHF by the UARS organization. Also noted are integration and testing activities performed following software deliveries to the CDHF. The UARS project has an obvious requirement for a powerful and flexible data base management system; an off-the-shelf commercial system has been incorporated.
Development of dual sensor hand-held detector
NASA Astrophysics Data System (ADS)
Sezgin, Mehmet
2010-04-01
In this paper hand-held dual sensor detector development requirements are considered dedicated to buried object detection. Design characteristics of such a system are categorized and listed. Hardware and software structures, ergonomics, user interface, environmental and EMC/EMI tests to be applied and performance test issues are studied. Main properties of the developed system (SEZER) are presented, which contains Metal Detector (MD) and Ground Penetrating Radar (GPR). The realized system has ergonomic structure and can detect both metallic and non-metallic buried objects. Moreover classification of target is possible if it was defined to the signal processing software in learning phase.
Ontological Model of Business Process Management Systems
NASA Astrophysics Data System (ADS)
Manoilov, G.; Deliiska, B.
2008-10-01
The activities which constitute business process management (BPM) can be grouped into five categories: design, modeling, execution, monitoring and optimization. Dedicated software packets for business process management system (BPMS) are available on the market. But the efficiency of its exploitation depends on used ontological model in the development time and run time of the system. In the article an ontological model of BPMS in area of software industry is investigated. The model building is preceded by conceptualization of the domain and taxonomy of BPMS development. On the base of the taxonomy an simple online thesaurus is created.
Advanced automation of a prototypic thermal control system for Space Station
NASA Technical Reports Server (NTRS)
Dominick, Jeff
1990-01-01
Viewgraphs on an advanced automation of a prototypic thermal control system for space station are presented. The Thermal Expert System (TEXSYS) was initiated in 1986 as a cooperative project between ARC and JCS as a way to leverage on-going work at both centers. JSC contributed Thermal Control System (TCS) hardware and control software, TCS operational expertise, and integration expertise. ARC contributed expert system and display expertise. The first years of the project were dedicated to parallel development of expert system tools, displays, interface software, and TCS technology and procedures by a total of four organizations.
NASA Astrophysics Data System (ADS)
Schwartz, Richard A.; Zarro, D.; Csillaghy, A.; Dennis, B.; Tolbert, A. K.; Etesi, L.
2009-05-01
We report on our activities to integrate VSO search and retrieval capabilities into standard data access, display, and analysis tools. In addition to its standard Web-based search form, the VSO provides an Interactive Data Language (IDL) client (vso_search) that is available through the Solar Software (SSW) package. We have incorporated this client into an IDL-widget interface program (show_synop) that allows for more simplified searching and downloading of VSO datasets directly into a user's IDL data analysis environment. In particular, we have provided the capability to read VSO datasets into a general purpose IDL package (plotman) that can display different datatypes (lightcurves, images, and spectra) and perform basic data operations such as zooming, image overlays, solar rotation, etc. Currently, the show_synop tool supports access to ground-based and space-based (SOHO, STEREO, and Hinode) observations, and has the capability to include new datasets as they become available. A user encounters two major hurdles when using the VSO: (1) Instrument-specific software (such as level-0 file readers and data-prepping procedures) may not be available in the user's local SSW distribution. (2) Recent calibration files (such as flat-fields) are not automatically distributed with the analysis software. To address these issues, we have developed a dedicated server (prepserver) that incorporates all the latest instrument-specific software libraries and calibration files. The prepserver uses an IDL-Java bridge to read and implement data processing requests from a client and return a processed data file that can be readily displayed with the show_synop/plotman package. The advantage of the prepserver is that the user is only required to install the general branch (gen) of the SSW tree, and is freed from the more onerous task of installing instrument-specific libraries and calibration files. We will demonstrate how the prepserver can be used to read, process, and overlay SOHO/EIT, TRACE, SECCHI/EUVI, and RHESSI images.
Building a virtual ligand screening pipeline using free software: a survey.
Glaab, Enrico
2016-03-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. © The Author 2015. Published by Oxford University Press.
Building a virtual ligand screening pipeline using free software: a survey
2016-01-01
Virtual screening, the search for bioactive compounds via computational methods, provides a wide range of opportunities to speed up drug development and reduce the associated risks and costs. While virtual screening is already a standard practice in pharmaceutical companies, its applications in preclinical academic research still remain under-exploited, in spite of an increasing availability of dedicated free databases and software tools. In this survey, an overview of recent developments in this field is presented, focusing on free software and data repositories for screening as alternatives to their commercial counterparts, and outlining how available resources can be interlinked into a comprehensive virtual screening pipeline using typical academic computing facilities. Finally, to facilitate the set-up of corresponding pipelines, a downloadable software system is provided, using platform virtualization to integrate pre-installed screening tools and scripts for reproducible application across different operating systems. PMID:26094053
Paskevich, Valerie F.
1992-01-01
The Branch of Atlantic Marine Geology has been involved in the collection, processing and digital mosaicking of high, medium and low-resolution side-scan sonar data during the past 6 years. In the past, processing and digital mosaicking has been accomplished with a dedicated, shore-based computer system. With the need to process sidescan data in the field with increased power and reduced cost of major workstations, a need to have an image processing package on a UNIX based computer system which could be utilized in the field as well as be more generally available to Branch personnel was identified. This report describes the initial development of that package referred to as the Woods Hole Image Processing System (WHIPS). The software was developed using the Unidata NetCDF software interface to allow data to be more readily portable between different computer operating systems.
Flight Software Development for the CHEOPS Instrument with the CORDET Framework
NASA Astrophysics Data System (ADS)
Cechticky, V.; Ottensamer, R.; Pasetti, A.
2015-09-01
CHEOPS is an ESA S-class mission dedicated to the precise measurement of radii of already known exoplanets using ultra-high precision photometry. The instrument flight software controlling the instrument and handling the science data is developed by the University of Vienna using the CORDET Framework offered by P&P Software GmbH. The CORDET Framework provides a generic software infrastructure for PUS-based applications. This paper describes how the framework is used for the CHEOPS application software to provide a consistent solution for to the communication and control services, event handling and FDIR procedures. This approach is innovative in four respects: (a) it is a true third-party reuse; (b) re-use is done at specification, validation and code level; (c) the re-usable assets and their qualification data package are entirely open-source; (d) re-use is based on call-back with the application developer providing functions which are called by the reusable architecture. File names missing from here on out (I tried to mimic the files names from before.)
Architecture of the local spatial data infrastructure for regional climate change research
NASA Astrophysics Data System (ADS)
Titov, Alexander; Gordov, Evgeny
2013-04-01
Georeferenced datasets (meteorological databases, modeling and reanalysis results, etc.) are actively used in modeling and analysis of climate change for various spatial and temporal scales. Due to inherent heterogeneity of environmental datasets as well as their size which might constitute up to tens terabytes for a single dataset studies in the area of climate and environmental change require a special software support based on SDI approach. A dedicated architecture of the local spatial data infrastructure aiming at regional climate change analysis using modern web mapping technologies is presented. Geoportal is a key element of any SDI, allowing searching of geoinformation resources (datasets and services) using metadata catalogs, producing geospatial data selections by their parameters (data access functionality) as well as managing services and applications of cartographical visualization. It should be noted that due to objective reasons such as big dataset volume, complexity of data models used, syntactic and semantic differences of various datasets, the development of environmental geodata access, processing and visualization services turns out to be quite a complex task. Those circumstances were taken into account while developing architecture of the local spatial data infrastructure as a universal framework providing geodata services. So that, the architecture presented includes: 1. Effective in terms of search, access, retrieval and subsequent statistical processing, model of storing big sets of regional georeferenced data, allowing in particular to store frequently used values (like monthly and annual climate change indices, etc.), thus providing different temporal views of the datasets 2. General architecture of the corresponding software components handling geospatial datasets within the storage model 3. Metadata catalog describing in detail using ISO 19115 and CF-convention standards datasets used in climate researches as a basic element of the spatial data infrastructure as well as its publication according to OGC CSW (Catalog Service Web) specification 4. Computational and mapping web services to work with geospatial datasets based on OWS (OGC Web Services) standards: WMS, WFS, WPS 5. Geoportal as a key element of thematic regional spatial data infrastructure providing also software framework for dedicated web applications development To realize web mapping services Geoserver software is used since it provides natural WPS implementation as a separate software module. To provide geospatial metadata services GeoNetwork Opensource (http://geonetwork-opensource.org) product is planned to be used for it supports ISO 19115/ISO 19119/ISO 19139 metadata standards as well as ISO CSW 2.0 profile for both client and server. To implement thematic applications based on geospatial web services within the framework of local SDI geoportal the following open source software have been selected: 1. OpenLayers JavaScript library, providing basic web mapping functionality for the thin client such as web browser 2. GeoExt/ExtJS JavaScript libraries for building client-side web applications working with geodata services. The web interface developed will be similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. The work is partially supported by RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2.1 and IP 131.
Development of an integrated sensor module for a non-invasive respiratory monitoring system
NASA Astrophysics Data System (ADS)
Kang, Seok-Won; Chang, Keun-Shik
2013-09-01
A respiratory monitoring system has been developed for analyzing the carbon dioxide (CO2) and oxygen (O2) concentrations in the expired air using gas sensors. The data can be used to estimate some medical conditions, including diffusion capability of the lung membrane, oxygen uptake, and carbon dioxide output. For this purpose, a 3-way valve derived from a servomotor was developed, which operates synchronously with human respiratory signals. In particular, the breath analysis system includes an integrated sensor module for valve control, data acquisition through the O2 and CO2 sensors, and respiratory rate monitoring, as well as software dedicated to analysis of respiratory gasses. In addition, an approximation technique for experimental data based on Haar-wavelet-based decomposition is explored to remove noise as well as to reduce the file size of data for long-term monitoring.
NASA Astrophysics Data System (ADS)
Pradhan, Bikram; Delchambre, Ludovic; Hickson, Paul; Akhunov, Talat; Bartczak, Przemyslaw; Kumar, Brajesh; Surdej, Jean
2018-04-01
The 4-m International Liquid Mirror Telescope (ILMT) located at the ARIES Observatory (Devasthal, India) has been designed to scan at a latitude of +29° 22' 26" a band of sky having a width of about half a degree in the Time Delayed Integration (TDI) mode. Therefore, a special data-reduction and analysis pipeline to process online the large amount of optical data being produced has been dedicated to it. This requirement has led to the development of the 4-m ILMT data reduction pipeline, a new software package built with Python in order to simplify a large number of tasks aimed at the reduction of the acquired TDI images. This software provides astronomers with specially designed data reduction functions, astrometry and photometry calibration tools. In this paper we discuss the various reduction and calibration steps followed to reduce TDI images obtained in May 2015 with the Devasthal 1.3m telescope. We report here the detection and characterization of nine space debris present in the TDI frames.
Developing automated analytical methods for scientific environments using LabVIEW.
Wagner, Christoph; Armenta, Sergio; Lendl, Bernhard
2010-01-15
The development of new analytical techniques often requires the building of specially designed devices, each requiring its own dedicated control software. Especially in the research and development phase, LabVIEW has proven to be one highly useful tool for developing this software. Yet, it is still common practice to develop individual solutions for different instruments. In contrast to this, we present here a single LabVIEW-based program that can be directly applied to various analytical tasks without having to change the program code. Driven by a set of simple script commands, it can control a whole range of instruments, from valves and pumps to full-scale spectrometers. Fluid sample (pre-)treatment and separation procedures can thus be flexibly coupled to a wide range of analytical detection methods. Here, the capabilities of the program have been demonstrated by using it for the control of both a sequential injection analysis - capillary electrophoresis (SIA-CE) system with UV detection, and an analytical setup for studying the inhibition of enzymatic reactions using a SIA system with FTIR detection.
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H. L.; Onami, Shuichi
2015-01-01
Motivation: Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. Results: We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. Availability and implementation: A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Contact: sonami@riken.jp Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:25414366
NASA Technical Reports Server (NTRS)
Zhou, Hanying
2007-01-01
PREDICTS is a computer program that predicts the frequencies, as functions of time, of signals to be received by a radio science receiver in this case, a special-purpose digital receiver dedicated to analysis of signals received by an antenna in NASA s Deep Space Network (DSN). Unlike other software used in the DSN, PREDICTS does not use interpolation early in the calculations; as a consequence, PREDICTS is more precise and more stable. The precision afforded by the other DSN software is sufficient for telemetry; the greater precision afforded by PREDICTS is needed for radio-science experiments. In addition to frequencies as a function of time, PREDICTS yields the rates of change and interpolation coefficients for the frequencies and the beginning and ending times of reception, transmission, and occultation. PREDICTS is applicable to S-, X-, and Ka-band signals and can accommodate the following link configurations: (1) one-way (spacecraft to ground), (2) two-way (from a ground station to a spacecraft to the same ground station), and (3) three-way (from a ground transmitting station to a spacecraft to a different ground receiving station).
A system verification platform for high-density epiretinal prostheses.
Chen, Kuanfu; Lo, Yi-Kai; Yang, Zhi; Weiland, James D; Humayun, Mark S; Liu, Wentai
2013-06-01
Retinal prostheses have restored light perception to people worldwide who have poor or no vision as a consequence of retinal degeneration. To advance the quality of visual stimulation for retinal implant recipients, a higher number of stimulation channels is expected in the next generation retinal prostheses, which poses a great challenge to system design and verification. This paper presents a system verification platform dedicated to the development of retinal prostheses. The system includes primary processing, dual-band power and data telemetry, a high-density stimulator array, and two methods for output verification. End-to-end system validation and individual functional block characterization can be achieved with this platform through visual inspection and software analysis. Custom-built software running on the computers also provides a good way for testing new features before they are realized by the ICs. Real-time visual feedbacks through the video displays make it easy to monitor and debug the system. The characterization of the wireless telemetry and the demonstration of the visual display are reported in this paper using a 256-channel retinal prosthetic IC as an example.
Kyoda, Koji; Tohsato, Yukako; Ho, Kenneth H L; Onami, Shuichi
2015-04-01
Recent progress in live-cell imaging and modeling techniques has resulted in generation of a large amount of quantitative data (from experimental measurements and computer simulations) on spatiotemporal dynamics of biological objects such as molecules, cells and organisms. Although many research groups have independently dedicated their efforts to developing software tools for visualizing and analyzing these data, these tools are often not compatible with each other because of different data formats. We developed an open unified format, Biological Dynamics Markup Language (BDML; current version: 0.2), which provides a basic framework for representing quantitative biological dynamics data for objects ranging from molecules to cells to organisms. BDML is based on Extensible Markup Language (XML). Its advantages are machine and human readability and extensibility. BDML will improve the efficiency of development and evaluation of software tools for data visualization and analysis. A specification and a schema file for BDML are freely available online at http://ssbd.qbic.riken.jp/bdml/. Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press.
7 Secrets of Lasting Relationships!
ERIC Educational Resources Information Center
Fredette, Michelle
2013-01-01
Do a web search on the phrase "CRM sucks" and one will find scores of articles, webinars, and blog rants dedicated to the theme. Indeed, if one uses constituent relationship management (CRM) software, one is probably familiar with the litany of complaints. But this is not the time to give up. After all, this is one relationship that really needs…
Project SUN (Students Understanding Nature)
NASA Technical Reports Server (NTRS)
Curley, T.; Yanow, G.
1995-01-01
Project SUN is part of NASA's 'Mission to Planet Earth' education outreach effort. It is based on development of low cost, scientifi- cally accurate instrumentation and computer interfacing, coupled with Apple II computers as dedicated data loggers. The project is com- prised of: instruments, interfacing, software, curriculum, a detailed operating manual, and a system of training at the school sites.
BookMeUp: Creating a Book Suggestion App
ERIC Educational Resources Information Center
Clark, Jason
2012-01-01
The rise of apps and mobile devices has opened the door to small, dedicated software programs that are focused on singular tasks. From the author's perspective as head of digital access and web service manager at Montana State University, these apps offered an opportunity to build a focused digital service aimed at allowing someone to enter a…
Software for Studying and Enhancing Educational Uses of Geospatial Semantics and Data
ERIC Educational Resources Information Center
Nodenot, Thierry; Sallaberry, Christian; Gaio, Mauro
2010-01-01
Geographically related queries form nearly one-fifth of all queries submitted to the Excite search engine and the most frequently occurring terms are names of places. This paper focuses on digital libraries and extends the basic services of existing library management systems to include new ones that are dedicated to geographic information…
YaQ: an architecture for real-time navigation and rendering of varied crowds.
Maïm, Jonathan; Yersin, Barbara; Thalmann, Daniel
2009-01-01
The YaQ software platform is a complete system dedicated to real-time crowd simulation and rendering. Fitting multiple application domains, such as video games and VR, YaQ aims to provide efficient algorithms to generate crowds comprising up to thousands of varied virtual humans navigating in large-scale, global environments.
ERIC Educational Resources Information Center
Carrington, Michal; Chen, Richard; Davies, Martin; Kaur, Jagjit; Neville, Benjamin
2011-01-01
An argument map visually represents the structure of an argument, outlining its informal logical connections and informing judgments as to its worthiness. Argument mapping can be augmented with dedicated software that aids the mapping process. Empirical evidence suggests that semester-length subjects using argument mapping along with dedicated…
Microbial community analysis using MEGAN.
Huson, Daniel H; Weber, Nico
2013-01-01
Metagenomics, the study of microbes in the environment using DNA sequencing, depends upon dedicated software tools for processing and analyzing very large sequencing datasets. One such tool is MEGAN (MEtaGenome ANalyzer), which can be used to interactively analyze and compare metagenomic and metatranscriptomic data, both taxonomically and functionally. To perform a taxonomic analysis, the program places the reads onto the NCBI taxonomy, while functional analysis is performed by mapping reads to the SEED, COG, and KEGG classifications. Samples can be compared taxonomically and functionally, using a wide range of different charting and visualization techniques. PCoA analysis and clustering methods allow high-level comparison of large numbers of samples. Different attributes of the samples can be captured and used within analysis. The program supports various input formats for loading data and can export analysis results in different text-based and graphical formats. The program is designed to work with very large samples containing many millions of reads. It is written in Java and installers for the three major computer operating systems are available from http://www-ab.informatik.uni-tuebingen.de. © 2013 Elsevier Inc. All rights reserved.
2014-01-01
Background According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). Methods The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. Results The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. Conclusions The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a standards-compliant development of small and medium-sized medical software can be carried out by a small team with limited resources in a clinical setting. This is of particular relevance as the upcoming revision of the Medical Device Directive is expected to harmonize and tighten the current legal requirements for all European in-house manufacturers. PMID:24655818
Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver
2014-03-21
According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a standards-compliant development of small and medium-sized medical software can be carried out by a small team with limited resources in a clinical setting. This is of particular relevance as the upcoming revision of the Medical Device Directive is expected to harmonize and tighten the current legal requirements for all European in-house manufacturers.
Viewpoints: A High-Performance High-Dimensional Exploratory Data Analysis Tool
NASA Astrophysics Data System (ADS)
Gazis, P. R.; Levit, C.; Way, M. J.
2010-12-01
Scientific data sets continue to increase in both size and complexity. In the past, dedicated graphics systems at supercomputing centers were required to visualize large data sets, but as the price of commodity graphics hardware has dropped and its capability has increased, it is now possible, in principle, to view large complex data sets on a single workstation. To do this in practice, an investigator will need software that is written to take advantage of the relevant graphics hardware. The Viewpoints visualization package described herein is an example of such software. Viewpoints is an interactive tool for exploratory visual analysis of large high-dimensional (multivariate) data. It leverages the capabilities of modern graphics boards (GPUs) to run on a single workstation or laptop. Viewpoints is minimalist: it attempts to do a small set of useful things very well (or at least very quickly) in comparison with similar packages today. Its basic feature set includes linked scatter plots with brushing, dynamic histograms, normalization, and outlier detection/removal. Viewpoints was originally designed for astrophysicists, but it has since been used in a variety of fields that range from astronomy, quantum chemistry, fluid dynamics, machine learning, bioinformatics, and finance to information technology server log mining. In this article, we describe the Viewpoints package and show examples of its usage.
Reducing Time to Science: Unidata and JupyterHub Technology Using the Jetstream Cloud
NASA Astrophysics Data System (ADS)
Chastang, J.; Signell, R. P.; Fischer, J. L.
2017-12-01
Cloud computing can accelerate scientific workflows, discovery, and collaborations by reducing research and data friction. We describe the deployment of Unidata and JupyterHub technologies on the NSF-funded XSEDE Jetstream cloud. With the aid of virtual machines and Docker technology, we deploy a Unidata JupyterHub server co-located with a Local Data Manager (LDM), THREDDS data server (TDS), and RAMADDA geoscience content management system. We provide Jupyter Notebooks and the pre-built Python environments needed to run them. The notebooks can be used for instruction and as templates for scientific experimentation and discovery. We also supply a large quantity of NCEP forecast model results to allow data-proximate analysis and visualization. In addition, users can transfer data using Globus command line tools, and perform their own data-proximate analysis and visualization with Notebook technology. These data can be shared with others via a dedicated TDS server for scientific distribution and collaboration. There are many benefits of this approach. Not only is the cloud computing environment fast, reliable and scalable, but scientists can analyze, visualize, and share data using only their web browser. No local specialized desktop software or a fast internet connection is required. This environment will enable scientists to spend less time managing their software and more time doing science.
DOE Office of Scientific and Technical Information (OSTI.GOV)
The system is developed to collect, process, store and present the information provided by the radio frequency identification (RFID) devices. The system contains three parts, the application software, the database and the web page. The application software manages multiple RFID devices, such as readers and portals, simultaneously. It communicates with the devices through application programming interface (API) provided by the device vendor. The application software converts data collected by the RFID readers and portals to readable information. It is capable of encrypting data using 256 bits advanced encryption standard (AES). The application software has a graphical user interface (GUI). Themore » GUI mimics the configurations of the nucler material storage sites or transport vehicles. The GUI gives the user and system administrator an intuitive way to read the information and/or configure the devices. The application software is capable of sending the information to a remote, dedicated and secured web and database server. Two captured screen samples, one for storage and transport, are attached. The database is constructed to handle a large number of RFID tag readers and portals. A SQL server is employed for this purpose. An XML script is used to update the database once the information is sent from the application software. The design of the web page imitates the design of the application software. The web page retrieves data from the database and presents it in different panels. The user needs a user name combined with a password to access the web page. The web page is capable of sending e-mail and text messages based on preset criteria, such as when alarm thresholds are excceeded. A captured screen sample is attached. The application software is designed to be installed on a local computer. The local computer is directly connected to the RFID devices and can be controlled locally or remotely. There are multiple local computers managing different sites or transport vehicles. The control from remote sites and information transmitted to a central database server is through secured internet. The information stored in the central databaser server is shown on the web page. The users can view the web page on the internet. A dedicated and secured web and database server (https) is used to provide information security.« less
NASA Astrophysics Data System (ADS)
Teague, Kelly K.; Smith, G. Louis; Priestley, Kory; Lukashin, Constantine; Roithmayr, Carlos
2012-09-01
Five CERES scanning radiometers have been flown to date. The Proto-Flight Model flew aboard the Tropical Rainfall Measurement Mission spacecraft in November 1997. Two CERES instruments, Flight Models (FM) 1 and 2, are aboard the Terra spacecraft, which was launched in December 1999. Two more CERES instruments, FM-3 and FM-4, are on the Aqua spacecraft, which was placed in orbit in May 2002. These instruments continue to operate after providing over a decade of Earth Radiation Budget data. The CERES FM-5 instrument, onboard the Suomi-NPP spacecraft, launched in October 2011. The CERES FM- 6 instrument is manifested on the JPPS-1 spacecraft to be launched in December 2016. A successor to these instruments is presently in the definition stage. This paper describes the evolving role of flight software in the operation of these instruments to meet the Science objectives of the mission and also the ability to execute supplemental tasks as they evolve. In order to obtain and maintain high accuracy in the data products from these instruments, a number of operational activities have been developed and implemented since the instruments were originally designed and placed in orbit. These new activities are possible because of the ability to exploit and modify the flight software, which operates the instruments. The CERES Flight Software interface was designed to allow for on-orbit modification, and as such, constantly evolves to meet changing needs. The purpose of this paper is to provide a brief overview of modifications which have been developed to allow dedicated targeting of specific geographic locations as the CERES sensor flies overhead on its host spacecraft. This new observing strategy greatly increases the temporal and angular sampling for specific targets of high scientific interest.
The 2016 Bioinformatics Open Source Conference (BOSC).
Harris, Nomi L; Cock, Peter J A; Chapman, Brad; Fields, Christopher J; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather
2016-01-01
Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science.
MMX-I: data-processing software for multimodal X-ray imaging and tomography
Bergamaschi, Antoine; Medjoubi, Kadda; Messaoudi, Cédric; Marco, Sergio; Somogyi, Andrea
2016-01-01
A new multi-platform freeware has been developed for the processing and reconstruction of scanning multi-technique X-ray imaging and tomography datasets. The software platform aims to treat different scanning imaging techniques: X-ray fluorescence, phase, absorption and dark field and any of their combinations, thus providing an easy-to-use data processing tool for the X-ray imaging user community. A dedicated data input stream copes with the input and management of large datasets (several hundred GB) collected during a typical multi-technique fast scan at the Nanoscopium beamline and even on a standard PC. To the authors’ knowledge, this is the first software tool that aims at treating all of the modalities of scanning multi-technique imaging and tomography experiments. PMID:27140159
Massively Parallel Processing for Fast and Accurate Stamping Simulations
NASA Astrophysics Data System (ADS)
Gress, Jeffrey J.; Xu, Siguang; Joshi, Ramesh; Wang, Chuan-tao; Paul, Sabu
2005-08-01
The competitive automotive market drives automotive manufacturers to speed up the vehicle development cycles and reduce the lead-time. Fast tooling development is one of the key areas to support fast and short vehicle development programs (VDP). In the past ten years, the stamping simulation has become the most effective validation tool in predicting and resolving all potential formability and quality problems before the dies are physically made. The stamping simulation and formability analysis has become an critical business segment in GM math-based die engineering process. As the simulation becomes as one of the major production tools in engineering factory, the simulation speed and accuracy are the two of the most important measures for stamping simulation technology. The speed and time-in-system of forming analysis becomes an even more critical to support the fast VDP and tooling readiness. Since 1997, General Motors Die Center has been working jointly with our software vendor to develop and implement a parallel version of simulation software for mass production analysis applications. By 2001, this technology was matured in the form of distributed memory processing (DMP) of draw die simulations in a networked distributed memory computing environment. In 2004, this technology was refined to massively parallel processing (MPP) and extended to line die forming analysis (draw, trim, flange, and associated spring-back) running on a dedicated computing environment. The evolution of this technology and the insight gained through the implementation of DM0P/MPP technology as well as performance benchmarks are discussed in this publication.
Surfing on the morning after: analysis of an emergency contraception website.
Gainer, Erin; Sollet, Christian; Ulmann, Marion; Lévy, Delphine; Ulmann, André
2003-03-01
The introduction of widespread nonprescription delivery of hormonal emergency contraception (EC) calls for development of innovative tools to provide information to and gather feedback from EC users. Individuals seeking confidential information on sexual health and contraception are increasingly turning to the Internet as the resource of choice. This study employed analytical software and manual content analysis to examine the use of a website dedicated to an EC product (www.norlevo.com) over the course of 2 years. Frequency of visits to and pageviews of the site increased consistently over the 2-year time period, and the bulk of the visitors to the site were EC users seeking responses to frequently asked questions. The most common concern raised by users was the occurrence of spotting and menstrual bleeding following EC use. This analysis reveals that within the context of nonprescription access to hormonal EC, a website can constitute a potent educational tool for health professionals and EC users and provide a valuable source of post-marketing feedback on product use.
The tracking analysis in the Q-weak experiment
Pan, J.; Androic, D.; Armstrong, D. S.; ...
2016-11-21
Here, the Q-weak experiment at Jefferson Laboratory measured the parity violating asymmetry (Amore » $$_{PV}$$ ) in elastic electron-proton scattering at small momentum transfer squared (Q$$^{2}$$=0.025 (G e V/c)$$^{2}$$), with the aim of extracting the proton’s weak charge ( $${Q^p_W}$$ ) to an accuracy of 5 %. As one of the major uncertainty contribution sources to $${Q^p_W}$$ , Q$$^{2}$$ needs to be determined to ~1 % so as to reach the proposed experimental precision. For this purpose, two sets of high resolution tracking chambers were employed in the experiment, to measure tracks before and after the magnetic spectrometer. Data collected by the tracking system were then reconstructed with dedicated software into individual electron trajectories for experimental kinematics determination. The Q-weak kinematics and the analysis scheme for tracking data are briefly described here. The sources that contribute to the uncertainty of Q$$^{2}$$ are discussed, and the current analysis status is reported.« less
Hartman, Amber L; Riddle, Sean; McPhillips, Timothy; Ludäscher, Bertram; Eisen, Jonathan A
2010-06-12
For more than two decades microbiologists have used a highly conserved microbial gene as a phylogenetic marker for bacteria and archaea. The small-subunit ribosomal RNA gene, also known as 16 S rRNA, is encoded by ribosomal DNA, 16 S rDNA, and has provided a powerful comparative tool to microbial ecologists. Over time, the microbial ecology field has matured from small-scale studies in a select number of environments to massive collections of sequence data that are paired with dozens of corresponding collection variables. As the complexity of data and tool sets have grown, the need for flexible automation and maintenance of the core processes of 16 S rDNA sequence analysis has increased correspondingly. We present WATERS, an integrated approach for 16 S rDNA analysis that bundles a suite of publicly available 16 S rDNA analysis software tools into a single software package. The "toolkit" includes sequence alignment, chimera removal, OTU determination, taxonomy assignment, phylogentic tree construction as well as a host of ecological analysis and visualization tools. WATERS employs a flexible, collection-oriented 'workflow' approach using the open-source Kepler system as a platform. By packaging available software tools into a single automated workflow, WATERS simplifies 16 S rDNA analyses, especially for those without specialized bioinformatics, programming expertise. In addition, WATERS, like some of the newer comprehensive rRNA analysis tools, allows researchers to minimize the time dedicated to carrying out tedious informatics steps and to focus their attention instead on the biological interpretation of the results. One advantage of WATERS over other comprehensive tools is that the use of the Kepler workflow system facilitates result interpretation and reproducibility via a data provenance sub-system. Furthermore, new "actors" can be added to the workflow as desired and we see WATERS as an initial seed for a sizeable and growing repository of interoperable, easy-to-combine tools for asking increasingly complex microbial ecology questions.
Advanced Oil Spill Detection Algorithms For Satellite Based Maritime Environment Monitoring
NASA Astrophysics Data System (ADS)
Radius, Andrea; Azevedo, Rui; Sapage, Tania; Carmo, Paulo
2013-12-01
During the last years, the increasing pollution occurrence and the alarming deterioration of the environmental health conditions of the sea, lead to the need of global monitoring capabilities, namely for marine environment management in terms of oil spill detection and indication of the suspected polluter. The sensitivity of Synthetic Aperture Radar (SAR) to the different phenomena on the sea, especially for oil spill and vessel detection, makes it a key instrument for global pollution monitoring. The SAR performances in maritime pollution monitoring are being operationally explored by a set of service providers on behalf of the European Maritime Safety Agency (EMSA), which has launched in 2007 the CleanSeaNet (CSN) project - a pan-European satellite based oil monitoring service. EDISOFT, which is from the beginning a service provider for CSN, is continuously investing in R&D activities that will ultimately lead to better algorithms and better performance on oil spill detection from SAR imagery. This strategy is being pursued through EDISOFT participation in the FP7 EC Sea-U project and in the Automatic Oil Spill Detection (AOSD) ESA project. The Sea-U project has the aim to improve the current state of oil spill detection algorithms, through the informative content maximization obtained with data fusion, the exploitation of different type of data/ sensors and the development of advanced image processing, segmentation and classification techniques. The AOSD project is closely related to the operational segment, because it is focused on the automation of the oil spill detection processing chain, integrating auxiliary data, like wind information, together with image and geometry analysis techniques. The synergy between these different objectives (R&D versus operational) allowed EDISOFT to develop oil spill detection software, that combines the operational automatic aspect, obtained through dedicated integration of the processing chain in the existing open source NEST software, with new detection, filtering and classification algorithms. Particularly, dedicated filtering algorithm development based on Wavelet filtering was exploited for the improvement of oil spill detection and classification. In this work we present the functionalities of the developed software and the main results in support of the developed algorithm validity.
Dual ant colony operational modal analysis parameter estimation method
NASA Astrophysics Data System (ADS)
Sitarz, Piotr; Powałka, Bartosz
2018-01-01
Operational Modal Analysis (OMA) is a common technique used to examine the dynamic properties of a system. Contrary to experimental modal analysis, the input signal is generated in object ambient environment. Operational modal analysis mainly aims at determining the number of pole pairs and at estimating modal parameters. Many methods are used for parameter identification. Some methods operate in time while others in frequency domain. The former use correlation functions, the latter - spectral density functions. However, while some methods require the user to select poles from a stabilisation diagram, others try to automate the selection process. Dual ant colony operational modal analysis parameter estimation method (DAC-OMA) presents a new approach to the problem, avoiding issues involved in the stabilisation diagram. The presented algorithm is fully automated. It uses deterministic methods to define the interval of estimated parameters, thus reducing the problem to optimisation task which is conducted with dedicated software based on ant colony optimisation algorithm. The combination of deterministic methods restricting parameter intervals and artificial intelligence yields very good results, also for closely spaced modes and significantly varied mode shapes within one measurement point.
Geographic Information Systems and Web Page Development
NASA Technical Reports Server (NTRS)
Reynolds, Justin
2004-01-01
The Facilities Engineering and Architectural Branch is responsible for the design and maintenance of buildings, laboratories, and civil structures. In order to improve efficiency and quality, the FEAB has dedicated itself to establishing a data infrastructure based on Geographic Information Systems, GIS. The value of GIS was explained in an article dating back to 1980 entitled "Need for a Multipurpose Cadastre" which stated, "There is a critical need for a better land-information system in the United States to improve land-conveyance procedures, furnish a basis for equitable taxation, and provide much-needed information for resource management and environmental planning." Scientists and engineers both point to GIS as the solution. What is GIS? According to most text books, Geographic Information Systems is a class of software that stores, manages, and analyzes mapable features on, above, or below the surface of the earth. GIS software is basically database management software to the management of spatial data and information. Simply put, Geographic Information Systems manage, analyze, chart, graph, and map spatial information. GIS can be broken down into two main categories, urban GIS and natural resource GIS. Further still, natural resource GIS can be broken down into six sub-categories, agriculture, forestry, wildlife, catchment management, archaeology, and geology/mining. Agriculture GIS has several applications, such as agricultural capability analysis, land conservation, market analysis, or whole farming planning. Forestry GIs can be used for timber assessment and management, harvest scheduling and planning, environmental impact assessment, and pest management. GIS when used in wildlife applications enables the user to assess and manage habitats, identify and track endangered and rare species, and monitor impact assessment.
Asfour, Aktham; Raoof, Kosai; Yonnet, Jean-Paul
2013-11-27
A proof-of-concept of the use of a fully digital radiofrequency (RF) electronics for the design of dedicated Nuclear Magnetic Resonance (NMR) systems at low-field (0.1 T) is presented. This digital electronics is based on the use of three key elements: a Direct Digital Synthesizer (DDS) for pulse generation, a Software Defined Radio (SDR) for a digital receiving of NMR signals and a Digital Signal Processor (DSP) for system control and for the generation of the gradient signals (pulse programmer). The SDR includes a direct analog-to-digital conversion and a Digital Down Conversion (digital quadrature demodulation, decimation filtering, processing gain…). The various aspects of the concept and of the realization are addressed with some details. These include both hardware design and software considerations. One of the underlying ideas is to enable such NMR systems to "enjoy" from existing advanced technology that have been realized in other research areas, especially in telecommunication domain. Another goal is to make these systems easy to build and replicate so as to help research groups in realizing dedicated NMR desktops for a large palette of new applications. We also would like to give readers an idea of the current trends in this field. The performances of the developed electronics are discussed throughout the paper. First FID (Free Induction Decay) signals are also presented. Some development perspectives of our work in the area of low-field NMR/MRI will be finally addressed.
NASA Technical Reports Server (NTRS)
Kurihara, Shinobu; Nozawa, Kentaro
2013-01-01
The K5/VSSP software correlator (Figure 1), located in Tsukuba, Japan, is operated by the Geospatial Information Authority of Japan (GSI). It is fully dedicated to processing the geodetic VLBI sessions of the International VLBI Service for Geodesy and Astrometry. All of the weekend IVS Intensives (INT2) and the Japanese domestic VLBI observations organized by GSI were processed at the Tsukuba VLBI Correlator.
NASA Astrophysics Data System (ADS)
Miękina, Andrzej; Wagner, Jakub; Mazurek, Paweł; Morawski, Roman Z.; Sudmann, Tobba T.; Børsheim, Ingebjørg T.; Øvsthus, Knut; Jacobsen, Frode F.; Ciamulski, Tomasz; Winiecki, Wiesław
2016-11-01
The importance of research on new technologies that could be employed in care services for elderly and disabled persons is highlighted. Advantages of radar sensors, when applied for non-invasive monitoring of such persons in their home environment, are indicated. A need for comprehensible visualisation of the intermediate results of measurement data processing is justified. Capability of an impulse-radar-based system to provide information, being of crucial importance for medical or healthcare personnel, are investigated. An exemplary software interface, tailored for non-technical users, is proposed, and preliminary results of impulse-radar-based monitoring of human movements are demonstrated.
Cryogenic Characterization of FBK RGB-HD SiPMs
Aalseth, C. E.
2017-09-26
We report on the cryogenic characterization of Red Green Blue - High Density (RGB-HD) SiPMs developed at Fondazione Bruno Kessler (FBK) as part of the DarkSide program of dark matter searches with liquid argon time projection chambers. A dedicated setup was used to measure the primary dark noise, the correlated noise, and the gain of the SiPMs at varying temperatures. A custom-made data acquisition system and analysis software were used to precisely characterize these parameters. We demonstrate that FBK RGB-HD SiPMs with low quenching resistance (RGB-HD-LRmore » $$_q$$) can be operated from 40 K to 300 K with gains in the range $10^5$ to $10^6$ and noise rates on the order of a few Hz/mm$^2$.« less
Light at Night Markup Language (LANML): XML Technology for Light at Night Monitoring Data
NASA Astrophysics Data System (ADS)
Craine, B. L.; Craine, E. R.; Craine, E. M.; Crawford, D. L.
2013-05-01
Light at Night Markup Language (LANML) is a standard, based upon XML, useful in acquiring, validating, transporting, archiving and analyzing multi-dimensional light at night (LAN) datasets of any size. The LANML standard can accommodate a variety of measurement scenarios including single spot measures, static time-series, web based monitoring networks, mobile measurements, and airborne measurements. LANML is human-readable, machine-readable, and does not require a dedicated parser. In addition LANML is flexible; ensuring future extensions of the format will remain backward compatible with analysis software. The XML technology is at the heart of communicating over the internet and can be equally useful at the desktop level, making this standard particularly attractive for web based applications, educational outreach and efficient collaboration between research groups.
NASA Astrophysics Data System (ADS)
Polkowski, Marcin; Grad, Marek
2016-04-01
Passive seismic experiment "13BB Star" is operated since mid 2013 in northern Poland and consists of 13 broadband seismic stations. One of the elements of this experiment is dedicated on-line data acquisition system comprised of both client (station) side and server side modules with web based interface that allows monitoring of network status and provides tools for preliminary data analysis. Station side is controlled by ARM Linux board that is programmed to maintain 3G/EDGE internet connection, receive data from digitizer, send data do central server among with additional auxiliary parameters like temperatures, voltages and electric current measurements. Station side is controlled by set of easy to install PHP scripts. Data is transmitted securely over SSH protocol to central server. Central server is a dedicated Linux based machine. Its duty is receiving and processing all data from all stations including auxiliary parameters. Server side software is written in PHP and Python. Additionally, it allows remote station configuration and provides web based interface for user friendly interaction. All collected data can be displayed for each day and station. It also allows manual creation of event oriented plots with different filtering abilities and provides numerous status and statistic information. Our solution is very flexible and easy to modify. In this presentation we would like to share our solution and experience. National Science Centre Poland provided financial support for this work via NCN grant DEC-2011/02/A/ST10/00284.
Hierarchy Software Development Framework (h-dp-fwk) project
NASA Astrophysics Data System (ADS)
Zaytsev, A.
2010-04-01
Hierarchy Software Development Framework provides a lightweight tool for building portable modular applications for performing automated data analysis tasks in a batch mode. The history of design and development activities devoted to the project has begun in March 2005 and from the very beginning it was targeting the case of building experimental data processing applications for the CMD-3 experiment which is being commissioned at Budker Institute of Nuclear Physics (BINP, Novosibirsk, Russia). Its design addresses the generic case of modular data processing application operating within the well defined distributed computing environment. The main features of the framework are modularity, built-in message and data exchange mechanisms, XInclude and XML schema enabled XML configuration management tools, dedicated log management tools, internal debugging tools, both dynamic and static module chains support, internal DSO version and consistency checking, well defined API for developing specialized frameworks. It is supported on Scientific Linux 4 and 5 and planned to be ported to other platforms as well. The project is provided with the comprehensive set of technical documentation and users' guides. The licensing schema for the source code, binaries and documentation implies that the product is free for non-commercial use. Although the development phase is not over and many features are to be implemented yet the project is considered ready for public use and creating applications in various fields including development of events reconstruction software for small and moderate scale HEP experiments.
From Excavations to Web: a GIS for Archaeology
NASA Astrophysics Data System (ADS)
D'Urso, M. G.; Corsi, E.; Nemeti, S.; Germani, M.
2017-05-01
The study and the protection of Cultural Heritage in recent years have undergone a revolution about the search tools and the reference disciplines. The technological approach to the problem of the collection, organization and publication of archaeological data using GIS software has completely changed the essence of the traditional methods of investigation, paving the way to the development of several application areas, up to the Cultural Resource Management. A relatively recent specific sector of development for archaeological GIS development sector is dedicated to the intra - site analyses aimed to recording, processing and display information obtained during the excavations. The case - study of the archaeological site located in the south - east of San Pietro Vetere plateau in Aquino, in the Southern Lazio, is concerned with the illustration of a procedure describing the complete digital workflow relative to an intra-site analysis of an archaeological dig. The GIS project implementation and its publication on the web, thanks to several softwares, particularly the FOSS (Free Open Source Software) Quantum - GIS, are an opportunity to reflect on the strengths and the critical nature of this particular application of the GIS technology. For future developments in research it is of fundamental importance the identification of a digital protocol for processing of excavations (from the acquisition, cataloguing, up data insertion), also on account of a possible future Open Project on medieval Aquino.
Göbl, Rüdiger; Navab, Nassir; Hennersperger, Christoph
2018-06-01
Research in ultrasound imaging is limited in reproducibility by two factors: First, many existing ultrasound pipelines are protected by intellectual property, rendering exchange of code difficult. Second, most pipelines are implemented in special hardware, resulting in limited flexibility of implemented processing steps on such platforms. With SUPRA, we propose an open-source pipeline for fully software-defined ultrasound processing for real-time applications to alleviate these problems. Covering all steps from beamforming to output of B-mode images, SUPRA can help improve the reproducibility of results and make modifications to the image acquisition mode accessible to the research community. We evaluate the pipeline qualitatively, quantitatively, and regarding its run time. The pipeline shows image quality comparable to a clinical system and backed by point spread function measurements a comparable resolution. Including all processing stages of a usual ultrasound pipeline, the run-time analysis shows that it can be executed in 2D and 3D on consumer GPUs in real time. Our software ultrasound pipeline opens up the research in image acquisition. Given access to ultrasound data from early stages (raw channel data, radiofrequency data), it simplifies the development in imaging. Furthermore, it tackles the reproducibility of research results, as code can be shared easily and even be executed without dedicated ultrasound hardware.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hasenkamp, Daren; Sim, Alexander; Wehner, Michael
Extensive computing power has been used to tackle issues such as climate changes, fusion energy, and other pressing scientific challenges. These computations produce a tremendous amount of data; however, many of the data analysis programs currently only run a single processor. In this work, we explore the possibility of using the emerging cloud computing platform to parallelize such sequential data analysis tasks. As a proof of concept, we wrap a program for analyzing trends of tropical cyclones in a set of virtual machines (VMs). This approach allows the user to keep their familiar data analysis environment in the VMs, whilemore » we provide the coordination and data transfer services to ensure the necessary input and output are directed to the desired locations. This work extensively exercises the networking capability of the cloud computing systems and has revealed a number of weaknesses in the current cloud system software. In our tests, we are able to scale the parallel data analysis job to a modest number of VMs and achieve a speedup that is comparable to running the same analysis task using MPI. However, compared to MPI based parallelization, the cloud-based approach has a number of advantages. The cloud-based approach is more flexible because the VMs can capture arbitrary software dependencies without requiring the user to rewrite their programs. The cloud-based approach is also more resilient to failure; as long as a single VM is running, it can make progress while as soon as one MPI node fails the whole analysis job fails. In short, this initial work demonstrates that a cloud computing system is a viable platform for distributed scientific data analyses traditionally conducted on dedicated supercomputing systems.« less
The 2016 Bioinformatics Open Source Conference (BOSC)
Harris, Nomi L.; Cock, Peter J.A.; Chapman, Brad; Fields, Christopher J.; Hokamp, Karsten; Lapp, Hilmar; Muñoz-Torres, Monica; Wiencko, Heather
2016-01-01
Message from the ISCB: The Bioinformatics Open Source Conference (BOSC) is a yearly meeting organized by the Open Bioinformatics Foundation (OBF), a non-profit group dedicated to promoting the practice and philosophy of Open Source software development and Open Science within the biological research community. BOSC has been run since 2000 as a two-day Special Interest Group (SIG) before the annual ISMB conference. The 17th annual BOSC ( http://www.open-bio.org/wiki/BOSC_2016) took place in Orlando, Florida in July 2016. As in previous years, the conference was preceded by a two-day collaborative coding event open to the bioinformatics community. The conference brought together nearly 100 bioinformatics researchers, developers and users of open source software to interact and share ideas about standards, bioinformatics software development, and open and reproducible science. PMID:27781083
Real-time graphic display utility for nuclear safety applications
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, S.; Huang, X.; Taylor, J.
2006-07-01
With the increasing interests in the nuclear energy, new nuclear power plants will be constructed and licensed, and older generation ones will be upgraded for assuring continuing operation. The tendency of adopting the latest proven technology and the fact of older parts becoming obsolete have made the upgrades imperative. One of the areas for upgrades is the older CRT display being replaced by the latest graphics displays running under modern real time operating system (RTOS) with safety graded modern computer. HFC has developed a graphic display utility (GDU) under the QNX RTOS. A standard off-the-shelf software with a long historymore » of performance in industrial applications, QNX RTOS used for safety applications has been examined via a commercial dedication process that is consistent with the regulatory guidelines. Through a commercial survey, a design life cycle and an operating history evaluation, and necessary tests dictated by the dedication plan, it is reasonably confirmed that the QNX RTOS was essentially equivalent to what would be expected in the nuclear industry. The developed GDU operates and communicates with the existing equipment through a dedicated serial channel of a flat panel controller (FPC) module. The FPC module drives a flat panel display (FPD) monitor. A touch screen mounted on the FPD serves as the normal operator interface with the FPC/FPD monitor system. The GDU can be used not only for replacing older CRTs but also in new applications. The replacement of the older CRT does not disturb the function of the existing equipment. It not only provides modern proven technology upgrade but also improves human ergonomics. The FPC, which can be used as a standalone controller running with the GDU, is an integrated hardware and software module. It operates as a single board computer within a control system, and applies primarily to the graphics display, targeting, keyboard and mouse. During normal system operation, the GDU has two sources of data input: a serial interface with field equipment and a serial input from the FPD touch screen. The mechanism for data collection from the field equipment consists of the regular exchange of the data update request messages and target commands sent to the equipment and the update messages returned to the FPC. The data updates from field equipment control displays presented on the graphic pages. Touch screen contacts are decoded to identify physical position that was contacted. If that position corresponds with one of the buttons on the graphic page, the software uses that input to initiate the function defined for the particular button contacted. In this paper, the FPC will be illustrated as a standalone system as well as a module in a dedicated control system. The GDU design concepts and its design flow will be demonstrated. The dedication process of the QNX RTOS needed for the GDU will be highlighted. Finally, the GDU with a specific application example used in one of the nuclear power plants will be presented. (authors)« less
KNIME4NGS: a comprehensive toolbox for next generation sequencing analysis.
Hastreiter, Maximilian; Jeske, Tim; Hoser, Jonathan; Kluge, Michael; Ahomaa, Kaarin; Friedl, Marie-Sophie; Kopetzky, Sebastian J; Quell, Jan-Dominik; Mewes, H Werner; Küffner, Robert
2017-05-15
Analysis of Next Generation Sequencing (NGS) data requires the processing of large datasets by chaining various tools with complex input and output formats. In order to automate data analysis, we propose to standardize NGS tasks into modular workflows. This simplifies reliable handling and processing of NGS data, and corresponding solutions become substantially more reproducible and easier to maintain. Here, we present a documented, linux-based, toolbox of 42 processing modules that are combined to construct workflows facilitating a variety of tasks such as DNAseq and RNAseq analysis. We also describe important technical extensions. The high throughput executor (HTE) helps to increase the reliability and to reduce manual interventions when processing complex datasets. We also provide a dedicated binary manager that assists users in obtaining the modules' executables and keeping them up to date. As basis for this actively developed toolbox we use the workflow management software KNIME. See http://ibisngs.github.io/knime4ngs for nodes and user manual (GPLv3 license). robert.kueffner@helmholtz-muenchen.de. Supplementary data are available at Bioinformatics online.
RepExplore: addressing technical replicate variance in proteomics and metabolomics data analysis.
Glaab, Enrico; Schneider, Reinhard
2015-07-01
High-throughput omics datasets often contain technical replicates included to account for technical sources of noise in the measurement process. Although summarizing these replicate measurements by using robust averages may help to reduce the influence of noise on downstream data analysis, the information on the variance across the replicate measurements is lost in the averaging process and therefore typically disregarded in subsequent statistical analyses.We introduce RepExplore, a web-service dedicated to exploit the information captured in the technical replicate variance to provide more reliable and informative differential expression and abundance statistics for omics datasets. The software builds on previously published statistical methods, which have been applied successfully to biomedical omics data but are difficult to use without prior experience in programming or scripting. RepExplore facilitates the analysis by providing a fully automated data processing and interactive ranking tables, whisker plot, heat map and principal component analysis visualizations to interpret omics data and derived statistics. Freely available at http://www.repexplore.tk enrico.glaab@uni.lu Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press.
NMRbox: A Resource for Biomolecular NMR Computation.
Maciejewski, Mark W; Schuyler, Adam D; Gryk, Michael R; Moraru, Ion I; Romero, Pedro R; Ulrich, Eldon L; Eghbalnia, Hamid R; Livny, Miron; Delaglio, Frank; Hoch, Jeffrey C
2017-04-25
Advances in computation have been enabling many recent advances in biomolecular applications of NMR. Due to the wide diversity of applications of NMR, the number and variety of software packages for processing and analyzing NMR data is quite large, with labs relying on dozens, if not hundreds of software packages. Discovery, acquisition, installation, and maintenance of all these packages is a burdensome task. Because the majority of software packages originate in academic labs, persistence of the software is compromised when developers graduate, funding ceases, or investigators turn to other projects. To simplify access to and use of biomolecular NMR software, foster persistence, and enhance reproducibility of computational workflows, we have developed NMRbox, a shared resource for NMR software and computation. NMRbox employs virtualization to provide a comprehensive software environment preconfigured with hundreds of software packages, available as a downloadable virtual machine or as a Platform-as-a-Service supported by a dedicated compute cloud. Ongoing development includes a metadata harvester to regularize, annotate, and preserve workflows and facilitate and enhance data depositions to BioMagResBank, and tools for Bayesian inference to enhance the robustness and extensibility of computational analyses. In addition to facilitating use and preservation of the rich and dynamic software environment for biomolecular NMR, NMRbox fosters the development and deployment of a new class of metasoftware packages. NMRbox is freely available to not-for-profit users. Copyright © 2017 Biophysical Society. All rights reserved.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Lykosov, Vasily; Krupchatnikov, Vladimir; Okladnikov, Igor; Titov, Alexander; Shulgina, Tamara
2013-04-01
Analysis of growing volume of related to climate change data from sensors and model outputs requires collaborative multidisciplinary efforts of researchers. To do it timely and in reliable way one needs in modern information-computational infrastructure supporting integrated studies in the field of environmental sciences. Recently developed experimental software and hardware platform Climate (http://climate.scert.ru/) provides required environment for regional climate change related investigations. The platform combines modern web 2.0 approach, GIS-functionality and capabilities to run climate and meteorological models, process large geophysical datasets and support relevant analysis. It also supports joint software development by distributed research groups, and organization of thematic education for students and post-graduate students. In particular, platform software developed includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also run of integrated into the platform WRF and «Planet Simulator» models, modeling results data preprocessing and visualization is provided. All functions of the platform are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of selection of geographical region of interest (pan and zoom), data layers manipulation (order, enable/disable, features extraction) and visualization of results. Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches. Using it even unskilled user without specific knowledge can perform reliable computational processing and visualization of large meteorological, climatic and satellite monitoring datasets through unified graphical web-interface. Partial support of RF Ministry of Education and Science grant 8345, SB RAS Program VIII.80.2 and Projects 69, 131, 140 and APN CBA2012-16NSY project is acknowledged.
The Integrated Hazard Analysis Integrator
NASA Technical Reports Server (NTRS)
Morris, A. Terry; Massie, Michael J.
2009-01-01
Hazard analysis addresses hazards that arise in the design, development, manufacturing, construction, facilities, transportation, operations and disposal activities associated with hardware, software, maintenance, operations and environments. An integrated hazard is an event or condition that is caused by or controlled by multiple systems, elements, or subsystems. Integrated hazard analysis (IHA) is especially daunting and ambitious for large, complex systems such as NASA s Constellation program which incorporates program, systems and element components that impact others (International Space Station, public, International Partners, etc.). An appropriate IHA should identify all hazards, causes, controls and verifications used to mitigate the risk of catastrophic loss of crew, vehicle and/or mission. Unfortunately, in the current age of increased technology dependence, there is the tendency to sometimes overlook the necessary and sufficient qualifications of the integrator, that is, the person/team that identifies the parts, analyzes the architectural structure, aligns the analysis with the program plan and then communicates/coordinates with large and small components, each contributing necessary hardware, software and/or information to prevent catastrophic loss. As viewed from both Challenger and Columbia accidents, lack of appropriate communication, management errors and lack of resources dedicated to safety were cited as major contributors to these fatalities. From the accident reports, it would appear that the organizational impact of managers, integrators and safety personnel contributes more significantly to mission success and mission failure than purely technological components. If this is so, then organizations who sincerely desire mission success must put as much effort in selecting managers and integrators as they do when designing the hardware, writing the software code and analyzing competitive proposals. This paper will discuss the necessary and sufficient requirements of one of the significant contributors to mission success, the IHA integrator. Discussions will be provided to describe both the mindset required as well as deleterious assumptions/behaviors to avoid when integrating within a large scale system.
NASA Astrophysics Data System (ADS)
Candia, Sante; Lisio, Giovanni; Campolo, Giovanni; Pascucci, Dario
2010-08-01
The Avionics Software (ASW), in charge of controlling the Low Earth Orbit (LEO) Spacecraft PRIMA Platform (Piattaforma Ri-configurabile Italiana Multi-Applicativa), is evolving towards a highly modular and re-usable architecture based on an architectural framework allowing the effective integration of the software building blocks (SWBBs) providing the on-board control functions. During the recent years, the PRIMA ASW design and production processes have been improved to reach the following objectives: (a) at PUS Services level, separation of the mission-independent software mechanisms from the mission-dependent configuration information; (b) at Application level, identification of mission-independent recurrent functions for promoting abstraction and obtaining a more efficient and safe ASW production, with positive implications also on the software validation activities. This paper is dedicated to the characterisation activity which has been performed at Application level for a software component abstracting a set of functions for the generic On-Board Assembly (OBA), a set of hardware units used to deliver an on-board service. Moreover, the ASW production process is specified to show how it results after the introduction of the new design features.
NASA Astrophysics Data System (ADS)
Puig, Albert; LHCb Starterkit Team
2017-10-01
The vast majority of high-energy physicists use and produce software every day. Software skills are usually acquired “on the go” and dedicated training courses are rare. The LHCb Starterkit is a new training format for getting LHCb collaborators started in effectively using software to perform their research. The course focuses on teaching basic skills for research computing. Unlike traditional tutorials we focus on starting with basics, performing all the material live, with a high degree of interactivity, giving priority to understanding the tools as opposed to handing out recipes that work “as if by magic”. The LHCb Starterkit was started by two young members of the collaboration inspired by the principles of Software Carpentry, and the material is created in a collaborative fashion using the tools we teach. Three successful entry-level workshops, as well as an advance one, have taken place since the start of the initiative in 2015, and were taught largely by PhD students to other PhD students.
NASA Astrophysics Data System (ADS)
Langlois, Serge; Fouquet, Olivier; Gouy, Yann; Riant, David
2014-08-01
On-Board Computers (OBC) are more and more using integrated systems on-chip (SOC) that embed processors running from 50MHz up to several hundreds of MHz, and around which are plugged some dedicated communication controllers together with other Input/Output channels.For ground testing and On-Board SoftWare (OBSW) validation purpose, a representative simulation of these systems, faster than real-time and with cycle-true timing of execution, is not achieved with current purely software simulators.Since a few years some hybrid solutions where put in place ([1], [2]), including hardware in the loop so as to add accuracy and performance in the computer software simulation.This paper presents the results of the works engaged by Thales Alenia Space (TAS-F) at the end of 2010, that led to a validated HW simulator of the UT699 by mid- 2012 and that is now qualified and fully used in operational contexts.
A user-friendly LabVIEW software platform for grating based X-ray phase-contrast imaging.
Wang, Shenghao; Han, Huajie; Gao, Kun; Wang, Zhili; Zhang, Can; Yang, Meng; Wu, Zhao; Wu, Ziyu
2015-01-01
X-ray phase-contrast imaging can provide greatly improved contrast over conventional absorption-based imaging for weakly absorbing samples, such as biological soft tissues and fibre composites. In this study, we introduced an easy and fast way to develop a user-friendly software platform dedicated to the new grating-based X-ray phase-contrast imaging setup at the National Synchrotron Radiation Laboratory of the University of Science and Technology of China. The control of 21 motorized stages, of a piezoelectric stage and of an X-ray tube are achieved with this software, it also covers image acquisition with a flat panel detector for automatic phase stepping scan. Moreover, a data post-processing module for signals retrieval and other custom features are in principle available. With a seamless integration of all the necessary functions in one software package, this platform greatly facilitate users' activities during experimental runs with this grating based X-ray phase contrast imaging setup.
Montaje Experimental de Optica Adaptiva con Tecnología FPGA
NASA Astrophysics Data System (ADS)
Rodriguez Brizuela, F.; Verasay, J. P.; Recabarren, P.
An experimental platform based on FPGA devices, dedicated to implement active and adaptive optic software in HDL has been developed. The devel- oped assembly is the first of a series of works focused on this important area of instrumental astronomy. The exposed development is part of a Final Project of Electronic Engineering of the National University of Cordoba. FULL TEXT IN SPANISH
Evaluation of image registration in PET/CT of the liver and recommendations for optimized imaging.
Vogel, Wouter V; van Dalen, Jorn A; Wiering, Bas; Huisman, Henkjan; Corstens, Frans H M; Ruers, Theo J M; Oyen, Wim J G
2007-06-01
Multimodality PET/CT of the liver can be performed with an integrated (hybrid) PET/CT scanner or with software fusion of dedicated PET and CT. Accurate anatomic correlation and good image quality of both modalities are important prerequisites, regardless of the applied method. Registration accuracy is influenced by breathing motion differences on PET and CT, which may also have impact on (attenuation correction-related) artifacts, especially in the upper abdomen. The impact of these issues was evaluated for both hybrid PET/CT and software fusion, focused on imaging of the liver. Thirty patients underwent hybrid PET/CT, 20 with CT during expiration breath-hold (EB) and 10 with CT during free breathing (FB). Ten additional patients underwent software fusion of dedicated PET and dedicated expiration breath-hold CT (SF). The image registration accuracy was evaluated at the location of liver borders on CT and uncorrected PET images and at the location of liver lesions. Attenuation-correction artifacts were evaluated by comparison of liver borders on uncorrected and attenuation-corrected PET images. CT images were evaluated for the presence of breathing artifacts. In EB, 40% of patients had an absolute registration error of the diaphragm in the craniocaudal direction of >1 cm (range, -16 to 44 mm), and 45% of lesions were mispositioned >1 cm. In 50% of cases, attenuation-correction artifacts caused a deformation of the liver dome on PET of >1 cm. Poor compliance to breath-hold instructions caused CT artifacts in 55% of cases. In FB, 30% had registration errors of >1 cm (range, -4 to 16 mm) and PET artifacts were less extensive, but all CT images had breathing artifacts. As SF allows independent alignment of PET and CT, no registration errors or artifacts of >1 cm of the diaphragm occurred. Hybrid PET/CT of the liver may have significant registration errors and artifacts related to breathing motion. The extent of these issues depends on the selected breathing protocol and the speed of the CT scanner. No protocol or scanner can guarantee perfect image fusion. On the basis of these findings, recommendations were formulated with regard to scanner requirements, breathing protocols, and reporting.
NASA Astrophysics Data System (ADS)
Efthimiou, N.; Papadimitroulas, P.; Kostou, T.; Loudos, G.
2015-09-01
Commercial clinical and preclinical PET scanners rely on the full cylindrical geometry for whole body scans as well as for dedicated organs. In this study we propose the construction of a low cost dual-head C-shaped PET system dedicated for small animal brain imaging. Monte Carlo simulation studies were performed using GATE toolkit to evaluate the optimum design in terms of sensitivity, distortions in the FOV and spatial resolution. The PET model is based on SiPMs and BGO pixelated arrays. Four different configurations with C- angle 0°, 15°, 30° and 45° within the modules, were considered. Geometrical phantoms were used for the evaluation process. STIR software, extended by an efficient multi-threaded ray tracing technique, was used for the image reconstruction. The algorithm automatically adjusts the size of the FOV according to the shape of the detector's geometry. The results showed improvement in sensitivity of ∼15% in case of 45° C-angle compared to the 0° case. The spatial resolution was found 2 mm for 45° C-angle.
Simulator predicts transient flow for Malaysian subsea pipeline
DOE Office of Scientific and Technical Information (OSTI.GOV)
Inayat-Hussain, A.A.; Ayob, M.S.; Zain, A.B.M.
1996-04-15
In a step towards acquiring in-house capability in multiphase flow technology, Petronas Research and Scientific Services Sdn. Bhd., Kuala Lumpur, has developed two-phase flow simulation software for analyzing slow gas-condensate transient flow. Unlike its general-purpose contemporaries -- TACITE, OLGA, Traflow (OGJ, Jan. 3, 1994, p. 42; OGJ, Jan. 10, 1994, p. 52), and PLAC (AEA Technology, U.K.) -- ABASs is a dedicated software for slow transient flows generated during pigging operations in the Duyong network, offshore Malaysia. This network links the Duyong and Bekok fields to the onshore gas terminal (OGT) on the east coast of peninsular Malaysia. It predictsmore » the steady-state pressure drop vs. flow rates, condensate volume in the network, pigging dynamics including volume of produced slug, and the condensate build-up following pigging. The predictions of ABASs have been verified against field data obtained from the Duyong network. Presented here is an overview of the development, verification, and application of the ABASs software. Field data are presented for verification of the software, and several operational scenarios are simulated using the software. The field data and simulation study documented here will provide software users and developers with a further set of results on which to benchmark their own software and two-phase pipeline operating guidelines.« less
Flightspeed Integral Image Analysis Toolkit
NASA Technical Reports Server (NTRS)
Thompson, David R.
2009-01-01
The Flightspeed Integral Image Analysis Toolkit (FIIAT) is a C library that provides image analysis functions in a single, portable package. It provides basic low-level filtering, texture analysis, and subwindow descriptor for applications dealing with image interpretation and object recognition. Designed with spaceflight in mind, it addresses: Ease of integration (minimal external dependencies) Fast, real-time operation using integer arithmetic where possible (useful for platforms lacking a dedicated floatingpoint processor) Written entirely in C (easily modified) Mostly static memory allocation 8-bit image data The basic goal of the FIIAT library is to compute meaningful numerical descriptors for images or rectangular image regions. These n-vectors can then be used directly for novelty detection or pattern recognition, or as a feature space for higher-level pattern recognition tasks. The library provides routines for leveraging training data to derive descriptors that are most useful for a specific data set. Its runtime algorithms exploit a structure known as the "integral image." This is a caching method that permits fast summation of values within rectangular regions of an image. This integral frame facilitates a wide range of fast image-processing functions. This toolkit has applicability to a wide range of autonomous image analysis tasks in the space-flight domain, including novelty detection, object and scene classification, target detection for autonomous instrument placement, and science analysis of geomorphology. It makes real-time texture and pattern recognition possible for platforms with severe computational restraints. The software provides an order of magnitude speed increase over alternative software libraries currently in use by the research community. FIIAT can commercially support intelligent video cameras used in intelligent surveillance. It is also useful for object recognition by robots or other autonomous vehicles
Becker, Anton S; Mueller, Michael; Stoffel, Elina; Marcon, Magda; Ghafoor, Soleen; Boss, Andreas
2018-02-01
To train a generic deep learning software (DLS) to classify breast cancer on ultrasound images and to compare its performance to human readers with variable breast imaging experience. In this retrospective study, all breast ultrasound examinations from January 1, 2014 to December 31, 2014 at our institution were reviewed. Patients with post-surgical scars, initially indeterminate, or malignant lesions with histological diagnoses or 2-year follow-up were included. The DLS was trained with 70% of the images, and the remaining 30% were used to validate the performance. Three readers with variable expertise also evaluated the validation set (radiologist, resident, medical student). Diagnostic accuracy was assessed with a receiver operating characteristic analysis. 82 patients with malignant and 550 with benign lesions were included. Time needed for training was 7 min (DLS). Evaluation time for the test data set were 3.7 s (DLS) and 28, 22 and 25 min for human readers (decreasing experience). Receiver operating characteristic analysis revealed non-significant differences (p-values 0.45-0.47) in the area under the curve of 0.84 (DLS), 0.88 (experienced and intermediate readers) and 0.79 (inexperienced reader). DLS may aid diagnosing cancer on breast ultrasound images with an accuracy comparable to radiologists, and learns better and faster than a human reader with no prior experience. Further clinical trials with dedicated algorithms are warranted. Advances in knowledge: DLS can be trained classify cancer on breast ultrasound images high accuracy even with comparably few training cases. The fast evaluation speed makes real-time image analysis feasible.
Computer Synthesis Approaches of Hyperboloid Gear Drives with Linear Contact
NASA Astrophysics Data System (ADS)
Abadjiev, Valentin; Kawasaki, Haruhisa
2014-09-01
The computer design has improved forming different type software for scientific researches in the field of gearing theory as well as performing an adequate scientific support of the gear drives manufacture. Here are attached computer programs that are based on mathematical models as a result of scientific researches. The modern gear transmissions require the construction of new mathematical approaches to their geometric, technological and strength analysis. The process of optimization, synthesis and design is based on adequate iteration procedures to find out an optimal solution by varying definite parameters. The study is dedicated to accepted methodology in the creation of soft- ware for the synthesis of a class high reduction hyperboloid gears - Spiroid and Helicon ones (Spiroid and Helicon are trademarks registered by the Illinois Tool Works, Chicago, Ill). The developed basic computer products belong to software, based on original mathematical models. They are based on the two mathematical models for the synthesis: "upon a pitch contact point" and "upon a mesh region". Computer programs are worked out on the basis of the described mathematical models, and the relations between them are shown. The application of the shown approaches to the synthesis of commented gear drives is illustrated.
Updates in metabolomics tools and resources: 2014-2015.
Misra, Biswapriya B; van der Hooft, Justin J J
2016-01-01
Data processing and interpretation represent the most challenging and time-consuming steps in high-throughput metabolomic experiments, regardless of the analytical platforms (MS or NMR spectroscopy based) used for data acquisition. Improved machinery in metabolomics generates increasingly complex datasets that create the need for more and better processing and analysis software and in silico approaches to understand the resulting data. However, a comprehensive source of information describing the utility of the most recently developed and released metabolomics resources--in the form of tools, software, and databases--is currently lacking. Thus, here we provide an overview of freely-available, and open-source, tools, algorithms, and frameworks to make both upcoming and established metabolomics researchers aware of the recent developments in an attempt to advance and facilitate data processing workflows in their metabolomics research. The major topics include tools and researches for data processing, data annotation, and data visualization in MS and NMR-based metabolomics. Most in this review described tools are dedicated to untargeted metabolomics workflows; however, some more specialist tools are described as well. All tools and resources described including their analytical and computational platform dependencies are summarized in an overview Table. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Wildlife tracking data management: a new vision.
Urbano, Ferdinando; Cagnacci, Francesca; Calenge, Clément; Dettki, Holger; Cameron, Alison; Neteler, Markus
2010-07-27
To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioural data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals' environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling.
Sleep: An Open-Source Python Software for Visualization, Analysis, and Staging of Sleep Data
Combrisson, Etienne; Vallat, Raphael; Eichenlaub, Jean-Baptiste; O'Reilly, Christian; Lajnef, Tarek; Guillot, Aymeric; Ruby, Perrine M.; Jerbi, Karim
2017-01-01
We introduce Sleep, a new Python open-source graphical user interface (GUI) dedicated to visualization, scoring and analyses of sleep data. Among its most prominent features are: (1) Dynamic display of polysomnographic data, spectrogram, hypnogram and topographic maps with several customizable parameters, (2) Implementation of several automatic detection of sleep features such as spindles, K-complexes, slow waves, and rapid eye movements (REM), (3) Implementation of practical signal processing tools such as re-referencing or filtering, and (4) Display of main descriptive statistics including publication-ready tables and figures. The software package supports loading and reading raw EEG data from standard file formats such as European Data Format, in addition to a range of commercial data formats. Most importantly, Sleep is built on top of the VisPy library, which provides GPU-based fast and high-level visualization. As a result, it is capable of efficiently handling and displaying large sleep datasets. Sleep is freely available (http://visbrain.org/sleep) and comes with sample datasets and an extensive documentation. Novel functionalities will continue to be added and open-science community efforts are expected to enhance the capacities of this module. PMID:28983246
Sleep: An Open-Source Python Software for Visualization, Analysis, and Staging of Sleep Data.
Combrisson, Etienne; Vallat, Raphael; Eichenlaub, Jean-Baptiste; O'Reilly, Christian; Lajnef, Tarek; Guillot, Aymeric; Ruby, Perrine M; Jerbi, Karim
2017-01-01
We introduce Sleep, a new Python open-source graphical user interface (GUI) dedicated to visualization, scoring and analyses of sleep data. Among its most prominent features are: (1) Dynamic display of polysomnographic data, spectrogram, hypnogram and topographic maps with several customizable parameters, (2) Implementation of several automatic detection of sleep features such as spindles, K-complexes, slow waves, and rapid eye movements (REM), (3) Implementation of practical signal processing tools such as re-referencing or filtering, and (4) Display of main descriptive statistics including publication-ready tables and figures. The software package supports loading and reading raw EEG data from standard file formats such as European Data Format, in addition to a range of commercial data formats. Most importantly, Sleep is built on top of the VisPy library, which provides GPU-based fast and high-level visualization. As a result, it is capable of efficiently handling and displaying large sleep datasets. Sleep is freely available (http://visbrain.org/sleep) and comes with sample datasets and an extensive documentation. Novel functionalities will continue to be added and open-science community efforts are expected to enhance the capacities of this module.
Wildlife tracking data management: a new vision
Urbano, Ferdinando; Cagnacci, Francesca; Calenge, Clément; Dettki, Holger; Cameron, Alison; Neteler, Markus
2010-01-01
To date, the processing of wildlife location data has relied on a diversity of software and file formats. Data management and the following spatial and statistical analyses were undertaken in multiple steps, involving many time-consuming importing/exporting phases. Recent technological advancements in tracking systems have made large, continuous, high-frequency datasets of wildlife behavioural data available, such as those derived from the global positioning system (GPS) and other animal-attached sensor devices. These data can be further complemented by a wide range of other information about the animals' environment. Management of these large and diverse datasets for modelling animal behaviour and ecology can prove challenging, slowing down analysis and increasing the probability of mistakes in data handling. We address these issues by critically evaluating the requirements for good management of GPS data for wildlife biology. We highlight that dedicated data management tools and expertise are needed. We explore current research in wildlife data management. We suggest a general direction of development, based on a modular software architecture with a spatial database at its core, where interoperability, data model design and integration with remote-sensing data sources play an important role in successful GPS data handling. PMID:20566495
Li, Yajie; Zhao, Yongli; Zhang, Jie; Yu, Xiaosong; Jing, Ruiquan
2017-11-27
Network operators generally provide dedicated lightpaths for customers to meet the demand for high-quality transmission. Considering the variation of traffic load, customers usually rent peak bandwidth that exceeds the practical average traffic requirement. In this case, bandwidth provisioning is unmetered and customers have to pay according to peak bandwidth. Supposing that network operators could keep track of traffic load and allocate bandwidth dynamically, bandwidth can be provided as a metered service and customers would pay for the bandwidth that they actually use. To achieve cost-effective bandwidth provisioning, this paper proposes an autonomic bandwidth adjustment scheme based on data analysis of traffic load. The scheme is implemented in a software defined networking (SDN) controller and is demonstrated in the field trial of multi-vendor optical transport networks. The field trial shows that the proposed scheme can track traffic load and realize autonomic bandwidth adjustment. In addition, a simulation experiment is conducted to evaluate the performance of the proposed scheme. We also investigate the impact of different parameters on autonomic bandwidth adjustment. Simulation results show that the step size and adjustment period have significant influences on bandwidth savings and packet loss. A small value of step size and adjustment period can bring more benefits by tracking traffic variation with high accuracy. For network operators, the scheme can serve as technical support of realizing bandwidth as metered service in the future.
Concept of a programmable maintenance processor applicable to multiprocessing systems
NASA Technical Reports Server (NTRS)
Glover, Richard D.
1988-01-01
A programmable maintenance processor concept applicable to multiprocessing systems has been developed at the NASA Ames Research Center's Dryden Flight Research Facility. This stand-alone-processor is intended to provide support for system and application software testing as well as hardware diagnostics. An initial machanization has been incorporated into the extended aircraft interrogation and display system (XAIDS) which is multiprocessing general-purpose ground support equipment. The XAIDS maintenance processor has independent terminal and printer interfaces and a dedicated magnetic bubble memory that stores system test sequences entered from the terminal. This report describes the hardware and software embodied in this processor and shows a typical application in the check-out of a new XAIDS.
Power System Simulations For The Globalstar2 Mission Using The PowerCap Software
NASA Astrophysics Data System (ADS)
Defoug, S.; Pin, R.
2011-10-01
The Globalstar system aims to enable customers to communicate all around the world thanks to its constellation of 48 LEO satellites. Thales Alenia Space is in charge of the design and manufacturing of the second generation of the Globalstar satellites. For such a long duration mission (15 years) and with a payload power consumption varying incessantly, the optimization of the solar arrays and battery has to be consolidated by an accurate power simulation tool. After a general overview of the Globalstar power system and of the PowerCap software, this paper presents the dedicated version elaborated for the GlobalStar2 mission, the simulations results and their correlation with the tests.
A multiarchitecture parallel-processing development environment
NASA Technical Reports Server (NTRS)
Townsend, Scott; Blech, Richard; Cole, Gary
1993-01-01
A description is given of the hardware and software of a multiprocessor test bed - the second generation Hypercluster system. The Hypercluster architecture consists of a standard hypercube distributed-memory topology, with multiprocessor shared-memory nodes. By using standard, off-the-shelf hardware, the system can be upgraded to use rapidly improving computer technology. The Hypercluster's multiarchitecture nature makes it suitable for researching parallel algorithms in computational field simulation applications (e.g., computational fluid dynamics). The dedicated test-bed environment of the Hypercluster and its custom-built software allows experiments with various parallel-processing concepts such as message passing algorithms, debugging tools, and computational 'steering'. Such research would be difficult, if not impossible, to achieve on shared, commercial systems.
NASA Astrophysics Data System (ADS)
Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.
In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA’s Orbital Debris Program Office (ODPO), in honour of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosyncronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA’s Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).
NASA Technical Reports Server (NTRS)
Lederer, S. M.; Hickson, P.; Cowardin, H. M.; Buckalew, B.; Frith, J.; Alliss, R.
2017-01-01
In June 2015, the construction of the Meter Class Autonomous Telescope was completed and MCAT saw the light of the stars for the first time. In 2017, MCAT was newly dedicated as the Eugene Stansbery-MCAT telescope by NASA's Orbital Debris Program Office (ODPO), in honor of his inspiration and dedication to this newest optical member of the NASA ODPO. Since that time, MCAT has viewed the skies with one engineering camera and two scientific cameras, and the ODPO optical team has begun the process of vetting the entire system. The full system vetting includes verification and validation of: (1) the hardware comprising the system (e.g. the telescopes and its instruments, the dome, weather systems, all-sky camera, FLIR cloud infrared camera, etc.), (2) the custom-written Observatory Control System (OCS) master software designed to autonomously control this complex system of instruments, each with its own control software, and (3) the custom written Orbital Debris Processing software for post-processing the data. ES-MCAT is now capable of autonomous observing to include Geosynchronous survey, TLE (Two-line element) tracking of individual catalogued debris at all orbital regimes (Low-Earth Orbit all the way to Geosynchronous (GEO) orbit), tracking at specified non-sidereal rates, as well as sidereal rates for proper calibration with standard stars. Ultimately, the data will be used for validation of NASA's Orbital Debris Engineering Model, ORDEM, which aids in engineering designs of spacecraft that require knowledge of the orbital debris environment and long-term risks for collisions with Resident Space Objects (RSOs).
A usability evaluation of medical software at an expert conference setting.
Bond, Raymond Robert; Finlay, Dewar D; Nugent, Chris D; Moore, George; Guldenring, Daniel
2014-01-01
A usability test was employed to evaluate two medical software applications at an expert conference setting. One software application is a medical diagnostic tool (electrocardiogram [ECG] viewer) and the other is a medical research tool (electrode misplacement simulator [EMS]). These novel applications have yet to be adopted by the healthcare domain, thus, (1) we wanted to determine the potential user acceptance of these applications and (2) we wanted to determine the feasibility of evaluating medical diagnostic and medical research software at a conference setting as opposed to the conventional laboratory setting. The medical diagnostic tool (ECG viewer) was evaluated using seven delegates and the medical research tool (EMS) was evaluated using 17 delegates that were recruited at the 2010 International Conference on Computing in Cardiology. Each delegate/participant was required to use the software and undertake a set of predefined tasks during the session breaks at the conference. User interactions with the software were recorded using screen-recording software. The 'think-aloud' protocol was also used to elicit verbal feedback from the participants whilst they attempted the pre-defined tasks. Before and after each session, participants completed a pre-test and a post-test questionnaire respectively. The average duration of a usability session at the conference was 34.69 min (SD=10.28). However, taking into account that 10 min was dedicated to the pre-test and post-test questionnaires, the average time dedication to user interaction of the medical software was 24.69 min (SD=10.28). Given we have shown that usability data can be collected at conferences, this paper details the advantages of conference-based usability studies over the laboratory-based approach. For example, given delegates gather at one geographical location, a conference-based usability evaluation facilitates recruitment of a convenient sample of international subject experts. This would otherwise be very expensive to arrange. A conference-based approach also allows for data to be collected over a few days as opposed to months by avoiding administration duties normally involved in laboratory based approach, e.g. mailing invitation letters as part of a recruitment campaign. Following analysis of the user video recordings, 41 (previously unknown) use errors were identified in the advanced ECG viewer and 29 were identified in the EMS application. All use errors were given a consensus severity rating from two independent usability experts. Out of a rating scale of 4 (where 1=cosmetic and 4=critical), the average severity rating for the ECG viewer was 2.24 (SD=1.09) and the average severity rating for the EMS application was 2.34 (SD=0.97). We were also able to extract task completion rates and times from the video recordings to determine the effectiveness of the software applications. For example, six out of seven tasks were completed by all participants when using both applications. This statistic alone suggests both applications already have a high degree of usability. As well as extracting data from the video recordings, we were also able to extract data from the questionnaires. Using a semantic differential scale (where 1=poor and 5=excellent), delegates highly rated the 'responsiveness', 'usefulness', 'learnability' and the 'look and feel' of both applications. This study has shown the potential user acceptance and user-friendliness of the novel EMS and the ECG viewer applications within the healthcare domain. It has also shown that both medical diagnostic software and medical research software can be evaluated for their usability at an expert conference setting. The primary advantage of a conference-based usability evaluation over a laboratory-based evaluation is the high concentration of experts at one location, which is convenient, less time consuming and less expensive. Copyright © 2013 Elsevier Ireland Ltd. All rights reserved.
2010-01-01
Introduction Accurate assessment of estrogen receptor (ER), progesterone receptor (PR), and Ki-67 is essential in the histopathologic diagnostics of breast cancer. Commercially available image analysis systems are usually bundled with dedicated analysis hardware and, to our knowledge, no easily installable, free software for immunostained slide scoring has been described. In this study, we describe a free, Internet-based web application for quantitative image analysis of ER, PR, and Ki-67 immunohistochemistry in breast cancer tissue sections. Methods The application, named ImmunoRatio, calculates the percentage of positively stained nuclear area (labeling index) by using a color deconvolution algorithm for separating the staining components (diaminobenzidine and hematoxylin) and adaptive thresholding for nuclear area segmentation. ImmunoRatio was calibrated using cell counts defined visually as the gold standard (training set, n = 50). Validation was done using a separate set of 50 ER, PR, and Ki-67 stained slides (test set, n = 50). In addition, Ki-67 labeling indexes determined by ImmunoRatio were studied for their prognostic value in a retrospective cohort of 123 breast cancer patients. Results The labeling indexes by calibrated ImmunoRatio analyses correlated well with those defined visually in the test set (correlation coefficient r = 0.98). Using the median Ki-67 labeling index (20%) as a cutoff, a hazard ratio of 2.2 was obtained in the survival analysis (n = 123, P = 0.01). ImmunoRatio was shown to adapt to various staining protocols, microscope setups, digital camera models, and image acquisition settings. The application can be used directly with web browsers running on modern operating systems (e.g., Microsoft Windows, Linux distributions, and Mac OS). No software downloads or installations are required. ImmunoRatio is open source software, and the web application is publicly accessible on our website. Conclusions We anticipate that free web applications, such as ImmunoRatio, will make the quantitative image analysis of ER, PR, and Ki-67 easy and straightforward in the diagnostic assessment of breast cancer specimens. PMID:20663194
Tuominen, Vilppu J; Ruotoistenmäki, Sanna; Viitanen, Arttu; Jumppanen, Mervi; Isola, Jorma
2010-01-01
Accurate assessment of estrogen receptor (ER), progesterone receptor (PR), and Ki-67 is essential in the histopathologic diagnostics of breast cancer. Commercially available image analysis systems are usually bundled with dedicated analysis hardware and, to our knowledge, no easily installable, free software for immunostained slide scoring has been described. In this study, we describe a free, Internet-based web application for quantitative image analysis of ER, PR, and Ki-67 immunohistochemistry in breast cancer tissue sections. The application, named ImmunoRatio, calculates the percentage of positively stained nuclear area (labeling index) by using a color deconvolution algorithm for separating the staining components (diaminobenzidine and hematoxylin) and adaptive thresholding for nuclear area segmentation. ImmunoRatio was calibrated using cell counts defined visually as the gold standard (training set, n = 50). Validation was done using a separate set of 50 ER, PR, and Ki-67 stained slides (test set, n = 50). In addition, Ki-67 labeling indexes determined by ImmunoRatio were studied for their prognostic value in a retrospective cohort of 123 breast cancer patients. The labeling indexes by calibrated ImmunoRatio analyses correlated well with those defined visually in the test set (correlation coefficient r = 0.98). Using the median Ki-67 labeling index (20%) as a cutoff, a hazard ratio of 2.2 was obtained in the survival analysis (n = 123, P = 0.01). ImmunoRatio was shown to adapt to various staining protocols, microscope setups, digital camera models, and image acquisition settings. The application can be used directly with web browsers running on modern operating systems (e.g., Microsoft Windows, Linux distributions, and Mac OS). No software downloads or installations are required. ImmunoRatio is open source software, and the web application is publicly accessible on our website. We anticipate that free web applications, such as ImmunoRatio, will make the quantitative image analysis of ER, PR, and Ki-67 easy and straightforward in the diagnostic assessment of breast cancer specimens.
Reliability of a Single Light Source Purkinjemeter in Pseudophakic Eyes.
Janunts, Edgar; Chashchina, Ekaterina; Seitz, Berthold; Schaeffel, Frank; Langenbucher, Achim
2015-08-01
To study the reliability of Purkinje image analysis for assessment of intraocular lens tilt and decentration in pseudophakic eyes. The study comprised 64 eyes of 39 patients. All eyes underwent phacoemulsification with intraocular lens implanted in the capsular bag. Lens decentration and tilt were measured multiple times by an infrared Purkinjemeter. A total of 396 measurements were performed 1 week and 1 month postoperatively. Lens tilt (Tx, Ty) and decentration (Dx, Dy) in horizontal and vertical directions, respectively, were calculated by dedicated software based on regression analysis for each measurement using only four images, and afterward, the data were averaged (mean values, MV) for repeated sequence of measurements. New software was designed by us for recalculating lens misalignment parameters offline, using a complete set of Purkinje images obtained through the repeated measurements (9 to 15 Purkinje images) (recalculated values, MV'). MV and MV' were compared using SPSS statistical software package. MV and MV' were found to be highly correlated for the Tx and Ty parameters (R2 > 0.9; p < 0.001), moderately correlated for the Dx parameter (R2 > 0.7; p < 0.001), and weakly correlated for the Dy parameter (R2 = 0.23; p < 0.05). Reliability was high (Cronbach α > 0.9) for all measured parameters. Standard deviation values were 0.86 ± 0.69 degrees, 0.72 ± 0.65 degrees, 0.04 ± 0.05 mm, and 0.23 ± 0.34 mm for Tx, Ty, Dx, and Dy, respectively. The Purkinjemeter demonstrated high reliability and reproducibility for lens misalignment parameters. To further improve reliability, we recommend capturing at least six Purkinje images instead of three.
Bi-Force: large-scale bicluster editing and its application to gene expression data biclustering
Sun, Peng; Speicher, Nora K.; Röttger, Richard; Guo, Jiong; Baumbach, Jan
2014-01-01
Abstract The explosion of the biological data has dramatically reformed today's biological research. The need to integrate and analyze high-dimensional biological data on a large scale is driving the development of novel bioinformatics approaches. Biclustering, also known as ‘simultaneous clustering’ or ‘co-clustering’, has been successfully utilized to discover local patterns in gene expression data and similar biomedical data types. Here, we contribute a new heuristic: ‘Bi-Force’. It is based on the weighted bicluster editing model, to perform biclustering on arbitrary sets of biological entities, given any kind of pairwise similarities. We first evaluated the power of Bi-Force to solve dedicated bicluster editing problems by comparing Bi-Force with two existing algorithms in the BiCluE software package. We then followed a biclustering evaluation protocol in a recent review paper from Eren et al. (2013) (A comparative analysis of biclustering algorithms for gene expressiondata. Brief. Bioinform., 14:279–292.) and compared Bi-Force against eight existing tools: FABIA, QUBIC, Cheng and Church, Plaid, BiMax, Spectral, xMOTIFs and ISA. To this end, a suite of synthetic datasets as well as nine large gene expression datasets from Gene Expression Omnibus were analyzed. All resulting biclusters were subsequently investigated by Gene Ontology enrichment analysis to evaluate their biological relevance. The distinct theoretical foundation of Bi-Force (bicluster editing) is more powerful than strict biclustering. We thus outperformed existing tools with Bi-Force at least when following the evaluation protocols from Eren et al. Bi-Force is implemented in Java and integrated into the open source software package of BiCluE. The software as well as all used datasets are publicly available at http://biclue.mpi-inf.mpg.de. PMID:24682815
NASA Technical Reports Server (NTRS)
Defeo, P.; Doane, D.; Saito, J.
1982-01-01
A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.
A quantitative reconstruction software suite for SPECT imaging
NASA Astrophysics Data System (ADS)
Namías, Mauro; Jeraj, Robert
2017-11-01
Quantitative Single Photon Emission Tomography (SPECT) imaging allows for measurement of activity concentrations of a given radiotracer in vivo. Although SPECT has usually been perceived as non-quantitative by the medical community, the introduction of accurate CT based attenuation correction and scatter correction from hybrid SPECT/CT scanners has enabled SPECT systems to be as quantitative as Positron Emission Tomography (PET) systems. We implemented a software suite to reconstruct quantitative SPECT images from hybrid or dedicated SPECT systems with a separate CT scanner. Attenuation, scatter and collimator response corrections were included in an Ordered Subset Expectation Maximization (OSEM) algorithm. A novel scatter fraction estimation technique was introduced. The SPECT/CT system was calibrated with a cylindrical phantom and quantitative accuracy was assessed with an anthropomorphic phantom and a NEMA/IEC image quality phantom. Accurate activity measurements were achieved at an organ level. This software suite helps increasing quantitative accuracy of SPECT scanners.
Dynamic provisioning of a HEP computing infrastructure on a shared hybrid HPC system
NASA Astrophysics Data System (ADS)
Meier, Konrad; Fleig, Georg; Hauth, Thomas; Janczyk, Michael; Quast, Günter; von Suchodoletz, Dirk; Wiebelt, Bernd
2016-10-01
Experiments in high-energy physics (HEP) rely on elaborate hardware, software and computing systems to sustain the high data rates necessary to study rare physics processes. The Institut fr Experimentelle Kernphysik (EKP) at KIT is a member of the CMS and Belle II experiments, located at the LHC and the Super-KEKB accelerators, respectively. These detectors share the requirement, that enormous amounts of measurement data must be processed and analyzed and a comparable amount of simulated events is required to compare experimental results with theoretical predictions. Classical HEP computing centers are dedicated sites which support multiple experiments and have the required software pre-installed. Nowadays, funding agencies encourage research groups to participate in shared HPC cluster models, where scientist from different domains use the same hardware to increase synergies. This shared usage proves to be challenging for HEP groups, due to their specialized software setup which includes a custom OS (often Scientific Linux), libraries and applications. To overcome this hurdle, the EKP and data center team of the University of Freiburg have developed a system to enable the HEP use case on a shared HPC cluster. To achieve this, an OpenStack-based virtualization layer is installed on top of a bare-metal cluster. While other user groups can run their batch jobs via the Moab workload manager directly on bare-metal, HEP users can request virtual machines with a specialized machine image which contains a dedicated operating system and software stack. In contrast to similar installations, in this hybrid setup, no static partitioning of the cluster into a physical and virtualized segment is required. As a unique feature, the placement of the virtual machine on the cluster nodes is scheduled by Moab and the job lifetime is coupled to the lifetime of the virtual machine. This allows for a seamless integration with the jobs sent by other user groups and honors the fairshare policies of the cluster. The developed thin integration layer between OpenStack and Moab can be adapted to other batch servers and virtualization systems, making the concept also applicable for other cluster operators. This contribution will report on the concept and implementation of an OpenStack-virtualized cluster used for HEP workflows. While the full cluster will be installed in spring 2016, a test-bed setup with 800 cores has been used to study the overall system performance and dedicated HEP jobs were run in a virtualized environment over many weeks. Furthermore, the dynamic integration of the virtualized worker nodes, depending on the workload at the institute's computing system, will be described.
VLT instruments: industrial solutions for non-scientific detector systems
NASA Astrophysics Data System (ADS)
Duhoux, P.; Knudstrup, J.; Lilley, P.; Di Marcantonio, P.; Cirami, R.; Mannetta, M.
2014-07-01
Recent improvements in industrial vision technology and products together with the increasing need for high performance, cost efficient technical detectors for astronomical instrumentation have led ESO with the contribution of INAF to evaluate this trend and elaborate ad-hoc solutions which are interoperable and compatible with the evolution of VLT standards. The ESPRESSO spectrograph shall be the first instrument deploying this technology. ESO's Technical CCD (hereafter TCCD) requirements are extensive and demanding. A lightweight, low maintenance, rugged and high performance TCCD camera product or family of products is required which can operate in the extreme environmental conditions present at ESO's observatories with minimum maintenance and minimal downtime. In addition the camera solution needs to be interchangeable between different technical roles e.g. slit viewing, pupil and field stabilization, with excellent performance characteristics under a wide range of observing conditions together with ease of use for the end user. Interoperability is enhanced by conformance to recognized electrical, mechanical and software standards. Technical requirements and evaluation criteria for the TCCD solution are discussed in more detail. A software architecture has been adopted which facilitates easy integration with TCCD's from different vendors. The communication with the devices is implemented by means of dedicated adapters allowing usage of the same core framework (business logic). The preference has been given to cameras with an Ethernet interface, using standard TCP/IP based communication. While the preferred protocol is the industrial standard GigE Vision, not all vendors supply cameras with this interface, hence proprietary socket-based protocols are also acceptable with the provision of a validated Linux compliant API. A fundamental requirement of the TCCD software is that it shall allow for a seamless integration with the existing VLT software framework. ESPRESSO is a fiber-fed, cross-dispersed echelle spectrograph that will be located in the Combined-Coudé Laboratory of the VLT in the Paranal Observatory in Chile. It will be able to operate either using the light of any of the UT's or using the incoherently combined light of up to four UT's. The stabilization of the incoming beam is achieved by dedicated piezo systems controlled via active loops closed on 4 + 4 dedicated TCCD's for the stabilization of the pupil image and of the field with a frequency goal of 3 Hz on a 2nd to 3rd magnitude star. An additional 9th TCCD system shall be used as an exposure-meter. In this paper we will present the technical CCD solution for future VLT instruments.
Detecting planets in Kepler lightcurves using methods developed for CoRoT.
NASA Astrophysics Data System (ADS)
Grziwa, S.; Korth, J.; Pätzold, M.
2011-10-01
Launched in March 2009, Kepler is the second space telescope dedicated to the search for extrasolar planets. NASA released 150.000 lightcurves to the public in 2010 and announced that Kepler has found 1.235 candidates. The Rhenish Institute for Environmental Research (RIU-PF) is one of the detection groups from the CoRoT space mission. RIU-PF developed the software package EXOTRANS for the detection of transits in stellar lightcurves. EXOTRANS is designed for the fast automated processing of huge amounts of data and was easily adapted to the analysis of Kepler lightcurves. The use of different techniques and philosophies helps to find more candidates and to rule out others. We present the analysis of the Kepler lightcurves with EXOTRANS. Results of our filter (trend, harmonic) and detection (dcBLS) techniques are compared with the techniques used by Kepler (PDC, TPS). The different approaches to rule out false positives are discussed and additional candidates found by EXOTRANS are presented.
NASA Astrophysics Data System (ADS)
Frolova, M. A.; Razumova, T. A.
2017-01-01
This article is dedicated to the analysis of business processes in a comprehensive institution on the basis of the process approach. Decomposition of the processes in study is carried out by means of the IDEF0 methodology, both the basic mechanisms and control actions are determined, AS-IS diagrams for documentation support for educational service provision are developed. Disadvantages of the existing business processes are revealed on the basis of the diagrams and a way to solve the problem is proposed which allows increasing the efficiency of the use of labor resources. The results of the implementation of the solution that takes into account the use of software as a means of the solution are presented as TO-BE diagrams. The analysis carried out on the basis of the diagrams led to the conclusion about the need to automate the test-task database formation process for preparing students for the State Final Examination.
MRI-based dynamic tracking of an untethered ferromagnetic microcapsule navigating in liquid
NASA Astrophysics Data System (ADS)
Dahmen, Christian; Belharet, Karim; Folio, David; Ferreira, Antoine; Fatikow, Sergej
2016-04-01
The propulsion of ferromagnetic objects by means of MRI gradients is a promising approach to enable new forms of therapy. In this work, necessary techniques are presented to make this approach work. This includes path planning algorithms working on MRI data, ferromagnetic artifact imaging and a tracking algorithm which delivers position feedback for the ferromagnetic objects, and a propulsion sequence to enable interleaved magnetic propulsion and imaging. Using a dedicated software environment, integrating path-planning methods and real-time tracking, a clinical MRI system is adapted to provide this new functionality for controlled interventional targeted therapeutic applications. Through MRI-based sensing analysis, this article aims to propose a framework to plan a robust pathway to enhance the navigation ability to reach deep locations in the human body. The proposed approaches are validated with different experiments.
Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications
NASA Astrophysics Data System (ADS)
Boudillet, O.; Mescam, J.-C.; Dalemagne, D.
2008-08-01
EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.
Scalable Technology for a New Generation of Collaborative Applications
2007-04-01
of the International Symposium on Distributed Computing (DISC), Cracow, Poland, September 2005. Classic Paxos vs. Fast Paxos: Caveat Emptor, Flavio...grou or able and fast multicast primitive to layer under high-level latency across dimensions as varied as group size [10, 17],abstractions such as...servers, networked via fast , dedicated interconnects. The system to subscribe to a fraction of the equities on the software stack running on a single
pyZELDA: Python code for Zernike wavefront sensors
NASA Astrophysics Data System (ADS)
Vigan, A.; N'Diaye, M.
2018-06-01
pyZELDA analyzes data from Zernike wavefront sensors dedicated to high-contrast imaging applications. This modular software was originally designed to analyze data from the ZELDA wavefront sensor prototype installed in VLT/SPHERE; simple configuration files allow it to be extended to support several other instruments and testbeds. pyZELDA also includes simple simulation tools to measure the theoretical sensitivity of a sensor and to compare it to other sensors.
An algorithm for deriving core magnetic field models from the Swarm data set
NASA Astrophysics Data System (ADS)
Rother, Martin; Lesur, Vincent; Schachtschneider, Reyko
2013-11-01
In view of an optimal exploitation of the Swarm data set, we have prepared and tested software dedicated to the determination of accurate core magnetic field models and of the Euler angles between the magnetic sensors and the satellite reference frame. The dedicated core field model estimation is derived directly from the GFZ Reference Internal Magnetic Model (GRIMM) inversion and modeling family. The data selection techniques and the model parameterizations are similar to what were used for the derivation of the second (Lesur et al., 2010) and third versions of GRIMM, although the usage of observatory data is not planned in the framework of the application to Swarm. The regularization technique applied during the inversion process smoothes the magnetic field model in time. The algorithm to estimate the Euler angles is also derived from the CHAMP studies. The inversion scheme includes Euler angle determination with a quaternion representation for describing the rotations. It has been built to handle possible weak time variations of these angles. The modeling approach and software have been initially validated on a simple, noise-free, synthetic data set and on CHAMP vector magnetic field measurements. We present results of test runs applied to the synthetic Swarm test data set.
NASA Technical Reports Server (NTRS)
Trevino, Luis; Patterson, Jonathan; Teare, David; Johnson, Stephen
2015-01-01
The engineering development of the new Space Launch System (SLS) launch vehicle requires cross discipline teams with extensive knowledge of launch vehicle subsystems, information theory, and autonomous algorithms dealing with all operations from pre-launch through on orbit operations. The characteristics of these spacecraft systems must be matched with the autonomous algorithm monitoring and mitigation capabilities for accurate control and response to abnormal conditions throughout all vehicle mission flight phases, including precipitating safing actions and crew aborts. This presents a large and complex system engineering challenge, which is being addressed in part by focusing on the specific subsystems involved in the handling of off-nominal mission and fault tolerance with response management. Using traditional model based system and software engineering design principles from the Unified Modeling Language (UML) and Systems Modeling Language (SysML), the Mission and Fault Management (M&FM) algorithms for the vehicle are crafted and vetted in specialized Integrated Development Teams (IDTs) composed of multiple development disciplines such as Systems Engineering (SE), Flight Software (FSW), Safety and Mission Assurance (S&MA) and the major subsystems and vehicle elements such as Main Propulsion Systems (MPS), boosters, avionics, Guidance, Navigation, and Control (GNC), Thrust Vector Control (TVC), and liquid engines. These model based algorithms and their development lifecycle from inception through Flight Software certification are an important focus of this development effort to further insure reliable detection and response to off-nominal vehicle states during all phases of vehicle operation from pre-launch through end of flight. NASA formed a dedicated M&FM team for addressing fault management early in the development lifecycle for the SLS initiative. As part of the development of the M&FM capabilities, this team has developed a dedicated testbed that integrates specific M&FM algorithms, specialized nominal and off-nominal test cases, and vendor-supplied physics-based launch vehicle subsystem models. Additionally, the team has developed processes for implementing and validating these algorithms for concept validation and risk reduction for the SLS program. The flexibility of the Vehicle Management End-to-end Testbed (VMET) enables thorough testing of the M&FM algorithms by providing configurable suites of both nominal and off-nominal test cases to validate the developed algorithms utilizing actual subsystem models such as MPS. The intent of VMET is to validate the M&FM algorithms and substantiate them with performance baselines for each of the target vehicle subsystems in an independent platform exterior to the flight software development infrastructure and its related testing entities. In any software development process there is inherent risk in the interpretation and implementation of concepts into software through requirements and test cases into flight software compounded with potential human errors throughout the development lifecycle. Risk reduction is addressed by the M&FM analysis group working with other organizations such as S&MA, Structures and Environments, GNC, Orion, the Crew Office, Flight Operations, and Ground Operations by assessing performance of the M&FM algorithms in terms of their ability to reduce Loss of Mission and Loss of Crew probabilities. In addition, through state machine and diagnostic modeling, analysis efforts investigate a broader suite of failure effects and associated detection and responses that can be tested in VMET to ensure that failures can be detected, and confirm that responses do not create additional risks or cause undesired states through interactive dynamic effects with other algorithms and systems. VMET further contributes to risk reduction by prototyping and exercising the M&FM algorithms early in their implementation and without any inherent hindrances such as meeting FSW processor scheduling constraints due to their target platform - ARINC 653 partitioned OS, resource limitations, and other factors related to integration with other subsystems not directly involved with M&FM such as telemetry packing and processing. The baseline plan for use of VMET encompasses testing the original M&FM algorithms coded in the same C++ language and state machine architectural concepts as that used by Flight Software. This enables the development of performance standards and test cases to characterize the M&FM algorithms and sets a benchmark from which to measure the effectiveness of M&FM algorithms performance in the FSW development and test processes.
Design Criteria For Networked Image Analysis System
NASA Astrophysics Data System (ADS)
Reader, Cliff; Nitteberg, Alan
1982-01-01
Image systems design is currently undergoing a metamorphosis from the conventional computing systems of the past into a new generation of special purpose designs. This change is motivated by several factors, notably among which is the increased opportunity for high performance with low cost offered by advances in semiconductor technology. Another key issue is a maturing in understanding of problems and the applicability of digital processing techniques. These factors allow the design of cost-effective systems that are functionally dedicated to specific applications and used in a utilitarian fashion. Following an overview of the above stated issues, the paper presents a top-down approach to the design of networked image analysis systems. The requirements for such a system are presented, with orientation toward the hospital environment. The three main areas are image data base management, viewing of image data and image data processing. This is followed by a survey of the current state of the art, covering image display systems, data base techniques, communications networks and software systems control. The paper concludes with a description of the functional subystems and architectural framework for networked image analysis in a production environment.
NASA Astrophysics Data System (ADS)
Bonduà, Stefano; Battistelli, Alfredo; Berry, Paolo; Bortolotti, Villiam; Consonni, Alberto; Cormio, Carlo; Geloni, Claudio; Vasini, Ester Maria
2017-11-01
As is known, a full three-dimensional (3D) unstructured grid permits a great degree of flexibility when performing accurate numerical reservoir simulations. However, when the Integral Finite Difference Method (IFDM) is used for spatial discretization, constraints (arising from the required orthogonality between the segment connecting the blocks nodes and the interface area between blocks) pose difficulties in the creation of grids with irregular shaped blocks. The full 3D Voronoi approach guarantees the respect of IFDM constraints and allows generation of grids conforming to geological formations and structural objects and at the same time higher grid resolution in volumes of interest. In this work, we present dedicated pre- and post-processing gridding software tools for the TOUGH family of numerical reservoir simulators, developed by the Geothermal Research Group of the DICAM Department, University of Bologna. VORO2MESH is a new software coded in C++, based on the voro++ library, allowing computation of the 3D Voronoi tessellation for a given domain and the creation of a ready to use TOUGH2 MESH file. If a set of geological surfaces is available, the software can directly generate the set of Voronoi seed points used for tessellation. In order to reduce the number of connections and so to decrease computation time, VORO2MESH can produce a mixed grid with regular blocks (orthogonal prisms) and irregular blocks (polyhedron Voronoi blocks) at the point of contact between different geological formations. In order to visualize 3D Voronoi grids together with the results of numerical simulations, the functionality of the TOUGH2Viewer post-processor has been extended. We describe an application of VORO2MESH and TOUGH2Viewer to validate the two tools. The case study deals with the simulation of the migration of gases in deep layered sedimentary formations at basin scale using TOUGH2-TMGAS. A comparison between the simulation performances of unstructured and structured grids is presented.
Jourdren, Laurent; Delaveau, Thierry; Marquenet, Emelie; Jacq, Claude; Garcia, Mathilde
2010-07-01
Recent improvements in microscopy technology allow detection of single molecules of RNA, but tools for large-scale automatic analyses of particle distributions are lacking. An increasing number of imaging studies emphasize the importance of mRNA localization in the definition of cell territory or the biogenesis of cell compartments. CORSEN is a new tool dedicated to three-dimensional (3D) distance measurements from imaging experiments especially developed to access the minimal distance between RNA molecules and cellular compartment markers. CORSEN includes a 3D segmentation algorithm allowing the extraction and the characterization of the cellular objects to be processed--surface determination, aggregate decomposition--for minimal distance calculations. CORSEN's main contribution lies in exploratory statistical analysis, cell population characterization, and high-throughput assays that are made possible by the implementation of a batch process analysis. We highlighted CORSEN's utility for the study of relative positions of mRNA molecules and mitochondria: CORSEN clearly discriminates mRNA localized to the vicinity of mitochondria from those that are translated on free cytoplasmic polysomes. Moreover, it quantifies the cell-to-cell variations of mRNA localization and emphasizes the necessity for statistical approaches. This method can be extended to assess the evolution of the distance between specific mRNAs and other cellular structures in different cellular contexts. CORSEN was designed for the biologist community with the concern to provide an easy-to-use and highly flexible tool that can be applied for diverse distance quantification issues.
Dedicated Stereophotogrammetric X-Ray System For Craniofacial Research And Treatment Planning
NASA Astrophysics Data System (ADS)
Baumrind, Sheldon; Moffitt, Francis; Curry, Sean; Isaacson, Robert J.
1983-07-01
We have constructed and brought into use what we believe to be the first dedicated coplanar craniofacial stereometric x-ray system for clinical use. Paired Machlett Dynamax 50/58 x-ray tubes with 0.3 mm focal spots are employed. Displacement between emitters is 16 inches. The focus film distance for both emitters is 66.5 inches. The mid-sagittal plane to focus distance is 60 inches. One film of each stereo pair conforms with the standards of the Second Roentgenocephalometric Workshop and can be used to make all standard two-dimensional orthodontic and cephalometric measurements. When supplemented by data from the conjugate film, a three-dimensional coordinate map can be generated as a machine operation. Specialized complementary software has been developed to increase the reliability of landmark location both in two and in three dimensions.
Space Shuttle avionics upgrade - Issues and opportunities
NASA Astrophysics Data System (ADS)
Swaim, Richard A.; Wingert, William B.
An overview is conducted of existing Space Shuttle avionics and the possibilities for upgrading the cockpit to reduce costs and increase functionability. The current avionics include five general-purpose computers fitted with multifunction displays, dedicated switches and indicators, and dedicated flight instruments. The operational needs of the Shuttle are reviewed in the light of the avionics and potential upgrades in the form of microprocessors and display systems. The use of better processors can provide hardware support for multitasking and memory management and can reduce the life-cycle cost for software. Some limitations of the current technology are acknowledged including the Shuttle's power budget and structural configuration. A phased infusion of upgraded avionics is proposed that provides a functionally transparent replacement of crew-interface equipment as well as the addition of interface enhancements and the migration of selected functions.
Yang, Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu, Yu; Goddu, S Murty; Mutic, Sasa; Deasy, Joseph O; Low, Daniel A
2011-01-01
Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. DIRART provides a set of image processing/registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research. 0 2011 Ameri-
Sample Analysis at Mars Instrument Simulator
NASA Technical Reports Server (NTRS)
Benna, Mehdi; Nolan, Tom
2013-01-01
The Sample Analysis at Mars Instrument Simulator (SAMSIM) is a numerical model dedicated to plan and validate operations of the Sample Analysis at Mars (SAM) instrument on the surface of Mars. The SAM instrument suite, currently operating on the Mars Science Laboratory (MSL), is an analytical laboratory designed to investigate the chemical and isotopic composition of the atmosphere and volatiles extracted from solid samples. SAMSIM was developed using Matlab and Simulink libraries of MathWorks Inc. to provide MSL mission planners with accurate predictions of the instrument electrical, thermal, mechanical, and fluid responses to scripted commands. This tool is a first example of a multi-purpose, full-scale numerical modeling of a flight instrument with the purpose of supplementing or even eliminating entirely the need for a hardware engineer model during instrument development and operation. SAMSIM simulates the complex interactions that occur between the instrument Command and Data Handling unit (C&DH) and all subsystems during the execution of experiment sequences. A typical SAM experiment takes many hours to complete and involves hundreds of components. During the simulation, the electrical, mechanical, thermal, and gas dynamics states of each hardware component are accurately modeled and propagated within the simulation environment at faster than real time. This allows the simulation, in just a few minutes, of experiment sequences that takes many hours to execute on the real instrument. The SAMSIM model is divided into five distinct but interacting modules: software, mechanical, thermal, gas flow, and electrical modules. The software module simulates the instrument C&DH by executing a customized version of the instrument flight software in a Matlab environment. The inputs and outputs to this synthetic C&DH are mapped to virtual sensors and command lines that mimic in their structure and connectivity the layout of the instrument harnesses. This module executes, and thus validates, complex command scripts prior to their up-linking to the SAM instrument. As an output, this module generates synthetic data and message logs at a rate that is similar to the actual instrument.
NASA Astrophysics Data System (ADS)
de Brum, A. G. V.; da Cruz, F. C.; Hetem, A., Jr.
2015-10-01
To assist in the investigation of the triple asteroid system 2001-SN263, the deep space mission ASTER will carry onboard a laser altimeter. The instrument was named ALR and its development is now in progress. In order to help in the instrument design, with a view to the creation of software to control the instrument, a package of computer programs was produced to simulate the operation of a pulsed laser altimeter with operating principle based on the measurement of the time of flight of the travelling pulse. This software Simulator was called ALR_Sim, and the results obtained with its use represent what should be expected as return signal when laser pulses are fired toward a target, reflect on it and return to be detected by the instrument. The program was successfully tested with regard to some of the most common situations expected. It constitutes now the main workbench dedicated to the creation and testing of control software to embark in the ALR. In addition, the Simulator constitutes also an important tool to assist the creation of software to be used on Earth, in the processing and analysis of the data received from the instrument. This work presents the results obtained in the special case which involves the modeling of a surface with crater, along with the simulation of the instrument operation above this type of terrain. This study points out that the comparison of the wave form obtained as return signal after reflection of the laser pulse on the surface of the crater with the expected return signal in the case of a flat and homogeneous surface is a useful method that can be applied for terrain details extraction.
TE-Tracker: systematic identification of transposition events through whole-genome resequencing.
Gilly, Arthur; Etcheverry, Mathilde; Madoui, Mohammed-Amin; Guy, Julie; Quadrana, Leandro; Alberti, Adriana; Martin, Antoine; Heitkam, Tony; Engelen, Stefan; Labadie, Karine; Le Pen, Jeremie; Wincker, Patrick; Colot, Vincent; Aury, Jean-Marc
2014-11-19
Transposable elements (TEs) are DNA sequences that are able to move from their location in the genome by cutting or copying themselves to another locus. As such, they are increasingly recognized as impacting all aspects of genome function. With the dramatic reduction in cost of DNA sequencing, it is now possible to resequence whole genomes in order to systematically characterize novel TE mobilization in a particular individual. However, this task is made difficult by the inherently repetitive nature of TE sequences, which in some eukaryotes compose over half of the genome sequence. Currently, only a few software tools dedicated to the detection of TE mobilization using next-generation-sequencing are described in the literature. They often target specific TEs for which annotation is available, and are only able to identify families of closely related TEs, rather than individual elements. We present TE-Tracker, a general and accurate computational method for the de-novo detection of germ line TE mobilization from re-sequenced genomes, as well as the identification of both their source and destination sequences. We compare our method with the two classes of existing software: specialized TE-detection tools and generic structural variant (SV) detection tools. We show that TE-Tracker, while working independently of any prior annotation, bridges the gap between these two approaches in terms of detection power. Indeed, its positive predictive value (PPV) is comparable to that of dedicated TE software while its sensitivity is typical of a generic SV detection tool. TE-Tracker demonstrates the benefit of adopting an annotation-independent, de novo approach for the detection of TE mobilization events. We use TE-Tracker to provide a comprehensive view of transposition events induced by loss of DNA methylation in Arabidopsis. TE-Tracker is freely available at http://www.genoscope.cns.fr/TE-Tracker . We show that TE-Tracker accurately detects both the source and destination of novel transposition events in re-sequenced genomes. Moreover, TE-Tracker is able to detect all potential donor sequences for a given insertion, and can identify the correct one among them. Furthermore, TE-Tracker produces significantly fewer false positives than common SV detection programs, thus greatly facilitating the detection and analysis of TE mobilization events.
Dimensional Error in Rapid Prototyping with Open Source Software and Low-cost 3D-printer
Andrade-Delgado, Laura; Telich-Tarriba, Jose E.; Fuente-del-Campo, Antonio; Altamirano-Arcos, Carlos A.
2018-01-01
Summary: Rapid prototyping models (RPMs) had been extensively used in craniofacial and maxillofacial surgery, especially in areas such as orthognathic surgery, posttraumatic or oncological reconstructions, and implantology. Economic limitations are higher in developing countries such as Mexico, where resources dedicated to health care are limited, therefore limiting the use of RPM to few selected centers. This article aims to determine the dimensional error of a low-cost fused deposition modeling 3D printer (Tronxy P802MA, Shenzhen, Tronxy Technology Co), with Open source software. An ordinary dry human mandible was scanned with a computed tomography device. The data were processed with open software to build a rapid prototype with a fused deposition machine. Linear measurements were performed to find the mean absolute and relative difference. The mean absolute and relative difference was 0.65 mm and 1.96%, respectively (P = 0.96). Low-cost FDM machines and Open Source Software are excellent options to manufacture RPM, with the benefit of low cost and a similar relative error than other more expensive technologies. PMID:29464171
Dimensional Error in Rapid Prototyping with Open Source Software and Low-cost 3D-printer.
Rendón-Medina, Marco A; Andrade-Delgado, Laura; Telich-Tarriba, Jose E; Fuente-Del-Campo, Antonio; Altamirano-Arcos, Carlos A
2018-01-01
Rapid prototyping models (RPMs) had been extensively used in craniofacial and maxillofacial surgery, especially in areas such as orthognathic surgery, posttraumatic or oncological reconstructions, and implantology. Economic limitations are higher in developing countries such as Mexico, where resources dedicated to health care are limited, therefore limiting the use of RPM to few selected centers. This article aims to determine the dimensional error of a low-cost fused deposition modeling 3D printer (Tronxy P802MA, Shenzhen, Tronxy Technology Co), with Open source software. An ordinary dry human mandible was scanned with a computed tomography device. The data were processed with open software to build a rapid prototype with a fused deposition machine. Linear measurements were performed to find the mean absolute and relative difference. The mean absolute and relative difference was 0.65 mm and 1.96%, respectively ( P = 0.96). Low-cost FDM machines and Open Source Software are excellent options to manufacture RPM, with the benefit of low cost and a similar relative error than other more expensive technologies.
Galileo spacecraft power distribution and autonomous fault recovery
NASA Technical Reports Server (NTRS)
Detwiler, R. C.
1982-01-01
There is a trend in current spacecraft design to achieve greater fault tolerance through the implemenation of on-board software dedicated to detecting and isolating failures. A combination of hardware and software is utilized in the Galileo power system for autonomous fault recovery. Galileo is a dual-spun spacecraft designed to carry a number of scientific instruments into a series of orbits around the planet Jupiter. In addition to its self-contained scientific payload, it will also carry a probe system which will be separated from the spacecraft some 150 days prior to Jupiter encounter. The Galileo spacecraft is scheduled to be launched in 1985. Attention is given to the power system, the fault protection requirements, and the power fault recovery implementation.
Computing Properties of Hadrons, Nuclei and Nuclear Matter from Quantum Chromodynamics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Savage, Martin J.
This project was part of a coordinated software development effort which the nuclear physics lattice QCD community pursues in order to ensure that lattice calculations can make optimal use of present, and forthcoming leadership-class and dedicated hardware, including those of the national laboratories, and prepares for the exploitation of future computational resources in the exascale era. The UW team improved and extended software libraries used in lattice QCD calculations related to multi-nucleon systems, enhanced production running codes related to load balancing multi-nucleon production on large-scale computing platforms, and developed SQLite (addressable database) interfaces to efficiently archive and analyze multi-nucleon datamore » and developed a Mathematica interface for the SQLite databases.« less
Lend Me Your Voice - A Constructivist Approach to Augmentative Communication
NASA Astrophysics Data System (ADS)
Mangiatordi, Andrea; Acosta, Micaela; Castellano, Roxana
This paper envisions a software project dedicated to disabled children with communication impairments or restrictions. The idea is to develop a costless and functional communication aid capable of voice output. This software could be a very convenient solution if used on the laptops provided by One Laptop Per Child to children in different developing countries. These laptops, together with their operating system, are designed following constructionist ideas, focused on cooperative learning and "learning learning". This work discusses the creation of a tool which is both an aid to the disabled children and a base on which inclusive context can be built in school classes. An ideal lesson plan is discussed which could be used as a guideline for teachers.
Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education
NASA Astrophysics Data System (ADS)
Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki
The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.
NASA Astrophysics Data System (ADS)
Berthou, B.; Binosi, D.; Chouika, N.; Colaneri, L.; Guidal, M.; Mezrag, C.; Moutarde, H.; Rodríguez-Quintero, J.; Sabatié, F.; Sznajder, P.; Wagner, J.
2018-06-01
We describe the architecture and functionalities of a C++ software framework, coined PARTONS, dedicated to the phenomenology of Generalized Parton Distributions. These distributions describe the three-dimensional structure of hadrons in terms of quarks and gluons, and can be accessed in deeply exclusive lepto- or photo-production of mesons or photons. PARTONS provides a necessary bridge between models of Generalized Parton Distributions and experimental data collected in various exclusive production channels. We outline the specification of the PARTONS framework in terms of practical needs, physical content and numerical capacity. This framework will be useful for physicists - theorists or experimentalists - not only to develop new models, but also to interpret existing measurements and even design new experiments.
NASA Astrophysics Data System (ADS)
Sagnotti, Leonardo
2013-04-01
Modern rock magnetometers and stepwise demagnetization procedures result in the production of large datasets, which need a versatile and fast software for their display and analysis. Various software packages for paleomagnetic analyses have been recently developed to overcome the problems linked to the limited capability and the loss of operability of early codes written in obsolete computer languages and/or platforms, not compatible with modern 64 bit processors. The Demagnetization Analysis in Excel (DAIE) workbook is a new software that has been designed to make the analysis of demagnetization data easy and accessible on an application (Microsoft Excel) widely diffused and available on both the Microsoft Windows and Mac OS X operating systems. The widespread diffusion of Excel should guarantee a long term working life, since compatibility and functionality of current Excel files should be most likely maintained during the development of new processors and operating systems. DAIE is designed for viewing and analyzing stepwise demagnetization data of both discrete and u-channel samples. DAIE consists of a single file and has an open modular structure organized in 10 distinct worksheets. The standard demagnetization diagrams and various parameters of common use are shown on the same worksheet including selectable parameters and user's choices. The remanence characteristic components may be computed by principal component analysis (PCA) on a selected interval of demagnetization steps. Saving of the PCA data can be done both sample by sample, or in automatic by applying the selected choices to all the samples included in the file. The DAIE open structure allows easy personalization, development and improvement. The workbook has the following features which may be valuable for various users: - Operability in nearly all the computers and platforms; - Easy inputs of demagnetization data by "copy and paste" from ASCII files; - Easy export of computed parameters and demagnetization plots; - Complete control of the whole workflow and possibility of implementation of the workbook by any user; - Modular structure in distinct worksheets for each type of analyses and plots, in order to make implementation and personalization easier; - Opportunity to use the workbook for educational purposes, since all the computations and analyses are easily traceable and accessible; - Automatic and fast analysis of a large batch of demagnetization data, such as those measured on u-channel samples. The DAIE workbook and the "User manual" are available for download on a dedicated web site (http://roma2.rm.ingv.it/en/facilities/software/49/daie).
Preliminary design studies on a nuclear seawater desalination system
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wibisono, A. F.; Jung, Y. H.; Choi, J.
2012-07-01
Seawater desalination is one of the most promising technologies to provide fresh water especially in the arid region. The most used technology in seawater desalination are thermal desalination (MSF and MED) and membrane desalination (RO). Some developments have been done in the area of coupling the desalination plant with a nuclear reactor to reduce the cost of energy required in thermal desalination. The coupling a nuclear reactor to a desalination plant can be done either by using the co-generation or by using dedicated heat from a nuclear system. The comparison of the co-generation nuclear reactor with desalination plant, dedicated nuclearmore » heat system, and fossil fueled system will be discussed in this paper using economical assessment with IAEA DEEP software. A newly designed nuclear system dedicated for the seawater desalination will also be suggested by KAIST (Korea Advanced Inst. of Science and Technology) research team and described in detail within this paper. The suggested reactor system is using gas cooled type reactor and in this preliminary study the scope of design will be limited to comparison of two cases in different operating temperature ranges. (authors)« less
Esposito, Stefano Andrea; Huybrechts, Bart; Slagmolen, Pieter; Cotti, Elisabetta; Coucke, Wim; Pauwels, Ruben; Lambrechts, Paul; Jacobs, Reinhilde
2013-09-01
The routine use of high-resolution images derived from 3-dimensional cone-beam computed tomography (CBCT) datasets enables the linear measurement of lesions in the maxillary and mandibular bones on 3 planes of space. Measurements on different planes would make it possible to obtain real volumetric assessments. In this study, we tested, in vitro, the accuracy and reliability of new dedicated software developed for volumetric lesion assessment in clinical endodontics. Twenty-seven bone defects were created around the apices of 8 teeth in 1 young bovine mandible to simulate periapical lesions of different sizes and shapes. The volume of each defect was determined by taking an impression of the defect using a silicone material. The samples were scanned using an Accuitomo 170 CBCT (J. Morita Mfg Co, Kyoto, Japan), and the data were uploaded into a newly developed dedicated software tool. Two endodontists acted as independent and calibrated observers. They analyzed each bone defect for volume. The difference between the direct volumetric measurements and the measurements obtained with the CBCT images was statistically assessed using a lack-of-fit test. A correlation study was undertaken using the Pearson product-moment correlation coefficient. Intra- and interobserver agreement was also evaluated. The results showed a good fit and strong correlation between both volume measurements (ρ > 0.9) with excellent inter- and intraobserver agreement. Using this software, CBCT proved to be a reliable method in vitro for the estimation of endodontic lesion volumes in bovine jaws. Therefore, it may constitute a new, validated technique for the accurate evaluation and follow-up of apical periodontitis. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
HCS road: an enterprise system for integrated HCS data management and analysis.
Jackson, Donald; Lenard, Michael; Zelensky, Alexander; Shaikh, Mohammad; Scharpf, James V; Shaginaw, Richard; Nawade, Mahesh; Agler, Michele; Cloutier, Normand J; Fennell, Myles; Guo, Qi; Wardwell-Swanson, Judith; Zhao, Dandan; Zhu, Yingjie; Miller, Christopher; Gill, James
2010-08-01
The effective analysis and interpretation of high-content screening (HCS) data requires joining results to information on experimental treatments and controls, normalizing data, and selecting hits or fitting concentration-response curves. HCS data have unique requirements that are not supported by traditional high-throughput screening databases, including the ability to designate separate positive and negative controls for different measurements in multiplexed assays; the ability to capture information on the cell lines, fluorescent reagents, and treatments in each assay; the ability to store and use individual-cell and image data; and the ability to support HCS readers and software from multiple vendors along with third-party image analysis tools. To address these requirements, the authors developed an enterprise system for the storage and processing of HCS images and results. This system, HCS Road, supports target identification, lead discovery, lead evaluation, and lead profiling activities. A dedicated client supports experimental design, data review, and core analyses and displays images together with results for assay development, hit assessment, and troubleshooting. Data can be exported to third-party applications for further analysis and exploration. HCS Road provides a single source for high-content results across the organization, regardless of the group or instrument that produced them.
Artese, Anna; Alcaro, Stefano; Moraca, Federica; Reina, Rocco; Ventura, Marzia; Costantino, Gabriele; Beccari, Andrea R; Ortuso, Francesco
2013-05-01
During the first edition of the Computationally Driven Drug Discovery meeting, held in November 2011 at Dompé Pharma (L'Aquila, Italy), a questionnaire regarding the diffusion and the use of computational tools for drug-design purposes in both academia and industry was distributed among all participants. This is a follow-up of a previously reported investigation carried out among a few companies in 2007. The new questionnaire implemented five sections dedicated to: research group identification and classification; 18 different computational techniques; software information; hardware data; and economical business considerations. In this article, together with a detailed history of the different computational methods, a statistical analysis of the survey results that enabled the identification of the prevalent computational techniques adopted in drug-design projects is reported and a profile of the computational medicinal chemist currently working in academia and pharmaceutical companies in Italy is highlighted.
VAX CLuster upgrade: Report of a CPC task force
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hanson, J.; Berry, H.; Kessler, P.
The CSCF VAX cluster provides interactive computing for 100 users during prime time, plus a considerable amount of daytime and overnight batch processing. While this cluster represents less than 10% of the VAX computing power at BNL (6 MIPS out of 70), it has served as an important center for this larger network, supporting special hardware and software too expensive to maintain on every machine. In addition, it is the only unrestricted facility available to VAX/VMS users (other machines are typically dedicated to special projects). This committee's analysis shows that the cpu's on the CSCF cluster are currently badly oversaturated,more » frequently giving extremely poor interactive response. Short batch jobs (a necessary part of interactive work) typically take 3 to 4 times as long to execute as they would on an idle machine. There is also an immediate need for more scratch disk space and user permanent file space.« less
Radiation-induced chromosomal instability in human mammary epithelial cells
NASA Technical Reports Server (NTRS)
Durante, M.; Grossi, G. F.; Yang, T. C.
1996-01-01
Karyotypes of human cells surviving X- and alpha-irradiation have been studied. Human mammary epithelial cells of the immortal, non-tumorigenic cell line H184B5 F5-1 M/10 were irradiated and surviving clones isolated and expanded in culture. Cytogenetic analysis was performed using dedicated software with an image analyzer. We have found that both high- and low-LET radiation induced chromosomal instability in long-term cultures, but with different characteristics. Complex chromosomal rearrangements were observed after X-rays, while chromosome loss predominated after alpha-particles. Deletions were observed in both cases. In clones derived from cells exposed to alpha-particles, some cells showed extensive chromosome breaking and double minutes. Genomic instability was correlated to delayed reproductive death and neoplastic transformation. These results indicate that chromosomal instability is a radiation-quality-dependent effect which could determine late genetic effects, and should therefore be carefully considered in the evaluation of risk for space missions.
Radiation-induced chromosomal instability in human mammary epithelial cells
NASA Astrophysics Data System (ADS)
Durante, M.; Grossi, G. F.; Yang, T. C.
Karyotypes of human cells surviving X- and alpha-irradiation have been studied. Human mammary epithelial cells of the immortal, non-tumorigenic cell line H184B5 F5-1 M/10 were irradiated and surviving clones isolated and expanded in culture. Cytogenetic analysis was performed using dedicated software with an image analyzer. We have found that both high- and low-LET radiation induced chromosomal instability in long-term cultures, but with different characteristics. Complex chromosomal rearrangements were observed after X-rays, while chromosome loss predominated after alpha-particles. Deletions were observed in both cases. In clones derived from cells exposed to alpha-particles, some cells showed extensive chromosome breaking and double minutes. Genomic instability was correlated to delayed reproductive death and neoplastic transformation. These results indicate that chromosomal instability is a radiation-quality-dependent effect which could determine late genetic effects, and should therefore be carefully considered in the evaluation of risk for space missions.
Repository-Based Software Engineering Program: Working Program Management Plan
NASA Technical Reports Server (NTRS)
1993-01-01
Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.
Peeters, Marieke; de Moor, Jan; Verhoeven, Ludo
2011-01-01
The goal of the present study was to get an overview of the emergent literacy activities, instructional adaptations and school absence of children with cerebral palsy (CP) compared to normally developing peers. The results showed that there were differences between the groups regarding the amount of emergent literacy instruction. While time dedicated to storybook reading and independent picture-book reading was comparable, the children with CP received fewer opportunities to work with educational software and more time was dedicated to rhyming games and singing. For the children with CP, the level of speech, intellectual, and physical impairments were all related to the amount of time in emergent literacy instruction. Additionally, the amount of time reading precursors is trained and the number of specific reading precursors that is trained is all related to skills of emergent literacy. Copyright © 2010 Elsevier Ltd. All rights reserved.
The Victor C++ library for protein representation and advanced manipulation.
Hirsh, Layla; Piovesan, Damiano; Giollo, Manuel; Ferrari, Carlo; Tosatto, Silvio C E
2015-04-01
Protein sequence and structure representation and manipulation require dedicated software libraries to support methods of increasing complexity. Here, we describe the VIrtual Constrution TOol for pRoteins (Victor) C++ library, an open source platform dedicated to enabling inexperienced users to develop advanced tools and gathering contributions from the community. The provided application examples cover statistical energy potentials, profile-profile sequence alignments and ab initio loop modeling. Victor was used over the last 15 years in several publications and optimized for efficiency. It is provided as a GitHub repository with source files and unit tests, plus extensive online documentation, including a Wiki with help files and tutorials, examples and Doxygen documentation. The C++ library and online documentation, distributed under a GPL license are available from URL: http://protein.bio.unipd.it/victor/. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Scientific Services on the Cloud
NASA Astrophysics Data System (ADS)
Chapman, David; Joshi, Karuna P.; Yesha, Yelena; Halem, Milt; Yesha, Yaacov; Nguyen, Phuong
Scientific Computing was one of the first every applications for parallel and distributed computation. To this date, scientific applications remain some of the most compute intensive, and have inspired creation of petaflop compute infrastructure such as the Oak Ridge Jaguar and Los Alamos RoadRunner. Large dedicated hardware infrastructure has become both a blessing and a curse to the scientific community. Scientists are interested in cloud computing for much the same reason as businesses and other professionals. The hardware is provided, maintained, and administrated by a third party. Software abstraction and virtualization provide reliability, and fault tolerance. Graduated fees allow for multi-scale prototyping and execution. Cloud computing resources are only a few clicks away, and by far the easiest high performance distributed platform to gain access to. There may still be dedicated infrastructure for ultra-scale science, but the cloud can easily play a major part of the scientific computing initiative.
Jürgens, Julian H W; Schulz, Nadine; Wybranski, Christian; Seidensticker, Max; Streit, Sebastian; Brauner, Jan; Wohlgemuth, Walter A; Deuerling-Zheng, Yu; Ricke, Jens; Dudeck, Oliver
2015-02-01
The objective of this study was to compare the parameter maps of a new flat-panel detector application for time-resolved perfusion imaging in the angiography room (FD-CTP) with computed tomography perfusion (CTP) in an experimental tumor model. Twenty-four VX2 tumors were implanted into the hind legs of 12 rabbits. Three weeks later, FD-CTP (Artis zeego; Siemens) and CTP (SOMATOM Definition AS +; Siemens) were performed. The parameter maps for the FD-CTP were calculated using a prototype software, and those for the CTP were calculated with VPCT-body software on a dedicated syngo MultiModality Workplace. The parameters were compared using Pearson product-moment correlation coefficient and linear regression analysis. The Pearson product-moment correlation coefficient showed good correlation values for both the intratumoral blood volume of 0.848 (P < 0.01) and the blood flow of 0.698 (P < 0.01). The linear regression analysis of the perfusion between FD-CTP and CTP showed for the blood volume a regression equation y = 4.44x + 36.72 (P < 0.01) and for the blood flow y = 0.75x + 14.61 (P < 0.01). This preclinical study provides evidence that FD-CTP allows a time-resolved (dynamic) perfusion imaging of tumors similar to CTP, which provides the basis for clinical applications such as the assessment of tumor response to locoregional therapies directly in the angiography suite.
The use of programmable logic controllers (PLC) for rocket engine component testing
NASA Technical Reports Server (NTRS)
Nail, William; Scheuermann, Patrick; Witcher, Kern
1991-01-01
Application of PLCs to the rocket engine component testing at a new Stennis Space Center Component Test Facility is suggested as an alternative to dedicated specialized computers. The PLC systems are characterized by rugged design, intuitive software, fault tolerance, flexibility, multiple end device options, networking capability, and built-in diagnostics. A distributed PLC-based system is projected to be used for testing LH2/LOx turbopumps required for the ALS/NLS rocket engines.
A Dedicated Computational Platform for Cellular Monte Carlo T-CAD Software Tools
2015-07-14
computer that establishes an encrypted Virtual Private Network ( OpenVPN [44]) based on the Secure Socket Layer (SSL) paradigm. Each user is given a...security certificate for each device used to connect to the computing nodes. Stable OpenVPN clients are available for Linux, Microsoft Windows, Apple OSX...platform is granted by an encrypted connection base on the Secure Socket Layer (SSL) protocol, and implemented in the OpenVPN Virtual Personal Network
Towards open-source, low-cost haptics for surgery simulation.
Suwelack, Stefan; Sander, Christian; Schill, Julian; Serf, Manuel; Danz, Marcel; Asfour, Tamim; Burger, Wolfgang; Dillmann, Rüdiger; Speidel, Stefanie
2014-01-01
In minimally invasive surgery (MIS), virtual reality (VR) training systems have become a promising education tool. However, the adoption of these systems in research and clinical settings is still limited by the high costs of dedicated haptics hardware for MIS. In this paper, we present ongoing research towards an open-source, low-cost haptic interface for MIS simulation. We demonstrate the basic mechanical design of the device, the sensor setup as well as its software integration.
Digital Waveguide Architectures for Virtual Musical Instruments
NASA Astrophysics Data System (ADS)
Smith, Julius O.
Digital sound synthesis has become a standard staple of modern music studios, videogames, personal computers, and hand-held devices. As processing power has increased over the years, sound synthesis implementations have evolved from dedicated chip sets, to single-chip solutions, and ultimately to software implementations within processors used primarily for other tasks (such as for graphics or general purpose computing). With the cost of implementation dropping closer and closer to zero, there is increasing room for higher quality algorithms.
D'Alessandro, M P; Ackerman, M J; Sparks, S M
1993-11-01
Educational Technology Network (ET Net) is a free, easy to use, on-line computer conferencing system organized and funded by the National Library of Medicine that is accessible via the SprintNet (SprintNet, Reston, VA) and Internet (Merit, Ann Arbor, MI) computer networks. It is dedicated to helping bring together, in a single continuously running electronic forum, developers and users of computer applications in the health sciences, including radiology. ET Net uses the Caucus computer conferencing software (Camber-Roth, Troy, NY) running on a microcomputer. This microcomputer is located in the National Library of Medicine's Lister Hill National Center for Biomedical Communications and is directly connected to the SprintNet and the Internet networks. The advanced computer conferencing software of ET Net allows individuals who are separated in space and time to unite electronically to participate, at any time, in interactive discussions on applications of computers in radiology. A computer conferencing system such as ET Net allows radiologists to maintain contact with colleagues on a regular basis when they are not physically together. Topics of discussion on ET Net encompass all applications of computers in radiological practice, research, and education. ET Net has been in successful operation for 3 years and has a promising future aiding radiologists in the exchange of information pertaining to applications of computers in radiology.
The NASA Exoplanet Science Institute Archives: KOA and NStED
NASA Astrophysics Data System (ADS)
Berriman, G. B.; Ciardi, D.; Abajian, M.; Barlow, T.; Bryden, G.; von Braun, K.; Good, J.; Kane, S.; Kong, M.; Laity, A.; Lynn, M.; Elroy, D. M.; Plavchan, P.; Ramirez, S.; Schmitz, M.; Stauffer, J.; Wyatt, P.; Zhang, A.; Goodrich, R.; Mader, J.; Tran, H.; Tsubota, M.; Beekley, A.; Berukoff, S.; Chan, B.; Lau, C.; Regelson, M.; Saucedo, M.; Swain, M.
2010-12-01
The NASA Exoplanet Science Institute (NExScI) maintains a series of archival services in support of NASA’s planet finding and characterization goals. Two of the larger archival services at NExScI are the Keck Observatory Archive (KOA) and the NASA Star and Exoplanet Database (NStED). KOA, a collaboration between the W. M. Keck Observatory and NExScI, serves raw data from the High Resolution Echelle Spectrograph (HIRES) and extracted spectral browse products. As of June 2009, KOA hosts over 28 million files (4.7 TB) from over 2,000 nights. In Spring 2010, it will begin to serve data from the Near-Infrared Echelle Spectrograph (NIRSPEC). NStED is a general purpose archive with the aim of providing support for NASA’s planet finding and characterization goals, and stellar astrophysics. There are two principal components of NStED: a database of (currently) all known exoplanets, and images; and an archive dedicated to high precision photometric surveys for transiting exoplanets. NStED is the US portal to the CNES mission CoRoT, the first space mission dedicated to the discovery and characterization of exoplanets. These archives share a common software and hardware architecture with the NASA/IPAC Infrared Science Archive (IRSA). The software architecture consists of standalone utilities that perform generic query and retrieval functions. They are called through program interfaces and plugged together to form applications through a simple executive library.
Parallel computing on Unix workstation arrays
NASA Astrophysics Data System (ADS)
Reale, F.; Bocchino, F.; Sciortino, S.
1994-12-01
We have tested arrays of general-purpose Unix workstations used as MIMD systems for massive parallel computations. In particular we have solved numerically a demanding test problem with a 2D hydrodynamic code, generally developed to study astrophysical flows, by exucuting it on arrays either of DECstations 5000/200 on Ethernet LAN, or of DECstations 3000/400, equipped with powerful Alpha processors, on FDDI LAN. The code is appropriate for data-domain decomposition, and we have used a library for parallelization previously developed in our Institute, and easily extended to work on Unix workstation arrays by using the PVM software toolset. We have compared the parallel efficiencies obtained on arrays of several processors to those obtained on a dedicated MIMD parallel system, namely a Meiko Computing Surface (CS-1), equipped with Intel i860 processors. We discuss the feasibility of using non-dedicated parallel systems and conclude that the convenience depends essentially on the size of the computational domain as compared to the relative processor power and network bandwidth. We point out that for future perspectives a parallel development of processor and network technology is important, and that the software still offers great opportunities of improvement, especially in terms of latency times in the message-passing protocols. In conditions of significant gain in terms of speedup, such workstation arrays represent a cost-effective approach to massive parallel computations.
hEIDI: An Intuitive Application Tool To Organize and Treat Large-Scale Proteomics Data.
Hesse, Anne-Marie; Dupierris, Véronique; Adam, Claire; Court, Magali; Barthe, Damien; Emadali, Anouk; Masselon, Christophe; Ferro, Myriam; Bruley, Christophe
2016-10-07
Advances in high-throughput proteomics have led to a rapid increase in the number, size, and complexity of the associated data sets. Managing and extracting reliable information from such large series of data sets require the use of dedicated software organized in a consistent pipeline to reduce, validate, exploit, and ultimately export data. The compilation of multiple mass-spectrometry-based identification and quantification results obtained in the context of a large-scale project represents a real challenge for developers of bioinformatics solutions. In response to this challenge, we developed a dedicated software suite called hEIDI to manage and combine both identifications and semiquantitative data related to multiple LC-MS/MS analyses. This paper describes how, through a user-friendly interface, hEIDI can be used to compile analyses and retrieve lists of nonredundant protein groups. Moreover, hEIDI allows direct comparison of series of analyses, on the basis of protein groups, while ensuring consistent protein inference and also computing spectral counts. hEIDI ensures that validated results are compliant with MIAPE guidelines as all information related to samples and results is stored in appropriate databases. Thanks to the database structure, validated results generated within hEIDI can be easily exported in the PRIDE XML format for subsequent publication. hEIDI can be downloaded from http://biodev.extra.cea.fr/docs/heidi .
Performance Characteristic Mems-Based IMUs for UAVs Navigation
NASA Astrophysics Data System (ADS)
Mohamed, H. A.; Hansen, J. M.; Elhabiby, M. M.; El-Sheimy, N.; Sesay, A. B.
2015-08-01
Accurate 3D reconstruction has become essential for non-traditional mapping applications such as urban planning, mining industry, environmental monitoring, navigation, surveillance, pipeline inspection, infrastructure monitoring, landslide hazard analysis, indoor localization, and military simulation. The needs of these applications cannot be satisfied by traditional mapping, which is based on dedicated data acquisition systems designed for mapping purposes. Recent advances in hardware and software development have made it possible to conduct accurate 3D mapping without using costly and high-end data acquisition systems. Low-cost digital cameras, laser scanners, and navigation systems can provide accurate mapping if they are properly integrated at the hardware and software levels. Unmanned Aerial Vehicles (UAVs) are emerging as a mobile mapping platform that can provide additional economical and practical advantages. However, such economical and practical requirements need navigation systems that can provide uninterrupted navigation solution. Hence, testing the performance characteristics of Micro-Electro-Mechanical Systems (MEMS) or low cost navigation sensors for various UAV applications is important research. This work focuses on studying the performance characteristics under different manoeuvres using inertial measurements integrated with single point positioning, Real-Time-Kinematic (RTK), and additional navigational aiding sensors. Furthermore, the performance of the inertial sensors is tested during Global Positioning System (GPS) signal outage.
Oxygen saturation profile in healthy preterm infants.
Harigopal, S; Satish, H P; Taktak, A F G; Southern, K W; Shaw, N J
2011-09-01
To establish a reference range for oxygen saturation (SpO(2)) in well preterm infants to guide home oxygen therapy using a pulse oximeter and Pulse Oximetry Data Analysis Software (PODS). SpO(2) and heart-rate profiles of healthy preterm infants receiving mechanical ventilation for less than 6 h and supplemental oxygen for less than 48 h were monitored using a pulse oximeter. The stored data were downloaded from the monitor to a personal computer as individual files. Each infant's files of SpO(2) were subsequently displayed in graphic form, and a reference range was constructed using dedicated software, PODS. 43 infants were studied. The median value of all infants mean SpO(2) values was 95% (range 92-99%). The median duration of saturations less than 85% and between 85% and 90 % were 1% and 2% respectively. Using the study group median, 5th and 95th percentiles, a cumulative frequency curve of time against SpO(2) value was constructed (representing the reference range of SpO(2) profiles in healthy preterm infants). The SpO(2) reference range can be used as an easy and practical guide to compare SpO(2) profiles of infants on home oxygen therapy and guide their oxygen therapy.
MATISSE: A novel tool to access, visualize and analyse data from planetary exploration missions
NASA Astrophysics Data System (ADS)
Zinzi, A.; Capria, M. T.; Palomba, E.; Giommi, P.; Antonelli, L. A.
2016-04-01
The increasing number and complexity of planetary exploration space missions require new tools to access, visualize and analyse data to improve their scientific return. ASI Science Data Center (ASDC) addresses this request with the web-tool MATISSE (Multi-purpose Advanced Tool for the Instruments of the Solar System Exploration), allowing the visualization of single observation or real-time computed high-order products, directly projected on the three-dimensional model of the selected target body. Using MATISSE it will be no longer needed to download huge quantity of data or to write down a specific code for every instrument analysed, greatly encouraging studies based on joint analysis of different datasets. In addition the extremely high-resolution output, to be used offline with a Python-based free software, together with the files to be read with specific GIS software, makes it a valuable tool to further process the data at the best spatial accuracy available. MATISSE modular structure permits addition of new missions or tasks and, thanks to dedicated future developments, it would be possible to make it compliant to the Planetary Virtual Observatory standards currently under definition. In this context the recent development of an interface to the NASA ODE REST API by which it is possible to access to public repositories is set.
Exploratory analysis regarding the domain definitions for computer based analytical models
NASA Astrophysics Data System (ADS)
Raicu, A.; Oanta, E.; Barhalescu, M.
2017-08-01
Our previous computer based studies dedicated to structural problems using analytical methods defined the composite cross section of a beam as a result of Boolean operations with so-called ‘simple’ shapes. Using generalisations, in the class of the ‘simple’ shapes were included areas bounded by curves approximated using spline functions and areas approximated as polygons. However, particular definitions lead to particular solutions. In order to ascend above the actual limitations, we conceived a general definition of the cross sections that are considered now calculus domains consisting of several subdomains. The according set of input data use complex parameterizations. This new vision allows us to naturally assign a general number of attributes to the subdomains. In this way there may be modelled new phenomena that use map-wise information, such as the metal alloys equilibrium diagrams. The hierarchy of the input data text files that use the comma-separated-value format and their structure are also presented and discussed in the paper. This new approach allows us to reuse the concepts and part of the data processing software instruments already developed. The according software to be subsequently developed will be modularised and generalised in order to be used in the upcoming projects that require rapid development of computer based models.
The Personal Hearing System—A Software Hearing Aid for a Personal Communication System
NASA Astrophysics Data System (ADS)
Grimm, Giso; Guilmin, Gwénaël; Poppen, Frank; Vlaming, Marcel S. M. G.; Hohmann, Volker
2009-12-01
A concept and architecture of a personal communication system (PCS) is introduced that integrates audio communication and hearing support for the elderly and hearing-impaired through a personal hearing system (PHS). The concept envisions a central processor connected to audio headsets via a wireless body area network (WBAN). To demonstrate the concept, a prototype PCS is presented that is implemented on a netbook computer with a dedicated audio interface in combination with a mobile phone. The prototype can be used for field-testing possible applications and to reveal possibilities and limitations of the concept of integrating hearing support in consumer audio communication devices. It is shown that the prototype PCS can integrate hearing aid functionality, telephony, public announcement systems, and home entertainment. An exemplary binaural speech enhancement scheme that represents a large class of possible PHS processing schemes is shown to be compatible with the general concept. However, an analysis of hardware and software architectures shows that the implementation of a PCS on future advanced cell phone-like devices is challenging. Because of limitations in processing power, recoding of prototype implementations into fixed point arithmetic will be required and WBAN performance is still a limiting factor in terms of data rate and delay.
Science yield modeling with the Exoplanet Open-Source Imaging Mission Simulator (EXOSIMS)
NASA Astrophysics Data System (ADS)
Delacroix, Christian; Savransky, Dmitry; Garrett, Daniel; Lowrance, Patrick; Morgan, Rhonda
2016-08-01
We report on our ongoing development of EXOSIMS and mission simulation results for WFIRST. We present the interface control and the modular structure of the software, along with corresponding prototypes and class definitions for some of the software modules. More specifically, we focus on describing the main steps of our high-fidelity mission simulator EXOSIMS, i.e., the completeness, optical system and zodiacal light modules definition, the target list module filtering, and the creation of a planet population within our simulated universe module. For the latter, we introduce the integration of a recent mass-radius model from the FORECASTER software. We also provide custom modules dedicated to WFIRST using both the Hybrid Lyot Coronagraph (HLC) and the Shaped Pupil Coronagraph (SPC) for detection and characterization, respectively. In that context, we show and discuss the results of some preliminary WFIRST simulations, focusing on comparing different methods of integration time calculation, through ensembles (large numbers) of survey simulations.
Setti, E; Musumeci, R
2001-06-01
The world wide web is an exciting service that allows one to publish electronic documents made of text and images on the internet. Client software called a web browser can access these documents, and display and print them. The most popular browsers are currently Microsoft Internet Explorer (Microsoft, Redmond, WA) and Netscape Communicator (Netscape Communications, Mountain View, CA). These browsers can display text in hypertext markup language (HTML) format and images in Joint Photographic Expert Group (JPEG) and Graphic Interchange Format (GIF). Currently, neither browser can display radiologic images in native Digital Imaging and Communications in Medicine (DICOM) format. With the aim to publish radiologic images on the internet, we wrote a dedicated Java applet. Our software can display radiologic and histologic images in DICOM, JPEG, and GIF formats, and provides a a number of functions like windowing and magnification lens. The applet is compatible with some web browsers, even the older versions. The software is free and available from the author.
Cameo: A Python Library for Computer Aided Metabolic Engineering and Optimization of Cell Factories.
Cardoso, João G R; Jensen, Kristian; Lieven, Christian; Lærke Hansen, Anne Sofie; Galkina, Svetlana; Beber, Moritz; Özdemir, Emre; Herrgård, Markus J; Redestig, Henning; Sonnenschein, Nikolaus
2018-04-20
Computational systems biology methods enable rational design of cell factories on a genome-scale and thus accelerate the engineering of cells for the production of valuable chemicals and proteins. Unfortunately, the majority of these methods' implementations are either not published, rely on proprietary software, or do not provide documented interfaces, which has precluded their mainstream adoption in the field. In this work we present cameo, a platform-independent software that enables in silico design of cell factories and targets both experienced modelers as well as users new to the field. It is written in Python and implements state-of-the-art methods for enumerating and prioritizing knockout, knock-in, overexpression, and down-regulation strategies and combinations thereof. Cameo is an open source software project and is freely available under the Apache License 2.0. A dedicated Web site including documentation, examples, and installation instructions can be found at http://cameo.bio . Users can also give cameo a try at http://try.cameo.bio .
The GenABEL Project for statistical genomics.
Karssen, Lennart C; van Duijn, Cornelia M; Aulchenko, Yurii S
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the "core team", facilitating agile statistical omics methodology development and fast dissemination.
Development of Radar Control system for Multi-mode Active Phased Array Radar for atmospheric probing
NASA Astrophysics Data System (ADS)
Yasodha, Polisetti; Jayaraman, Achuthan; Thriveni, A.
2016-07-01
Modern multi-mode active phased array radars require highly efficient radar control system for hassle free real time radar operation. The requirement comes due to the distributed architecture of the active phased array radar, where each antenna element in the array is connected to a dedicated Transmit-Receive (TR) module. Controlling the TR modules, which are generally few hundreds in number, and functioning them in synchronisation, is a huge task during real time radar operation and should be handled with utmost care. Indian MST Radar, located at NARL, Gadanki, which is established during early 90's, as an outcome of the middle atmospheric program, is a remote sensing instrument for probing the atmosphere. This radar has a semi-active array, consisting of 1024 antenna elements, with limited beam steering, possible only along the principle planes. To overcome the limitations and difficulties, the radar is being augmented into fully active phased array, to accomplish beam agility and multi-mode operations. Each antenna element is excited with a dedicated 1 kW TR module, located in the field and enables to position the radar beam within 20° conical volume. A multi-channel receiver makes the radar to operate in various modes like Doppler Beam Swinging (DBS), Spaced Antenna (SA), Frequency Domain Interferometry (FDI) etc. Present work describes the real-time radar control (RC) system for the above described active phased array radar. The radar control system consists of a Spartan 6 FPGA based Timing and Control Signal Generator (TCSG), and a computer containing the software for controlling all the subsystems of the radar during real-time radar operation and also for calibrating the radar. The main function of the TCSG is to generate the control and timing waveforms required for various subsystems of the radar. Important components of the RC system software are (i) TR module configuring software which does programming, controlling and health parameter monitoring of the TR modules, (ii) radar operation software which facilitates experimental parameter setting and operating the radar in different modes, (iii) beam steering software which computes the amplitude co-efficients and phases required for each TR module, for forming the beams selected for radar operation with the desired shape and (iv) Calibration software for calibrating the radar by measuring the differential insertion phase and amplitudes in all 1024 Transmit and Receive paths and correcting them. The TR module configuring software is a major task as it needs to control 1024 TR modules, which are located in the field about 150 m away from the RC system in the control room. Each TR module has a processor identified with a dedicated IP address, along with memory to store the instructions and parameters required for radar operation. A communication link is designed using Gigabit Ethernet (GbE) switches to realise 1 to 1024 way switching network. RC system computer communicates with the each processor using its IP address and establishes connection, via 1 to 1024 port GbE switching network. The experimental parameters data are pre-loaded parallely into all the TR modules along with the phase shifter data required for beam steering using this network. A reference timing pulse is sent to all the TR modules simultaneously, which indicates the start of radar operation. RC system also monitors the status parameters from the TR modules indicating their health during radar operation at regular intervals, via GbE switching network. Beam steering software generates the phase shift required for each TR module for the beams selected for operation. Radar operational software calls the phase shift data required for beam steering and adds it to the calibration phase obtained through calibration software and loads the resultant phase data into TR modules. Timed command/data transfer to/from subsystems and synchronisation of subsystems is essential for proper real-time operation of the active phased array radar and the RC system ensures that the commands/experimental parameter data are properly transferred to all subsystems especially to TR modules. In case of failure of any TR module, it is indicated to the user for further rectification. Realisation of the RC system is at an advanced stage. More details will be presented in the conference.
Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich
2017-04-01
Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.
Self-service for software development projects and HPC activities
NASA Astrophysics Data System (ADS)
Husejko, M.; Høimyr, N.; Gonzalez, A.; Koloventzos, G.; Asbury, D.; Trzcinska, A.; Agtzidis, I.; Botrel, G.; Otto, J.
2014-05-01
This contribution describes how CERN has implemented several essential tools for agile software development processes, ranging from version control (Git) to issue tracking (Jira) and documentation (Wikis). Running such services in a large organisation like CERN requires many administrative actions both by users and service providers, such as creating software projects, managing access rights, users and groups, and performing tool-specific customisation. Dealing with these requests manually would be a time-consuming task. Another area of our CERN computing services that has required dedicated manual support has been clusters for specific user communities with special needs. Our aim is to move all our services to a layered approach, with server infrastructure running on the internal cloud computing infrastructure at CERN. This contribution illustrates how we plan to optimise the management of our of services by means of an end-user facing platform acting as a portal into all the related services for software projects, inspired by popular portals for open-source developments such as Sourceforge, GitHub and others. Furthermore, the contribution will discuss recent activities with tests and evaluations of High Performance Computing (HPC) applications on different hardware and software stacks, and plans to offer a dynamically scalable HPC service at CERN, based on affordable hardware.
Propulsion/flight control integration technology (PROFIT) software system definition
NASA Technical Reports Server (NTRS)
Carlin, C. M.; Hastings, W. J.
1978-01-01
The Propulsion Flight Control Integration Technology (PROFIT) program is designed to develop a flying testbed dedicated to controls research. The control software for PROFIT is defined. Maximum flexibility, needed for long term use of the flight facility, is achieved through a modular design. The Host program, processes inputs from the telemetry uplink, aircraft central computer, cockpit computer control and plant sensors to form an input data base for use by the control algorithms. The control algorithms, programmed as application modules, process the input data to generate an output data base. The Host program formats the data for output to the telemetry downlink, the cockpit computer control, and the control effectors. Two applications modules are defined - the bill of materials F-100 engine control and the bill of materials F-15 inlet control.
Automated Micro-Object Detection for Mobile Diagnostics Using Lens-Free Imaging Technology
Roy, Mohendra; Seo, Dongmin; Oh, Sangwoo; Chae, Yeonghun; Nam, Myung-Hyun; Seo, Sungkyu
2016-01-01
Lens-free imaging technology has been extensively used recently for microparticle and biological cell analysis because of its high throughput, low cost, and simple and compact arrangement. However, this technology still lacks a dedicated and automated detection system. In this paper, we describe a custom-developed automated micro-object detection method for a lens-free imaging system. In our previous work (Roy et al.), we developed a lens-free imaging system using low-cost components. This system was used to generate and capture the diffraction patterns of micro-objects and a global threshold was used to locate the diffraction patterns. In this work we used the same setup to develop an improved automated detection and analysis algorithm based on adaptive threshold and clustering of signals. For this purpose images from the lens-free system were then used to understand the features and characteristics of the diffraction patterns of several types of samples. On the basis of this information, we custom-developed an automated algorithm for the lens-free imaging system. Next, all the lens-free images were processed using this custom-developed automated algorithm. The performance of this approach was evaluated by comparing the counting results with standard optical microscope results. We evaluated the counting results for polystyrene microbeads, red blood cells, HepG2, HeLa, and MCF7 cells lines. The comparison shows good agreement between the systems, with a correlation coefficient of 0.91 and linearity slope of 0.877. We also evaluated the automated size profiles of the microparticle samples. This Wi-Fi-enabled lens-free imaging system, along with the dedicated software, possesses great potential for telemedicine applications in resource-limited settings. PMID:27164146
Visual exploration and analysis of ionospheric scintillation monitoring data: The ISMR Query Tool
NASA Astrophysics Data System (ADS)
Vani, Bruno César; Shimabukuro, Milton Hirokazu; Galera Monico, João Francisco
2017-07-01
Ionospheric Scintillations are rapid variations on the phase and/or amplitude of a radio signal as it passes through ionospheric plasma irregularities. The ionosphere is a specific layer of the Earth's atmosphere located approximately between 50 km and 1000 km above the Earth's surface. As Global Navigation Satellite Systems (GNSS) - such as GPS, Galileo, BDS and GLONASS - use radio signals, these variations degrade their positioning service quality. Due to its location, Brazil is one of the places most affected by scintillation in the world. For that reason, ionosphere monitoring stations have been deployed over Brazilian territory since 2011 through cooperative projects between several institutions in Europe and Brazil. Such monitoring stations compose a network that generates a large amount of monitoring data everyday. GNSS receivers deployed at these stations - named Ionospheric Scintillation Monitor Receivers (ISMR) - provide scintillation indices and related signal metrics for available satellites dedicated to satellite-based navigation and positioning services. With this monitoring infrastructure, more than ten million observation values are generated and stored every day. Extracting the relevant information from this huge amount of data was a hard process and required the expertise of computer and geoscience scientists. This paper describes the concepts, design and aspects related to the implementation of the software that has been supporting research on ISMR data - the so-called ISMR Query Tool. Usability and other aspects are also presented via examples of application. This web based software has been designed and developed aiming to ensure insights over the huge amount of ISMR data that is fetched every day on an integrated platform. The software applies and adapts time series mining and information visualization techniques to extend the possibilities of exploring and analyzing ISMR data. The software is available to the scientific community through the World Wide Web, therefore constituting an analysis infrastructure that complements the monitoring one, providing support for researching ionospheric scintillation in the GNSS context. Interested researchers can access the functionalities without cost at http://is-cigala-calibra.fct.unesp.br/, under online request to the Space Geodesy Study Group from UNESP - Univ Estadual Paulista at Presidente Prudente.
Learning GIS and exploring geolocated data with the all-in-one Geolokit toolbox for Google Earth
NASA Astrophysics Data System (ADS)
Watlet, A.; Triantafyllou, A.; Bastin, C.
2016-12-01
GIS software are today's essential tools to gather and visualize geological data, to apply spatial and temporal analysis and finally, to create and share interactive maps for further investigations in geosciences. Such skills are especially essential to learn for students who go through fieldtrips, samples collections or field experiments. However, time is generally missing to teach in detail all the aspects of visualizing geolocated geoscientific data. For these purposes, we developed Geolokit: a lightweight freeware dedicated to geodata visualization and written in Python, a high-level, cross-platform programming language. Geolokit software is accessible through a graphical user interface, designed to run in parallel with Google Earth, benefitting from the numerous interactive capabilities. It is designed as a very user-friendly toolbox that allows `geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to visualize these into the Google Earth environment using KML code; with no require of third party software, except Google Earth itself. Geolokit comes with a large number of geosciences labels, symbols, colours and placemarks and is applicable to display several types of geolocated data, including: Multi-points datasets Automatically computed contours of multi-points datasets via several interpolation methods Discrete planar and linear structural geology data in 2D or 3D supporting large range of structures input format Clustered stereonets and rose diagrams 2D cross-sections as vertical sections Georeferenced maps and grids with user defined coordinates Field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS In the end, Geolokit is helpful for quickly visualizing and exploring data without losing too much time in the numerous capabilities of GIS software suites. We are looking for students and teachers to discover all the functionalities of Geolokit. As this project is under development and planned to be open source, we are definitely looking to discussions regarding particular needs or ideas, and to contributions in the Geolokit project.
Yang, Deshan; Brame, Scott; El Naqa, Issam; Aditya, Apte; Wu, Yu; Murty Goddu, S.; Mutic, Sasa; Deasy, Joseph O.; Low, Daniel A.
2011-01-01
Purpose: Recent years have witnessed tremendous progress in image guide radiotherapy technology and a growing interest in the possibilities for adapting treatment planning and delivery over the course of treatment. One obstacle faced by the research community has been the lack of a comprehensive open-source software toolkit dedicated for adaptive radiotherapy (ART). To address this need, the authors have developed a software suite called the Deformable Image Registration and Adaptive Radiotherapy Toolkit (DIRART). Methods:DIRART is an open-source toolkit developed in MATLAB. It is designed in an object-oriented style with focus on user-friendliness, features, and flexibility. It contains four classes of DIR algorithms, including the newer inverse consistency algorithms to provide consistent displacement vector field in both directions. It also contains common ART functions, an integrated graphical user interface, a variety of visualization and image-processing features, dose metric analysis functions, and interface routines. These interface routines make DIRART a powerful complement to the Computational Environment for Radiotherapy Research (CERR) and popular image-processing toolkits such as ITK. Results: DIRART provides a set of image processing∕registration algorithms and postprocessing functions to facilitate the development and testing of DIR algorithms. It also offers a good amount of options for DIR results visualization, evaluation, and validation. Conclusions: By exchanging data with treatment planning systems via DICOM-RT files and CERR, and by bringing image registration algorithms closer to radiotherapy applications, DIRART is potentially a convenient and flexible platform that may facilitate ART and DIR research. PMID:21361176
NASA Astrophysics Data System (ADS)
Marino, Alessandra; Ludovisi, Giancarlo; Moccaldi, Antonio; Damiani, Fiorenzo
2001-02-01
The aim of this paper is to outline the potential of imaging spectroscopy and GIS techniques as tool for the management of data rich environments, as complex fluvial areas, exposed to geological, geomorphological, and hydrogeological risks. The area of study, the Pescara River Basin is characterized by the presence of important industrial sites and by the occurrence of floods, landslides and seismic events. Data were collected, during a specific flight, using an hyperspectral MIVIS sensor. Images have been processed in order to obtain updated and accurate land-cover and land-use maps that have been inserted in a specific GIS database and integrated with further information like lithology, geological structure, geomorphology, hydrogeological features, productive plants location and characters. The processing of data layers was performed, using a dedicated software, through typical GIS operators like indexing, recording, matrix analysis, proximity analysis. The interactions between natural risks, industrial installations, agricultural areas, water resources and urban settlements have been analyzed. This allowed the creation and processing of thematic layers like vulnerability, risk and impact maps.
Economou, Anastasios; Voulgaropoulos, Anastasios
2003-01-01
The development of a dedicated automated sequential-injection analysis apparatus for anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (AdSV) is reported. The instrument comprised a peristaltic pump, a multiposition selector valve and a home-made potentiostat and used a mercury-film electrode as the working electrodes in a thin-layer electrochemical detector. Programming of the experimental sequence was performed in LabVIEW 5.1. The sequence of operations included formation of the mercury film, electrolytic or adsorptive accumulation of the analyte on the electrode surface, recording of the voltammetric current-potential response, and cleaning of the electrode. The stripping step was carried out by applying a square-wave (SW) potential-time excitation signal to the working electrode. The instrument allowed unattended operation since multiple-step sequences could be readily implemented through the purpose-built software. The utility of the analyser was tested for the determination of copper(II), cadmium(II), lead(II) and zinc(II) by SWASV and of nickel(II), cobalt(II) and uranium(VI) by SWAdSV.
Economou, Anastasios; Voulgaropoulos, Anastasios
2003-01-01
The development of a dedicated automated sequential-injection analysis apparatus for anodic stripping voltammetry (ASV) and adsorptive stripping voltammetry (AdSV) is reported. The instrument comprised a peristaltic pump, a multiposition selector valve and a home-made potentiostat and used a mercury-film electrode as the working electrodes in a thin-layer electrochemical detector. Programming of the experimental sequence was performed in LabVIEW 5.1. The sequence of operations included formation of the mercury film, electrolytic or adsorptive accumulation of the analyte on the electrode surface, recording of the voltammetric current-potential response, and cleaning of the electrode. The stripping step was carried out by applying a square-wave (SW) potential-time excitation signal to the working electrode. The instrument allowed unattended operation since multiple-step sequences could be readily implemented through the purpose-built software. The utility of the analyser was tested for the determination of copper(II), cadmium(II), lead(II) and zinc(II) by SWASV and of nickel(II), cobalt(II) and uranium(VI) by SWAdSV. PMID:18924623
Analysis of satellite multibeam antennas’ performances
NASA Astrophysics Data System (ADS)
Sterbini, Guido
2006-07-01
In this work, we discuss the application of frequency reuse's concept in satellite communications, stressing the importance for a design-oriented mathematical model as first step for dimensioning antenna systems. We consider multibeam reflector antennas. The first part of the work consists in reorganizing, making uniform and completing the models already developed in the scientific literature. In doing it, we adopt the multidimensional Taylor development formalism. For computing the spillover efficiency of the antenna, we consider different feed's illuminations and we propose a completely original mathematical model, obtained by the interpolation of simulator results. The second part of the work is dedicated to characterize the secondary far field pattern. Combining this model together with the information on the cellular coverage geometry is possible to evaluate the isolation and the minimum directivity on the cell. As third part, in order to test the model and its analysis and synthesis capabilities, we implement a software tool that helps the designer in the rapid tuning of the fundamental quantities for the optimization of the performance: the proposed model shows an optimum agreement with the results of the simulations.
NASA Astrophysics Data System (ADS)
Fuc, Pawel; Lijewski, Piotr; Ziolkowski, Andrzej; Dobrzyński, Michal
2017-05-01
Analysis of the energy balance for an exhaust system of a diesel engine fit with an automotive thermoelectric generator (ATEG) of our own design has been carried out. A special measurement system and dedicated software were developed to measure the power generated by the modules. The research object was a 1.3-l small diesel engine with power output of 66 kW. The tests were carried out on a dynamic engine test bed that allows reproduction of an actual driving cycle expressed as a function V = f( t), simulating drivetrain (clutch, transmission) operating characteristics, vehicle geometrical parameters, and driver behavior. Measurements of exhaust gas thermodynamic parameters (temperature, pressure, and mass flow) as well as the voltage and current generated by the thermoelectric modules were performed during tests of our own design. Based on the results obtained, the flow of exhaust gas energy in the entire exhaust system was determined along with the ATEG power output. The ideal area of the exhaust system for location of the ATEG was defined to ensure the highest thermal energy recovery efficiency.
The Use of Twitter by Radiology Journals: An Analysis of Twitter Activity and Impact Factor.
Kelly, Brendan S; Redmond, Ciaran E; Nason, Gregory J; Healy, Gerard M; Horgan, Niall A; Heffernan, Eric J
2016-11-01
Medical journals use social media as a means to disseminate new research and interact with readers. The microblogging site Twitter is one such platform. The aim of this study was to analyze the recent use of Twitter by the leading radiology journals. The top 50 journals by Impact Factor were included. Twitter profiles associated with these journals, or their corresponding societies, were identified. Whether each journal used other social media platforms was also recorded. Each Twitter profile was analyzed over a one-year period, with data collected via Twitonomy software. Klout scores of social media influence were calculated. Results were analyzed in SPSS using Student's t test, Fisher contingency tables, and Pearson correlations to identify any association between social media interaction and Impact Factors of journals. Fourteen journals (28%) had dedicated Twitter profiles. Of the 36 journals without dedicated Twitter profiles, 25 (50%) were associated with societies that had profiles, leaving 11 (22%) journals without a presence on Twitter. The mean Impact Factor of all journals was 3.1 ± 1.41 (range, 1.7-6.9). Journals with Twitter profiles had higher Impact Factors than those without (mean, 3.37 vs 2.14; P < .001). There was no statistically significant difference between the Impact Factors of the journals with dedicated Twitter profiles and those associated with affiliated societies (P = .47). Since joining Twitter, 7 of the 11 journals (64%) experienced increases in Impact Factor. A greater number of Twitter followers was correlated with higher journal Impact Factor (R 2 = 0.581, P = .029). The investigators assessed the prevalence and activity of the leading radiology journals on Twitter. Radiology journals with Twitter profiles have higher Impact Factors than those without profiles, and the number of followers of a journal's Twitter profile is positively associated with Impact Factor. Copyright © 2016 American College of Radiology. Published by Elsevier Inc. All rights reserved.
The November 1, 2017 issue of Cancer Research is dedicated to a collection of computational resource papers in genomics, proteomics, animal models, imaging, and clinical subjects for non-bioinformaticists looking to incorporate computing tools into their work. Scientists at Pacific Northwest National Laboratory have developed P-MartCancer, an open, web-based interactive software tool that enables statistical analyses of peptide or protein data generated from mass-spectrometry (MS)-based global proteomics experiments.
Microcomputer programming skills
NASA Technical Reports Server (NTRS)
Barth, C. W.
1979-01-01
Some differences in skill and techniques required for conversion from programmer to microprogrammer are discussed. The primary things with which the programmer should work are hardware architecture, hardware/software trade off, and interfacing. The biggest differences, however, will stem from the differences in applications than from differences in machine size. The change to real-time programming is the most important of these differences, particularly on dedicated microprocessors. Another primary change is programming with a more computer-naive user in mind, and dealing with his limitations and expectations.
Bandit: Technologies for Proximity Operations of Teams of Sub-10Kg Spacecraft
2007-10-16
and adding a dedicated overhead camera system. As will be explained below, the forced-air system did not work and the existing system has proven too...erratic to justify the expense of the camera system. 6DOF Software Simulator. The existing Java-based graphical 6DOF simulator was to be improved for...proposed camera system for a nonfunctional table. The C-9 final report is enclosed. ["Prf flj ,er Figure 1. Forced-air table schematic Figure 2
Electromagnetic Imaging Methods for Nondestructive Evaluation Applications
Deng, Yiming; Liu, Xin
2011-01-01
Electromagnetic nondestructive tests are important and widely used within the field of nondestructive evaluation (NDE). The recent advances in sensing technology, hardware and software development dedicated to imaging and image processing, and material sciences have greatly expanded the application fields, sophisticated the systems design and made the potential of electromagnetic NDE imaging seemingly unlimited. This review provides a comprehensive summary of research works on electromagnetic imaging methods for NDE applications, followed by the summary and discussions on future directions. PMID:22247693
AVIRIS ground data-processing system
NASA Technical Reports Server (NTRS)
Reimer, John H.; Heyada, Jan R.; Carpenter, Steve C.; Deich, William T. S.; Lee, Meemong
1987-01-01
The Airborne Visible/Infrared Imaging Spectrometer (AVIRIS) has been under development at JPL for the past four years. During this time, a dedicated ground data-processing system has been designed and implemented to store and process the large amounts of data expected. This paper reviews the objectives of this ground data-processing system and describes the hardware. An outline of the data flow through the system is given, and the software and incorporated algorithms developed specifically for the systematic processing of AVIRIS data are described.
Architectural design proposal for real time clock for wireless microcontroller unit
NASA Astrophysics Data System (ADS)
Alias, Muhammad Nor Azwan Mohd; Nizam Mohyar, Shaiful
2017-11-01
In this project, we are developing an Intellectual properties (IP) which is a dedicated real-time clock (RTC) system for a wireless microcontroller. This IP is developed using Verilog Hardware Description Language (Verilog HDL) and being simulated using Quartus II and Synopsys software. This RTC will be used in microcontroller system to provide precise time and date which can be used for various applications. It plays a very important role in the real-time systems like digital clock, attendance system, digital camera and more.
Modular space vehicle boards, control software, reprogramming, and failure recovery
DOE Office of Scientific and Technical Information (OSTI.GOV)
Judd, Stephen; Dallmann, Nicholas; McCabe, Kevin
A space vehicle may have a modular board configuration that commonly uses some or all components and a common operating system for at least some of the boards. Each modular board may have its own dedicated processing, and processing loads may be distributed. The space vehicle may be reprogrammable, and may be launched without code that enables all functionality and/or components. Code errors may be detected and the space vehicle may be reset to a working code version to prevent system failure.
IRIS COLOUR CLASSIFICATION SCALES – THEN AND NOW
Grigore, Mariana; Avram, Alina
2015-01-01
Eye colour is one of the most obvious phenotypic traits of an individual. Since the first documented classification scale developed in 1843, there have been numerous attempts to classify the iris colour. In the past centuries, iris colour classification scales has had various colour categories and mostly relied on comparison of an individual’s eye with painted glass eyes. Once photography techniques were refined, standard iris photographs replaced painted eyes, but this did not solve the problem of painted/ printed colour variability in time. Early clinical scales were easy to use, but lacked objectivity and were not standardised or statistically tested for reproducibility. The era of automated iris colour classification systems came with the technological development. Spectrophotometry, digital analysis of high-resolution iris images, hyper spectral analysis of the human real iris and the dedicated iris colour analysis software, all accomplished an objective, accurate iris colour classification, but are quite expensive and limited in use to research environment. Iris colour classification systems evolved continuously due to their use in a wide range of studies, especially in the fields of anthropology, epidemiology and genetics. Despite the wide range of the existing scales, up until present there has been no generally accepted iris colour classification scale. PMID:27373112
IRIS COLOUR CLASSIFICATION SCALES--THEN AND NOW.
Grigore, Mariana; Avram, Alina
2015-01-01
Eye colour is one of the most obvious phenotypic traits of an individual. Since the first documented classification scale developed in 1843, there have been numerous attempts to classify the iris colour. In the past centuries, iris colour classification scales has had various colour categories and mostly relied on comparison of an individual's eye with painted glass eyes. Once photography techniques were refined, standard iris photographs replaced painted eyes, but this did not solve the problem of painted/ printed colour variability in time. Early clinical scales were easy to use, but lacked objectivity and were not standardised or statistically tested for reproducibility. The era of automated iris colour classification systems came with the technological development. Spectrophotometry, digital analysis of high-resolution iris images, hyper spectral analysis of the human real iris and the dedicated iris colour analysis software, all accomplished an objective, accurate iris colour classification, but are quite expensive and limited in use to research environment. Iris colour classification systems evolved continuously due to their use in a wide range of studies, especially in the fields of anthropology, epidemiology and genetics. Despite the wide range of the existing scales, up until present there has been no generally accepted iris colour classification scale.
The Shock and Vibration Digest. Volume 16, Number 4
1984-04-01
The 2nd International Modal Analysis Conference, which was held in Orlando, Florida, this past February, was highly successful in all respects. A...announcement of the formation of a new technical society dedicated to advancing the modal analysis technology, the International Society for Modal Testing and... Analysis . This new society is I unique in two respects. First, it is dedicated to a specific branch of a specialized technical field..Second, it is a
Internet teleconferencing as a clinical tool for anesthesiologists.
Ruskin, K J; Palmer, T E; Hagenouw, R R; Lack, A; Dunnill, R
1998-04-01
Internet teleconferencing software can be used to hold "virtual" meetings, during which participants around the world can share ideas. A core group of anesthetic medical practitioners, largely consisting of the Society for Advanced Telecommunications in Anesthesia (SATA), has begun to hold regularly scheduled "virtual grand rounds." This paper examines currently available software and offers impressions of our own early experiences with this technology. Two teleconferencing systems have been used: White Pine Software CU-SeeMe and Microsoft NetMeeting. While both provided acceptable results, each had specific advantages and disadvantages. CU-SeeMe is easier to use when conferences include more than two participants. NetMeeting provides higher quality audio and video signals under crowded network conditions, and is better for conferences with only two participants. While some effort is necessary to get these teleconferencing systems to work well, we have been using desktop conferencing for six months to hold virtual Internet meetings. The sound and video images produced by Internet teleconferencing software are inferior to dedicated point-to-point teleconferencing systems. However, low cost, wide availability, and ease of use make this technology a potentially valuable tool for clinicians and researchers.
SCOSII OL: A dedicated language for mission operations
NASA Technical Reports Server (NTRS)
Baldi, Andrea; Elgaard, Dennis; Lynenskjold, Steen; Pecchioli, Mauro
1994-01-01
The Spacecraft Control and Operations System 2 (SCOSII) is the new generation of Mission Control Systems (MCS) to be used at ESOC. The system is generic because it offers a collection of standard functions configured through a database upon which a dedicated MCS is established for a given mission. An integral component of SCOSII is the support of a dedicated Operations Language (OL). The spacecraft operation engineers edit, test, validate, and install OL scripts as part of the configuration of the system with, e.g., expressions for computing derived parameters and procedures for performing flight operations, all without involvement of software support engineers. A layered approach has been adopted for the implementation centered around the explicit representation of a data model. The data model is object-oriented defining the structure of the objects in terms of attributes (data) and services (functions) which can be accessed by the OL. SCOSII supports the creation of a mission model. System elements as, e.g., a gyro are explicit, as are the attributes which described them and the services they provide. The data model driven approach makes it possible to take immediate advantage of this higher-level of abstraction, without requiring expansion of the language. This article describes the background and context leading to the OL, concepts, language facilities, implementation, status and conclusions found so far.
SDN solutions for switching dedicated long-haul connections: Measurements and comparative analysis
Rao, Nageswara S. V.
2016-01-01
We consider a scenario of two sites connected over a dedicated, long-haul connection that must quickly fail-over in response to degradations in host-to-host application performance. The traditional layer-2/3 hot stand-by fail-over solutions do not adequately address the variety of application degradations, and more recent single controller Software Defined Networks (SDN) solutions are not effective for long-haul connections. We present two methods for such a path fail-over using OpenFlow enabled switches: (a) a light-weight method that utilizes host scripts to monitor application performance and dpctl API for switching, and (b) a generic method that uses two OpenDaylight (ODL) controllers and RESTmore » interfaces. For both methods, the restoration dynamics of applications contain significant statistical variations due to the complexities of controllers, north bound interfaces and switches; they, together with the wide variety of vendor implementations, complicate the choice among such solutions. We develop the impulse-response method based on regression functions of performance parameters to provide a rigorous and objective comparison of different solutions. We describe testing results of the two proposed methods, using TCP throughput and connection rtt as main parameters, over a testbed consisting of HP and Cisco switches connected over longhaul connections emulated in hardware by ANUE devices. Lastly, the combination of analytical and experimental results demonstrate that the dpctl method responds seconds faster than the ODL method on average, even though both methods eventually restore original TCP throughput.« less
Intelligent Urban Public Transportation for Accessibility Dedicated to People with Disabilities
Zhou, Haiying; Hou, Kun-Mean; Zuo, Decheng; Li, Jian
2012-01-01
The traditional urban public transport system generally cannot provide an effective access service for people with disabilities, especially for disabled, wheelchair and blind (DWB) passengers. In this paper, based on advanced information & communication technologies (ICT) and green technologies (GT) concepts, a dedicated public urban transportation service access system named Mobi+ has been introduced, which facilitates the mobility of DWB passengers. The Mobi+ project consists of three subsystems: a wireless communication subsystem, which provides the data exchange and network connection services between buses and stations in the complex urban environments; the bus subsystem, which provides the DWB class detection & bus arrival notification services; and the station subsystem, which implements the urban environmental surveillance & bus auxiliary access services. The Mobi+ card that supports multi-microcontroller multi-transceiver adopts the fault-tolerant component-based hardware architecture, in which the dedicated embedded system software, i.e., operating system micro-kernel and wireless protocol, has been integrated. The dedicated Mobi+ embedded system provides the fault-tolerant resource awareness communication and scheduling mechanism to ensure the reliability in data exchange and service provision. At present, the Mobi+ system has been implemented on the buses and stations of line ‘2’ in the city of Clermont-Ferrand (France). The experiential results show that, on one hand the Mobi+ prototype system reaches the design expectations and provides an effective urban bus access service for people with disabilities; on the other hand the Mobi+ system is easily to deploy in the buses and at bus stations thanks to its low energy consumption and small form factor. PMID:23112622
Intelligent urban public transportation for accessibility dedicated to people with disabilities.
Zhou, Haiying; Hou, Kun-Mean; Zuo, Decheng; Li, Jian
2012-01-01
The traditional urban public transport system generally cannot provide an effective access service for people with disabilities, especially for disabled, wheelchair and blind (DWB) passengers. In this paper, based on advanced information & communication technologies (ICT) and green technologies (GT) concepts, a dedicated public urban transportation service access system named Mobi+ has been introduced, which facilitates the mobility of DWB passengers. The Mobi+ project consists of three subsystems: a wireless communication subsystem, which provides the data exchange and network connection services between buses and stations in the complex urban environments; the bus subsystem, which provides the DWB class detection & bus arrival notification services; and the station subsystem, which implements the urban environmental surveillance & bus auxiliary access services. The Mobi+ card that supports multi-microcontroller multi-transceiver adopts the fault-tolerant component-based hardware architecture, in which the dedicated embedded system software, i.e., operating system micro-kernel and wireless protocol, has been integrated. The dedicated Mobi+ embedded system provides the fault-tolerant resource awareness communication and scheduling mechanism to ensure the reliability in data exchange and service provision. At present, the Mobi+ system has been implemented on the buses and stations of line '2' in the city of Clermont-Ferrand (France). The experiential results show that, on one hand the Mobi+ prototype system reaches the design expectations and provides an effective urban bus access service for people with disabilities; on the other hand the Mobi+ system is easily to deploy in the buses and at bus stations thanks to its low energy consumption and small form factor.
Ruokolainen, Mervi; Mauno, Saija; Cheng, Ting
2014-11-01
To examine the moderating roles of job dedication and age in the job insecurity-family-related well-being relationship. As job insecurity is a rather permanent stressor among nurses nowadays, more research is needed on the buffering factors alleviating its negative effects on well-being. A total of 1719 Finnish nurses representing numerous health care organisations participated in this cross-sectional study. Moderated hierarchical regression analysis was used to examine the associations. Nurses' younger age and low job dedication operated as protective factors against the negative effect of high job insecurity on parental satisfaction. The effect of job dedication on family-related well-being was also age-specific: high job dedication protected younger nurses from the negative effect of job insecurity on work-family conflict and parental stress, whereas among older nurses those who reported low job dedication showed better well-being in the presence of high job insecurity. The most job-dedicated nurses were more vulnerable to job insecurity in relation to parental satisfaction. In addition, high job dedication combined with high age implied more work-family conflict and parental stress in the presence of high job insecurity. Managers should seek to boost younger nurses' job dedication and to prevent older nurses' over-commitment. © 2013 John Wiley & Sons Ltd.
Bi-Force: large-scale bicluster editing and its application to gene expression data biclustering.
Sun, Peng; Speicher, Nora K; Röttger, Richard; Guo, Jiong; Baumbach, Jan
2014-05-01
The explosion of the biological data has dramatically reformed today's biological research. The need to integrate and analyze high-dimensional biological data on a large scale is driving the development of novel bioinformatics approaches. Biclustering, also known as 'simultaneous clustering' or 'co-clustering', has been successfully utilized to discover local patterns in gene expression data and similar biomedical data types. Here, we contribute a new heuristic: 'Bi-Force'. It is based on the weighted bicluster editing model, to perform biclustering on arbitrary sets of biological entities, given any kind of pairwise similarities. We first evaluated the power of Bi-Force to solve dedicated bicluster editing problems by comparing Bi-Force with two existing algorithms in the BiCluE software package. We then followed a biclustering evaluation protocol in a recent review paper from Eren et al. (2013) (A comparative analysis of biclustering algorithms for gene expressiondata. Brief. Bioinform., 14:279-292.) and compared Bi-Force against eight existing tools: FABIA, QUBIC, Cheng and Church, Plaid, BiMax, Spectral, xMOTIFs and ISA. To this end, a suite of synthetic datasets as well as nine large gene expression datasets from Gene Expression Omnibus were analyzed. All resulting biclusters were subsequently investigated by Gene Ontology enrichment analysis to evaluate their biological relevance. The distinct theoretical foundation of Bi-Force (bicluster editing) is more powerful than strict biclustering. We thus outperformed existing tools with Bi-Force at least when following the evaluation protocols from Eren et al. Bi-Force is implemented in Java and integrated into the open source software package of BiCluE. The software as well as all used datasets are publicly available at http://biclue.mpi-inf.mpg.de. © The Author(s) 2014. Published by Oxford University Press on behalf of Nucleic Acids Research.
Strauss, Ludwig G; Pan, Leyun; Cheng, Caixia; Haberkorn, Uwe; Dimitrakopoulou-Strauss, Antonia
2011-03-01
(18)F-FDG kinetics are quantified by a 2-tissue-compartment model. The routine use of dynamic PET is limited because of this modality's 1-h acquisition time. We evaluated shortened acquisition protocols up to 0-30 min regarding the accuracy for data analysis with the 2-tissue-compartment model. Full dynamic series for 0-60 min were analyzed using a 2-tissue-compartment model. The time-activity curves and the resulting parameters for the model were stored in a database. Shortened acquisition data were generated from the database using the following time intervals: 0-10, 0-16, 0-20, 0-25, and 0-30 min. Furthermore, the impact of adding a 60-min uptake value to the dynamic series was evaluated. The datasets were analyzed using dedicated software to predict the results of the full dynamic series. The software is based on a modified support vector machines (SVM) algorithm and predicts the compartment parameters of the full dynamic series. The SVM-based software provides user-independent results and was accurate at predicting the compartment parameters of the full dynamic series. If a squared correlation coefficient of 0.8 (corresponding to 80% explained variance of the data) was used as a limit, a shortened acquisition of 0-16 min was accurate at predicting the 60-min 2-tissue-compartment parameters. If a limit of 0.9 (90% explained variance) was used, a dynamic series of at least 0-20 min together with the 60-min uptake values is required. Shortened acquisition protocols can be used to predict the parameters of the 2-tissue-compartment model. Either a dynamic PET series of 0-16 min or a combination of a dynamic PET/CT series of 0-20 min and a 60-min uptake value is accurate for analysis with a 2-tissue-compartment model.
Web-GIS platform for monitoring and forecasting of regional climate and ecological changes
NASA Astrophysics Data System (ADS)
Gordov, E. P.; Krupchatnikov, V. N.; Lykosov, V. N.; Okladnikov, I.; Titov, A. G.; Shulgina, T. M.
2012-12-01
Growing volume of environmental data from sensors and model outputs makes development of based on modern information-telecommunication technologies software infrastructure for information support of integrated scientific researches in the field of Earth sciences urgent and important task (Gordov et al, 2012, van der Wel, 2005). It should be considered that original heterogeneity of datasets obtained from different sources and institutions not only hampers interchange of data and analysis results but also complicates their intercomparison leading to a decrease in reliability of analysis results. However, modern geophysical data processing techniques allow combining of different technological solutions for organizing such information resources. Nowadays it becomes a generally accepted opinion that information-computational infrastructure should rely on a potential of combined usage of web- and GIS-technologies for creating applied information-computational web-systems (Titov et al, 2009, Gordov et al. 2010, Gordov, Okladnikov and Titov, 2011). Using these approaches for development of internet-accessible thematic information-computational systems, and arranging of data and knowledge interchange between them is a very promising way of creation of distributed information-computation environment for supporting of multidiscipline regional and global research in the field of Earth sciences including analysis of climate changes and their impact on spatial-temporal vegetation distribution and state. Experimental software and hardware platform providing operation of a web-oriented production and research center for regional climate change investigations which combines modern web 2.0 approach, GIS-functionality and capabilities of running climate and meteorological models, large geophysical datasets processing, visualization, joint software development by distributed research groups, scientific analysis and organization of students and post-graduate students education is presented. Platform software developed (Shulgina et al, 2012, Okladnikov et al, 2012) includes dedicated modules for numerical processing of regional and global modeling results for consequent analysis and visualization. Also data preprocessing, run and visualization of modeling results of models WRF and «Planet Simulator» integrated into the platform is provided. All functions of the center are accessible by a user through a web-portal using common graphical web-browser in the form of an interactive graphical user interface which provides, particularly, capabilities of visualization of processing results, selection of geographical region of interest (pan and zoom) and data layers manipulation (order, enable/disable, features extraction). Platform developed provides users with capabilities of heterogeneous geophysical data analysis, including high-resolution data, and discovering of tendencies in climatic and ecosystem changes in the framework of different multidisciplinary researches (Shulgina et al, 2011). Using it even unskilled user without specific knowledge can perform computational processing and visualization of large meteorological, climatological and satellite monitoring datasets through unified graphical web-interface.
Neural classifier in the estimation process of maturity of selected varieties of apples
NASA Astrophysics Data System (ADS)
Boniecki, P.; Piekarska-Boniecka, H.; Koszela, K.; Zaborowicz, M.; Przybył, K.; Wojcieszak, D.; Zbytek, Z.; Ludwiczak, A.; Przybylak, A.; Lewicki, A.
2015-07-01
This paper seeks to present methods of neural image analysis aimed at estimating the maturity state of selected varieties of apples which are popular in Poland. An identification of the degree of maturity of selected varieties of apples has been conducted on the basis of information encoded in graphical form, presented in the digital photos. The above process involves the application of the BBCH scale, used to determine the maturity of apples. The aforementioned scale is widely used in the EU and has been developed for many species of monocotyledonous plants and dicotyledonous plants. It is also worth noticing that the given scale enables detailed determinations of development stage of a given plant. The purpose of this work is to identify maturity level of selected varieties of apples, which is supported by the use of image analysis methods and classification techniques represented by artificial neural networks. The analysis of graphical representative features based on image analysis method enabled the assessment of the maturity of apples. For the utilitarian purpose the "JabVis 1.1" neural IT system was created, in accordance with requirements of the software engineering dedicated to support the decision-making processes occurring in broadly understood production process and processing of apples.
NASA Astrophysics Data System (ADS)
Ryazanova, A. A.; Okladnikov, I. G.; Gordov, E. P.
2017-11-01
The frequency of occurrence and magnitude of precipitation and temperature extreme events show positive trends in several geographical regions. These events must be analyzed and studied in order to better understand their impact on the environment, predict their occurrences, and mitigate their effects. For this purpose, we augmented web-GIS called “CLIMATE” to include a dedicated statistical package developed in the R language. The web-GIS “CLIMATE” is a software platform for cloud storage processing and visualization of distributed archives of spatial datasets. It is based on a combined use of web and GIS technologies with reliable procedures for searching, extracting, processing, and visualizing the spatial data archives. The system provides a set of thematic online tools for the complex analysis of current and future climate changes and their effects on the environment. The package includes new powerful methods of time-dependent statistics of extremes, quantile regression and copula approach for the detailed analysis of various climate extreme events. Specifically, the very promising copula approach allows obtaining the structural connections between the extremes and the various environmental characteristics. The new statistical methods integrated into the web-GIS “CLIMATE” can significantly facilitate and accelerate the complex analysis of climate extremes using only a desktop PC connected to the Internet.
Zhang, Fan; Briones, Andrea; Soloviev, Mikhail
2016-01-01
This chapter describes the principles of selection of antigenic peptides for the development of anti-peptide antibodies for use in microarray-based multiplex affinity assays and also with mass-spectrometry detection. The methods described here are mostly applicable to small to medium scale arrays. Although the same principles of peptide selection would be suitable for larger scale arrays (with 100+ features) the actual informatics software and printing methods may well be different. Because of the sheer number of proteins/peptides to be processed and analyzed dedicated software capable of processing all the proteins and an enterprise level array robotics may be necessary for larger scale efforts. This report aims to provide practical advice to those who develop or use arrays with up to ~100 different peptide or protein features.
Manufacturing of ArF chromeless hard shifter for 65-nm technology
NASA Astrophysics Data System (ADS)
Park, Keun-Taek; Dieu, Laurent; Hughes, Greg P.; Green, Kent G.; Croffie, Ebo H.; Taravade, Kunal N.
2003-12-01
For logic design, Chrome-less Phase Shift Mask is one of the possible solutions for defining small geometry with low MEF (mask enhancement factor) for the 65nm node. There have been lots of dedicated studies on the PCO (Phase Chrome Off-axis) mask technology and several design approaches have been proposed including grating background, chrome patches (or chrome shield) for applying PCO on line/space and contact pattern. In this paper, we studied the feasibility of grating design for line and contact pattern. The design of the grating pattern was provided from the EM simulation software (TEMPEST) and the aerial image simulation software. AIMS measurements with high NA annular illumination were done. Resist images were taken on designed pattern in different focus. Simulations, AIMS are compared to verify the consistency of the process with wafer printed performance.
The GenABEL Project for statistical genomics
Karssen, Lennart C.; van Duijn, Cornelia M.; Aulchenko, Yurii S.
2016-01-01
Development of free/libre open source software is usually done by a community of people with an interest in the tool. For scientific software, however, this is less often the case. Most scientific software is written by only a few authors, often a student working on a thesis. Once the paper describing the tool has been published, the tool is no longer developed further and is left to its own device. Here we describe the broad, multidisciplinary community we formed around a set of tools for statistical genomics. The GenABEL project for statistical omics actively promotes open interdisciplinary development of statistical methodology and its implementation in efficient and user-friendly software under an open source licence. The software tools developed withing the project collectively make up the GenABEL suite, which currently consists of eleven tools. The open framework of the project actively encourages involvement of the community in all stages, from formulation of methodological ideas to application of software to specific data sets. A web forum is used to channel user questions and discussions, further promoting the use of the GenABEL suite. Developer discussions take place on a dedicated mailing list, and development is further supported by robust development practices including use of public version control, code review and continuous integration. Use of this open science model attracts contributions from users and developers outside the “core team”, facilitating agile statistical omics methodology development and fast dissemination. PMID:27347381
Operating a wide-area remote observing system for the W. M. Keck Observatory
NASA Astrophysics Data System (ADS)
Wirth, Gregory D.; Kibrick, Robert I.; Goodrich, Robert W.; Lyke, James E.
2008-07-01
For over a decade, the W. M. Keck Observatory's two 10-meter telescopes have been operated remotely from its Waimea headquarters. Over the last 6 years, WMKO remote observing has expanded to allow teams at dedicated sites in California to observe either in collaboration with colleagues in Waimea or entirely from the U.S. mainland. Once an experimental effort, the Observatory's mainland observing capability is now fully operational, supported on all science instruments (except the interferometer) and regularly used by astronomers at eight mainland sites. Establishing a convenient and secure observing capability from those sites required careful planning to ensure that they are properly equipped and configured. It also entailed a significant investment in hardware and software, including both custom scripts to simplify launching the instrument interface at remote sites and automated routers employing ISDN backup lines to ensure continuation of observing during Internet outages. Observers often wait until shortly before their runs to request use of the mainland facilities. Scheduling these requests and ensuring proper system operation prior to observing requires close coordination between personnel at WMKO and the mainland sites. An established protocol for approving requests and carrying out pre-run checkout has proven useful in ensuring success. The Observatory anticipates enhancing and expanding its remote observing system. Future plans include deploying dedicated summit computers for running VNC server software, implementing a web-based tracking system for mainland-based observing requests, expanding the system to additional mainland sites, and converting to full-time VNC operation for all instruments.
NASA Astrophysics Data System (ADS)
Swift, Jonathan J.; Bottom, Michael; Johnson, John A.; Wright, Jason T.; McCrady, Nate; Wittenmyer, Robert A.; Plavchan, Peter; Riddle, Reed; Muirhead, Philip S.; Herzig, Erich; Myles, Justin; Blake, Cullen H.; Eastman, Jason; Beatty, Thomas G.; Barnes, Stuart I.; Gibson, Steven R.; Lin, Brian; Zhao, Ming; Gardner, Paul; Falco, Emilio; Criswell, Stephen; Nava, Chantanelle; Robinson, Connor; Sliski, David H.; Hedrick, Richard; Ivarsen, Kevin; Hjelstrom, Annie; de Vera, Jon; Szentgyorgyi, Andrew
2015-04-01
The Miniature Exoplanet Radial Velocity Array (MINERVA) is a U.S.-based observational facility dedicated to the discovery and characterization of exoplanets around a nearby sample of bright stars. MINERVA employs a robotic array of four 0.7-m telescopes outfitted for both high-resolution spectroscopy and photometry, and is designed for completely autonomous operation. The primary science program is a dedicated radial velocity survey and the secondary science objective is to obtain high-precision transit light curves. The modular design of the facility and the flexibility of our hardware allows for both science programs to be pursued simultaneously, while the robotic control software provides a robust and efficient means to carry out nightly observations. We describe the design of MINERVA, including major hardware components, software, and science goals. The telescopes and photometry cameras are characterized at our test facility on the Caltech campus in Pasadena, California, and their on-sky performance is validated. The design and simulated performance of the spectrograph is briefly discussed as we await its completion. New observations from our test facility demonstrate sub-mmag photometric precision of one of our radial velocity survey targets, and we present new transit observations and fits of WASP-52b-a known hot-Jupiter with an inflated radius and misaligned orbit. The process of relocating the MINERVA hardware to its final destination at the Fred Lawrence Whipple Observatory in southern Arizona has begun, and science operations are expected to commence in 2015.
Extending the Fermi-LAT Data Processing Pipeline to the Grid
NASA Astrophysics Data System (ADS)
Zimmer, S.; Arrabito, L.; Glanzman, T.; Johnson, T.; Lavalley, C.; Tsaregorodtsev, A.
2012-12-01
The Data Handling Pipeline (“Pipeline”) has been developed for the Fermi Gamma-Ray Space Telescope (Fermi) Large Area Telescope (LAT) which launched in June 2008. Since then it has been in use to completely automate the production of data quality monitoring quantities, reconstruction and routine analysis of all data received from the satellite and to deliver science products to the collaboration and the Fermi Science Support Center. Aside from the reconstruction of raw data from the satellite (Level 1), data reprocessing and various event-level analyses are also reasonably heavy loads on the pipeline and computing resources. These other loads, unlike Level 1, can run continuously for weeks or months at a time. In addition it receives heavy use in performing production Monte Carlo tasks. In daily use it receives a new data download every 3 hours and launches about 2000 jobs to process each download, typically completing the processing of the data before the next download arrives. The need for manual intervention has been reduced to less than 0.01% of submitted jobs. The Pipeline software is written almost entirely in Java and comprises several modules. The software comprises web-services that allow online monitoring and provides charts summarizing work flow aspects and performance information. The server supports communication with several batch systems such as LSF and BQS and recently also Sun Grid Engine and Condor. This is accomplished through dedicated job control services that for Fermi are running at SLAC and the other computing site involved in this large scale framework, the Lyon computing center of IN2P3. While being different in the logic of a task, we evaluate a separate interface to the Dirac system in order to communicate with EGI sites to utilize Grid resources, using dedicated Grid optimized systems rather than developing our own. More recently the Pipeline and its associated data catalog have been generalized for use by other experiments, and are currently being used by the Enriched Xenon Observatory (EXO), Cryogenic Dark Matter Search (CDMS) experiments as well as for Monte Carlo simulations for the future Cherenkov Telescope Array (CTA).
Development of climate data storage and processing model
NASA Astrophysics Data System (ADS)
Okladnikov, I. G.; Gordov, E. P.; Titov, A. G.
2016-11-01
We present a storage and processing model for climate datasets elaborated in the framework of a virtual research environment (VRE) for climate and environmental monitoring and analysis of the impact of climate change on the socio-economic processes on local and regional scales. The model is based on a «shared nothings» distributed computing architecture and assumes using a computing network where each computing node is independent and selfsufficient. Each node holds a dedicated software for the processing and visualization of geospatial data providing programming interfaces to communicate with the other nodes. The nodes are interconnected by a local network or the Internet and exchange data and control instructions via SSH connections and web services. Geospatial data is represented by collections of netCDF files stored in a hierarchy of directories in the framework of a file system. To speed up data reading and processing, three approaches are proposed: a precalculation of intermediate products, a distribution of data across multiple storage systems (with or without redundancy), and caching and reuse of the previously obtained products. For a fast search and retrieval of the required data, according to the data storage and processing model, a metadata database is developed. It contains descriptions of the space-time features of the datasets available for processing, their locations, as well as descriptions and run options of the software components for data analysis and visualization. The model and the metadata database together will provide a reliable technological basis for development of a high- performance virtual research environment for climatic and environmental monitoring.
NASA Technical Reports Server (NTRS)
Lee, Alice T.; Gunn, Todd; Pham, Tuan; Ricaldi, Ron
1994-01-01
This handbook documents the three software analysis processes the Space Station Software Analysis team uses to assess space station software, including their backgrounds, theories, tools, and analysis procedures. Potential applications of these analysis results are also presented. The first section describes how software complexity analysis provides quantitative information on code, such as code structure and risk areas, throughout the software life cycle. Software complexity analysis allows an analyst to understand the software structure, identify critical software components, assess risk areas within a software system, identify testing deficiencies, and recommend program improvements. Performing this type of analysis during the early design phases of software development can positively affect the process, and may prevent later, much larger, difficulties. The second section describes how software reliability estimation and prediction analysis, or software reliability, provides a quantitative means to measure the probability of failure-free operation of a computer program, and describes the two tools used by JSC to determine failure rates and design tradeoffs between reliability, costs, performance, and schedule.
The SEL Adapts to Meet Changing Times
NASA Technical Reports Server (NTRS)
Pajerski, Rose S.; Basili, Victor R.
1997-01-01
Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. It has done this by developing and refining a continual process improvement approach that allows an organization such as the FDD to fine-tune its process for its particular domain. Experimental software engineering and measurement play a significant role in this approach. The SEL is a partnership of NASA Goddard, its major software contractor, Computer Sciences Corporation (CSC), and the University of Maryland's (LTM) Department of Computer Science. The FDD primarily builds software systems that provide ground-based flight dynamics support for scientific satellites. They fall into two sets: ground systems and simulators. Ground systems are midsize systems that average around 250 thousand source lines of code (KSLOC). Ground system development projects typically last 1 - 2 years. Recent systems have been rehosted to workstations from IBM mainframes, and also contain significant new subsystems written in C and C++. The simulators are smaller systems averaging around 60 KSLOC that provide the test data for the ground systems. Simulator development lasts up to 1 year. Most of the simulators have been built in Ada on workstations. The SEL is responsible for the management and continual improvement of the software engineering processes used on these FDD projects.
APEX - the Hyperspectral ESA Airborne Prism Experiment
Itten, Klaus I.; Dell'Endice, Francesco; Hueni, Andreas; Kneubühler, Mathias; Schläpfer, Daniel; Odermatt, Daniel; Seidel, Felix; Huber, Silvia; Schopfer, Jürg; Kellenberger, Tobias; Bühler, Yves; D'Odorico, Petra; Nieke, Jens; Alberti, Edoardo; Meuleman, Koen
2008-01-01
The airborne ESA-APEX (Airborne Prism Experiment) hyperspectral mission simulator is described with its distinct specifications to provide high quality remote sensing data. The concept of an automatic calibration, performed in the Calibration Home Base (CHB) by using the Control Test Master (CTM), the In-Flight Calibration facility (IFC), quality flagging (QF) and specific processing in a dedicated Processing and Archiving Facility (PAF), and vicarious calibration experiments are presented. A preview on major applications and the corresponding development efforts to provide scientific data products up to level 2/3 to the user is presented for limnology, vegetation, aerosols, general classification routines and rapid mapping tasks. BRDF (Bidirectional Reflectance Distribution Function) issues are discussed and the spectral database SPECCHIO (Spectral Input/Output) introduced. The optical performance as well as the dedicated software utilities make APEX a state-of-the-art hyperspectral sensor, capable of (a) satisfying the needs of several research communities and (b) helping the understanding of the Earth's complex mechanisms. PMID:27873868
NASA Technical Reports Server (NTRS)
Germany, G. A.
2001-01-01
The primary goal of the funded task was to restore and distribute the ISO ATLAS-1 space science data set with enhanced software and database utilities. The first year was primarily dedicated to physically transferring the data from its original format to its initial CD archival format. The remainder of the first year was devoted to the verification of the restored data set and database. The second year was devoted to the enhancement of the data set, especially the development of IDL utilities and redesign of the database and search interface as needed. This period was also devoted to distribution of the rescued data set, principally the creation and maintenance of a web interface to the data set. The final six months was dedicated to working with NSSDC to create a permanent, off site, hive of the data set and supporting utilities. This time was also used to resolve last minute quality and design issues.
NASA Astrophysics Data System (ADS)
Pierleoni, Arnaldo; Casagrande, Luca; Bellezza, Michele; Casadei, Stefano
2010-05-01
The need for increasingly complex geospatial algorithms dedicated to the management of water resources, the fact that many of them require specific knowledge and the need for dedicated computing machines has led to the necessity of centralizing and sharing all the server applications and the plugins developed. For this purpose, a Web Processing Service (WPS) that can make available to users a range of geospatial analysis algorithms, geostatistics, remote sensing procedures and that can be used simply by providing data and input parameters and download the results has been developed. The core of the system infrastructure is a GRASS GIS, which acts as a computational engine, providing more than 350 forms of analysis and the opportunity to create new and ad hoc procedures. The implementation of the WPS was performed using the software PyWPS written in Python that is easily manageable and configurable. All these instruments are managed by a daemon named "Arcibald" specifically created for the purpose of listing the order of the requests that come from the users. In fact, it may happen that there are already ongoing processes so the system will queue the new ones registering the request and running it only when the previous calculations have been completed. However, individual Geoprocessing have an indicator to assess the resources necessary to implement it, enabling you to run geoprocesses that do not require excessive computing time in parallel. This assessment is also made in relation to the size of the input file provided. The WPS standard defines methods for accessing and running Geoprocessing regardless of the client used, however, the project has been developed specifically for a graphical client to access the resources. The client was built as a plugin for the GIS QGis Software which provides the most common tools for the view and the consultation of geographically referenced data. The tool was tested using the data taken during the bathymetric campaign at the Montedoglio Reservoir on the Tiber River in order to generate a digital model of the reservoir bed. Starting from a text file containing coordinates and the depth of the points (previously statistically treated to remove any inaccuracy), we used the plugin for QGis to connect to the Web service and started the process of cross validation in order to obtain the parameters to be used for interpolation. This makes possible to highlight the morphological variations of the basin of reservoirs due to silting phenomena, therefore to consider the actual capacity of the basin for a proper evaluation of the available water resource. Indeed, this is a critical step for the next phase of management. In this case, since the procedure is very long (order of days), the system automatically choose to send the results via email. Moreover the system, once the procedures invoked end, allows to choose whether to share data and results or to remove all traces of the calculation. This because in some cases data and sensitive information are used and this could violate privacy policies if shared. The entire project is made only with open-source software.
Antonica, Filippo; Asabella, Artor Niccoli; Ferrari, Cristina; Rubini, Domenico; Notaristefano, Antonio; Nicoletti, Adriano; Altini, Corinna; Merenda, Nunzio; Mossa, Emilio; Guarini, Attilio; Rubini, Giuseppe
2014-01-01
In the last decade numerous attempts were considered to co-register and integrate different imaging data. Like PET/CT the integration of PET to MR showed great interest. PET/MR scanners are recently tested on different distrectual or systemic pathologies. Unfortunately PET/MR scanners are expensive and diagnostic protocols are still under studies and investigations. Nuclear Medicine imaging highlights functional and biometabolic information but has poor anatomic details. The aim of this study is to integrate MR and PET data to produce distrectual or whole body fused images acquired from different scanners even in different days. We propose an offline method to fuse PET with MR data using an open-source software that has to be inexpensive, reproducible and capable to exchange data over the network. We also evaluate global quality, alignment quality, and diagnostic confidence of fused PET-MR images. We selected PET/CT studies performed in our Nuclear Medicine unit, MR studies provided by patients on DICOM CD media or network received. We used Osirix 5.7 open source version. We aligned CT slices with the first MR slice, pointed and marked for co-registration using MR-T1 sequence and CT as reference and fused with PET to produce a PET-MR image. A total of 100 PET/CT studies were fused with the following MR studies: 20 head, 15 thorax, 24 abdomen, 31 pelvis, 10 whole body. An interval of no more than 15 days between PET and MR was the inclusion criteria. PET/CT, MR and fused studies were evaluated by two experienced radiologist and two experienced nuclear medicine physicians. Each one filled a five point based evaluation scoring scheme based on image quality, image artifacts, segmentation errors, fusion misalignment and diagnostic confidence. Our fusion method showed best results for head, thorax and pelvic districts in terms of global quality, alignment quality and diagnostic confidence,while for the abdomen and pelvis alignement quality and global quality resulted poor due to internal organs filling variation and time shifting beetwen examinations. PET/CT images with time of flight reconstruction and real attenuation correction were combined with anatomical detailed MRI images. We used Osirix, an image processing Open Source Software dedicated to DICOM images. No additional costs, to buy and upgrade proprietary software are required for combining data. No high technology or very expensive PET/MR scanner, that requires dedicated shielded room spaces and personnel to be employed or to be trained, are needed. Our method allows to share patient PET/MR fused data with different medical staff using dedicated networks. The proposed method may be applied to every MR sequence (MR-DWI and MR-STIR, magnet enhanced sequences) to characterize soft tissue alterations and improve discrimination diseases. It can be applied not only to PET with MR but virtually to every DICOM study.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brower, Richard C.
This proposal is to develop the software and algorithmic infrastructure needed for the numerical study of quantum chromodynamics (QCD), and of theories that have been proposed to describe physics beyond the Standard Model (BSM) of high energy physics, on current and future computers. This infrastructure will enable users (1) to improve the accuracy of QCD calculations to the point where they no longer limit what can be learned from high-precision experiments that seek to test the Standard Model, and (2) to determine the predictions of BSM theories in order to understand which of them are consistent with the data thatmore » will soon be available from the LHC. Work will include the extension and optimizations of community codes for the next generation of leadership class computers, the IBM Blue Gene/Q and the Cray XE/XK, and for the dedicated hardware funded for our field by the Department of Energy. Members of our collaboration at Brookhaven National Laboratory and Columbia University worked on the design of the Blue Gene/Q, and have begun to develop software for it. Under this grant we will build upon their experience to produce high-efficiency production codes for this machine. Cray XE/XK computers with many thousands of GPU accelerators will soon be available, and the dedicated commodity clusters we obtain with DOE funding include growing numbers of GPUs. We will work with our partners in NVIDIA's Emerging Technology group to scale our existing software to thousands of GPUs, and to produce highly efficient production codes for these machines. Work under this grant will also include the development of new algorithms for the effective use of heterogeneous computers, and their integration into our codes. It will include improvements of Krylov solvers and the development of new multigrid methods in collaboration with members of the FASTMath SciDAC Institute, using their HYPRE framework, as well as work on improved symplectic integrators.« less
NASA Astrophysics Data System (ADS)
Ben Salah, Ahmed; Ragot, Nicolas; Paquet, Thierry
2013-01-01
The French National Library (BnF*) has launched many mass digitization projects in order to give access to its collection. The indexation of digital documents on Gallica (digital library of the BnF) is done through their textual content obtained thanks to service providers that use Optical Character Recognition softwares (OCR). OCR softwares have become increasingly complex systems composed of several subsystems dedicated to the analysis and the recognition of the elements in a page. However, the reliability of these systems is always an issue at stake. Indeed, in some cases, we can find errors in OCR outputs that occur because of an accumulation of several errors at different levels in the OCR process. One of the frequent errors in OCR outputs is the missed text components. The presence of such errors may lead to severe defects in digital libraries. In this paper, we investigate the detection of missed text components to control the OCR results from the collections of the French National Library. Our verification approach uses local information inside the pages based on Radon transform descriptors and Local Binary Patterns descriptors (LBP) coupled with OCR results to control their consistency. The experimental results show that our method detects 84.15% of the missed textual components, by comparing the OCR ALTO files outputs (produced by the service providers) to the images of the document.
SOFIA: a flexible source finder for 3D spectral line data
NASA Astrophysics Data System (ADS)
Serra, Paolo; Westmeier, Tobias; Giese, Nadine; Jurek, Russell; Flöer, Lars; Popping, Attila; Winkel, Benjamin; van der Hulst, Thijs; Meyer, Martin; Koribalski, Bärbel S.; Staveley-Smith, Lister; Courtois, Hélène
2015-04-01
We introduce SOFIA, a flexible software application for the detection and parametrization of sources in 3D spectral line data sets. SOFIA combines for the first time in a single piece of software a set of new source-finding and parametrization algorithms developed on the way to future H I surveys with ASKAP (WALLABY, DINGO) and APERTIF. It is designed to enable the general use of these new algorithms by the community on a broad range of data sets. The key advantages of SOFIA are the ability to: search for line emission on multiple scales to detect 3D sources in a complete and reliable way, taking into account noise level variations and the presence of artefacts in a data cube; estimate the reliability of individual detections; look for signal in arbitrarily large data cubes using a catalogue of 3D coordinates as a prior; provide a wide range of source parameters and output products which facilitate further analysis by the user. We highlight the modularity of SOFIA, which makes it a flexible package allowing users to select and apply only the algorithms useful for their data and science questions. This modularity makes it also possible to easily expand SOFIA in order to include additional methods as they become available. The full SOFIA distribution, including a dedicated graphical user interface, is publicly available for download.
NASA Astrophysics Data System (ADS)
Neidhardt, Alexander; Collioud, Arnaud
2014-12-01
A central VLBI network status monitoring can be realized by using online status information about current VLBI sessions, real-time, and status data directly from each radio telescope. Such monitoring helps to organize sessions or to get immediate feedback from the active telescopes. Therefore the remote control software for VLBI radio telescopes ``e-RemoteCtrl'' (http://www.econtrol-software.de), which enables remote access as extension to the NASA Field System, realizes real-time data streams to dedicated data centers. The software has direct access to the status information about the current observation (e.g., schedule, scan, source) and the telescope (e.g., current state, temperature, pressure) in real-time. This information are directly sent to ``IVS Live''. ``IVS Live'' (http://ivslive.obs.u-bordeaux1.fr/) is a Web tool that can be used to follow the observing sessions, organized by the International VLBI Service for Geodesy and Astrometry (IVS), navigate through past or upcoming sessions, or search and display specific information about sessions, sources (like VLBI images), and stations, by using an Internet browser.
Durbin, Kenneth R.; Tran, John C.; Zamdborg, Leonid; Sweet, Steve M. M.; Catherman, Adam D.; Lee, Ji Eun; Li, Mingxi; Kellie, John F.; Kelleher, Neil L.
2011-01-01
Applying high-throughput Top-Down MS to an entire proteome requires a yet-to-be-established model for data processing. Since Top-Down is becoming possible on a large scale, we report our latest software pipeline dedicated to capturing the full value of intact protein data in automated fashion. For intact mass detection, we combine algorithms for processing MS1 data from both isotopically resolved (FT) and charge-state resolved (ion trap) LC-MS data, which are then linked to their fragment ions for database searching using ProSight. Automated determination of human keratin and tubulin isoforms is one result. Optimized for the intricacies of whole proteins, new software modules visualize proteome-scale data based on the LC retention time and intensity of intact masses and enable selective detection of PTMs to automatically screen for acetylation, phosphorylation, and methylation. Software functionality was demonstrated using comparative LC-MS data from yeast strains in addition to human cells undergoing chemical stress. We further these advances as a key aspect of realizing Top-Down MS on a proteomic scale. PMID:20848673
A real-time GNSS-R system based on software-defined radio and graphics processing units
NASA Astrophysics Data System (ADS)
Hobiger, Thomas; Amagai, Jun; Aida, Masanori; Narita, Hideki
2012-04-01
Reflected signals of the Global Navigation Satellite System (GNSS) from the sea or land surface can be utilized to deduce and monitor physical and geophysical parameters of the reflecting area. Unlike most other remote sensing techniques, GNSS-Reflectometry (GNSS-R) operates as a passive radar that takes advantage from the increasing number of navigation satellites that broadcast their L-band signals. Thereby, most of the GNSS-R receiver architectures are based on dedicated hardware solutions. Software-defined radio (SDR) technology has advanced in the recent years and enabled signal processing in real-time, which makes it an ideal candidate for the realization of a flexible GNSS-R system. Additionally, modern commodity graphic cards, which offer massive parallel computing performances, allow to handle the whole signal processing chain without interfering with the PC's CPU. Thus, this paper describes a GNSS-R system which has been developed on the principles of software-defined radio supported by General Purpose Graphics Processing Units (GPGPUs), and presents results from initial field tests which confirm the anticipated capability of the system.
Optics derotator servo control system for SONG Telescope
NASA Astrophysics Data System (ADS)
Xu, Jin; Ren, Changzhi; Ye, Yu
2012-09-01
The Stellar Oscillations Network Group (SONG) is an initiative which aims at designing and building a groundbased network of 1m telescopes dedicated to the study of phenomena occurring in the time domain. Chinese standard node of SONG is an Alt-Az Telescope of F/37 with 1m diameter. Optics derotator control system of SONG telescope adopts the development model of "Industrial Computer + UMAC Motion Controller + Servo Motor".1 Industrial computer is the core processing part of the motion control, motion control card(UMAC) is in charge of the details on the motion control, Servo amplifier accepts the control commands from UMAC, and drives the servo motor. The position feedback information comes from the encoder, to form a closed loop control system. This paper describes in detail hardware design and software design for the optics derotator servo control system. In terms of hardware design, the principle, structure, and control algorithm of servo system based on optics derotator are analyzed and explored. In terms of software design, the paper proposes the architecture of the system software based on Object-Oriented Programming.
The Development of the Puerto Rico Lightning Detection Network for Meteorological Research
NASA Technical Reports Server (NTRS)
Legault, Marc D.; Miranda, Carmelo; Medin, J.; Ojeda, L. J.; Blakeslee, Richard J.
2011-01-01
A land-based Puerto Rico Lightning Detection Network (PR-LDN) dedicated to the academic research of meteorological phenomena has being developed. Five Boltek StormTracker PCI-Receivers with LTS-2 Timestamp Cards with GPS and lightning detectors were integrated to Pentium III PC-workstations running the CentOS linux operating system. The Boltek detector linux driver was compiled under CentOS, modified, and thoroughly tested. These PC-workstations with integrated lightning detectors were installed at five of the University of Puerto Rico (UPR) campuses distributed around the island of PR. The PC-workstations are left on permanently in order to monitor lightning activity at all times. Each is networked to their campus network-backbone permitting quasi-instantaneous data transfer to a central server at the UPR-Bayam n campus. Information generated by each lightning detector is managed by a C-program developed by us called the LDN-client. The LDN-client maintains an open connection to the central server operating the LDN-server program where data is sent real-time for analysis and archival. The LDN-client also manages the storing of data on the PC-workstation hard disk. The LDN-server software (also an in-house effort) analyses the data from each client and performs event triangulations. Time-of-arrival (TOA) and related hybrid algorithms, lightning-type and event discriminating routines are also implemented in the LDN-server software. We also have developed software to visually monitor lightning events in real-time from all clients and the triangulated events. We are currently monitoring and studying the spatial, temporal, and type distribution of lightning strikes associated with electrical storms and tropical cyclones in the vicinity of Puerto Rico.
A free software for the calculation of T2* values for iron overload assessment.
Fernandes, Juliano Lara; Fioravante, Luciana Andrea Barozi; Verissimo, Monica P; Loggetto, Sandra R
2017-06-01
Background Iron overload assessment with magnetic resonance imaging (MRI) using T2* has become a key diagnostic method in the management of many diseases. Quantitative analysis of the MRI images with a cost-effective tool has been a limitation to increased use of the method. Purpose To provide a free software solution for this purpose comparing the results with a commercial solution. Material and Methods The free tool was developed as a standalone program to be directly downloaded and ran in a common personal computer platform without the need of a dedicated workstation. Liver and cardiac T2* values were calculated using both tools and the values obtained compared between them in a group of 56 patients with suspected iron overload using Bland-Altman plots and concordance correlation coefficients (CCC). Results In the heart, the mean T2* differences between the two methods was 0.46 ms (95% confidence interval [CI], -0.037 -0.965) and in the liver 0.49 ms (95% CI, 0.257-0.722). The CCC for both the heart and the liver were significantly high (0.98 [95% CI, 0.966-0.988] with a Pearson ρ of 0.9811 and 0.991 [95% CI, 0.986-0.994] with a Pearson ρ of 0.996, respectively. No significant differences were observed when analyzing only patients with abnormal concentrations of iron in both organs compared to the whole cohort. Conclusion The proposed free software tool is accurate for calculation of T2* values of the liver and heart and might be a solution for centers that cannot use paid commercial solutions.
Design of the Next Generation Target at the Lujan Neutron Scattering Center, LANSCE
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ferres, Laurent
Los Alamos National Laboratory (LANL) supports scientific research in many diverse fields such as biology, chemistry, and nuclear science. The Laboratory was established in 1943 during the Second World War to develop nuclear weapons. Today, LANL is one of the largest laboratories dedicated to nuclear defense and operates an 800 MeV proton linear accelerator for basic and applied research including: production of high- and low-energy neutrons beams, isotope production for medical applications and proton radiography. This accelerator is located at the Los Alamos Neutron Science Center (LANSCE). The work performed involved the redesign of the target for the low-energy neutronmore » source at the Lujan Neutron Scattering Center, which is one of the facilities built around the accelerator. The redesign of the target involves modeling various arrangements of the moderator-reflector-shield for the next generation neutron production target. This is done using Monte Carlo N-Particle eXtended (MCNPX), and ROOT analysis framework, a C++ based-software, to analyze the results.« less
Illés, Tamás
2011-03-01
The EOS system is a new medical imaging device based on low-dose X-rays, gaseous detectors and dedicated software for 3D reconstruction. It was developed by Nobel prizewinner Georges Charpak. A new concept--the vertebral vector--is used to facilitate the interpretation of EOS data, especially in the horizontal plane. We studied 95 cases of idiopathic scoliosis before and after surgery by means of classical methods and using vertebral vectors, in order to compare the accuracy of the two approaches. The vertebral vector permits simultaneous analysis of the scoliotic curvature in the frontal, sagittal and horizontal planes, as precisely as classical methods. The use of the vertebral vector simplifies and facilitates the interpretation of the mass of information provided by EOS. After analyzing the horizontal data, the first goal of corrective intervention would be to reduce the lateral vertebral deviation. The reduction in vertebral rotation seems less important. This is a new element in the therapeutic management of spinal deformations.
Modelling Sawing of Metal Tubes Through FEM Simulation
NASA Astrophysics Data System (ADS)
Bort, C. M. Giorgio; Bosetti, P.; Bruschi, S.
2011-05-01
The paper presents the development of a numerical model of the sawing process of AISI 304 thin tubes, which is cut through a circular blade with alternating roughing and finishing teeth. The numerical simulation environment is the three-dimensional FEM software Deform™ v.10.1. The teeth actual trajectories were determined by a blade kinematics analysis developed in Matlab™. Due to the manufacturing rolling steps and subsequent welding stage, the tube material is characterized by a gradient of properties along its thickness. Consequently, a simplified cutting test was set up and carried out in order to identify the values of relevant material parameters to be used in the numerical model. The dedicated test was the Orthogonal Tube Cutting test (OTC), which was performed on an instrumented lathe. The proposed numerical model was validated by comparing numerical results and experimental data obtained from sawing tests carried out on an industrial machine. The following outputs were compared: the cutting force, the chip thickness, and the chip contact area.
Scherer, N M; Basso, D M
2008-09-16
DNATagger is a web-based tool for coloring and editing DNA, RNA and protein sequences and alignments. It is dedicated to the visualization of protein coding sequences and also protein sequence alignments to facilitate the comprehension of evolutionary processes in sequence analysis. The distinctive feature of DNATagger is the use of codons as informative units for coloring DNA and RNA sequences. The codons are colored according to their corresponding amino acids. It is the first program that colors codons in DNA sequences without being affected by "out-of-frame" gaps of alignments. It can handle single gaps and gaps inside the triplets. The program also provides the possibility to edit the alignments and change color patterns and translation tables. DNATagger is a JavaScript application, following the W3C guidelines, designed to work on standards-compliant web browsers. It therefore requires no installation and is platform independent. The web-based DNATagger is available as free and open source software at http://www.inf.ufrgs.br/~dmbasso/dnatagger/.
NASA Astrophysics Data System (ADS)
Chadel, Meriem; Bouzaki, Mohammed Moustafa; Chadel, Asma; Petit, Pierre; Sawicki, Jean-Paul; Aillerie, Michel; Benyoucef, Boumediene
2017-02-01
We present and analyze experimental results obtained with a laboratory setup based on a hardware and smart instrumentation for the complete study of performance of PV panels using for illumination an artificial radiation source (Halogen lamps). Associated to an accurate analysis, this global experimental procedure allows the determination of effective performance under standard conditions thanks to a simulation process originally developed under Matlab software environment. The uniformity of the irradiated surface was checked by simulation of the light field. We studied the response of standard commercial photovoltaic panels under enlightenment measured by a spectrometer with different spectra for two sources, halogen lamps and sunlight. Then, we bring a special attention to the influence of the spectral distribution of light on the characteristics of photovoltaic panel, that we have performed as a function of temperature and for different illuminations with dedicated measurements and studies of the open circuit voltage and short-circuit current.
Fang, Joyce; Savransky, Dmitry
2016-08-01
Automation of alignment tasks can provide improved efficiency and greatly increase the flexibility of an optical system. Current optical systems with automated alignment capabilities are typically designed to include a dedicated wavefront sensor. Here, we demonstrate a self-aligning method for a reconfigurable system using only focal plane images. We define a two lens optical system with 8 degrees of freedom. Images are simulated given misalignment parameters using ZEMAX software. We perform a principal component analysis on the simulated data set to obtain Karhunen-Loève modes, which form the basis set whose weights are the system measurements. A model function, which maps the state to the measurement, is learned using nonlinear least-squares fitting and serves as the measurement function for the nonlinear estimator (extended and unscented Kalman filters) used to calculate control inputs to align the system. We present and discuss simulated and experimental results of the full system in operation.
Enlarging the STEM pipeline working with youth-serving organizations
NASA Astrophysics Data System (ADS)
Porro, I.
2005-12-01
The After-School Astronomy Project (ASAP) is a comprehensive initiative to promote the pursuit of science learning among underrepresented youth. To this end ASAP specifically aims at building the capacity of urban community-based centers to deliver innovative science out-of-school programming to their youth. ASAP makes use of a modular curriculum consisting of a combination of hands-on activities and youth-led explorations of the night sky using MicroObservatory. Through project-based investigations students reinforce learning in astronomy and develop an understanding of science as inquiry, while also develop communication and computer skills. Through MicroObservatory students gain access to a network of educational telescopes, that they control over the Internet, software analysis tools and an online community of users. An integral part of ASAP is to provide professional development opportunities for after-school workers. This promotes a self-sustainable implementation of ASAP long-term and fosters the creation of a cadre of after-school professionals dedicated to facilitating science-based programs.
NASA Astrophysics Data System (ADS)
Stoeckel, Gerhard P.; Doyle, Keith B.
2017-08-01
The Transiting Exoplanet Survey Satellite (TESS) is an instrument consisting of four, wide fieldof- view CCD cameras dedicated to the discovery of exoplanets around the brightest stars, and understanding the diversity of planets and planetary systems in our galaxy. Each camera utilizes a seven-element lens assembly with low-power and low-noise CCD electronics. Advanced multivariable optimization and numerical simulation capabilities accommodating arbitrarily complex objective functions have been added to the internally developed Lincoln Laboratory Integrated Modeling and Analysis Software (LLIMAS) and used to assess system performance. Various optical phenomena are accounted for in these analyses including full dn/dT spatial distributions in lenses and charge diffusion in the CCD electronics. These capabilities are utilized to design CCD shims for thermal vacuum chamber testing and flight, and verify comparable performance in both environments across a range of wavelengths, field points and temperature distributions. Additionally, optimizations and simulations are used for model correlation and robustness optimizations.
AirLab: a cloud-based platform to manage and share antibody-based single-cell research.
Catena, Raúl; Özcan, Alaz; Jacobs, Andrea; Chevrier, Stephane; Bodenmiller, Bernd
2016-06-29
Single-cell analysis technologies are essential tools in research and clinical diagnostics. These methods include flow cytometry, mass cytometry, and other microfluidics-based technologies. Most laboratories that employ these methods maintain large repositories of antibodies. These ever-growing collections of antibodies, their multiple conjugates, and the large amounts of data generated in assays using specific antibodies and conditions makes a dedicated software solution necessary. We have developed AirLab, a cloud-based tool with web and mobile interfaces, for the organization of these data. AirLab streamlines the processes of antibody purchase, organization, and storage, antibody panel creation, results logging, and antibody validation data sharing and distribution. Furthermore, AirLab enables inventory of other laboratory stocks, such as primers or clinical samples, through user-controlled customization. Thus, AirLab is a mobile-powered and flexible tool that harnesses the capabilities of mobile tools and cloud-based technology to facilitate inventory and sharing of antibody and sample collections and associated validation data.
FPGA based control system for space instrumentation
NASA Astrophysics Data System (ADS)
Di Giorgio, Anna M.; Cerulli Irelli, Pasquale; Nuzzolo, Francesco; Orfei, Renato; Spinoglio, Luigi; Liu, Giovanni S.; Saraceno, Paolo
2008-07-01
The prototype for a general purpose FPGA based control system for space instrumentation is presented, with particular attention to the instrument control application software. The system HW is based on the LEON3FT processor, which gives the flexibility to configure the chip with only the necessary HW functionalities, from simple logic up to small dedicated processors. The instrument control SW is developed in ANSI C and for time critical (<10μs) commanding sequences implements an internal instructions sequencer, triggered via an interrupt service routine based on a HW high priority interrupt.
A Generic Multibody Parachute Simulation Model
NASA Technical Reports Server (NTRS)
Neuhaus, Jason Richard; Kenney, Patrick Sean
2006-01-01
Flight simulation of dynamic atmospheric vehicles with parachute systems is a complex task that is not easily modeled in many simulation frameworks. In the past, the performance of vehicles with parachutes was analyzed by simulations dedicated to parachute operations and were generally not used for any other portion of the vehicle flight trajectory. This approach required multiple simulation resources to completely analyze the performance of the vehicle. Recently, improved software engineering practices and increased computational power have allowed a single simulation to model the entire flight profile of a vehicle employing a parachute.
1982-10-01
spent in preparing this document. 00. EXECUTIVE SUMMARY The O’Hare Runway Configuration Management System (CMS) is an interactive multi-user computer ...MITRE Washington’s Computer Center. Currently, CMS is housed in an IBM 4341 computer with VM/SP operating system. CMS employs the IBM’s Display...iV 0O, o 0 .r4L /~ wA 0U 00 00 0 w vi O’Hare, it will operate on a dedicated mini- computer which permits multi-tasking (that is, multiple users
Redesigning a risk-management process for tracking injuries.
Wenzel, G R
1998-01-01
The changing responsibilities of registered nurses are challenging even the most dedicated professionals. To survive within her newly-defined roles, one nurse used a total quality improvement model to understand, analyze, and improve a medical center's system for tracking inpatient injuries. This process led to the drafting of an original software design that implemented a nursing informatics tracking system. It has resulted in significant savings of time and money and has far surpassed the accuracy, efficiency, and scope of the previous method. This article presents an overview of the design process.
GIS Toolsets for Planetary Geomorphology and Landing-Site Analysis
NASA Astrophysics Data System (ADS)
Nass, Andrea; van Gasselt, Stephan
2015-04-01
Modern Geographic Information Systems (GIS) allow expert and lay users alike to load and position geographic data and perform simple to highly complex surface analyses. For many applications dedicated and ready-to-use GIS tools are available in standard software systems while other applications require the modular combination of available basic tools to answer more specific questions. This also applies to analyses in modern planetary geomorphology where many of such (basic) tools can be used to build complex analysis tools, e.g. in image- and terrain model analysis. Apart from the simple application of sets of different tools, many complex tasks require a more sophisticated design for storing and accessing data using databases (e.g. ArcHydro for hydrological data analysis). In planetary sciences, complex database-driven models are often required to efficiently analyse potential landings sites or store rover data, but also geologic mapping data can be efficiently stored and accessed using database models rather than stand-alone shapefiles. For landings-site analyses, relief and surface roughness estimates are two common concepts that are of particular interest and for both, a number of different definitions co-exist. We here present an advanced toolset for the analysis of image and terrain-model data with an emphasis on extraction of landing site characteristics using established criteria. We provide working examples and particularly focus on the concepts of terrain roughness as it is interpreted in geomorphology and engineering studies.
Simulation study of a high performance brain PET system with dodecahedral geometry.
Tao, Weijie; Chen, Gaoyu; Weng, Fenghua; Zan, Yunlong; Zhao, Zhixiang; Peng, Qiyu; Xu, Jianfeng; Huang, Qiu
2018-05-25
In brain imaging, the spherical PET system achieves the highest sensitivity when the solid angle is concerned. However it is not practical. In this work we designed an alternative sphere-like scanner, the dodecahedral scanner, which has a high sensitivity in imaging and a high feasibility to manufacture. We simulated this system and compared the performance with a few other dedicated brain PET systems. Monte Carlo simulations were conducted to generate data of the dedicated brain PET system with the dodecahedral geometry (11 regular pentagon detectors). The data were then reconstructed using the in-house developed software with the fully three-dimensional maximum-likelihood expectation maximization (3D-MLEM) algorithm. Results show that the proposed system has a high sensitivity distribution for the whole field of view (FOV). With a depth-of-interaction (DOI) resolution around 6.67 mm, the proposed system achieves the spatial resolution of 1.98 mm. Our simulation study also shows that the proposed system improves the image contrast and reduces noise compared with a few other dedicated brain PET systems. Finally, simulations with the Hoffman phantom show the potential application of the proposed system in clinical applications. In conclusion, the proposed dodecahedral PET system is potential for widespread applications in high-sensitivity, high-resolution PET imaging, to lower the injected dose. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.
Antonuzzo, A; Vasile, E; Sbrana, A; Lucchesi, M; Galli, L; Brunetti, I M; Musettini, G; Farnesi, A; Biasco, E; Virgili, N; Falcone, A; Ricci, S
2017-01-01
Supportive care in oncology is a primary need for every oncology department nowadays. In 2012, in our institution, a dedicated supportive care service (SCS) was created in order to deal with any need our on-treatment patients might have (e.g. tumour-related or treatment-related symptoms). We hypothesized that this service had a positive impact on the number of unplanned hospitalizations; to confirm our hypothesis, we decided to review admission data in 2011 and 2012. Using our internal software, we compared admission data in 2011 (that is, the year before the dedicated service was created) and 2012 (when such service began, that is April of that year). We also made an evaluation of the costs of these hospitalizations. Despite an increase of the number of patients treated in our day hospital (+6.5 %), the number of unplanned hospital admissions decreased by 3.2 % (from 17.3 to 14.1 %). The number of patients accessing to emergency room went from 66 to 61 % (a reduction of 5 %). The costs of these hospitalizations were reduced by 2.2 %. The introduction of the dedicated SCS in our oncology department caused a net reduction by 3.2 % of the number of unplanned hospitalizations of on-treatment cancer patients.
[Intranet applications in radiology].
Knopp, M V; von Hippel, G M; Koch, T; Knopp, M A
2000-01-01
The aim of the paper is to present the conceptual basis and capabilities of intranet applications in radiology. The intranet, which is the local brother of the internet can be readily realized using existing computer components and a network. All current computer operating systems support intranet applications which allow hard and software independent communication of text, images, video and sound with the use of browser software without dedicated programs on the individual personal computers. Radiological applications for text communication e.g. department specific bulletin boards and access to examination protocols; use of image communication for viewing and limited processing and documentation of radiological images can be achieved on decentralized PCs as well as speech communication for dictation, distribution of dictation and speech recognition. The intranet helps to optimize the organizational efficiency and cost effectiveness in the daily work of radiological departments in outpatients and hospital settings. The general interest in internet and intranet technology will guarantee its continuous development.
NASA Astrophysics Data System (ADS)
Nagy, Tamás; Vadai, Gergely; Gingl, Zoltán
2017-09-01
Modern measurement of physical signals is based on the use of sensors, electronic signal conditioning, analog-to-digital conversion and digital signal processing carried out by dedicated software. The same signal chain is used in many devices such as home appliances, automotive electronics, medical instruments, and smartphones. Teaching the theoretical, experimental, and signal processing background must be an essential part of improving the standard of higher education, and it fits well to the increasingly multidisciplinary nature of physics and engineering too. In this paper, we show how digital phonocardiography can be used in university education as a universal, highly scalable, exciting, and inspiring laboratory practice and as a demonstration at various levels and complexity. We have developed open-source software templates in modern programming languages to support immediate use and to serve as a basis of further modifications using personal computers, tablets, and smartphones.
Real-time control of the robotic lunar observatory telescope
Anderson, J.M.; Becker, K.J.; Kieffer, H.H.; Dodd, D.N.
1999-01-01
The US Geological Survey operates an automated observatory dedicated to the radiometry of the Moon with the objective of developing a multispectral, spatially resolved photometric model of the Moon to be used in the calibration of Earth-orbiting spacecraft. Interference filters are used with two imaging instruments to observe the Moon in 32 passbands from 350-2500 nm. Three computers control the telescope mount and instruments with a fourth computer acting as a master system to control all observation activities. Real-time control software has been written to operate the instrumentation and to automate the observing process. The observing software algorithms use information including the positions of objects in the sky, the phase of the Moon, and the times of evening and morning twilight to decide how to observe program objects. The observatory has been operating in a routine mode since late 1995 and is expected to continue through at least 2002 without significant modifications.
Solutions for acceleration measurement in vehicle crash tests
NASA Astrophysics Data System (ADS)
Dima, D. S.; Covaciu, D.
2017-10-01
Crash tests are useful for validating computer simulations of road traffic accidents. One of the most important parameters measured is the acceleration. The evolution of acceleration versus time, during a crash test, form a crash pulse. The correctness of the crash pulse determination depends on the data acquisition system used. Recommendations regarding the instrumentation for impact tests are given in standards, which are focused on the use of accelerometers as impact sensors. The goal of this paper is to present the device and software developed by authors for data acquisition and processing. The system includes two accelerometers with different input ranges, a processing unit based on a 32-bit microcontroller and a data logging unit with SD card. Data collected on card, as text files, is processed with a dedicated software running on personal computers. The processing is based on diagrams and includes the digital filters recommended in standards.
Persuasive Mobile Health Applications
NASA Astrophysics Data System (ADS)
Garcia Wylie, Carlos; Coulton, Paul
With many industrialized societies bearing the cost of an increasingly sedentary lifestyle on the health of their populations there is a need to find new ways of encouraging physical activity to promote better health and well being. With the increasing power of mobile phones and the recent emergence of personal heart rate monitors, aimed at dedicated amateur runners, there is now a possibility to develop “Persuasive Mobile Health Applications” to promote well being through the use of real-time physiological data and persuade users to adopt a healthier lifestyle. In this paper we present a novel general health monitoring software for mobile phones called Heart Angel. This software is aimed at helping users monitor, record, as well as improve their fitness level through built-in cardio-respiratory tests, a location tracking application for analyzing heart rate exertion over time and location, and a fun mobile-exergame called Health Defender.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Mi-Ae; Moore, Stephen C.; McQuaid, Sarah J.
Purpose: The authors have previously reported the advantages of high-sensitivity single-photon emission computed tomography (SPECT) systems for imaging structures located deep inside the brain. DaTscan (Isoflupane I-123) is a dopamine transporter (DaT) imaging agent that has shown potential for early detection of Parkinson disease (PD), as well as for monitoring progression of the disease. Realizing the full potential of DaTscan requires efficient estimation of striatal uptake from SPECT images. They have evaluated two SPECT systems, a conventional dual-head gamma camera with low-energy high-resolution collimators (conventional) and a dedicated high-sensitivity multidetector cardiac imaging system (dedicated) for imaging tasks related to PD.more » Methods: Cramer-Rao bounds (CRB) on precision of estimates of striatal and background activity concentrations were calculated from high-count, separate acquisitions of the compartments (right striata, left striata, background) of a striatal phantom. CRB on striatal and background activity concentration were calculated from essentially noise-free projection datasets, synthesized by scaling and summing the compartment projection datasets, for a range of total detected counts. They also calculated variances of estimates of specific-to-nonspecific binding ratios (BR) and asymmetry indices from these values using propagation of error analysis, as well as the precision of measuring changes in BR on the order of the average annual decline in early PD. Results: Under typical clinical conditions, the conventional camera detected 2 M counts while the dedicated camera detected 12 M counts. Assuming a normal BR of 5, the standard deviation of BR estimates was 0.042 and 0.021 for the conventional and dedicated system, respectively. For an 8% decrease to BR = 4.6, the signal-to-noise ratio were 6.8 (conventional) and 13.3 (dedicated); for a 5% decrease, they were 4.2 (conventional) and 8.3 (dedicated). Conclusions: This implies that PD can be detected earlier with the dedicated system than with the conventional system; therefore, earlier identification of PD progression should be possible with the high-sensitivity dedicated SPECT camera.« less
Deep near-infrared survey of the Southern Sky (DENIS)
NASA Technical Reports Server (NTRS)
Deul, E.
1992-01-01
DENIS (Deep Near-Infrared Survey of the Southern Sky) will be the first complete census of astronomical sources in the near-infrared spectral range. The challenges of this novel survey are both scientific and technical. Phenomena radiating in the near-infrared range from brown dwarfs to galaxies in the early stages of cosmological evolution, the scientific exploitation of data relevant over such a wide range requires pooling expertise from several of the leading European astronomical centers. The technical challenges of a project which will provide an order of magnitude more sources than given by the IRAS space mission, and which will involve advanced data-handling and image-processing techniques, likewise require pooling of hardware and software resources, as well as of human expertise. The DENIS project team is composed of some 40 scientists, computer specialists, and engineers located in 5 European Community countries (France, Germany, Italy, The Netherlands, and Spain), with important contributions from specialists in Australia, Brazil, Chile, and Hungary. DENIS will survey the entire southern sky in 3 colors, namely in the I band at a wavelength of 0.8 micron, in the 1.25 micron J band, and in the 2.15 micron K' band. The sensitivity limits will be 18th magnitude in the I band, 16th in the J band, and 14.5th in the K' band. The angular resolution achieved will be 1 arcsecond in the I band, and 3.0 arcseconds in the J and K' bands. The European Southern Observatory 1 m telescope on La Silla will be dedicated to survey use during operations expected to last four years, commencing in late 1993. DENIS aims to provide the astronomical community with complete digitized infrared images of the full southern sky and a catalogue of extracted objects, both of the best quality and in readily accessible form. This will be achieved through dedicated software packages and specialized catalogues, and with assistance from the Leiden and Paris Data Analysis Centers. The data will be mailed on DAT tapes from La Silla to the two Data Analysis Centers for further processing. Two centers are necessary because of the shear quantity of data and because of the complementary roles the Centers will develop, each exploiting its own particular expertise. The Leiden Data Analysis Center (LDAC) will extract objects, establish their parameters, and archive them into a source catalogue. The LDAC will collaborate with the Groningen Space Research group that has gained experience in infrared image handling from the IRAS satellite. The Paris Data Analysis Center (PDAC) will be responsible for archiving and preprocessing the raw data to provide a homogeneous set of data suitable for further reduction in both the Leiden and Paris data analysis streams. The PDAC will also extract and archive images for the sources flagged by the LDAC as extended, and create a catalogue of galaxies. In exploiting the DENIS data we foresee the collaboration with other data analysis centers, such as the Observatoire de Lyon where the relevant DENIS catalogue of galaxies can be incorporated into their extragalactic database. The Point Sources and the Small Extended Sources catalogues could be incorporated in the Late Type Star database at Montpellier, and in the SIMBAD database as CDS. At Groningen the IRAS Point Source catalogue and/or image data can be merged with the DENIS catalogues. At Meudon algorithms and software will be developed with main goal assessing the limits reachable for the homogeneity and intrinsic consistency between the ensemble of the images in the data base (flat-fielding, relative positioning of the fields, bootstrapped flux calibration) but also for the data analysis.
Global Ocean Currents Database
NASA Astrophysics Data System (ADS)
Boyer, T.; Sun, L.
2016-02-01
The NOAA's National Centers for Environmental Information has released an ocean currents database portal that aims 1) to integrate global ocean currents observations from a variety of instruments with different resolution, accuracy and response to spatial and temporal variability into a uniform network common data form (NetCDF) format and 2) to provide a dedicated online data discovery, access to NCEI-hosted and distributed data sources for ocean currents data. The portal provides a tailored web application that allows users to search for ocean currents data by platform types and spatial/temporal ranges of their interest. The dedicated web application is available at http://www.nodc.noaa.gov/gocd/index.html. The NetCDF format supports widely-used data access protocols and catalog services such as OPeNDAP (Open-source Project for a Network Data Access Protocol) and THREDDS (Thematic Real-time Environmental Distributed Data Services), which the GOCD users can use data files with their favorite analysis and visualization client software without downloading to their local machine. The potential users of the ocean currents database include, but are not limited to, 1) ocean modelers for their model skills assessments, 2) scientists and researchers for studying the impact of ocean circulations on the climate variability, 3) ocean shipping industry for safety navigation and finding optimal routes for ship fuel efficiency, 4) ocean resources managers while planning for the optimal sites for wastes and sewages dumping and for renewable hydro-kinematic energy, and 5) state and federal governments to provide historical (analyzed) ocean circulations as an aid for search and rescue
STSE: Spatio-Temporal Simulation Environment Dedicated to Biology.
Stoma, Szymon; Fröhlich, Martina; Gerber, Susanne; Klipp, Edda
2011-04-28
Recently, the availability of high-resolution microscopy together with the advancements in the development of biomarkers as reporters of biomolecular interactions increased the importance of imaging methods in molecular cell biology. These techniques enable the investigation of cellular characteristics like volume, size and geometry as well as volume and geometry of intracellular compartments, and the amount of existing proteins in a spatially resolved manner. Such detailed investigations opened up many new areas of research in the study of spatial, complex and dynamic cellular systems. One of the crucial challenges for the study of such systems is the design of a well stuctured and optimized workflow to provide a systematic and efficient hypothesis verification. Computer Science can efficiently address this task by providing software that facilitates handling, analysis, and evaluation of biological data to the benefit of experimenters and modelers. The Spatio-Temporal Simulation Environment (STSE) is a set of open-source tools provided to conduct spatio-temporal simulations in discrete structures based on microscopy images. The framework contains modules to digitize, represent, analyze, and mathematically model spatial distributions of biochemical species. Graphical user interface (GUI) tools provided with the software enable meshing of the simulation space based on the Voronoi concept. In addition, it supports to automatically acquire spatial information to the mesh from the images based on pixel luminosity (e.g. corresponding to molecular levels from microscopy images). STSE is freely available either as a stand-alone version or included in the linux live distribution Systems Biology Operational Software (SB.OS) and can be downloaded from http://www.stse-software.org/. The Python source code as well as a comprehensive user manual and video tutorials are also offered to the research community. We discuss main concepts of the STSE design and workflow. We demonstrate it's usefulness using the example of a signaling cascade leading to formation of a morphological gradient of Fus3 within the cytoplasm of the mating yeast cell Saccharomyces cerevisiae. STSE is an efficient and powerful novel platform, designed for computational handling and evaluation of microscopic images. It allows for an uninterrupted workflow including digitization, representation, analysis, and mathematical modeling. By providing the means to relate the simulation to the image data it allows for systematic, image driven model validation or rejection. STSE can be scripted and extended using the Python language. STSE should be considered rather as an API together with workflow guidelines and a collection of GUI tools than a stand alone application. The priority of the project is to provide an easy and intuitive way of extending and customizing software using the Python language.
An arrhythmia classification algorithm using a dedicated wavelet adapted to different subjects.
Kim, Jinkwon; Min, Se Dong; Lee, Myoungho
2011-06-27
Numerous studies have been conducted regarding a heartbeat classification algorithm over the past several decades. However, many algorithms have also been studied to acquire robust performance, as biosignals have a large amount of variation among individuals. Various methods have been proposed to reduce the differences coming from personal characteristics, but these expand the differences caused by arrhythmia. In this paper, an arrhythmia classification algorithm using a dedicated wavelet adapted to individual subjects is proposed. We reduced the performance variation using dedicated wavelets, as in the ECG morphologies of the subjects. The proposed algorithm utilizes morphological filtering and a continuous wavelet transform with a dedicated wavelet. A principal component analysis and linear discriminant analysis were utilized to compress the morphological data transformed by the dedicated wavelets. An extreme learning machine was used as a classifier in the proposed algorithm. A performance evaluation was conducted with the MIT-BIH arrhythmia database. The results showed a high sensitivity of 97.51%, specificity of 85.07%, accuracy of 97.94%, and a positive predictive value of 97.26%. The proposed algorithm achieves better accuracy than other state-of-the-art algorithms with no intrasubject between the training and evaluation datasets. And it significantly reduces the amount of intervention needed by physicians.
An arrhythmia classification algorithm using a dedicated wavelet adapted to different subjects
2011-01-01
Background Numerous studies have been conducted regarding a heartbeat classification algorithm over the past several decades. However, many algorithms have also been studied to acquire robust performance, as biosignals have a large amount of variation among individuals. Various methods have been proposed to reduce the differences coming from personal characteristics, but these expand the differences caused by arrhythmia. Methods In this paper, an arrhythmia classification algorithm using a dedicated wavelet adapted to individual subjects is proposed. We reduced the performance variation using dedicated wavelets, as in the ECG morphologies of the subjects. The proposed algorithm utilizes morphological filtering and a continuous wavelet transform with a dedicated wavelet. A principal component analysis and linear discriminant analysis were utilized to compress the morphological data transformed by the dedicated wavelets. An extreme learning machine was used as a classifier in the proposed algorithm. Results A performance evaluation was conducted with the MIT-BIH arrhythmia database. The results showed a high sensitivity of 97.51%, specificity of 85.07%, accuracy of 97.94%, and a positive predictive value of 97.26%. Conclusions The proposed algorithm achieves better accuracy than other state-of-the-art algorithms with no intrasubject between the training and evaluation datasets. And it significantly reduces the amount of intervention needed by physicians. PMID:21707989
NASA Astrophysics Data System (ADS)
Burgisser, Alain; Alletti, Marina; Scaillet, Bruno
2015-06-01
Modeling magmatic degassing, or how the volatile distribution between gas and melt changes at pressure varies, is a complex task that involves a large number of thermodynamical relationships and that requires dedicated software. This article presents the software D-Compress, which computes the gas and melt volatile composition of five element sets in magmatic systems (O-H, S-O-H, C-S-O-H, C-S-O-H-Fe, and C-O-H). It has been calibrated so as to simulate the volatiles coexisting with three common types of silicate melts (basalt, phonolite, and rhyolite). Operational temperatures depend on melt composition and range from 790 to 1400 °C. A specificity of D-Compress is the calculation of volatile composition as pressure varies along a (de)compression path between atmospheric and 3000 bars. This software was prepared so as to maximize versatility by proposing different sets of input parameters. In particular, whenever new solubility laws on specific melt compositions are available, the model parameters can be easily tuned to run the code on that composition. Parameter gaps were minimized by including sets of chemical species for which calibration data were available over a wide range of pressure, temperature, and melt composition. A brief description of the model rationale is followed by the presentation of the software capabilities. Examples of use are then presented with outputs comparisons between D-Compress and other currently available thermodynamical models. The compiled software and the source code are available as electronic supplementary materials.
Jaspard, Emmanuel; Macherel, David; Hunault, Gilles
2012-01-01
Late Embryogenesis Abundant Proteins (LEAPs) are ubiquitous proteins expected to play major roles in desiccation tolerance. Little is known about their structure - function relationships because of the scarcity of 3-D structures for LEAPs. The previous building of LEAPdb, a database dedicated to LEAPs from plants and other organisms, led to the classification of 710 LEAPs into 12 non-overlapping classes with distinct properties. Using this resource, numerous physico-chemical properties of LEAPs and amino acid usage by LEAPs have been computed and statistically analyzed, revealing distinctive features for each class. This unprecedented analysis allowed a rigorous characterization of the 12 LEAP classes, which differed also in multiple structural and physico-chemical features. Although most LEAPs can be predicted as intrinsically disordered proteins, the analysis indicates that LEAP class 7 (PF03168) and probably LEAP class 11 (PF04927) are natively folded proteins. This study thus provides a detailed description of the structural properties of this protein family opening the path toward further LEAP structure - function analysis. Finally, since each LEAP class can be clearly characterized by a unique set of physico-chemical properties, this will allow development of software to predict proteins as LEAPs. PMID:22615859
Distributed and Collaborative Software Analysis
NASA Astrophysics Data System (ADS)
Ghezzi, Giacomo; Gall, Harald C.
Throughout the years software engineers have come up with a myriad of specialized tools and techniques that focus on a certain type of
Bohler, Anwesha; Eijssen, Lars M T; van Iersel, Martijn P; Leemans, Christ; Willighagen, Egon L; Kutmon, Martina; Jaillard, Magali; Evelo, Chris T
2015-08-23
Biological pathways are descriptive diagrams of biological processes widely used for functional analysis of differentially expressed genes or proteins. Primary data analysis, such as quality control, normalisation, and statistical analysis, is often performed in scripting languages like R, Perl, and Python. Subsequent pathway analysis is usually performed using dedicated external applications. Workflows involving manual use of multiple environments are time consuming and error prone. Therefore, tools are needed that enable pathway analysis directly within the same scripting languages used for primary data analyses. Existing tools have limited capability in terms of available pathway content, pathway editing and visualisation options, and export file formats. Consequently, making the full-fledged pathway analysis tool PathVisio available from various scripting languages will benefit researchers. We developed PathVisioRPC, an XMLRPC interface for the pathway analysis software PathVisio. PathVisioRPC enables creating and editing biological pathways, visualising data on pathways, performing pathway statistics, and exporting results in several image formats in multiple programming environments. We demonstrate PathVisioRPC functionalities using examples in Python. Subsequently, we analyse a publicly available NCBI GEO gene expression dataset studying tumour bearing mice treated with cyclophosphamide in R. The R scripts demonstrate how calls to existing R packages for data processing and calls to PathVisioRPC can directly work together. To further support R users, we have created RPathVisio simplifying the use of PathVisioRPC in this environment. We have also created a pathway module for the microarray data analysis portal ArrayAnalysis.org that calls the PathVisioRPC interface to perform pathway analysis. This module allows users to use PathVisio functionality online without having to download and install the software and exemplifies how the PathVisioRPC interface can be used by data analysis pipelines for functional analysis of processed genomics data. PathVisioRPC enables data visualisation and pathway analysis directly from within various analytical environments used for preliminary analyses. It supports the use of existing pathways from WikiPathways or pathways created using the RPC itself. It also enables automation of tasks performed using PathVisio, making it useful to PathVisio users performing repeated visualisation and analysis tasks. PathVisioRPC is freely available for academic and commercial use at http://projects.bigcat.unimaas.nl/pathvisiorpc.
Gas monitoring onboard ISS using FTIR spectroscopy
NASA Astrophysics Data System (ADS)
Gisi, Michael; Stettner, Armin; Seurig, Roland; Honne, Atle; Witt, Johannes; Rebeyre, Pierre
2017-06-01
In the confined, enclosed environment of a spacecraft, the air quality must be monitored continuously in order to safeguard the crew's health. For this reason, OHB builds the ANITA2 (Analysing Interferometer for Ambient Air) technology demonstrator for trace gas monitoring onboard the International Space Station (ISS). The measurement principle of ANITA2 is based on the Fourier Transform Infrared (FTIR) technology with dedicated gas analysis software from the Norwegian partner SINTEF. This combination proved to provide high sensitivity, accuracy and precision for parallel measurements of 33 trace gases simultaneously onboard ISS by the precursor instrument ANITA1. The paper gives a technical overview about the opto-mechanical components of ANITA2, such as the interferometer, the reference Laser, the infrared source and the gas cell design and a quick overview about the gas analysis. ANITA2 is very well suited for measuring gas concentrations specifically but not limited to usage onboard spacecraft, as no consumables are required and measurements are performed autonomously. ANITA2 is a programme under the contract of the European Space Agency, and the air quality monitoring system is a stepping stone into the future, as a precursor system for manned exploration missions.
IoT for Real-Time Measurement of High-Throughput Liquid Dispensing in Laboratory Environments.
Shumate, Justin; Baillargeon, Pierre; Spicer, Timothy P; Scampavia, Louis
2018-04-01
Critical to maintaining quality control in high-throughput screening is the need for constant monitoring of liquid-dispensing fidelity. Traditional methods involve operator intervention with gravimetric analysis to monitor the gross accuracy of full plate dispenses, visual verification of contents, or dedicated weigh stations on screening platforms that introduce potential bottlenecks and increase the plate-processing cycle time. We present a unique solution using open-source hardware, software, and 3D printing to automate dispenser accuracy determination by providing real-time dispense weight measurements via a network-connected precision balance. This system uses an Arduino microcontroller to connect a precision balance to a local network. By integrating the precision balance as an Internet of Things (IoT) device, it gains the ability to provide real-time gravimetric summaries of dispensing, generate timely alerts when problems are detected, and capture historical dispensing data for future analysis. All collected data can then be accessed via a web interface for reviewing alerts and dispensing information in real time or remotely for timely intervention of dispense errors. The development of this system also leveraged 3D printing to rapidly prototype sensor brackets, mounting solutions, and component enclosures.
Analysis of Factors Limiting Bacterial Growth in PDMS Mother Machine Devices.
Yang, Da; Jennings, Anna D; Borrego, Evalynn; Retterer, Scott T; Männik, Jaan
2018-01-01
The microfluidic mother machine platform has attracted much interest for its potential in studies of bacterial physiology, cellular organization, and cell mechanics. Despite numerous experiments and development of dedicated analysis software, differences in bacterial growth and morphology in narrow mother machine channels compared to typical liquid media conditions have not been systematically characterized. Here we determine changes in E. coli growth rates and cell dimensions in different sized dead-end microfluidic channels using high resolution optical microscopy. We find that E. coli adapt to the confined channel environment by becoming narrower and longer compared to the same strain grown in liquid culture. Cell dimensions decrease as the channel length increases and width decreases. These changes are accompanied by increases in doubling times in agreement with the universal growth law. In channels 100 μm and longer, cell doublings can completely stop as a result of frictional forces that oppose cell elongation. Before complete cessation of elongation, mechanical stresses lead to substantial deformation of cells and changes in their morphology. Our work shows that mechanical forces rather than nutrient limitation are the main growth limiting factor for bacterial growth in long and narrow channels.
Analysis of Factors Limiting Bacterial Growth in PDMS Mother Machine Devices
Yang, Da; Jennings, Anna D.; Borrego, Evalynn; ...
2018-05-01
The microfluidic mother machine platform has attracted much interest for its potential in studies of bacterial physiology, cellular organization, and cell mechanics. Despite numerous experiments and development of dedicated analysis software, differences in bacterial growth and morphology in narrow mother machine channels compared to typical liquid media conditions have not been systematically characterized. Here we determine changes in E. coli growth rates and cell dimensions in different sized dead-end microfluidic channels using high resolution optical microscopy. We find that E. coli adapt to the confined channel environment by becoming narrower and longer compared to the same strain grown in liquidmore » culture. Cell dimensions decrease as the channel length increases and width decreases. These changes are accompanied by increases in doubling times in agreement with the universal growth law. In channels 100 μm and longer, cell doublings can completely stop as a result of frictional forces that oppose cell elongation. Before complete cessation of elongation, mechanical stresses lead to substantial deformation of cells and changes in their morphology. Lastly, our work shows that mechanical forces rather than nutrient limitation are the main growth limiting factor for bacterial growth in long and narrow channels.« less
phenoVein—A Tool for Leaf Vein Segmentation and Analysis1[OPEN
Pflugfelder, Daniel; Huber, Gregor; Scharr, Hanno; Hülskamp, Martin; Koornneef, Maarten; Jahnke, Siegfried
2015-01-01
Precise measurements of leaf vein traits are an important aspect of plant phenotyping for ecological and genetic research. Here, we present a powerful and user-friendly image analysis tool named phenoVein. It is dedicated to automated segmenting and analyzing of leaf veins in images acquired with different imaging modalities (microscope, macrophotography, etc.), including options for comfortable manual correction. Advanced image filtering emphasizes veins from the background and compensates for local brightness inhomogeneities. The most important traits being calculated are total vein length, vein density, piecewise vein lengths and widths, areole area, and skeleton graph statistics, like the number of branching or ending points. For the determination of vein widths, a model-based vein edge estimation approach has been implemented. Validation was performed for the measurement of vein length, vein width, and vein density of Arabidopsis (Arabidopsis thaliana), proving the reliability of phenoVein. We demonstrate the power of phenoVein on a set of previously described vein structure mutants of Arabidopsis (hemivenata, ondulata3, and asymmetric leaves2-101) compared with wild-type accessions Columbia-0 and Landsberg erecta-0. phenoVein is freely available as open-source software. PMID:26468519
In vivo growth of 60 non-screening detected lung cancers: a computed tomography study.
Mets, Onno M; Chung, Kaman; Zanen, Pieter; Scholten, Ernst T; Veldhuis, Wouter B; van Ginneken, Bram; Prokop, Mathias; Schaefer-Prokop, Cornelia M; de Jong, Pim A
2018-04-01
Current pulmonary nodule management guidelines are based on nodule volume doubling time, which assumes exponential growth behaviour. However, this is a theory that has never been validated in vivo in the routine-care target population. This study evaluates growth patterns of untreated solid and subsolid lung cancers of various histologies in a non-screening setting.Growth behaviour of pathology-proven lung cancers from two academic centres that were imaged at least three times before diagnosis (n=60) was analysed using dedicated software. Random-intercept random-slope mixed-models analysis was applied to test which growth pattern most accurately described lung cancer growth. Individual growth curves were plotted per pathology subgroup and nodule type.We confirmed that growth in both subsolid and solid lung cancers is best explained by an exponential model. However, subsolid lesions generally progress slower than solid ones. Baseline lesion volume was not related to growth, indicating that smaller lesions do not grow slower compared to larger ones.By showing that lung cancer conforms to exponential growth we provide the first experimental basis in the routine-care setting for the assumption made in volume doubling time analysis. Copyright ©ERS 2018.
NASA Astrophysics Data System (ADS)
Prévereaud, Y.; Vérant, J.-L.; Balat-Pichelin, M.; Moschetta, J.-M.
2016-05-01
To answer the question of space debris survivability during atmospheric entry ONERA uses its software named MUSIC/FAST. So, the first part of this paper is dedicated to the presentation of the ONERA tool and its validation by comparison with flight data and CFD computations. However, the influence of oxidation on the thermal degradation process and material properties in atmospheric entry conditions is still unknown. A second step is then devoted to the presentation of an experimental campaign investigating TA6V oxidation in atmospheric entry conditions, as the most of the debris found on ground are made of this material. Experiments have been realized using the MESOX facility implemented at the 6 kW solar furnace in PROMES-CNRS laboratory. Finally, an application of MUSIC/FAST is proposed on the atmospheric re-entry of a generic TA6V tank. Aiming at degradation assessment, a sensitive study to initial conditions is conducted. To complete computational analysis regarding degradation process by melting, a numerical analysis of the influence of oxidation on the thermal wall degradation during the tank atmospheric re-entry is presented as well.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hussain, Hameed; Malik, Saif Ur Rehman; Hameed, Abdul
An efficient resource allocation is a fundamental requirement in high performance computing (HPC) systems. Many projects are dedicated to large-scale distributed computing systems that have designed and developed resource allocation mechanisms with a variety of architectures and services. In our study, through analysis, a comprehensive survey for describing resource allocation in various HPCs is reported. The aim of the work is to aggregate under a joint framework, the existing solutions for HPC to provide a thorough analysis and characteristics of the resource management and allocation strategies. Resource allocation mechanisms and strategies play a vital role towards the performance improvement ofmore » all the HPCs classifications. Therefore, a comprehensive discussion of widely used resource allocation strategies deployed in HPC environment is required, which is one of the motivations of this survey. Moreover, we have classified the HPC systems into three broad categories, namely: (a) cluster, (b) grid, and (c) cloud systems and define the characteristics of each class by extracting sets of common attributes. All of the aforementioned systems are cataloged into pure software and hybrid/hardware solutions. The system classification is used to identify approaches followed by the implementation of existing resource allocation strategies that are widely presented in the literature.« less
Analysis of Factors Limiting Bacterial Growth in PDMS Mother Machine Devices
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Da; Jennings, Anna D.; Borrego, Evalynn
The microfluidic mother machine platform has attracted much interest for its potential in studies of bacterial physiology, cellular organization, and cell mechanics. Despite numerous experiments and development of dedicated analysis software, differences in bacterial growth and morphology in narrow mother machine channels compared to typical liquid media conditions have not been systematically characterized. Here we determine changes in E. coli growth rates and cell dimensions in different sized dead-end microfluidic channels using high resolution optical microscopy. We find that E. coli adapt to the confined channel environment by becoming narrower and longer compared to the same strain grown in liquidmore » culture. Cell dimensions decrease as the channel length increases and width decreases. These changes are accompanied by increases in doubling times in agreement with the universal growth law. In channels 100 μm and longer, cell doublings can completely stop as a result of frictional forces that oppose cell elongation. Before complete cessation of elongation, mechanical stresses lead to substantial deformation of cells and changes in their morphology. Lastly, our work shows that mechanical forces rather than nutrient limitation are the main growth limiting factor for bacterial growth in long and narrow channels.« less
Using Wide-Field Meteor Cameras to Actively Engage Students in Science
NASA Astrophysics Data System (ADS)
Kuehn, D. M.; Scales, J. N.
2012-08-01
Astronomy has always afforded teachers an excellent topic to develop students' interest in science. New technology allows the opportunity to inexpensively outfit local school districts with sensitive, wide-field video cameras that can detect and track brighter meteors and other objects. While the data-collection and analysis process can be mostly automated by software, there is substantial human involvement that is necessary in the rejection of spurious detections, in performing dynamics and orbital calculations, and the rare recovery and analysis of fallen meteorites. The continuous monitoring allowed by dedicated wide-field surveillance cameras can provide students with a better understanding of the behavior of the night sky including meteors and meteor showers, stellar motion, the motion of the Sun, Moon, and planets, phases of the Moon, meteorological phenomena, etc. Additionally, some students intrigued by the possibility of UFOs and "alien visitors" may find that actual monitoring data can help them develop methods for identifying "unknown" objects. We currently have two ultra-low light-level surveillance cameras coupled to fish-eye lenses that are actively obtaining data. We have developed curricula suitable for middle or high school students in astronomy and earth science courses and are in the process of testing and revising our materials.
Operating Dedicated Data Centers - Is It Cost-Effective?
NASA Astrophysics Data System (ADS)
Ernst, M.; Hogue, R.; Hollowell, C.; Strecker-Kellog, W.; Wong, A.; Zaytsev, A.
2014-06-01
The advent of cloud computing centres such as Amazon's EC2 and Google's Computing Engine has elicited comparisons with dedicated computing clusters. Discussions on appropriate usage of cloud resources (both academic and commercial) and costs have ensued. This presentation discusses a detailed analysis of the costs of operating and maintaining the RACF (RHIC and ATLAS Computing Facility) compute cluster at Brookhaven National Lab and compares them with the cost of cloud computing resources under various usage scenarios. An extrapolation of likely future cost effectiveness of dedicated computing resources is also presented.
NASA Technical Reports Server (NTRS)
1976-01-01
The preliminary analysis of strawman earth-viewing shuttle sortie payloads begun with the partial spacelab payload was analyzed. The payloads analyzed represent the two extremes of shuttle sortie application payloads: a full shuttle sortie payload dedicated to earth-viewing applications, and a small structure payload which can fly on a space available basis with another primary shuttle payload such as a free flying satellite. The intent of the dedicated mission analysis was to configure an ambitious, but feasible, payload; which, while rich in scientific return, would also stress the system and reveal any deficiences or problem areas in mission planning, support equipment, and operations. Conversely, the intent of the small structure payload was to demonstrate the ease with which a small, simple, flexible payload can be accommodated on shuttle flights.
Ampudia-Blasco, Francisco Javier; García-Soidán, Francisco Javier; Rubio Sánchez, Manuela; Phan, Tra-Mi
2017-03-01
DiaScope ® is a software to help in individualized prescription of antidiabetic treatment in type 2 diabetes. This study assessed its value and acceptability by different professionals. DiaScope ® was developed based on the ADA-EASD 2012 algorithm and on the recommendation of 12 international diabetes experts using the RAND/UCLA appropriateness method. The current study was performed at a single session. In the first phase, 5 clinical scenarios were evaluated, selecting the most appropriated therapeutic option among 4 possibilities (initial test). In a second phase, the same clinical cases were evaluated with DiaScope ® (final test).Opinion surveys on DiaScope ® were also performed (questionnaire). DiaScope ® changed the selected option 1 or more times in 70.5% of cases. Among 275 evaluated questionnaires, 54.0% strongly agree that DiaScope ® allowed finding easily a similar therapeutic scenario to the corresponding patient, and 52.5 among the obtained answers were clinically plausible. Up to 58.3% will recommend it to a colleague. In particular, primary care physicians with >20 years of professional dedication found with DiaScope ® the most appropriate option for a particular situation against specialists or those with less professional dedication (p<.05). DiaScope ® is an easy to use tool for antidiabetic drug prescription that provides plausible solutions and is especially useful for primary care physicians with more years of professional practice. Copyright © 2017 SEEN. Publicado por Elsevier España, S.L.U. All rights reserved.
Software architecture and design of the web services facilitating climate model diagnostic analysis
NASA Astrophysics Data System (ADS)
Pan, L.; Lee, S.; Zhang, J.; Tang, B.; Zhai, C.; Jiang, J. H.; Wang, W.; Bao, Q.; Qi, M.; Kubar, T. L.; Teixeira, J.
2015-12-01
Climate model diagnostic analysis is a computationally- and data-intensive task because it involves multiple numerical model outputs and satellite observation data that can both be high resolution. We have built an online tool that facilitates this process. The tool is called Climate Model Diagnostic Analyzer (CMDA). It employs the web service technology and provides a web-based user interface. The benefits of these choices include: (1) No installation of any software other than a browser, hence it is platform compatable; (2) Co-location of computation and big data on the server side, and small results and plots to be downloaded on the client side, hence high data efficiency; (3) multi-threaded implementation to achieve parallel performance on multi-core servers; and (4) cloud deployment so each user has a dedicated virtual machine. In this presentation, we will focus on the computer science aspects of this tool, namely the architectural design, the infrastructure of the web services, the implementation of the web-based user interface, the mechanism of provenance collection, the approach to virtualization, and the Amazon Cloud deployment. As an example, We will describe our methodology to transform an existing science application code into a web service using a Python wrapper interface and Python web service frameworks (i.e., Flask, Gunicorn, and Tornado). Another example is the use of Docker, a light-weight virtualization container, to distribute and deploy CMDA onto an Amazon EC2 instance. Our tool of CMDA has been successfully used in the 2014 Summer School hosted by the JPL Center for Climate Science. Students had positive feedbacks in general and we will report their comments. An enhanced version of CMDA with several new features, some requested by the 2014 students, will be used in the 2015 Summer School soon.
CAMECA IMS 1300-HR3: The New Generation Ion Microprobe
NASA Astrophysics Data System (ADS)
Peres, P.; Choi, S. Y.; Renaud, L.; Saliot, P.; Larson, D. J.
2016-12-01
The success of secondary ion mass spectrometry (SIMS) in Geo- and Cosmo-chemistry relies on its performance in terms of: 1) very high sensitivity (mandatory for high precision measurements or to achieve low detection limits); 2) a broad mass range of elemental and isotopic species, from low mass (H) to high mass (U and above); 3) in-situ analysis of any solid flat polished surface; and 4) high spatial resolution from tens of microns down to sub-micron scale. The IMS 1300-HR3 (High Reproducibility, High spatial Resolution, High mass Resolution) is the latest generation of CAMECA's large geometry magnetic sector SIMS (or ion microprobe), successor to the internationally recognized IMS 1280-HR. The 1300-HR3delivers unmatched analytical performance for a wide range of applications (stable isotopes, geochronology, trace elements, nuclear safeguards and environmental studies…) due to: • High brightness RF-plasma oxygen ion source with enhanced beam density and current stability, dramatically improving spatial resolution, data reproducibility, and throughput • Automated sample loading system with motorized sample height (Z) adjustment, significantly increasing analysis precision, ease-of-use, and productivity • UV-light microscope for enhanced optical image resolution, together with dedicated software for easy sample navigation (developed by University of Wisconsin, USA) • Low noise 1012Ω resistor Faraday cup preamplifier boards for measuring low signal intensities In addition, improvements in electronics and software have been integrated into the new instrument. In order to meet a growing demand from geochronologists, CAMECA also introduces the KLEORA, which is a fully optimized ion microprobe for advanced mineral dating derived from the IMS 1300-HR3. Instrumental developments as well as data obtained for stable isotope and U-Pb dating applications will be presented in detail.
Bioelectricity versus bioethanol from sugarcane bagasse: is it worth being flexible?
2013-01-01
Background Sugarcane is the most efficient crop for production of (1G) ethanol. Additionally, sugarcane bagasse can be used to produce (2G) ethanol. However, the manufacture of 2G ethanol in large scale is not a consolidated process yet. Thus, a detailed economic analysis, based on consistent simulations of the process, is worthwhile. Moreover, both ethanol and electric energy markets have been extremely volatile in Brazil, which suggests that a flexible biorefinery, able to switch between 2G ethanol and electric energy production, could be an option to absorb fluctuations in relative prices. Simulations of three cases were run using the software EMSO: production of 1G ethanol + electric energy, of 1G + 2G ethanol and a flexible biorefinery. Bagasse for 2G ethanol was pretreated with a weak acid solution, followed by enzymatic hydrolysis, while 50% of sugarcane trash (mostly leaves) was used as surplus fuel. Results With maximum diversion of bagasse to 2G ethanol (74% of the total), an increase of 25.8% in ethanol production (reaching 115.2 L/tonne of sugarcane) was achieved. An increase of 21.1% in the current ethanol price would be enough to make all three biorefineries economically viable (11.5% for the 1G + 2G dedicated biorefinery). For 2012 prices, the flexible biorefinery presented a lower Internal Rate of Return (IRR) than the 1G + 2G dedicated biorefinery. The impact of electric energy prices (auction and spot market) and of enzyme costs on the IRR was not as significant as it would be expected. Conclusions For current market prices in Brazil, not even production of 1G bioethanol is economically feasible. However, the 1G + 2G dedicated biorefinery is closer to feasibility than the conventional 1G + electric energy industrial plant. Besides, the IRR of the 1G + 2G biorefinery is more sensitive with respect to the price of ethanol, and an increase of 11.5% in this value would be enough to achieve feasibility. The ability of the flexible biorefinery to take advantage of seasonal fluctuations does not make up for its higher investment cost, in the present scenario. PMID:24088415
Ultra-rapid EOP determination with VLBI
NASA Astrophysics Data System (ADS)
Haas, Rüdiger; Kurihara, Shinobu; Nozawa, Kentaro; Hobiger, Thomas; Lovell, Jim; McCallum, Jamie; Quick, Jonathan
2013-04-01
In 2007 the Geospatial information Authority of Japan (GSI) and the Onsala Space Observatory (OSO) started a project aiming at determining the earth rotation angle, usually expressed as dUT1, in near real-time. In the beginning of this project dedicated one hour long one-baseline experiments were observed periodically using the VLBI stations Onsala (Sweden) and Tsukuba (Japan). The strategy is that the observed VLBI-data are sent in real-time via the international optical fibre backbone to the VLBI-correlator at Tsukuba where the data are correlated and analyzed in near-real time, producing ultra-rapid dUT1 results. An offline version of this strategy has been adopted in 2009 for the regular VLBI intensive series INT-2 involving Wettzell (Germany) and Tsukuba. Since March 2010 the INT-2 is using real-time e-transfer, too, and since June 2010 also automated analysis. Starting in 2009 the ultra-rapid approach was applied to regular 24 hour long VLBI-sessions that involve Tsukuba and Onsala, so that ultra-rapid dUT1 results can be produced already during ongoing VLBI-sessions. This strategy was successfully operated during the 15 days long CONT11 campaign. In 2011 the ultra-rapid strategy was extended to involve a network of VLBI-stations, so that not only dUT1 but also the polar motion components can be determined in near real-time. Initially, in November 2011 a dedicated three-station session was observed involving Onsala, Tsukuba and Hobart (Tasmania, Australia). In 2012 several regular 24 hour long IVS-sessions that involved Onsala, Tsukuba and HartRAO (South Africa) were operated with the ultra-rapid strategy, and in several cases also Hobart was added as a fourth station. For this project we use the new analysis software c5++ developed by the National Institute of Information and Communications Technology (NICT). In this presentation we give an overview of the UREOP-project, describe the recent developments, and discuss the obtained results.
Boukazouha, F; Poulin-Vittrant, G; Tran-Huu-Hue, L P; Bavencoffe, M; Boubenider, F; Rguiti, M; Lethiecq, M
2015-07-01
This article is dedicated to the study of Piezoelectric Transformers (PTs), which offer promising solutions to the increasing need for integrated power electronics modules within autonomous systems. The advantages offered by such transformers include: immunity to electromagnetic disturbances; ease of miniaturisation for example, using conventional micro fabrication processes; and enhanced performance in terms of voltage gain and power efficiency. Central to the adequate description of such transformers is the need for complex analytical modeling tools, especially if one is attempting to include combined contributions due to (i) mechanical phenomena owing to the different propagation modes which differ at the primary and secondary sides of the PT; and (ii) electrical phenomena such as the voltage gain and power efficiency, which depend on the electrical load. The present work demonstrates an original one-dimensional (1D) analytical model, dedicated to a Rosen-type PT and simulation results are successively compared against that of a three-dimensional (3D) Finite Element Analysis (COMSOL Multiphysics software) and experimental results. The Rosen-type PT studied here is based on a single layer soft PZT (P191) with corresponding dimensions 18 mm × 3 mm × 1.5 mm, which operated at the second harmonic of 176 kHz. Detailed simulational and experimental results show that the presented 1D model predicts experimental measurements to within less than 10% error of the voltage gain at the second and third resonance frequency modes. Adjustment of the analytical model parameters is found to decrease errors relative to experimental voltage gain to within 1%, whilst a 2.5% error on the output admittance magnitude at the second resonance mode were obtained. Relying on the unique assumption of one-dimensionality, the present analytical model appears as a useful tool for Rosen-type PT design and behavior understanding. Copyright © 2015 Elsevier B.V. All rights reserved.
Liu, Ya-Fei; Yuan, Hong-Fu; Song, Chun-Feng; Xie, Jin-Chun; Li, Xiao-Yu; Yan, De-Lin
2014-11-01
A new method is proposed for the fast determination of the induction period of gasoline using Fourier transform attenuated total reflection infrared spectroscopy (ATR-FTIR). A dedicated analysis system with the function of spectral measurement, data processing, display and storage was designed and integrated using a Fourier transform infrared spectrometer module and chemometric software. The sample presentation accessory designed which has advantages of constant optical path, convenient sample injection and cleaning is composed of a nine times reflection attenuated total reflectance (ATR) crystal of zinc selenide (ZnSe) coated with a diamond film and a stainless steel lid with sealing device. The influence of spectral scanning number and repeated sample loading times on the spectral signal-to-noise ratio was studied. The optimum spectral scanning number is 15 times and the optimum sample loading number is 4 times. Sixty four different gasoline samples were collected from the Beijing-Tianjin area and the induction period values were determined as reference data by standard method GB/T 8018-87. The infrared spectra of these samples were collected in the operating condition mentioned above using the dedicated fast analysis system. Spectra were pretreated using mean centering and 1st derivative to reduce the influence of spectral noise and baseline shift A PLS calibration model for the induction period was established by correlating the known induction period values of the samples with their spectra. The correlation coefficient (R2), standard error of calibration (SEC) and standard error of prediction (SEP) of the model are 0.897, 68.3 and 91.9 minutes, respectively. The relative deviation of the model for gasoline induction period prediction is less than 5%, which meets the requirements of repeatability tolerance in GB method. The new method is simple and fast. It takes no more than 3 minutes to detect one sample. Therefore, the method is feasible for implementing fast determination of gasoline induction period, and of a positive meaning in the evaluation of fuel quality.
van Veen-Berkx, Elizabeth; Elkhuizen, Sylvia G; Kuijper, Bart; Kazemier, Geert
2016-01-01
Two approaches prevail for reserving operating room (OR) capacity for emergency surgery: (1) dedicated emergency ORs and (2) evenly allocating capacity to all elective ORs, thereby creating a virtual emergency team. Previous studies contradict which approach leads to the best performance in OR utilization. Quasi-experimental controlled time-series design with empirical data from 3 university medical centers. Four different time periods were compared with analysis of variance with contrasts. Performance was measured based on 467,522 surgical cases. After closing the dedicated emergency OR, utilization slightly increased; overtime also increased. This was in contrast to earlier simulated results. The 2 control centers, maintaining a dedicated emergency OR, showed a higher increase in utilization and a decrease in overtime, along with a smaller ratio of case cancellations because of emergency surgery. This study shows that in daily practice a dedicated emergency OR is the preferred approach in performance terms regarding utilization, overtime, and case cancellations. Copyright © 2016 Elsevier Inc. All rights reserved.
DOE Office of Scientific and Technical Information (OSTI.GOV)
CHEN, JOANNA; SIMIRENKO, LISA; TAPASWI, MANJIRI
The DIVA software interfaces a process in which researchers design their DNA with a web-based graphical user interface, submit their designs to a central queue, and a few weeks later receive their sequence-verified clonal constructs. Each researcher independently designs the DNA to be constructed with a web-based BioCAD tool, and presses a button to submit their designs to a central queue. Researchers have web-based access to their DNA design queues, and can track the progress of their submitted designs as they progress from "evaluation", to "waiting for reagents", to "in progress", to "complete". Researchers access their completed constructs through themore » central DNA repository. Along the way, all DNA construction success/failure rates are captured in a central database. Once a design has been submitted to the queue, a small number of dedicated staff evaluate the design for feasibility and provide feedback to the responsible researcher if the design is either unreasonable (e.g., encompasses a combinatorial library of a billion constructs) or small design changes could significantly facilitate the downstream implementation process. The dedicated staff then use DNA assembly design automation software to optimize the DNA construction process for the design, leveraging existing parts from the DNA repository where possible and ordering synthetic DNA where necessary. SynTrack software manages the physical locations and availability of the various requisite reagents and process inputs (e.g., DNA templates). Once all requisite process inputs are available, the design progresses from "waiting for reagents" to "in progress" in the design queue. Human-readable and machine-parseable DNA construction protocols output by the DNA assembly design automation software are then executed by the dedicated staff exploiting lab automation devices wherever possible. Since the all employed DNA construction methods are sequence-agnostic, standardized (utilize the same enzymatic master mixes and reaction conditions), completely independent DNA construction tasks can be aggregated into the same multi-well plates and pursued in parallel. The resulting sets of cloned constructs can then be screened by high-throughput next-gen sequencing platforms for sequence correctness. A combination of long read-length (e.g., PacBio) and paired-end read platforms (e.g., Illumina) would be exploited depending the particular task at hand (e.g., PacBio might be sufficient to screen a set of pooled constructs with significant gene divergence). Post sequence verification, designs for which at least one correct clone was identified will progress to a "complete" status, while designs for which no correct clones wereidentified will progress to a "failure" status. Depending on the failure mode (e.g., no transformants), and how many prior attempts/variations of assembly protocol have been already made for a given design, subsequent attempts may be made or the design can progress to a "permanent failure" state. All success and failure rate information will be captured during the process, including at which stage a given clonal construction procedure failed (e.g., no PCR product) and what the exact failure was (e.g. assembly piece 2 missing). This success/failure rate data can be leveraged to refine the DNA assembly design process.« less
Unger, L W; Muckenhuber, M; Riss, S; Argeny, S; Stift, J; Mesteri, I; Stift, A
2018-04-28
As adjuvant chemotherapy in colorectal cancer relies on the identification of lymph node metastases, the pathologist's dedication may have a considerable influence on postoperative survival. The aim of this retrospective study was to assess the impact of the pathologist's dedication on lymph node detection rate and postoperative survival in patients operated on by a single experienced colorectal surgeon within a 5-year period. We assessed 229 patients undergoing total mesorectal excision or complete mesocolic excision by the senior author between 1 January 2009 and 31 December 2013. Pathologists were grouped as 'general pathologist' or 'dedicated pathologist' depending on their dedication/specialization. Dedicated pathologists found statistically significantly more lymph nodes in colorectal specimens than general pathologists [23 (interquartile range 24) vs 14 (interquartile range 11), respectively; P < 0.001]. The detection rate of ≥ 12 lymph nodes per specimen was significantly higher in the dedicated pathologist group [65/74 (87.8%) vs 105/155 (67.7%); P = 0.016]. However, postoperative survival did not differ in the respective subgroups. In the multivariable analysis by Cox proportional hazard model, International Union against Cancer Stage IV was the only factor associated with decreased disease-specific survival (hazard ratio 28.257; 95% CI 3.850-207.386; P = 0.001). In our centre, the pathologist's dedication has an impact on lymph node detection rate but does not influence postoperative disease-specific survival. Colorectal Disease © 2018 The Association of Coloproctology of Great Britain and Ireland.
SiFAP: a Simple Sub-Millisecond Astronomical Photometer
NASA Astrophysics Data System (ADS)
Ambrosino, F.; Meddi, F.; Nesci, R.; Rossi, C.; Sclavi, S.; Bruni, I.
2013-09-01
A new fast photometer based on SiPM technology was developed at the University of Rome "La Sapienza" starting from 2009. A first prototype was successfully tested observing the Crab pulsar at the Loiano telescope of the Bologna Observatory. In this paper we illustrate the improvements we applied to our instrument, concerning new cooled commercial sensors, a new version of our custom dedicated electronics and an upgraded control timing software. Finally we report the results obtained with this instrument on December 2012 on the Crab pulsar at the Loiano telescope to show its goodness and capabilities.
[Management of pre-analytical nonconformities].
Berkane, Z; Dhondt, J L; Drouillard, I; Flourié, F; Giannoli, J M; Houlbert, C; Surgat, P; Szymanowicz, A
2010-12-01
The main nonconformities enumerated to facilitate consensual codification. In each case, an action is defined: refusal to realize the examination with request of a new sample, request of information or correction, results cancellation, nurse or physician information. A traceability of the curative, corrective and preventive actions is needed. Then, methodology and indicators are proposed to assess nonconformity and to follow the quality improvements. The laboratory information system can be used instead of dedicated software. Tools for the follow-up of nonconformities scores are proposed. Finally, we propose an organization and some tools allowing the management and control of the nonconformities occurring during the pre-examination phase.
Beamforming strategy of ULA and UCA sensor configuration in multistatic passive radar
NASA Astrophysics Data System (ADS)
Hossa, Robert
2009-06-01
A Beamforming Network (BN) concept of Uniform Linear Array (ULA) and Uniform Circular Array (UCA) dipole configuration designed to multistatic passive radar is considered in details. In the case of UCA configuration, computationally efficient procedure of beamspace transformation from UCA to virtual ULA configuration with omnidirectional coverage is utilized. If effect, the idea of the proposed solution is equivalent to the techniques of antenna array factor shaping dedicated to ULA structure. Finally, exemplary results from the computer software simulations of elaborated spatial filtering solutions to reference and surveillance channels are provided and discussed.
Helping dentists manage accounts receivable.
Scott, J
2001-01-01
First Pacific Corporation (FPC) has worked with dental practices since 1961, providing personal services that optimize practice performance. In addition to being the premier service provider for administrative tasks in dental offices, they supply state-of-the-art hardware and accounts receivable management software. FPC designs and teaches practice development strategies, deliver on-site training, and much more. FPC is dedicated to the long-term professional success of dental clients, their staff, and their practices through a unique, integrated package of services. As a family-owned business, with headquarters in Salem, Oregon, FPC employs approximately two hundred staff who serve practices in twenty-two states.
NASA Technical Reports Server (NTRS)
Thomas, V. C.
1986-01-01
A Vibroacoustic Data Base Management Center has been established at the Jet Propulsion Laboratory (JPL). The center utilizes the Vibroacoustic Payload Environment Prediction System (VAPEPS) software package to manage a data base of shuttle and expendable launch vehicle flight and ground test data. Remote terminal access over telephone lines to a dedicated VAPEPS computer system has been established to provide the payload community a convenient means of querying the global VAPEPS data base. This guide describes the functions of the JPL Data Base Management Center and contains instructions for utilizing the resources of the center.
A smartphone photogrammetry method for digitizing prosthetic socket interiors.
Hernandez, Amaia; Lemaire, Edward
2017-04-01
Prosthetic CAD/CAM systems require accurate 3D limb models; however, difficulties arise when working from the person's socket since current 3D scanners have difficulties scanning socket interiors. While dedicated scanners exist, they are expensive and the cost may be prohibitive for a limited number of scans per year. A low-cost and accessible photogrammetry method for socket interior digitization is proposed, using a smartphone camera and cloud-based photogrammetry services. 15 two-dimensional images of the socket's interior are captured using a smartphone camera. A 3D model is generated using cloud-based software. Linear measurements were comparing between sockets and the related 3D models. 3D reconstruction accuracy averaged 2.6 ± 2.0 mm and 0.086 ± 0.078 L, which was less accurate than models obtained by high quality 3D scanners. However, this method would provide a viable 3D digital socket reproduction that is accessible and low-cost, after processing in prosthetic CAD software. Clinical relevance The described method provides a low-cost and accessible means to digitize a socket interior for use in prosthetic CAD/CAM systems, employing a smartphone camera and cloud-based photogrammetry software.
AlgoRun: a Docker-based packaging system for platform-agnostic implemented algorithms.
Hosny, Abdelrahman; Vera-Licona, Paola; Laubenbacher, Reinhard; Favre, Thibauld
2016-08-01
There is a growing need in bioinformatics for easy-to-use software implementations of algorithms that are usable across platforms. At the same time, reproducibility of computational results is critical and often a challenge due to source code changes over time and dependencies. The approach introduced in this paper addresses both of these needs with AlgoRun, a dedicated packaging system for implemented algorithms, using Docker technology. Implemented algorithms, packaged with AlgoRun, can be executed through a user-friendly interface directly from a web browser or via a standardized RESTful web API to allow easy integration into more complex workflows. The packaged algorithm includes the entire software execution environment, thereby eliminating the common problem of software dependencies and the irreproducibility of computations over time. AlgoRun-packaged algorithms can be published on http://algorun.org, a centralized searchable directory to find existing AlgoRun-packaged algorithms. AlgoRun is available at http://algorun.org and the source code under GPL license is available at https://github.com/algorun laubenbacher@uchc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Casimage project: a digital teaching files authoring environment.
Rosset, Antoine; Muller, Henning; Martins, Martina; Dfouni, Natalia; Vallée, Jean-Paul; Ratib, Osman
2004-04-01
The goal of the Casimage project is to offer an authoring and editing environment integrated with the Picture Archiving and Communication Systems (PACS) for creating image-based electronic teaching files. This software is based on a client/server architecture allowing remote access of users to a central database. This authoring environment allows radiologists to create reference databases and collection of digital images for teaching and research directly from clinical cases being reviewed on PACS diagnostic workstations. The environment includes all tools to create teaching files, including textual description, annotations, and image manipulation. The software also allows users to generate stand-alone CD-ROMs and web-based teaching files to easily share their collections. The system includes a web server compatible with the Medical Imaging Resource Center standard (MIRC, http://mirc.rsna.org) to easily integrate collections in the RSNA web network dedicated to teaching files. This software could be installed on any PACS workstation to allow users to add new cases at any time and anywhere during clinical operations. Several images collections were created with this tool, including thoracic imaging that was subsequently made available on a CD-Rom and on our web site and through the MIRC network for public access.
Software development for a gamma-ray burst rapid-response observatory in the US Virgin Islands.
NASA Astrophysics Data System (ADS)
Davis, K. A.; Giblin, T. W.; Neff, J. E.; Hakkila, J.; Hartmann, D.
2004-12-01
The site is situated near the crest of Crown Mountain on the island of St. Thomas in the US Virgin Islands. The observing site is strategically located 65 W longitude, placing it as the most eastern GRB-dedicated observing site in the western hemisphere. The observatory has a 0.5 m robotic telescope and a Marconi 4240 2048 by 2048 CCD with BVRI filters. The field of view is identical to that of the XRT onboard Swift, 19 by 19 arc minutes. The telescope is operated through the Talon telescope control software. The observatory is notified of a burst trigger through the GRB Coordinates Network (GCN). This GCN notification is received through a socket connection to the control computer on site. A Perl script passes this information to the Talon software, which automatically interrupts concurrent observations and inserts a new GRB observing schedule. Once the observations are made the resulting images are then analyzed in IRAF. A source extraction is necessary to identify known sources and the optical transient. The system is being calibrated for automatic GRB response and is expected to be ready to follow up Swift observations. This work has been supported by NSF and NASA-EPSCoR.
NASA Astrophysics Data System (ADS)
Gordov, Evgeny; Okladnikov, Igor; Titov, Alexander
2017-04-01
For comprehensive usage of large geospatial meteorological and climate datasets it is necessary to create a distributed software infrastructure based on the spatial data infrastructure (SDI) approach. Currently, it is generally accepted that the development of client applications as integrated elements of such infrastructure should be based on the usage of modern web and GIS technologies. The paper describes the Web GIS for complex processing and visualization of geospatial (mainly in NetCDF and PostGIS formats) datasets as an integral part of the dedicated Virtual Research Environment for comprehensive study of ongoing and possible future climate change, and analysis of their implications, providing full information and computing support for the study of economic, political and social consequences of global climate change at the global and regional levels. The Web GIS consists of two basic software parts: 1. Server-side part representing PHP applications of the SDI geoportal and realizing the functionality of interaction with computational core backend, WMS/WFS/WPS cartographical services, as well as implementing an open API for browser-based client software. Being the secondary one, this part provides a limited set of procedures accessible via standard HTTP interface. 2. Front-end part representing Web GIS client developed according to a "single page application" technology based on JavaScript libraries OpenLayers (http://openlayers.org/), ExtJS (https://www.sencha.com/products/extjs), GeoExt (http://geoext.org/). It implements application business logic and provides intuitive user interface similar to the interface of such popular desktop GIS applications, as uDIG, QuantumGIS etc. Boundless/OpenGeo architecture was used as a basis for Web-GIS client development. According to general INSPIRE requirements to data visualization Web GIS provides such standard functionality as data overview, image navigation, scrolling, scaling and graphical overlay, displaying map legends and corresponding metadata information. The specialized Web GIS client contains three basic tires: • Tier of NetCDF metadata in JSON format • Middleware tier of JavaScript objects implementing methods to work with: o NetCDF metadata o XML file of selected calculations configuration (XML task) o WMS/WFS/WPS cartographical services • Graphical user interface tier representing JavaScript objects realizing general application business logic Web-GIS developed provides computational processing services launching to support solving tasks in the area of environmental monitoring, as well as presenting calculation results in the form of WMS/WFS cartographical layers in raster (PNG, JPG, GeoTIFF), vector (KML, GML, Shape), and binary (NetCDF) formats. It has shown its effectiveness in the process of solving real climate change research problems and disseminating investigation results in cartographical formats. The work is supported by the Russian Science Foundation grant No 16-19-10257.
Design optimization of large-size format edge-lit light guide units
NASA Astrophysics Data System (ADS)
Hastanin, J.; Lenaerts, C.; Fleury-Frenette, K.
2016-04-01
In this paper, we present an original method of dot pattern generation dedicated to large-size format light guide plate (LGP) design optimization, such as photo-bioreactors, the number of dots greatly exceeds the maximum allowable number of optical objects supported by most common ray-tracing software. In the proposed method, in order to simplify the computational problem, the original optical system is replaced by an equivalent one. Accordingly, an original dot pattern is splitted into multiple small sections, inside which the dot size variation is less than the ink dots printing typical resolution. Then, these sections are replaced by equivalent cells with continuous diffusing film. After that, we adjust the TIS (Total Integrated Scatter) two-dimensional distribution over the grid of equivalent cells, using an iterative optimization procedure. Finally, the obtained optimal TIS distribution is converted into the dot size distribution by applying an appropriate conversion rule. An original semi-empirical equation dedicated to rectangular large-size LGPs is proposed for the initial guess of TIS distribution. It allows significantly reduce the total time needed to dot pattern optimization.
@TOME-2: a new pipeline for comparative modeling of protein-ligand complexes.
Pons, Jean-Luc; Labesse, Gilles
2009-07-01
@TOME 2.0 is new web pipeline dedicated to protein structure modeling and small ligand docking based on comparative analyses. @TOME 2.0 allows fold recognition, template selection, structural alignment editing, structure comparisons, 3D-model building and evaluation. These tasks are routinely used in sequence analyses for structure prediction. In our pipeline the necessary software is efficiently interconnected in an original manner to accelerate all the processes. Furthermore, we have also connected comparative docking of small ligands that is performed using protein-protein superposition. The input is a simple protein sequence in one-letter code with no comment. The resulting 3D model, protein-ligand complexes and structural alignments can be visualized through dedicated Web interfaces or can be downloaded for further studies. These original features will aid in the functional annotation of proteins and the selection of templates for molecular modeling and virtual screening. Several examples are described to highlight some of the new functionalities provided by this pipeline. The server and its documentation are freely available at http://abcis.cbs.cnrs.fr/AT2/
@TOME-2: a new pipeline for comparative modeling of protein–ligand complexes
Pons, Jean-Luc; Labesse, Gilles
2009-01-01
@TOME 2.0 is new web pipeline dedicated to protein structure modeling and small ligand docking based on comparative analyses. @TOME 2.0 allows fold recognition, template selection, structural alignment editing, structure comparisons, 3D-model building and evaluation. These tasks are routinely used in sequence analyses for structure prediction. In our pipeline the necessary software is efficiently interconnected in an original manner to accelerate all the processes. Furthermore, we have also connected comparative docking of small ligands that is performed using protein–protein superposition. The input is a simple protein sequence in one-letter code with no comment. The resulting 3D model, protein–ligand complexes and structural alignments can be visualized through dedicated Web interfaces or can be downloaded for further studies. These original features will aid in the functional annotation of proteins and the selection of templates for molecular modeling and virtual screening. Several examples are described to highlight some of the new functionalities provided by this pipeline. The server and its documentation are freely available at http://abcis.cbs.cnrs.fr/AT2/ PMID:19443448
The ID23-2 structural biology microfocus beamline at the ESRF
Flot, David; Mairs, Trevor; Giraud, Thierry; Guijarro, Matias; Lesourd, Marc; Rey, Vicente; van Brussel, Denis; Morawe, Christian; Borel, Christine; Hignette, Olivier; Chavanne, Joel; Nurizzo, Didier; McSweeney, Sean; Mitchell, Edward
2010-01-01
The first phase of the ESRF beamline ID23 to be constructed was ID23-1, a tunable MAD-capable beamline which opened to users in early 2004. The second phase of the beamline to be constructed is ID23-2, a monochromatic microfocus beamline dedicated to macromolecular crystallography experiments. Beamline ID23-2 makes use of well characterized optical elements: a single-bounce silicon (111) monochromator and two mirrors in Kirkpatrick–Baez geometry to focus the X-ray beam. A major design goal of the ID23-2 beamline is to provide a reliable, easy-to-use and routine microfocus beam. ID23-2 started operation in November 2005, as the first beamline dedicated to microfocus macromolecular crystallography. The beamline has taken the standard automated ESRF macromolecular crystallography environment (both hardware and software), allowing users of ID23-2 to be rapidly familiar with the microfocus environment. This paper describes the beamline design, the special considerations taken into account given the microfocus beam, and summarizes the results of the first years of the beamline operation. PMID:20029119
NASA Astrophysics Data System (ADS)
Perez-Hoyos, Santiago; Sanchez-Lavega, A.; Hueso, R.; Rojas, J. F.
2010-10-01
The Aula Espazio Gela is a facility at the School of Technical Engineering of the Universidad del Pais Vasco (Bilbao, Spain) dedicated to the education of undergarduated and gratuated students in the research and technology of space science activities. It also promotes the collaboration between the University and industrial spatial sector. One of the main elements of this facility is an astronomical observatory that is oriented to the activities of the students of the Master in Space Science and Technology. The main instrument is a 50 cm aperture Dall-Kirham telescope with equatorial mount completely robotized that includes different CCD cameras. Here we present some of the projects developed by graduate and under-graduate students in the field of the solar system. Explicitly we present some studies dedicated to the studies of planetary atmospheres and to acquire skills on the software management of planetary images. Aknowledgements: This project is supported by the Dpto. Innovación y Promoción Económica de la Diputación Foral de Bizkaia (Basque Country).
Dedicated to Gifted Education: An Interview with Karen Rogers
ERIC Educational Resources Information Center
Hay, Peta
2017-01-01
Karen B. Rogers has dedicated her career to serving gifted students. In this interview she outlines her major research studies, and explores some of her experiences in the field, with special emphasis on her time in Australia. She discusses her use of the meta-synthesis and meta-analysis methodologies, and outlines key areas of gifted education…
A System for Modelling Cell–Cell Interactions during Plant Morphogenesis
Dupuy, Lionel; Mackenzie, Jonathan; Rudge, Tim; Haseloff, Jim
2008-01-01
Background and aims During the development of multicellular organisms, cells are capable of interacting with each other through a range of biological and physical mechanisms. A description of these networks of cell–cell interactions is essential for an understanding of how cellular activity is co-ordinated in regionalized functional entities such as tissues or organs. The difficulty of experimenting on living tissues has been a major limitation to describing such systems, and computer modelling appears particularly helpful to characterize the behaviour of multicellular systems. The experimental difficulties inherent to the multitude of parallel interactions that underlie cellular morphogenesis have led to the need for computer models. Methods A new generic model of plant cellular morphogenesis is described that expresses interactions amongst cellular entities explicitly: the plant is described as a multi-scale structure, and interactions between distinct entities is established through a topological neighbourhood. Tissues are represented as 2D biphasic systems where the cell wall responds to turgor pressure through a viscous yielding of the cell wall. Key Results This principle was used in the development of the CellModeller software, a generic tool dedicated to the analysis and modelling of plant morphogenesis. The system was applied to three contrasting study cases illustrating genetic, hormonal and mechanical factors involved in plant morphogenesis. Conclusions Plant morphogenesis is fundamentally a cellular process and the CellModeller software, through its underlying generic model, provides an advanced research tool to analyse coupled physical and biological morphogenetic mechanisms. PMID:17921524
NASA Technical Reports Server (NTRS)
Jaggi, S.
1993-01-01
A study is conducted to investigate the effects and advantages of data compression techniques on multispectral imagery data acquired by NASA's airborne scanners at the Stennis Space Center. The first technique used was vector quantization. The vector is defined in the multispectral imagery context as an array of pixels from the same location from each channel. The error obtained in substituting the reconstructed images for the original set is compared for different compression ratios. Also, the eigenvalues of the covariance matrix obtained from the reconstructed data set are compared with the eigenvalues of the original set. The effects of varying the size of the vector codebook on the quality of the compression and on subsequent classification are also presented. The output data from the Vector Quantization algorithm was further compressed by a lossless technique called Difference-mapped Shift-extended Huffman coding. The overall compression for 7 channels of data acquired by the Calibrated Airborne Multispectral Scanner (CAMS), with an RMS error of 15.8 pixels was 195:1 (0.41 bpp) and with an RMS error of 3.6 pixels was 18:1 (.447 bpp). The algorithms were implemented in software and interfaced with the help of dedicated image processing boards to an 80386 PC compatible computer. Modules were developed for the task of image compression and image analysis. Also, supporting software to perform image processing for visual display and interpretation of the compressed/classified images was developed.
SeedVicious: Analysis of microRNA target and near-target sites.
Marco, Antonio
2018-01-01
Here I describe seedVicious, a versatile microRNA target site prediction software that can be easily fitted into annotation pipelines and run over custom datasets. SeedVicious finds microRNA canonical sites plus other, less efficient, target sites. Among other novel features, seedVicious can compute evolutionary gains/losses of target sites using maximum parsimony, and also detect near-target sites, which have one nucleotide different from a canonical site. Near-target sites are important to study population variation in microRNA regulation. Some analyses suggest that near-target sites may also be functional sites, although there is no conclusive evidence for that, and they may actually be target alleles segregating in a population. SeedVicious does not aim to outperform but to complement existing microRNA prediction tools. For instance, the precision of TargetScan is almost doubled (from 11% to ~20%) when we filter predictions by the distance between target sites using this program. Interestingly, two adjacent canonical target sites are more likely to be present in bona fide target transcripts than pairs of target sites at slightly longer distances. The software is written in Perl and runs on 64-bit Unix computers (Linux and MacOS X). Users with no computing experience can also run the program in a dedicated web-server by uploading custom data, or browse pre-computed predictions. SeedVicious and its associated web-server and database (SeedBank) are distributed under the GPL/GNU license.
Label-free tissue scanner for colorectal cancer screening
NASA Astrophysics Data System (ADS)
Kandel, Mikhail E.; Sridharan, Shamira; Liang, Jon; Luo, Zelun; Han, Kevin; Macias, Virgilia; Shah, Anish; Patel, Roshan; Tangella, Krishnarao; Kajdacsy-Balla, Andre; Guzman, Grace; Popescu, Gabriel
2017-06-01
The current practice of surgical pathology relies on external contrast agents to reveal tissue architecture, which is then qualitatively examined by a trained pathologist. The diagnosis is based on the comparison with standardized empirical, qualitative assessments of limited objectivity. We propose an approach to pathology based on interferometric imaging of "unstained" biopsies, which provides unique capabilities for quantitative diagnosis and automation. We developed a label-free tissue scanner based on "quantitative phase imaging," which maps out optical path length at each point in the field of view and, thus, yields images that are sensitive to the "nanoscale" tissue architecture. Unlike analysis of stained tissue, which is qualitative in nature and affected by color balance, staining strength and imaging conditions, optical path length measurements are intrinsically quantitative, i.e., images can be compared across different instruments and clinical sites. These critical features allow us to automate the diagnosis process. We paired our interferometric optical system with highly parallelized, dedicated software algorithms for data acquisition, allowing us to image at a throughput comparable to that of commercial tissue scanners while maintaining the nanoscale sensitivity to morphology. Based on the measured phase information, we implemented software tools for autofocusing during imaging, as well as image archiving and data access. To illustrate the potential of our technology for large volume pathology screening, we established an "intrinsic marker" for colorectal disease that detects tissue with dysplasia or colorectal cancer and flags specific areas for further examination, potentially improving the efficiency of existing pathology workflows.
Mechanical energy assessment of adult with Down syndrome during walking with obstacle avoidance.
Salami, Firooz; Vimercati, Sara Laura; Rigoldi, Chiara; Taebi, Amirtaha; Albertini, Giorgio; Galli, Manuela
2014-08-01
The aim of this study is analyzing the differences between plane walking and stepping over an obstacle for two groups of healthy people and people with Down syndrome and then, evaluating the movement efficiency between the groups by comprising of their mechanical energy exchanges. 39 adults including two groups of 21 people with Down syndrome (age: 21.6 ± 7 years) and 18 healthy people (age: 25.1 ± 2.4 years) participated in this research. The test has been done in two conditions, first in plane walking and second in walking with an obstacle (10% of the subject's height). The gait data were acquired using quantitative movement analysis, composed of an optoelectronic system (Elite2002, BTS) with eight infrared cameras. Mechanical energy exchanges are computed by dedicated software and finally the data including spatiotemporal parameters, mechanical energy parameters and energy recovery of gait cycle are analyzed by statistical software to find significant differences. Regards to spatiotemporal parameters velocity and step length are lower in people with Down syndrome. Mechanical energy parameters particularly energy recovery does not change from healthy people to people with Down syndrome. However, there are some differences in inter-group through plane walking to obstacle avoidance and it means people with Down syndrome probably use their residual abilities in the most efficient way to achieve the main goal of an efficient energy recovery. Copyright © 2014 Elsevier Ltd. All rights reserved.
Real Time Target Tracking Using Dedicated Vision Hardware
NASA Astrophysics Data System (ADS)
Kambies, Keith; Walsh, Peter
1988-03-01
This paper describes a real-time vision target tracking system developed by Adaptive Automation, Inc. and delivered to NASA's Launch Equipment Test Facility, Kennedy Space Center, Florida. The target tracking system is part of the Robotic Application Development Laboratory (RADL) which was designed to provide NASA with a general purpose robotic research and development test bed for the integration of robot and sensor systems. One of the first RADL system applications is the closing of a position control loop around a six-axis articulated arm industrial robot using a camera and dedicated vision processor as the input sensor so that the robot can locate and track a moving target. The vision system is inside of the loop closure of the robot tracking system, therefore, tight throughput and latency constraints are imposed on the vision system that can only be met with specialized hardware and a concurrent approach to the processing algorithms. State of the art VME based vision boards capable of processing the image at frame rates were used with a real-time, multi-tasking operating system to achieve the performance required. This paper describes the high speed vision based tracking task, the system throughput requirements, the use of dedicated vision hardware architecture, and the implementation design details. Important to the overall philosophy of the complete system was the hierarchical and modular approach applied to all aspects of the system, hardware and software alike, so there is special emphasis placed on this topic in the paper.
Microchip electrophoresis for wine analysis.
Gomez, Federico J V; Silva, M Fernanda
2016-12-01
The present critical review provides a summary of representative articles describing the analysis of wine by microchip electrophoresis. Special emphasis has been given to those compounds able to provide background information to achieve the differentiation of wines according to botanical origin, provenance, vintage and quality or assure wine authentication. This review focuses on capillary electrophoresis (CE) microchips dedicated to the analysis of wine covering all the contributions concerning this area. The most relevant compounds in wine analysis such as phenols, organic acids, inorganic species, aldehydes, sugars, alcohols, and neuroactive amines were considered. Moreover, a special section is dedicated to the potential of CE microchip for wine classification. Indeed, potential directions for the future are discussed.
NASA Astrophysics Data System (ADS)
Lague, D.
2014-12-01
High Resolution Topographic (HRT) datasets are predominantly stored and analyzed as 2D raster grids of elevations (i.e., Digital Elevation Models). Raster grid processing is common in GIS software and benefits from a large library of fast algorithms dedicated to geometrical analysis, drainage network computation and topographic change measurement. Yet, all instruments or methods currently generating HRT datasets (e.g., ALS, TLS, SFM, stereo satellite imagery) output natively 3D unstructured point clouds that are (i) non-regularly sampled, (ii) incomplete (e.g., submerged parts of river channels are rarely measured), and (iii) include 3D elements (e.g., vegetation, vertical features such as river banks or cliffs) that cannot be accurately described in a DEM. Interpolating the raw point cloud onto a 2D grid generally results in a loss of position accuracy, spatial resolution and in more or less controlled interpolation. Here I demonstrate how studying earth surface topography and processes directly on native 3D point cloud datasets offers several advantages over raster based methods: point cloud methods preserve the accuracy of the original data, can better handle the evaluation of uncertainty associated to topographic change measurements and are more suitable to study vegetation characteristics and steep features of the landscape. In this presentation, I will illustrate and compare Point Cloud based and Raster based workflows with various examples involving ALS, TLS and SFM for the analysis of bank erosion processes in bedrock and alluvial rivers, rockfall statistics (including rockfall volume estimate directly from point clouds) and the interaction of vegetation/hydraulics and sedimentation in salt marshes. These workflows use 2 recently published algorithms for point cloud classification (CANUPO) and point cloud comparison (M3C2) now implemented in the open source software CloudCompare.
A practical guide to environmental association analysis in landscape genomics.
Rellstab, Christian; Gugerli, Felix; Eckert, Andrew J; Hancock, Angela M; Holderegger, Rolf
2015-09-01
Landscape genomics is an emerging research field that aims to identify the environmental factors that shape adaptive genetic variation and the gene variants that drive local adaptation. Its development has been facilitated by next-generation sequencing, which allows for screening thousands to millions of single nucleotide polymorphisms in many individuals and populations at reasonable costs. In parallel, data sets describing environmental factors have greatly improved and increasingly become publicly accessible. Accordingly, numerous analytical methods for environmental association studies have been developed. Environmental association analysis identifies genetic variants associated with particular environmental factors and has the potential to uncover adaptive patterns that are not discovered by traditional tests for the detection of outlier loci based on population genetic differentiation. We review methods for conducting environmental association analysis including categorical tests, logistic regressions, matrix correlations, general linear models and mixed effects models. We discuss the advantages and disadvantages of different approaches, provide a list of dedicated software packages and their specific properties, and stress the importance of incorporating neutral genetic structure in the analysis. We also touch on additional important aspects such as sampling design, environmental data preparation, pooled and reduced-representation sequencing, candidate-gene approaches, linearity of allele-environment associations and the combination of environmental association analyses with traditional outlier detection tests. We conclude by summarizing expected future directions in the field, such as the extension of statistical approaches, environmental association analysis for ecological gene annotation, and the need for replication and post hoc validation studies. © 2015 John Wiley & Sons Ltd.
NASA Technical Reports Server (NTRS)
Becker, D. D.
1980-01-01
The orbiter subsystems and interfacing program elements which interact with the orbiter computer flight software are analyzed. The failure modes identified in the subsystem/element failure mode and effects analysis are examined. Potential interaction with the software is examined through an evaluation of the software requirements. The analysis is restricted to flight software requirements and excludes utility/checkout software. The results of the hardware/software interaction analysis for the forward reaction control system are presented.
Karger, Axel; Stock, Rüdiger; Ziller, Mario; Elschner, Mandy C; Bettin, Barbara; Melzer, Falk; Maier, Thomas; Kostrzewa, Markus; Scholz, Holger C; Neubauer, Heinrich; Tomaso, Herbert
2012-10-10
Burkholderia (B.) pseudomallei and B. mallei are genetically closely related species. B. pseudomallei causes melioidosis in humans and animals, whereas B. mallei is the causative agent of glanders in equines and rarely also in humans. Both agents have been classified by the CDC as priority category B biological agents. Rapid identification is crucial, because both agents are intrinsically resistant to many antibiotics. Matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-TOF MS) has the potential of rapid and reliable identification of pathogens, but is limited by the availability of a database containing validated reference spectra. The aim of this study was to evaluate the use of MALDI-TOF MS for the rapid and reliable identification and differentiation of B. pseudomallei and B. mallei and to build up a reliable reference database for both organisms. A collection of ten B. pseudomallei and seventeen B. mallei strains was used to generate a library of reference spectra. Samples of both species could be identified by MALDI-TOF MS, if a dedicated subset of the reference spectra library was used. In comparison with samples representing B. mallei, higher genetic diversity among B. pseudomallei was reflected in the higher average Eucledian distances between the mass spectra and a broader range of identification score values obtained with commercial software for the identification of microorganisms. The type strain of B. pseudomallei (ATCC 23343) was isolated decades ago and is outstanding in the spectrum-based dendrograms probably due to massive methylations as indicated by two intensive series of mass increments of 14 Da specifically and reproducibly found in the spectra of this strain. Handling of pathogens under BSL 3 conditions is dangerous and cumbersome but can be minimized by inactivation of bacteria with ethanol, subsequent protein extraction under BSL 1 conditions and MALDI-TOF MS analysis being faster than nucleic amplification methods. Our spectra demonstrated a higher homogeneity in B. mallei than in B. pseudomallei isolates. As expected for closely related species, the identification process with MALDI Biotyper software (Bruker Daltonik GmbH, Bremen, Germany) requires the careful selection of spectra from reference strains. When a dedicated reference set is used and spectra of high quality are acquired, it is possible to distinguish both species unambiguously. The need for a careful curation of reference spectra databases is stressed.
2012-01-01
Background Burkholderia (B.) pseudomallei and B. mallei are genetically closely related species. B. pseudomallei causes melioidosis in humans and animals, whereas B. mallei is the causative agent of glanders in equines and rarely also in humans. Both agents have been classified by the CDC as priority category B biological agents. Rapid identification is crucial, because both agents are intrinsically resistant to many antibiotics. Matrix-assisted laser desorption/ionisation mass spectrometry (MALDI-TOF MS) has the potential of rapid and reliable identification of pathogens, but is limited by the availability of a database containing validated reference spectra. The aim of this study was to evaluate the use of MALDI-TOF MS for the rapid and reliable identification and differentiation of B. pseudomallei and B. mallei and to build up a reliable reference database for both organisms. Results A collection of ten B. pseudomallei and seventeen B. mallei strains was used to generate a library of reference spectra. Samples of both species could be identified by MALDI-TOF MS, if a dedicated subset of the reference spectra library was used. In comparison with samples representing B. mallei, higher genetic diversity among B. pseudomallei was reflected in the higher average Eucledian distances between the mass spectra and a broader range of identification score values obtained with commercial software for the identification of microorganisms. The type strain of B. pseudomallei (ATCC 23343) was isolated decades ago and is outstanding in the spectrum-based dendrograms probably due to massive methylations as indicated by two intensive series of mass increments of 14 Da specifically and reproducibly found in the spectra of this strain. Conclusions Handling of pathogens under BSL 3 conditions is dangerous and cumbersome but can be minimized by inactivation of bacteria with ethanol, subsequent protein extraction under BSL 1 conditions and MALDI-TOF MS analysis being faster than nucleic amplification methods. Our spectra demonstrated a higher homogeneity in B. mallei than in B. pseudomallei isolates. As expected for closely related species, the identification process with MALDI Biotyper software (Bruker Daltonik GmbH, Bremen, Germany) requires the careful selection of spectra from reference strains. When a dedicated reference set is used and spectra of high quality are acquired, it is possible to distinguish both species unambiguously. The need for a careful curation of reference spectra databases is stressed. PMID:23046611
DOE Office of Scientific and Technical Information (OSTI.GOV)
M.D. Stine
1996-01-23
The purpose of this analysis is to select the critical characteristics to be verified for steel sets and accessories and the verification methods to be implemented through a material dedication process for the procurement and use of commercial grade structural steel sets and accessories (which have a nuclear safety function) to be used in ground support (with the exception of alcove ground support and alcove opening framing, which are not addressed in this analysis) for the Exploratory Studies Facility (ESF) Topopah Spring (TS) Loop. The ESF TS Loop includes the North Ramp, Main Drift, and South Ramp underground openings.
Low-cost real-time infrared scene generation for image projection and signal injection
NASA Astrophysics Data System (ADS)
Buford, James A., Jr.; King, David E.; Bowden, Mark H.
1998-07-01
As cost becomes an increasingly important factor in the development and testing of Infrared sensors and flight computer/processors, the need for accurate hardware-in-the- loop (HWIL) simulations is critical. In the past, expensive and complex dedicated scene generation hardware was needed to attain the fidelity necessary for accurate testing. Recent technological advances and innovative applications of established technologies are beginning to allow development of cost-effective replacements for dedicated scene generators. These new scene generators are mainly constructed from commercial-off-the-shelf (COTS) hardware and software components. At the U.S. Army Aviation and Missile Command (AMCOM) Missile Research, Development, and Engineering Center (MRDEC), researchers have developed such a dynamic IR scene generator (IRSG) built around COTS hardware and software. The IRSG is used to provide dynamic inputs to an IR scene projector for in-band seeker testing and for direct signal injection into the seeker or processor electronics. AMCOM MRDEC has developed a second generation IRSG, namely IRSG2, using the latest Silicon Graphics Incorporated (SGI) Onyx2 with Infinite Reality graphics. As reported in previous papers, the SGI Onyx Reality Engine 2 is the platform of the original IRSG that is now referred to as IRSG1. IRSG1 has been in operation and used daily for the past three years on several IR projection and signal injection HWIL programs. Using this second generation IRSG, frame rates have increased from 120 Hz to 400 Hz and intensity resolution from 12 bits to 16 bits. The key features of the IRSGs are real time missile frame rates and frame sizes, dynamic missile-to-target(s) viewpoint updated each frame in real-time by a six-degree-of- freedom (6DOF) system under test (SUT) simulation, multiple dynamic objects (e.g. targets, terrain/background, countermeasures, and atmospheric effects), latency compensation, point-to-extended source anti-aliased targets, and sensor modeling effects. This paper provides a comparison between the IRSG1 and IRSG2 systems and focuses on the IRSG software, real time features, and database development tools.
Brandsch, Rainer
2017-10-01
Migration modelling provides reliable migration estimates from food-contact materials (FCM) to food or food simulants based on mass-transfer parameters like diffusion and partition coefficients related to individual materials. In most cases, mass-transfer parameters are not readily available from the literature and for this reason are estimated with a given uncertainty. Historically, uncertainty was accounted for by introducing upper limit concepts first, turning out to be of limited applicability due to highly overestimated migration results. Probabilistic migration modelling gives the possibility to consider uncertainty of the mass-transfer parameters as well as other model inputs. With respect to a functional barrier, the most important parameters among others are the diffusion properties of the functional barrier and its thickness. A software tool that accepts distribution as inputs and is capable of applying Monte Carlo methods, i.e., random sampling from the input distributions of the relevant parameters (i.e., diffusion coefficient and layer thickness), predicts migration results with related uncertainty and confidence intervals. The capabilities of probabilistic migration modelling are presented in the view of three case studies (1) sensitivity analysis, (2) functional barrier efficiency and (3) validation by experimental testing. Based on the predicted migration by probabilistic migration modelling and related exposure estimates, safety evaluation of new materials in the context of existing or new packaging concepts is possible. Identifying associated migration risk and potential safety concerns in the early stage of packaging development is possible. Furthermore, dedicated material selection exhibiting required functional barrier efficiency under application conditions becomes feasible. Validation of the migration risk assessment by probabilistic migration modelling through a minimum of dedicated experimental testing is strongly recommended.
Effect of combined digital imaging parameters on endodontic file measurements.
de Oliveira, Matheus Lima; Pinto, Geraldo Camilo de Souza; Ambrosano, Glaucia Maria Bovi; Tosoni, Guilherme Monteiro
2012-10-01
This study assessed the effect of the combination of a dedicated endodontic filter, spatial resolution, and contrast resolution on the determination of endodontic file lengths. Forty extracted single-rooted teeth were x-rayed with K-files (ISO size 10 and 15) in the root canals. Images were acquired using the VistaScan system (Dürr Dental, Beitigheim-Bissingen, Germany) under different combining parameters of spatial resolution (10 and 25 line pairs per millimeter [lp/mm]) and contrast resolution (8- and 16-bit depths). Subsequently, a dedicated endodontic filter was applied on the 16-bit images, creating 2 additional parameters. Six observers measured the length of the endodontic files in the root canals using the software that accompanies the system. The mean values of the actual file lengths and the measurements of the radiographic images were submitted to 1-way analysis of variance and the Tukey test at a level of significance of 5%. The intraobserver reproducibility was assessed by the intraclass correlation coefficient. All combined image parameters showed excellent intraobserver agreement with intraclass correlation coefficient means higher than 0.98. The imaging parameter of 25 lp/mm and 16 bit associated with the use of the endodontic filter did not differ significantly from the actual file lengths when both file sizes were analyzed together or separately (P > .05). When the size 15 file was evaluated separately, only 8-bit images differed significantly from the actual file lengths (P ≤ .05). The combination of an endodontic filter with high spatial resolution and high contrast resolution is recommended for the determination of file lengths when using storage phosphor plates. Copyright © 2012 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.
A software platform for phase contrast x-ray breast imaging research.
Bliznakova, K; Russo, P; Mettivier, G; Requardt, H; Popov, P; Bravin, A; Buliev, I
2015-06-01
To present and validate a computer-based simulation platform dedicated for phase contrast x-ray breast imaging research. The software platform, developed at the Technical University of Varna on the basis of a previously validated x-ray imaging software simulator, comprises modules for object creation and for x-ray image formation. These modules were updated to take into account the refractive index for phase contrast imaging as well as implementation of the Fresnel-Kirchhoff diffraction theory of the propagating x-ray waves. Projection images are generated in an in-line acquisition geometry. To test and validate the platform, several phantoms differing in their complexity were constructed and imaged at 25 keV and 60 keV at the beamline ID17 of the European Synchrotron Radiation Facility. The software platform was used to design computational phantoms that mimic those used in the experimental study and to generate x-ray images in absorption and phase contrast modes. The visual and quantitative results of the validation process showed an overall good correlation between simulated and experimental images and show the potential of this platform for research in phase contrast x-ray imaging of the breast. The application of the platform is demonstrated in a feasibility study for phase contrast images of complex inhomogeneous and anthropomorphic breast phantoms, compared to x-ray images generated in absorption mode. The improved visibility of mammographic structures suggests further investigation and optimisation of phase contrast x-ray breast imaging, especially when abnormalities are present. The software platform can be exploited also for educational purposes. Copyright © 2015 Elsevier Ltd. All rights reserved.