Sample records for difx software correlator

  1. Haystack Observatory VLBI Correlator

    NASA Technical Reports Server (NTRS)

    Titus, Mike; Cappallo, Roger; Corey, Brian; Dudevoir, Kevin; Niell, Arthur; Whitney, Alan

    2013-01-01

    This report summarizes the activities of the Haystack Correlator during 2012. Highlights include finding a solution to the DiFX InfiniBand timeout problem and other DiFX software development, conducting a DBE comparison test following the First International VLBI Technology Workshop, conducting a Mark IV and DiFX correlator comparison, more broadband delay experiments, more u- VLBI Galactic Center observations, and conversion of RDV session processing to the Mark IV/HOPS path. Non-real-time e-VLBI transfers and engineering support of other correlators continued.

  2. The Bonn Astro/Geo Correlator

    NASA Technical Reports Server (NTRS)

    Bernhart, Simone; Alef, Walter; Bertarini, Alessandra; La Porta, Laura; Muskens, Arno; Rottmann, Helge; Roy, Alan

    2013-01-01

    The Bonn Distributed FX (DiFX) correlator is a software correlator operated jointly by the Max- Planck-Institut fur Radioastronomie (MPIfR), the Institut fur Geodasie und Geoinformation der Universitat Bonn (IGG), and the Bundesamt fur Kartographie und Geodasie (BKG) in Frankfurt.

  3. IAA Correlator Center

    NASA Technical Reports Server (NTRS)

    Surkis, Igor; Ken, Voitsekh; Melnikov, Alexey; Mishin, Vladimir; Sokolova, Nadezda; Shantyr, Violet; Zimovsky, Vladimir

    2013-01-01

    The activities of the six-station IAA RAS correlator include regular processing of national geodetic VLBI programs Ru-E, Ru-U, and Ru-F. The Ru-U sessions have been transferred in e-VLBI mode and correlated in the IAA Correlator Center automatically since 2011. The DiFX software correlator is used at the IAA in some astrophysical experiments.

  4. Implementation and Testing of VLBI Software Correlation at the USNO

    NASA Technical Reports Server (NTRS)

    Fey, Alan; Ojha, Roopesh; Boboltz, Dave; Geiger, Nicole; Kingham, Kerry; Hall, David; Gaume, Ralph; Johnston, Ken

    2010-01-01

    The Washington Correlator (WACO) at the U.S. Naval Observatory (USNO) is a dedicated VLBI processor based on dedicated hardware of ASIC design. The WACO is currently over 10 years old and is nearing the end of its expected lifetime. Plans for implementation and testing of software correlation at the USNO are currently being considered. The VLBI correlation process is, by its very nature, well suited to a parallelized computing environment. Commercial off-the-shelf computer hardware has advanced in processing power to the point where software correlation is now both economically and technologically feasible. The advantages of software correlation are manifold but include flexibility, scalability, and easy adaptability to changing environments and requirements. We discuss our experience with and plans for use of software correlation at USNO with emphasis on the use of the DiFX software correlator.

  5. IVS Technology Coordinator Report

    NASA Technical Reports Server (NTRS)

    Whitney, Alan

    2013-01-01

    This report of the Technology Coordinator includes the following: 1) continued work to implement the new VLBI2010 system, 2) the 1st International VLBI Technology Workshop, 3) a VLBI Digital- Backend Intercomparison Workshop, 4) DiFX software correlator development for geodetic VLBI, 5) a review of progress towards global VLBI standards, and 6) a welcome to new IVS Technology Coordinator Bill Petrachenko.

  6. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  7. Single baseline GLONASS observations with VLBI: data processing and first results

    NASA Astrophysics Data System (ADS)

    Tornatore, V.; Haas, R.; Duev, D.; Pogrebenko, S.; Casey, S.; Molera Calvés, G.; Keimpema, A.

    2011-07-01

    Several tests to observe signals transmitted by GLONASS (GLObal NAvigation Satellite System) satellites have been performed using the geodetic VLBI (Very Long Baseline Interferometry) technique. The radio telescopes involved in these experiments were Medicina (Italy) and Onsala (Sweden), both equipped with L-band receivers. Observations at the stations were performed using the standard Mark4 VLBI data acquisition rack and Mark5A disk-based recorders. The goals of the observations were to develop and test the scheduling, signal acquisition and processing routines to verify the full tracking pipeline, foreseeing the cross-correlation of the recorded data on the baseline Onsala-Medicina. The natural radio source 3c286 was used as a calibrator before the starting of the satellite observation sessions. Delay models, including the tropospheric and ionospheric corrections, which are consistent for both far- and near-field sources are under development. Correlation of the calibrator signal has been performed using the DiFX software, while the satellite signals have been processed using the narrow band approach with the Metsaehovi software and analysed with a near-field delay model. Delay models both for the calibrator signals and the satellites signals, using the same geometrical, tropospheric and ionospheric models, are under investigation to make a correlation of the satellite signals possible.

  8. VGOS Operations and Geodetic Results

    NASA Astrophysics Data System (ADS)

    Niell, Arthur E.; Beaudoin, Christopher J.; Bolotin, Sergei; Cappallo, Roger J.; Corey, Brian E.; Gipson, John; Gordon, David; McWhirter, Russell; Ruszczyk, Chester A.; SooHoo, Jason

    2014-12-01

    Over the past two years the first VGOS geodetic results were obtained using the GGAO12M and Westford broadband systems that have been developed under NASA sponsorship and funding. These observations demonstrated full broadband operation, from data acquisition through correlation, delay extraction, and baseline estimation. The May 2013 24-hour session proceeded almost without human intervention in anticipation of the goal of unattended operation. A recent test observation successfully demonstrated the use of what is expected to be the operational version of the RDBE digital back end and the Mark 6 system on which the outputs of four RDBEs, each processing one RF band, were recorded on a single module at eight gigabits per second. The complex-sample VDIF data from GGAO12M and Westford were cross-correlated on the Haystack DiFX software correlator, and the instrumental delay was calculated from all of the phase calibration tones in each channel. A minimum redundancy frequency sequence (1, 2, 4, 6, 9, 13, 14, 15) was utilized to minimize the first sidelobes of the multiband delay resolution function.

  9. V-FASTR: THE VLBA FAST RADIO TRANSIENTS EXPERIMENT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wayth, Randall B.; Tingay, Steven J.; Brisken, Walter F.

    2011-07-10

    Recent discoveries of dispersed, non-periodic impulsive radio signals with single-dish radio telescopes have sparked significant interest in exploring the relatively uncharted space of fast transient radio signals. Here we describe V-FASTR, an experiment to perform a blind search for fast transient radio signals using the Very Long Baseline Array (VLBA). The experiment runs entirely in a commensal mode, alongside normal VLBA observations and operations. It is made possible by the features and flexibility of the DiFX software correlator that is used to process VLBA data. Using the VLBA for this type of experiment offers significant advantages over single-dish experiments, includingmore » a larger field of view, the ability to easily distinguish local radio-frequency interference from real signals, and the possibility to localize detected events on the sky to milliarcsecond accuracy. We describe our software pipeline, which accepts short integration ({approx} ms) spectrometer data from each antenna in real time during correlation and performs an incoherent dedispersion separately for each antenna, over a range of trial dispersion measures. The dedispersed data are processed by a sophisticated detector and candidate events are recorded. At the end of the correlation, small snippets of the raw data at the time of the events are stored for further analysis. We present the results of our event detection pipeline from some test observations of the pulsars B0329+54 and B0531+21 (the Crab pulsar).« less

  10. VLITE Surveys the Sky: A 340 MHz Companion to the VLA Sky Survey (VLASS)

    NASA Astrophysics Data System (ADS)

    Peters, Wendy; Clarke, Tracy; Brisken, Walter; Cotton, William; Richards, Emily E.; Giacintucci, Simona; Kassim, Namir

    2018-01-01

    The VLA Low Band Ionosphere and Transient Experiment (VLITE; ) is a commensal observing system on the Karl G. Janksy Very Large Array (VLA) which was developed by the Naval Research Laboratory and NRAO. A 64 MHz sub-band from the prime focus 240-470 MHz dipoles is correlated during nearly all regular VLA observations. VLITE uses dedicated samplers and fibers, as well as a custom designed, real-time DiFX software correlator, and requires no additional resources from the VLA system running the primary science program. The experiment has been operating since November 2014 with 10 antennas; a recent expansion in summer 2017 increased that number to 16 and more than doubled the number of baselines.The VLA Sky Survey (VLASS; ), is an ongoing survey of the entire sky visible to the VLA at a frequency of 2-4 GHz. The observations are made using an "on-the-fly" (OTF) continuous RA scanning technique which fills in the sky by observing along rows of constant declination. VLITE breaks the data into 2-second integrations and correlates these at a central position every 1.5 degrees. All data for each correlator position is imaged separately, corrected and weighted by an appropriately elongated primary beam model, and then combined in the image plane to create a mosaic of the sky. A catalog of the sources is extracted to provide a 340 MHz sky model.We present preliminary images and catalogs from the 2017 VLASS observations which began in early September, 2017, and continued on a nearly daily basis throughout the fall. In addition to providing a unique sky model at 340 MHz, these data complement VLASS by providing spectral indices for all cataloged sources.

  11. Towards Cloud Processing of GGOS Big Data

    NASA Astrophysics Data System (ADS)

    Weston, Stuart; Kim, Bumjun; Litchfield, Alan; Gulyaev, Sergei; Hall, Dylan; Chorao, Carlos; Ruthven, Andrew; Davies, Glyn; Lagos, Bruno; Christie, Don

    2017-04-01

    We report on our initial steps towards development of a cloud-like correlation infrastructure for geodetic Very Long Baseline Interferometry (VLBI), which in its raw format is of the order of 10-100 TB (big data). Data is generated by multiple VLBI radio telescopes, and is then used by for geodetic, geophysical, and astrometric research and operational activities through the International VLBI Service (IVS), as well as for corrections of GPS satellite orbits. Currently IVS data is correlated in several international Correlators (Correlation Centres), which receive data from individual radio telescope stations either in hard drives via regular mail service or via fibre using e-transfer mode. The latter is strongly limited by connectivity of existing correlation centres, which creates bottle necks and slows down the turnover of the data. This becomes critical in many applications - for example, it currently takes 1-2 weeks to generate the dUT1 parameter for corrections of GNSS orbits while less than 1-2 days delay is desirable. We started with a blade server at the AUT campus to emulate a cloud server using Virtual Machines (VMWare). The New Zealand Data Head node is connected to the high speed (100 Gbps) network ring circuit courtesy of the Research and Education Advanced Network New Zealand (REANNZ), with the additional nodes at remote physical sites connected via 10 Gbps fibre. We use real Australian Long Baseline Array (LBA) observational data from 6 radio telescopes in Australia, South Africa and New Zealand (15 baselines) of 1.5 hours in duration making 8 TB to emulate data transfer from remote locations and to provide a meaningful benchmark dataset for correlation. Data was successfully transferred using bespoke UDT network transfer tools and correlated with the speed-up factor of 0.8 using DiFX software correlator. In partnership with the New Zealand office of Catalyst IT Ltd we have moved this environment into Catalyst Cloud and report on the first correlation of a VLBI Dataset in a true cloud environment.

  12. Update on the Commensal VLA Low-band Ionospheric and Transient Experiment (VLITE)

    NASA Astrophysics Data System (ADS)

    Kassim, Namir E.; Clarke, Tracy E.; Ray, Paul S.; Polisensky, Emil; Peters, Wendy M.; Giacintucci, Simona; Helmboldt, Joseph F.; Hyman, Scott D.; Brisken, Walter; Hicks, Brian; Deneva, Julia S.

    2017-01-01

    The JVLA Low-band Ionospheric and Transient Experiment (VLITE) is a commensal observing system on the NRAO JVLA. The separate optical path of the prime-focus sub-GHz dipole feeds and the Cassegrain-focus GHz feeds provided an opportunity to expand the simultaneous frequency operation of the JVLA through joint observations across both systems. The low-band receivers on 10 JVLA antennas are outfitted with dedicated samplers and use spare fibers to transport the 320-384 MHz band to the VLITE correlator. The initial phase of VLITE uses a custom-designed real-time DiFX software correlator to produce autocorrelations, as well as parallel and cross-hand cross-correlations from the linear dipole feeds. NRL and NRAO have worked together to explore the scientific potential of the commensal low frequency system for ionospheric remote sensing, astrophysics and transients. VLITE operates at nearly 70% wall time with roughly 6200 hours of JVLA time recorded each year.VLITE data are used in real-time for ionospheric research and are transferred daily to NRL for processing in the astrophysics and transient pipelines. These pipelines provide automated radio frequency interference excision, calibration, imaging and self-calibration of data.We will review early scientific results from VLITE across all three science focus areas, including the ionosphere, slow (> 1 sec) transients, and astrophysics. We also discuss the future of the project, that includes its planned expansion to eVLITE including the addition of more antennas, and a parallel capability to search for fast (< 1 sec), dispersed transients, such as Fast Radio Bursts and Rotating Radio Transients. We will also present early results of commissioning tests to utilize VLITE data products to complement NRAO’s 3 GHz VLA Sky Survey (VLASS). Revised pipelines are under development for operation during the on-the-fly operation mode of the sky survey.

  13. VLBI2010 and the Westford Station - The Path Forward

    NASA Astrophysics Data System (ADS)

    Beaudoin, C.; Wilson, K.; Whittier, B.; Whitney, A.; McWhirter, R.; Smythe, J. SooHoo, D.; Ruszczyk, C.; Rogers, A.; Poirier, M.; Niell, A.; Corey, B.; Cappallo, R.; Byford, J.; Bolis, P.

    2012-12-01

    For the past three years the role of the Westford antenna in geodetic VLBI has been two-fold. Over this time its primary purpose has been to participate in standard S/X-band geodetic VLBI observations. In its secondary role the Westford antenna has been converted into a research instrument, facilitating the development of the broadband geodetic VLBI observing technique. As a research instrument, the Westford antenna incorporates a commercially-available ETS-Lindgren 3164 quadridge antenna as a radio telescope feed. The system also uses the VLBI2010 data acquisition system that incorporates digital backends (DBEs) implementing a polyphase filter bank processor. The process of converting the station from its mode of operations to a research instrument often introduces subtle anomalies that must be diagnosed prior to broadband observing. Furthermore, this bifurcation of the station's role is not in line with the goals of the VLBI2010 specifications. Until recently it has not been possible for the Westford station to serve as both an operational and research instrument without conversion for two reasons: poor sensitivity and incompatibility of backend baseband filter bandwidths. The poor sensitivity of the Westford antenna as a broadband radio telescope is in large part due to the commercial broadband feed which was readily available when the proof-of-concept VLBI2010 observations were initiated. However, with the materialization of the quadridge flared horn (QRFH) by the California Institute of Technology and with the improvements in the DiFX software correlator, the necessary components are now available to upgrade the Westford station to full-broadband capability while adhering to the mandate to maintain backwards compatibility with the legacy S/X systems. In this paper we will present the path forward for upgrading the Westford site to full-broadband capability while maintaining S/X compatibility.

  14. Development problem analysis of correlation leak detector’s software

    NASA Astrophysics Data System (ADS)

    Faerman, V. A.; Avramchuk, V. S.; Marukyan, V. M.

    2018-05-01

    In the article, the practical application and the structure of the correlation leak detectors’ software is studied and the task of its designing is analyzed. In the first part of the research paper, the expediency of the facilities development of correlation leak detectors for the following operating efficiency of public utilities exploitation is shown. The analysis of the functional structure of correlation leak detectors is conducted and its program software tasks are defined. In the second part of the research paper some development steps of the software package – requirement forming, program structure definition and software concept creation – are examined in the context of the usage experience of the hardware-software prototype of correlation leak detector.

  15. Real-time software receiver

    NASA Technical Reports Server (NTRS)

    Psiaki, Mark L. (Inventor); Kintner, Jr., Paul M. (Inventor); Ledvina, Brent M. (Inventor); Powell, Steven P. (Inventor)

    2007-01-01

    A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.

  16. Real-time software receiver

    NASA Technical Reports Server (NTRS)

    Psiaki, Mark L. (Inventor); Ledvina, Brent M. (Inventor); Powell, Steven P. (Inventor); Kintner, Jr., Paul M. (Inventor)

    2006-01-01

    A real-time software receiver that executes on a general purpose processor. The software receiver includes data acquisition and correlator modules that perform, in place of hardware correlation, baseband mixing and PRN code correlation using bit-wise parallelism.

  17. The Software Correlator of the Chinese VLBI Network

    NASA Technical Reports Server (NTRS)

    Zheng, Weimin; Quan, Ying; Shu, Fengchun; Chen, Zhong; Chen, Shanshan; Wang, Weihua; Wang, Guangli

    2010-01-01

    The software correlator of the Chinese VLBI Network (CVN) has played an irreplaceable role in the CVN routine data processing, e.g., in the Chinese lunar exploration project. This correlator will be upgraded to process geodetic and astronomical observation data. In the future, with several new stations joining the network, CVN will carry out crustal movement observations, quick UT1 measurements, astrophysical observations, and deep space exploration activities. For the geodetic or astronomical observations, we need a wide-band 10-station correlator. For spacecraft tracking, a realtime and highly reliable correlator is essential. To meet the scientific and navigation requirements of CVN, two parallel software correlators in the multiprocessor environments are under development. A high speed, 10-station prototype correlator using the mixed Pthreads and MPI (Massage Passing Interface) parallel algorithm on a computer cluster platform is being developed. Another real-time software correlator for spacecraft tracking adopts the thread-parallel technology, and it runs on the SMP (Symmetric Multiple Processor) servers. Both correlators have the characteristic of flexible structure and scalability.

  18. Network-Based Analysis of Software Change Propagation

    PubMed Central

    Wang, Rongcun; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system. PMID:24790557

  19. Network-based analysis of software change propagation.

    PubMed

    Wang, Rongcun; Huang, Rubing; Qu, Binbin

    2014-01-01

    The object-oriented software systems frequently evolve to meet new change requirements. Understanding the characteristics of changes aids testers and system designers to improve the quality of softwares. Identifying important modules becomes a key issue in the process of evolution. In this context, a novel network-based approach is proposed to comprehensively investigate change distributions and the correlation between centrality measures and the scope of change propagation. First, software dependency networks are constructed at class level. And then, the number of times of cochanges among classes is minded from software repositories. According to the dependency relationships and the number of times of cochanges among classes, the scope of change propagation is calculated. Using Spearman rank correlation analyzes the correlation between centrality measures and the scope of change propagation. Three case studies on java open source software projects Findbugs, Hibernate, and Spring are conducted to research the characteristics of change propagation. Experimental results show that (i) change distribution is very uneven; (ii) PageRank, Degree, and CIRank are significantly correlated to the scope of change propagation. Particularly, CIRank shows higher correlation coefficient, which suggests it can be a more useful indicator for measuring the scope of change propagation of classes in object-oriented software system.

  20. VLBI Correlators in Kashima

    NASA Technical Reports Server (NTRS)

    Sekido, Mamoru; Takefuji, Kazuhiro

    2013-01-01

    Kashima Space Technology Center (KSTC) is making use of two kinds of software correlators, the multi-channel K5/VSSP software correlator and the fast wide-band correlator 'GICO3,' for geodetic and R&D VLBI experiments. Overview of the activity and future plans are described in this paper.

  1. Using Utility Functions to Control a Distributed Storage System

    DTIC Science & Technology

    2008-05-01

    Pinheiro et al. [2007] suggest this is not an accurate assumption. Nicola and Goyal [1990] examined correlated failures across multiversion software...F. and Goyal, A. (1990). Modeling of correlated failures and community error recovery in multiversion software. IEEE Transactions on Software

  2. Implementing the concurrent operation of sub-arrays in the ALMA correlator

    NASA Astrophysics Data System (ADS)

    Amestica, Rodrigo; Perez, Jesus; Lacasse, Richard; Saez, Alejandro

    2016-07-01

    The ALMA correlator processes the digitized signals from 64 individual antennas to produce a grand total of 2016 correlated base-lines, with runtime selectable lags resolution and integration time. The on-line software system can process a maximum of 125M visibilities per second, producing an archiving data rate close to one sixteenth of the former (7.8M visibilities per second with a network transfer limit of 60 MB/sec). Mechanisms in the correlator hardware design make it possible to split the total number of antennas in the array into smaller subsets, or sub-arrays, such that they can share correlator resources while executing independent observations. The software part of the sub-system is responsible for configuring and scheduling correlator resources in such a way that observations among independent subarrays occur simultaneously while internally sharing correlator resources under a cooperative arrangement. Configuration of correlator modes through its CAN-bus interface and periodic geometric delay updates are the most relevant activities to schedule concurrently while observations happen at the same time among a number of sub-arrays. For that to work correctly, the software interface to sub-arrays schedules shared correlator resources sequentially before observations actually start on each sub-array. Start times for specific observations are optimized and reported back to the higher level observing software. After that initial sequential phase has taken place then simultaneous executions and recording of correlated data across different sub-arrays move forward concurrently, sharing the local network to broadcast results to other software sub-systems. The present paper presents an overview of the different hardware and software actors within the correlator sub-system that implement some degree of concurrency and synchronization needed for seamless and simultaneous operation of multiple sub-arrays, limitations stemming from the resource-sharing nature of the correlator, limitations intrinsic to the digital technology available in the correlator hardware, and milestones so far reached by this new ALMA feature.

  3. Process Correlation Analysis Model for Process Improvement Identification

    PubMed Central

    Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170

  4. Process correlation analysis model for process improvement identification.

    PubMed

    Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong

    2014-01-01

    Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.

  5. Low-noise correlation measurements based on software-defined-radio receivers and cooled microwave amplifiers.

    PubMed

    Nieminen, Teemu; Lähteenmäki, Pasi; Tan, Zhenbing; Cox, Daniel; Hakonen, Pertti J

    2016-11-01

    We present a microwave correlation measurement system based on two low-cost USB-connected software defined radio dongles modified to operate as coherent receivers by using a common local oscillator. Existing software is used to obtain I/Q samples from both dongles simultaneously at a software tunable frequency. To achieve low noise, we introduce an easy low-noise solution for cryogenic amplification at 600-900 MHz based on single discrete HEMT with 21 dB gain and 7 K noise temperature. In addition, we discuss the quantization effects in a digital correlation measurement and determination of optimal integration time by applying Allan deviation analysis.

  6. Enhanced Visualization of Subtle Outer Retinal Pathology by En Face Optical Coherence Tomography and Correlation with Multi-Modal Imaging

    PubMed Central

    Chew, Avenell L.; Lamey, Tina; McLaren, Terri; De Roach, John

    2016-01-01

    Purpose To present en face optical coherence tomography (OCT) images generated by graph-search theory algorithm-based custom software and examine correlation with other imaging modalities. Methods En face OCT images derived from high density OCT volumetric scans of 3 healthy subjects and 4 patients using a custom algorithm (graph-search theory) and commercial software (Heidelberg Eye Explorer software (Heidelberg Engineering)) were compared and correlated with near infrared reflectance, fundus autofluorescence, adaptive optics flood-illumination ophthalmoscopy (AO-FIO) and microperimetry. Results Commercial software was unable to generate accurate en face OCT images in eyes with retinal pigment epithelium (RPE) pathology due to segmentation error at the level of Bruch’s membrane (BM). Accurate segmentation of the basal RPE and BM was achieved using custom software. The en face OCT images from eyes with isolated interdigitation or ellipsoid zone pathology were of similar quality between custom software and Heidelberg Eye Explorer software in the absence of any other significant outer retinal pathology. En face OCT images demonstrated angioid streaks, lesions of acute macular neuroretinopathy, hydroxychloroquine toxicity and Bietti crystalline deposits that correlated with other imaging modalities. Conclusions Graph-search theory algorithm helps to overcome the limitations of outer retinal segmentation inaccuracies in commercial software. En face OCT images can provide detailed topography of the reflectivity within a specific layer of the retina which correlates with other forms of fundus imaging. Our results highlight the need for standardization of image reflectivity to facilitate quantification of en face OCT images and longitudinal analysis. PMID:27959968

  7. Real-time autocorrelator for fluorescence correlation spectroscopy based on graphical-processor-unit architecture: method, implementation, and comparative studies

    NASA Astrophysics Data System (ADS)

    Laracuente, Nicholas; Grossman, Carl

    2013-03-01

    We developed an algorithm and software to calculate autocorrelation functions from real-time photon-counting data using the fast, parallel capabilities of graphical processor units (GPUs). Recent developments in hardware and software have allowed for general purpose computing with inexpensive GPU hardware. These devices are more suited for emulating hardware autocorrelators than traditional CPU-based software applications by emphasizing parallel throughput over sequential speed. Incoming data are binned in a standard multi-tau scheme with configurable points-per-bin size and are mapped into a GPU memory pattern to reduce time-expensive memory access. Applications include dynamic light scattering (DLS) and fluorescence correlation spectroscopy (FCS) experiments. We ran the software on a 64-core graphics pci card in a 3.2 GHz Intel i5 CPU based computer running Linux. FCS measurements were made on Alexa-546 and Texas Red dyes in a standard buffer (PBS). Software correlations were compared to hardware correlator measurements on the same signals. Supported by HHMI and Swarthmore College

  8. Open source posturography.

    PubMed

    Rey-Martinez, Jorge; Pérez-Fernández, Nicolás

    2016-12-01

    The proposed validation goal of 0.9 in intra-class correlation coefficient was reached with the results of this study. With the obtained results we consider that the developed software (RombergLab) is a validated balance assessment software. The reliability of this software is dependent of the used force platform technical specifications. Develop and validate a posturography software and share its source code in open source terms. Prospective non-randomized validation study: 20 consecutive adults underwent two balance assessment tests, six condition posturography was performed using a clinical approved software and force platform and the same conditions were measured using the new developed open source software using a low cost force platform. Intra-class correlation index of the sway area obtained from the center of pressure variations in both devices for the six conditions was the main variable used for validation. Excellent concordance between RombergLab and clinical approved force platform was obtained (intra-class correlation coefficient =0.94). A Bland and Altman graphic concordance plot was also obtained. The source code used to develop RombergLab was published in open source terms.

  9. Influence of Smartphones and Software on Acoustic Voice Measures

    PubMed Central

    GRILLO, ELIZABETH U.; BROSIOUS, JENNA N.; SORRELL, STACI L.; ANAND, SUPRAJA

    2016-01-01

    This study assessed the within-subject variability of voice measures captured using different recording devices (i.e., smartphones and head mounted microphone) and software programs (i.e., Analysis of Dysphonia in Speech and Voice (ADSV), Multi-dimensional Voice Program (MDVP), and Praat). Correlations between the software programs that calculated the voice measures were also analyzed. Results demonstrated no significant within-subject variability across devices and software and that some of the measures were highly correlated across software programs. The study suggests that certain smartphones may be appropriate to record daily voice measures representing the effects of vocal loading within individuals. In addition, even though different algorithms are used to compute voice measures across software programs, some of the programs and measures share a similar relationship. PMID:28775797

  10. Job satisfaction, job stress and psychosomatic health problems in software professionals in India

    PubMed Central

    Madhura, Sahukar; Subramanya, Pailoor; Balaram, Pradhan

    2014-01-01

    This questionnaire based study investigates correlation between job satisfaction, job stress and psychosomatic health in Indian software professionals. Also, examines how yoga practicing Indian software professionals cope up with stress and psychosomatic health problems. The sample consisted of yoga practicing and non-yoga practicing Indian software professionals working in India. The findings of this study have shown that there is significant correlation among job satisfaction, job stress and health. In Yoga practitioners job satisfaction is not significantly related to Psychosomatic health whereas in non-yoga group Psychosomatic Health symptoms showed significant relationship with Job satisfaction. PMID:25598623

  11. Effectiveness of back-to-back testing

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.; Eckhardt, David E.; Caglayan, Alper; Kelly, John P. J.

    1987-01-01

    Three models of back-to-back testing processes are described. Two models treat the case where there is no intercomponent failure dependence. The third model describes the more realistic case where there is correlation among the failure probabilities of the functionally equivalent components. The theory indicates that back-to-back testing can, under the right conditions, provide a considerable gain in software reliability. The models are used to analyze the data obtained in a fault-tolerant software experiment. It is shown that the expected gain is indeed achieved, and exceeded, provided the intercomponent failure dependence is sufficiently small. However, even with the relatively high correlation the use of several functionally equivalent components coupled with back-to-back testing may provide a considerable reliability gain. Implications of this finding are that the multiversion software development is a feasible and cost effective approach to providing highly reliable software components intended for fault-tolerant software systems, on condition that special attention is directed at early detection and elimination of correlated faults.

  12. Integrating voice evaluation: correlation between acoustic and audio-perceptual measures.

    PubMed

    Vaz Freitas, Susana; Melo Pestana, Pedro; Almeida, Vítor; Ferreira, Aníbal

    2015-05-01

    This article aims to establish correlations between acoustic and audio-perceptual measures using the GRBAS scale with respect to four different voice analysis software programs. Exploratory, transversal. A total of 90 voice records were collected and analyzed with the Dr. Speech (Tiger Electronics, Seattle, WA), Multidimensional Voice Program (Kay Elemetrics, NJ, USA), PRAAT (University of Amsterdam, The Netherlands), and Voice Studio (Seegnal, Oporto, Portugal) software programs. The acoustic measures were correlated to the audio-perceptual parameters of the GRBAS and rated by 10 experts. The predictive value of the acoustic measurements related to the audio-perceptual parameters exhibited magnitudes ranging from weak (R(2)a=0.17) to moderate (R(2)a=0.71). The parameter exhibiting the highest correlation magnitude is B (Breathiness), whereas the weaker correlation magnitudes were found to be for A (Asthenia) and S (Strain). The acoustic measures with stronger predictive values were local Shimmer, harmonics-to-noise ratio, APQ5 shimmer, and PPQ5 jitter, with different magnitudes for each one of the studied software programs. Some acoustic measures are pointed as significant predictors of GRBAS parameters, but they differ among software programs. B (Breathiness) was the parameter exhibiting the highest correlation magnitude. Copyright © 2015 The Voice Foundation. Published by Elsevier Inc. All rights reserved.

  13. INTERSPECIES CORRELATION ESTIMATION (ICE) FOR ACUTE TOXICITY TO AQUATIC ORGANISMS AND WILDLIFE. II. USER MANUAL AND SOFTWARE

    EPA Science Inventory

    Asfaw, Amha, Mark R. Ellersieck and Foster L. Mayer. 2003. Interspecies Correlation Estimations (ICE) for Acute Toxicity to Aquatic Organisms and Wildlife. II. User Manual and Software. EPA/600/R-03/106. U.S. Environmental Protection Agency, National Health and Environmental Effe...

  14. Software Correlator for Radioastron Mission

    NASA Astrophysics Data System (ADS)

    Likhachev, Sergey F.; Kostenko, Vladimir I.; Girin, Igor A.; Andrianov, Andrey S.; Rudnitskiy, Alexey G.; Zharov, Vladimir E.

    In this paper, we discuss the characteristics and operation of Astro Space Center (ASC) software FX correlator that is an important component of space-ground interferometer for Radioastron project. This project performs joint observations of compact radio sources using 10m space radio telescope (SRT) together with ground radio telescopes at 92, 18, 6 and 1.3 cm wavelengths. In this paper, we describe the main features of space-ground VLBI data processing of Radioastron project using ASC correlator. Quality of implemented fringe search procedure provides positive results without significant losses in correlated amplitude. ASC Correlator has a computational power close to real time operation. The correlator has a number of processing modes: “Continuum”, “Spectral Line”, “Pulsars”, “Giant Pulses”,“Coherent”. Special attention is paid to peculiarities of Radioastron space-ground VLBI data processing. The algorithms of time delay and delay rate calculation are also discussed, which is a matter of principle for data correlation of space-ground interferometers. During five years of Radioastron SRT successful operation, ASC correlator showed high potential of satisfying steady growing needs of current and future ground and space VLBI science. Results of ASC software correlator operation are demonstrated.

  15. Identifying Dyscalculia Symptoms Related to Magnocellular Reasoning Using Smartphones.

    PubMed

    Knudsen, Greger Siem; Babic, Ankica

    2016-01-01

    This paper presents a study that has developed a mobile software application for assisting diagnosis of learning disabilities in mathematics, called dyscalculia, and measuring correlations between dyscalculia symptoms and magnocellular reasoning. Usually, software aids for dyscalculic individuals are focused on both assisting diagnosis and teaching the material. The software developed in this study however maintains a specific focus on the former, and in the process attempts to capture alleged correlations between dyscalculia symptoms and possible underlying causes of the condition. Classification of symptoms is performed by k-Nearest Neighbor algorithm classifying five parameters evaluating user's skills, returning calculated performance in each category as well as correlation strength between detected symptoms and magnocellular reasoning abilities. Expert evaluations has found the application to be appropriate and productive for its intended purpose, proving that mobile software is a suitable and valuable tool for assisting dyscalculia diagnosis and identifying root causes of developing the condition.

  16. Measurement of the area of venous ulcers using two software programs 1

    PubMed Central

    Eberhardt, Thaís Dresch; de Lima, Suzinara Beatriz Soares; Lopes, Luis Felipe Dias; Borges, Eline de Lima; Weiller, Teresinha Heck; da Fonseca, Graziele Gorete Portella

    2016-01-01

    ABSTRACT Objective: to compare the measurement area of venous ulcers using AutoCAD(r) and Image Tool software. Method: this was an assessment of reproducibility tests conducted in a angiology clinic of a university hospital. Data were collected from 21 patients with venous ulcers, in the period from March to July of 2015, using a collection form and photograph of wounds. Five nurses (evaluators) of the hospital skin wound study group participated. The wounds were measured using both software programs. Data were analyzed using intraclass correlation coefficient, concordance correlation coefficient and Bland-Altman analysis. The study met the ethical aspects in accordance with current legislation. Results: the size of ulcers varied widely, however, without significant difference between the measurements; an excellent intraclass and concordance correlation was found between both software programs, which seem to be more accurate when measuring a wound area >10 cm². Conclusion: the use of both software programs is appropriate for measurement of venous ulcers, appearing to be more accurate when used to measure a wound area > 10 cm². PMID:27992028

  17. Development of Cell Analysis Software for Cultivated Corneal Endothelial Cells.

    PubMed

    Okumura, Naoki; Ishida, Naoya; Kakutani, Kazuya; Hongo, Akane; Hiwa, Satoru; Hiroyasu, Tomoyuki; Koizumi, Noriko

    2017-11-01

    To develop analysis software for cultured human corneal endothelial cells (HCECs). Software was designed to recognize cell borders and to provide parameters such as cell density, coefficient of variation, and polygonality of cultured HCECs based on phase contrast images. Cultured HCECs with high or low cell density were incubated with Ca-free and Mg-free phosphate-buffered saline for 10 minutes to reveal the cell borders and were then analyzed with software (n = 50). Phase contrast images showed that cell borders were not distinctly outlined, but these borders became more distinctly outlined after phosphate-buffered saline treatment and were recognized by cell analysis software. The cell density value provided by software was similar to that obtained using manual cell counting by an experienced researcher. Morphometric parameters, such as the coefficient of variation and polygonality, were also produced by software, and these values were significantly correlated with cell density (Pearson correlation coefficients -0.62 and 0.63, respectively). The software described here provides morphometric information from phase contrast images, and it enables subjective and noninvasive quality assessment for tissue engineering therapy of the corneal endothelium.

  18. WGCNA: an R package for weighted correlation network analysis.

    PubMed

    Langfelder, Peter; Horvath, Steve

    2008-12-29

    Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at http://www.genetics.ucla.edu/labs/horvath/CoexpressionNetwork/Rpackages/WGCNA.

  19. WGCNA: an R package for weighted correlation network analysis

    PubMed Central

    Langfelder, Peter; Horvath, Steve

    2008-01-01

    Background Correlation networks are increasingly being used in bioinformatics applications. For example, weighted gene co-expression network analysis is a systems biology method for describing the correlation patterns among genes across microarray samples. Weighted correlation network analysis (WGCNA) can be used for finding clusters (modules) of highly correlated genes, for summarizing such clusters using the module eigengene or an intramodular hub gene, for relating modules to one another and to external sample traits (using eigengene network methodology), and for calculating module membership measures. Correlation networks facilitate network based gene screening methods that can be used to identify candidate biomarkers or therapeutic targets. These methods have been successfully applied in various biological contexts, e.g. cancer, mouse genetics, yeast genetics, and analysis of brain imaging data. While parts of the correlation network methodology have been described in separate publications, there is a need to provide a user-friendly, comprehensive, and consistent software implementation and an accompanying tutorial. Results The WGCNA R software package is a comprehensive collection of R functions for performing various aspects of weighted correlation network analysis. The package includes functions for network construction, module detection, gene selection, calculations of topological properties, data simulation, visualization, and interfacing with external software. Along with the R package we also present R software tutorials. While the methods development was motivated by gene expression data, the underlying data mining approach can be applied to a variety of different settings. Conclusion The WGCNA package provides R functions for weighted correlation network analysis, e.g. co-expression network analysis of gene expression data. The R package along with its source code and additional material are freely available at . PMID:19114008

  20. Digital Image Correlation from Commercial to FOS Software: a Mature Technique for Full-Field Displacement Measurements

    NASA Astrophysics Data System (ADS)

    Belloni, V.; Ravanelli, R.; Nascetti, A.; Di Rita, M.; Mattei, D.; Crespi, M.

    2018-05-01

    In the last few decades, there has been a growing interest in studying non-contact methods for full-field displacement and strain measurement. Among such techniques, Digital Image Correlation (DIC) has received particular attention, thanks to its ability to provide these information by comparing digital images of a sample surface before and after deformation. The method is now commonly adopted in the field of civil, mechanical and aerospace engineering and different companies and some research groups implemented 2D and 3D DIC software. In this work a review on DIC software status is given at first. Moreover, a free and open source 2D DIC software is presented, named py2DIC and developed in Python at the Geodesy and Geomatics Division of DICEA of the University of Rome "La Sapienza"; its potentialities were evaluated by processing the images captured during tensile tests performed in the Structural Engineering Lab of the University of Rome "La Sapienza" and comparing them to those obtained using the commercial software Vic-2D developed by Correlated Solutions Inc, USA. The agreement of these results at one hundredth of millimetre level demonstrate the possibility to use this open source software as a valuable 2D DIC tool to measure full-field displacements on the investigated sample surface.

  1. Research on software behavior trust based on hierarchy evaluation

    NASA Astrophysics Data System (ADS)

    Long, Ke; Xu, Haishui

    2017-08-01

    In view of the correlation software behavior, we evaluate software behavior credibility from two levels of control flow and data flow. In control flow level, method of the software behavior of trace based on support vector machine (SVM) is proposed. In data flow level, behavioral evidence evaluation based on fuzzy decision analysis method is put forward.

  2. Maximum entropy analysis of polarized fluorescence decay of (E)GFP in aqueous solution

    NASA Astrophysics Data System (ADS)

    Novikov, Eugene G.; Skakun, Victor V.; Borst, Jan Willem; Visser, Antonie J. W. G.

    2018-01-01

    The maximum entropy method (MEM) was used for the analysis of polarized fluorescence decays of enhanced green fluorescent protein (EGFP) in buffered water/glycerol mixtures, obtained with time-correlated single-photon counting (Visser et al 2016 Methods Appl. Fluoresc. 4 035002). To this end, we used a general-purpose software module of MEM that was earlier developed to analyze (complex) laser photolysis kinetics of ligand rebinding reactions in oxygen binding proteins. We demonstrate that the MEM software provides reliable results and is easy to use for the analysis of both total fluorescence decay and fluorescence anisotropy decay of aqueous solutions of EGFP. The rotational correlation times of EGFP in water/glycerol mixtures, obtained by MEM as maxima of the correlation-time distributions, are identical to the single correlation times determined by global analysis of parallel and perpendicular polarized decay components. The MEM software is also able to determine homo-FRET in another dimeric GFP, for which the transfer correlation time is an order of magnitude shorter than the rotational correlation time. One important advantage utilizing MEM analysis is that no initial guesses of parameters are required, since MEM is able to select the least correlated solution from the feasible set of solutions.

  3. Visual and computer software-aided estimates of Dupuytren's contractures: correlation with clinical goniometric measurements.

    PubMed

    Smith, R P; Dias, J J; Ullah, A; Bhowal, B

    2009-05-01

    Corrective surgery for Dupuytren's disease represents a significant proportion of a hand surgeon's workload. The decision to go ahead with surgery and the success of surgery requires measuring the degree of contracture of the diseased finger(s). This is performed in clinic with a goniometer, pre- and postoperatively. Monitoring the recurrence of the contracture can inform on surgical outcome, research and audit. We compared visual and computer software-aided estimation of Dupuytren's contractures to clinical goniometric measurements in 60 patients with Dupuytren's disease. Patients' hands were digitally photographed. There were 76 contracted finger joints--70 proximal interphalangeal joints and six distal interphalangeal joints. The degrees of contracture of these images were visually assessed by six orthopaedic staff of differing seniority and re-assessed with computer software. Across assessors, the Pearson correlation between the goniometric measurements and the visual estimations was 0.83 and this significantly improved to 0.88 with computer software. Reliability with intra-class correlations achieved 0.78 and 0.92 for the visual and computer-aided estimations, respectively, and with test-retest analysis, 0.92 for visual estimation and 0.95 for computer-aided measurements. Visual estimations of Dupuytren's contractures correlate well with actual clinical goniometric measurements and improve further if measured with computer software. Digital images permit monitoring of contracture after surgery and may facilitate research into disease progression and auditing of surgical technique.

  4. Efficacy of a Newly Designed Cephalometric Analysis Software for McNamara Analysis in Comparison with Dolphin Software.

    PubMed

    Nouri, Mahtab; Hamidiaval, Shadi; Akbarzadeh Baghban, Alireza; Basafa, Mohammad; Fahim, Mohammad

    2015-01-01

    Cephalometric norms of McNamara analysis have been studied in various populations due to their optimal efficiency. Dolphin cephalometric software greatly enhances the conduction of this analysis for orthodontic measurements. However, Dolphin is very expensive and cannot be afforded by many clinicians in developing countries. A suitable alternative software program in Farsi/English will greatly help Farsi speaking clinicians. The present study aimed to develop an affordable Iranian cephalometric analysis software program and compare it with Dolphin, the standard software available on the market for cephalometric analysis. In this diagnostic, descriptive study, 150 lateral cephalograms of normal occlusion individuals were selected in Mashhad and Qazvin, two major cities of Iran mainly populated with Fars ethnicity, the main Iranian ethnic group. After tracing the cephalograms, the McNamara analysis standards were measured both with Dolphin and the new software. The cephalometric software was designed using Microsoft Visual C++ program in Windows XP. Measurements made with the new software were compared with those of Dolphin software on both series of cephalograms. The validity and reliability were tested using intra-class correlation coefficient. Calculations showed a very high correlation between the results of the Iranian cephalometric analysis software and Dolphin. This confirms the validity and optimal efficacy of the newly designed software (ICC 0.570-1.0). According to our results, the newly designed software has acceptable validity and reliability and can be used for orthodontic diagnosis, treatment planning and assessment of treatment outcome.

  5. Current Practice in Software Development for Computational Neuroscience and How to Improve It

    PubMed Central

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research. PMID:24465191

  6. Current practice in software development for computational neuroscience and how to improve it.

    PubMed

    Gewaltig, Marc-Oliver; Cannon, Robert

    2014-01-01

    Almost all research work in computational neuroscience involves software. As researchers try to understand ever more complex systems, there is a continual need for software with new capabilities. Because of the wide range of questions being investigated, new software is often developed rapidly by individuals or small groups. In these cases, it can be hard to demonstrate that the software gives the right results. Software developers are often open about the code they produce and willing to share it, but there is little appreciation among potential users of the great diversity of software development practices and end results, and how this affects the suitability of software tools for use in research projects. To help clarify these issues, we have reviewed a range of software tools and asked how the culture and practice of software development affects their validity and trustworthiness. We identified four key questions that can be used to categorize software projects and correlate them with the type of product that results. The first question addresses what is being produced. The other three concern why, how, and by whom the work is done. The answers to these questions show strong correlations with the nature of the software being produced, and its suitability for particular purposes. Based on our findings, we suggest ways in which current software development practice in computational neuroscience can be improved and propose checklists to help developers, reviewers, and scientists to assess the quality of software and whether particular pieces of software are ready for use in research.

  7. Multi-version software reliability through fault-avoidance and fault-tolerance

    NASA Technical Reports Server (NTRS)

    Vouk, Mladen A.; Mcallister, David F.

    1989-01-01

    A number of experimental and theoretical issues associated with the practical use of multi-version software to provide run-time tolerance to software faults were investigated. A specialized tool was developed and evaluated for measuring testing coverage for a variety of metrics. The tool was used to collect information on the relationships between software faults and coverage provided by the testing process as measured by different metrics (including data flow metrics). Considerable correlation was found between coverage provided by some higher metrics and the elimination of faults in the code. Back-to-back testing was continued as an efficient mechanism for removal of un-correlated faults, and common-cause faults of variable span. Software reliability estimation methods was also continued based on non-random sampling, and the relationship between software reliability and code coverage provided through testing. New fault tolerance models were formulated. Simulation studies of the Acceptance Voting and Multi-stage Voting algorithms were finished and it was found that these two schemes for software fault tolerance are superior in many respects to some commonly used schemes. Particularly encouraging are the safety properties of the Acceptance testing scheme.

  8. ALMA Correlator Real-Time Data Processor

    NASA Astrophysics Data System (ADS)

    Pisano, J.; Amestica, R.; Perez, J.

    2005-10-01

    The design of a real-time Linux application utilizing Real-Time Application Interface (RTAI) to process real-time data from the radio astronomy correlator for the Atacama Large Millimeter Array (ALMA) is described. The correlator is a custom-built digital signal processor which computes the cross-correlation function of two digitized signal streams. ALMA will have 64 antennas with 2080 signal streams each with a sample rate of 4 giga-samples per second. The correlator's aggregate data output will be 1 gigabyte per second. The software is defined by hard deadlines with high input and processing data rates, while requiring interfaces to non real-time external computers. The designed computer system - the Correlator Data Processor or CDP, consists of a cluster of 17 SMP computers, 16 of which are compute nodes plus a master controller node all running real-time Linux kernels. Each compute node uses an RTAI kernel module to interface to a 32-bit parallel interface which accepts raw data at 64 megabytes per second in 1 megabyte chunks every 16 milliseconds. These data are transferred to tasks running on multiple CPUs in hard real-time using RTAI's LXRT facility to perform quantization corrections, data windowing, FFTs, and phase corrections for a processing rate of approximately 1 GFLOPS. Highly accurate timing signals are distributed to all seventeen computer nodes in order to synchronize them to other time-dependent devices in the observatory array. RTAI kernel tasks interface to the timing signals providing sub-millisecond timing resolution. The CDP interfaces, via the master node, to other computer systems on an external intra-net for command and control, data storage, and further data (image) processing. The master node accesses these external systems utilizing ALMA Common Software (ACS), a CORBA-based client-server software infrastructure providing logging, monitoring, data delivery, and intra-computer function invocation. The software is being developed in tandem with the correlator hardware which presents software engineering challenges as the hardware evolves. The current status of this project and future goals are also presented.

  9. Analysis of Performance of Stereoscopic-Vision Software

    NASA Technical Reports Server (NTRS)

    Kim, Won; Ansar, Adnan; Steele, Robert; Steinke, Robert

    2007-01-01

    A team of JPL researchers has analyzed stereoscopic vision software and produced a document describing its performance. This software is of the type used in maneuvering exploratory robotic vehicles on Martian terrain. The software in question utilizes correlations between portions of the images recorded by two electronic cameras to compute stereoscopic disparities, which, in conjunction with camera models, are used in computing distances to terrain points to be included in constructing a three-dimensional model of the terrain. The analysis included effects of correlation- window size, a pyramidal image down-sampling scheme, vertical misalignment, focus, maximum disparity, stereo baseline, and range ripples. Contributions of sub-pixel interpolation, vertical misalignment, and foreshortening to stereo correlation error were examined theoretically and experimentally. It was found that camera-calibration inaccuracy contributes to both down-range and cross-range error but stereo correlation error affects only the down-range error. Experimental data for quantifying the stereo disparity error were obtained by use of reflective metrological targets taped to corners of bricks placed at known positions relative to the cameras. For the particular 1,024-by-768-pixel cameras of the system analyzed, the standard deviation of the down-range disparity error was found to be 0.32 pixel.

  10. Parallel algorithm of VLBI software correlator under multiprocessor environment

    NASA Astrophysics Data System (ADS)

    Zheng, Weimin; Zhang, Dong

    2007-11-01

    The correlator is the key signal processing equipment of a Very Lone Baseline Interferometry (VLBI) synthetic aperture telescope. It receives the mass data collected by the VLBI observatories and produces the visibility function of the target, which can be used to spacecraft position, baseline length measurement, synthesis imaging, and other scientific applications. VLBI data correlation is a task of data intensive and computation intensive. This paper presents the algorithms of two parallel software correlators under multiprocessor environments. A near real-time correlator for spacecraft tracking adopts the pipelining and thread-parallel technology, and runs on the SMP (Symmetric Multiple Processor) servers. Another high speed prototype correlator using the mixed Pthreads and MPI (Massage Passing Interface) parallel algorithm is realized on a small Beowulf cluster platform. Both correlators have the characteristic of flexible structure, scalability, and with 10-station data correlating abilities.

  11. Software-assisted small bowel motility analysis using free-breathing MRI: feasibility study.

    PubMed

    Bickelhaupt, Sebastian; Froehlich, Johannes M; Cattin, Roger; Raible, Stephan; Bouquet, Hanspeter; Bill, Urs; Patak, Michael A

    2014-01-01

    To validate a software prototype allowing for small bowel motility analysis in free breathing by comparing it to manual measurements. In all, 25 patients (15 male, 10 female; mean age 39 years) were included in this Institutional Review Board-approved, retrospective study. Magnetic resonance imaging (MRI) was performed on a 1.5T system after standardized preparation acquiring motility sequences in free breathing over 69-84 seconds. Small bowel motility was analyzed manually and with the software. Functional parameters, measurement time, and reproducibility were compared using the coefficient of variance and paired Student's t-test. Correlation was analyzed using Pearson's correlation coefficient and linear regression. The 25 segments were analyzed twice both by hand and using the software with automatic breathing correction. All assessed parameters significantly correlated between the methods (P < 0.01), but the scattering of repeated measurements was significantly (P < 0.01) lower using the software (3.90%, standard deviation [SD] ± 5.69) than manual examinations (9.77%, SD ± 11.08). The time needed was significantly less (P < 0.001) with the software (4.52 minutes, SD ± 1.58) compared to manual measurement, lasting 17.48 minutes for manual (SD ± 1.75 minutes). The use of the software proves reliable and faster small bowel motility measurements in free-breathing MRI compared to manual analyses. The new technique allows for analyses of prolonged sequences acquired in free breathing, improving the informative value of the examinations by amplifying the evaluable data. Copyright © 2013 Wiley Periodicals, Inc.

  12. Semi-automated software to measure luminal and stromal areas of choroid in optical coherence tomographic images.

    PubMed

    Sonoda, Shozo; Sakamoto, Taiji; Kakiuchi, Naoko; Shiihara, Hideki; Sakoguchi, Tomonori; Tomita, Masatoshi; Yamashita, Takehiro; Uchino, Eisuke

    2018-03-01

    To determine the capabilities of "EyeGround" software in measuring the choroidal cross sectional areas in optical coherence tomographic (OCT) images. Cross sectional, prospective study. The cross-sectional area of the subfoveal choroid within a 1500 µm diameter circle centered on the fovea was measured both with and without using the EyeGround software in the OCT images. The differences between the evaluation times and the results of the measurements were compared. The inter-rater, intra-rater, inter-method agreements were determined. Fifty-one eyes of 51 healthy subjects were studied: 24 men and 27 women with an average age of 35.0 ± 8.8 years. The time for analyzing a single image was significantly shorter with the software at 3.2±1.1 min than without the software at 12.1±5.1 min (P <0.001). The inter-method correlation efficient for the measurements of the whole choroid was high [0.989, 95% CI (0.981-0.994)]. With the software, the inter-rater correlation efficient was significantly high [0.997, 95% CI (0.995-0.999)], and the intra-rater correlation efficient was also significantly high [0.999, 95% CI (0.999-1.0)]. The EyeGround software can measure the choroidal area in the OCT cross sectional images with good reproducibility and in a significantly shorter times. It can be a valuable tool for analyzing the choroid.

  13. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUAL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    GEOPACK, a comprehensive user-friendly geostatistical software system, was developed to help in the analysis of spatially correlated data. The software system was developed to be used by scientists, engineers, regulators, etc., with little experience in geostatistical techniques...

  14. GEOSTATISTICS FOR WASTE MANAGEMENT: A USER'S MANUEL FOR THE GEOPACK (VERSION 1.0) GEOSTATISTICAL SOFTWARE SYSTEM

    EPA Science Inventory

    A comprehensive, user-friendly geostatistical software system called GEOPACk has been developed. The purpose of this software is to make available the programs necessary to undertake a geostatistical analysis of spatially correlated data. The programs were written so that they ...

  15. Deriving the Cost of Software Maintenance for Software Intensive Systems

    DTIC Science & Technology

    2011-08-29

    more of software maintenance). Figure 4. SEER-SEM Maintenance Effort by Year Report (Reifer, Allen, Fersch, Hitchings, Judy , & Rosa, 2010...understand the linear relationship between two variables. The formula for the simple Pearson product-moment correlation is represented in Equation 5...standardization is required across the software maintenance community in order to ensure that the data being recorded can be employed beyond the agency or

  16. Computerized assessment of placental calcification post-ultrasound: a novel software tool.

    PubMed

    Moran, M; Higgins, M; Zombori, G; Ryan, J; McAuliffe, F M

    2013-05-01

    Placental calcification is associated with an increased risk of perinatal morbidity and mortality. The subjectivity of current ultrasound methods of assessment of placental calcification indicates that a more objective method is required. The aim of this study was to correlate the percentage of calcification defined by the clinician using a new software tool for calculating the extent of placental calcification with traditional ultrasound methods and with pregnancy outcome. Ninety placental images were individually assessed. An upper threshold was defined, based on high intensity, to quantify calcification within the placenta. Output metrics were then produced including the overall percentage of calcification with respect to the total number of pixels within the region of interest. The results were correlated with traditional ultrasound methods of assessment of placental calcification and with pregnancy outcome. The results demonstrate a significant correlation between placental calcification, as defined using the software, and traditional methods of Grannum grading of placental calcification. Whilst correlation with perinatal outcome and cord pH was not significant as a result of small numbers, patients with placental calcification assessed using the computerized software at the upper quartile had higher rates of poor perinatal outcome when compared with those at the lower quartile (8/22 (36%) vs 3/23 (13%); P = 0.069). These results suggest that this computerized software tool has the potential to become an alternative method of assessing placental calcification. Copyright © 2012 ISUOG. Published by John Wiley & Sons Ltd.

  17. Assessment of left ventricular mechanical dyssynchrony by phase analysis of gated-SPECT myocardial perfusion imaging and tissue Doppler imaging: comparison between QGS and ECTb software packages.

    PubMed

    Rastgou, Fereydoon; Shojaeifard, Maryam; Amin, Ahmad; Ghaedian, Tahereh; Firoozabadi, Hasan; Malek, Hadi; Yaghoobi, Nahid; Bitarafan-Rajabi, Ahmad; Haghjoo, Majid; Amouzadeh, Hedieh; Barati, Hossein

    2014-12-01

    Recently, the phase analysis of gated single-photon emission computed tomography (SPECT) myocardial perfusion imaging (MPI) has become feasible via several software packages for the evaluation of left ventricular mechanical dyssynchrony. We compared two quantitative software packages, quantitative gated SPECT (QGS) and Emory cardiac toolbox (ECTb), with tissue Doppler imaging (TDI) as the conventional method for the evaluation of left ventricular mechanical dyssynchrony. Thirty-one patients with severe heart failure (ejection fraction ≤35%) and regular heart rhythm, who referred for gated-SPECT MPI, were enrolled. TDI was performed within 3 days after MPI. Dyssynchrony parameters derived from gated-SPECT MPI were analyzed by QGS and ECTb and were compared with the Yu index and septal-lateral wall delay measured by TDI. QGS and ECTb showed a good correlation for assessment of phase histogram bandwidth (PHB) and phase standard deviation (PSD) (r = 0.664 and r = 0.731, P < .001, respectively). However, the mean value of PHB and PSD by ECTb was significantly higher than that of QGS. No significant correlation was found between ECTb and QGS and the Yu index. Nevertheless, PHB, PSD, and entropy derived from QGS revealed a significant (r = 0.424, r = 0.478, r = 0.543, respectively; P < .02) correlation with septal-lateral wall delay. Despite a good correlation between QGS and ECTb software packages, different normal cut-off values of PSD and PHB should be defined for each software package. There was only a modest correlation between phase analysis of gated-SPECT MPI and TDI data, especially in the population of heart failure patients with both narrow and wide QRS complex.

  18. Fast blood flow monitoring in deep tissues with real-time software correlators

    PubMed Central

    Wang, Detian; Parthasarathy, Ashwin B.; Baker, Wesley B.; Gannon, Kimberly; Kavuri, Venki; Ko, Tiffany; Schenkel, Steven; Li, Zhe; Li, Zeren; Mullen, Michael T.; Detre, John A.; Yodh, Arjun G.

    2016-01-01

    We introduce, validate and demonstrate a new software correlator for high-speed measurement of blood flow in deep tissues based on diffuse correlation spectroscopy (DCS). The software correlator scheme employs standard PC-based data acquisition boards to measure temporal intensity autocorrelation functions continuously at 50 – 100 Hz, the fastest blood flow measurements reported with DCS to date. The data streams, obtained in vivo for typical source-detector separations of 2.5 cm, easily resolve pulsatile heart-beat fluctuations in blood flow which were previously considered to be noise. We employ the device to separate tissue blood flow from tissue absorption/scattering dynamics and thereby show that the origin of the pulsatile DCS signal is primarily flow, and we monitor cerebral autoregulation dynamics in healthy volunteers more accurately than with traditional instrumentation as a result of increased data acquisition rates. Finally, we characterize measurement signal-to-noise ratio and identify count rate and averaging parameters needed for optimal performance. PMID:27231588

  19. Art care: A multi-modality coronary 3D reconstruction and hemodynamic status assessment software.

    PubMed

    Siogkas, Panagiotis K; Stefanou, Kostas A; Athanasiou, Lambros S; Papafaklis, Michail I; Michalis, Lampros K; Fotiadis, Dimitrios I

    2018-01-01

    Due to the incremental increase of clinical interest in the development of software that allows the 3-dimensional (3D) reconstruction and the functional assessment of the coronary vasculature, several software packages have been developed and are available today. Taking this into consideration, we have developed an innovative suite of software modules that perform 3D reconstruction of coronary arterial segments using different coronary imaging modalities such as IntraVascular UltraSound (IVUS) and invasive coronary angiography images (ICA), Optical Coherence Tomography (OCT) and ICA images, or plain ICA images and can safely and accurately assess the hemodynamic status of the artery of interest. The user can perform automated or manual segmentation of the IVUS or OCT images, visualize in 3D the reconstructed vessel and export it to formats, which are compatible with other Computer Aided Design (CAD) software systems. We employ finite elements to provide the capability to assess the hemodynamic functionality of the reconstructed vessels by calculating the virtual functional assessment index (vFAI), an index that corresponds and has been shown to correlate well to the actual fractional flow reserve (FFR) value. All the modules of the proposed system have been thoroughly validated. In brief, the 3D-QCA module, compared to a successful commercial software of the same genre, presented very good correlation using several validation metrics, with a Pearson's correlation coefficient (R) for the calculated volumes, vFAI, length and minimum lumen diameter of 0.99, 0.99, 0.99 and 0.88, respectively. Moreover, the automatic lumen detection modules for IVUS and OCT presented very high accuracy compared to the annotations by medical experts with the Pearson's correlation coefficient reaching the values of 0.94 and 0.99, respectively. In this study, we have presented a user-friendly software for the 3D reconstruction of coronary arterial segments and the accurate hemodynamic assessment of the severity of existing stenosis.

  20. Support Materials for the Software Technical Review Process

    DTIC Science & Technology

    1988-04-01

    the Software Technical Review Process Softwar-e reviewing is a general term applied to techniques for the use of human hitellectual power to detect...more systematic than random. It utilizes data supplied by students, rather than relying solely on the subjective opinions of the instructor. The...The experience of other users is now essential.) "• Are the resulting grades accurate? (Thus far, they appear to correlate with student grades on

  1. Do We Really Know What Makes Educational Software Effective? A Call for Empirical Research on Effectiveness.

    ERIC Educational Resources Information Center

    Jolicoeur, Karen; Berger, Dale E.

    1986-01-01

    Examination of methods used by two software review services in evaluating microcomputer courseware--EPIE (Educational Products Information Exchange) and MicroSIFT (Microcomputer Software and Information for Teachers)--found low correlations between their recommendations for 82 programs. This lack of agreement casts doubts on the usefulness of…

  2. Effective Electronic Materials: Are Teachers Aware of These?

    ERIC Educational Resources Information Center

    Luik, P.

    2012-01-01

    This study analyses to what extent teachers recognise which interactive multimedia software is efficient and which is not. The results are based on two correlation studies. The first study was carried out with 35 different pieces of interactive multimedia software for secondary students, and 34 pieces of interactive multimedia software for primary…

  3. Metric analysis and data validation across FORTRAN projects

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.; Phillips, Tsai-Yun

    1983-01-01

    The desire to predict the effort in developing or explaining the quality of software has led to the proposal of several metrics. As a step toward validating these metrics, the Software Engineering Laboratory (SEL) has analyzed the software science metrics, cyclomatic complexity, and various standard program measures for their relation to effort (including design through acceptance testing), development errors (both discrete and weighted according to the amount of time to locate and fix), and one another. The data investigated are collected from a project FORTRAN environment and examined across several projects at once, within individual projects and by reporting accuracy checks demonstrating the need to validate a database. When the data comes from individual programmers or certain validated projects, the metrics' correlations with actual effort seem to be strongest. For modules developed entirely by individual programmers, the validity ratios induce a statistically significant ordering of several of the metrics' correlations. When comparing the strongest correlations, neither software science's E metric cyclomatic complexity not source lines of code appears to relate convincingly better with effort than the others.

  4. Multi-modality 3D breast imaging with X-Ray tomosynthesis and automated ultrasound.

    PubMed

    Sinha, Sumedha P; Roubidoux, Marilyn A; Helvie, Mark A; Nees, Alexis V; Goodsitt, Mitchell M; LeCarpentier, Gerald L; Fowlkes, J Brian; Chalek, Carl L; Carson, Paul L

    2007-01-01

    This study evaluated the utility of 3D automated ultrasound in conjunction with 3D digital X-Ray tomosynthesis for breast cancer detection and assessment, to better localize and characterize lesions in the breast. Tomosynthesis image volumes and automated ultrasound image volumes were acquired in the same geometry and in the same view for 27 patients. 3 MQSA certified radiologists independently reviewed the image volumes, visually correlating the images from the two modalities with in-house software. More sophisticated software was used on a smaller set of 10 cases, which enabled the radiologist to draw a 3D box around the suspicious lesion in one image set and isolate an anatomically correlated, similarly boxed region in the other modality image set. In the primary study, correlation was found to be moderately useful to the readers. In the additional study, using improved software, the median usefulness rating increased and confidence in localizing and identifying the suspicious mass increased in more than half the cases. As automated scanning and reading software techniques advance, superior results are expected.

  5. Characterizing Cyclostationary Features of Digital Modulated Signals with Empirical Measurements using Spectral Correlation Function

    DTIC Science & Technology

    2011-06-01

    USING SPECTRAL CORRELATION FUNCTION THESIS Mujun Song, Captain, ROKA AFIT/GCE/ENG/11-09 DEPARTMENT OF THE AIR FORCE AIR UNIVERSITY AIR...Management Air Force Institute of Technology Air University Air Education and Training Command In Partial Fulfillment of the Requirements for the...generator, Agilent E4438C, ESG Vector Signal Generator. Universal Software Radio Peripheral 2 (USRP2), which is a Software Defined Radio (SDR), is used

  6. Evaluation of a Multicolor, Single-Tube Technique To Enumerate Lymphocyte Subpopulations▿

    PubMed Central

    Colombo, F.; Cattaneo, A.; Lopa, R.; Portararo, P.; Rebulla, P.; Porretti, L.

    2008-01-01

    To evaluate the fully automated FACSCanto software, we compared lymphocyte subpopulation counts obtained using three-color FACSCalibur-CELLQuest and six-color FACSCanto-FACSCanto software techniques. High correlations were observed between data obtained with these techniques. Our study indicated that FACSCanto clinical software is accurate and sensitive in single-platform lymphocyte immunophenotyping. PMID:18448621

  7. Supporting Early Math--Rationales and Requirements for High Quality Software

    ERIC Educational Resources Information Center

    Haake, Magnus; Husain, Layla; Gulz, Agneta

    2015-01-01

    There is substantial evidence that preschooler's performance in early math is highly correlated to math performance throughout school as well as academic skills in general. One way to help children attain early math skills is by using targeted educational software and the paper discusses potential gains of using such software to support early math…

  8. Quantitative Measures for Software Independent Verification and Validation

    NASA Technical Reports Server (NTRS)

    Lee, Alice

    1996-01-01

    As software is maintained or reused, it undergoes an evolution which tends to increase the overall complexity of the code. To understand the effects of this, we brought in statistics experts and leading researchers in software complexity, reliability, and their interrelationships. These experts' project has resulted in our ability to statistically correlate specific code complexity attributes, in orthogonal domains, to errors found over time in the HAL/S flight software which flies in the Space Shuttle. Although only a prototype-tools experiment, the result of this research appears to be extendable to all other NASA software, given appropriate data similar to that logged for the Shuttle onboard software. Our research has demonstrated that a more complete domain coverage can be mathematically demonstrated with the approach we have applied, thereby ensuring full insight into the cause-and-effects relationship between the complexity of a software system and the fault density of that system. By applying the operational profile we can characterize the dynamic effects of software path complexity under this same approach We now have the ability to measure specific attributes which have been statistically demonstrated to correlate to increased error probability, and to know which actions to take, for each complexity domain. Shuttle software verifiers can now monitor the changes in the software complexity, assess the added or decreased risk of software faults in modified code, and determine necessary corrections. The reports, tool documentation, user's guides, and new approach that have resulted from this research effort represent advances in the state of the art of software quality and reliability assurance. Details describing how to apply this technique to other NASA code are contained in this document.

  9. CONTIN XPCS: software for inverse transform analysis of X-ray photon correlation spectroscopy dynamics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less

  10. CONTIN XPCS: software for inverse transform analysis of X-ray photon correlation spectroscopy dynamics

    DOE PAGES

    Andrews, Ross N.; Narayanan, Suresh; Zhang, Fan; ...

    2018-02-01

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) reveal materials dynamics using coherent scattering, with XPCS permitting the investigation of dynamics in a more diverse array of materials than DLS. Heterogeneous dynamics occur in many material systems. The authors' recent work has shown how classic tools employed in the DLS analysis of heterogeneous dynamics can be extended to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. The present work describes the software implementation of inverse transform analysis of XPCS data. This software, calledCONTIN XPCS, is an extension of traditionalCONTINanalysis and accommodates the various dynamics encountered inmore » equilibrium XPCS measurements.« less

  11. Software Defined GPS API: Development and Implementation of GPS Correlator Architectures Using MATLAB with Focus on SDR Implementations

    DTIC Science & Technology

    2014-05-18

    intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques...with the intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved...intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques to

  12. Surveillance of industrial processes with correlated parameters

    DOEpatents

    White, Andrew M.; Gross, Kenny C.; Kubic, William L.; Wigeland, Roald A.

    1996-01-01

    A system and method for surveillance of an industrial process. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions.

  13. The Effects of Development Team Skill on Software Product Quality

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  14. Endoscopic Stone Measurement During Ureteroscopy.

    PubMed

    Ludwig, Wesley W; Lim, Sunghwan; Stoianovici, Dan; Matlaga, Brian R

    2018-01-01

    Currently, stone size cannot be accurately measured while performing flexible ureteroscopy (URS). We developed novel software for ureteroscopic, stone size measurement, and then evaluated its performance. A novel application capable of measuring stone fragment size, based on the known distance of the basket tip in the ureteroscope's visual field, was designed and calibrated in a laboratory setting. Complete URS procedures were recorded and 30 stone fragments were extracted and measured using digital calipers. The novel software program was applied to the recorded URS footage to obtain ureteroscope-derived stone size measurements. These ureteroscope-derived measurements were then compared with the actual-measured fragment size. The median longitudinal and transversal errors were 0.14 mm (95% confidence interval [CI] 0.1, 0.18) and 0.09 mm (95% CI 0.02, 0.15), respectively. The overall software accuracy and precision were 0.17 and 0.15 mm, respectively. The longitudinal and transversal measurements obtained by the software and digital calipers were highly correlated (r = 0.97 and 0.93). Neither stone size nor stone type was correlated with error measurements. This novel method and software reliably measured stone fragment size during URS. The software ultimately has the potential to make URS safer and more efficient.

  15. PSGMiner: A modular software for polysomnographic analysis.

    PubMed

    Umut, İlhan

    2016-06-01

    Sleep disorders affect a great percentage of the population. The diagnosis of these disorders is usually made by polysomnography. This paper details the development of new software to carry out feature extraction in order to perform robust analysis and classification of sleep events using polysomnographic data. The software, called PSGMiner, is a tool, which visualizes, processes and classifies bioelectrical data. The purpose of this program is to provide researchers with a platform with which to test new hypotheses by creating tests to check for correlations that are not available in commercially available software. The software is freely available under the GPL3 License. PSGMiner is composed of a number of diverse modules such as feature extraction, annotation, and machine learning modules, all of which are accessible from the main module. Using the software, it is possible to extract features of polysomnography using digital signal processing and statistical methods and to perform different analyses. The features can be classified through the use of five classification algorithms. PSGMiner offers an architecture designed for integrating new methods. Automatic scoring, which is available in almost all commercial PSG software, is not inherently available in this program, though it can be implemented by two different methodologies (machine learning and algorithms). While similar software focuses on a certain signal or event composed of a small number of modules with no expansion possibility, the software introduced here can handle all polysomnographic signals and events. The software simplifies the processing of polysomnographic signals for researchers and physicians that are not experts in computer programming. It can find correlations between different events which could help predict an oncoming event such as sleep apnea. The software could also be used for educational purposes. Copyright © 2016 Elsevier Ltd. All rights reserved.

  16. Objective breast symmetry analysis with the breast analyzing tool (BAT): improved tool for clinical trials.

    PubMed

    Krois, Wilfried; Romar, Alexander Ken; Wild, Thomas; Dubsky, Peter; Exner, Ruth; Panhofer, Peter; Jakesz, Raimund; Gnant, Michael; Fitzal, Florian

    2017-07-01

    Objective cosmetic analysis is important to evaluate the cosmetic outcome after breast surgery or breast radiotherapy. For this purpose, we aimed to improve our recently developed objective scoring software, the Breast Analyzing Tool (BAT ® ). A questionnaire about important factors for breast symmetry was handed out to ten experts (surgeons) and eight non-experts (students). Using these factors, the first-generation BAT ® software formula has been modified and the breast symmetry index (BSI) from 129 women after breast surgery has been calculated by the first author with this new BAT ® formula. The resulting BSI values of these 129 breast cancer patients were then correlated with subjective symmetry scores from the 18 observers using the Harris scale. The BSI of ten images was also calculated from five observers different from the first author to calculate inter-rater reliability. In a second phase, the new BAT ® formula was validated and correlated with subjective scores of additional 50 women after breast surgery. The inter-rater reliability analysis of the objective evaluation by the BAT ® from five individuals showed an ICC of 0.992 with almost no difference between different observers. All subjective scores of 50 patients correlated with the modified BSI score with a high Pearson correlation coefficient of 0.909 (p < .001) which was better compared to the old software (r = 0.769; p < .001). The modified BAT ® software improves the correlation between subjective and objective BSI values, and may be a new standard for trials evaluating breast symmetry.

  17. Validation of a Quality Management Metric

    DTIC Science & Technology

    2000-09-01

    quality management metric (QMM) was used to measure the performance of ten software managers on Department of Defense (DoD) software development programs. Informal verification and validation of the metric compared the QMM score to an overall program success score for the entire program and yielded positive correlation. The results of applying the QMM can be used to characterize the quality of software management and can serve as a template to improve software management performance. Future work includes further refining the QMM, applying the QMM scores to provide feedback

  18. Estimation and enhancement of real-time software reliability through mutation analysis

    NASA Technical Reports Server (NTRS)

    Geist, Robert; Offutt, A. J.; Harris, Frederick C., Jr.

    1992-01-01

    A simulation-based technique for obtaining numerical estimates of the reliability of N-version, real-time software is presented. An extended stochastic Petri net is employed to represent the synchronization structure of N versions of the software, where dependencies among versions are modeled through correlated sampling of module execution times. Test results utilizing specifications for NASA's planetary lander control software indicate that mutation-based testing could hold greater potential for enhancing reliability than the desirable but perhaps unachievable goal of independence among N versions.

  19. SCOPA and META-SCOPA: software for the analysis and aggregation of genome-wide association studies of multiple correlated phenotypes.

    PubMed

    Mägi, Reedik; Suleimanov, Yury V; Clarke, Geraldine M; Kaakinen, Marika; Fischer, Krista; Prokopenko, Inga; Morris, Andrew P

    2017-01-11

    Genome-wide association studies (GWAS) of single nucleotide polymorphisms (SNPs) have been successful in identifying loci contributing genetic effects to a wide range of complex human diseases and quantitative traits. The traditional approach to GWAS analysis is to consider each phenotype separately, despite the fact that many diseases and quantitative traits are correlated with each other, and often measured in the same sample of individuals. Multivariate analyses of correlated phenotypes have been demonstrated, by simulation, to increase power to detect association with SNPs, and thus may enable improved detection of novel loci contributing to diseases and quantitative traits. We have developed the SCOPA software to enable GWAS analysis of multiple correlated phenotypes. The software implements "reverse regression" methodology, which treats the genotype of an individual at a SNP as the outcome and the phenotypes as predictors in a general linear model. SCOPA can be applied to quantitative traits and categorical phenotypes, and can accommodate imputed genotypes under a dosage model. The accompanying META-SCOPA software enables meta-analysis of association summary statistics from SCOPA across GWAS. Application of SCOPA to two GWAS of high-and low-density lipoprotein cholesterol, triglycerides and body mass index, and subsequent meta-analysis with META-SCOPA, highlighted stronger association signals than univariate phenotype analysis at established lipid and obesity loci. The META-SCOPA meta-analysis also revealed a novel signal of association at genome-wide significance for triglycerides mapping to GPC5 (lead SNP rs71427535, p = 1.1x10 -8 ), which has not been reported in previous large-scale GWAS of lipid traits. The SCOPA and META-SCOPA software enable discovery and dissection of multiple phenotype association signals through implementation of a powerful reverse regression approach.

  20. Surveillance of industrial processes with correlated parameters

    DOEpatents

    White, A.M.; Gross, K.C.; Kubic, W.L.; Wigeland, R.A.

    1996-12-17

    A system and method for surveillance of an industrial process are disclosed. The system and method includes a plurality of sensors monitoring industrial process parameters, devices to convert the sensed data to computer compatible information and a computer which executes computer software directed to analyzing the sensor data to discern statistically reliable alarm conditions. The computer software is executed to remove serial correlation information and then calculate Mahalanobis distribution data to carry out a probability ratio test to determine alarm conditions. 10 figs.

  1. Environmental Health Monitor: Advanced Development of Temperature Sensor Suite.

    DTIC Science & Technology

    1995-07-30

    systems was implemented using program code existing at Veritay. The software , written in Microsoft® QuickBASIC, facilitated program changes for...currently unforeseen reason re-calibration is needed, this can be readily * accommodated by a straightforward change in the software program---without...unit. A linear relationship between these differences * was obtained using curve fitting software . The ½/-inch globe to 6-inch globe correlation * was

  2. Comparison Of Pre-Operative Curvature With Postoperative Curvature In Root Canals Treated With K-3 Rotary Systems.

    PubMed

    Nagi, Sana Ehsen; Khan, Farhan Raza

    2017-01-01

    With root canal treatment, the organic debris and micro-organisms from pulp space is removed and an ideal canal preparation is achieved that is conducive of hermetic obturation. The purpose of this study was to correlate the pre-operative canal curvature with the postoperative curvature in human extracted teeth prepared with K-3 rotary systems. The root canal preparation was carried out on extracted human molars and premolars using K-3 endodontic rotary files. A pre and post-operative image of the teeth using digital radiograph were taken in order to compare pre and post-operative canal curvature. The images were saved in an images retrieval system (Gendex software, USA). Change in the canal curvature was measured using the software measuring tool (Vixwin software, USA). Student paired t-test and Pearson correlation test was applied at 0.05 level of significance. There is a statistically significant difference between pre-operative and post-operative canal curvature (p-value <0.001) and a strong positive correlation (91% correlation) between pre-operative and post-operative canal curvature in teeth prepared with the K-3 rotary files. A significant difference between pre and post instrumentation curvature was found. Degree of canal curvature was not correlated with time taken for canal preparation.

  3. Novel Assessment of Interstitial Lung Disease Using the "Computer-Aided Lung Informatics for Pathology Evaluation and Rating" (CALIPER) Software System in Idiopathic Inflammatory Myopathies.

    PubMed

    Ungprasert, Patompong; Wilton, Katelynn M; Ernste, Floranne C; Kalra, Sanjay; Crowson, Cynthia S; Rajagopalan, Srinivasan; Bartholmai, Brian J

    2017-10-01

    To evaluate the correlation between measurements from quantitative thoracic high-resolution CT (HRCT) analysis with "Computer-Aided Lung Informatics for Pathology Evaluation and Rating" (CALIPER) software and measurements from pulmonary function tests (PFTs) in patients with idiopathic inflammatory myopathies (IIM)-associated interstitial lung disease (ILD). A cohort of patients with IIM-associated ILD seen at Mayo Clinic was identified from medical record review. Retrospective analysis of HRCT data and PFTs at baseline and 1 year was performed. The abnormalities in HRCT were quantified using CALIPER software. A total of 110 patients were identified. At baseline, total interstitial abnormalities as measured by CALIPER, both by absolute volume and by percentage of total lung volume, had a significant negative correlation with diffusing capacity for carbon monoxide (DLCO), total lung capacity (TLC), and oxygen saturation. Analysis by subtype of interstitial abnormality revealed significant negative correlations between ground glass opacities (GGO) and reticular density (RD) with DLCO and TLC. At one year, changes of total interstitial abnormalities compared with baseline had a significant negative correlation with changes of TLC and oxygen saturation. A negative correlation between changes of total interstitial abnormalities and DLCO was also observed, but it was not statistically significant. Analysis by subtype of interstitial abnormality revealed negative correlations between changes of GGO and RD and changes of DLCO, TLC, and oxygen saturation, but most of the correlations did not achieve statistical significance. CALIPER measurements correlate well with functional measurements in patients with IIM-associated ILD.

  4. Prospective comparison of speckle tracking longitudinal bidimensional strain between two vendors.

    PubMed

    Castel, Anne-Laure; Szymanski, Catherine; Delelis, François; Levy, Franck; Menet, Aymeric; Mailliet, Amandine; Marotte, Nathalie; Graux, Pierre; Tribouilloy, Christophe; Maréchaux, Sylvestre

    2014-02-01

    Speckle tracking is a relatively new, largely angle-independent technique used for the evaluation of myocardial longitudinal strain (LS). However, significant differences have been reported between LS values obtained by speckle tracking with the first generation of software products. To compare LS values obtained with the most recently released equipment from two manufacturers. Systematic scanning with head-to-head acquisition with no modification of the patient's position was performed in 64 patients with equipment from two different manufacturers, with subsequent off-line post-processing for speckle tracking LS assessment (Philips QLAB 9.0 and General Electric [GE] EchoPAC BT12). The interobserver variability of each software product was tested on a randomly selected set of 20 echocardiograms from the study population. GE and Philips interobserver coefficients of variation (CVs) for global LS (GLS) were 6.63% and 5.87%, respectively, indicating good reproducibility. Reproducibility was very variable for regional and segmental LS values, with CVs ranging from 7.58% to 49.21% with both software products. The concordance correlation coefficient (CCC) between GLS values was high at 0.95, indicating substantial agreement between the two methods. While good agreement was observed between midwall and apical regional strains with the two software products, basal regional strains were poorly correlated. The agreement between the two software products at a segmental level was very variable; the highest correlation was obtained for the apical cap (CCC 0.90) and the poorest for basal segments (CCC range 0.31-0.56). A high level of agreement and reproducibility for global but not for basal regional or segmental LS was found with two vendor-dependent software products. This finding may help to reinforce clinical acceptance of GLS in everyday clinical practice. Copyright © 2014 Elsevier Masson SAS. All rights reserved.

  5. Oscillating-Flow Regenerator Test Rig: Hardware and Theory With Derived Correlations for Screens and Felts

    NASA Technical Reports Server (NTRS)

    Gedeon, D.; Wood, J. G.

    1996-01-01

    A number of wire mesh and metal felt test samples, with a range of porosities, yield generic correlations for friction factor, Nusselt number, enhanced axial conduction ratio, and overall heat flux ratio. This information is directed primarily toward stirling cycle regenerator modelers, but will be of use to anyone seeking to better model fluid flow through these porous materials. Behind these results lies an oscillating-flow test rig, which measures pumping dissipation and thermal energy transport in sample matrices, and several stages of data-reduction software, which correlate instantaneous values for the above dimensionless groups. Within the software, theoretical model reduces instantaneous quantifies from cycle-averaged measurables using standard parameter estimation techniques.

  6. On Correlated Failures in Survivable Storage Systems

    DTIC Science & Technology

    2002-05-01

    Littlewood, D.R. Miller, “Conceptual modeling of coincident failures in multiversion software”, IEEE Transactions on Software Engineering, Volume: 15 Issue...Recovery in Multiversion Software”. IEEE Transaction on Software Engineering, Vol. 16 No.3, March 1990 [Plank1997] J. Plank “A tutorial on Reed-Solomon

  7. Software-implemented fault insertion: An FTMP example

    NASA Technical Reports Server (NTRS)

    Czeck, Edward W.; Siewiorek, Daniel P.; Segall, Zary Z.

    1987-01-01

    This report presents a model for fault insertion through software; describes its implementation on a fault-tolerant computer, FTMP; presents a summary of fault detection, identification, and reconfiguration data collected with software-implemented fault insertion; and compares the results to hardware fault insertion data. Experimental results show detection time to be a function of time of insertion and system workload. For the fault detection time, there is no correlation between software-inserted faults and hardware-inserted faults; this is because hardware-inserted faults must manifest as errors before detection, whereas software-inserted faults immediately exercise the error detection mechanisms. In summary, the software-implemented fault insertion is able to be used as an evaluation technique for the fault-handling capabilities of a system in fault detection, identification and recovery. Although the software-inserted faults do not map directly to hardware-inserted faults, experiments show software-implemented fault insertion is capable of emulating hardware fault insertion, with greater ease and automation.

  8. Attributes Effecting Software Testing Estimation; Is Organizational Trust an Issue?

    ERIC Educational Resources Information Center

    Hammoud, Wissam

    2013-01-01

    This quantitative correlational research explored the potential association between the levels of organizational trust and the software testing estimation. This was conducted by exploring the relationships between organizational trust, tester's expertise, organizational technology used, and the number of hours, number of testers, and time-coding…

  9. Improving the quality of care of patients with rheumatic disease using patient-centric electronic redesign software.

    PubMed

    Newman, Eric D; Lerch, Virginia; Billet, Jon; Berger, Andrea; Kirchner, H Lester

    2015-04-01

    Electronic health records (EHRs) are not optimized for chronic disease management. To improve the quality of care for patients with rheumatic disease, we developed electronic data capture, aggregation, display, and documentation software. The software integrated and reassembled information from the patient (via a touchscreen questionnaire), nurse, physician, and EHR into a series of actionable views. Core functions included trends over time, rheumatology-related demographics, and documentation for patient and provider. Quality measures collected included patient-reported outcomes, disease activity, and function. The software was tested and implemented in 3 rheumatology departments, and integrated into routine care delivery. Post-implementation evaluation measured adoption, efficiency, productivity, and patient perception. Over 2 years, 6,725 patients completed 19,786 touchscreen questionnaires. The software was adopted for use by 86% of patients and rheumatologists. Chart review and documentation time trended downward, and productivity increased by 26%. Patient satisfaction, activation, and adherence remained unchanged, although pre-implementation values were high. A strong correlation was seen between use of the software and disease control (weighted Pearson's correlation coefficient 0.5927, P = 0.0095), and a relative increase in patients with low disease activity of 3% per quarter was noted. We describe innovative software that aggregates, stores, and displays information vital to improving the quality of care for patients with chronic rheumatic disease. The software was well-adopted by patients and providers. Post-implementation, significant improvements in quality of care, efficiency of care, and productivity were demonstrated. Copyright © 2015 by the American College of Rheumatology.

  10. Genome-wide study of correlations between genomic features and their relationship with the regulation of gene expression.

    PubMed

    Kravatsky, Yuri V; Chechetkin, Vladimir R; Tchurikov, Nikolai A; Kravatskaya, Galina I

    2015-02-01

    The broad class of tasks in genetics and epigenetics can be reduced to the study of various features that are distributed over the genome (genome tracks). The rapid and efficient processing of the huge amount of data stored in the genome-scale databases cannot be achieved without the software packages based on the analytical criteria. However, strong inhomogeneity of genome tracks hampers the development of relevant statistics. We developed the criteria for the assessment of genome track inhomogeneity and correlations between two genome tracks. We also developed a software package, Genome Track Analyzer, based on this theory. The theory and software were tested on simulated data and were applied to the study of correlations between CpG islands and transcription start sites in the Homo sapiens genome, between profiles of protein-binding sites in chromosomes of Drosophila melanogaster, and between DNA double-strand breaks and histone marks in the H. sapiens genome. Significant correlations between transcription start sites on the forward and the reverse strands were observed in genomes of D. melanogaster, Caenorhabditis elegans, Mus musculus, H. sapiens, and Danio rerio. The observed correlations may be related to the regulation of gene expression in eukaryotes. Genome Track Analyzer is freely available at http://ancorr.eimb.ru/. © The Author 2015. Published by Oxford University Press on behalf of Kazusa DNA Research Institute.

  11. Quantification of Abdominal Fat in Obese and Healthy Adolescents Using 3 Tesla Magnetic Resonance Imaging and Free Software for Image Analysis.

    PubMed

    Eloi, Juliana Cristina; Epifanio, Matias; de Gonçalves, Marília Maia; Pellicioli, Augusto; Vieira, Patricia Froelich Giora; Dias, Henrique Bregolin; Bruscato, Neide; Soder, Ricardo Bernardi; Santana, João Carlos Batista; Mouzaki, Marialena; Baldisserotto, Matteo

    2017-01-01

    Computed tomography, which uses ionizing radiation and expensive software packages for analysis of scans, can be used to quantify abdominal fat. The objective of this study is to measure abdominal fat with 3T MRI using free software for image analysis and to correlate these findings with anthropometric and laboratory parameters in adolescents. This prospective observational study included 24 overweight/obese and 33 healthy adolescents (mean age 16.55 years). All participants underwent abdominal MRI exams. Visceral and subcutaneous fat area and percentage were correlated with anthropometric parameters, lipid profile, glucose metabolism, and insulin resistance. Student's t test and Mann-Whitney's test was applied. Pearson's chi-square test was used to compare proportions. To determine associations Pearson's linear correlation or Spearman's correlation were used. In both groups, waist circumference (WC) was associated with visceral fat area (P = 0.001 and P = 0.01 respectively), and triglycerides were associated with fat percentage (P = 0.046 and P = 0.071 respectively). In obese individuals, total cholesterol/HDL ratio was associated with visceral fat area (P = 0.03) and percentage (P = 0.09), and insulin and HOMA-IR were associated with visceral fat area (P = 0.001) and percentage (P = 0.005). 3T MRI can provide reliable and good quality images for quantification of visceral and subcutaneous fat by using a free software package. The results demonstrate that WC is a good predictor of visceral fat in obese adolescents and visceral fat area is associated with total cholesterol/HDL ratio, insulin and HOMA-IR.

  12. Student Use of Scaffolding Software: Relationships with Motivation and Conceptual Understanding

    NASA Astrophysics Data System (ADS)

    Butler, Kyle A.; Lumpe, Andrew

    2008-10-01

    This study was designed to theoretically articulate and empirically assess the role of computer scaffolds. In this project, several examples of educational software were developed to scaffold the learning of students performing high level cognitive activities. The software used in this study, Artemis, focused on scaffolding the learning of students as they performed information seeking activities. As 5th grade students traveled through a project-based science unit on photosynthesis, researchers used a pre-post design to test for both student motivation and student conceptual understanding of photosynthesis. To measure both variables, a motivation survey and three methods of concept map analysis were used. The student use of the scaffolding features was determined using a database that tracked students' movement between scaffolding tools. The gain scores of each dependent variable was then correlated to the students' feature use (time and hits) embedded in the Artemis Interface. This provided the researchers with significant relationships between the scaffolding features represented in the software and student motivation and conceptual understanding of photosynthesis. There were a total of three significant correlations in comparing the scaffolding use by hits (clicked on) with the dependent variables and only one significant correlation when comparing the scaffold use in time. The first significant correlation ( r = .499, p < .05) was between the saving/viewing features hits and the students' task value. This correlation supports the assumption that there is a positive relationship between the student use of the saving/viewing features and the students' perception of how interesting, how important, and how useful the task is. The second significant correlation ( r = 0.553, p < 0.01) was between the searching features hits and the students' self-efficacy for learning and performance. This correlation supports the assumption that there is a positive relationship between the student use of the searching features and the students' perception of their ability to accomplish a task as well as their confidence in their skills to perform that task. The third significant correlation ( r = 0.519, p < 0.05) was between the collaborative features hits and the students' essay performance scores. This correlation supports the assumption that there is a positive relationship between the student use of the collaborative features and the students' ability to perform high cognitive tasks. Finally, the last significant correlation ( r = 0.576, p < 0.01) was between the maintenance features time and the qualitative analysis of the concept maps. This correlation supports the assumption that there is a positive relationship between the student use of the maintenance features and student conceptual understanding of photosynthesis.

  13. Average correlation clustering algorithm (ACCA) for grouping of co-regulated genes with similar pattern of variation in their expression values.

    PubMed

    Bhattacharya, Anindya; De, Rajat K

    2010-08-01

    Distance based clustering algorithms can group genes that show similar expression values under multiple experimental conditions. They are unable to identify a group of genes that have similar pattern of variation in their expression values. Previously we developed an algorithm called divisive correlation clustering algorithm (DCCA) to tackle this situation, which is based on the concept of correlation clustering. But this algorithm may also fail for certain cases. In order to overcome these situations, we propose a new clustering algorithm, called average correlation clustering algorithm (ACCA), which is able to produce better clustering solution than that produced by some others. ACCA is able to find groups of genes having more common transcription factors and similar pattern of variation in their expression values. Moreover, ACCA is more efficient than DCCA with respect to the time of execution. Like DCCA, we use the concept of correlation clustering concept introduced by Bansal et al. ACCA uses the correlation matrix in such a way that all genes in a cluster have the highest average correlation values with the genes in that cluster. We have applied ACCA and some well-known conventional methods including DCCA to two artificial and nine gene expression datasets, and compared the performance of the algorithms. The clustering results of ACCA are found to be more significantly relevant to the biological annotations than those of the other methods. Analysis of the results show the superiority of ACCA over some others in determining a group of genes having more common transcription factors and with similar pattern of variation in their expression profiles. Availability of the software: The software has been developed using C and Visual Basic languages, and can be executed on the Microsoft Windows platforms. The software may be downloaded as a zip file from http://www.isical.ac.in/~rajat. Then it needs to be installed. Two word files (included in the zip file) need to be consulted before installation and execution of the software. Copyright 2010 Elsevier Inc. All rights reserved.

  14. Grayscale Optical Correlator Workbench

    NASA Technical Reports Server (NTRS)

    Hanan, Jay; Zhou, Hanying; Chao, Tien-Hsin

    2006-01-01

    Grayscale Optical Correlator Workbench (GOCWB) is a computer program for use in automatic target recognition (ATR). GOCWB performs ATR with an accurate simulation of a hardware grayscale optical correlator (GOC). This simulation is performed to test filters that are created in GOCWB. Thus, GOCWB can be used as a stand-alone ATR software tool or in combination with GOC hardware for building (target training), testing, and optimization of filters. The software is divided into three main parts, denoted filter, testing, and training. The training part is used for assembling training images as input to a filter. The filter part is used for combining training images into a filter and optimizing that filter. The testing part is used for testing new filters and for general simulation of GOC output. The current version of GOCWB relies on the mathematical software tools from MATLAB binaries for performing matrix operations and fast Fourier transforms. Optimization of filters is based on an algorithm, known as OT-MACH, in which variables specified by the user are parameterized and the best filter is selected on the basis of an average result for correct identification of targets in multiple test images.

  15. Cobalt: A GPU-based correlator and beamformer for LOFAR

    NASA Astrophysics Data System (ADS)

    Broekema, P. Chris; Mol, J. Jan David; Nijboer, R.; van Amesfoort, A. S.; Brentjens, M. A.; Loose, G. Marcel; Klijn, W. F. A.; Romein, J. W.

    2018-04-01

    For low-frequency radio astronomy, software correlation and beamforming on general purpose hardware is a viable alternative to custom designed hardware. LOFAR, a new-generation radio telescope centered in the Netherlands with international stations in Germany, France, Ireland, Poland, Sweden and the UK, has successfully used software real-time processors based on IBM Blue Gene technology since 2004. Since then, developments in technology have allowed us to build a system based on commercial off-the-shelf components that combines the same capabilities with lower operational cost. In this paper, we describe the design and implementation of a GPU-based correlator and beamformer with the same capabilities as the Blue Gene based systems. We focus on the design approach taken, and show the challenges faced in selecting an appropriate system. The design, implementation and verification of the software system show the value of a modern test-driven development approach. Operational experience, based on three years of operations, demonstrates that a general purpose system is a good alternative to the previous supercomputer-based system or custom-designed hardware.

  16. GPU Based Software Correlators - Perspectives for VLBI2010

    NASA Technical Reports Server (NTRS)

    Hobiger, Thomas; Kimura, Moritaka; Takefuji, Kazuhiro; Oyama, Tomoaki; Koyama, Yasuhiro; Kondo, Tetsuro; Gotoh, Tadahiro; Amagai, Jun

    2010-01-01

    Caused by historical separation and driven by the requirements of the PC gaming industry, Graphics Processing Units (GPUs) have evolved to massive parallel processing systems which entered the area of non-graphic related applications. Although a single processing core on the GPU is much slower and provides less functionality than its counterpart on the CPU, the huge number of these small processing entities outperforms the classical processors when the application can be parallelized. Thus, in recent years various radio astronomical projects have started to make use of this technology either to realize the correlator on this platform or to establish the post-processing pipeline with GPUs. Therefore, the feasibility of GPUs as a choice for a VLBI correlator is being investigated, including pros and cons of this technology. Additionally, a GPU based software correlator will be reviewed with respect to energy consumption/GFlop/sec and cost/GFlop/sec.

  17. CONTIN XPCS: Software for Inverse Transform Analysis of X-Ray Photon Correlation Spectroscopy Dynamics

    PubMed Central

    Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan

    2018-01-01

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) both reveal dynamics using coherent scattering, but X-rays permit investigating of dynamics in a much more diverse array of materials. Heterogeneous dynamics occur in many such materials, and we showed how classic tools employed in analysis of heterogeneous DLS dynamics extend to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. This work presents the software implementation of inverse transform analysis of XPCS data called CONTIN XPCS, an extension of traditional CONTIN that accommodates dynamics encountered in equilibrium XPCS measurements. PMID:29875507

  18. CONTIN XPCS: Software for Inverse Transform Analysis of X-Ray Photon Correlation Spectroscopy Dynamics.

    PubMed

    Andrews, Ross N; Narayanan, Suresh; Zhang, Fan; Kuzmenko, Ivan; Ilavsky, Jan

    2018-02-01

    X-ray photon correlation spectroscopy (XPCS) and dynamic light scattering (DLS) both reveal dynamics using coherent scattering, but X-rays permit investigating of dynamics in a much more diverse array of materials. Heterogeneous dynamics occur in many such materials, and we showed how classic tools employed in analysis of heterogeneous DLS dynamics extend to XPCS, revealing additional information that conventional Kohlrausch exponential fitting obscures. This work presents the software implementation of inverse transform analysis of XPCS data called CONTIN XPCS, an extension of traditional CONTIN that accommodates dynamics encountered in equilibrium XPCS measurements.

  19. New fully automated software for assessment of brachial artery flow- mediated dilation with advantages of continuous measurement.

    PubMed

    Ercan, Ertuğrul; Kırılmaz, Bahadır; Kahraman, İsmail; Bayram, Vildan; Doğan, Hüseyin

    2012-11-01

    Flow-mediated dilation (FMD) is used to evaluate endothelial functions. Computer-assisted analysis utilizing edge detection permits continuous measurements along the vessel wall. We have developed a new fully automated software program to allow accurate and reproducible measurement. FMD has been measured and analyzed in 18 coronary artery disease (CAD) patients and 17 controls both by manually and by the software developed (computer supported) methods. The agreement between methods was assessed by Bland-Altman analysis. The mean age, body mass index and cardiovascular risk factors were higher in CAD group. Automated FMD% measurement for the control subjects was 18.3±8.5 and 6.8±6.5 for the CAD group (p=0.0001). The intraobserver and interobserver correlation for automated measurement was high (r=0.974, r=0.981, r=0.937, r=0.918, respectively). Manual FMD% at 60th second was correlated with automated FMD % (r=0.471, p=0.004). The new fully automated software© can be used to precise measurement of FMD with low intra- and interobserver variability than manual assessment.

  20. A Survey On Management Of Software Engineering In Japan

    NASA Astrophysics Data System (ADS)

    Kadono, Yasuo; Tsubaki, Hiroe; Tsuruho, Seishiro

    2008-05-01

    The purpose of this study is to clarity the mechanism of how software engineering capabilities relate to the business performance of IT vendors in Japan. To do this, we developed a structural model using factors related to software engineering, business performance and competitive environment. By analyzing the data collected from 78 major IT vendors in Japan, we found that superior deliverables and business performance were correlated with the effort expended particularly on human resource development, quality assurance, research and development and process improvement.

  1. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  2. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    PubMed

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  3. A Highly Accurate Face Recognition System Using Filtering Correlation

    NASA Astrophysics Data System (ADS)

    Watanabe, Eriko; Ishikawa, Sayuri; Kodate, Kashiko

    2007-09-01

    The authors previously constructed a highly accurate fast face recognition optical correlator (FARCO) [E. Watanabe and K. Kodate: Opt. Rev. 12 (2005) 460], and subsequently developed an improved, super high-speed FARCO (S-FARCO), which is able to process several hundred thousand frames per second. The principal advantage of our new system is its wide applicability to any correlation scheme. Three different configurations were proposed, each depending on correlation speed. This paper describes and evaluates a software correlation filter. The face recognition function proved highly accurate, seeing that a low-resolution facial image size (64 × 64 pixels) has been successfully implemented. An operation speed of less than 10 ms was achieved using a personal computer with a central processing unit (CPU) of 3 GHz and 2 GB memory. When we applied the software correlation filter to a high-security cellular phone face recognition system, experiments on 30 female students over a period of three months yielded low error rates: 0% false acceptance rate and 2% false rejection rate. Therefore, the filtering correlation works effectively when applied to low resolution images such as web-based images or faces captured by a monitoring camera.

  4. HIGH-RATE FORMABILITY OF HIGH-STRENGTH ALUMINUM ALLOYS: A STUDY ON OBJECTIVITY OF MEASURED STRAIN AND STRAIN RATE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Upadhyay, Piyush; Rohatgi, Aashish; Stephens, Elizabeth V.

    2015-02-18

    Al alloy AA7075 sheets were deformed at room temperature at strain-rates exceeding 1000 /s using the electrohydraulic forming (EHF) technique. A method that combines high speed imaging and digital image correlation technique, developed at Pacific Northwest National Laboratory, is used to investigate high strain rate deformation behavior of AA7075. For strain-rate sensitive materials, the ability to accurately model their high-rate deformation behavior is dependent upon the ability to accurately quantify the strain-rate that the material is subjected to. This work investigates the objectivity of software-calculated strain and strain rate by varying different parameters within commonly used commercially available digital imagemore » correlation software. Except for very close to the time of crack opening the calculated strain and strain rates are very consistent and independent of the adjustable parameters of the software.« less

  5. Reliability and accuracy of three imaging software packages used for 3D analysis of the upper airway on cone beam computed tomography images.

    PubMed

    Chen, Hui; van Eijnatten, Maureen; Wolff, Jan; de Lange, Jan; van der Stelt, Paul F; Lobbezoo, Frank; Aarab, Ghizlane

    2017-08-01

    The aim of this study was to assess the reliability and accuracy of three different imaging software packages for three-dimensional analysis of the upper airway using CBCT images. To assess the reliability of the software packages, 15 NewTom 5G ® (QR Systems, Verona, Italy) CBCT data sets were randomly and retrospectively selected. Two observers measured the volume, minimum cross-sectional area and the length of the upper airway using Amira ® (Visage Imaging Inc., Carlsbad, CA), 3Diagnosys ® (3diemme, Cantu, Italy) and OnDemand3D ® (CyberMed, Seoul, Republic of Korea) software packages. The intra- and inter-observer reliability of the upper airway measurements were determined using intraclass correlation coefficients and Bland & Altman agreement tests. To assess the accuracy of the software packages, one NewTom 5G ® CBCT data set was used to print a three-dimensional anthropomorphic phantom with known dimensions to be used as the "gold standard". This phantom was subsequently scanned using a NewTom 5G ® scanner. Based on the CBCT data set of the phantom, one observer measured the volume, minimum cross-sectional area, and length of the upper airway using Amira ® , 3Diagnosys ® , and OnDemand3D ® , and compared these measurements with the gold standard. The intra- and inter-observer reliability of the measurements of the upper airway using the different software packages were excellent (intraclass correlation coefficient ≥0.75). There was excellent agreement between all three software packages in volume, minimum cross-sectional area and length measurements. All software packages underestimated the upper airway volume by -8.8% to -12.3%, the minimum cross-sectional area by -6.2% to -14.6%, and the length by -1.6% to -2.9%. All three software packages offered reliable volume, minimum cross-sectional area and length measurements of the upper airway. The length measurements of the upper airway were the most accurate results in all software packages. All software packages underestimated the upper airway dimensions of the anthropomorphic phantom.

  6. A Novel Survey to Examine the Relationship between Health IT Adoption and Nurse-Physician Communication.

    PubMed

    Holmgren, A Jay; Pfeifer, Eric; Manojlovich, Milisa; Adler-Milstein, Julia

    2016-12-21

    As EHR adoption in US hospitals becomes ubiquitous, a wide range of IT options are theoretically available to facilitate physician-nurse communication, but we know little about the adoption rate of specific technologies or the impact of their use. To measure adoption of hardware, software, and telephony relevant to nurse-physician communication in US hospitals. To assess the relationship between non-IT communication practices and hardware, software, and telephony adoption. To identify hospital characteristics associated with greater adoption of hardware, software, telephony, and non-IT communication practices. We conducted a survey of 105 hospitals in the National Nursing Practice Network. The survey captured adoption of hardware, software, and telephony to support nurse-physician communication, along with non-IT communication practices. We calculated descriptive statistics and then created four indices, one for each category, by scoring degree of adoption of technologies or practices within each category. Next, we examined correlations between the three technology indices and the non-IT communication practices index. We used multivariate OLS regression to assess whether certain types of hospitals had higher index scores. The majority of hospitals surveyed have a range of hardware, software, and telephony tools available to support nurse-physician communication; we found substantial heterogeneity across hospitals in non-IT communication practices. More intensive non-IT communication was associated with greater adoption of software (r=0.31, p=0.01), but was not correlated with hardware or telephony. Medium-sized hospitals had lower adoption of software (r =-1.14,p=0.04) in comparison to small hospitals, while federally-owned hospitals had lower software (r=-2.57, p=0.02) and hardware adoption (r=-1.63, p=0.01). The positive relationship between non-IT communication and level of software adoption suggests that there is a complementary, rather than substitutive, relationship. Our results suggest that some technologies with the potential to further enhance communication, such as CPOE and secure messaging, are not being utilized to their full potential in many hospitals.

  7. Relationship between postural alignment in sitting by photogrammetry and seated postural control in post-stroke subjects.

    PubMed

    Iyengar, Y R; Vijayakumar, K; Abraham, J M; Misri, Z K; Suresh, B V; Unnikrishnan, B

    2014-01-01

    This study was executed to find out correlation between postural alignment in sitting measured through photogrammetry and postural control in sitting following stroke. A cross-sectional study with convenient sampling consisting of 45 subjects with acute and sub-acute stroke. Postural alignment in sitting was measured through photogrammetry and relevant angles were obtained through software MB Ruler (version 5.0). Seated postural control was measured through Function in Sitting Test (FIST). Correlation was obtained using Spearman's Rank Correlation co-efficient in SPSS software (version 17.0). Moderate positive correlation (r = 0.385; p < 0.01) was found between angle of lordosis and angle between acromion, lateral epicondyle and point between radius and ulna. Strong negative correlation (r = -0.435; p < 0.01) was found between cranio-vertebral angle and kyphosis. FIST showed moderate positive correlation (r = 0.3446; p < 0.05) with cranio-vertebral angle and strong positive correlation (r = 0.4336; p < 0.01) with Brunnstrom's stage of recovery in upper extremity. Degree of forward head posture in sitting correlates directly with seated postural control and inversely with degree of kyphosis in sitting post-stroke. Postural control in sitting post-stroke is directly related with Brunnstrom's stage of recovery in affected upper extremity in sitting.

  8. The Relationship between Gender and Students' Attitude and Experience of Using a Mathematical Software Program (MATLAB)

    ERIC Educational Resources Information Center

    Ocak, Mehmet A.

    2006-01-01

    This correlation study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…

  9. Comparison of two kinds of skin imaging analysis software: VISIA® from Canfield and IPP® from Media Cybernetics.

    PubMed

    Wang, X; Shu, X; Li, Z; Huo, W; Zou, L; Tang, Y; Li, L

    2018-01-27

    Skin imaging analysis, acting as a supplement to noninvasive bioengineering devices, has been widely used in medical cosmetology and cosmetic product evaluation. The main aim of this study is to assess the differences and correlations in measuring skin spots, wrinkles, vascular features, porphyrin, and pore between two commercially available image analysis software. Seventy healthy women were included in the study. Before taking pictures, the dermatologist evaluated subjects' skin conditions. Test sites included the forehead, cheek, and periorbital skin. A 2 × 2 cm cardboard was used to make a mark on the skin surface. Pictures were taken using VISIA ® under three kinds light conditions and analyzed using VISIA ® and IPP ® respectively. (1) Skin pore, red area, ultraviolet spot, brown spot, porphyrin, and wrinkle measured with VISIA ® were correlated with those measured with IPP ® (P < .01). (2) Spot, wrinkle, fine line, brown spot, and red area analyzed with VISIA ® were correlated with age on the forehead and periorbital skin (P < .05). L-value, Crow's feet, ultraviolet spot, brown spot, and red area analyzed with IPP ® were correlated with age on the periorbital skin (P < .05). (3) L-value, spot, wrinkle, fine line, porphyrin, red area, and pore analyzed with VISIA ® and IPP ® showed correlations with the subjective evaluation scores (P < .05). VISIA ® and IPP ® showed acceptable correlation in measuring various skin conditions. VISIA ® showed a high sensibility when measured on the forehead skin. IPP ® is available as an alternative software program to evaluate skin features. © 2018 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  10. Nasendoscopy: an analysis of measurement uncertainties.

    PubMed

    Gilleard, Onur; Sommerlad, Brian; Sell, Debbie; Ghanem, Ali; Birch, Malcolm

    2013-05-01

    Objective : The purpose of this study was to analyze the optical characteristics of two different nasendoscopes used to assess velopharyngeal insufficiency and to quantify the measurement uncertainties that will occur in a typical set of clinical data. Design : The magnification and barrel distortion associated with nasendoscopy was estimated by using computer software to analyze the apparent dimensions of a spatially calibrated test object at varying object-lens distances. In addition, a method of semiquantitative analysis of velopharyngeal closure using nasendoscopy and computer software is described. To calculate the reliability of this method, 10 nasendoscopy examinations were analyzed two times by three separate operators. The measure of intraoperator and interoperator agreement was evaluated using Pearson's r correlation coefficient. Results : Over an object lens distance of 9 mm, magnification caused the visualized dimensions of the test object to increase by 80%. In addition, dimensions of objects visualized in the far-peripheral field of the nasendoscopic examinations appeared approximately 40% smaller than those visualized in the central field. Using computer software to analyze velopharyngeal closure, the mean correlation coefficient for intrarater reliability was .94 and for interrater reliability was .90. Conclusion : Using a custom-designed apparatus, the effect object-lens distance has on the magnification of nasendoscopic images has been quantified. Barrel distortion has also been quantified and was found to be independent of object-lens distance. Using computer software to analyze clinical images, the intraoperator and interoperator correlation appears to show that ratio-metric measurements are reliable.

  11. A Visual Basic simulation software tool for performance analysis of a membrane-based advanced water treatment plant.

    PubMed

    Pal, P; Kumar, R; Srivastava, N; Chaudhuri, J

    2014-02-01

    A Visual Basic simulation software (WATTPPA) has been developed to analyse the performance of an advanced wastewater treatment plant. This user-friendly and menu-driven software is based on the dynamic mathematical model for an industrial wastewater treatment scheme that integrates chemical, biological and membrane-based unit operations. The software-predicted results corroborate very well with the experimental findings as indicated in the overall correlation coefficient of the order of 0.99. The software permits pre-analysis and manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. It allows quick performance analysis of the whole system as well as the individual units. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for hazardous wastewater.

  12. Tsukuba VLBI Correlator

    NASA Technical Reports Server (NTRS)

    Kurihara, Shinobu; Nozawa, Kentaro

    2013-01-01

    The K5/VSSP software correlator (Figure 1), located in Tsukuba, Japan, is operated by the Geospatial Information Authority of Japan (GSI). It is fully dedicated to processing the geodetic VLBI sessions of the International VLBI Service for Geodesy and Astrometry. All of the weekend IVS Intensives (INT2) and the Japanese domestic VLBI observations organized by GSI were processed at the Tsukuba VLBI Correlator.

  13. Validation of a free software for unsupervised assessment of abdominal fat in MRI.

    PubMed

    Maddalo, Michele; Zorza, Ivan; Zubani, Stefano; Nocivelli, Giorgio; Calandra, Giulio; Soldini, Pierantonio; Mascaro, Lorella; Maroldi, Roberto

    2017-05-01

    To demonstrate the accuracy of an unsupervised (fully automated) software for fat segmentation in magnetic resonance imaging. The proposed software is a freeware solution developed in ImageJ that enables the quantification of metabolically different adipose tissues in large cohort studies. The lumbar part of the abdomen (19cm in craniocaudal direction, centered in L3) of eleven healthy volunteers (age range: 21-46years, BMI range: 21.7-31.6kg/m 2 ) was examined in a breath hold on expiration with a GE T1 Dixon sequence. Single-slice and volumetric data were considered for each subject. The results of the visceral and subcutaneous adipose tissue assessments obtained by the unsupervised software were compared to supervised segmentations of reference. The associated statistical analysis included Pearson correlations, Bland-Altman plots and volumetric differences (VD % ). Values calculated by the unsupervised software significantly correlated with corresponding supervised segmentations of reference for both subcutaneous adipose tissue - SAT (R=0.9996, p<0.001) and visceral adipose tissue - VAT (R=0.995, p<0.001). Bland-Altman plots showed the absence of systematic errors and a limited spread of the differences. In the single-slice analysis, VD % were (1.6±2.9)% for SAT and (4.9±6.9)% for VAT. In the volumetric analysis, VD % were (1.3±0.9)% for SAT and (2.9±2.7)% for VAT. The developed software is capable of segmenting the metabolically different adipose tissues with a high degree of accuracy. This free add-on software for ImageJ can easily have a widespread and enable large-scale population studies regarding the adipose tissue and its related diseases. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  14. Statistical correlation of structural mode shapes from test measurements and NASTRAN analytical values

    NASA Technical Reports Server (NTRS)

    Purves, L.; Strang, R. F.; Dube, M. P.; Alea, P.; Ferragut, N.; Hershfeld, D.

    1983-01-01

    The software and procedures of a system of programs used to generate a report of the statistical correlation between NASTRAN modal analysis results and physical tests results from modal surveys are described. Topics discussed include: a mathematical description of statistical correlation, a user's guide for generating a statistical correlation report, a programmer's guide describing the organization and functions of individual programs leading to a statistical correlation report, and a set of examples including complete listings of programs, and input and output data.

  15. Development of automatic visceral fat volume calculation software for CT volume data.

    PubMed

    Nemoto, Mitsutaka; Yeernuer, Tusufuhan; Masutani, Yoshitaka; Nomura, Yukihiro; Hanaoka, Shouhei; Miki, Soichiro; Yoshikawa, Takeharu; Hayashi, Naoto; Ohtomo, Kuni

    2014-01-01

    To develop automatic visceral fat volume calculation software for computed tomography (CT) volume data and to evaluate its feasibility. A total of 24 sets of whole-body CT volume data and anthropometric measurements were obtained, with three sets for each of four BMI categories (under 20, 20 to 25, 25 to 30, and over 30) in both sexes. True visceral fat volumes were defined on the basis of manual segmentation of the whole-body CT volume data by an experienced radiologist. Software to automatically calculate visceral fat volumes was developed using a region segmentation technique based on morphological analysis with CT value threshold. Automatically calculated visceral fat volumes were evaluated in terms of the correlation coefficient with the true volumes and the error relative to the true volume. Automatic visceral fat volume calculation results of all 24 data sets were obtained successfully and the average calculation time was 252.7 seconds/case. The correlation coefficients between the true visceral fat volume and the automatically calculated visceral fat volume were over 0.999. The newly developed software is feasible for calculating visceral fat volumes in a reasonable time and was proved to have high accuracy.

  16. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  17. Quantitative comparison and evaluation of software packages for assessment of abdominal adipose tissue distribution by magnetic resonance imaging.

    PubMed

    Bonekamp, S; Ghosh, P; Crawford, S; Solga, S F; Horska, A; Brancati, F L; Diehl, A M; Smith, S; Clark, J M

    2008-01-01

    To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Feature evaluation and test-retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. A random sample of 15 obese adults with type 2 diabetes. Axial T1-weighted spin echo images centered at vertebral bodies of L2-L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test-retest reliability. Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test-retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages.

  18. Quantitative comparison and evaluation of software packages for assessment of abdominal adipose tissue distribution by magnetic resonance imaging

    PubMed Central

    Bonekamp, S; Ghosh, P; Crawford, S; Solga, SF; Horska, A; Brancati, FL; Diehl, AM; Smith, S; Clark, JM

    2009-01-01

    Objective To examine five available software packages for the assessment of abdominal adipose tissue with magnetic resonance imaging, compare their features and assess the reliability of measurement results. Design Feature evaluation and test–retest reliability of softwares (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision) used in manual, semi-automated or automated segmentation of abdominal adipose tissue. Subjects A random sample of 15 obese adults with type 2 diabetes. Measurements Axial T1-weighted spin echo images centered at vertebral bodies of L2–L3 were acquired at 1.5 T. Five software packages were evaluated (NIHImage, SliceOmatic, Analyze, HippoFat and EasyVision), comparing manual, semi-automated and automated segmentation approaches. Images were segmented into cross-sectional area (CSA), and the areas of visceral (VAT) and subcutaneous adipose tissue (SAT). Ease of learning and use and the design of the graphical user interface (GUI) were rated. Intra-observer accuracy and agreement between the software packages were calculated using intra-class correlation. Intra-class correlation coefficient was used to obtain test–retest reliability. Results Three of the five evaluated programs offered a semi-automated technique to segment the images based on histogram values or a user-defined threshold. One software package allowed manual delineation only. One fully automated program demonstrated the drawbacks of uncritical automated processing. The semi-automated approaches reduced variability and measurement error, and improved reproducibility. There was no significant difference in the intra-observer agreement in SAT and CSA. The VAT measurements showed significantly lower test–retest reliability. There were some differences between the software packages in qualitative aspects, such as user friendliness. Conclusion Four out of five packages provided essentially the same results with respect to the inter- and intra-rater reproducibility. Our results using SliceOmatic, Analyze or NIHImage were comparable and could be used interchangeably. Newly developed fully automated approaches should be compared to one of the examined software packages. PMID:17700582

  19. Measurement and analysis of operating system fault tolerance

    NASA Technical Reports Server (NTRS)

    Lee, I.; Tang, D.; Iyer, R. K.

    1992-01-01

    This paper demonstrates a methodology to model and evaluate the fault tolerance characteristics of operational software. The methodology is illustrated through case studies on three different operating systems: the Tandem GUARDIAN fault-tolerant system, the VAX/VMS distributed system, and the IBM/MVS system. Measurements are made on these systems for substantial periods to collect software error and recovery data. In addition to investigating basic dependability characteristics such as major software problems and error distributions, we develop two levels of models to describe error and recovery processes inside an operating system and on multiple instances of an operating system running in a distributed environment. Based on the models, reward analysis is conducted to evaluate the loss of service due to software errors and the effect of the fault-tolerance techniques implemented in the systems. Software error correlation in multicomputer systems is also investigated.

  20. An Evaluation of Jet Impingement Heat Transfer Correlations for Piccolo Tube Application

    NASA Technical Reports Server (NTRS)

    Bond, Thomas (Technical Monitor); Wright, William B.

    2004-01-01

    Impinging jets have been used for a wide variety of applications where high rates of heat transfer are desired. This report will present a review of heat transfer correlations that have been published. The correlations were then added to the LEWICE software to evaluate the applicability of these correlations to a piccolo tube anti-icing system. The results of this analysis were then compared quantitatively to test results on a representative piccolo tube system.

  1. [Dental arch form reverting by four-point method].

    PubMed

    Pan, Xiao-Gang; Qian, Yu-Fen; Weng, Si-En; Feng, Qi-Ping; Yu, Quan

    2008-04-01

    To explore a simple method of reverting individual dental arch form template for wire bending. Individual dental arch form was reverted by four-point method. By defining central point of bracket on bilateral lower second premolar and first molar, certain individual dental arch form could be generated. The arch form generating procedure was then be developed to computer software for printing arch form. Four-point method arch form was evaluated by comparing with direct model measurement on linear and angular parameters. The accuracy and reproducibility were assessed by paired t test and concordance correlation coefficient with Medcalc 9.3 software package. The arch form by four-point method was of good accuracy and reproducibility (linear concordance correlation coefficient was 0.9909 and angular concordance correlation coefficient was 0.8419). The dental arch form reverted by four-point method could reproduce the individual dental arch form.

  2. Comprehensive analysis of T cell epitope discovery strategies using 17DD yellow fever virus structural proteins and BALB/c (H2d) mice model.

    PubMed

    Maciel, Milton; Kellathur, Srinivasan N; Chikhlikar, Pryia; Dhalia, Rafael; Sidney, John; Sette, Alessandro; August, Thomas J; Marques, Ernesto T A

    2008-08-15

    Immunomics research uses in silico epitope prediction, as well as in vivo and in vitro approaches. We inoculated BALB/c (H2d) mice with 17DD yellow fever vaccine to investigate the correlations between approaches used for epitope discovery: ELISPOT assays, binding assays, and prediction software. Our results showed a good agreement between ELISPOT and binding assays, which seemed to correlate with the protein immunogenicity. PREDBALB/c prediction software partially agreed with the ELISPOT and binding assay results, but presented low specificity. The use of prediction software to exclude peptides containing no epitopes, followed by high throughput screening of the remaining peptides by ELISPOT, and the use of MHC-biding assays to characterize the MHC restrictions demonstrated to be an efficient strategy. The results allowed the characterization of 2 MHC class I and 17 class II epitopes in the envelope protein of the YF virus in BALB/c (H2d) mice.

  3. Comparison between a new computer program and the reference software for gray-scale median analysis of atherosclerotic carotid plaques.

    PubMed

    Casella, Ivan Benaduce; Fukushima, Rodrigo Bono; Marques, Anita Battistini de Azevedo; Cury, Marcus Vinícius Martins; Presti, Calógero

    2015-03-01

    To compare a new dedicated software program and Adobe Photoshop for gray-scale median (GSM) analysis of B-mode images of carotid plaques. A series of 42 carotid plaques generating ≥50% diameter stenosis was evaluated by a single observer. The best segment for visualization of internal carotid artery plaque was identified on a single longitudinal view and images were recorded in JPEG format. Plaque analysis was performed by both programs. After normalization of image intensity (blood = 0, adventitial layer = 190), histograms were obtained after manual delineation of plaque. Results were compared with nonparametric Wilcoxon signed rank test and Kendall tau-b correlation analysis. GSM ranged from 00 to 100 with Adobe Photoshop and from 00 to 96 with IMTPC, with a high grade of similarity between image pairs, and a highly significant correlation (R = 0.94, p < .0001). IMTPC software appears suitable for the GSM analysis of carotid plaques. © 2014 Wiley Periodicals, Inc.

  4. A Correlational Study between IT Governance and the Effect on Strategic Management Functioning among Senior & Middle Management in Medium Scale Software Organizations

    ERIC Educational Resources Information Center

    Kurien, Sam

    2013-01-01

    The purpose of the study was to explore whether there are relationships between elements of information technology (IT) governance, strategic planning, and strategic functions among senior and mid-level management at medium-scaled software development firms. Several topics and models of IT governance literature were discussed and the gap in…

  5. SU-F-R-45: The Prognostic Value of Radiotherapy Based On the Changes of Texture Features Between Pre-Treatment and Post-Treatment FDG PET Image for NSCLC Patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ma, C; Yin, Y

    Purpose: The purpose of this research is investigating which texture features extracted from FDG-PET images by gray-level co-occurrence matrix(GLCM) have a higher prognostic value than the other texture features. Methods: 21 non-small cell lung cancer(NSCLC) patients were approved in the study. Patients underwent 18F-FDG PET/CT scans with both pre-treatment and post-treatment. Firstly, the tumors were extracted by our house developed software. Secondly, the clinical features including the maximum SUV and tumor volume were extracted by MIM vista software, and texture features including angular second moment, contrast, inverse different moment, entropy and correlation were extracted using MATLAB.The differences can be calculatedmore » by using post-treatment features to subtract pre-treatment features. Finally, the SPSS software was used to get the Pearson correlation coefficients and Spearman rank correlation coefficients between the change ratios of texture features and change ratios of clinical features. Results: The Pearson and Spearman rank correlation coefficient between contrast and SUV maximum is 0.785 and 0.709. The P and S value between inverse difference moment and tumor volume is 0.953 and 0.942. Conclusion: This preliminary study showed that the relationships between different texture features and the same clinical feature are different. Finding the prognostic value of contrast and inverse difference moment were higher than the other three textures extracted by GLCM.« less

  6. Myocardial blood flow quantification by Rb-82 cardiac PET/CT: A detailed reproducibility study between two semi-automatic analysis programs.

    PubMed

    Dunet, Vincent; Klein, Ran; Allenbach, Gilles; Renaud, Jennifer; deKemp, Robert A; Prior, John O

    2016-06-01

    Several analysis software packages for myocardial blood flow (MBF) quantification from cardiac PET studies exist, but they have not been compared using concordance analysis, which can characterize precision and bias separately. Reproducible measurements are needed for quantification to fully develop its clinical potential. Fifty-one patients underwent dynamic Rb-82 PET at rest and during adenosine stress. Data were processed with PMOD and FlowQuant (Lortie model). MBF and myocardial flow reserve (MFR) polar maps were quantified and analyzed using a 17-segment model. Comparisons used Pearson's correlation ρ (measuring precision), Bland and Altman limit-of-agreement and Lin's concordance correlation ρc = ρ·C b (C b measuring systematic bias). Lin's concordance and Pearson's correlation values were very similar, suggesting no systematic bias between software packages with an excellent precision ρ for MBF (ρ = 0.97, ρc = 0.96, C b = 0.99) and good precision for MFR (ρ = 0.83, ρc = 0.76, C b = 0.92). On a per-segment basis, no mean bias was observed on Bland-Altman plots, although PMOD provided slightly higher values than FlowQuant at higher MBF and MFR values (P < .0001). Concordance between software packages was excellent for MBF and MFR, despite higher values by PMOD at higher MBF values. Both software packages can be used interchangeably for quantification in daily practice of Rb-82 cardiac PET.

  7. Influence of the medication environment on the unsafe medication behavior of nurses: A path analysis.

    PubMed

    Yu, Xi; Li, Ce; Gao, Xueqin; Liu, Furong; Lin, Ping

    2018-04-20

    To explore the relationship between the medication environment and the unsafe medication behavior of nurses and to analyze its influence path. Unsafe medication behavior is the direct cause of medication error. The organizational environment is the foundation of and plays a guiding role in work behavior. Whether the medication environment correlates with the unsafe medication behavior of nurses remains unclear. This study used a correlative design with self-administered questionnaires, and the SHEL model, an acronym of its elements of software, hardware, environment, and liveware, was used as the framework for the medication environment. A survey was conducted among 1012 clinical nurses from five tertiary hospitals in China using the nurse unsafe medication behavior scale (NUMBS) and the nurses' perceptions of the medication environment scale (NPMES). Data were collected from January to February 2017. Path analyses were used to examine the hypothesized model. The medication environment correlated negatively with unsafe medication behavior (r=-0.48, p<0.001). The path analysis showed that software, liveware and nurses' personal factors directly affected unsafe medication behavior. Software, hardware and the environment indirectly influenced unsafe medication behavior, and nurses' personal factors played a mediating role in the relationships of unsafe medication behavior with software, hardware, and the environment. The unsafe medication behavior of nurses should be further improved. The medication environment was a predictor of unsafe medication behavior. Care managers should actively improve the medication environment to reduce the incidence of unsafe medication behaviors. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  8. BurnCase 3D software validation study: Burn size measurement accuracy and inter-rater reliability.

    PubMed

    Parvizi, Daryousch; Giretzlehner, Michael; Wurzer, Paul; Klein, Limor Dinur; Shoham, Yaron; Bohanon, Fredrick J; Haller, Herbert L; Tuca, Alexandru; Branski, Ludwik K; Lumenta, David B; Herndon, David N; Kamolz, Lars-P

    2016-03-01

    The aim of this study was to compare the accuracy of burn size estimation using the computer-assisted software BurnCase 3D (RISC Software GmbH, Hagenberg, Austria) with that using a 2D scan, considered to be the actual burn size. Thirty artificial burn areas were pre planned and prepared on three mannequins (one child, one female, and one male). Five trained physicians (raters) were asked to assess the size of all wound areas using BurnCase 3D software. The results were then compared with the real wound areas, as determined by 2D planimetry imaging. To examine inter-rater reliability, we performed an intraclass correlation analysis with a 95% confidence interval. The mean wound area estimations of the five raters using BurnCase 3D were in total 20.7±0.9% for the child, 27.2±1.5% for the female and 16.5±0.1% for the male mannequin. Our analysis showed relative overestimations of 0.4%, 2.8% and 1.5% for the child, female and male mannequins respectively, compared to the 2D scan. The intraclass correlation between the single raters for mean percentage of the artificial burn areas was 98.6%. There was also a high intraclass correlation between the single raters and the 2D Scan visible. BurnCase 3D is a valid and reliable tool for the determination of total body surface area burned in standard models. Further clinical studies including different pediatric and overweight adult mannequins are warranted. Copyright © 2016 Elsevier Ltd and ISBI. All rights reserved.

  9. Effect of system workload on operating system reliability - A study on IBM 3081

    NASA Technical Reports Server (NTRS)

    Iyer, R. K.; Rossetti, D. J.

    1985-01-01

    This paper presents an analysis of operating system failures on an IBM 3081 running VM/SP. Three broad categories of software failures are found: error handling, program control or logic, and hardware related; it is found that more than 25 percent of software failures occur in the hardware/software interface. Measurements show that results on software reliability cannot be considered representative unless the system workload is taken into account. The overall CPU execution rate, although measured to be close to 100 percent most of the time, is not found to correlate strongly with the occurrence of failures. Possible reasons for the observed workload failure dependency, based on detailed investigations of the failure data, are discussed.

  10. Integrated fiducial sample mount and software for correlated microscopy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timothy R McJunkin; Jill R. Scott; Tammy L. Trowbridge

    2014-02-01

    A novel design sample mount with integrated fiducials and software for assisting operators in easily and efficiently locating points of interest established in previous analytical sessions is described. The sample holder and software were evaluated with experiments to demonstrate the utility and ease of finding the same points of interest in two different microscopy instruments. Also, numerical analysis of expected errors in determining the same position with errors unbiased by a human operator was performed. Based on the results, issues related to acquiring reproducibility and best practices for using the sample mount and software were identified. Overall, the sample mountmore » methodology allows data to be efficiently and easily collected on different instruments for the same sample location.« less

  11. New hardware and workflows for semi-automated correlative cryo-fluorescence and cryo-electron microscopy/tomography.

    PubMed

    Schorb, Martin; Gaechter, Leander; Avinoam, Ori; Sieckmann, Frank; Clarke, Mairi; Bebeacua, Cecilia; Bykov, Yury S; Sonnen, Andreas F-P; Lihl, Reinhard; Briggs, John A G

    2017-02-01

    Correlative light and electron microscopy allows features of interest defined by fluorescence signals to be located in an electron micrograph of the same sample. Rare dynamic events or specific objects can be identified, targeted and imaged by electron microscopy or tomography. To combine it with structural studies using cryo-electron microscopy or tomography, fluorescence microscopy must be performed while maintaining the specimen vitrified at liquid-nitrogen temperatures and in a dry environment during imaging and transfer. Here we present instrumentation, software and an experimental workflow that improves the ease of use, throughput and performance of correlated cryo-fluorescence and cryo-electron microscopy. The new cryo-stage incorporates a specially modified high-numerical aperture objective lens and provides a stable and clean imaging environment. It is combined with a transfer shuttle for contamination-free loading of the specimen. Optimized microscope control software allows automated acquisition of the entire specimen area by cryo-fluorescence microscopy. The software also facilitates direct transfer of the fluorescence image and associated coordinates to the cryo-electron microscope for subsequent fluorescence-guided automated imaging. Here we describe these technological developments and present a detailed workflow, which we applied for automated cryo-electron microscopy and tomography of various specimens. Copyright © 2016 The Authors. Published by Elsevier Inc. All rights reserved.

  12. A Novel Survey to Examine the Relationship Between Health IT Adoption and Nurse-Physician Communication

    PubMed Central

    Pfeifer, Eric; Manojlovich, Milisa; Adler-Milstein, Julia

    2016-01-01

    Summary Background As EHR adoption in US hospitals becomes ubiquitous, a wide range of IT options are theoretically available to facilitate physician-nurse communication, but we know little about the adoption rate of specific technologies or the impact of their use. Objectives To measure adoption of hardware, software, and telephony relevant to nurse-physician communication in US hospitals. To assess the relationship between non-IT communication practices and hardware, software, and telephony adoption. To identify hospital characteristics associated with greater adoption of hardware, software, telephony, and non-IT communication practices. Methods We conducted a survey of 105 hospitals in the National Nursing Practice Network. The survey captured adoption of hardware, software, and telephony to support nurse-physician communication, along with non-IT communication practices. We calculated descriptive statistics and then created four indices, one for each category, by scoring degree of adoption of technologies or practices within each category. Next, we examined correlations between the three technology indices and the non-IT communication practices index. We used multivariate OLS regression to assess whether certain types of hospitals had higher index scores. Results The majority of hospitals surveyed have a range of hardware, software, and telephony tools available to support nurse-physician communication; we found substantial heterogeneity across hospitals in non-IT communication practices. More intensive non-IT communication was associated with greater adoption of software (r=0.31, p=0.01), but was not correlated with hardware or telephony. Medium-sized hospitals had lower adoption of software (r =-1.14,p=0.04) in comparison to small hospitals, while federally-owned hospitals had lower software (r=-2.57, p=0.02) and hardware adoption (r=-1.63, p=0.01). Conclusions The positive relationship between non-IT communication and level of software adoption suggests that there is a complementary, rather than substitutive, relationship. Our results suggest that some technologies with the potential to further enhance communication, such as CPOE and secure messaging, are not being utilized to their full potential in many hospitals. PMID:27999841

  13. Evaluation of an Innovative Digital Assessment Tool in Dental Anatomy.

    PubMed

    Lam, Matt T; Kwon, So Ran; Qian, Fang; Denehy, Gerald E

    2015-05-01

    The E4D Compare software is an innovative tool that provides immediate feedback to students' projects and competencies. It should provide consistent scores even when different scanners are used which may have inherent subtle differences in calibration. This study aimed to evaluate potential discrepancies in evaluation using the E4D Compare software based on four different NEVO scanners in dental anatomy projects. Additionally, correlation between digital and visual scores was evaluated. Thirty-five projects of maxillary left central incisors were evaluated. Among these, thirty wax-ups were performed by four operators and five consisted of standard dentoform teeth. Five scores were obtained for each project: one from an instructor that visually graded the project and from four different NEVO scanners. A faculty involved in teaching the dental anatomy course blindly scored the 35 projects. One operator scanned all projects to four NEVO scanners (D4D Technologies, Richardson, TX, USA). The images were aligned to the gold standard, and tolerance set at 0.3 mm to generate a score. The score reflected percentage match between the project and the gold standard. One-way ANOVA with repeated measures was used to determine whether there was a significant difference in scores among the four NEVO scanners. Paired-sample t-test was used to detect any difference between visual scores and the average scores of the four NEVO scanners. Pearson's correlation test was used to assess the relationship between visual and average scores of NEVO scanners. There was no significant difference in mean scores among four different NEVO scanners [F(3, 102) = 2.27, p = 0.0852 one-way ANOVA with repeated measures]. Moreover, the data provided strong evidence that a significant difference existed between visual and digital scores (p = 0.0217; a paired - sample t-test). Mean visual scores were significantly lower than digital scores (72.4 vs 75.1). Pearson's correlation coefficient of 0.85 indicated a strong correlation between visual and digital scores (p < 0.0001). The E4D Compare software provides consistent scores even when different scanners are used and correlates well with visual scores. The use of innovative digital assessment tools in dental education is promising with the E4D Compare software correlating well with visual scores and providing consistent scores even when different scanners are used.

  14. Commercial counterboard for 10 ns software correlator for photon and fluorescence correlation spectroscopy.

    PubMed

    Molteni, Matteo; Ferri, Fabio

    2016-11-01

    A 10 ns time resolution, multi-tau software correlator, capable of computing simultaneous autocorrelation (A-A, B-B) and cross (A-B) correlation functions at count rates up to ∼10 MHz, with no data loss, has been developed in LabVIEW and C++ by using the National Instrument timer/counterboard (NI PCIe-6612) and a fast Personal Computer (PC) (Intel Core i7-4790 Processor 3.60 GHz ). The correlator works by using two algorithms: for large lag times (τ ≳ 1 μs), a classical time-mode scheme, based on the measure of the number of pulses per time interval, is used; differently, for τ ≲ 1 μs a photon-mode (PM) scheme is adopted and the correlation function is retrieved from the sequence of the photon arrival times. Single auto- and cross-correlation functions can be processed online in full real time up to count rates of ∼1.8 MHz and ∼1.2 MHz, respectively. Two autocorrelation (A-A, B-B) and a cross correlation (A-B) functions can be simultaneously processed in full real time only up to count rates of ∼750 kHz. At higher count rates, the online processing takes place in a delayed modality, but with no data loss. When tested with simulated correlation data and latex spheres solutions, the overall performances of the correlator appear to be comparable with those of commercial hardware correlators, but with several nontrivial advantages related to its flexibility, low cost, and easy adaptability to future developments of PC and data acquisition technology.

  15. Commercial counterboard for 10 ns software correlator for photon and fluorescence correlation spectroscopy

    NASA Astrophysics Data System (ADS)

    Molteni, Matteo; Ferri, Fabio

    2016-11-01

    A 10 ns time resolution, multi-tau software correlator, capable of computing simultaneous autocorrelation (A-A, B-B) and cross (A-B) correlation functions at count rates up to ˜10 MHz, with no data loss, has been developed in LabVIEW and C++ by using the National Instrument timer/counterboard (NI PCIe-6612) and a fast Personal Computer (PC) (Intel Core i7-4790 Processor 3.60 GHz ). The correlator works by using two algorithms: for large lag times (τ ≳ 1 μs), a classical time-mode scheme, based on the measure of the number of pulses per time interval, is used; differently, for τ ≲ 1 μs a photon-mode (PM) scheme is adopted and the correlation function is retrieved from the sequence of the photon arrival times. Single auto- and cross-correlation functions can be processed online in full real time up to count rates of ˜1.8 MHz and ˜1.2 MHz, respectively. Two autocorrelation (A-A, B-B) and a cross correlation (A-B) functions can be simultaneously processed in full real time only up to count rates of ˜750 kHz. At higher count rates, the online processing takes place in a delayed modality, but with no data loss. When tested with simulated correlation data and latex spheres solutions, the overall performances of the correlator appear to be comparable with those of commercial hardware correlators, but with several nontrivial advantages related to its flexibility, low cost, and easy adaptability to future developments of PC and data acquisition technology.

  16. DAQ: Software Architecture for Data Acquisition in Sounding Rockets

    NASA Technical Reports Server (NTRS)

    Ahmad, Mohammad; Tran, Thanh; Nichols, Heidi; Bowles-Martinez, Jessica N.

    2011-01-01

    A multithreaded software application was developed by Jet Propulsion Lab (JPL) to collect a set of correlated imagery, Inertial Measurement Unit (IMU) and GPS data for a Wallops Flight Facility (WFF) sounding rocket flight. The data set will be used to advance Terrain Relative Navigation (TRN) technology algorithms being researched at JPL. This paper describes the software architecture and the tests used to meet the timing and data rate requirements for the software used to collect the dataset. Also discussed are the challenges of using commercial off the shelf (COTS) flight hardware and open source software. This includes multiple Camera Link (C-link) based cameras, a Pentium-M based computer, and Linux Fedora 11 operating system. Additionally, the paper talks about the history of the software architecture's usage in other JPL projects and its applicability for future missions, such as cubesats, UAVs, and research planes/balloons. Also talked about will be the human aspect of project especially JPL's Phaeton program and the results of the launch.

  17. Quantitative analysis of tympanic membrane perforation: a simple and reliable method.

    PubMed

    Ibekwe, T S; Adeosun, A A; Nwaorgu, O G

    2009-01-01

    Accurate assessment of the features of tympanic membrane perforation, especially size, site, duration and aetiology, is important, as it enables optimum management. To describe a simple, cheap and effective method of quantitatively analysing tympanic membrane perforations. The system described comprises a video-otoscope (capable of generating still and video images of the tympanic membrane), adapted via a universal serial bus box to a computer screen, with images analysed using the Image J geometrical analysis software package. The reproducibility of results and their correlation with conventional otoscopic methods of estimation were tested statistically with the paired t-test and correlational tests, using the Statistical Package for the Social Sciences version 11 software. The following equation was generated: P/T x 100 per cent = percentage perforation, where P is the area (in pixels2) of the tympanic membrane perforation and T is the total area (in pixels2) for the entire tympanic membrane (including the perforation). Illustrations are shown. Comparison of blinded data on tympanic membrane perforation area obtained independently from assessments by two trained otologists, of comparative years of experience, using the video-otoscopy system described, showed similar findings, with strong correlations devoid of inter-observer error (p = 0.000, r = 1). Comparison with conventional otoscopic assessment also indicated significant correlation, comparing results for two trained otologists, but some inter-observer variation was present (p = 0.000, r = 0.896). Correlation between the two methods for each of the otologists was also highly significant (p = 0.000). A computer-adapted video-otoscope, with images analysed by Image J software, represents a cheap, reliable, technology-driven, clinical method of quantitative analysis of tympanic membrane perforations and injuries.

  18. Evaluation of software tools for automated identification of neuroanatomical structures in quantitative β-amyloid PET imaging to diagnose Alzheimer's disease.

    PubMed

    Tuszynski, Tobias; Rullmann, Michael; Luthardt, Julia; Butzke, Daniel; Tiepolt, Solveig; Gertz, Hermann-Josef; Hesse, Swen; Seese, Anita; Lobsien, Donald; Sabri, Osama; Barthel, Henryk

    2016-06-01

    For regional quantification of nuclear brain imaging data, defining volumes of interest (VOIs) by hand is still the gold standard. As this procedure is time-consuming and operator-dependent, a variety of software tools for automated identification of neuroanatomical structures were developed. As the quality and performance of those tools are poorly investigated so far in analyzing amyloid PET data, we compared in this project four algorithms for automated VOI definition (HERMES Brass, two PMOD approaches, and FreeSurfer) against the conventional method. We systematically analyzed florbetaben brain PET and MRI data of ten patients with probable Alzheimer's dementia (AD) and ten age-matched healthy controls (HCs) collected in a previous clinical study. VOIs were manually defined on the data as well as through the four automated workflows. Standardized uptake value ratios (SUVRs) with the cerebellar cortex as a reference region were obtained for each VOI. SUVR comparisons between ADs and HCs were carried out using Mann-Whitney-U tests, and effect sizes (Cohen's d) were calculated. SUVRs of automatically generated VOIs were correlated with SUVRs of conventionally derived VOIs (Pearson's tests). The composite neocortex SUVRs obtained by manually defined VOIs were significantly higher for ADs vs. HCs (p=0.010, d=1.53). This was also the case for the four tested automated approaches which achieved effect sizes of d=1.38 to d=1.62. SUVRs of automatically generated VOIs correlated significantly with those of the hand-drawn VOIs in a number of brain regions, with regional differences in the degree of these correlations. Best overall correlation was observed in the lateral temporal VOI for all tested software tools (r=0.82 to r=0.95, p<0.001). Automated VOI definition by the software tools tested has a great potential to substitute for the current standard procedure to manually define VOIs in β-amyloid PET data analysis.

  19. Head circumference as a useful surrogate for intracranial volume in older adults.

    PubMed

    Hshieh, Tammy T; Fox, Meaghan L; Kosar, Cyrus M; Cavallari, Michele; Guttmann, Charles R G; Alsop, David; Marcantonio, Edward R; Schmitt, Eva M; Jones, Richard N; Inouye, Sharon K

    2016-01-01

    Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for ICV in older adults. 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Head circumference and ICV by SPM8 were moderately correlated (overall r = 0.73, men r = 0.67, women r = 0.63). Head circumference and ICV by FSL were also moderately correlated (overall r = 0.69, men r = 0.63, women r = 0.49). Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan.

  20. [Classification of results of studying blood plasma with laser correlation spectroscopy based on semiotics of preclinical and clinical states].

    PubMed

    Ternovoĭ, K S; Kryzhanovskiĭ, G N; Musiĭchuk, Iu I; Noskin, L A; Klopov, N V; Noskin, V A; Starodub, N F

    1998-01-01

    The usage of laser correlation spectroscopy for verification of preclinical and clinical states is substantiated. Developed "semiotic" classifier for solving the problems of preclinical and clinical states is presented. The substantiation of biological algorithms as well as the mathematical support and software for the proposed classifier for the data of laser correlation spectroscopy of blood plasma are presented.

  1. ROI Analysis of the System Architecture Virtual Integration Initiative

    DTIC Science & Technology

    2018-04-01

    The ROI anal- ysis uses conservative estimates of costs and benefits, especially for those parameters that have a proven, strong correlation to overall...formula: • In Section 3, we discuss the exponential growth of avionics software systems in terms of SLOC by analyzing the historical data to correlate ...which implies that the system has good structure (high cohesion, low coupling), good ap- plication clarity (good correlation between program and

  2. Analytic posteriors for Pearson's correlation coefficient.

    PubMed

    Ly, Alexander; Marsman, Maarten; Wagenmakers, Eric-Jan

    2018-02-01

    Pearson's correlation is one of the most common measures of linear dependence. Recently, Bernardo (11th International Workshop on Objective Bayes Methodology, 2015) introduced a flexible class of priors to study this measure in a Bayesian setting. For this large class of priors, we show that the (marginal) posterior for Pearson's correlation coefficient and all of the posterior moments are analytic. Our results are available in the open-source software package JASP.

  3. The Dimensionality and Correlates of Flow in Human-Computer Interactions.

    ERIC Educational Resources Information Center

    Webster, Jane; And Others

    1993-01-01

    Defines playfulness in human-computer interactions in terms of flow theory and explores the dimensionality of the flow concept. Two studies are reported that investigated the factor structure and correlates of flow in human-computer interactions: one examined MBA students using Lotus 1-2-3 spreadsheet software, and one examined employees using…

  4. Identifying technical aliases in SELDI mass spectra of complex mixtures of proteins

    PubMed Central

    2013-01-01

    Background Biomarker discovery datasets created using mass spectrum protein profiling of complex mixtures of proteins contain many peaks that represent the same protein with different charge states. Correlated variables such as these can confound the statistical analyses of proteomic data. Previously we developed an algorithm that clustered mass spectrum peaks that were biologically or technically correlated. Here we demonstrate an algorithm that clusters correlated technical aliases only. Results In this paper, we propose a preprocessing algorithm that can be used for grouping technical aliases in mass spectrometry protein profiling data. The stringency of the variance allowed for clustering is customizable, thereby affecting the number of peaks that are clustered. Subsequent analysis of the clusters, instead of individual peaks, helps reduce difficulties associated with technically-correlated data, and can aid more efficient biomarker identification. Conclusions This software can be used to pre-process and thereby decrease the complexity of protein profiling proteomics data, thus simplifying the subsequent analysis of biomarkers by decreasing the number of tests. The software is also a practical tool for identifying which features to investigate further by purification, identification and confirmation. PMID:24010718

  5. A Low Cost Simulation System to Demonstrate Pilot Induced Oscillation Phenomenon

    NASA Technical Reports Server (NTRS)

    Ali, Syed Firasat

    1997-01-01

    A flight simulation system with graphics and software on Silicon Graphics computer workstations has been installed in the Flight Vehicle Design Laboratory at Tuskegee University. The system has F-15E flight simulation software from NASA Dryden which uses the graphics of SGI flight simulation demos. On the system, thus installed, a study of pilot induced oscillations is planned for future work. Preliminary research is conducted by obtaining two sets of straight level flights with pilot in the loop. In one set of flights no additional delay is used between the stick input and the appearance of airplane response on the computer monitor. In another set of flights, a 500 ms additional delay is used. The flight data is analyzed to find cross correlations between deflections of control surfaces and response of the airplane. The pilot dynamics features depicted from cross correlations of straight level flights are discussed in this report. The correlations presented here will serve as reference material for the corresponding correlations in a future study of pitch attitude tracking tasks involving pilot induced oscillations.

  6. Finite-Difference Time-Domain Analysis of Tapered Photonic Crystal Fiber

    NASA Astrophysics Data System (ADS)

    Ali, M. I. Md; Sanusidin, S. N.; Yusof, M. H. M.

    2018-03-01

    This paper brief about the simulation of tapered photonic crystal fiber (PCF) LMA-8 single-mode type based on correlation of scattering pattern at wavelength of 1.55 μm, analyzation of transmission spectrum at wavelength over the range of 1.0 until 2.5 μm and correlation of transmission spectrum with the refractive index change in photonic crystal holes with respect to taper size of 0.1 until 1.0 using Optiwave simulation software. The main objective is to simulate using Finite-Difference Time-Domain (FDTD) technique of tapered LMA-8 PCF and for sensing application by improving the capabilities of PCF without collapsing the crystal holes. The types of FDTD techniques used are scattering pattern and transverse transmission and principal component analysis (PCA) used as a mathematical tool to model the data obtained by MathCad software. The simulation results showed that there is no obvious correlation of scattering pattern at a wavelength of 1.55 μm, a correlation obtained between taper sizes with a transverse transmission and there is a parabolic relationship between the refractive index changes inside the crystal structure.

  7. The Stability and Validity of Automated Vocal Analysis in Preverbal Preschoolers With Autism Spectrum Disorder

    PubMed Central

    Woynaroski, Tiffany; Oller, D. Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A.; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul

    2017-01-01

    Theory and research suggest that vocal development predicts “useful speech” in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently “in development” and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. PMID:27459107

  8. Reproducibility of a novel echocardiographic 3D automated software for the assessment of mitral valve anatomy.

    PubMed

    Aquila, Iolanda; González, Ariana; Fernández-Golfín, Covadonga; Rincón, Luis Miguel; Casas, Eduardo; García, Ana; Hinojar, Rocio; Jiménez-Nacher, José Julio; Zamorano, José Luis

    2016-05-17

    3D transesophageal echocardiography (TEE) is superior to 2D TEE in quantitative anatomic evaluation of the mitral valve (MV) but it shows limitations regarding automatic quantification. Here, we tested the inter-/intra-observer reproducibility of a novel full-automated software in the evaluation of MV anatomy compared to manual 3D assessment. Thirty-six out of 61 screened patients referred to our Cardiac Imaging Unit for TEE were retrospectively included. 3D TEE analysis was performed both manually and with the automated software by two independent operators. Mitral annular area, intercommissural distance, anterior leaflet length and posterior leaflet length were assessed. A significant correlation between both methods was found for all variables: intercommissural diameter (r = 0.84, p < 0.01), mitral annular area (r = 0.94, p > 0, 01), anterior leaflet length (r = 0.83, p < 0.01) and posterior leaflet length (r = 0.67, p < 0.01). Interobserver variability assessed by the intraclass correlation coefficient was superior for the automatic software: intercommisural distance 0.997 vs. 0.76; mitral annular area 0.957 vs. 0.858; anterior leaflet length 0.963 vs. 0.734 and posterior leaflet length 0.936 vs. 0.838. Intraobserver variability was good for both methods with a better level of agreement with the automatic software. The novel 3D automated software is reproducible in MV anatomy assessment. The incorporation of this new tool in clinical MV assessment may improve patient selection and outcomes for MV interventions as well as patient diagnosis and prognosis stratification. Yet, high-quality 3D images are indispensable.

  9. Validation of a Video Analysis Software Package for Quantifying Movement Velocity in Resistance Exercises.

    PubMed

    Sañudo, Borja; Rueda, David; Pozo-Cruz, Borja Del; de Hoyo, Moisés; Carrasco, Luis

    2016-10-01

    Sañudo, B, Rueda, D, del Pozo-Cruz, B, de Hoyo, M, and Carrasco, L. Validation of a video analysis software package for quantifying movement velocity in resistance exercises. J Strength Cond Res 30(10): 2934-2941, 2016-The aim of this study was to establish the validity of a video analysis software package in measuring mean propulsive velocity (MPV) and the maximal velocity during bench press. Twenty-one healthy males (21 ± 1 year) with weight training experience were recruited, and the MPV and the maximal velocity of the concentric phase (Vmax) were compared with a linear position transducer system during a standard bench press exercise. Participants performed a 1 repetition maximum test using the supine bench press exercise. The testing procedures involved the simultaneous assessment of bench press propulsive velocity using 2 kinematic (linear position transducer and semi-automated tracking software) systems. High Pearson's correlation coefficients for MPV and Vmax between both devices (r = 0.473 to 0.993) were observed. The intraclass correlation coefficients for barbell velocity data and the kinematic data obtained from video analysis were high (>0.79). In addition, the low coefficients of variation indicate that measurements had low variability. Finally, Bland-Altman plots with the limits of agreement of the MPV and Vmax with different loads showed a negative trend, which indicated that the video analysis had higher values than the linear transducer. In conclusion, this study has demonstrated that the software used for the video analysis was an easy to use and cost-effective tool with a very high degree of concurrent validity. This software can be used to evaluate changes in velocity of training load in resistance training, which may be important for the prescription and monitoring of training programmes.

  10. A near-infrared fluorescence-based surgical navigation system imaging software for sentinel lymph node detection

    NASA Astrophysics Data System (ADS)

    Ye, Jinzuo; Chi, Chongwei; Zhang, Shuang; Ma, Xibo; Tian, Jie

    2014-02-01

    Sentinel lymph node (SLN) in vivo detection is vital in breast cancer surgery. A new near-infrared fluorescence-based surgical navigation system (SNS) imaging software, which has been developed by our research group, is presented for SLN detection surgery in this paper. The software is based on the fluorescence-based surgical navigation hardware system (SNHS) which has been developed in our lab, and is designed specifically for intraoperative imaging and postoperative data analysis. The surgical navigation imaging software consists of the following software modules, which mainly include the control module, the image grabbing module, the real-time display module, the data saving module and the image processing module. And some algorithms have been designed to achieve the performance of the software, for example, the image registration algorithm based on correlation matching. Some of the key features of the software include: setting the control parameters of the SNS; acquiring, display and storing the intraoperative imaging data in real-time automatically; analysis and processing of the saved image data. The developed software has been used to successfully detect the SLNs in 21 cases of breast cancer patients. In the near future, we plan to improve the software performance and it will be extensively used for clinical purpose.

  11. Development and assessment of a digital X-ray software tool to determine vertebral rotation in adolescent idiopathic scoliosis.

    PubMed

    Eijgenraam, Susanne M; Boselie, Toon F M; Sieben, Judith M; Bastiaenen, Caroline H G; Willems, Paul C; Arts, Jacobus J; Lataster, Arno

    2017-02-01

    The amount of vertebral rotation in the axial plane is of key importance in the prognosis and treatment of adolescent idiopathic scoliosis (AIS). Current methods to determine vertebral rotation are either designed for use in analogue plain radiographs and not useful in digital images, or lack measurement precision and are therefore less suitable for the follow-up of rotation in AIS patients. This study aimed to develop a digital X-ray software tool with high measurement precision to determine vertebral rotation in AIS, and to assess its (concurrent) validity and reliability. In this study a combination of basic science and reliability methodology applied in both laboratory and clinical settings was used. Software was developed using the algorithm of the Perdriolle torsion meter for analogue AP plain radiographs of the spine. Software was then assessed for (1) concurrent validity and (2) intra- and interobserver reliability. Plain radiographs of both human cadaver vertebrae and outpatient AIS patients were used. Concurrent validity was measured by two independent observers, both experienced in the assessment of plain radiographs. Reliability-measurements were performed by three independent spine surgeons. Pearson correlation of the software compared with the analogue Perdriolle torsion meter for mid-thoracic vertebrae was 0.98, for low-thoracic vertebrae 0.97 and for lumbar vertebrae 0.97. Measurement exactness of the software was within 5° in 62% of cases and within 10° in 97% of cases. Intraclass correlation coefficient (ICC) for inter-observer reliability was 0.92 (0.91-0.95), ICC for intra-observer reliability was 0.96 (0.94-0.97). We developed a digital X-ray software tool to determine vertebral rotation in AIS with a substantial concurrent validity and reliability, which may be useful for the follow-up of vertebral rotation in AIS patients. Copyright © 2015 Elsevier Inc. All rights reserved.

  12. Correlation and agreement of a digital and conventional method to measure arch parameters.

    PubMed

    Nawi, Nes; Mohamed, Alizae Marny; Marizan Nor, Murshida; Ashar, Nor Atika

    2018-01-01

    The aim of the present study was to determine the overall reliability and validity of arch parameters measured digitally compared to conventional measurement. A sample of 111 plaster study models of Down syndrome (DS) patients were digitized using a blue light three-dimensional (3D) scanner. Digital and manual measurements of defined parameters were performed using Geomagic analysis software (Geomagic Studio 2014 software, 3D Systems, Rock Hill, SC, USA) on digital models and with a digital calliper (Tuten, Germany) on plaster study models. Both measurements were repeated twice to validate the intraexaminer reliability based on intraclass correlation coefficients (ICCs) using the independent t test and Pearson's correlation, respectively. The Bland-Altman method of analysis was used to evaluate the agreement of the measurement between the digital and plaster models. No statistically significant differences (p > 0.05) were found between the manual and digital methods when measuring the arch width, arch length, and space analysis. In addition, all parameters showed a significant correlation coefficient (r ≥ 0.972; p < 0.01) between all digital and manual measurements. Furthermore, a positive agreement between digital and manual measurements of the arch width (90-96%), arch length and space analysis (95-99%) were also distinguished using the Bland-Altman method. These results demonstrate that 3D blue light scanning and measurement software are able to precisely produce 3D digital model and measure arch width, arch length, and space analysis. The 3D digital model is valid to be used in various clinical applications.

  13. Eye/Brain/Task Testbed And Software

    NASA Technical Reports Server (NTRS)

    Janiszewski, Thomas; Mainland, Nora; Roden, Joseph C.; Rothenheber, Edward H.; Ryan, Arthur M.; Stokes, James M.

    1994-01-01

    Eye/brain/task (EBT) testbed records electroencephalograms, movements of eyes, and structures of tasks to provide comprehensive data on neurophysiological experiments. Intended to serve continuing effort to develop means for interactions between human brain waves and computers. Software library associated with testbed provides capabilities to recall collected data, to process data on movements of eyes, to correlate eye-movement data with electroencephalographic data, and to present data graphically. Cognitive processes investigated in ways not previously possible.

  14. An Empirical Approach to Logical Clustering of Software Failure Regions

    DTIC Science & Technology

    1994-03-01

    this is a coincidence or normal behavior of failure regions. " Software faults were numbered in order as they were discovered, by the various testing...locations of the associated faults. The goal of this research will be an improved testing technique that incorporates failure region behavior . To do this...clustering behavior . This, however, does not correlate with the structural clustering of failure regions observed by Ginn (1991) on the same set of data

  15. Open release of the DCA++ project

    NASA Astrophysics Data System (ADS)

    Haehner, Urs; Solca, Raffaele; Staar, Peter; Alvarez, Gonzalo; Maier, Thomas; Summers, Michael; Schulthess, Thomas

    We present the first open release of the DCA++ project, a highly scalable and efficient research code to solve quantum many-body problems with cutting edge quantum cluster algorithms. The implemented dynamical cluster approximation (DCA) and its DCA+ extension with a continuous self-energy capture nonlocal correlations in strongly correlated electron systems thereby allowing insight into high-Tc superconductivity. With the increasing heterogeneity of modern machines, DCA++ provides portable performance on conventional and emerging new architectures, such as hybrid CPU-GPU and Xeon Phi, sustaining multiple petaflops on ORNL's Titan and CSCS' Piz Daint. Moreover, we will describe how best practices in software engineering can be applied to make software development sustainable and scalable in a research group. Software testing and documentation not only prevent productivity collapse, but more importantly, they are necessary for correctness, credibility and reproducibility of scientific results. This research used resources of the Oak Ridge Leadership Computing Facility (OLCF) awarded by the INCITE program, and of the Swiss National Supercomputing Center. OLCF is a DOE Office of Science User Facility supported under Contract DE-AC05-00OR22725.

  16. Inexpensive electronics and software for photon statistics and correlation spectroscopy.

    PubMed

    Gamari, Benjamin D; Zhang, Dianwen; Buckman, Richard E; Milas, Peker; Denker, John S; Chen, Hui; Li, Hongmin; Goldner, Lori S

    2014-07-01

    Single-molecule-sensitive microscopy and spectroscopy are transforming biophysics and materials science laboratories. Techniques such as fluorescence correlation spectroscopy (FCS) and single-molecule sensitive fluorescence resonance energy transfer (FRET) are now commonly available in research laboratories but are as yet infrequently available in teaching laboratories. We describe inexpensive electronics and open-source software that bridges this gap, making state-of-the-art research capabilities accessible to undergraduates interested in biophysics. We include a discussion of the intensity correlation function relevant to FCS and how it can be determined from photon arrival times. We demonstrate the system with a measurement of the hydrodynamic radius of a protein using FCS that is suitable for the undergraduate teaching laboratory. The FPGA-based electronics, which are easy to construct, are suitable for more advanced measurements as well, and several applications are described. As implemented, the system has 8 ns timing resolution, can control up to four laser sources, and can collect information from as many as four photon-counting detectors.

  17. Inexpensive electronics and software for photon statistics and correlation spectroscopy

    PubMed Central

    Gamari, Benjamin D.; Zhang, Dianwen; Buckman, Richard E.; Milas, Peker; Denker, John S.; Chen, Hui; Li, Hongmin; Goldner, Lori S.

    2016-01-01

    Single-molecule-sensitive microscopy and spectroscopy are transforming biophysics and materials science laboratories. Techniques such as fluorescence correlation spectroscopy (FCS) and single-molecule sensitive fluorescence resonance energy transfer (FRET) are now commonly available in research laboratories but are as yet infrequently available in teaching laboratories. We describe inexpensive electronics and open-source software that bridges this gap, making state-of-the-art research capabilities accessible to undergraduates interested in biophysics. We include a discussion of the intensity correlation function relevant to FCS and how it can be determined from photon arrival times. We demonstrate the system with a measurement of the hydrodynamic radius of a protein using FCS that is suitable for the undergraduate teaching laboratory. The FPGA-based electronics, which are easy to construct, are suitable for more advanced measurements as well, and several applications are described. As implemented, the system has 8 ns timing resolution, can control up to four laser sources, and can collect information from as many as four photon-counting detectors. PMID:26924846

  18. The interactive digital video interface

    NASA Technical Reports Server (NTRS)

    Doyle, Michael D.

    1989-01-01

    A frequent complaint in the computer oriented trade journals is that current hardware technology is progressing so quickly that software developers cannot keep up. A example of this phenomenon can be seen in the field of microcomputer graphics. To exploit the advantages of new mechanisms of information storage and retrieval, new approaches must be made towards incorporating existing programs as well as developing entirely new applications. A particular area of need is the correlation of discrete image elements to textural information. The interactive digital video (IDV) interface embodies a new concept in software design which addresses these needs. The IDV interface is a patented device and language independent process for identifying image features on a digital video display and which allows a number of different processes to be keyed to that identification. Its capabilities include the correlation of discrete image elements to relevant text information and the correlation of these image features to other images as well as to program control mechanisms. Sophisticated interrelationships can be set up between images, text, and program control mechanisms.

  19. The CoreWall Project: An Update for 2007

    NASA Astrophysics Data System (ADS)

    Yu-Chung Chen, J.; Higgins, S.; Hur, H.; Ito, E.; Jenkins, C. J.; Johnson, A.; Leigh, J.; Morin, P.; Lee, J.

    2007-12-01

    The CoreWall Suite is a NSF-supported collaborative development for a real-time core description (Corelyzer), stratigraphic correlation (Correlater), and data visualization (CoreNavigator) software to be used by the marine, terrestrial and Antarctic science communities. The overall goal of the Corewall software development is to bring portable cross-platform tools to the broader drilling and coring communities to expand and enhance data visualization and enhance collaborative integration of multiple datasets. The CoreWall Project is now in its second year and significant progress has been made on all 3 software components. Corelyzer has undergone 2 field deployments and testing by ANDRILL program in 2006 (and again in Fall 2007) and by ICDP's SAFOD project (summer 2007). In addition, Corewall group and ICDP are working together so that the core description (DIS) system can expose DIS core data directly into Corelyzer seamlessly and be available to future ICDP and IODP-Mission Specific Platform expeditions. Educators have also taken note of the software's ease of use and strong visualization capabilities to begin exploring curriculum projects with Corelyzer software. To ensure that the software development is integrated with other community IT activities the development of the U.S. IODP-Phase 2 Scientific Ocean Drilling Vessel (SODV), a Steering Committee was constituted. It is composed of key U.S. IODP and related database (e.g., CHRONOS, SedDB) developers and users as well as representatives of other core-based enterprises (e.g., ANDRILL, ICDP, LacCore). Corelyzer (CoreWall's main visual core description tool) software displays digital core images from one or more cores along with discrete data streams (eg. physical properties, downhole logs) and nested images (eg. thin sections, fossils) to provide a robust approach to the description of sediment cores. Corelyzer's digital image handling allows the cores to be viewed from micron to km scale determined by the image resolution along a sliding plane, effectively making it a "digital microscope". Detailed features such as lithologic variation, macroscopic grain size variation, bioturbation intensity, chemical composition and micropaleontology are easier to interpret and annotate. Significant new capabilities have been added to allow for importing multiple images and data types, sharing/exporting Corelyzer "work sessions" for multiple users, enhanced annotations, as well as support for other activities like examining clasts, and sample requests. The new Correlator software, the updated version of Splicer/Sagan software used by ODP for over 10 years, has been ported into a single new analysis tool that will work across multiple platforms and interact seamlessly with both JANUS (ODP's relational database), CHRONOS, PetDB, SedDB, dbSEABED and other databases. This functionality will result in a CoreWall Suite module that can be used and distributed anywhere for stratigraphic and age correlation tasks. CoreNavigator, a spatial data discovery tool, has taken on a virtual Globe interface that allows users to enter Corelyzer from a geographic-visual standpoint.

  20. Accuracy and Reliability Assessment of CT and MR Perfusion Analysis Software Using a Digital Phantom

    PubMed Central

    Christensen, Soren; Sasaki, Makoto; Østergaard, Leif; Shirato, Hiroki; Ogasawara, Kuniaki; Wintermark, Max; Warach, Steven

    2013-01-01

    Purpose: To design a digital phantom data set for computed tomography (CT) perfusion and perfusion-weighted imaging on the basis of the widely accepted tracer kinetic theory in which the true values of cerebral blood flow (CBF), cerebral blood volume (CBV), mean transit time (MTT), and tracer arrival delay are known and to evaluate the accuracy and reliability of postprocessing programs using this digital phantom. Materials and Methods: A phantom data set was created by generating concentration-time curves reflecting true values for CBF (2.5–87.5 mL/100 g per minute), CBV (1.0–5.0 mL/100 g), MTT (3.4–24 seconds), and tracer delays (0–3.0 seconds). These curves were embedded in human brain images. The data were analyzed by using 13 algorithms each for CT and magnetic resonance (MR), including five commercial vendors and five academic programs. Accuracy was assessed by using the Pearson correlation coefficient (r) for true values. Delay-, MTT-, or CBV-dependent errors and correlations between time to maximum of residue function (Tmax) were also evaluated. Results: In CT, CBV was generally well reproduced (r > 0.9 in 12 algorithms), but not CBF and MTT (r > 0.9 in seven and four algorithms, respectively). In MR, good correlation (r > 0.9) was observed in one-half of commercial programs, while all academic algorithms showed good correlations for all parameters. Most algorithms had delay-dependent errors, especially for commercial software, as well as CBV dependency for CBF or MTT calculation and MTT dependency for CBV calculation. Correlation was good in Tmax except for one algorithm. Conclusion: The digital phantom readily evaluated the accuracy and characteristics of the CT and MR perfusion analysis software. All commercial programs had delay-induced errors and/or insufficient correlations with true values, while academic programs for MR showed good correlations with true values. © RSNA, 2012 Supplemental material: http://radiology.rsna.org/lookup/suppl/doi:10.1148/radiol.12112618/-/DC1 PMID:23220899

  1. Reliability and scientific use of a surgical planning software for anterior cervical discectomy and fusion (ACDF).

    PubMed

    Barth, Martin; Weiß, Christel; Brenke, Christopher; Schmieder, Kirsten

    2017-04-01

    Software-based planning of a spinal implant inheres in the promise of precision and superior results. The purpose of the study was to analyze the measurement reliability, prognostic value, and scientific use of a surgical planning software in patients receiving anterior cervical discectomy and fusion (ACDF). Lateral neutral, flexion, and extension radiographs of patients receiving tailored cages as suggested by the planning software were available for analysis. Differences of vertebral wedging angles and segmental height of all cervical segments were determined at different timepoints using intraclass correlation coefficients (ICC). Cervical lordosis (C2/C7), segmental heights, global, and segmental range of motion (ROM) were determined at different timepoints. Clinical and radiological variables were correlated 12 months after surgery. 282 radiographs of 35 patients with a mean age of 53.1 ± 12.0 years were analyzed. Measurement of segmental height was highly accurate with an ICC near to 1, but angle measurements showed low ICC values. Likewise, the ICCs of the prognosticated values were low. Postoperatively, there was a significant decrease of segmental height (p < 0.0001) and loss of C2/C7 ROM (p = 0.036). ROM of unfused segments also significantly decreased (p = 0.016). High NDI was associated with low subsidence rates. The surgical planning software showed high accuracy in the measurement of height differences and lower accuracy values with angle measurements. Both the prognosticated height and angle values were arbitrary. Global ROM, ROM of the fused and intact segments, is restricted after ACDF.

  2. Head Circumference as a Useful Surrogate for Intracranial Volume in Older Adults

    PubMed Central

    Hshieh, Tammy T.; Fox, Meaghan L.; Kosar, Cyrus M.; Cavallari, Michele; Guttmann, Charles R.G.; Alsop, David; Marcantonio, Edward R.; Schmitt, Eva M.; Jones, Richard N.; Inouye, Sharon K.

    2015-01-01

    Background Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for intracranial volume in older adults. Methods 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Results Head circumference and ICV by SPM8 were moderately correlated (overall r=0.73, men r=0.67, women r=0.63). Head circumference and ICV by FSL were also moderately correlated (overall r=0.69, men r=0.63, women r=0.49). Conclusions Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan. PMID:26631180

  3. A software to measure phase-velocity dispersion from ambient-noise correlations and its application to the SNSN data

    NASA Astrophysics Data System (ADS)

    Sadeghisorkhani, Hamzeh; Gudmundsson, Ólafur

    2017-04-01

    Graphical software for phase-velocity dispersion measurements of surface waves in noise-correlation traces, called GSpecDisp, is presented. It is an interactive environment for the measurements and presentation of the results. It measures phase-velocity dispersion curves in the frequency domain based on matching of the real part of the cross-correlation spectrum with the appropriate Bessel function. The inputs are time-domain cross-correlations in SAC format. It can measure two types of phase-velocity dispersion curves; 1- average phase-velocity of a region, and 2- single-pair phase velocity. The average phase-velocity dispersion curve of a region can be used as a reference curve to automatically select the dispersion curves from each single-pair cross-correlation in that region. It also allows the users to manually refine the selections. Therefore, no prior knowledge is needed for an unknown region. GSpecDisp can measure the phase velocity of Rayleigh and Love waves from all possible components of the noise correlation tensor, including diagonal and off-diagonal components of the tensor. First, we explain how GSpecDisp is applied to measure phase-velocity dispersion curves. Then, we demonstrate measurement results on synthetic and real data from the Swedish National Seismic Network (SNSN). We compare the results with two other methods of phase-velocity dispersion measurements. Finally, we compare phase-velocity dispersion curves of Rayleigh waves obtained from different components of the correlation tensor.

  4. Processing techniques for software based SAR processors

    NASA Technical Reports Server (NTRS)

    Leung, K.; Wu, C.

    1983-01-01

    Software SAR processing techniques defined to treat Shuttle Imaging Radar-B (SIR-B) data are reviewed. The algorithms are devised for the data processing procedure selection, SAR correlation function implementation, multiple array processors utilization, cornerturning, variable reference length azimuth processing, and range migration handling. The Interim Digital Processor (IDP) originally implemented for handling Seasat SAR data has been adapted for the SIR-B, and offers a resolution of 100 km using a processing procedure based on the Fast Fourier Transformation fast correlation approach. Peculiarities of the Seasat SAR data processing requirements are reviewed, along with modifications introduced for the SIR-B. An Advanced Digital SAR Processor (ADSP) is under development for use with the SIR-B in the 1986 time frame as an upgrade for the IDP, which will be in service in 1984-5.

  5. Using Group Explorer in teaching abstract algebra

    NASA Astrophysics Data System (ADS)

    Schubert, Claus; Gfeller, Mary; Donohue, Christopher

    2013-04-01

    This study explores the use of Group Explorer in an undergraduate mathematics course in abstract algebra. The visual nature of Group Explorer in representing concepts in group theory is an attractive incentive to use this software in the classroom. However, little is known about students' perceptions on this technology in learning concepts in abstract algebra. A total of 26 participants in an undergraduate course studying group theory were surveyed regarding their experiences using Group Explorer. Findings indicate that all participants believed that the software was beneficial to their learning and described their attitudes regarding the software in terms of using the technology and its helpfulness in learning concepts. A multiple regression analysis reveals that representational fluency of concepts with the software correlated significantly with participants' understanding of group concepts yet, participants' attitudes about Group Explorer and technology in general were not significant factors.

  6. Happy software developers solve problems better: psychological measurements in empirical software engineering

    PubMed Central

    Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID:24688866

  7. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    PubMed

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint.

  8. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software.

    PubMed

    Nakano, Shogo; Asano, Yasuhisa

    2015-02-03

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.

  9. Protein evolution analysis of S-hydroxynitrile lyase by complete sequence design utilizing the INTMSAlign software

    NASA Astrophysics Data System (ADS)

    Nakano, Shogo; Asano, Yasuhisa

    2015-02-01

    Development of software and methods for design of complete sequences of functional proteins could contribute to studies of protein engineering and protein evolution. To this end, we developed the INTMSAlign software, and used it to design functional proteins and evaluate their usefulness. The software could assign both consensus and correlation residues of target proteins. We generated three protein sequences with S-selective hydroxynitrile lyase (S-HNL) activity, which we call designed S-HNLs; these proteins folded as efficiently as the native S-HNL. Sequence and biochemical analysis of the designed S-HNLs suggested that accumulation of neutral mutations occurs during the process of S-HNLs evolution from a low-activity form to a high-activity (native) form. Taken together, our results demonstrate that our software and the associated methods could be applied not only to design of complete sequences, but also to predictions of protein evolution, especially within families such as esterases and S-HNLs.

  10. Enhanced cardio vascular image analysis by combined representation of results from dynamic MRI and anatomic CTA

    NASA Astrophysics Data System (ADS)

    Kuehnel, C.; Hennemuth, A.; Oeltze, S.; Boskamp, T.; Peitgen, H.-O.

    2008-03-01

    The diagnosis support in the field of coronary artery disease (CAD) is very complex due to the numerous symptoms and performed studies leading to the final diagnosis. CTA and MRI are on their way to replace invasive catheter angiography. Thus, there is a need for sophisticated software tools that present the different analysis results, and correlate the anatomical and dynamic image information. We introduce a new software assistant for the combined result visualization of CTA and MR images, in which a dedicated concept for the structured presentation of original data, segmentation results, and individual findings is realized. Therefore, we define a comprehensive class hierarchy and assign suitable interaction functions. User guidance is coupled as closely as possible with available data, supporting a straightforward workflow design. The analysis results are extracted from two previously developed software assistants, providing coronary artery analysis and measurements, function analysis as well as late enhancement data investigation. As an extension we introduce a finding concept directly relating suspicious positions to the underlying data. An affine registration of CT and MR data in combination with the AHA 17-segment model enables the coupling of local findings to positions in all data sets. Furthermore, sophisticated visualization in 2D and 3D and interactive bull's eye plots facilitate a correlation of coronary stenoses and physiology. The software has been evaluated on 20 patient data sets.

  11. Periorbital Biometric Measurements using ImageJ Software: Standardisation of Technique and Assessment Of Intra- and Interobserver Variability

    PubMed Central

    Rajyalakshmi, R.; Prakash, Winston D.; Ali, Mohammad Javed; Naik, Milind N.

    2017-01-01

    Purpose: To assess the reliability and repeatability of periorbital biometric measurements using ImageJ software and to assess if the horizontal visible iris diameter (HVID) serves as a reliable scale for facial measurements. Methods: This study was a prospective, single-blind, comparative study. Two clinicians performed 12 periorbital measurements on 100 standardised face photographs. Each individual’s HVID was determined by Orbscan IIz and used as a scale for measurements using ImageJ software. All measurements were repeated using the ‘average’ HVID of the study population as a measurement scale. Intraclass correlation coefficient (ICC) and Pearson product-moment coefficient were used as statistical tests to analyse the data. Results: The range of ICC for intra- and interobserver variability was 0.79–0.99 and 0.86–0.99, respectively. Test-retest reliability ranged from 0.66–1.0 to 0.77–0.98, respectively. When average HVID of the study population was used as scale, ICC ranged from 0.83 to 0.99, and the test-retest reliability ranged from 0.83 to 0.96 and the measurements correlated well with recordings done with individual Orbscan HVID measurements. Conclusion: Periorbital biometric measurements using ImageJ software are reproducible and repeatable. Average HVID of the population as measured by Orbscan is a reliable scale for facial measurements. PMID:29403183

  12. Handling of computational in vitro/in vivo correlation problems by Microsoft Excel: I. Principles and some general algorithms.

    PubMed

    Langenbucher, Frieder

    2002-01-01

    Most computations in the field of in vitro/in vivo correlations can be handled directly by Excel worksheets, without the need for specialized software. Following a summary of Excel features, applications are illustrated for numerical computation of AUC and Mean, Wagner-Nelson and Loo-Riegelman absorption plots, and polyexponential curve fitting.

  13. RELAP-7 Closure Correlations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Berry, R. A.; Martineau, R. C.

    The RELAP-7 code is the next generation nuclear reactor system safety analysis code being developed at the Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework, MOOSE (Multi-Physics Object Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty years of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s and TRACE’s capabilities and extends their analysis capabilities for all reactor system simulation scenarios. The RELAP-7 codemore » utilizes the well-posed 7-equation two-phase flow model for compressible two-phase flow. Closure models used in the TRACE code has been reviewed and selected to reflect the progress made during the past decades and provide a basis for the colure correlations implemented in the RELAP-7 code. This document provides a summary on the closure correlations that are currently implemented in the RELAP-7 code. The closure correlations include sub-grid models that describe interactions between the fluids and the flow channel, and interactions between the two phases.« less

  14. Proposed software system for atomic-structure calculation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fischer, C.F.

    1981-07-01

    Atomic structure calculations are understood well enough that, at a routine level, an atomic structure software package can be developed. At the Atomic Physics Conference in Riga, 1978 L.V. Chernysheva and M.Y. Amusia of Leningrad University, presented a paper on Software for Atomic Calculations. Their system, called ATOM is based on the Hartree-Fock approximation and correlation is included within the framework of RPAE. Energy level calculations, transition probabilities, photo-ionization cross-sections, electron scattering cross-sections are some of the physical properties that can be evaluated by their system. The MCHF method, together with CI techniques and the Breit-Pauli approximation also provides amore » sound theoretical basis for atomic structure calculations.« less

  15. Comparison of cyclic correlation algorithm implemented in matlab and python

    NASA Astrophysics Data System (ADS)

    Carr, Richard; Whitney, James

    Simulation is a necessary step for all engineering projects. Simulation gives the engineers an approximation of how their devices will perform under different circumstances, without hav-ing to build, or before building a physical prototype. This is especially true for space bound devices, i.e., space communication systems, where the impact of system malfunction or failure is several orders of magnitude over that of terrestrial applications. Therefore having a reliable simulation tool is key in developing these devices and systems. Math Works Matrix Laboratory (MATLAB) is a matrix based software used by scientists and engineers to solve problems and perform complex simulations. MATLAB has a number of applications in a wide variety of fields which include communications, signal processing, image processing, mathematics, eco-nomics and physics. Because of its many uses MATLAB has become the preferred software for many engineers; it is also very expensive, especially for students and startups. One alternative to MATLAB is Python. The Python is a powerful, easy to use, open source programming environment that can be used to perform many of the same functions as MATLAB. Python programming environment has been steadily gaining popularity in niche programming circles. While there are not as many function included in the software as MATLAB, there are many open source functions that have been developed that are available to be downloaded for free. This paper illustrates how Python can implement the cyclic correlation algorithm and com-pares the results to the cyclic correlation algorithm implemented in the MATLAB environment. Some of the characteristics to be compared are the accuracy and precision of the results, and the length of the programs. The paper will demonstrate that Python is capable of performing simulations of complex algorithms such cyclic correlation.

  16. Software-based on-site estimation of fractional flow reserve using standard coronary CT angiography data.

    PubMed

    De Geer, Jakob; Sandstedt, Mårten; Björkholm, Anders; Alfredsson, Joakim; Janzon, Magnus; Engvall, Jan; Persson, Anders

    2016-10-01

    The significance of a coronary stenosis can be determined by measuring the fractional flow reserve (FFR) during invasive coronary angiography. Recently, methods have been developed which claim to be able to estimate FFR using image data from standard coronary computed tomography angiography (CCTA) exams. To evaluate the accuracy of non-invasively computed fractional flow reserve (cFFR) from CCTA. A total of 23 vessels in 21 patients who had undergone both CCTA and invasive angiography with FFR measurement were evaluated using a cFFR software prototype. The cFFR results were compared to the invasively obtained FFR values. Correlation was calculated using Spearman's rank correlation, and agreement using intraclass correlation coefficient (ICC). Sensitivity, specificity, accuracy, negative predictive value, and positive predictive value for significant stenosis (defined as both FFR ≤0.80 and FFR ≤0.75) were calculated. The mean cFFR value for the whole group was 0.81 and the corresponding mean invFFR value was 0.84. The cFFR sensitivity for significant stenosis (FFR ≤0.80/0.75) on a per-lesion basis was 0.83/0.80, specificity was 0.76/0.89, and accuracy 0.78/0.87. The positive predictive value was 0.56/0.67 and the negative predictive value was 0.93/0.94. The Spearman rank correlation coefficient was ρ = 0.77 (P < 0.001) and ICC = 0.73 (P < 0.001). This particular CCTA-based cFFR software prototype allows for a rapid, non-invasive on-site evaluation of cFFR. The results are encouraging and cFFR may in the future be of help in the triage to invasive coronary angiography. © The Foundation Acta Radiologica 2015.

  17. Measurement of the area of venous ulcers using two software programs.

    PubMed

    Eberhardt, Thaís Dresch; Lima, Suzinara Beatriz Soares de; Lopes, Luis Felipe Dias; Borges, Eline de Lima; Weiller, Teresinha Heck; Fonseca, Graziele Gorete Portella da

    2016-12-19

    to compare the measurement area of venous ulcers using AutoCAD(r) and Image Tool software. this was an assessment of reproducibility tests conducted in a angiology clinic of a university hospital. Data were collected from 21 patients with venous ulcers, in the period from March to July of 2015, using a collection form and photograph of wounds. Five nurses (evaluators) of the hospital skin wound study group participated. The wounds were measured using both software programs. Data were analyzed using intraclass correlation coefficient, concordance correlation coefficient and Bland-Altman analysis. The study met the ethical aspects in accordance with current legislation. the size of ulcers varied widely, however, without significant difference between the measurements; an excellent intraclass and concordance correlation was found between both software programs, which seem to be more accurate when measuring a wound area >10 cm². the use of both software programs is appropriate for measurement of venous ulcers, appearing to be more accurate when used to measure a wound area > 10 cm². comparar a mensuração de área de úlceras venosas por meio dos softwares AutoCAD(r) e Image Tool. trata-se de um estudo de avaliação de reprodutibilidade de testes, realizado em um ambulatório de angiologia de um hospital universitário. Os dados foram coletados de 21 pacientes com úlceras venosas, no período de março a julho de 2015, por meio de formulário de coleta e fotografia das feridas. Cinco enfermeiros (avaliadores) do Grupo de Estudos de Lesões de Pele do hospital participaram da pesquisa. As feridas foram mensuradas em ambos os softwares. Os dados foram analisados por meio do Coeficiente de correlação intraclasse, Coeficiente de correlação de concordância e procedimento de Bland e Altman. A pesquisa respeitou os aspectos éticos de acordo com a legislação vigente. os tamanhos das úlceras apresentaram grande amplitude, porém, sem diferença significativa entre as mensurações, existe excelente correlação intraclasse e de concordância entre os softwares, os quais parecem ser mais precisos na mensuração de feridas com área > 10 cm². o uso de ambos os softwares é indicado para a mensuração de úlceras venosas, parecendo ser mais precisos quando utilizados para mensurar feridas com área > 10 cm². comparar la medida del área de úlceras venosas por medio de los softwares AutoCAD(r) e Image Tool. se trata de un estudio de evaluación de reproducibilidad de pruebas, realizado en un ambulatorio de angiología de un hospital universitario. Los datos fueron recolectados de 21 pacientes con úlceras venosas, en el período de marzo a julio de 2015, por medio de formulario de recolección y fotografías de las heridas. Cinco enfermeros (evaluadores) del Grupo de Estudios de Lesiones de Piel del hospital participaron de la investigación. Las heridas fueron medidas en ambos softwares. Los datos fueron analizados por medio de: Coeficiente de correlación intraclase, Coeficiente de correlación de concordancia y Procedimiento de Bland y Altman. La investigación respetó los aspectos éticos de acuerdo con la legislación vigente. los tamaños de las úlceras presentaron gran amplitud, sin embargo, sin diferencia significativa entre las medidas; existe excelente correlación intraclase y de concordancia entre los softwares, los que parecen ser más precisos en medidas de heridas con área > 10 cm². el uso de ambos softwares es indicado para medir úlceras venosas, pareciendo ser más precisos cuando utilizados para medir heridas con área > 10 cm².

  18. Evaluation of pharyngeal space and its correlation with mandible and hyoid bone in patients with different skeletal classes and facial types.

    PubMed

    Nejaim, Yuri; Aps, Johan K M; Groppo, Francisco Carlos; Haiter Neto, Francisco

    2018-06-01

    The purpose of this article was to evaluate the pharyngeal space volume, and the size and shape of the mandible and the hyoid bone, as well as their relationships, in patients with different facial types and skeletal classes. Furthermore, we estimated the volume of the pharyngeal space with a formula using only linear measurements. A total of 161 i-CAT Next Generation (Imaging Sciences International, Hatfield, Pa) cone-beam computed tomography images (80 men, 81 women; ages, 21-58 years; mean age, 27 years) were retrospectively studied. Skeletal class and facial type were determined for each patient from multiplanar reconstructions using the NemoCeph software (Nemotec, Madrid, Spain). Linear and angular measurements were performed using 3D imaging software (version 3.4.3; Carestream Health, Rochester, NY), and volumetric analysis of the pharyngeal space was carried out with ITK-SNAP (version 2.4.0; Cognitica, Philadelphia, Pa) segmentation software. For the statistics, analysis of variance and the Tukey test with a significance level of 0.05, Pearson correlation, and linear regression were used. The pharyngeal space volume, when correlated with mandible and hyoid bone linear and angular measurements, showed significant correlations with skeletal class or facial type. The linear regression performed to estimate the volume of the pharyngeal space showed an R of 0.92 and an adjusted R 2 of 0.8362. There were significant correlations between pharyngeal space volume, and the mandible and hyoid bone measurements, suggesting that the stomatognathic system should be evaluated in an integral and nonindividualized way. Furthermore, it was possible to develop a linear regression model, resulting in a useful formula for estimating the volume of the pharyngeal space. Copyright © 2018 American Association of Orthodontists. Published by Elsevier Inc. All rights reserved.

  19. Parallel Planes Information Visualization

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian

    2015-12-26

    This software presents a user-provided multivariate dataset as an interactive three dimensional visualization so that the user can explore the correlation between variables in the observations and the distribution of observations among the variables.

  20. Arsenic removal from contaminated groundwater by membrane-integrated hybrid plant: optimization and control using Visual Basic platform.

    PubMed

    Chakrabortty, S; Sen, M; Pal, P

    2014-03-01

    A simulation software (ARRPA) has been developed in Microsoft Visual Basic platform for optimization and control of a novel membrane-integrated arsenic separation plant in the backdrop of absence of such software. The user-friendly, menu-driven software is based on a dynamic linearized mathematical model, developed for the hybrid treatment scheme. The model captures the chemical kinetics in the pre-treating chemical reactor and the separation and transport phenomena involved in nanofiltration. The software has been validated through extensive experimental investigations. The agreement between the outputs from computer simulation program and the experimental findings are excellent and consistent under varying operating conditions reflecting high degree of accuracy and reliability of the software. High values of the overall correlation coefficient (R (2) = 0.989) and Willmott d-index (0.989) are indicators of the capability of the software in analyzing performance of the plant. The software permits pre-analysis, manipulation of input data, helps in optimization and exhibits performance of an integrated plant visually on a graphical platform. Performance analysis of the whole system as well as the individual units is possible using the tool. The software first of its kind in its domain and in the well-known Microsoft Excel environment is likely to be very useful in successful design, optimization and operation of an advanced hybrid treatment plant for removal of arsenic from contaminated groundwater.

  1. SMC Message Browser Projects

    NASA Technical Reports Server (NTRS)

    Wichmann, Benjamin C.

    2013-01-01

    I work directly with the System Monitoring and Control (SMC) software engineers who develop, test and release custom and commercial software in support of the Kennedy Space Center Spaceport Command and Control System. (SCCS). SMC uses Commercial Off-The-Shelf (COTS) Enterprise Management Systems (EMS) software which provides a centralized subsystem for configuring, monitoring, and controlling SCCS hardware and software used in the Control Rooms. There are multiple projects being worked on using the COTS EMS software. I am currently working with the HP Operations Manager for UNIX (OMU) software which allows Master Console Operators (MCO) to access, view and interpret messages regarding the status of the SCCS hardware and software. The OMU message browser gets cluttered with messages which can make it difficult for the MCO to manage. My main project involves determining ways to reduce the number of messages being displayed in the OMU message browser. I plan to accomplish this task in two different ways: (1) by correlating multiple messages into one single message being displayed and (2) to create policies that will determine the significance of each message and whether or not it needs to be displayed to the MCO. The core idea is to lessen the number of messages being sent to the OMU message browser so the MCO can more effectively use it.

  2. Software for the EVLA

    NASA Astrophysics Data System (ADS)

    Butler, Bryan J.; van Moorsel, Gustaaf; Tody, Doug

    2004-09-01

    The Expanded Very Large Array (EVLA) project is the next generation instrument for high resolution long-millimeter to short-meter wavelength radio astronomy. It is currently funded by NSF, with completion scheduled for 2012. The EVLA will upgrade the VLA with new feeds, receivers, data transmission hardware, correlator, and a new software system to enable the instrument to achieve its full potential. This software includes both that required for controlling and monitoring the instrument and that involved with the scientific dataflow. We concentrate here on a portion of the dataflow software, including: proposal preparation, submission, and handling; observation preparation, scheduling, and remote monitoring; data archiving; and data post-processing, including both automated (pipeline) and manual processing. The primary goals of the software are: to maximize the scientific return of the EVLA; provide ease of use, for both novices and experts; exploit commonality amongst all NRAO telescopes where possible. This last point is both a bane and a blessing: we are not at liberty to do whatever we want in the software, but on the other hand we may borrow from other projects (notably ALMA and GBT) where appropriate. The software design methodology includes detailed initial use-cases and requirements from the scientists, intimate interaction between the scientists and the programmers during design and implementation, and a thorough testing and acceptance plan.

  3. Tool development in threat assessment: syntax regularization and correlative analysis. Final report Task I and Task II, November 21, 1977-May 21, 1978. [Linguistic analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Miron, M.S.; Christopher, C.; Hirshfield, S.

    1978-05-01

    Psycholinguistics provides crisis managers in nuclear threat incidents with a quantitative methodology which can aid in the determination of threat credibility, authorship identification and perpetrator apprehension. The objective of this contract is to improve and enhance present psycholinguistic software systems by means of newly-developed, computer-automated techniques which significantly extend the technology of automated content and stylistic analysis of nuclear threat. In accordance with this overall objective, the first two contract Tasks have been completed and are reported on in this document. The first Task specifies the development of software support for the purpose of syntax regularization of vocabulary to rootmore » form. The second calls for the exploration and development of alternative approaches to correlative analysis of vocabulary usage.« less

  4. Shadow analysis via the C+K Visioline: A technical note.

    PubMed

    Houser, T; Zerweck, C; Grove, G; Wickett, R

    2017-11-01

    This research investigated the ability of shadow analysis (via the Courage + Khazaka Visioline and Image Pro Premiere 9.0 software) to accurately assess the differences in skin topography associated with photo aging. Analyses were performed on impressions collected from a microfinish comparator scale (GAR Electroforming) as well a series of impressions collected from the crow's feet region of 9 women who represent each point on the Zerweck Crow's Feet classification scale. Analyses were performed using a Courage + Khazaka Visioline VL 650 as well as Image Pro Premiere 9.0 software. Shadow analysis showed an ability to accurately measure the groove depth when measuring impressions collected from grooves of known depth. Several shadow analysis parameters showed a correlation with the expert grader ratings of crow's feet when averaging measurements taken from the North and South directions. The Max Depth parameter in particular showed a strong correlation with the expert grader's ratings which improved when a more sophisticated analysis was performed using Image Pro Premiere. When used properly, shadow analysis is effective at accurately measuring skin surface impressions for differences in skin topography. Shadow analysis is shown to accurately assess the differences across a range of crow's feet severity correlating to a 0-8 grader scale. The Visioline VL 650 is a good tool for this measurement, with room for improvement in analysis which can be achieved through third party image analysis software. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  5. Studying Axon-Astrocyte Functional Interactions by 3D Two-Photon Ca2+ Imaging: A Practical Guide to Experiments and "Big Data" Analysis.

    PubMed

    Savtchouk, Iaroslav; Carriero, Giovanni; Volterra, Andrea

    2018-01-01

    Recent advances in fast volumetric imaging have enabled rapid generation of large amounts of multi-dimensional functional data. While many computer frameworks exist for data storage and analysis of the multi-gigabyte Ca 2+ imaging experiments in neurons, they are less useful for analyzing Ca 2+ dynamics in astrocytes, where transients do not follow a predictable spatio-temporal distribution pattern. In this manuscript, we provide a detailed protocol and commentary for recording and analyzing three-dimensional (3D) Ca 2+ transients through time in GCaMP6f-expressing astrocytes of adult brain slices in response to axonal stimulation, using our recently developed tools to perform interactive exploration, filtering, and time-correlation analysis of the transients. In addition to the protocol, we release our in-house software tools and discuss parameters pertinent to conducting axonal stimulation/response experiments across various brain regions and conditions. Our software tools are available from the Volterra Lab webpage at https://wwwfbm.unil.ch/dnf/group/glia-an-active-synaptic-partner/member/volterra-andrea-volterra in the form of software plugins for Image J (NIH)-a de facto standard in scientific image analysis. Three programs are available: MultiROI_TZ_profiler for interactive graphing of several movable ROIs simultaneously, Gaussian_Filter5D for Gaussian filtering in several dimensions, and Correlation_Calculator for computing various cross-correlation parameters on voxel collections through time.

  6. Ventilation/perfusion SPECT/CT in patients with pulmonary emphysema. Evaluation of software-based analysing.

    PubMed

    Schreiter, V; Steffen, I; Huebner, H; Bredow, J; Heimann, U; Kroencke, T J; Poellinger, A; Doellinger, F; Buchert, R; Hamm, B; Brenner, W; Schreiter, N F

    2015-01-01

    The purpose of this study was to evaluate the reproducibility of a new software based analysing system for ventilation/perfusion single-photon emission computed tomography/computed tomography (V/P SPECT/CT) in patients with pulmonary emphysema and to compare it to the visual interpretation. 19 patients (mean age: 68.1 years) with pulmonary emphysema who underwent V/P SPECT/CT were included. Data were analysed by two independent observers in visual interpretation (VI) and by software based analysis system (SBAS). SBAS PMOD version 3.4 (Technologies Ltd, Zurich, Switzerland) was used to assess counts and volume per lung lobe/per lung and to calculate the count density per lung, lobe ratio of counts and ratio of count density. VI was performed using a visual scale to assess the mean counts per lung lobe. Interobserver variability and association for SBAS and VI were analysed using Spearman's rho correlation coefficient. Interobserver agreement correlated highly in perfusion (rho: 0.982, 0.957, 0.90, 0.979) and ventilation (rho: 0.972, 0.924, 0.941, 0.936) for count/count density per lobe and ratio of counts/count density in SBAS. Interobserver agreement correlated clearly for perfusion (rho: 0.655) and weakly for ventilation (rho: 0.458) in VI. SBAS provides more reproducible measures than VI for the relative tracer uptake in V/P SPECT/CTs in patients with pulmonary emphysema. However, SBAS has to be improved for routine clinical use.

  7. A New Measurement Technique of the Characteristics of Nutrient Artery Canals in Tibias Using Materialise's Interactive Medical Image Control System Software

    PubMed Central

    Li, Jiantao; Zhang, Hao; Yin, Peng; Su, Xiuyun; Zhao, Zhe; Zhou, Jianfeng; Li, Chen; Li, Zhirui; Zhang, Lihai; Tang, Peifu

    2015-01-01

    We established a novel measurement technique to evaluate the anatomic information of nutrient artery canals using Mimics (Materialise's Interactive Medical Image Control System) software, which will provide full knowledge of nutrient artery canals to assist in the diagnosis of longitudinal fractures of tibia and choosing an optimal therapy. Here we collected Digital Imaging and Communications in Medicine (DICOM) format of 199 patients hospitalized in our hospital. All three-dimensional models of tibia in Mimics were reconstructed. In 3-matic software, we marked five points in tibia which located at intercondylar eminence, tibia tuberosity, outer ostium, inner ostium, and bottom of medial malleolus. We then recorded Z-coordinates values of the five points and performed statistical analysis. Our results indicate that foramen was found to be absent in 9 (2.3%) tibias, and 379 (95.2%) tibias had single nutrient foramen. The double foramina was observed in 10 (2.5%) tibias. The mean of tibia length was 358 ± 22 mm. The mean foraminal index was 31.8%  ± 3%. The mean distance between tibial tuberosity and foramen (TFD) is 66 ± 12 mm. Foraminal index has significant positive correlation with TFD (r = 0.721, P < 0.01). Length of nutrient artery canals has significant negative correlation with TFD (r = −0.340, P < 0.01) and has significant negative correlation with foraminal index (r = −0.541, P < 0.01). PMID:26788498

  8. Monitoring of noninvasive ventilation by built-in software of home bilevel ventilators: a bench study.

    PubMed

    Contal, Olivier; Vignaux, Laurence; Combescure, Christophe; Pepin, Jean-Louis; Jolliet, Philippe; Janssens, Jean-Paul

    2012-02-01

    Current bilevel positive-pressure ventilators for home noninvasive ventilation (NIV) provide physicians with software that records items important for patient monitoring, such as compliance, tidal volume (Vt), and leaks. However, to our knowledge, the validity of this information has not yet been independently assessed. Testing was done for seven home ventilators on a bench model adapted to simulate NIV and generate unintentional leaks (ie, other than of the mask exhalation valve). Five levels of leaks were simulated using a computer-driven solenoid valve (0-60 L/min) at different levels of inspiratory pressure (15 and 25 cm H(2)O) and at a fixed expiratory pressure (5 cm H(2)O), for a total of 10 conditions. Bench data were compared with results retrieved from ventilator software for leaks and Vt. For assessing leaks, three of the devices tested were highly reliable, with a small bias (0.3-0.9 L/min), narrow limits of agreement (LA), and high correlations (R(2), 0.993-0.997) when comparing ventilator software and bench results; conversely, for four ventilators, bias ranged from -6.0 L/min to -25.9 L/min, exceeding -10 L/min for two devices, with wide LA and lower correlations (R(2), 0.70-0.98). Bias for leaks increased markedly with the importance of leaks in three devices. Vt was underestimated by all devices, and bias (range, 66-236 mL) increased with higher insufflation pressures. Only two devices had a bias < 100 mL, with all testing conditions considered. Physicians monitoring patients who use home ventilation must be aware of differences in the estimation of leaks and Vt by ventilator software. Also, leaks are reported in different ways according to the device used.

  9. Comparison of Perfusion CT Software to Predict the Final Infarct Volume After Thrombectomy.

    PubMed

    Austein, Friederike; Riedel, Christian; Kerby, Tina; Meyne, Johannes; Binder, Andreas; Lindner, Thomas; Huhndorf, Monika; Wodarg, Fritz; Jansen, Olav

    2016-09-01

    Computed tomographic perfusion represents an interesting physiological imaging modality to select patients for reperfusion therapy in acute ischemic stroke. The purpose of our study was to determine the accuracy of different commercial perfusion CT software packages (Philips (A), Siemens (B), and RAPID (C)) to predict the final infarct volume (FIV) after mechanical thrombectomy. Single-institutional computed tomographic perfusion data from 147 mechanically recanalized acute ischemic stroke patients were postprocessed. Ischemic core and FIV were compared about thrombolysis in cerebral infarction (TICI) score and time interval to reperfusion. FIV was measured at follow-up imaging between days 1 and 8 after stroke. In 118 successfully recanalized patients (TICI 2b/3), a moderately to strongly positive correlation was observed between ischemic core and FIV. The highest accuracy and best correlation are shown in early and fully recanalized patients (Pearson r for A=0.42, B=0.64, and C=0.83; P<0.001). Bland-Altman plots and boxplots demonstrate smaller ranges in package C than in A and B. Significant differences were found between the packages about over- and underestimation of the ischemic core. Package A, compared with B and C, estimated more than twice as many patients with a malignant stroke profile (P<0.001). Package C best predicted hypoperfusion volume in nonsuccessfully recanalized patients. Our study demonstrates best accuracy and approximation between the results of a fully automated software (RAPID) and FIV, especially in early and fully recanalized patients. Furthermore, this software package overestimated the FIV to a significantly lower degree and estimated a malignant mismatch profile less often than other software. © 2016 American Heart Association, Inc.

  10. Comparison of in-hospital versus 30-day mortality assessments for selected medical conditions.

    PubMed

    Borzecki, Ann M; Christiansen, Cindy L; Chew, Priscilla; Loveland, Susan; Rosen, Amy K

    2010-12-01

    In-hospital mortality measures such as the Agency for Healthcare Research and Quality (AHRQ) Inpatient Quality Indicators (IQIs) are easily derived using hospital discharge abstracts and publicly available software. However, hospital assessments based on a 30-day postadmission interval might be more accurate given potential differences in facility discharge practices. To compare in-hospital and 30-day mortality rates for 6 medical conditions using the AHRQ IQI software. We used IQI software (v3.1) and 2004-2007 Veterans Health Administration (VA) discharge and Vital Status files to derive 4-year facility-level in-hospital and 30-day observed mortality rates and observed/expected ratios (O/Es) for admissions with a principal diagnosis of acute myocardial infarction, congestive heart failure, stroke, gastrointestinal hemorrhage, hip fracture, and pneumonia. We standardized software-calculated O/Es to the VA population and compared O/Es and outlier status across sites using correlation, observed agreement, and kappas. Of 119 facilities, in-hospital versus 30-day mortality O/E correlations were generally high (median: r = 0.78; range: 0.31-0.86). Examining outlier status, observed agreement was high (median: 84.7%, 80.7%-89.1%). Kappas showed at least moderate agreement (k > 0.40) for all indicators except stroke and hip fracture (k ≤ 0.22). Across indicators, few sites changed from a high to nonoutlier or low outlier, or vice versa (median: 10, range: 7-13). The AHRQ IQI software can be easily adapted to generate 30-day mortality rates. Although 30-day mortality has better face validity as a hospital performance measure than in-hospital mortality, site assessments were similar despite the definition used. Thus, the measure selected for internal benchmarking should primarily depend on the healthcare system's data linkage capabilities.

  11. An introduction to the interim digital SAR processor and the characteristics of the associated Seasat SAR imagery

    NASA Technical Reports Server (NTRS)

    Wu, C.; Barkan, B.; Huneycutt, B.; Leang, C.; Pang, S.

    1981-01-01

    Basic engineering data regarding the Interim Digital SAR Processor (IDP) and the digitally correlated Seasat synthetic aperature radar (SAR) imagery are presented. The correlation function and IDP hardware/software configuration are described, and a preliminary performance assessment presented. The geometric and radiometric characteristics, with special emphasis on those peculiar to the IDP produced imagery, are described.

  12. Development of an automated asbestos counting software based on fluorescence microscopy.

    PubMed

    Alexandrov, Maxym; Ichida, Etsuko; Nishimura, Tomoki; Aoki, Kousuke; Ishida, Takenori; Hirota, Ryuichi; Ikeda, Takeshi; Kawasaki, Tetsuo; Kuroda, Akio

    2015-01-01

    An emerging alternative to the commonly used analytical methods for asbestos analysis is fluorescence microscopy (FM), which relies on highly specific asbestos-binding probes to distinguish asbestos from interfering non-asbestos fibers. However, all types of microscopic asbestos analysis require laborious examination of large number of fields of view and are prone to subjective errors and large variability between asbestos counts by different analysts and laboratories. A possible solution to these problems is automated counting of asbestos fibers by image analysis software, which would lower the cost and increase the reliability of asbestos testing. This study seeks to develop a fiber recognition and counting software for FM-based asbestos analysis. We discuss the main features of the developed software and the results of its testing. Software testing showed good correlation between automated and manual counts for the samples with medium and high fiber concentrations. At low fiber concentrations, the automated counts were less accurate, leading us to implement correction mode for automated counts. While the full automation of asbestos analysis would require further improvements in accuracy of fiber identification, the developed software could already assist professional asbestos analysts and record detailed fiber dimensions for the use in epidemiological research.

  13. The Supernovae Analysis Application (SNAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bayless, Amanda J.; Fryer, Christopher Lee; Wollaeger, Ryan Thomas

    The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginningmore » to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.« less

  14. The Supernovae Analysis Application (SNAP)

    DOE PAGES

    Bayless, Amanda J.; Fryer, Christopher Lee; Wollaeger, Ryan Thomas; ...

    2017-09-06

    The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginningmore » to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.« less

  15. The Supernovae Analysis Application (SNAP)

    NASA Astrophysics Data System (ADS)

    Bayless, Amanda J.; Fryer, Chris L.; Wollaeger, Ryan; Wiggins, Brandon; Even, Wesley; de la Rosa, Janie; Roming, Peter W. A.; Frey, Lucy; Young, Patrick A.; Thorpe, Rob; Powell, Luke; Landers, Rachel; Persson, Heather D.; Hay, Rebecca

    2017-09-01

    The SuperNovae Analysis aPplication (SNAP) is a new tool for the analysis of SN observations and validation of SN models. SNAP consists of a publicly available relational database with observational light curve, theoretical light curve, and correlation table sets with statistical comparison software, and a web interface available to the community. The theoretical models are intended to span a gridded range of parameter space. The goal is to have users upload new SN models or new SN observations and run the comparison software to determine correlations via the website. There are problems looming on the horizon that SNAP is beginning to solve. For example, large surveys will discover thousands of SNe annually. Frequently, the parameter space of a new SN event is unbounded. SNAP will be a resource to constrain parameters and determine if an event needs follow-up without spending resources to create new light curve models from scratch. Second, there is no rapidly available, systematic way to determine degeneracies between parameters, or even what physics is needed to model a realistic SN. The correlations made within the SNAP system are beginning to solve these problems.

  16. On Fitting Generalized Linear Mixed-effects Models for Binary Responses using Different Statistical Packages

    PubMed Central

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W.; Xia, Yinglin; Tu, Xin M.

    2011-01-01

    Summary The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. PMID:21671252

  17. The stability and validity of automated vocal analysis in preverbal preschoolers with autism spectrum disorder.

    PubMed

    Woynaroski, Tiffany; Oller, D Kimbrough; Keceli-Kaysili, Bahar; Xu, Dongxin; Richards, Jeffrey A; Gilkerson, Jill; Gray, Sharmistha; Yoder, Paul

    2017-03-01

    Theory and research suggest that vocal development predicts "useful speech" in preschoolers with autism spectrum disorder (ASD), but conventional methods for measurement of vocal development are costly and time consuming. This longitudinal correlational study examines the reliability and validity of several automated indices of vocalization development relative to an index derived from human coded, conventional communication samples in a sample of preverbal preschoolers with ASD. Automated indices of vocal development were derived using software that is presently "in development" and/or only available for research purposes and using commercially available Language ENvironment Analysis (LENA) software. Indices of vocal development that could be derived using the software available for research purposes: (a) were highly stable with a single day-long audio recording, (b) predicted future spoken vocabulary to a degree that was nonsignificantly different from the index derived from conventional communication samples, and (c) continued to predict future spoken vocabulary even after controlling for concurrent vocabulary in our sample. The score derived from standard LENA software was similarly stable, but was not significantly correlated with future spoken vocabulary. Findings suggest that automated vocal analysis is a valid and reliable alternative to time intensive and expensive conventional communication samples for measurement of vocal development of preverbal preschoolers with ASD in research and clinical practice. Autism Res 2017, 10: 508-519. © 2016 International Society for Autism Research, Wiley Periodicals, Inc. © 2016 International Society for Autism Research, Wiley Periodicals, Inc.

  18. Spatial problem-solving strategies of middle school students: Wayfinding with geographic information systems

    NASA Astrophysics Data System (ADS)

    Wigglesworth, John C.

    2000-06-01

    Geographic Information Systems (GIS) is a powerful computer software package that emphasizes the use of maps and the management of spatially referenced environmental data archived in a systems data base. Professional applications of GIS have been in place since the 1980's, but only recently has GIS gained significant attention in the K--12 classroom. Students using GIS are able to manipulate and query data in order to solve all manners of spatial problems. Very few studies have examined how this technological innovation can support classroom learning. In particular, there has been little research on how experience in using the software correlates with a child's spatial cognition and his/her ability to understand spatial relationships. This study investigates the strategies used by middle school students to solve a wayfinding (route-finding) problem using the ArcView GIS software. The research design combined an individual background questionnaire, results from the Group Assessment of Logical Thinking (GALT) test, and analysis of reflective think-aloud sessions to define the characteristics of the strategies students' used to solve this particular class of spatial problem. Three uniquely different spatial problem solving strategies were identified. Visual/Concrete Wayfinders used a highly visual strategy; Logical/Abstract Wayfinders used GIS software tools to apply a more analytical and systematic approach; Transitional Wayfinders used an approach that showed evidence of one that was shifting from a visual strategy to one that was more analytical. The triangulation of data sources indicates that this progression of wayfinding strategy can be correlated both to Piagetian stages of logical thought and to experience with the use of maps. These findings suggest that GIS teachers must be aware that their students' performance will lie on a continuum that is based on cognitive development, spatial ability, and prior experience with maps. To be most effective, GIS teaching strategies and curriculum development should also represent a progression that correlates to the learners' current skills and experience.

  19. Assessment of left ventricular size and function by 3-dimensional transthoracic echocardiography: Impact of the echocardiography platform and analysis software.

    PubMed

    Castel, Anne Laure; Toledano, Manuel; Tribouilloy, Christophe; Delelis, François; Mailliet, Amandine; Marotte, Nathalie; Guerbaai, Raphaëlle A; Levy, Franck; Graux, Pierre; Ennezat, Pierre-Vladimir; Maréchaux, Sylvestre

    2018-05-27

    Whether echocardiography platform and analysis software impact left ventricular (LV) volumes, ejection fraction (EF), and stroke volume (SV) by transthoracic tridimensional echocardiography (3DE) has not yet been assessed. Hence, our aim was to compare 3DE LV end-diastolic and end-systolic volumes (EDV and ESV), LVEF, and SV obtained with echocardiography platform from 2 different manufacturers. 3DE was performed in 84 patients (65% of screened consecutive patients), with equipment from 2 different manufacturers, with subsequent off-line postprocessing to obtain parameters of LV function and size (Philips QLAB 3DQ and General Electric EchoPAC 4D autoLVQ). Twenty-five patients with clinical indication for cardiac magnetic resonance imaging served as a validation subgroup. LVEDV and LVESV from 2 vendors were highly correlated (r = 0.93), but compared with 4D autoLVQ, the use of Qlab 3DQ resulted in lower LVEDV and LVESV (bias: 11 mL, limits of agreement: -25 to +47 and bias: 6 mL, limits of agreement: -22 to +34, respectively). The agreement between LVEF values of each software was poor (intraclass correlation coefficient 0.62) despite no or minimal bias. SVs were also lower with Qlab 3DQ advanced compared with 4D autoLVQ, and both were poorly correlated (r = 0.66). Consistently, the underestimation of LVEDV, LVESV, and SV by 3DE compared with cardiac magnetic resonance imaging was more pronounced with Philips QLAB 3DQ advanced than with 4D autoLVQ. The echocardiography platform and analysis software significantly affect the values of LV parameters obtained by 3DE. Intervendor standardization and improvements in 3DE modalities are needed to broaden the use of LV parameters obtained by 3DE in clinical practice. Copyright © 2018. Published by Elsevier Inc.

  20. A software sensor model based on hybrid fuzzy neural network for rapid estimation water quality in Guangzhou section of Pearl River, China.

    PubMed

    Zhou, Chunshan; Zhang, Chao; Tian, Di; Wang, Ke; Huang, Mingzhi; Liu, Yanbiao

    2018-01-02

    In order to manage water resources, a software sensor model was designed to estimate water quality using a hybrid fuzzy neural network (FNN) in Guangzhou section of Pearl River, China. The software sensor system was composed of data storage module, fuzzy decision-making module, neural network module and fuzzy reasoning generator module. Fuzzy subtractive clustering was employed to capture the character of model, and optimize network architecture for enhancing network performance. The results indicate that, on basis of available on-line measured variables, the software sensor model can accurately predict water quality according to the relationship between chemical oxygen demand (COD) and dissolved oxygen (DO), pH and NH 4 + -N. Owing to its ability in recognizing time series patterns and non-linear characteristics, the software sensor-based FNN is obviously superior to the traditional neural network model, and its R (correlation coefficient), MAPE (mean absolute percentage error) and RMSE (root mean square error) are 0.8931, 10.9051 and 0.4634, respectively.

  1. Design and Implementation of Scientific Software Components to Enable Multiscale Modeling: The Effective Fragment Potential (QM/EFP) Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaenko, Alexander; Windus, Theresa L.; Sosonkina, Masha

    2012-10-19

    The design and development of scientific software components to provide an interface to the effective fragment potential (EFP) methods are reported. Multiscale modeling of physical and chemical phenomena demands the merging of software packages developed by research groups in significantly different fields. Componentization offers an efficient way to realize new high performance scientific methods by combining the best models available in different software packages without a need for package readaptation after the initial componentization is complete. The EFP method is an efficient electronic structure theory based model potential that is suitable for predictive modeling of intermolecular interactions in large molecularmore » systems, such as liquids, proteins, atmospheric aerosols, and nanoparticles, with an accuracy that is comparable to that of correlated ab initio methods. The developed components make the EFP functionality accessible for any scientific component-aware software package. The performance of the component is demonstrated on a protein interaction model, and its accuracy is compared with results obtained with coupled cluster methods.« less

  2. Rapid Isolation and Detection for RNA Biomarkers for TBI Diagnostics

    DTIC Science & Technology

    2015-10-01

    V., Grape and wine sensory attributes correlate with pattern- based discrimination of Cabernet Sauvignon wines by a peptidic sensor array, Tetrahedron... wine samples. Partial Least Squares Regression (PLSR) was used for the correlation of wine sensory attributes to the peptide-based receptor...responses. Data analysis was done using the software XLSTAT Addinsoft, NewYork) and R.Absorbance values due to wine without the sensing ensembles were

  3. Research and Development in Very Long Baseline Interferometry (VLBI)

    NASA Technical Reports Server (NTRS)

    Himwich, William E.

    2004-01-01

    Contents include the following: 1.Observation coordination. 2. Data acquisition system control software. 3. Station support. 4. Correlation, data processing, and analysis. 5. Data distribution and archiving. 6. Technique improvement and research. 7. Computer support.

  4. Ultraviolet Spectrometer and Polarimeter (UVSP) software development and hardware tests for the solar maximum mission

    NASA Technical Reports Server (NTRS)

    1984-01-01

    An analysis of UVSP wavelength drive hardware, problems, and recovery procedures; radiative power loss from solar plasmas; and correlations between observed UV brightness and inferred photospheric currents are given.

  5. 25 ns software correlator for photon and fluorescence correlation spectroscopy

    NASA Astrophysics Data System (ADS)

    Magatti, Davide; Ferri, Fabio

    2003-02-01

    A 25 ns time resolution, multi-tau software correlator developed in LABVIEW based on the use of a standard photon counting unit, a fast timer/counter board (6602-PCI National Instrument) and a personal computer (PC) (1.5 GHz Pentium 4) is presented and quantitatively discussed. The correlator works by processing the stream of incoming data in parallel according to two different algorithms: For large lag times (τ⩾100 μs), a classical time-mode (TM) scheme, based on the measure of the number of pulses per time interval, is used; differently, for τ⩽100 μs a photon-mode (PM) scheme is adopted and the time sequence of the arrival times of the photon pulses is measured. By combining the two methods, we developed a system capable of working out correlation functions on line, in full real time for the TM correlator and partially in batch processing for the PM correlator. For the latter one, the duty cycle depends on the count rate of the incoming pulses, being ˜100% for count rates ⩽3×104 Hz, ˜15% at 105 Hz, and ˜1% at 106 Hz. For limitations imposed by the fairly small first-in, first-out (FIFO) buffer available on the counter board, the maximum count rate permissible for a proper functioning of the PM correlator is limited to ˜105 Hz. However, this limit can be removed by using a board with a deeper FIFO. Similarly, the 25 ns time resolution is only limited by maximum clock frequency available on the 6602-PCI and can be easily improved by using a faster clock. When tested on dilute solutions of calibrated latex spheres, the overall performances of the correlator appear to be comparable with those of commercial hardware correlators, but with several nontrivial advantages related to its flexibility, low cost, and easy adaptability to future developments of PC and data acquisition technology.

  6. Computerized Bone Age Estimation Using Deep Learning Based Program: Evaluation of the Accuracy and Efficiency.

    PubMed

    Kim, Jeong Rye; Shim, Woo Hyun; Yoon, Hee Mang; Hong, Sang Hyup; Lee, Jin Seong; Cho, Young Ah; Kim, Sangki

    2017-12-01

    The purpose of this study is to evaluate the accuracy and efficiency of a new automatic software system for bone age assessment and to validate its feasibility in clinical practice. A Greulich-Pyle method-based deep-learning technique was used to develop the automatic software system for bone age determination. Using this software, bone age was estimated from left-hand radiographs of 200 patients (3-17 years old) using first-rank bone age (software only), computer-assisted bone age (two radiologists with software assistance), and Greulich-Pyle atlas-assisted bone age (two radiologists with Greulich-Pyle atlas assistance only). The reference bone age was determined by the consensus of two experienced radiologists. First-rank bone ages determined by the automatic software system showed a 69.5% concordance rate and significant correlations with the reference bone age (r = 0.992; p < 0.001). Concordance rates increased with the use of the automatic software system for both reviewer 1 (63.0% for Greulich-Pyle atlas-assisted bone age vs 72.5% for computer-assisted bone age) and reviewer 2 (49.5% for Greulich-Pyle atlas-assisted bone age vs 57.5% for computer-assisted bone age). Reading times were reduced by 18.0% and 40.0% for reviewers 1 and 2, respectively. Automatic software system showed reliably accurate bone age estimations and appeared to enhance efficiency by reducing reading times without compromising the diagnostic accuracy.

  7. OPTHYLIC: An Optimised Tool for Hybrid Limits Computation

    NASA Astrophysics Data System (ADS)

    Busato, Emmanuel; Calvet, David; Theveneaux-Pelzer, Timothée

    2018-05-01

    A software tool, computing observed and expected upper limits on Poissonian process rates using a hybrid frequentist-Bayesian CLs method, is presented. This tool can be used for simple counting experiments where only signal, background and observed yields are provided or for multi-bin experiments where binned distributions of discriminating variables are provided. It allows the combination of several channels and takes into account statistical and systematic uncertainties, as well as correlations of systematic uncertainties between channels. It has been validated against other software tools and analytical calculations, for several realistic cases.

  8. Assessment of lumbosacral kyphosis in spondylolisthesis: a computer-assisted reliability study of six measurement techniques

    PubMed Central

    Glavas, Panagiotis; Mac-Thiong, Jean-Marc; Parent, Stefan; de Guise, Jacques A.

    2008-01-01

    Although recognized as an important aspect in the management of spondylolisthesis, there is no consensus on the most reliable and optimal measure of lumbosacral kyphosis (LSK). Using a custom computer software, four raters evaluated 60 standing lateral radiographs of the lumbosacral spine during two sessions at a 1-week interval. The sample size consisted of 20 normal, 20 low and 20 high grade spondylolisthetic subjects. Six parameters were included for analysis: Boxall’s slip angle, Dubousset’s lumbosacral angle (LSA), the Spinal Deformity Study Group’s (SDSG) LSA, dysplastic SDSG LSA, sagittal rotation (SR), kyphotic Cobb angle (k-Cobb). Intra- and inter-rater reliability for all parameters was assessed using intra-class correlation coefficients (ICC). Correlations between parameters and slip percentage were evaluated with Pearson coefficients. The intra-rater ICC’s for all the parameters ranged between 0.81 and 0.97 and the inter-rater ICC’s were between 0.74 and 0.98. All parameters except sagittal rotation showed a medium to large correlation with slip percentage. Dubousset’s LSA and the k-Cobb showed the largest correlations (r = −0.78 and r = −0.50, respectively). SR was associated with the weakest correlation (r = −0.10). All other parameters had medium correlations with percent slip (r = 0.31–0.43). All measurement techniques provided excellent inter- and intra-rater reliability. Dubousset’s LSA showed the strongest correlation with slip grade. This parameter can be used in the clinical setting with PACS software capabilities to assess LSK. A computer-assisted technique is recommended in order to increase the reliability of the measurement of LSK in spondylolisthesis. PMID:19015898

  9. Computational measurement of joint space width and structural parameters in normal hips.

    PubMed

    Nishii, Takashi; Shiomi, Toshiyuki; Sakai, Takashi; Takao, Masaki; Yoshikawa, Hideki; Sugano, Nobuhiko

    2012-05-01

    Joint space width (JSW) of hip joints on radiographs in normal population may vary by related factors, but previous investigations were insufficient due to limitations of sources of radiographs, inclusion of subjects with osteoarthritis, and manual measurement techniques. We investigated influential factors on JSW using semiautomatic computational software on pelvic radiographs in asymptomatic subjects without radiological osteoarthritic findings. Global and local JSW at the medial, middle, and lateral compartments, and the hip structural parameters were measured in asymptomatic, normal 150 cases (300 hips), using a customized computational software. Reliability of measurement in global and local JSWs was high with intraobserver reproducibility (intraclass correlation coefficient) ranging from 0.957 to 0.993 and interobserver reproducibility ranging from 0.925 to 0.985. There were significant differences among three local JSWs, with the largest JSW at the lateral compartment. Global and medial local JSWs were significantly larger in the right hip, and global, medial and middle local JSWs were significantly smaller in women. Global and local JSWs were inversely correlated with CE angle and positively correlated with horizontal distance of the head center, but not correlated with body mass index in men and women. They were positively correlated with age and inversely correlated with vertical distance of the head center only in men. There were interindividual variations of JSW in normal population, depending on sites of the weight-bearing area, side, gender, age, and hip structural parameters. For accurate diagnosis and assessment of hip osteoarthritis, consideration of those influential factors other than degenerative change is important.

  10. Correlation of serum uric acid with heart rate variability in hypertension.

    PubMed

    Kunikullaya, K U; Purushottam, N; Prakash, V; Mohan, S; Chinnaswamy, R

    2015-01-01

    Autonomic dysfunction with dominant sympathetic tone is a common finding among hypertensives and prehypertensives. Uric acid is one of the independent predictors of hypertension. There are very few studies which have shown a relationship between the autonomic tone and uric acid generation pathway among prehypertensives and hypertensives. Aim of the study was to estimate and correlate serum uric acid levels with autonomic function as measured by heart rate variability (HRV) among prehypertensives and hypertensives. Cross-sectional study of three groups, prehypertensives, hypertensives and normotensives, classified according to Joint National Committee VII criteria, with 35 subjects in each group were included in this study. Serum uric acid levels were estimated by using colorimetric assay kit. HRV was analyzed after recording lead II Electrocardiogram using RMS Vagus HRV software (RMS, India). One-way ANOVA and Pearson's correlation was done using SPSS 18.0 software. Mean uric acid levels were 5.62±2.21mg/dL in normal subjects, 7.06±2.87mg/dL in prehypertensives and 9.77±2.04mg/dL in hypertensives. There was statistically significant negative correlation between uric acid and time domain parameters of HRV in the whole sample and among prehypertensives and positive correlation with low frequency power (LF) in ms(2) and n.u. Serum uric acid levels were high in prehypertensives and hypertensives as compared to normal subjects. Further, there was statistically significant correlation seen between uric acid levels and sympathetic domain parameters particularly among prehypertensives. Copyright © 2015 SEHLELHA. Published by Elsevier España, S.L.U. All rights reserved.

  11. Validating New Software for Semiautomated Liver Volumetry--Better than Manual Measurement?

    PubMed

    Noschinski, L E; Maiwald, B; Voigt, P; Wiltberger, G; Kahn, T; Stumpp, P

    2015-09-01

    This prospective study compared a manual program for liver volumetry with semiautomated software. The hypothesis was that the semiautomated software would be faster, more accurate and less dependent on the evaluator's experience. Ten patients undergoing hemihepatectomy were included in this IRB approved study after written informed consent. All patients underwent a preoperative abdominal 3-phase CT scan, which was used for whole liver volumetry and volume prediction for the liver part to be resected. Two different types of software were used: 1) manual method: borders of the liver had to be defined per slice by the user; 2) semiautomated software: automatic identification of liver volume with manual assistance for definition of Couinaud segments. Measurements were done by six observers with different experience levels. Water displacement volumetry immediately after partial liver resection served as the gold standard. The resected part was examined with a CT scan after displacement volumetry. Volumetry of the resected liver scan showed excellent correlation to water displacement volumetry (manual: ρ = 0.997; semiautomated software: ρ = 0.995). The difference between the predicted volume and the real volume was significantly smaller with the semiautomated software than with the manual method (33% vs. 57%, p = 0.002). The semiautomated software was almost four times faster for volumetry of the whole liver (manual: 6:59 ± 3:04 min; semiautomated: 1:47 ± 1:11 min). Both methods for liver volumetry give an estimated liver volume close to the real one. The tested semiautomated software is faster, more accurate in predicting the volume of the resected liver part, gives more reproducible results and is less dependent on the user's experience. Both tested types of software allow exact volumetry of resected liver parts. Preoperative prediction can be performed more accurately with the semiautomated software. The semiautomated software is nearly four times faster than the tested manual program and less dependent on the user's experience. © Georg Thieme Verlag KG Stuttgart · New York.

  12. Measuring Sea-Ice Motion in the Arctic with Real Time Photogrammetry

    NASA Astrophysics Data System (ADS)

    Brozena, J. M.; Hagen, R. A.; Peters, M. F.; Liang, R.; Ball, D.

    2014-12-01

    The U.S. Naval Research Laboratory, in coordination with other groups, has been collecting sea-ice data in the Arctic off the north coast of Alaska with an airborne system employing a radar altimeter, LiDAR and a photogrammetric camera in an effort to obtain wide swaths of measurements coincident with Cryosat-2 footprints. Because the satellite tracks traverse areas of moving pack ice, precise real-time estimates of the ice motion are needed to fly a survey grid that will yield complete data coverage. This requirement led us to develop a method to find the ice motion from the aircraft during the survey. With the advent of real-time orthographic photogrammetric systems, we developed a system that measures the sea ice motion in-flight, and also permits post-process modeling of sea ice velocities to correct the positioning of radar and LiDAR data. For the 2013 and 2014 field seasons, we used this Real Time Ice Motion Estimation (RTIME) system to determine ice motion using Applanix's Inflight Ortho software with an Applanix DSS439 system. Operationally, a series of photos were taken in the survey area. The aircraft then turned around and took more photos along the same line several minutes later. Orthophotos were generated within minutes of collection and evaluated by custom software to find photo footprints and potential overlap. Overlapping photos were passed to the correlation software, which selects a series of "chips" in the first photo and looks for the best matches in the second photo. The correlation results are then passed to a density-based clustering algorithm to determine the offset of the photo pair. To investigate any systematic errors in the photogrammetry, we flew several flight lines over a fixed point on various headings, over an area of non-moving ice in 2013. The orthophotos were run through the correlation software to find any residual offsets, and run through additional software to measure chip positions and offsets relative to the aircraft heading. X- and Y-offsets in situations where one of the chips was near the center of its photo were plotted to find the along- and across-track errors vs. distance from the photo center. Corrections were determined and applied to the survey data, reducing the mean error by about 1 meter. The corrections were applied to all of the subsequent survey data.

  13. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images.

    PubMed

    Wang, Yi; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y; van Aken, Peter A

    2016-09-01

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO6 octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. Copyright © 2016 The Authors. Published by Elsevier B.V. All rights reserved.

  14. Cross-instrument Analysis Correlation Software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McJunkin, Timothy R.

    This program has been designed to assist with the tracking of a sample from one analytical instrument to another such as SEM, microscopes, micro x-ray diffraction and other instruments where particular positions/locations on the sample are examined, photographed, etc. The software is designed to easily enter the position of fiducials and locations of interest such that in a future session in the same of different instrument the positions of interest can be re-found through using the known location fiducials in the current and reference session to transform the point into the current sessions coordinate system. The software is dialog boxmore » driven guiding the user through the necessary data entry and program choices. Information is stored in a series of text based extensible markup language (XML) files.« less

  15. On fitting generalized linear mixed-effects models for binary responses using different statistical packages.

    PubMed

    Zhang, Hui; Lu, Naiji; Feng, Changyong; Thurston, Sally W; Xia, Yinglin; Zhu, Liang; Tu, Xin M

    2011-09-10

    The generalized linear mixed-effects model (GLMM) is a popular paradigm to extend models for cross-sectional data to a longitudinal setting. When applied to modeling binary responses, different software packages and even different procedures within a package may give quite different results. In this report, we describe the statistical approaches that underlie these different procedures and discuss their strengths and weaknesses when applied to fit correlated binary responses. We then illustrate these considerations by applying these procedures implemented in some popular software packages to simulated and real study data. Our simulation results indicate a lack of reliability for most of the procedures considered, which carries significant implications for applying such popular software packages in practice. Copyright © 2011 John Wiley & Sons, Ltd.

  16. VLBI Technology Development at SHAO

    NASA Technical Reports Server (NTRS)

    Zhang, Xiuzhong; Shu, Fengchun; Xiang, Ying; Zhu, Renjie; Xu, Zhijun; Chen, Zhong; Zheng, Weimin; Luo, Jintao; Wu, Yajun

    2010-01-01

    VLBI technology development made significant progress at SHAO in the last few years. The development status of the Chinese DBBC, the software and FPGA-based correlators, and the new VLBI antenna, as well as VLBI applications are summarized in this paper.

  17. Digital correlation of DDRS data

    NASA Technical Reports Server (NTRS)

    Griffin, C. R.; Estes, J. M.

    1981-01-01

    The reduction of digital SAR (synthetic aperture radar) data to radar images for use in remote sensing applications was investigated. The critical software operations are discussed in detail, and suggestions and recommendations are made for improving the algorithms currently being used.

  18. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  19. Link Correlated Military Data for Better Decision Support

    DTIC Science & Technology

    2011-06-01

    automatically translated into URI based links, thus can greatly reduce man power cost on software development. 3 Linked Data Technique Tim Berners - Lee ...Linked Data - while Linked Data is usually considered as part of Semantic Web, or “the Semantic Web done right” as described by Tim himself - has been...Required data of automatic link construction mechanism on more kinds of correlations. References [1] B. L. Tim , “The next Web of open, linked data

  20. A wideband analog correlator system for AMiBA

    NASA Astrophysics Data System (ADS)

    Li, Chao-Te; Kubo, Derek; Han, Chih-Chiang; Chen, Chung-Cheng; Chen, Ming-Tang; Lien, Chun-Hsien; Wang, Huei; Wei, Ray-Ming; Yang, Chia-Hsiang; Chiueh, Tzi-Dar; Peterson, Jeffrey; Kesteven, Michael; Wilson, Warwick

    2004-10-01

    A wideband correlator system with a bandwidth of 16 GHz or more is required for Array for Microwave Background Anisotropy (AMiBA) to achieve the sensitivity of 10μK in one hour of observation. Double-balanced diode mixers were used as multipliers in 4-lag correlator modules. Several wideband modules were developed for IF signal distribution between receivers and correlators. Correlator outputs were amplified, and digitized by voltage-to-frequency converters. Data acquisition circuits were designed using field programmable gate arrays (FPGA). Subsequent data transfer and control software were based on the configuration for Australia Telescope Compact Array. Transform matrix method will be adopted during calibration to take into account the phase and amplitude variations of analog devices across the passband.

  1. The effect of proposed software products' features on the satisfaction and dissatisfaction of potential customers

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Yusof, Muhammad Mat

    2016-08-01

    This paper reports the effect of proposed software products features on the satisfaction and dissatisfaction of potential customers of proposed software products. Kano model's functional and dysfunctional technique was used along with Berger et al.'s customer satisfaction coefficients. The result shows that only two features performed the most in influencing the satisfaction and dissatisfaction of would-be customers of the proposed software product. Attractive and one-dimensional features had the highest impact on the satisfaction and dissatisfaction of customers. This result will benefit requirements analysts, developers, designers, projects and sales managers in preparing for proposed products. Additional analysis showed that the Kano model's satisfaction and dissatisfaction scores were highly related to the Park et al.'s average satisfaction coefficient (r=96%), implying that these variables can be used interchangeably or in place of one another to elicit customer satisfaction. Furthermore, average satisfaction coefficients and satisfaction and dissatisfaction indexes were all positively and linearly correlated.

  2. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement

    PubMed Central

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-01-01

    Objective The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Methods Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Results Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. Conclusions AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis. PMID:22654681

  3. Reliability and validity of the AutoCAD software method in lumbar lordosis measurement.

    PubMed

    Letafatkar, Amir; Amirsasan, Ramin; Abdolvahabi, Zahra; Hadadnezhad, Malihe

    2011-12-01

    The aim of this study was to determine the reliability and validity of the AutoCAD software method in lumbar lordosis measurement. Fifty healthy volunteers with a mean age of 23 ± 1.80 years were enrolled. A lumbar lateral radiograph was taken on all participants, and the lordosis was measured according to the Cobb method. Afterward, the lumbar lordosis degree was measured via AutoCAD software and flexible ruler methods. The current study is accomplished in 2 parts: intratester and intertester evaluations of reliability as well as the validity of the flexible ruler and software methods. Based on the intraclass correlation coefficient, AutoCAD's reliability and validity in measuring lumbar lordosis were 0.984 and 0.962, respectively. AutoCAD showed to be a reliable and valid method to measure lordosis. It is suggested that this method may replace those that are costly and involve health risks, such as radiography, in evaluating lumbar lordosis.

  4. The development of automated behavior analysis software

    NASA Astrophysics Data System (ADS)

    Jaana, Yuki; Prima, Oky Dicky A.; Imabuchi, Takashi; Ito, Hisayoshi; Hosogoe, Kumiko

    2015-03-01

    The measurement of behavior for participants in a conversation scene involves verbal and nonverbal communications. The measurement validity may vary depending on the observers caused by some aspects such as human error, poorly designed measurement systems, and inadequate observer training. Although some systems have been introduced in previous studies to automatically measure the behaviors, these systems prevent participants to talk in a natural way. In this study, we propose a software application program to automatically analyze behaviors of the participants including utterances, facial expressions (happy or neutral), head nods, and poses using only a single omnidirectional camera. The camera is small enough to be embedded into a table to allow participants to have spontaneous conversation. The proposed software utilizes facial feature tracking based on constrained local model to observe the changes of the facial features captured by the camera, and the Japanese female facial expression database to recognize expressions. Our experiment results show that there are significant correlations between measurements observed by the observers and by the software.

  5. TBI server: a web server for predicting ion effects in RNA folding.

    PubMed

    Zhu, Yuhong; He, Zhaojian; Chen, Shi-Jie

    2015-01-01

    Metal ions play a critical role in the stabilization of RNA structures. Therefore, accurate prediction of the ion effects in RNA folding can have a far-reaching impact on our understanding of RNA structure and function. Multivalent ions, especially Mg²⁺, are essential for RNA tertiary structure formation. These ions can possibly become strongly correlated in the close vicinity of RNA surface. Most of the currently available software packages, which have widespread success in predicting ion effects in biomolecular systems, however, do not explicitly account for the ion correlation effect. Therefore, it is important to develop a software package/web server for the prediction of ion electrostatics in RNA folding by including ion correlation effects. The TBI web server http://rna.physics.missouri.edu/tbi_index.html provides predictions for the total electrostatic free energy, the different free energy components, and the mean number and the most probable distributions of the bound ions. A novel feature of the TBI server is its ability to account for ion correlation and ion distribution fluctuation effects. By accounting for the ion correlation and fluctuation effects, the TBI server is a unique online tool for computing ion-mediated electrostatic properties for given RNA structures. The results can provide important data for in-depth analysis for ion effects in RNA folding including the ion-dependence of folding stability, ion uptake in the folding process, and the interplay between the different energetic components.

  6. DSN Beowulf Cluster-Based VLBI Correlator

    NASA Technical Reports Server (NTRS)

    Rogstad, Stephen P.; Jongeling, Andre P.; Finley, Susan G.; White, Leslie A.; Lanyi, Gabor E.; Clark, John E.; Goodhart, Charles E.

    2009-01-01

    The NASA Deep Space Network (DSN) requires a broadband VLBI (very long baseline interferometry) correlator to process data routinely taken as part of the VLBI source Catalogue Maintenance and Enhancement task (CAT M&E) and the Time and Earth Motion Precision Observations task (TEMPO). The data provided by these measurements are a crucial ingredient in the formation of precision deep-space navigation models. In addition, a VLBI correlator is needed to provide support for other VLBI related activities for both internal and external customers. The JPL VLBI Correlator (JVC) was designed, developed, and delivered to the DSN as a successor to the legacy Block II Correlator. The JVC is a full-capability VLBI correlator that uses software processes running on multiple computers to cross-correlate two-antenna broadband noise data. Components of this new system (see Figure 1) consist of Linux PCs integrated into a Beowulf Cluster, an existing Mark5 data storage system, a RAID array, an existing software correlator package (SoftC) originally developed for Delta DOR Navigation processing, and various custom- developed software processes and scripts. Parallel processing on the JVC is achieved by assigning slave nodes of the Beowulf cluster to process separate scans in parallel until all scans have been processed. Due to the single stream sequential playback of the Mark5 data, some ramp-up time is required before all nodes can have access to required scan data. Core functions of each processing step are accomplished using optimized C programs. The coordination and execution of these programs across the cluster is accomplished using Pearl scripts, PostgreSQL commands, and a handful of miscellaneous system utilities. Mark5 data modules are loaded on Mark5 Data systems playback units, one per station. Data processing is started when the operator scans the Mark5 systems and runs a script that reads various configuration files and then creates an experiment-dependent status database used to delegate parallel tasks between nodes and storage areas (see Figure 2). This script forks into three processes: extract, translate, and correlate. Each of these processes iterates on available scan data and updates the status database as the work for each scan is completed. The extract process coordinates and monitors the transfer of data from each of the Mark5s to the Beowulf RAID storage systems. The translate process monitors and executes the data conversion processes on available scan files, and writes the translated files to the slave nodes. The correlate process monitors the execution of SoftC correlation processes on the slave nodes for scans that have completed translation. A comparison of the JVC and the legacy Block II correlator outputs reveals they are well within a formal error, and that the data are comparable with respect to their use in flight navigation. The processing speed of the JVC is improved over the Block II correlator by a factor of 4, largely due to the elimination of the reel-to-reel tape drives used in the Block II correlator.

  7. Parallel-Processing Software for Creating Mosaic Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; McCauley, Michael; DeJong, Eric

    2008-01-01

    A computer program implements parallel processing for nearly real-time creation of panoramic mosaics of images of terrain acquired by video cameras on an exploratory robotic vehicle (e.g., a Mars rover). Because the original images are typically acquired at various camera positions and orientations, it is necessary to warp the images into the reference frame of the mosaic before stitching them together to create the mosaic. [Also see "Parallel-Processing Software for Correlating Stereo Images," Software Supplement to NASA Tech Briefs, Vol. 31, No. 9 (September 2007) page 26.] The warping algorithm in this computer program reflects the considerations that (1) for every pixel in the desired final mosaic, a good corresponding point must be found in one or more of the original images and (2) for this purpose, one needs a good mathematical model of the cameras and a good correlation of individual pixels with respect to their positions in three dimensions. The desired mosaic is divided into slices, each of which is assigned to one of a number of central processing units (CPUs) operating simultaneously. The results from the CPUs are gathered and placed into the final mosaic. The time taken to create the mosaic depends upon the number of CPUs, the speed of each CPU, and whether a local or a remote data-staging mechanism is used.

  8. GOEAST: a web-based software toolkit for Gene Ontology enrichment analysis.

    PubMed

    Zheng, Qi; Wang, Xiu-Jie

    2008-07-01

    Gene Ontology (GO) analysis has become a commonly used approach for functional studies of large-scale genomic or transcriptomic data. Although there have been a lot of software with GO-related analysis functions, new tools are still needed to meet the requirements for data generated by newly developed technologies or for advanced analysis purpose. Here, we present a Gene Ontology Enrichment Analysis Software Toolkit (GOEAST), an easy-to-use web-based toolkit that identifies statistically overrepresented GO terms within given gene sets. Compared with available GO analysis tools, GOEAST has the following improved features: (i) GOEAST displays enriched GO terms in graphical format according to their relationships in the hierarchical tree of each GO category (biological process, molecular function and cellular component), therefore, provides better understanding of the correlations among enriched GO terms; (ii) GOEAST supports analysis for data from various sources (probe or probe set IDs of Affymetrix, Illumina, Agilent or customized microarrays, as well as different gene identifiers) and multiple species (about 60 prokaryote and eukaryote species); (iii) One unique feature of GOEAST is to allow cross comparison of the GO enrichment status of multiple experiments to identify functional correlations among them. GOEAST also provides rigorous statistical tests to enhance the reliability of analysis results. GOEAST is freely accessible at http://omicslab.genetics.ac.cn/GOEAST/

  9. Meta-analysis in Stata using gllamm.

    PubMed

    Bagos, Pantelis G

    2015-12-01

    There are several user-written programs for performing meta-analysis in Stata (Stata Statistical Software: College Station, TX: Stata Corp LP). These include metan, metareg, mvmeta, and glst. However, there are several cases for which these programs do not suffice. For instance, there is no software for performing univariate meta-analysis with correlated estimates, for multilevel or hierarchical meta-analysis, or for meta-analysis of longitudinal data. In this work, we show with practical applications that many disparate models, including but not limited to the ones mentioned earlier, can be fitted using gllamm. The software is very versatile and can handle a wide variety of models with applications in a wide range of disciplines. The method presented here takes advantage of these modeling capabilities and makes use of appropriate transformations, based on the Cholesky decomposition of the inverse of the covariance matrix, known as generalized least squares, in order to handle correlated data. The models described earlier can be thought of as special instances of a general linear mixed-model formulation, but to the author's knowledge, a general exposition in order to incorporate all the available models for meta-analysis as special cases and the instructions to fit them in Stata has not been presented so far. Source code is available at http:www.compgen.org/tools/gllamm. Copyright © 2015 John Wiley & Sons, Ltd.

  10. Assessment of global longitudinal strain using standardized myocardial deformation imaging: a modality independent software approach.

    PubMed

    Riffel, Johannes H; Keller, Marius G P; Aurich, Matthias; Sander, Yannick; Andre, Florian; Giusca, Sorin; Aus dem Siepen, Fabian; Seitz, Sebastian; Galuschky, Christian; Korosoglou, Grigorios; Mereles, Derliz; Katus, Hugo A; Buss, Sebastian J

    2015-07-01

    Myocardial deformation measurement is superior to left ventricular ejection fraction in identifying early changes in myocardial contractility and prediction of cardiovascular outcome. The lack of standardization hinders its clinical implementation. The aim of the study is to investigate a novel standardized deformation imaging approach based on the feature tracking algorithm for the assessment of global longitudinal (GLS) and global circumferential strain (GCS) in echocardiography and cardiac magnetic resonance imaging (CMR). 70 subjects undergoing CMR were consecutively investigated with echocardiography within a median time of 30 min. GLS and GCS were analyzed with a post-processing software incorporating the same standardized algorithm for both modalities. Global strain was defined as the relative shortening of the whole endocardial contour length and calculated according to the strain formula. Mean GLS values were -16.2 ± 5.3 and -17.3 ± 5.3 % for echocardiography and CMR, respectively. GLS did not differ significantly between the two imaging modalities, which showed strong correlation (r = 0.86), a small bias (-1.1 %) and narrow 95 % limits of agreement (LOA ± 5.4 %). Mean GCS values were -17.9 ± 6.3 and -24.4 ± 7.8 % for echocardiography and CMR, respectively. GCS was significantly underestimated by echocardiography (p < 0.001). A weaker correlation (r = 0.73), a higher bias (-6.5 %) and wider LOA (± 10.5 %) were observed for GCS. GLS showed a strong correlation (r = 0.92) when image quality was good, while correlation dropped to r = 0.82 with poor acoustic windows in echocardiography. GCS assessment revealed only a strong correlation (r = 0.87) when echocardiographic image quality was good. No significant differences for GLS between two different echocardiographic vendors could be detected. Quantitative assessment of GLS using a standardized software algorithm allows the direct comparison of values acquired irrespective of the imaging modality. GLS may, therefore, serve as a reliable parameter for the assessment of global left ventricular function in clinical routine besides standard evaluation of the ejection fraction.

  11. Validation of a semi-automatic protocol for the assessment of the tear meniscus central area based on open-source software

    NASA Astrophysics Data System (ADS)

    Pena-Verdeal, Hugo; Garcia-Resua, Carlos; Yebra-Pimentel, Eva; Giraldez, Maria J.

    2017-08-01

    Purpose: Different lower tear meniscus parameters can be clinical assessed on dry eye diagnosis. The aim of this study was to propose and analyse the variability of a semi-automatic method for measuring lower tear meniscus central area (TMCA) by using open source software. Material and methods: On a group of 105 subjects, one video of the lower tear meniscus after fluorescein instillation was generated by a digital camera attached to a slit-lamp. A short light beam (3x5 mm) with moderate illumination in the central portion of the meniscus (6 o'clock) was used. Images were extracted from each video by a masked observer. By using an open source software based on Java (NIH ImageJ), a further observer measured in a masked and randomized order the TMCA in the short light beam illuminated area by two methods: (1) manual method, where TMCA images was "manually" measured; (2) semi-automatic method, where TMCA images were transformed in an 8-bit-binary image, then holes inside this shape were filled and on the isolated shape, the area size was obtained. Finally, both measurements, manual and semi-automatic, were compared. Results: Paired t-test showed no statistical difference between both techniques results (p = 0.102). Pearson correlation between techniques show a significant positive near to perfect correlation (r = 0.99; p < 0.001). Conclusions: This study showed a useful tool to objectively measure the frontal central area of the meniscus in photography by free open source software.

  12. Variance in predicted cup size by 2-dimensional vs 3-dimensional computerized tomography-based templating in primary total hip arthroplasty.

    PubMed

    Osmani, Feroz A; Thakkar, Savyasachi; Ramme, Austin; Elbuluk, Ameer; Wojack, Paul; Vigdorchik, Jonathan M

    2017-12-01

    Preoperative total hip arthroplasty templating can be performed with radiographs using acetate prints, digital viewing software, or with computed tomography (CT) images. Our hypothesis is that 3D templating is more precise and accurate with cup size prediction as compared to 2D templating with acetate prints and digital templating software. Data collected from 45 patients undergoing robotic-assisted total hip arthroplasty compared cup sizes templated on acetate prints and OrthoView software to MAKOplasty software that uses CT scan. Kappa analysis determined strength of agreement between each templating modality and the final size used. t tests compared mean cup-size variance from the final size for each templating technique. Interclass correlation coefficient (ICC) determined reliability of digital and acetate planning by comparing predictions of the operating surgeon and a blinded adult reconstructive fellow. The Kappa values for CT-guided, digital, and acetate templating with the final size was 0.974, 0.233, and 0.262, respectively. Both digital and acetate templating significantly overpredicted cup size, compared to CT-guided methods ( P < .001). There was no significant difference between digital and acetate templating ( P  = .117). Interclass correlation coefficient value for digital and acetate templating was 0.928 and 0.931, respectively. CT-guided planning more accurately predicts hip implant cup size when compared to the significant overpredictions of digital and acetate templating. CT-guided templating may also lead to better outcomes due to bone stock preservation from a smaller and more accurate cup size predicted than that of digital and acetate predictions.

  13. MaROS: Information Management Service

    NASA Technical Reports Server (NTRS)

    Allard, Daniel A.; Gladden, Roy E.; Wright, Jesse J.; Hy, Franklin H.; Rabideau, Gregg R.; Wallick, Michael N.

    2011-01-01

    This software is provided by the Mars Relay Operations Service (MaROS) task to a variety of Mars projects for the purpose of coordinating communications sessions between landed spacecraft assets and orbiting spacecraft assets at Mars. The Information Management Service centralizes a set of functions previously distributed across multiple spacecraft operations teams, and as such, greatly improves visibility into the end-to-end strategic coordination process. Most of the process revolves around the scheduling of communications sessions between the spacecraft during periods of time when a landed asset on Mars is geometrically visible by an orbiting spacecraft. These relay sessions are used to transfer data both to and from the landed asset via the orbiting asset on behalf of Earth-based spacecraft operators. This software component is an application process running as a Java virtual machine. The component provides all service interfaces via a Representational State Transfer (REST) protocol over https to external clients. There are two general interaction modes with the service: upload and download of data. For data upload, the service must execute logic specific to the upload data type and trigger any applicable calculations including pass delivery latencies and overflight conflicts. For data download, the software must retrieve and correlate requested information and deliver to the requesting client. The provision of this service enables several key advancements over legacy processes and systems. For one, this service represents the first time that end-to-end relay information is correlated into a single shared repository. The software also provides the first multimission latency calculator; previous latency calculations had been performed on a mission-by-mission basis.

  14. Accuracy and reproducibility of novel echocardiographic three-dimensional automated software for the assessment of the aortic root in candidates for thanscatheter aortic valve replacement.

    PubMed

    García-Martín, Ana; Lázaro-Rivera, Carla; Fernández-Golfín, Covadonga; Salido-Tahoces, Luisa; Moya-Mur, Jose-Luis; Jiménez-Nacher, Jose-Julio; Casas-Rojo, Eduardo; Aquila, Iolanda; González-Gómez, Ariana; Hernández-Antolín, Rosana; Zamorano, José Luis

    2016-07-01

    A specialized three-dimensional transoesophageal echocardiography (3D-TOE) reconstruction tool has recently been introduced; the system automatically configures a geometric model of the aortic root from the images obtained by 3D-TOE and performs quantitative analysis of these structures. The aim of this study was to compare the measurements of the aortic annulus (AA) obtained by the new model to that obtained by 3D-TOE and multidetector computed tomography (MDCT) in candidates to transcatheter aortic valve implantation (TAVI) and to assess the reproducibility of this new method. We included 31 patients who underwent TAVI. The AA diameters and area were evaluated by the manual 3D-TOE method and by the automatic software. We showed an excellent correlation between the measurements obtained by both methods: intra-class correlation coefficient (ICC): 0.731 (0.508-0.862), r: 0.742 for AA diameter and ICC: 0.723 (0.662-0.923), r: 0.723 for the AA area, with no significant differences regardless of the method used. The interobserver variability was superior for the automatic measurements than for the manual ones. In a subgroup of 10 patients, we also found an excellent correlation between the automatic measurements and those obtained by MDCT, ICC: 0.941 (0.761-0.985), r: 0.901 for AA diameter and ICC: 0.853 (0.409-0.964), r: 0.744 for the AA area. The new automatic 3D-TOE software allows modelling and quantifying the aortic root from 3D-TOE data with high reproducibility. There is good correlation between the automated measurements and other 3D validated techniques. Our results support its use in clinical practice as an alternative to MDCT previous to TAVI. Published on behalf of the European Society of Cardiology. All rights reserved. © The Author 2015. For permissions please email: journals.permissions@oup.com.

  15. Computer aided detection in prostate cancer diagnostics: A promising alternative to biopsy? A retrospective study from 104 lesions with histological ground truth.

    PubMed

    Thon, Anika; Teichgräber, Ulf; Tennstedt-Schenk, Cornelia; Hadjidemetriou, Stathis; Winzler, Sven; Malich, Ansgar; Papageorgiou, Ismini

    2017-01-01

    Prostate cancer (PCa) diagnosis by means of multiparametric magnetic resonance imaging (mpMRI) is a current challenge for the development of computer-aided detection (CAD) tools. An innovative CAD-software (Watson Elementary™) was proposed to achieve high sensitivity and specificity, as well as to allege a correlate to Gleason grade. To assess the performance of Watson Elementary™ in automated PCa diagnosis in our hospital´s database of MRI-guided prostate biopsies. The evaluation was retrospective for 104 lesions (47 PCa, 57 benign) from 79, 64.61±6.64 year old patients using 3T T2-weighted imaging, Apparent Diffusion Coefficient (ADC) maps and dynamic contrast enhancement series. Watson Elementary™ utilizes signal intensity, diffusion properties and kinetic profile to compute a proportional Gleason grade predictor, termed Malignancy Attention Index (MAI). The analysis focused on (i) the CAD sensitivity and specificity to classify suspect lesions and (ii) the MAI correlation with the histopathological ground truth. The software revealed a sensitivity of 46.80% for PCa classification. The specificity for PCa was found to be 75.43% with a positive predictive value of 61.11%, a negative predictive value of 63.23% and a false discovery rate of 38.89%. CAD classified PCa and benign lesions with equal probability (P 0.06, χ2 test). Accordingly, receiver operating characteristic analysis suggests a poor predictive value for MAI with an area under curve of 0.65 (P 0.02), which is not superior to the performance of board certified observers. Moreover, MAI revealed no significant correlation with Gleason grade (P 0.60, Pearson´s correlation). The tested CAD software for mpMRI analysis was a weak PCa biomarker in this dataset. Targeted prostate biopsy and histology remains the gold standard for prostate cancer diagnosis.

  16. Computer aided detection in prostate cancer diagnostics: A promising alternative to biopsy? A retrospective study from 104 lesions with histological ground truth

    PubMed Central

    Thon, Anika; Teichgräber, Ulf; Tennstedt-Schenk, Cornelia; Hadjidemetriou, Stathis; Winzler, Sven; Malich, Ansgar

    2017-01-01

    Background Prostate cancer (PCa) diagnosis by means of multiparametric magnetic resonance imaging (mpMRI) is a current challenge for the development of computer-aided detection (CAD) tools. An innovative CAD-software (Watson Elementary™) was proposed to achieve high sensitivity and specificity, as well as to allege a correlate to Gleason grade. Aim/Objective To assess the performance of Watson Elementary™ in automated PCa diagnosis in our hospital´s database of MRI-guided prostate biopsies. Methods The evaluation was retrospective for 104 lesions (47 PCa, 57 benign) from 79, 64.61±6.64 year old patients using 3T T2-weighted imaging, Apparent Diffusion Coefficient (ADC) maps and dynamic contrast enhancement series. Watson Elementary™ utilizes signal intensity, diffusion properties and kinetic profile to compute a proportional Gleason grade predictor, termed Malignancy Attention Index (MAI). The analysis focused on (i) the CAD sensitivity and specificity to classify suspect lesions and (ii) the MAI correlation with the histopathological ground truth. Results The software revealed a sensitivity of 46.80% for PCa classification. The specificity for PCa was found to be 75.43% with a positive predictive value of 61.11%, a negative predictive value of 63.23% and a false discovery rate of 38.89%. CAD classified PCa and benign lesions with equal probability (P 0.06, χ2 test). Accordingly, receiver operating characteristic analysis suggests a poor predictive value for MAI with an area under curve of 0.65 (P 0.02), which is not superior to the performance of board certified observers. Moreover, MAI revealed no significant correlation with Gleason grade (P 0.60, Pearson´s correlation). Conclusion The tested CAD software for mpMRI analysis was a weak PCa biomarker in this dataset. Targeted prostate biopsy and histology remains the gold standard for prostate cancer diagnosis. PMID:29023572

  17. Reproducibility of Lobar Perfusion and Ventilation Quantification Using SPECT/CT Segmentation Software in Lung Cancer Patients.

    PubMed

    Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin

    2017-09-01

    Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  18. A comparative study of software programmes for cross-sectional skeletal muscle and adipose tissue measurements on abdominal computed tomography scans of rectal cancer patients.

    PubMed

    van Vugt, Jeroen L A; Levolger, Stef; Gharbharan, Arvind; Koek, Marcel; Niessen, Wiro J; Burger, Jacobus W A; Willemsen, Sten P; de Bruin, Ron W F; IJzermans, Jan N M

    2017-04-01

    The association between body composition (e.g. sarcopenia or visceral obesity) and treatment outcomes, such as survival, using single-slice computed tomography (CT)-based measurements has recently been studied in various patient groups. These studies have been conducted with different software programmes, each with their specific characteristics, of which the inter-observer, intra-observer, and inter-software correlation are unknown. Therefore, a comparative study was performed. Fifty abdominal CT scans were randomly selected from 50 different patients and independently assessed by two observers. Cross-sectional muscle area (CSMA, i.e. rectus abdominis, oblique and transverse abdominal muscles, paraspinal muscles, and the psoas muscle), visceral adipose tissue area (VAT), and subcutaneous adipose tissue area (SAT) were segmented by using standard Hounsfield unit ranges and computed for regions of interest. The inter-software, intra-observer, and inter-observer agreement for CSMA, VAT, and SAT measurements using FatSeg, OsiriX, ImageJ, and sliceOmatic were calculated using intra-class correlation coefficients (ICCs) and Bland-Altman analyses. Cohen's κ was calculated for the agreement of sarcopenia and visceral obesity assessment. The Jaccard similarity coefficient was used to compare the similarity and diversity of measurements. Bland-Altman analyses and ICC indicated that the CSMA, VAT, and SAT measurements between the different software programmes were highly comparable (ICC 0.979-1.000, P < 0.001). All programmes adequately distinguished between the presence or absence of sarcopenia (κ = 0.88-0.96 for one observer and all κ = 1.00 for all comparisons of the other observer) and visceral obesity (all κ = 1.00). Furthermore, excellent intra-observer (ICC 0.999-1.000, P < 0.001) and inter-observer (ICC 0.998-0.999, P < 0.001) agreement for all software programmes were found. Accordingly, excellent Jaccard similarity coefficients were found for all comparisons (mean ≥ 0.964). FatSeg, OsiriX, ImageJ, and sliceOmatic showed an excellent agreement for CSMA, VAT, and SAT measurements on abdominal CT scans. Furthermore, excellent inter-observer and intra-observer agreement were achieved. Therefore, results of studies using these different software programmes can reliably be compared. © 2016 The Authors. Journal of Cachexia, Sarcopenia and Muscle published by John Wiley & Sons Ltd on behalf of the Society on Sarcopenia, Cachexia and Wasting Disorders.

  19. A comparative study of software programmes for cross‐sectional skeletal muscle and adipose tissue measurements on abdominal computed tomography scans of rectal cancer patients

    PubMed Central

    Levolger, Stef; Gharbharan, Arvind; Koek, Marcel; Niessen, Wiro J.; Burger, Jacobus W.A.; Willemsen, Sten P.; de Bruin, Ron W.F.

    2016-01-01

    Abstract Background The association between body composition (e.g. sarcopenia or visceral obesity) and treatment outcomes, such as survival, using single‐slice computed tomography (CT)‐based measurements has recently been studied in various patient groups. These studies have been conducted with different software programmes, each with their specific characteristics, of which the inter‐observer, intra‐observer, and inter‐software correlation are unknown. Therefore, a comparative study was performed. Methods Fifty abdominal CT scans were randomly selected from 50 different patients and independently assessed by two observers. Cross‐sectional muscle area (CSMA, i.e. rectus abdominis, oblique and transverse abdominal muscles, paraspinal muscles, and the psoas muscle), visceral adipose tissue area (VAT), and subcutaneous adipose tissue area (SAT) were segmented by using standard Hounsfield unit ranges and computed for regions of interest. The inter‐software, intra‐observer, and inter‐observer agreement for CSMA, VAT, and SAT measurements using FatSeg, OsiriX, ImageJ, and sliceOmatic were calculated using intra‐class correlation coefficients (ICCs) and Bland–Altman analyses. Cohen's κ was calculated for the agreement of sarcopenia and visceral obesity assessment. The Jaccard similarity coefficient was used to compare the similarity and diversity of measurements. Results Bland–Altman analyses and ICC indicated that the CSMA, VAT, and SAT measurements between the different software programmes were highly comparable (ICC 0.979–1.000, P < 0.001). All programmes adequately distinguished between the presence or absence of sarcopenia (κ = 0.88–0.96 for one observer and all κ = 1.00 for all comparisons of the other observer) and visceral obesity (all κ = 1.00). Furthermore, excellent intra‐observer (ICC 0.999–1.000, P < 0.001) and inter‐observer (ICC 0.998–0.999, P < 0.001) agreement for all software programmes were found. Accordingly, excellent Jaccard similarity coefficients were found for all comparisons (mean ≥ 0.964). Conclusions FatSeg, OsiriX, ImageJ, and sliceOmatic showed an excellent agreement for CSMA, VAT, and SAT measurements on abdominal CT scans. Furthermore, excellent inter‐observer and intra‐observer agreement were achieved. Therefore, results of studies using these different software programmes can reliably be compared. PMID:27897414

  20. Development and Implementation of GPS Correlator Structures in MATLAB and Simulink with Focus on SDR Applications: Implementation of a Standard GPS Correlator Architecture (Baseline) Implementation of the MIT Quicksynch Sparse Algorithm Development and Implementation of Parallel Circular Correlator Constructs

    DTIC Science & Technology

    2014-05-01

    software is available for a wide variety of operating systems , including Unix, FreeBSD, Linux, Solaris, Novell NetWare, OS X, Microsoft Windows, OS/2, TPF...Word for Xenix systems . Subsequent versions were later written for several other platforms including IBM PCs running DOS (1983), Apple Macintosh ...this burden, to Washington Headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204

  1. Computer-Generated, Three-Dimensional Spine Model From Biplanar Radiographs: A Validity Study in Idiopathic Scoliosis Curves Greater Than 50 Degrees.

    PubMed

    Carreau, Joseph H; Bastrom, Tracey; Petcharaporn, Maty; Schulte, Caitlin; Marks, Michelle; Illés, Tamás; Somoskeöy, Szabolcs; Newton, Peter O

    2014-03-01

    Reproducibility study of SterEOS 3-dimensional (3D) software in large, idiopathic scoliosis (IS) spinal curves. To determine the accuracy and reproducibility of various 3D, software-generated radiographic measurements acquired from a 2-dimensional (2D) imaging system. SterEOS software allows a user to reconstruct a 3D spinal model from an upright, biplanar, low-dose, X-ray system. The validity and internal consistency of this system have not been tested in large IS curves. EOS images from 30 IS patients with curves greater than 50° were collected for analysis. Three observers blinded to the study protocol conducted repeated, randomized, manual 2D measurements, and 3D software generated measurements from biplanar images acquired from an EOS Imaging system. Three-dimensional measurements were repeated using both the Full 3D and Fast 3D guided processes. A total of 180 (120 3D and 60 2D) sets of measurements were obtained of coronal (Cobb angle) and sagittal (T1-T12 and T4-T12 kyphosis; L1-S1 and L1-L5; and pelvic tilt, pelvic incidence, and sacral slope) parameters. Intra-class correlation coefficients were compared, as were the calculated differences in values generated by SterEOS 3D software and manual 2D measurements. The 95% confidence intervals of the mean differences in measures were calculated as an estimate of reproducibility. Average intra-class correlation coefficients were excellent: 0.97, 0.97, and 0.93 for Full 3D, Fast 3D, and 2D measures, respectively (p = .11). Measurement errors for some sagittal measures were significantly lower with the 3D techniques. Both the Full 3D and Fast 3D techniques provided consistent measurements of axial plane vertebral rotation. SterEOS 3D reconstruction spine software creates reproducible measurements in all 3 planes of deformity in curves greater than 50°. Advancements in 3D scoliosis imaging are expected to improve our understanding and treatment of idiopathic scoliosis. Copyright © 2014 Scoliosis Research Society. Published by Elsevier Inc. All rights reserved.

  2. Microwave Scanning System Correlations

    DTIC Science & Technology

    2010-08-11

    The follow equipment is needed for each of the individual scanning systems: Handheld Scanner Equipment list 1. Dell Netbook (with the...proper software installed by Evisive) 2. Bluetooth USB port transmitter 3. Handheld Probe 4. USB to mini-USB Converter (links camera to netbook

  3. Ability and efficiency of an automatic analysis software to measure microvascular parameters.

    PubMed

    Carsetti, Andrea; Aya, Hollmann D; Pierantozzi, Silvia; Bazurro, Simone; Donati, Abele; Rhodes, Andrew; Cecconi, Maurizio

    2017-08-01

    Analysis of the microcirculation is currently performed offline, is time consuming and operator dependent. The aim of this study was to assess the ability and efficiency of the automatic analysis software CytoCamTools 1.7.12 (CC) to measure microvascular parameters in comparison with Automated Vascular Analysis (AVA) software 3.2. 22 patients admitted to the cardiothoracic intensive care unit following cardiac surgery were prospectively enrolled. Sublingual microcirculatory videos were analysed using AVA and CC software. The total vessel density (TVD) for small vessels, perfused vessel density (PVD) and proportion of perfused vessels (PPV) were calculated. Blood flow was assessed using the microvascular flow index (MFI) for AVA software and the averaged perfused speed indicator (APSI) for the CC software. The duration of the analysis was also recorded. Eighty-four videos from 22 patients were analysed. The bias between TVD-CC and TVD-AVA was 2.20 mm/mm 2 (95 % CI 1.37-3.03) with limits of agreement (LOA) of -4.39 (95 % CI -5.66 to -3.16) and 8.79 (95 % CI 7.50-10.01) mm/mm 2 . The percentage error (PE) for TVD was ±32.2 %. TVD was positively correlated between CC and AVA (r = 0.74, p < 0.001). The bias between PVD-CC and PVD-AVA was 6.54 mm/mm 2 (95 % CI 5.60-7.48) with LOA of -4.25 (95 % CI -8.48 to -0.02) and 17.34 (95 % CI 13.11-21.57) mm/mm 2 . The PE for PVD was ±61.2 %. PVD was positively correlated between CC and AVA (r = 0.66, p < 0.001). The median PPV-AVA was significantly higher than the median PPV-CC [97.39 % (95.25, 100 %) vs. 81.65 % (61.97, 88.99), p < 0.0001]. MFI categories cannot estimate or predict APSI values (p = 0.45). The time required for the analysis was shorter with CC than with AVA system [2'42″ (2'12″, 3'31″) vs. 16'12″ (13'38″, 17'57″), p < 0.001]. TVD is comparable between the two softwares, although faster with CC software. The values for PVD and PPV are not interchangeable given the different approach to assess microcirculatory flow.

  4. Is liver perfusion CT reproducible? A study on intra- and interobserver agreement of normal hepatic haemodynamic parameters obtained with two different software packages.

    PubMed

    Bretas, Elisa Almeida Sathler; Torres, Ulysses S; Torres, Lucas Rios; Bekhor, Daniel; Saito Filho, Celso Fernando; Racy, Douglas Jorge; Faggioni, Lorenzo; D'Ippolito, Giuseppe

    2017-10-01

    To evaluate the agreement between the measurements of perfusion CT parameters in normal livers by using two different software packages. This retrospective study was based on 78 liver perfusion CT examinations acquired for detecting suspected liver metastasis. Patients with any morphological or functional hepatic abnormalities were excluded. The final analysis included 37 patients (59.7 ± 14.9 y). Two readers (1 and 2) independently measured perfusion parameters using different software packages from two major manufacturers (A and B). Arterial perfusion (AP) and portal perfusion (PP) were determined using the dual-input vascular one-compartmental model. Inter-reader agreement for each package and intrareader agreement between both packages were assessed with intraclass correlation coefficients (ICC) and Bland-Altman statistics. Inter-reader agreement was substantial for AP using software A (ICC = 0.82) and B (ICC = 0.85-0.86), fair for PP using software A (ICC = 0.44) and fair to moderate for PP using software B (ICC = 0.56-0.77). Intrareader agreement between software A and B ranged from slight to moderate (ICC = 0.32-0.62) for readers 1 and 2 considering the AP parameters, and from fair to moderate (ICC = 0.40-0.69) for readers 1 and 2 considering the PP parameters. At best there was only moderate agreement between both software packages, resulting in some uncertainty and suboptimal reproducibility. Advances in knowledge: Software-dependent factors may contribute to variance in perfusion measurements, demanding further technical improvements. AP measurements seem to be the most reproducible parameter to be adopted when evaluating liver perfusion CT.

  5. PiVoT GPS Receiver

    NASA Technical Reports Server (NTRS)

    Wennersten, Miriam; Banes, Vince; Boegner, Greg; Clagnett, Charles; Dougherty, Lamar; Edwards, Bernard; Roman, Joe; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    NASA Goddard Space Flight Center has built an open architecture, 24 channel spaceflight Global Positioning System (GPS) receiver. The compact PCI PiVoT GPS receiver card is based on the Mitel/GEC Plessey Builder 2 board. PiVoT uses two Plessey 2021 correlators to allow tracking of up to 24 separate GPS SV's on unique channels. Its four front ends can support four independent antennas, making it a useful card for hosting GPS attitude determination algorithms. It has been built using space quality, radiation tolerant parts. The PiVoT card works at a lower signal to noise ratio than the original Builder 2 board. It also hosts an improved clock oscillator. The PiVoT software is based on the original Piessey Builder 2 software ported to the Linux operating system. The software is posix compliant and can be easily converted to other posix operating systems. The software is open source to anyone with a licensing agreement with Plessey. Additional tasks can be added to the software to support GPS science experiments or attitude determination algorithms. The next generation PiVoT receiver will be a single radiation hardened compact PCI card containing the microprocessor and the GPS receiver optimized for use above the GPS constellation.

  6. [Job stressors in software developers--a comparison with other occupations].

    PubMed

    Kadokura, M

    1997-09-01

    The aim of this study is to investigate the difference in job stressors among software developers, the sales staff and the clerical staff (n = 2,079) in two companies (A Co. and B Co.) using a self-administered questionnaire that included a job stressor scale and the 30-item General Health Questionnaire (GHQ). We developed the job stressor scale based on the interviews with out-patients who engaged in software development and previous studies about job stressors. Factor analysis with a seven-factor solution showed that seven subscales were abstracted from the job stressor scale, namely, quantitative load of work, dissatisfaction with work, demanding work, uneasiness about work, human relations, ambiguity of work and shortage of private time. Each subscale was significantly (r = .313-.442, p < 0.0001) correlated with the GHQ score and proved to be a reliable instrument, as indicated by a Cronbach's alpha of greater than 0.73. Stepwise multiple regression analysis revealed that quantitative load of work and shortage of private time subscale scores were significantly high in software developers in A Co. Software developers in A Co. tended to score higher (P < .10) than the others in demanding work and ambiguity of work subscale. All subscale scores were significantly low in the clerical staff in B Co. There was no significant difference between the sales staff and software developers in B Co. Results of the interviews with out-patients showed that demanding work, hard deadline, ambiguity of work and precarious work would cause trouble in software developers. The implications of these findings with respect to occupational issues related to software developers are discussed.

  7. [Correlation research on contents of podophyllotoxin and total lignans in Sinopodophyllum hexandrum and ecological factors].

    PubMed

    Li, Min; Zhong, Guo-yue; Wu, Ao-lin; Zhang, Shou-wen; Jiang, Wei; Liang, Jian

    2015-05-01

    To explore the correlation between the ecological factors and the contents of podophyllotoxin and total lignans in root and rhizome of Sinopodophyllum hexandrum, podophyllotoxin in 87 samples (from 5 provinces) was determined by HPLC and total lignans by UV. A correlation and regression analysis was made by software SPSS 16.0 in combination with ecological factors (terrain, soil and climate). The content determination results showed a great difference between podophyllotoxin and total lignans, attaining 1.001%-6.230% and 5.350%-16.34%, respective. The correlation and regression analysis by SPSS showed a positive linear correlation between their contents, strong positive correlation between their contents, latitude and annual average rainfall within the sampling area, weak negative correlation with pH value and organic material in soil, weaker and stronger positive correlations with soil potassium, weak negative correlation with slope and annual average temperature and weaker positive correlation between the podophyllotoxin content and soil potassium.

  8. Parallel-Processing Software for Correlating Stereo Images

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Deen, Robert; Mcauley, Michael; DeJong, Eric

    2007-01-01

    A computer program implements parallel- processing algorithms for cor relating images of terrain acquired by stereoscopic pairs of digital stereo cameras on an exploratory robotic vehicle (e.g., a Mars rove r). Such correlations are used to create three-dimensional computatio nal models of the terrain for navigation. In this program, the scene viewed by the cameras is segmented into subimages. Each subimage is assigned to one of a number of central processing units (CPUs) opera ting simultaneously.

  9. Study of Optimum Simulation Techniques for the Design and Evaluation of Anti-Jam Communication Systems

    DTIC Science & Technology

    1976-03-01

    pseudo -ranae and range rate correlations , and GDM software etficiency. Other simplifications include the eliwination of all or part of che multipath...signal is available. Then the pdf parameters are trivially available by simple mean, variance and correlation measurements on the quadrature signal...This report investigates the application of CSEL to the LES 8/9 and GPS satellite programs. In addition, a new analysis of the effects of soft and

  10. A graphical simulation software for instruction in cardiovascular mechanics physiology.

    PubMed

    Wildhaber, Reto A; Verrey, François; Wenger, Roland H

    2011-01-25

    Computer supported, interactive e-learning systems are widely used in the teaching of physiology. However, the currently available complimentary software tools in the field of the physiology of cardiovascular mechanics have not yet been adapted to the latest systems software. Therefore, a simple-to-use replacement for undergraduate and graduate students' education was needed, including an up-to-date graphical software that is validated and field-tested. Software compatible to Windows, based on modified versions of existing mathematical algorithms, has been newly developed. Testing was performed during a full term of physiological lecturing to medical and biology students. The newly developed CLabUZH software models a reduced human cardiovascular loop containing all basic compartments: an isolated heart including an artificial electrical stimulator, main vessels and the peripheral resistive components. Students can alter several physiological parameters interactively. The resulting output variables are printed in x-y diagrams and in addition shown in an animated, graphical model. CLabUZH offers insight into the relations of volume, pressure and time dependency in the circulation and their correlation to the electrocardiogram (ECG). Established mechanisms such as the Frank-Starling Law or the Windkessel Effect are considered in this model. The CLabUZH software is self-contained with no extra installation required and runs on most of today's personal computer systems. CLabUZH is a user-friendly interactive computer programme that has proved to be useful in teaching the basic physiological principles of heart mechanics.

  11. NASA Tech Briefs, October 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics covered include: Cryogenic Temperature-Gradient Foam/Substrate Tensile Tester; Flight Test of an Intelligent Flight-Control System; Slat Heater Boxes for Thermal Vacuum Testing; System for Testing Thermal Insulation of Pipes; Electrical-Impedance-Based Ice-Thickness Gauges; Simulation System for Training in Laparoscopic Surgery; Flasher Powered by Photovoltaic Cells and Ultracapacitors; Improved Autoassociative Neural Networks; Toroidal-Core Microinductors Biased by Permanent Magnets; Using Correlated Photons to Suppress Background Noise; Atmospheric-Fade-Tolerant Tracking and Pointing in Wireless Optical Communication; Curved Focal-Plane Arrays Using Back-Illuminated High-Purity Photodetectors; Software for Displaying Data from Planetary Rovers; Software for Refining or Coarsening Computational Grids; Software for Diagnosis of Multiple Coordinated Spacecraft; Software Helps Retrieve Information Relevant to the User; Software for Simulating a Complex Robot; Software for Planning Scientific Activities on Mars; Software for Training in Pre-College Mathematics; Switching and Rectification in Carbon-Nanotube Junctions; Scandia-and-Yttria-Stabilized Zirconia for Thermal Barriers; Environmentally Safer, Less Toxic Fire-Extinguishing Agents; Multiaxial Temperature- and Time-Dependent Failure Model; Cloverleaf Vibratory Microgyroscope with Integrated Post; Single-Vector Calibration of Wind-Tunnel Force Balances; Microgyroscope with Vibrating Post as Rotation Transducer; Continuous Tuning and Calibration of Vibratory Gyroscopes; Compact, Pneumatically Actuated Filter Shuttle; Improved Bearingless Switched-Reluctance Motor; Fluorescent Quantum Dots for Biological Labeling; Growing Three-Dimensional Corneal Tissue in a Bioreactor; Scanning Tunneling Optical Resonance Microscopy; The Micro-Arcsecond Metrology Testbed; Detecting Moving Targets by Use of Soliton Resonances; and Finite-Element Methods for Real-Time Simulation of Surgery.

  12. Digital PIV (DPIV) Software Analysis System

    NASA Technical Reports Server (NTRS)

    Blackshire, James L.

    1997-01-01

    A software package was developed to provide a Digital PIV (DPIV) capability for NASA LaRC. The system provides an automated image capture, test correlation, and autocorrelation analysis capability for the Kodak Megaplus 1.4 digital camera system for PIV measurements. The package includes three separate programs that, when used together with the PIV data validation algorithm, constitutes a complete DPIV analysis capability. The programs are run on an IBM PC/AT host computer running either Microsoft Windows 3.1 or Windows 95 using a 'quickwin' format that allows simple user interface and output capabilities to the windows environment.

  13. Theory of post-block 2 VLBI observable extraction

    NASA Technical Reports Server (NTRS)

    Lowe, Stephen T.

    1992-01-01

    The algorithms used in the post-Block II fringe-fitting software called 'Fit' are described. The steps needed to derive the very long baseline interferometry (VLBI) charged-particle corrected group delay, phase delay rate, and phase delay (the latter without resolving cycle ambiguities) are presented beginning with the set of complex fringe phasors as a function of observation frequency and time. The set of complex phasors is obtained from the JPL/CIT Block II correlator. The output of Fit is the set of charged-particle corrected observables (along with ancillary information) in a form amenable to the software program 'Modest.'

  14. Identification of seedling cabbages and weeds using hyperspectral imaging

    USDA-ARS?s Scientific Manuscript database

    Target detectionis one of research focues for precision chemical application. This study developed a method to identify seedling cabbages and weeds using hyperspectral spectral imaging. In processing the image data, with ENVI software, after dimension reduction, noise reduction, de-correlation for h...

  15. Software Requirements Analysis as Fault Predictor

    NASA Technical Reports Server (NTRS)

    Wallace, Dolores

    2003-01-01

    Waiting until the integration and system test phase to discover errors leads to more costly rework than resolving those same errors earlier in the lifecycle. Costs increase even more significantly once a software system has become operational. WE can assess the quality of system requirements, but do little to correlate this information either to system assurance activities or long-term reliability projections - both of which remain unclear and anecdotal. Extending earlier work on requirements accomplished by the ARM tool, measuring requirements quality information against code complexity and test data for the same system may be used to predict specific software modules containing high impact or deeply embedded faults now escaping in operational systems. Such knowledge would lead to more effective and efficient test programs. It may enable insight into whether a program should be maintained or started over.

  16. A scalable correlator for multichannel diffuse correlation spectroscopy.

    PubMed

    Stapels, Christopher J; Kolodziejski, Noah J; McAdams, Daniel; Podolsky, Matthew J; Fernandez, Daniel E; Farkas, Dana; Christian, James F

    2016-02-01

    Diffuse correlation spectroscopy (DCS) is a technique which enables powerful and robust non-invasive optical studies of tissue micro-circulation and vascular blood flow. The technique amounts to autocorrelation analysis of coherent photons after their migration through moving scatterers and subsequent collection by single-mode optical fibers. A primary cost driver of DCS instruments are the commercial hardware-based correlators, limiting the proliferation of multi-channel instruments for validation of perfusion analysis as a clinical diagnostic metric. We present the development of a low-cost scalable correlator enabled by microchip-based time-tagging, and a software-based multi-tau data analysis method. We will discuss the capabilities of the instrument as well as the implementation and validation of 2- and 8-channel systems built for live animal and pre-clinical settings.

  17. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates.

    PubMed

    Xia, Li C; Steele, Joshua A; Cram, Jacob A; Cardon, Zoe G; Simmons, Sheri L; Vallino, Joseph J; Fuhrman, Jed A; Sun, Fengzhu

    2011-01-01

    The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa.

  18. Extended local similarity analysis (eLSA) of microbial community and other time series data with replicates

    PubMed Central

    2011-01-01

    Background The increasing availability of time series microbial community data from metagenomics and other molecular biological studies has enabled the analysis of large-scale microbial co-occurrence and association networks. Among the many analytical techniques available, the Local Similarity Analysis (LSA) method is unique in that it captures local and potentially time-delayed co-occurrence and association patterns in time series data that cannot otherwise be identified by ordinary correlation analysis. However LSA, as originally developed, does not consider time series data with replicates, which hinders the full exploitation of available information. With replicates, it is possible to understand the variability of local similarity (LS) score and to obtain its confidence interval. Results We extended our LSA technique to time series data with replicates and termed it extended LSA, or eLSA. Simulations showed the capability of eLSA to capture subinterval and time-delayed associations. We implemented the eLSA technique into an easy-to-use analytic software package. The software pipeline integrates data normalization, statistical correlation calculation, statistical significance evaluation, and association network construction steps. We applied the eLSA technique to microbial community and gene expression datasets, where unique time-dependent associations were identified. Conclusions The extended LSA analysis technique was demonstrated to reveal statistically significant local and potentially time-delayed association patterns in replicated time series data beyond that of ordinary correlation analysis. These statistically significant associations can provide insights to the real dynamics of biological systems. The newly designed eLSA software efficiently streamlines the analysis and is freely available from the eLSA homepage, which can be accessed at http://meta.usc.edu/softs/lsa. PMID:22784572

  19. A novel method to estimate the volume of bone defects using cone-beam computed tomography: an in vitro study.

    PubMed

    Esposito, Stefano Andrea; Huybrechts, Bart; Slagmolen, Pieter; Cotti, Elisabetta; Coucke, Wim; Pauwels, Ruben; Lambrechts, Paul; Jacobs, Reinhilde

    2013-09-01

    The routine use of high-resolution images derived from 3-dimensional cone-beam computed tomography (CBCT) datasets enables the linear measurement of lesions in the maxillary and mandibular bones on 3 planes of space. Measurements on different planes would make it possible to obtain real volumetric assessments. In this study, we tested, in vitro, the accuracy and reliability of new dedicated software developed for volumetric lesion assessment in clinical endodontics. Twenty-seven bone defects were created around the apices of 8 teeth in 1 young bovine mandible to simulate periapical lesions of different sizes and shapes. The volume of each defect was determined by taking an impression of the defect using a silicone material. The samples were scanned using an Accuitomo 170 CBCT (J. Morita Mfg Co, Kyoto, Japan), and the data were uploaded into a newly developed dedicated software tool. Two endodontists acted as independent and calibrated observers. They analyzed each bone defect for volume. The difference between the direct volumetric measurements and the measurements obtained with the CBCT images was statistically assessed using a lack-of-fit test. A correlation study was undertaken using the Pearson product-moment correlation coefficient. Intra- and interobserver agreement was also evaluated. The results showed a good fit and strong correlation between both volume measurements (ρ > 0.9) with excellent inter- and intraobserver agreement. Using this software, CBCT proved to be a reliable method in vitro for the estimation of endodontic lesion volumes in bovine jaws. Therefore, it may constitute a new, validated technique for the accurate evaluation and follow-up of apical periodontitis. Copyright © 2013 American Association of Endodontists. Published by Elsevier Inc. All rights reserved.

  20. Reliability of a Single Light Source Purkinjemeter in Pseudophakic Eyes.

    PubMed

    Janunts, Edgar; Chashchina, Ekaterina; Seitz, Berthold; Schaeffel, Frank; Langenbucher, Achim

    2015-08-01

    To study the reliability of Purkinje image analysis for assessment of intraocular lens tilt and decentration in pseudophakic eyes. The study comprised 64 eyes of 39 patients. All eyes underwent phacoemulsification with intraocular lens implanted in the capsular bag. Lens decentration and tilt were measured multiple times by an infrared Purkinjemeter. A total of 396 measurements were performed 1 week and 1 month postoperatively. Lens tilt (Tx, Ty) and decentration (Dx, Dy) in horizontal and vertical directions, respectively, were calculated by dedicated software based on regression analysis for each measurement using only four images, and afterward, the data were averaged (mean values, MV) for repeated sequence of measurements. New software was designed by us for recalculating lens misalignment parameters offline, using a complete set of Purkinje images obtained through the repeated measurements (9 to 15 Purkinje images) (recalculated values, MV'). MV and MV' were compared using SPSS statistical software package. MV and MV' were found to be highly correlated for the Tx and Ty parameters (R2 > 0.9; p < 0.001), moderately correlated for the Dx parameter (R2 > 0.7; p < 0.001), and weakly correlated for the Dy parameter (R2 = 0.23; p < 0.05). Reliability was high (Cronbach α > 0.9) for all measured parameters. Standard deviation values were 0.86 ± 0.69 degrees, 0.72 ± 0.65 degrees, 0.04 ± 0.05 mm, and 0.23 ± 0.34 mm for Tx, Ty, Dx, and Dy, respectively. The Purkinjemeter demonstrated high reliability and reproducibility for lens misalignment parameters. To further improve reliability, we recommend capturing at least six Purkinje images instead of three.

  1. Dental age estimation employing CBCT scans enhanced with Mimics software: Comparison of two different approaches using pulp/tooth volumetric analysis.

    PubMed

    Asif, Muhammad Khan; Nambiar, Phrabhakaran; Mani, Shani Ann; Ibrahim, Norliza Binti; Khan, Iqra Muhammad; Sukumaran, Prema

    2018-02-01

    The methods of dental age estimation and identification of unknown deceased individuals are evolving with the introduction of advanced innovative imaging technologies in forensic investigations. However, assessing small structures like root canal volumes can be challenging in spite of using highly advanced technology. The aim of the study was to investigate which amongst the two methods of volumetric analysis of maxillary central incisors displayed higher strength of correlation between chronological age and pulp/tooth volume ratio for Malaysian adults. Volumetric analysis of pulp cavity/tooth ratio was employed in Method 1 and pulp chamber/crown ratio (up to cemento-enamel junction) was analysed in Method 2. The images were acquired employing CBCT scans and enhanced by manipulating them with the Mimics software. These scans belonged to 56 males and 54 females and their ages ranged from 16 to 65 years. Pearson correlation and regression analysis indicated that both methods used for volumetric measurements had strong correlation between chronological age and pulp/tooth volume ratio. However, Method 2 gave higher coefficient of determination value (R2 = 0.78) when compared to Method 1 (R2 = 0.64). Moreover, manipulation in Method 2 was less time consuming and revealed higher inter-examiner reliability (0.982) as no manual intervention during 'multiple slice editing phase' of the software was required. In conclusion, this study showed that volumetric analysis of pulp cavity/tooth ratio is a valuable gender independent technique and the Method 2 regression equation should be recommended for dental age estimation. Copyright © 2018 Elsevier Ltd and Faculty of Forensic and Legal Medicine. All rights reserved.

  2. Automated analysis of flow cytometric data for measuring neutrophil CD64 expression using a multi-instrument compatible probability state model.

    PubMed

    Wong, Linda; Hill, Beth L; Hunsberger, Benjamin C; Bagwell, C Bruce; Curtis, Adam D; Davis, Bruce H

    2015-01-01

    Leuko64™ (Trillium Diagnostics) is a flow cytometric assay that measures neutrophil CD64 expression and serves as an in vitro indicator of infection/sepsis or the presence of a systemic acute inflammatory response. Leuko64 assay currently utilizes QuantiCALC, a semiautomated software that employs cluster algorithms to define cell populations. The software reduces subjective gating decisions, resulting in interanalyst variability of <5%. We evaluated a completely automated approach to measuring neutrophil CD64 expression using GemStone™ (Verity Software House) and probability state modeling (PSM). Four hundred and fifty-seven human blood samples were processed using the Leuko64 assay. Samples were analyzed on four different flow cytometer models: BD FACSCanto II, BD FACScan, BC Gallios/Navios, and BC FC500. A probability state model was designed to identify calibration beads and three leukocyte subpopulations based on differences in intensity levels of several parameters. PSM automatically calculates CD64 index values for each cell population using equations programmed into the model. GemStone software uses PSM that requires no operator intervention, thus totally automating data analysis and internal quality control flagging. Expert analysis with the predicate method (QuantiCALC) was performed. Interanalyst precision was evaluated for both methods of data analysis. PSM with GemStone correlates well with the expert manual analysis, r(2) = 0.99675 for the neutrophil CD64 index values with no intermethod bias detected. The average interanalyst imprecision for the QuantiCALC method was 1.06% (range 0.00-7.94%), which was reduced to 0.00% with the GemStone PSM. The operator-to-operator agreement in GemStone was a perfect correlation, r(2) = 1.000. Automated quantification of CD64 index values produced results that strongly correlate with expert analysis using a standard gate-based data analysis method. PSM successfully evaluated flow cytometric data generated by multiple instruments across multiple lots of the Leuko64 kit in all 457 cases. The probability-based method provides greater objectivity, higher data analysis speed, and allows for greater precision for in vitro diagnostic flow cytometric assays. © 2015 International Clinical Cytometry Society.

  3. Software for Verifying Image-Correlation Tie Points

    NASA Technical Reports Server (NTRS)

    Klimeck, Gerhard; Yagi, Gary

    2008-01-01

    A computer program enables assessment of the quality of tie points in the image-correlation processes of the software described in the immediately preceding article. Tie points are computed in mappings between corresponding pixels in the left and right images of a stereoscopic pair. The mappings are sometimes not perfect because image data can be noisy and parallax can cause some points to appear in one image but not the other. The present computer program relies on the availability of a left- right correlation map in addition to the usual right left correlation map. The additional map must be generated, which doubles the processing time. Such increased time can now be afforded in the data-processing pipeline, since the time for map generation is now reduced from about 60 to 3 minutes by the parallelization discussed in the previous article. Parallel cluster processing time, therefore, enabled this better science result. The first mapping is typically from a point (denoted by coordinates x,y) in the left image to a point (x',y') in the right image. The second mapping is from (x',y') in the right image to some point (x",y") in the left image. If (x,y) and(x",y") are identical, then the mapping is considered perfect. The perfect-match criterion can be relaxed by introducing an error window that admits of round-off error and a small amount of noise. The mapping procedure can be repeated until all points in each image not connected to points in the other image are eliminated, so that what remains are verified correlation data.

  4. Age estimation by dentin translucency measurement using digital method: An institutional study

    PubMed Central

    Gupta, Shalini; Chandra, Akhilesh; Agnihotri, Archana; Gupta, Om Prakash; Maurya, Niharika

    2017-01-01

    Aims: The aims of the present study were to measure translucency on sectioned teeth using available computer hardware and software, to correlate dimensions of root dentin translucency with age, and to assess whether translucency is reliable for age estimation. Materials and Methods: A pilot study was done on 62 freshly extracted single-rooted permanent teeth from 62 different individuals (35 males and 27 females) and their 250 μm thick sections were prepared by micromotor, carborundum disks, and Arkansas stone. Each tooth section was scanned and the images were opened in the Adobe Photoshop software. Measurement of root dentin translucency (TD length) was done on the scanned image by placing two guides (A and B) along the x-axis of ABFO NO. 2 scale. Unpaired t-test, regression analysis, and Pearson correlation coefficient were used as statistical tools. Results: A linear relationship was observed between TD length and age in the regression analysis. The Pearson correlation analysis showed that there was positive correlation (r = 0.52, P = 0.0001) between TD length and age. However, no significant (P > 0.05) difference was observed in the TD length between male (8.44 ± 2.92 mm) and female (7.80 ± 2.79 mm) samples. Conclusion: Translucency of the root dentin increases with age and it can be used as a reliable parameter for the age estimation. The method used here to digitally select and measure translucent root dentin is more refined, better correlated to age, and produce superior age estimation. PMID:28584476

  5. Pile/shaft designs using artificial neural networks (i.e., genetic programming) with spatial variability considerations : [summary].

    DOT National Transportation Integrated Search

    2014-03-01

    In this project, University of Florida researchers : sought to improve the unit skin friction and tip : resistance correlations embedded in the FB-Deep : software algorithm for estimating driven pile and : drilled shaft resistance. They utilized an a...

  6. GATE: software for the analysis and visualization of high-dimensional time series expression data.

    PubMed

    MacArthur, Ben D; Lachmann, Alexander; Lemischka, Ihor R; Ma'ayan, Avi

    2010-01-01

    We present Grid Analysis of Time series Expression (GATE), an integrated computational software platform for the analysis and visualization of high-dimensional biomolecular time series. GATE uses a correlation-based clustering algorithm to arrange molecular time series on a two-dimensional hexagonal array and dynamically colors individual hexagons according to the expression level of the molecular component to which they are assigned, to create animated movies of systems-level molecular regulatory dynamics. In order to infer potential regulatory control mechanisms from patterns of correlation, GATE also allows interactive interroga-tion of movies against a wide variety of prior knowledge datasets. GATE movies can be paused and are interactive, allowing users to reconstruct networks and perform functional enrichment analyses. Movies created with GATE can be saved in Flash format and can be inserted directly into PDF manuscript files as interactive figures. GATE is available for download and is free for academic use from http://amp.pharm.mssm.edu/maayan-lab/gate.htm

  7. THE DiskMass SURVEY. III. STELLAR KINEMATICS VIA CROSS-CORRELATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Westfall, Kyle B.; Bershady, Matthew A.; Verheijen, Marc A. W., E-mail: westfall@astro.rug.nl, E-mail: mab@astro.wisc.edu, E-mail: verheyen@astro.rug.nl

    2011-03-15

    We describe a new cross-correlation (CC) approach used by our survey to derive stellar kinematics from galaxy-continuum spectroscopy. This approach adopts the formal error analysis derived by Statler, but properly handles spectral masks. Thus, we address the primary concerns regarding application of the CC method to censored data, while maintaining its primary advantage by consolidating kinematic and template-mismatch information toward different regions of the CC function. We identify a systematic error in the nominal CC method of approximately 10% in velocity dispersion incurred by a mistreatment of detector-censored data, which is eliminated by our new method. We derive our approachmore » from first principles, and we use Monte Carlo simulations to demonstrate its efficacy. An identical set of Monte Carlo simulations performed using the well-established penalized-pixel-fitting code of Cappellari and Emsellem compares favorably with the results from our newly implemented software. Finally, we provide a practical demonstration of this software by extracting stellar kinematics from SparsePak spectra of UGC 6918.« less

  8. Implementation of a noise reduction circuit for spaceflight IR spectrometers

    NASA Technical Reports Server (NTRS)

    Ramirez, L.; Hickok, R.; Pain, B.; Staller, C.

    1992-01-01

    The paper discusses the implementation and analysis of a correlated triple sampling circuit using analog subtractor/integrators. The software and test setup for noise measurements are also described. The correlation circuitry is part of the signal chain for a 256-element InSb line array used in the Visible and Infrared Mapping Spectrometer. Using a focal-plane array (FPA) simulator, system noise measurements of 0.7 DN are obtained. A test setup for FPA/SPE (signal processing electronics) characterization along with noise measurements is demonstrated.

  9. Correlator optical wavefront sensor COWS

    NASA Astrophysics Data System (ADS)

    1991-02-01

    This report documents the significant upgrades and improvements made to the correlator optical wavefront sensor (COWS) optical bench during this phase of the program. Software for the experiment was reviewed and documented. Flowcharts showing the program flow are included as well as documentation for programs which were written to calculate and display Zernike polynomials. The system was calibrated and aligned and a series of experiments to determine the optimum settings for the input and output MOSLM polarizers were conducted. In addition, design of a simple aberration generation is included.

  10. Mingus Discontinuous Multiphysics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pat Notz, Dan Turner

    Mingus provides hybrid coupled local/non-local mechanics analysis capabilities that extend several traditional methods to applications with inherent discontinuities. Its primary features include adaptations of solid mechanics, fluid dynamics and digital image correlation that naturally accommodate dijointed data or irregular solution fields by assimilating a variety of discretizations (such as control volume finite elements, peridynamics and meshless control point clouds). The goal of this software is to provide an analysis framework form multiphysics engineering problems with an integrated image correlation capability that can be used for experimental validation and model

  11. Caries risk assessment in schoolchildren - a form based on Cariogram® software

    PubMed Central

    CABRAL, Renata Nunes; HILGERT, Leandro Augusto; FABER, Jorge; LEAL, Soraya Coelho

    2014-01-01

    Identifying caries risk factors is an important measure which contributes to best understanding of the cariogenic profile of the patient. The Cariogram® software provides this analysis, and protocols simplifying the method were suggested. Objectives The aim of this study was to determine whether a newly developed Caries Risk Assessment (CRA) form based on the Cariogram® software could classify schoolchildren according to their caries risk and to evaluate relationships between caries risk and the variables in the form. Material and Methods 150 schoolchildren aged 5 to 7 years old were included in this survey. Caries prevalence was obtained according to International Caries Detection and Assessment System (ICDAS) II. Information for filling in the form based on Cariogram® was collected clinically and from questionnaires sent to parents. Linear regression and a forward stepwise multiple regression model were applied to correlate the variables included in the form with the caries risk. Results Caries prevalence, in primary dentition, including enamel and dentine carious lesions was 98.6%, and 77.3% when only dentine lesions were considered. Eighty-six percent of the children were classified as at moderate caries risk. The forward stepwise multiple regression model result was significant (R2=0.904; p<0.00001), showing that the most significant factors influencing caries risk were caries experience, oral hygiene, frequency of food consumption, sugar consumption and fluoride sources. Conclusion The use of the form based on the Cariogram® software enabled classification of the schoolchildren at low, moderate and high caries risk. Caries experience, oral hygiene, frequency of food consumption, sugar consumption and fluoride sources are the variables that were shown to be highly correlated with caries risk. PMID:25466473

  12. Quantification of protein expression in cells and cellular subcompartments on immunohistochemical sections using a computer supported image analysis system.

    PubMed

    Braun, Martin; Kirsten, Robert; Rupp, Niels J; Moch, Holger; Fend, Falko; Wernert, Nicolas; Kristiansen, Glen; Perner, Sven

    2013-05-01

    Quantification of protein expression based on immunohistochemistry (IHC) is an important step for translational research and clinical routine. Several manual ('eyeballing') scoring systems are used in order to semi-quantify protein expression based on chromogenic intensities and distribution patterns. However, manual scoring systems are time-consuming and subject to significant intra- and interobserver variability. The aim of our study was to explore, whether new image analysis software proves to be sufficient as an alternative tool to quantify protein expression. For IHC experiments, one nucleus specific marker (i.e., ERG antibody), one cytoplasmic specific marker (i.e., SLC45A3 antibody), and one marker expressed in both compartments (i.e., TMPRSS2 antibody) were chosen. Stainings were applied on TMAs, containing tumor material of 630 prostate cancer patients. A pathologist visually quantified all IHC stainings in a blinded manner, applying a four-step scoring system. For digital quantification, image analysis software (Tissue Studio v.2.1, Definiens AG, Munich, Germany) was applied to obtain a continuous spectrum of average staining intensity. For each of the three antibodies we found a strong correlation of the manual protein expression score and the score of the image analysis software. Spearman's rank correlation coefficient was 0.94, 0.92, and 0.90 for ERG, SLC45A3, and TMPRSS2, respectively (p⟨0.01). Our data suggest that the image analysis software Tissue Studio is a powerful tool for quantification of protein expression in IHC stainings. Further, since the digital analysis is precise and reproducible, computer supported protein quantification might help to overcome intra- and interobserver variability and increase objectivity of IHC based protein assessment.

  13. Total focusing method with correlation processing of antenna array signals

    NASA Astrophysics Data System (ADS)

    Kozhemyak, O. A.; Bortalevich, S. I.; Loginov, E. L.; Shinyakov, Y. A.; Sukhorukov, M. P.

    2018-03-01

    The article proposes a method of preliminary correlation processing of a complete set of antenna array signals used in the image reconstruction algorithm. The results of experimental studies of 3D reconstruction of various reflectors using and without correlation processing are presented in the article. Software ‘IDealSystem3D’ by IDeal-Technologies was used for experiments. Copper wires of different diameters located in a water bath were used as a reflector. The use of correlation processing makes it possible to obtain more accurate reconstruction of the image of the reflectors and to increase the signal-to-noise ratio. The experimental results were processed using an original program. This program allows varying the parameters of the antenna array and sampling frequency.

  14. Separation in Logistic Regression: Causes, Consequences, and Control.

    PubMed

    Mansournia, Mohammad Ali; Geroldinger, Angelika; Greenland, Sander; Heinze, Georg

    2018-04-01

    Separation is encountered in regression models with a discrete outcome (such as logistic regression) where the covariates perfectly predict the outcome. It is most frequent under the same conditions that lead to small-sample and sparse-data bias, such as presence of a rare outcome, rare exposures, highly correlated covariates, or covariates with strong effects. In theory, separation will produce infinite estimates for some coefficients. In practice, however, separation may be unnoticed or mishandled because of software limits in recognizing and handling the problem and in notifying the user. We discuss causes of separation in logistic regression and describe how common software packages deal with it. We then describe methods that remove separation, focusing on the same penalized-likelihood techniques used to address more general sparse-data problems. These methods improve accuracy, avoid software problems, and allow interpretation as Bayesian analyses with weakly informative priors. We discuss likelihood penalties, including some that can be implemented easily with any software package, and their relative advantages and disadvantages. We provide an illustration of ideas and methods using data from a case-control study of contraceptive practices and urinary tract infection.

  15. Comparative assessment of software for non-targeted data analysis in the study of volatile fingerprint changes during storage of a strawberry beverage.

    PubMed

    Morales, M L; Callejón, R M; Ordóñez, J L; Troncoso, A M; García-Parrilla, M C

    2017-11-03

    Five free software packages were compared to assess their utility for the non-targeted study of changes in the volatile profile during the storage of a novel strawberry beverage. AMDIS coupled to Gavin software turned out to be easy to use, required the minimum handling for subsequent data treatment and its results were the most similar to those obtained by manual integration. However, AMDIS coupled to SpectConnect software provided more information for the study of volatile profile changes during the storage of strawberry beverage. During storage, volatile profile changed producing the differentiation among the strawberry beverage stored at different temperatures, and this difference increases as time passes; these results were also supported by PCA. As expected, it seems that cold temperature is the best way of preservation for this product during long time storage. Variable Importance in the Projection (VIP) and correlation scores pointed out four volatile compounds as potential markers for shelf-life of our strawberry beverage: 2-phenylethyl acetate, decanoic acid, γ-decalactone and furfural. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. A dose-response curve for biodosimetry from a 6 MV electron linear accelerator

    PubMed Central

    Lemos-Pinto, M.M.P.; Cadena, M.; Santos, N.; Fernandes, T.S.; Borges, E.; Amaral, A.

    2015-01-01

    Biological dosimetry (biodosimetry) is based on the investigation of radiation-induced biological effects (biomarkers), mainly dicentric chromosomes, in order to correlate them with radiation dose. To interpret the dicentric score in terms of absorbed dose, a calibration curve is needed. Each curve should be constructed with respect to basic physical parameters, such as the type of ionizing radiation characterized by low or high linear energy transfer (LET) and dose rate. This study was designed to obtain dose calibration curves by scoring of dicentric chromosomes in peripheral blood lymphocytes irradiated in vitro with a 6 MV electron linear accelerator (Mevatron M, Siemens, USA). Two software programs, CABAS (Chromosomal Aberration Calculation Software) and Dose Estimate, were used to generate the curve. The two software programs are discussed; the results obtained were compared with each other and with other published low LET radiation curves. Both software programs resulted in identical linear and quadratic terms for the curve presented here, which was in good agreement with published curves for similar radiation quality and dose rates. PMID:26445334

  17. An artificial intelligence approach to lithostratigraphic correlation using geophysical well logs

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Olea, R.A.; Davis, J.C.

    1986-01-01

    Computer programs for lithostratigraphic correlation of well logs have achieved limited success. Their algorithms are based on an oversimplified view of the manual process used by analysts to establish geologically correct correlations. The programs experience difficulties if the correlated rocks deviate from an ideal geometry of perfectly homogeneous, parallel layers of infinite extent. Artificial intelligence provides a conceptual basis for formulating the task of lithostratigraphic correlation, leading to more realistic procedures. A prototype system using the ''production rule'' approach of expert systems successfully correlates well logs in areas of stratigraphic complexity. Two digitized logs are used per well, one formore » curve matching and the other for lithologic comparison. The software has been successfully used to correlate more than 100,000 ft (30 480 m) of section, through clastic sequences in Louisiana and through carbonate sequences in Kansas. Correlations have been achieved even in the presence of faults, unconformities, facies changes, and lateral variations in bed thickness.« less

  18. Artificial neural networks to model formulation-property correlations in the process of inline-compounding on an injection moulding machine

    NASA Astrophysics Data System (ADS)

    Moritzer, Elmar; Müller, Ellen; Martin, Yannick; Kleeschulte, Rainer

    2015-05-01

    Today the global market poses great challenges for industrial product development. Complexity, diversity of variants, flexibility and individuality are just some of the features that products have to offer today. In addition, the product series have shorter lifetimes. Because of their high capacity for adaption, polymers are increasingly able to displace traditional materials such as wood, glass and metals from various fields of application. Polymers can only be used to substitute other materials, however, if they are optimally suited to the applications in question. Hence, product-specific material development is becoming increasingly important. Integrating the compounding step in the injection moulding process permits a more efficient and faster development process for a new polymer formulation, making it possible to create new product-specific materials. This process is called inline-compounding on an injection moulding machine. The entire process sequence is supported by software from Bayer Technology called Product Design Workbench (PDWB), which provides assistance in all the individual steps from data management, via analysis and model compilation, right through to the optimization of the formulation and the design of experiments. The software is based on artificial neural networks and can model the formulation-property correlations and thus enable different formulations to be optimized. In the study presented, the workflow and the modelling with the software are presented.

  19. Cartilage quantification using contrast-enhanced MRI in the wrist of rheumatoid arthritis: cartilage loss is associated with bone marrow edema.

    PubMed

    Fujimori, Motoshi; Nakamura, Satoko; Hasegawa, Kiminori; Ikeno, Kunihiro; Ichikawa, Shota; Sutherland, Kenneth; Kamishima, Tamotsu

    2017-08-01

    To quantify wrist cartilage using contrast MRI and compare with the extent of adjacent synovitis and bone marrow edema (BME) in patients with rheumatoid arthritis (RA). 18 patients with RA underwent post-contrast fat-suppressed T 1 weighted coronal imaging. Cartilage area at the centre of the scaphoid-capitate and radius-scaphoid joints was measured by in-house developed software. We defined cartilage as the pixels with signal intensity between two thresholds (lower: 0.4, 0.5 and 0.6 times the muscle signal, upper: 0.9, 1.0, 1.1, 1.2 and 1.3 times the muscle signal). We investigated the association of cartilage loss with synovitis and BME score derived from RA MRI scoring system. Cartilage area was correlated with BME score when thresholds were adequately set with lower threshold at 0.6 times the muscle signal and upper threshold at 1.2 times the muscle signal for both SC (r s =-0.469, p < 0.05) and RS (r s =-0.486, p < 0.05) joints, while it showed no significant correlation with synovitis score at any thresholds. Our software can accurately quantify cartilage in the wrist and BME associated with cartilage loss in patients with RA. Advances in knowledge: Our software can quantify cartilage using conventional MR images of the wrist. BME is associated with cartilage loss in RA patients.

  20. Comparison of Eyemetrics and Orbscan automated method to determine horizontal corneal diameter

    PubMed Central

    Venkataraman, Arvind; Mardi, Sapna K; Pillai, Sarita

    2010-01-01

    Purpose: To compare horizontal corneal diameter measurements using the Orbscan Eyemetrics function and Orbscan corneal topographer. Materials and Methods: Seventy-three eyes of 37 patients were included in the study. In all cases, the automated white-to-white (WTW) measurements were obtained using Orbscan by two observers. Using the Eyemetrics function, the WTW was measured manually by the same observers from limbus to limbus using the digital caliper passing through the five point corneal reflections on the Orbscan real image. The data was analyzed using SPSS software for correlation, reliability and inter-rater repeatability. Results: The mean horizontal corneal diameter was 11.74 ± 0.32mm (SD) with the Orbscan and 11.92 ± 0.33mm (SD) with Eyemetrics Software-based measurement. A good positive correlation (Spearman r = 0.720, P = 0.026) was found between these two measurements. The coefficient of inter-rater repeatability was 0.89 for the Orbscan and 0.94 for the Eyemetrics software measurements on the anterior segment images. The Bland and Altman analysis showed large limits of agreement between Orbscan WTW and Eyemetrics WTW measurements. The intra-session repeatability scores for repeat measurements for the Orbscan WTW and Eyemetrics measurements were good. Conclusion: Eyemetrics can be used to measure WTW and the Eyemetrics measured WTW was longer than the WTW measured by Orbscan. PMID:20413925

  1. Accurate analysis and visualization of cardiac (11)C-PIB uptake in amyloidosis with semiautomatic software.

    PubMed

    Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark

    2016-08-01

    (11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.

  2. Using BMDP and SPSS for a Q factor analysis.

    PubMed

    Tanner, B A; Koning, S M

    1980-12-01

    While Euclidean distances and Q factor analysis may sometimes be preferred to correlation coefficients and cluster analysis for developing a typology, commercially available software does not always facilitate their use. Commands are provided for using BMDP and SPSS in a Q factor analysis with Euclidean distances.

  3. Business Computer Programming I. Curriculum Guide. Second Edition.

    ERIC Educational Resources Information Center

    Patton, Jan

    This guide provides instructors of business computer programming with a curriculum correlated directly to the office education essential elements mandated by the Texas Education Agency. Introductory materials include a scope and sequence and lists of suggested textbooks, resource books, software, audiovisuals, and magazines. Eleven units are…

  4. A Data Collection and Representation Framework for Software and Human-Computer Interaction Measurements.

    DTIC Science & Technology

    2000-01-04

    by Miara , Musselman, Navarro, and Shneiderman [ Miara et al. 1983] they found that indentation correlated strongly with comprehension. They tested 47...Dissertation, Auburn University, Auburn, AL, August 1996. MIARA , R.J., MUSSELMAN, JA., NAVARRO, JA., AND SHNEIDERMAN, B. 1983. Program Indentation and

  5. CATCHING THE WIND: A LOW COST METHOD FOR WIND POWER SITE ASSESSMENT

    EPA Science Inventory

    Our Phase I successes involve the installation of a wind monitoring station in Humboldt County, the evaluation of four different measure-correlate-predict methods for wind site assessment, and the creation of SWEET, an open source software package implementing the prediction ...

  6. Process air quality data

    NASA Technical Reports Server (NTRS)

    Butler, C. M.; Hogge, J. E.

    1978-01-01

    Air quality sampling was conducted. Data for air quality parameters, recorded on written forms, punched cards or magnetic tape, are available for 1972 through 1975. Computer software was developed to (1) calculate several daily statistical measures of location, (2) plot time histories of data or the calculated daily statistics, (3) calculate simple correlation coefficients, and (4) plot scatter diagrams. Computer software was developed for processing air quality data to include time series analysis and goodness of fit tests. Computer software was developed to (1) calculate a larger number of daily statistical measures of location, and a number of daily monthly and yearly measures of location, dispersion, skewness and kurtosis, (2) decompose the extended time series model and (3) perform some goodness of fit tests. The computer program is described, documented and illustrated by examples. Recommendations are made for continuation of the development of research on processing air quality data.

  7. PiVoT GPS Receiver

    NASA Technical Reports Server (NTRS)

    Wennersten, Miriam Dvorak; Banes, Anthony Vince; Boegner, Gregory J.; Dougherty, Lamar; Edwards, Bernard L.; Roman, Joseph; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    NASA Goddard Space Flight Center has built an open architecture, 24 channel space flight GPS receiver. The CompactPCI PiVoT GPS receiver card is based on the Mitel/GEC Plessey Builder-2 board. PiVoT uses two Plessey 2021 correlators to allow tracking of up to 24 separate GPS SV's on unique channels. Its four front ends can support four independent antennas, making it a useful card for hosting GPS attitude determination algorithms. It has been built using space quality, radiation tolerant parts. The PiVoT card will track a weaker signal than the original Builder 2 board. It also hosts an improved clock oscillator. The PiVoT software is based on the original Plessey Builder 2 software ported to the Linux operating system. The software is POSIX complaint and can easily be converted to other POSIX operating systems. The software is open source to anyone with a licensing agreement with Plessey. Additional tasks can be added to the software to support GPS science experiments or attitude determination algorithms. The next generation PiVoT receiver will be a single radiation hardened CompactPCI card containing the microprocessor and the GPS receiver optimized for use above the GPS constellation. PiVoT was flown successfully on a balloon in July, 2001, for its first non-simulated flight.

  8. Large-N correlator systems for low frequency radio astronomy

    NASA Astrophysics Data System (ADS)

    Foster, Griffin

    Low frequency radio astronomy has entered a second golden age driven by the development of a new class of large-N interferometric arrays. The low frequency array (LOFAR) and a number of redshifted HI Epoch of Reionization (EoR) arrays are currently undergoing commission and regularly observing. Future arrays of unprecedented sensitivity and resolutions at low frequencies, such as the square kilometer array (SKA) and the hydrogen epoch of reionization array (HERA), are in development. The combination of advancements in specialized field programmable gate array (FPGA) hardware for signal processing, computing and graphics processing unit (GPU) resources, and new imaging and calibration algorithms has opened up the oft underused radio band below 300 MHz. These interferometric arrays require efficient implementation of digital signal processing (DSP) hardware to compute the baseline correlations. FPGA technology provides an optimal platform to develop new correlators. The significant growth in data rates from these systems requires automated software to reduce the correlations in real time before storing the data products to disk. Low frequency, widefield observations introduce a number of unique calibration and imaging challenges. The efficient implementation of FX correlators using FPGA hardware is presented. Two correlators have been developed, one for the 32 element BEST-2 array at Medicina Observatory and the other for the 96 element LOFAR station at Chilbolton Observatory. In addition, calibration and imaging software has been developed for each system which makes use of the radio interferometry measurement equation (RIME) to derive calibrations. A process for generating sky maps from widefield LOFAR station observations is presented. Shapelets, a method of modelling extended structures such as resolved sources and beam patterns has been adapted for radio astronomy use to further improve system calibration. Scaling of computing technology allows for the development of larger correlator systems, which in turn allows for improvements in sensitivity and resolution. This requires new calibration techniques which account for a broad range of systematic effects.

  9. Exploring Initiative as a Signal of Knowledge Co-Construction During Collaborative Problem Solving.

    PubMed

    Howard, Cynthia; Di Eugenio, Barbara; Jordan, Pamela; Katz, Sandra

    2017-08-01

    Peer interaction has been found to be conducive to learning in many settings. Knowledge co-construction (KCC) has been proposed as one explanatory mechanism. However, KCC is a theoretical construct that is too abstract to guide the development of instructional software that can support peer interaction. In this study, we present an extensive analysis of a corpus of peer dialogs that we collected in the domain of introductory Computer Science. We show that the notion of task initiative shifts correlates with both KCC and learning. Speakers take task initiative when they contribute new content that advances problem solving and that is not invited by their partner; if initiative shifts between the partners, it indicates they both contribute to problem solving. We found that task initiative shifts occur more frequently within KCC episodes than outside. In addition, task initiative shifts within KCC episodes correlate with learning for low pre-testers, and total task initiative shifts correlate with learning for high pre-testers. As recognizing task initiative shifts does not require as much deep knowledge as recognizing KCC, task initiative shifts as an indicator of productive collaboration are potentially easier to model in instructional software that simulates a peer. Copyright © 2016 Cognitive Science Society, Inc.

  10. The Effects of Size and Type of Vocal Fold Polyp on Some Acoustic Voice Parameters.

    PubMed

    Akbari, Elaheh; Seifpanahi, Sadegh; Ghorbani, Ali; Izadi, Farzad; Torabinezhad, Farhad

    2018-03-01

    Vocal abuse and misuse would result in vocal fold polyp. Certain features define the extent of vocal folds polyp effects on voice acoustic parameters. The present study aimed to define the effects of polyp size on acoustic voice parameters, and compare these parameters in hemorrhagic and non-hemorrhagic polyps. In the present retrospective study, 28 individuals with hemorrhagic or non-hemorrhagic polyps of the true vocal folds were recruited to investigate acoustic voice parameters of vowel/ æ/ computed by the Praat software. The data were analyzed using the SPSS software, version 17.0. According to the type and size of polyps, mean acoustic differences and correlations were analyzed by the statistical t test and Pearson correlation test, respectively; with significance level below 0.05. The results indicated that jitter and the harmonics-to-noise ratio had a significant positive and negative correlation with the polyp size (P=0.01), respectively. In addition, both mentioned parameters were significantly different between the two types of the investigated polyps. Both the type and size of polyps have effects on acoustic voice characteristics. In the present study, a novel method to measure polyp size was introduced. Further confirmation of this method as a tool to compare polyp sizes requires additional investigations.

  11. The Effects of Size and Type of Vocal Fold Polyp on Some Acoustic Voice Parameters

    PubMed Central

    Akbari, Elaheh; Seifpanahi, Sadegh; Ghorbani, Ali; Izadi, Farzad; Torabinezhad, Farhad

    2018-01-01

    Background Vocal abuse and misuse would result in vocal fold polyp. Certain features define the extent of vocal folds polyp effects on voice acoustic parameters. The present study aimed to define the effects of polyp size on acoustic voice parameters, and compare these parameters in hemorrhagic and non-hemorrhagic polyps. Methods In the present retrospective study, 28 individuals with hemorrhagic or non-hemorrhagic polyps of the true vocal folds were recruited to investigate acoustic voice parameters of vowel/ æ/ computed by the Praat software. The data were analyzed using the SPSS software, version 17.0. According to the type and size of polyps, mean acoustic differences and correlations were analyzed by the statistical t test and Pearson correlation test, respectively; with significance level below 0.05. Results The results indicated that jitter and the harmonics-to-noise ratio had a significant positive and negative correlation with the polyp size (P=0.01), respectively. In addition, both mentioned parameters were significantly different between the two types of the investigated polyps. Conclusion Both the type and size of polyps have effects on acoustic voice characteristics. In the present study, a novel method to measure polyp size was introduced. Further confirmation of this method as a tool to compare polyp sizes requires additional investigations. PMID:29749984

  12. the-wizz: clustering redshift estimation for everyone

    NASA Astrophysics Data System (ADS)

    Morrison, C. B.; Hildebrandt, H.; Schmidt, S. J.; Baldry, I. K.; Bilicki, M.; Choi, A.; Erben, T.; Schneider, P.

    2017-05-01

    We present the-wizz, an open source and user-friendly software for estimating the redshift distributions of photometric galaxies with unknown redshifts by spatially cross-correlating them against a reference sample with known redshifts. The main benefit of the-wizz is in separating the angular pair finding and correlation estimation from the computation of the output clustering redshifts allowing anyone to create a clustering redshift for their sample without the intervention of an 'expert'. It allows the end user of a given survey to select any subsample of photometric galaxies with unknown redshifts, match this sample's catalogue indices into a value-added data file and produce a clustering redshift estimation for this sample in a fraction of the time it would take to run all the angular correlations needed to produce a clustering redshift. We show results with this software using photometric data from the Kilo-Degree Survey (KiDS) and spectroscopic redshifts from the Galaxy and Mass Assembly survey and the Sloan Digital Sky Survey. The results we present for KiDS are consistent with the redshift distributions used in a recent cosmic shear analysis from the survey. We also present results using a hybrid machine learning-clustering redshift analysis that enables the estimation of clustering redshifts for individual galaxies. the-wizz can be downloaded at http://github.com/morriscb/The-wiZZ/.

  13. MODEST: A Tool for Geodesy and Astronomy

    NASA Technical Reports Server (NTRS)

    Sovers, Ojars J.; Jacobs, Christopher S.; Lanyi, Gabor E.

    2004-01-01

    Features of the JPL VLBI modeling and estimation software "MODEST" are reviewed. Its main advantages include thoroughly documented model physics, portability, and detailed error modeling. Two unique models are included: modeling of source structure and modeling of both spatial and temporal correlations in tropospheric delay noise. History of the code parallels the development of the astrometric and geodetic VLBI technique and the software retains many of the models implemented during its advancement. The code has been traceably maintained since the early 1980s, and will continue to be updated with recent IERS standards. Scripts are being developed to facilitate user-friendly data processing in the era of e-VLBI.

  14. Diffraction-geometry refinement in the DIALS framework

    DOE PAGES

    Waterman, David G.; Winter, Graeme; Gildea, Richard J.; ...

    2016-03-30

    Rapid data collection and modern computing resources provide the opportunity to revisit the task of optimizing the model of diffraction geometry prior to integration. A comprehensive description is given of new software that builds upon established methods by performing a single global refinement procedure, utilizing a smoothly varying model of the crystal lattice where appropriate. This global refinement technique extends to multiple data sets, providing useful constraints to handle the problem of correlated parameters, particularly for small wedges of data. Examples of advanced uses of the software are given and the design is explained in detail, with particular emphasis onmore » the flexibility and extensibility it entails.« less

  15. Scientific Computation Application Partnerships in Materials and Chemical Sciences, Charge Transfer and Charge Transport in Photoactivated Systems, Developing Electron-Correlated Methods for Excited State Structure and Dynamics in the NWChem Software Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cramer, Christopher J.

    Charge transfer and charge transport in photoactivated systems are fundamental processes that underlie solar energy capture, solar energy conversion, and photoactivated catalysis, both organometallic and enzymatic. We developed methods, algorithms, and software tools needed for reliable treatment of the underlying physics for charge transfer and charge transport, an undertaking with broad applicability to the goals of the fundamental-interaction component of the Department of Energy Office of Basic Energy Sciences and the exascale initiative of the Office of Advanced Scientific Computing Research.

  16. Self-learning health monitoring algorithm in composite structures

    NASA Astrophysics Data System (ADS)

    Grassia, Luigi; Iannone, Michele; Califano, America; D'Amore, Alberto

    2018-02-01

    The paper describes a system that it is able of monitoring the health state of a composite structure in real time. The hardware of the system consists of a wire of strain sensors connected to a control unit. The software of the system elaborates the strain data and in real time is able to detect the presence of an eventual damage of the structures monitored with the strain sensors. The algorithm requires as input only the strains of the monitored structured measured on real time, i.e. those strains coming from the deformations of the composite structure due to the working loads. The health monitoring system does not require any additional device to interrogate the structure as often used in the literature, instead it is based on a self-learning procedure. The strain data acquired when the structure is healthy are used to set up the correlations between the strain in different positions of structure by means of neural network. Once the correlations between the strains in different position have been set up, these correlations act as a fingerprint of the healthy structure. In case of damage the correlation between the strains in the position of the structure near the damage will change due to the change of the stiffness of the structure caused by the damage. The developed software is able to recognize the change of the transfer function between the strains and consequently is able to detect the damage.

  17. Dynamic gadolinium-enhanced magnetic resonance imaging allows accurate assessment of the synovial inflammatory activity in rheumatoid arthritis knee joints: a comparison with synovial histology.

    PubMed

    Axelsen, M B; Stoltenberg, M; Poggenborg, R P; Kubassova, O; Boesen, M; Bliddal, H; Hørslev-Petersen, K; Hanson, L G; Østergaard, M

    2012-03-01

    To determine whether dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) evaluated using semi-automatic image processing software can accurately assess synovial inflammation in rheumatoid arthritis (RA) knee joints. In 17 RA patients undergoing knee surgery, the average grade of histological synovial inflammation was determined from four biopsies obtained during surgery. A preoperative series of T(1)-weighted dynamic fast low-angle shot (FLASH) MR images was obtained. Parameters characterizing contrast uptake dynamics, including the initial rate of enhancement (IRE), were generated by the software in three different areas: (I) the entire slice (Whole slice); (II) a manually outlined region of interest (ROI) drawn quickly around the joint, omitting large artefacts such as blood vessels (Quick ROI); and (III) a manually outlined ROI following the synovial capsule of the knee joint (Precise ROI). Intra- and inter-reader agreement was assessed using the intra-class correlation coefficient (ICC). The IRE from the Quick ROI and the Precise ROI revealed high correlations to the grade of histological inflammation (Spearman's correlation coefficient (rho) = 0.70, p = 0.001 and rho = 0.74, p = 0.001, respectively). Intra- and inter-reader ICCs were very high (0.93-1.00). No Whole slice parameters were correlated to histology. DCE-MRI provides fast and accurate assessment of synovial inflammation in RA patients. Manual outlining of the joint to omit large artefacts is necessary.

  18. The correlation between preoperative volumetry and real graft weight: comparison of two volumetry programs.

    PubMed

    Mussin, Nadiar; Sumo, Marco; Lee, Kwang-Woong; Choi, YoungRok; Choi, Jin Yong; Ahn, Sung-Woo; Yoon, Kyung Chul; Kim, Hyo-Sin; Hong, Suk Kyun; Yi, Nam-Joon; Suh, Kyung-Suk

    2017-04-01

    Liver volumetry is a vital component in living donor liver transplantation to determine an adequate graft volume that meets the metabolic demands of the recipient and at the same time ensures donor safety. Most institutions use preoperative contrast-enhanced CT image-based software programs to estimate graft volume. The objective of this study was to evaluate the accuracy of 2 liver volumetry programs (Rapidia vs . Dr. Liver) in preoperative right liver graft estimation compared with real graft weight. Data from 215 consecutive right lobe living donors between October 2013 and August 2015 were retrospectively reviewed. One hundred seven patients were enrolled in Rapidia group and 108 patients were included in the Dr. Liver group. Estimated graft volumes generated by both software programs were compared with real graft weight measured during surgery, and further classified into minimal difference (≤15%) and big difference (>15%). Correlation coefficients and degree of difference were determined. Linear regressions were calculated and results depicted as scatterplots. Minimal difference was observed in 69.4% of cases from Dr. Liver group and big difference was seen in 44.9% of cases from Rapidia group (P = 0.035). Linear regression analysis showed positive correlation in both groups (P < 0.01). However, the correlation coefficient was better for the Dr. Liver group (R 2 = 0.719), than for the Rapidia group (R 2 = 0.688). Dr. Liver can accurately predict right liver graft size better and faster than Rapidia, and can facilitate preoperative planning in living donor liver transplantation.

  19. Mathematical and Statistical Software Index.

    DTIC Science & Technology

    1986-08-01

    geometric) mean HMEAN - harmonic mean MEDIAN - median MODE - mode QUANT - quantiles OGIVE - distribution curve IQRNG - interpercentile range RANGE ... range mutliphase pivoting algorithm cross-classification multiple discriminant analysis cross-tabul ation mul tipl e-objecti ve model curve fitting...Statistics). .. .. .... ...... ..... ...... ..... .. 21 *RANGEX (Correct Correlations for Curtailment of Range ). .. .. .... ...... ... 21 *RUMMAGE II (Analysis

  20. Correlating Computer Database Programs with Social Studies Instruction.

    ERIC Educational Resources Information Center

    Northwest Regional Educational Lab., Portland, OR.

    This unit emphasizes the integration of software in a focus on the classroom instruction process. Student activities are based on plans and ideas for instructional units presented by a teacher who describes and demonstrates the activities. Integration has occurred when computer applications are included in an instructional activity. This guide…

  1. An Undergraduate Electrical Engineering Course on Computer Organization.

    ERIC Educational Resources Information Center

    Commission on Engineering Education, Washington, DC.

    Outlined is an undergraduate electrical engineering course on computer organization designed to meet the need for electrical engineers familiar with digital system design. The program includes both hardware and software aspects of digital systems essential to design function and correlates design and organizational aspects of the subject. The…

  2. Quantification of myocardial fibrosis by digital image analysis and interactive stereology

    PubMed Central

    2014-01-01

    Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Conclusion Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist’s visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193 PMID:24912374

  3. Quantification of myocardial fibrosis by digital image analysis and interactive stereology.

    PubMed

    Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas

    2014-06-09

    Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist's visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193.

  4. Comparative test-retest reliability of metabolite values assessed with magnetic resonance spectroscopy of the brain. The LCModel versus the manufacturer software.

    PubMed

    Fayed, Nicolas; Modrego, Pedro J; Medrano, Jaime

    2009-06-01

    Reproducibility is an essential strength of any diagnostic technique for cross-sectional and longitudinal works. To determine in vivo short-term comparatively, the test-retest reliability of magnetic resonance spectroscopy (MRS) of the brain was compared using the manufacturer's software package and the widely used linear combination of model (LCModel) technique. Single-voxel H-MRS was performed in a series of patients with different pathologies on a 1.5 T clinical scanner. Four areas of the brain were explored with the point resolved spectroscopy technique acquisition mode; the echo time was 35 milliseconds and the repetition time was 2000 milliseconds. We enrolled 15 patients for every area, and the intra-individual variations of metabolites were studied in two consecutive scans without removing the patient from the scanner. Curve fitting and analysis of metabolites were made with the software of GE and the LCModel. Spectra non-fulfilling the minimum criteria of quality in relation to linewidths and signal/noise ratio were rejected. The intraclass correlation coefficients for the N-acetylaspartate/creatine (NAA/Cr) ratios were 0.93, 0.89, 0.9 and 0.8 for the posterior cingulate gyrus, occipital, prefrontal and temporal regions, respectively, with the GE software. For the LCModel, the coefficients were 0.9, 0.89, 0.87 and 0.84, respectively. For the absolute value of NAA, the GE software was also slightly more reproducible than LCModel. However, for the choline/Cr and myo-inositol/Cr ratios, the LCModel was more reliable than the GE software. The variability we have seen hovers around the percentages observed in previous reports (around 10% for the NAA/Cr ratios). We did not find that the LCModel software is superior to the software of the manufacturer. Reproducibility of metabolite values relies more on the observance of the quality parameters than on the software used.

  5. Software use in the (re)habilitation of hearing impaired children.

    PubMed

    Silva, Mariane Perin da; Comerlatto Junior, Ademir Antonio; Balen, Sheila Andreoli; Bevilacqua, Maria Cecília

    2012-01-01

    To verify the applicability of a software in the (re)habilitation of hearing impaired children. The sample comprised 17 children with hearing impairment, ten with cochlear implants (CI) and seven with hearing aids (HA). The Software Auxiliar na Reabilitação de Distúrbios Auditivos - SARDA (Auxiliary Software for the Rehabilitation of Hearing Disorders) was used. The training protocol was applied for 30 minutes, twice a week, for the necessary time to complete the strategies proposed in the software. To measure the software's applicability for training the speech perception ability in quiet and in noise, subjects were assessed through the Hearing in Noise Test (HINT), before and after the auditory training. Data were statistically analyzed. The group of CI users needed, in average, 12.2 days to finish the strategies, and the group of HA users, in average 10.14 days. Both groups presented differences between pre and post assessments, both in quiet and in noise. Younger children showed more difficulty executing the strategies, however, there was no correlation between age and performance. The type of electronic device did not influence the training. Children presented greater difficulty in the strategy involving non-verbal stimuli and in the strategy with verbal stimuli that trains the sustained attention ability. Children's attention and motivation during stimulation were fundamental for a successful auditory training. The auditory training using the SARDA was effective, providing improvement of the speech perception ability, both in quiet and in noise, for the hearing impaired children.

  6. Multicenter Study Validating Accuracy of a Continuous Respiratory Rate Measurement Derived From Pulse Oximetry: A Comparison With Capnography.

    PubMed

    Bergese, Sergio D; Mestek, Michael L; Kelley, Scott D; McIntyre, Robert; Uribe, Alberto A; Sethi, Rakesh; Watson, James N; Addison, Paul S

    2017-04-01

    Intermittent measurement of respiratory rate via observation is routine in many patient care settings. This approach has several inherent limitations that diminish the clinical utility of these measurements because it is intermittent, susceptible to human error, and requires clinical resources. As an alternative, a software application that derives continuous respiratory rate measurement from a standard pulse oximeter has been developed. We sought to determine the performance characteristics of this new technology by comparison with clinician-reviewed capnography waveforms in both healthy subjects and hospitalized patients in a low-acuity care setting. Two independent observational studies were conducted to validate the performance of the Medtronic Nellcor Respiration Rate Software application. One study enrolled 26 healthy volunteer subjects in a clinical laboratory, and a second multicenter study enrolled 53 hospitalized patients. During a 30-minute study period taking place while participants were breathing spontaneously, pulse oximeter and nasal/oral capnography waveforms were collected. Pulse oximeter waveforms were processed to determine respiratory rate via the Medtronic Nellcor Respiration Rate Software. Capnography waveforms reviewed by a clinician were used to determine the reference respiratory rate. A total of 23,243 paired observations between the pulse oximeter-derived respiratory rate and the capnography reference method were collected and examined. The mean reference-based respiratory rate was 15.3 ± 4.3 breaths per minute with a range of 4 to 34 breaths per minute. The Pearson correlation coefficient between the Medtronic Nellcor Respiration Rate Software values and the capnography reference respiratory rate is reported as a linear correlation, R, as 0.92 ± 0.02 (P < .001), whereas Lin's concordance correlation coefficient indicates an overall agreement of 0.85 ± 0.04 (95% confidence interval [CI] +0.76; +0.93) (healthy volunteers: 0.94 ± 0.02 [95% CI +0.91; +0.97]; hospitalized patients: 0.80 ± 0.06 [95% CI +0.68; +0.92]). The mean bias of the Medtronic Nellcor Respiration Rate Software was 0.18 breaths per minute with a precision (SD) of 1.65 breaths per minute (healthy volunteers: 0.37 ± 0.78 [95% limits of agreement: -1.16; +1.90] breaths per minute; hospitalized patients: 0.07 ± 1.99 [95% limits of agreement: -3.84; +3.97] breaths per minute). The root mean square deviation was 1.35 breaths per minute (healthy volunteers: 0.81; hospitalized patients: 1.60). These data demonstrate the performance of the Medtronic Nellcor Respiration Rate Software in healthy subjects and patients hospitalized in a low-acuity care setting when compared with clinician-reviewed capnography. The observed performance of this technology suggests that it may be a useful adjunct to continuous pulse oximetry monitoring by providing continuous respiratory rate measurements. The potential patient safety benefit of using combined continuous pulse oximetry and respiratory rate monitoring warrants assessment.

  7. Multicenter Study Validating Accuracy of a Continuous Respiratory Rate Measurement Derived From Pulse Oximetry: A Comparison With Capnography

    PubMed Central

    Bergese, Sergio D.; Kelley, Scott D.; McIntyre, Robert; Uribe, Alberto A.; Sethi, Rakesh; Watson, James N.; Addison, Paul S.

    2017-01-01

    BACKGROUND: Intermittent measurement of respiratory rate via observation is routine in many patient care settings. This approach has several inherent limitations that diminish the clinical utility of these measurements because it is intermittent, susceptible to human error, and requires clinical resources. As an alternative, a software application that derives continuous respiratory rate measurement from a standard pulse oximeter has been developed. We sought to determine the performance characteristics of this new technology by comparison with clinician-reviewed capnography waveforms in both healthy subjects and hospitalized patients in a low-acuity care setting. METHODS: Two independent observational studies were conducted to validate the performance of the Medtronic NellcorTM Respiration Rate Software application. One study enrolled 26 healthy volunteer subjects in a clinical laboratory, and a second multicenter study enrolled 53 hospitalized patients. During a 30-minute study period taking place while participants were breathing spontaneously, pulse oximeter and nasal/oral capnography waveforms were collected. Pulse oximeter waveforms were processed to determine respiratory rate via the Medtronic Nellcor Respiration Rate Software. Capnography waveforms reviewed by a clinician were used to determine the reference respiratory rate. RESULTS: A total of 23,243 paired observations between the pulse oximeter-derived respiratory rate and the capnography reference method were collected and examined. The mean reference-based respiratory rate was 15.3 ± 4.3 breaths per minute with a range of 4 to 34 breaths per minute. The Pearson correlation coefficient between the Medtronic Nellcor Respiration Rate Software values and the capnography reference respiratory rate is reported as a linear correlation, R, as 0.92 ± 0.02 (P < .001), whereas Lin’s concordance correlation coefficient indicates an overall agreement of 0.85 ± 0.04 (95% confidence interval [CI] +0.76; +0.93) (healthy volunteers: 0.94 ± 0.02 [95% CI +0.91; +0.97]; hospitalized patients: 0.80 ± 0.06 [95% CI +0.68; +0.92]). The mean bias of the Medtronic Nellcor Respiration Rate Software was 0.18 breaths per minute with a precision (SD) of 1.65 breaths per minute (healthy volunteers: 0.37 ± 0.78 [95% limits of agreement: –1.16; +1.90] breaths per minute; hospitalized patients: 0.07 ± 1.99 [95% limits of agreement: –3.84; +3.97] breaths per minute). The root mean square deviation was 1.35 breaths per minute (healthy volunteers: 0.81; hospitalized patients: 1.60). CONCLUSIONS: These data demonstrate the performance of the Medtronic Nellcor Respiration Rate Software in healthy subjects and patients hospitalized in a low-acuity care setting when compared with clinician-reviewed capnography. The observed performance of this technology suggests that it may be a useful adjunct to continuous pulse oximetry monitoring by providing continuous respiratory rate measurements. The potential patient safety benefit of using combined continuous pulse oximetry and respiratory rate monitoring warrants assessment. PMID:28099286

  8. [Study on correlation between ITS sequence of Arctium lappa and quality of Fructus Arctii].

    PubMed

    Xu, Liang; Dou, Deqiang; Wang, Bing; Yang, Yanyun; Kang, Tingguo

    2011-07-01

    To study the correlation between ITS sequence of Arctium lappa and Fructus Arctii quality of different origin. The samples of Fructu arctii materials were collected from 26 different producing areas. Their ITS sequence were determined after polymerase chain reaction (PCR) and quality were evaluated through the determination of arctiin content by HPLC. Genetic diversity, genotype and correlation were analyzed by ClustalX (1.81), Mage 4.0, SPSS 13.0 statistical software. ITS sequence of A. was obtained from 26 samples, and was registered in the GenBank. Corresponding arctiin content of Fructus arctii and 1000-grain weight were determined. A. lappa genotype correlated with Fructus arctii quality by statistical analysis. The research provided a foundation for revealing the molecular mechanism of Fructus arctii geoherbs.

  9. Using Dispersed Modes During Model Correlation

    NASA Technical Reports Server (NTRS)

    Stewart, Eric C.; Hathcock, Megan L.

    2017-01-01

    The model correlation process for the modal characteristics of a launch vehicle is well established. After a test, parameters within the nominal model are adjusted to reflect structural dynamics revealed during testing. However, a full model correlation process for a complex structure can take months of man-hours and many computational resources. If the analyst only has weeks, or even days, of time in which to correlate the nominal model to the experimental results, then the traditional correlation process is not suitable. This paper describes using model dispersions to assist the model correlation process and decrease the overall cost of the process. The process creates thousands of model dispersions from the nominal model prior to the test and then compares each of them to the test data. Using mode shape and frequency error metrics, one dispersion is selected as the best match to the test data. This dispersion is further improved by using a commercial model correlation software. In the three examples shown in this paper, this dispersion based model correlation process performs well when compared to models correlated using traditional techniques and saves time in the post-test analysis.

  10. Genomic Model with Correlation Between Additive and Dominance Effects.

    PubMed

    Xiang, Tao; Christensen, Ole Fredslund; Vitezica, Zulma Gladis; Legarra, Andres

    2018-05-09

    Dominance genetic effects are rarely included in pedigree-based genetic evaluation. With the availability of single nucleotide polymorphism markers and the development of genomic evaluation, estimates of dominance genetic effects have become feasible using genomic best linear unbiased prediction (GBLUP). Usually, studies involving additive and dominance genetic effects ignore possible relationships between them. It has been often suggested that the magnitude of functional additive and dominance effects at the quantitative trait loci are related, but there is no existing GBLUP-like approach accounting for such correlation. Wellmann and Bennewitz showed two ways of considering directional relationships between additive and dominance effects, which they estimated in a Bayesian framework. However, these relationships cannot be fitted at the level of individuals instead of loci in a mixed model and are not compatible with standard animal or plant breeding software. This comes from a fundamental ambiguity in assigning the reference allele at a given locus. We show that, if there has been selection, assigning the most frequent as the reference allele orients the correlation between functional additive and dominance effects. As a consequence, the most frequent reference allele is expected to have a positive value. We also demonstrate that selection creates negative covariance between genotypic additive and dominance genetic values. For parameter estimation, it is possible to use a combined additive and dominance relationship matrix computed from marker genotypes, and to use standard restricted maximum likelihood (REML) algorithms based on an equivalent model. Through a simulation study, we show that such correlations can easily be estimated by mixed model software and accuracy of prediction for genetic values is slightly improved if such correlations are used in GBLUP. However, a model assuming uncorrelated effects and fitting orthogonal breeding values and dominant deviations performed similarly for prediction. Copyright © 2018, Genetics.

  11. Correlation between computerised findings and Newman's scaling on vascularity using power Doppler ultrasonography imaging and its predictive value in patients with plantar fasciitis

    PubMed Central

    Chen, H; Ho, H M; Ying, M; Fu, S N

    2012-01-01

    Objectives The purpose of this study was to correlate findings on small vessel vascularity between computerised findings and Newman's scaling using power Doppler ultrasonography (PDU) imaging and its predictive value in patients with plantar fasciitis. Methods PDU was performed on 44 patients (age range 30–66 years; mean age 48 years) with plantar fasciitis and 46 healthy subjects (age range 18–61 years; mean age 36 years). The vascularity was quantified using ultrasound images by a customised software program and graded by Newman's grading scale. Vascular index (VI) was calculated from the software program as the ratio of the number of colour pixels to the total number of pixels within a standardised selected area of proximal plantar fascia. The 46 healthy subjects were examined on 2 occasions 7–10 days apart, and 18 of them were assessed by 2 examiners. Statistical analyses were performed using intraclass correlation coefficient and linear regression analysis. Results Good correlation was found between the averaged VI ratios and Newman's qualitative scale (ρ = 0.70; p<0.001). Intratester and intertester reliability were 0.89 and 0.61, respectively. Furthermore, higher VI was correlated with less reduction in pain after physiotherapeutic intervention. Conclusions The computerised VI not only has a high level of concordance with the Newman grading scale but is also reliable in reflecting the vascularity of proximal plantar fascia, and can predict pain reduction after intervention. This index can be used to characterise the changes in vascularity of patients with plantar fasciitis, and it may also be helpful for evaluating treatment and monitoring the progress after intervention in future studies. PMID:22167513

  12. Can the analysis of built-in software of CPAP devices replace polygraphy in children?

    PubMed

    Khirani, Sonia; Delord, Vincent; Olmo Arroyo, Jorge; De Sanctis, Livio; Frapin, Annick; Amaddeo, Alessandro; Fauroux, Brigitte

    2017-09-01

    Polysomnography (PSG) is the gold standard for the scoring of residual respiratory events during continuous positive airway pressure (CPAP). Studies comparing PSG scoring with automatic scoring by the built-in software of CPAP devices have reported acceptable agreements except for the hypopnea index (HI) in adult patients, but no study has yet been conducted in children. The aim of the present study was to compare the automatic scoring by CPAP device and manual scoring using the software tracings of the CPAP device integrating pulse oximetry (SpO 2 ) with in-lab polygraphy (PG). Consecutive clinically stable children treated with constant CPAP (ResMed) for at least one month and scheduled for a nocturnal PG were recruited. A pulse oximeter was connected to the CPAP device. The PG apnea-hypopnea index (AHI PG ), scored according to modified AASM guidelines, was compared with the automatic AHI reported by the CPAP device (AHI A CPAP ) and the manual scoring of the AHI on the CPAP software (AHI M CPAP ). Fifteen children (1.5-18.6 years) were included. Mean residual AHI PG was 0.9 ± 1.2/hour (0.0-4.6/hour) vs. AHI A CPAP of 3.6 ± 3.6/hour (0.5-14.7/hour) (p < 0.001), and AHI M CPAP of 1.2 ± 1.6/hour (0.0-5.1/hour) (p = 0.01). Correlation between AHI PG and AHI A CPAP was good (r = 0.667; p = 0.007), and improved when considering AHI M CPAP (r = 0.933; p < 0.001). Strong correlations were also observed between the PG apnea index (AI) and HI, and the manually scored AI and HI on CPAP, respectively. Manual scoring of respiratory events on the built-in software tracings of CPAP devices integrating SpO 2 signal may be helpful. These results have to be confirmed in patients with higher AHI. Copyright © 2017 Elsevier B.V. All rights reserved.

  13. Reproducibility of dynamic contrast-enhanced MRI and dynamic susceptibility contrast MRI in the study of brain gliomas: a comparison of data obtained using different commercial software.

    PubMed

    Conte, Gian Marco; Castellano, Antonella; Altabella, Luisa; Iadanza, Antonella; Cadioli, Marcello; Falini, Andrea; Anzalone, Nicoletta

    2017-04-01

    Dynamic susceptibility contrast MRI (DSC) and dynamic contrast-enhanced MRI (DCE) are useful tools in the diagnosis and follow-up of brain gliomas; nevertheless, both techniques leave the open issue of data reproducibility. We evaluated the reproducibility of data obtained using two different commercial software for perfusion maps calculation and analysis, as one of the potential sources of variability can be the software itself. DSC and DCE analyses from 20 patients with gliomas were tested for both the intrasoftware (as intraobserver and interobserver reproducibility) and the intersoftware reproducibility, as well as the impact of different postprocessing choices [vascular input function (VIF) selection and deconvolution algorithms] on the quantification of perfusion biomarkers plasma volume (Vp), volume transfer constant (K trans ) and rCBV. Data reproducibility was evaluated with the intraclass correlation coefficient (ICC) and Bland-Altman analysis. For all the biomarkers, the intra- and interobserver reproducibility resulted in almost perfect agreement in each software, whereas for the intersoftware reproducibility the value ranged from 0.311 to 0.577, suggesting fair to moderate agreement; Bland-Altman analysis showed high dispersion of data, thus confirming these findings. Comparisons of different VIF estimation methods for DCE biomarkers resulted in ICC of 0.636 for K trans and 0.662 for Vp; comparison of two deconvolution algorithms in DSC resulted in an ICC of 0.999. The use of single software ensures very good intraobserver and interobservers reproducibility. Caution should be taken when comparing data obtained using different software or different postprocessing within the same software, as reproducibility is not guaranteed anymore.

  14. Basic to Advanced InSAR Processing: GMTSAR

    NASA Astrophysics Data System (ADS)

    Sandwell, D. T.; Xu, X.; Baker, S.; Hogrelius, A.; Mellors, R. J.; Tong, X.; Wei, M.; Wessel, P.

    2017-12-01

    Monitoring crustal deformation using InSAR is becoming a standard technique for the science and application communities. Optimal use of the new data streams from Sentinel-1 and NISAR will require open software tools as well as education on the strengths and limitations of the InSAR methods. Over the past decade we have developed freely available, open-source software for processing InSAR data. The software relies on the Generic Mapping Tools (GMT) for the back-end data analysis and display and is thus called GMTSAR. With startup funding from NSF, we accelerated the development of GMTSAR to include more satellite data sources and provide better integration and distribution with GMT. In addition, with support from UNAVCO we have offered 6 GMTSAR short courses to educate mostly novice InSAR users. Currently, the software is used by hundreds of scientists and engineers around the world to study deformation at more than 4300 different sites. The most challenging aspect of the recent software development was the transition from image alignment using the cross-correlation method to a completely new alignment algorithm that uses only the precise orbital information to geometrically align images to an accuracy of better than 7 cm. This development was needed to process a new data type that is being acquired by the Sentinel-1A/B satellites. This combination of software and open data is transforming radar interferometry from a research tool into a fully operational time series analysis tool. Over the next 5 years we are planning to continue to broaden the user base through: improved software delivery methods; code hardening; better integration with data archives; support for high level products being developed for NISAR; and continued education and outreach.

  15. Log ASCII Standard (LAS) Files for Geophysical Wireline Well Logs and Their Application to Geologic Cross Sections Through the Central Appalachian Basin

    USGS Publications Warehouse

    Crangle, Robert D.

    2007-01-01

    Introduction The U.S. Geological Survey (USGS) uses geophysical wireline well logs for a variety of purposes, including stratigraphic correlation (Hettinger, 2001, Ryder, 2002), petroleum reservoir analyses (Nelson and Bird, 2005), aquifer studies (Balch, 1988), and synthetic seismic profiles (Kulander and Ryder, 2005). Commonly, well logs are easier to visualize, manipulate, and interpret when available in a digital format. In recent geologic cross sections E-E' and D-D', constructed through the central Appalachian basin (Ryder, Swezey, and others, in press; Ryder, Crangle, and others, in press), gamma ray well log traces and lithologic logs were used to correlate key stratigraphic intervals (Fig. 1). The stratigraphy and structure of the cross sections are illustrated through the use of graphical software applications (e.g., Adobe Illustrator). The gamma ray traces were digitized in Neuralog (proprietary software) from paper well logs and converted to a Log ASCII Standard (LAS) format. Once converted, the LAS files were transformed to images through an LAS-reader application (e.g., GeoGraphix Prizm) and then overlain in positions adjacent to well locations, used for stratigraphic control, on each cross section. This report summarizes the procedures used to convert paper logs to a digital LAS format using a third-party software application, Neuralog. Included in this report are LAS files for sixteen wells used in geologic cross section E-E' (Table 1) and thirteen wells used in geologic cross section D-D' (Table 2).

  16. Inter- and Intrarater Reliability Using Different Software Versions of E4D Compare in Dental Education.

    PubMed

    Callan, Richard S; Cooper, Jeril R; Young, Nancy B; Mollica, Anthony G; Furness, Alan R; Looney, Stephen W

    2015-06-01

    The problems associated with intra- and interexaminer reliability when assessing preclinical performance continue to hinder dental educators' ability to provide accurate and meaningful feedback to students. Many studies have been conducted to evaluate the validity of utilizing various technologies to assist educators in achieving that goal. The purpose of this study was to compare two different versions of E4D Compare software to determine if either could be expected to deliver consistent and reliable comparative results, independent of the individual utilizing the technology. Five faculty members obtained E4D digital images of students' attempts (sample model) at ideal gold crown preparations for tooth #30 performed on typodont teeth. These images were compared to an ideal (master model) preparation utilizing two versions of E4D Compare software. The percent correlations between and within these faculty members were recorded and averaged. The intraclass correlation coefficient was used to measure both inter- and intrarater agreement among the examiners. The study found that using the older version of E4D Compare did not result in acceptable intra- or interrater agreement among the examiners. However, the newer version of E4D Compare, when combined with the Nevo scanner, resulted in a remarkable degree of agreement both between and within the examiners. These results suggest that consistent and reliable results can be expected when utilizing this technology under the protocol described in this study.

  17. Sampling and sensitivity analyses tools (SaSAT) for computational modelling

    PubMed Central

    Hoare, Alexander; Regan, David G; Wilson, David P

    2008-01-01

    SaSAT (Sampling and Sensitivity Analysis Tools) is a user-friendly software package for applying uncertainty and sensitivity analyses to mathematical and computational models of arbitrary complexity and context. The toolbox is built in Matlab®, a numerical mathematical software package, and utilises algorithms contained in the Matlab® Statistics Toolbox. However, Matlab® is not required to use SaSAT as the software package is provided as an executable file with all the necessary supplementary files. The SaSAT package is also designed to work seamlessly with Microsoft Excel but no functionality is forfeited if that software is not available. A comprehensive suite of tools is provided to enable the following tasks to be easily performed: efficient and equitable sampling of parameter space by various methodologies; calculation of correlation coefficients; regression analysis; factor prioritisation; and graphical output of results, including response surfaces, tornado plots, and scatterplots. Use of SaSAT is exemplified by application to a simple epidemic model. To our knowledge, a number of the methods available in SaSAT for performing sensitivity analyses have not previously been used in epidemiological modelling and their usefulness in this context is demonstrated. PMID:18304361

  18. Robonaut's Flexible Information Technology Infrastructure

    NASA Technical Reports Server (NTRS)

    Askew, Scott; Bluethmann, William; Alder, Ken; Ambrose, Robert

    2003-01-01

    Robonaut, NASA's humanoid robot, is designed to work as both an astronaut assistant and, in certain situations, an astronaut surrogate. This highly dexterous robot performs complex tasks under telepresence control that could previously only be carried out directly by humans. Currently with 47 degrees of freedom (DOF), Robonaut is a state-of-the-art human size telemanipulator system. while many of Robonaut's embedded components have been custom designed to meet packaging or environmental requirements, the primary computing systems used in Robonaut are currently commercial-off-the-shelf (COTS) products which have some correlation to flight qualified computer systems. This loose coupling of information technology (IT) resources allows Robonaut to exploit cost effective solutions while floating the technology base to take advantage of the rapid pace of IT advances. These IT systems utilize a software development environment, which is both compatible with COTS hardware as well as flight proven computing systems, preserving the majority of software development for a flight system. The ability to use highly integrated and flexible COTS software development tools improves productivity while minimizing redesign for a space flight system. Further, the flexibility of Robonaut's software and communication architecture has allowed it to become a widely used distributed development testbed for integrating new capabilities and furthering experimental research.

  19. Video Altimeter and Obstruction Detector for an Aircraft

    NASA Technical Reports Server (NTRS)

    Delgado, Frank J.; Abernathy, Michael F.; White, Janis; Dolson, William R.

    2013-01-01

    Video-based altimetric and obstruction detection systems for aircraft have been partially developed. The hardware of a system of this type includes a downward-looking video camera, a video digitizer, a Global Positioning System receiver or other means of measuring the aircraft velocity relative to the ground, a gyroscope based or other attitude-determination subsystem, and a computer running altimetric and/or obstruction-detection software. From the digitized video data, the altimetric software computes the pixel velocity in an appropriate part of the video image and the corresponding angular relative motion of the ground within the field of view of the camera. Then by use of trigonometric relationships among the aircraft velocity, the attitude of the camera, the angular relative motion, and the altitude, the software computes the altitude. The obstruction-detection software performs somewhat similar calculations as part of a larger task in which it uses the pixel velocity data from the entire video image to compute a depth map, which can be correlated with a terrain map, showing locations of potential obstructions. The depth map can be used as real-time hazard display and/or to update an obstruction database.

  20. [Cytocompatibility of Co-Cr ceramic alloys after recasting].

    PubMed

    Hu, Yu-Feng; Jin, Wen-Zhong

    2017-06-01

    To study the correlation between apical foramen area and accuracy of PropexII electronic apex locator under destroyed apical constriction. Forty extracted teeth with single straight root canal were ground down 1 mm in the root tip and placed in 2% liquid agar gel injected into Castro model. The length of root canal was measured by PropexII electronic apex locator. The difference (L) between the electronic length (LP) and actual length was calculated. Imaging of apical foramen was recorded under microscope and apical foramen area (S) was measured by image processing software Photoshop CS. SPSS 22.0 software package was used to analyze the linear correlation and regression. With ±0.5 mm as the allowable range, all value of L was positive. The precise rate of PropexII was 52.5% when apical constriction was destroyed. There was a linear relationship between S and L (S=0.04+0.11×L,R=0.903). The accuracy decreases when apical constriction is destroyed. The accuracy is worse when the apical foramen area is larger.

  1. Quantitative evaluation of skeletal muscle defects in second harmonic generation images.

    PubMed

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  2. Quantitative evaluation of skeletal muscle defects in second harmonic generation images

    NASA Astrophysics Data System (ADS)

    Liu, Wenhua; Raben, Nina; Ralston, Evelyn

    2013-02-01

    Skeletal muscle pathologies cause irregularities in the normally periodic organization of the myofibrils. Objective grading of muscle morphology is necessary to assess muscle health, compare biopsies, and evaluate treatments and the evolution of disease. To facilitate such quantitation, we have developed a fast, sensitive, automatic imaging analysis software. It detects major and minor morphological changes by combining texture features and Fourier transform (FT) techniques. We apply this tool to second harmonic generation (SHG) images of muscle fibers which visualize the repeating myosin bands. Texture features are then calculated by using a Haralick gray-level cooccurrence matrix in MATLAB. Two scores are retrieved from the texture correlation plot by using FT and curve-fitting methods. The sensitivity of the technique was tested on SHG images of human adult and infant muscle biopsies and of mouse muscle samples. The scores are strongly correlated to muscle fiber condition. We named the software MARS (muscle assessment and rating scores). It is executed automatically and is highly sensitive even to subtle defects. We propose MARS as a powerful and unbiased tool to assess muscle health.

  3. GP Workbench Manual: Technical Manual, User's Guide, and Software Guide

    USGS Publications Warehouse

    Oden, Charles P.; Moulton, Craig W.

    2006-01-01

    GP Workbench is an open-source general-purpose geophysical data processing software package written primarily for ground penetrating radar (GPR) data. It also includes support for several USGS prototype electromagnetic instruments such as the VETEM and ALLTEM. The two main programs in the package are GP Workbench and GP Wave Utilities. GP Workbench has routines for filtering, gridding, and migrating GPR data; as well as an inversion routine for characterizing UXO (unexploded ordinance) using ALLTEM data. GP Workbench provides two-dimensional (section view) and three-dimensional (plan view or time slice view) processing for GPR data. GP Workbench can produce high-quality graphics for reports when Surfer 8 or higher (Golden Software) is installed. GP Wave Utilities provides a wide range of processing algorithms for single waveforms, such as filtering, correlation, deconvolution, and calculating GPR waveforms. GP Wave Utilities is used primarily for calibrating radar systems and processing individual traces. Both programs also contain research features related to the calibration of GPR systems and calculating subsurface waveforms. The software is written to run on the Windows operating systems. GP Workbench can import GPR data file formats used by major commercial instrument manufacturers including Sensors and Software, GSSI, and Mala. The GP Workbench native file format is SU (Seismic Unix), and subsequently, files generated by GP Workbench can be read by Seismic Unix as well as many other data processing packages.

  4. AnyWave: a cross-platform and modular software for visualizing and processing electrophysiological signals.

    PubMed

    Colombet, B; Woodman, M; Badier, J M; Bénar, C G

    2015-03-15

    The importance of digital signal processing in clinical neurophysiology is growing steadily, involving clinical researchers and methodologists. There is a need for crossing the gap between these communities by providing efficient delivery of newly designed algorithms to end users. We have developed such a tool which both visualizes and processes data and, additionally, acts as a software development platform. AnyWave was designed to run on all common operating systems. It provides access to a variety of data formats and it employs high fidelity visualization techniques. It also allows using external tools as plug-ins, which can be developed in languages including C++, MATLAB and Python. In the current version, plug-ins allow computation of connectivity graphs (non-linear correlation h2) and time-frequency representation (Morlet wavelets). The software is freely available under the LGPL3 license. AnyWave is designed as an open, highly extensible solution, with an architecture that permits rapid delivery of new techniques to end users. We have developed AnyWave software as an efficient neurophysiological data visualizer able to integrate state of the art techniques. AnyWave offers an interface well suited to the needs of clinical research and an architecture designed for integrating new tools. We expect this software to strengthen the collaboration between clinical neurophysiologists and researchers in biomedical engineering and signal processing. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. User Interactive Software for Analysis of Human Physiological Data

    NASA Technical Reports Server (NTRS)

    Cowings, Patricia S.; Toscano, William; Taylor, Bruce C.; Acharya, Soumydipta

    2006-01-01

    Ambulatory physiological monitoring has been used to study human health and performance in space and in a variety of Earth-based environments (e.g., military aircraft, armored vehicles, small groups in isolation, and patients). Large, multi-channel data files are typically recorded in these environments, and these files often require the removal of contaminated data prior to processing and analyses. Physiological data processing can now be performed with user-friendly, interactive software developed by the Ames Psychophysiology Research Laboratory. This software, which runs on a Windows platform, contains various signal-processing routines for both time- and frequency- domain data analyses (e.g., peak detection, differentiation and integration, digital filtering, adaptive thresholds, Fast Fourier Transform power spectrum, auto-correlation, etc.). Data acquired with any ambulatory monitoring system that provides text or binary file format are easily imported to the processing software. The application provides a graphical user interface where one can manually select and correct data artifacts utilizing linear and zero interpolation and adding trigger points for missed peaks. Block and moving average routines are also provided for data reduction. Processed data in numeric and graphic format can be exported to Excel. This software, PostProc (for post-processing) requires the Dadisp engineering spreadsheet (DSP Development Corp), or equivalent, for implementation. Specific processing routines were written for electrocardiography, electroencephalography, electromyography, blood pressure, skin conductance level, impedance cardiography (cardiac output, stroke volume, thoracic fluid volume), temperature, and respiration

  6. Evaluation of three different validation procedures regarding the accuracy of template-guided implant placement: an in vitro study.

    PubMed

    Vasak, Christoph; Strbac, Georg D; Huber, Christian D; Lettner, Stefan; Gahleitner, André; Zechner, Werner

    2015-02-01

    The study aims to evaluate the accuracy of the NobelGuide™ (Medicim/Nobel Biocare, Göteborg, Sweden) concept maximally reducing the influence of clinical and surgical parameters. Moreover, the study was to compare and validate two validation procedures versus a reference method. Overall, 60 implants were placed in 10 artificial edentulous mandibles according to the NobelGuide™ protocol. For merging the pre- and postoperative DICOM data sets, three different fusion methods (Triple Scan Technique, NobelGuide™ Validation software, and AMIRA® software [VSG - Visualization Sciences Group, Burlington, MA, USA] as reference) were applied. Discrepancies between the virtual and the actual implant positions were measured. The mean deviations measured with AMIRA® were 0.49 mm (implant shoulder), 0.69 mm (implant apex), and 1.98°mm (implant axis). The Triple Scan Technique as well as the NobelGuide™ Validation software revealed similar deviations compared with the reference method. A significant correlation between angular and apical deviations was seen (r = 0.53; p < .001). A greater implant diameter was associated with greater deviations (p = .03). The Triple Scan Technique as a system-independent validation procedure as well as the NobelGuide™ Validation software are in accordance with the AMIRA® software. The NobelGuide™ system showed similar or less spatial and angular deviations compared with others. © 2013 Wiley Periodicals, Inc.

  7. A study of land mobile satellite service multipath effects using SATLAB software

    NASA Technical Reports Server (NTRS)

    Campbell, Richard L.

    1991-01-01

    A software package is proposed that uses the known properties of signals received in multipath environments along with the mathematical relationships between signal characteristics to explore the effects of antenna pattern, vehicle velocity, shadowing of the direct wave, distributions of scatters around the moving vehicle and levels of scattered signals on the received complex envelope, fade rates and fade duration, Doppler spectrum, signal arrival angle spectrum, and spatial correlation. The data base may be either actual measured received signals entered as ASCII flat files or data synthesized using a built in model. An example illustrates the effect of using different antennas to receive signals in the same environment.

  8. Power System Simulations For The Globalstar2 Mission Using The PowerCap Software

    NASA Astrophysics Data System (ADS)

    Defoug, S.; Pin, R.

    2011-10-01

    The Globalstar system aims to enable customers to communicate all around the world thanks to its constellation of 48 LEO satellites. Thales Alenia Space is in charge of the design and manufacturing of the second generation of the Globalstar satellites. For such a long duration mission (15 years) and with a payload power consumption varying incessantly, the optimization of the solar arrays and battery has to be consolidated by an accurate power simulation tool. After a general overview of the Globalstar power system and of the PowerCap software, this paper presents the dedicated version elaborated for the GlobalStar2 mission, the simulations results and their correlation with the tests.

  9. Generalisation of the identity method for determination of high-order moments of multiplicity distributions with a software implementation

    NASA Astrophysics Data System (ADS)

    Maćkowiak-Pawłowska, Maja; Przybyła, Piotr

    2018-05-01

    The incomplete particle identification limits the experimentally-available phase space region for identified particle analysis. This problem affects ongoing fluctuation and correlation studies including the search for the critical point of strongly interacting matter performed on SPS and RHIC accelerators. In this paper we provide a procedure to obtain nth order moments of the multiplicity distribution using the identity method, generalising previously published solutions for n=2 and n=3. Moreover, we present an open source software implementation of this computation, called Idhim, that allows one to obtain the true moments of identified particle multiplicity distributions from the measured ones provided the response function of the detector is known.

  10. A wideband software reconfigurable modem

    NASA Astrophysics Data System (ADS)

    Turner, J. H., Jr.; Vickers, H.

    A wideband modem is described which provides signal processing capability for four Lx-band signals employing QPSK, MSK and PPM waveforms and employs a software reconfigurable architecture for maximum system flexibility and graceful degradation. The current processor uses a 2901 and two 8086 microprocessors per channel and performs acquisition, tracking, and data demodulation for JITDS, GPS, IFF and TACAN systems. The next generation processor will be implemented using a VHSIC chip set employing a programmable complex array vector processor module, a GP computer module, customized gate array modules, and a digital array correlator. This integrated processor has application to a wide number of diverse system waveforms, and will bring the benefits of VHSIC technology insertion into avionic antijam communications systems.

  11. [Spatial distribution of birth defects among children aged 0 to 5 years and its relationship with soil chemical elements in Chongqing].

    PubMed

    Dong, Yan; Zhong, Zhao-hui; Li, Hong; Li, Jie; Wang, Ying-xiong; Peng, Bin; Zhang, Mao-zhong; Huang, Qiao; Yan, Ju; Xu, Fei-long

    2013-10-01

    To explore the correlation between the incidence of birth defects and the contents of soil elements so as to provide a scientific basis for screening the related pathogenic factors that inducing birth defects for the development of related preventive and control strategies. MapInfo 7.0 software was used to draw the maps on spatial distribution regarding the incidence rates of birth defects and the contents of 11 chemical elements in soil in the 33 studied areas. Variables on the two maps were superposed for analyzing the spatial correlation. SAS 8.0 software was used to analyze single factor, multi-factors and principal components as well as to comprehensively evaluate the degrees of relevance. Different incidence rates of birth defects showed in the maps of spatial distribution presented certain degrees of negative correlation with anomalies of soil chemical elements, including copper, chrome, iodine, selenium, zinc while positively correlated with the levels of lead. Results from the principal component regression equation indicating that the contents of copper(0.002), arsenic(-0.07), cadmium(0.05), chrome (-0.001), zinc (0.001), iodine(-0.03), lead (0.08), fluorine(-0.002)might serve as important factors that related to the prevalence of birth defects. Through the study on spatial distribution, we noticed that the incidence rates of birth defects were related to the contents of copper, chrome, iodine, selenium, zinc, lead in soil while the contents of chrome, iodine and lead might lead to the occurrence of birth defects.

  12. Correlation Assessment of Climate and Geographic Distribution of Tuberculosis Using Geographical Information System (GIS).

    PubMed

    Beiranvand, Reza; Karimi, Asrin; Delpisheh, Ali; Sayehmiri, Kourosh; Soleimani, Samira; Ghalavandi, Shahnaz

    2016-01-01

    Tuberculosis (TB) spread pattern is influenced by geographic and social factors. Nowadays Geographic Information System (GIS) is one of the most important epidemiological instrumentation identifying high-risk population groups and geographic areas of TB. The aim of this study was to determine the correlation between climate and geographic distribution of TB in Khuzestan Province using GIS during 2005-2012. Through an ecological study, all 6363 patients with definite diagnosis of TB from 2005 until the end of September 2012 in Khuzestan Province, southern Iran were diagnosed. Data were recorded using TB- Register software. Tuberculosis incidence based on the climate and the average of annual rain was evaluated using GIS. Data were analyzed through SPSS software. Independent t-test, ANOVA, Linear regression, Pearson and Eta correlation coefficient with a significance level of less than 5% were used for the statistical analysis. The TB incidence was different in various geographic conditions. The highest mean of TB cumulative incidence rate was observed in extra dry areas (P= 0.017). There was a significant inverse correlation between annual rain rate and TB incidence rate (R= -0.45, P= 0.001). The lowest TB incidence rate (0-100 cases per 100,000) was in areas with the average of annual rain more than 1000 mm (P= 0.003). The risk of TB has a strong relationship with climate and the average of annual rain, so that the risk of TB in areas with low annual rainfall and extra dry climate is more than other regions. Services and special cares to high-risk regions of TB are recommended.

  13. Automated CT Scan Scores of Bronchiectasis and Air Trapping in Cystic Fibrosis

    PubMed Central

    Swiercz, Waldemar; Heltshe, Sonya L.; Anthony, Margaret M.; Szefler, Paul; Klein, Rebecca; Strain, John; Brody, Alan S.; Sagel, Scott D.

    2014-01-01

    Background: Computer analysis of high-resolution CT (HRCT) scans may improve the assessment of structural lung injury in children with cystic fibrosis (CF). The goal of this cross-sectional pilot study was to validate automated, observer-independent image analysis software to establish objective, simple criteria for bronchiectasis and air trapping. Methods: HRCT scans of the chest were performed in 35 children with CF and compared with scans from 12 disease control subjects. Automated image analysis software was developed to count visible airways on inspiratory images and to measure a low attenuation density (LAD) index on expiratory images. Among the children with CF, relationships among automated measures, Brody HRCT scanning scores, lung function, and sputum markers of inflammation were assessed. Results: The number of total, central, and peripheral airways on inspiratory images and LAD (%) on expiratory images were significantly higher in children with CF compared with control subjects. Among subjects with CF, peripheral airway counts correlated strongly with Brody bronchiectasis scores by two raters (r = 0.86, P < .0001; r = 0.91, P < .0001), correlated negatively with lung function, and were positively associated with sputum free neutrophil elastase activity. LAD (%) correlated with Brody air trapping scores (r = 0.83, P < .0001; r = 0.69, P < .0001) but did not correlate with lung function or sputum inflammatory markers. Conclusions: Quantitative airway counts and LAD (%) on HRCT scans appear to be useful surrogates for bronchiectasis and air trapping in children with CF. Our automated methodology provides objective quantitative measures of bronchiectasis and air trapping that may serve as end points in CF clinical trials. PMID:24114359

  14. The correlation between preoperative volumetry and real graft weight: comparison of two volumetry programs

    PubMed Central

    Mussin, Nadiar; Sumo, Marco; Choi, YoungRok; Choi, Jin Yong; Ahn, Sung-Woo; Yoon, Kyung Chul; Kim, Hyo-Sin; Hong, Suk Kyun; Yi, Nam-Joon; Suh, Kyung-Suk

    2017-01-01

    Purpose Liver volumetry is a vital component in living donor liver transplantation to determine an adequate graft volume that meets the metabolic demands of the recipient and at the same time ensures donor safety. Most institutions use preoperative contrast-enhanced CT image-based software programs to estimate graft volume. The objective of this study was to evaluate the accuracy of 2 liver volumetry programs (Rapidia vs. Dr. Liver) in preoperative right liver graft estimation compared with real graft weight. Methods Data from 215 consecutive right lobe living donors between October 2013 and August 2015 were retrospectively reviewed. One hundred seven patients were enrolled in Rapidia group and 108 patients were included in the Dr. Liver group. Estimated graft volumes generated by both software programs were compared with real graft weight measured during surgery, and further classified into minimal difference (≤15%) and big difference (>15%). Correlation coefficients and degree of difference were determined. Linear regressions were calculated and results depicted as scatterplots. Results Minimal difference was observed in 69.4% of cases from Dr. Liver group and big difference was seen in 44.9% of cases from Rapidia group (P = 0.035). Linear regression analysis showed positive correlation in both groups (P < 0.01). However, the correlation coefficient was better for the Dr. Liver group (R2 = 0.719), than for the Rapidia group (R2 = 0.688). Conclusion Dr. Liver can accurately predict right liver graft size better and faster than Rapidia, and can facilitate preoperative planning in living donor liver transplantation. PMID:28382294

  15. A versatile software package for inter-subject correlation based analyses of fMRI.

    PubMed

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/

  16. A versatile software package for inter-subject correlation based analyses of fMRI

    PubMed Central

    Kauppi, Jukka-Pekka; Pajula, Juha; Tohka, Jussi

    2014-01-01

    In the inter-subject correlation (ISC) based analysis of the functional magnetic resonance imaging (fMRI) data, the extent of shared processing across subjects during the experiment is determined by calculating correlation coefficients between the fMRI time series of the subjects in the corresponding brain locations. This implies that ISC can be used to analyze fMRI data without explicitly modeling the stimulus and thus ISC is a potential method to analyze fMRI data acquired under complex naturalistic stimuli. Despite of the suitability of ISC based approach to analyze complex fMRI data, no generic software tools have been made available for this purpose, limiting a widespread use of ISC based analysis techniques among neuroimaging community. In this paper, we present a graphical user interface (GUI) based software package, ISC Toolbox, implemented in Matlab for computing various ISC based analyses. Many advanced computations such as comparison of ISCs between different stimuli, time window ISC, and inter-subject phase synchronization are supported by the toolbox. The analyses are coupled with re-sampling based statistical inference. The ISC based analyses are data and computation intensive and the ISC toolbox is equipped with mechanisms to execute the parallel computations in a cluster environment automatically and with an automatic detection of the cluster environment in use. Currently, SGE-based (Oracle Grid Engine, Son of a Grid Engine, or Open Grid Scheduler) and Slurm environments are supported. In this paper, we present a detailed account on the methods behind the ISC Toolbox, the implementation of the toolbox and demonstrate the possible use of the toolbox by summarizing selected example applications. We also report the computation time experiments both using a single desktop computer and two grid environments demonstrating that parallelization effectively reduces the computing time. The ISC Toolbox is available in https://code.google.com/p/isc-toolbox/ PMID:24550818

  17. Interactive Retinal Blood Flow Analysis of the Macular Region

    PubMed Central

    Tian, Jing; Somfai, Gábor Márk; Campagnoli, Thalmon R.; Smiddy, William E.; Debuc, Delia Cabrera

    2015-01-01

    The study of retinal hemodynamics plays an important role to understand the onset and progression of diabetic retinopathy which is a leading cause of blindness in American adults. In this work, we developed an interactive retinal analysis tool to quantitatively measure the blood flow velocity (BFV) and blood flow rate (BFR) in the macular region using the Retinal Function Imager (RFI-3005, Optical Imaging, Rehovot, Israel). By employing a high definition stroboscopic fundus camera, the RFI device is able to assess retinal blood flow characteristics in vivo even in the capillaries. However, the measurements of BFV using a user-guided vessel segmentation tool may induce significant inter-observer differences and BFR is not provided in the built-in software. In this work, we have developed an interactive tool to assess the retinal BFV as well as BFR in the macular region. Optical coherence tomography (OCT) data from commercially available devices were registered with the RFI image to locate the fovea accurately. The boundaries of the vessels were delineated on a motion contrast enhanced image and BFV was computed by maximizing the cross-correlation of pixel intensities in a ratio video. Furthermore, we were able to calculate the BFR in absolute values (μl/s) which other currently available devices targeting the retinal microcirculation are not yet capable of. Experiments were conducted on 122 vessels from 5 healthy and 5 mild non-proliferative diabetic retinopathy (NPDR) subjects. The Pearson's correlation of the vessel diameter measurements between our method and manual labeling on 40 vessels was 0.984. The intraclass correlation (ICC) of BFV between our proposed method and built-in software were 0.924 and 0.830 for vessels from healthy and NPDR subjects, respectively. The coefficient of variation between repeated sessions was reduced significantly from 22.5% in the RFI built-in software to 15.9% in our proposed method (p<0.001). PMID:26569349

  18. WE-FG-202-12: Investigation of Longitudinal Salivary Gland DCE-MRI Changes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ger, R; Howell, R; Li, H

    Purpose: To determine the correlation between dose and changes through treatment in dynamic contrast enhanced (DCE) MRI voxel parameters (Ktrans, kep, Ve, and Vp) within salivary glands of head and neck oropharyngeal squamous cell carcinoma (HNSCC) patients. Methods: 17 HNSCC patients treated with definitive radiation therapy completed DCE-MRI scans on a 3T scanner at pre-treatment, mid-treatment, and post-treatment time points. Mid-treatment and post-treatment DCE images were deformably registered to pre-treatment DCE images (Velocity software package). Pharmacokinetic analysis of the DCE images used a modified Tofts model to produce parameter maps with an arterial input function selected from each patient’s perivertebralmore » space on the image (NordicICE software package). In-house software was developed for voxel-by-voxel longitudinal analysis of the salivary glands within the registered images. The planning CT was rigidly registered to the pre-treatment DCE image to obtain dose values in each voxel. Voxels within the lower and upper dose quartiles for each gland were averaged for each patient, then an average of the patients’ means for the two quartiles were compared. Dose-relationships were also assessed by Spearman correlations between dose and voxel parameter changes for each patient’s gland. Results: Changes in parameters’ means between time points were observed, but inter-patient variability was high. Ve of the parotid was the only parameter that had a consistently significant longitudinal difference between dose quartiles. The highest Spearman correlation was Vp of the sublingual gland for the change in the pre-treatment to mid-treatment values with only a ρ=0.29. Conclusion: In this preliminary study, there was large inter-patient variability in the changes of DCE voxel parameters with no clear relationship with dose. Additional patients may reduce the uncertainties and allow for the determination of the existence of parameter and dose relationships.« less

  19. Earth Surface Monitoring with COSI-Corr, Techniques and Applications

    NASA Astrophysics Data System (ADS)

    Leprince, S.; Ayoub, F.; Avouac, J.

    2009-12-01

    Co-registration of Optically Sensed Images and Correlation (COSI-Corr) is a software package developed at the California Institute of Technology (USA) for accurate geometrical processing of optical satellite and aerial imagery. Initially developed for the measurement of co-seismic ground deformation using optical imagery, COSI-Corr is now used for a wide range of applications in Earth Sciences, which take advantage of the software capability to co-register, with very high accuracy, images taken from different sensors and acquired at different times. As long as a sensor is supported in COSI-Corr, all images between the supported sensors can be accurately orthorectified and co-registered. For example, it is possible to co-register a series of SPOT images, a series of aerial photographs, as well as to register a series of aerial photographs with a series of SPOT images, etc... Currently supported sensors include the SPOT 1-5, Quickbird, Worldview 1 and Formosat 2 satellites, the ASTER instrument, and frame camera acquisitions from e.g., aerial survey or declassified satellite imagery. Potential applications include accurate change detection between multi-temporal and multi-spectral images, and the calibration of pushbroom cameras. In particular, COSI-Corr provides a powerful correlation tool, which allows for accurate estimation of surface displacement. The accuracy depends on many factors (e.g., cloud, snow, and vegetation cover, shadows, temporal changes in general, steadiness of the imaging platform, defects of the imaging system, etc...) but in practice, the standard deviation of the measurements obtained from the correlation of mutli-temporal images is typically around 1/20 to 1/10 of the pixel size. The software package also includes post-processing tools such as denoising, destriping, and stacking tools to facilitate data interpretation. Examples drawn from current research in, e.g., seismotectonics, glaciology, and geomorphology will be presented. COSI-Corr is developed in IDL (Interactive Data Language), integrated under the user friendly interface ENVI (Environment for Visualizing Images), and is distributed free of charge for academic research purposes.

  20. An image-processing software package: UU and Fig for optical metrology applications

    NASA Astrophysics Data System (ADS)

    Chen, Lujie

    2013-06-01

    Modern optical metrology applications are largely supported by computational methods, such as phase shifting [1], Fourier Transform [2], digital image correlation [3], camera calibration [4], etc, in which image processing is a critical and indispensable component. While it is not too difficult to obtain a wide variety of image-processing programs from the internet; few are catered for the relatively special area of optical metrology. This paper introduces an image-processing software package: UU (data processing) and Fig (data rendering) that incorporates many useful functions to process optical metrological data. The cross-platform programs UU and Fig are developed based on wxWidgets. At the time of writing, it has been tested on Windows, Linux and Mac OS. The userinterface is designed to offer precise control of the underline processing procedures in a scientific manner. The data input/output mechanism is designed to accommodate diverse file formats and to facilitate the interaction with other independent programs. In terms of robustness, although the software was initially developed for personal use, it is comparably stable and accurate to most of the commercial software of similar nature. In addition to functions for optical metrology, the software package has a rich collection of useful tools in the following areas: real-time image streaming from USB and GigE cameras, computational geometry, computer vision, fitting of data, 3D image processing, vector image processing, precision device control (rotary stage, PZT stage, etc), point cloud to surface reconstruction, volume rendering, batch processing, etc. The software package is currently used in a number of universities for teaching and research.

  1. A real-time monitoring system for the facial nerve.

    PubMed

    Prell, Julian; Rachinger, Jens; Scheller, Christian; Alfieri, Alex; Strauss, Christian; Rampp, Stefan

    2010-06-01

    Damage to the facial nerve during surgery in the cerebellopontine angle is indicated by A-trains, a specific electromyogram pattern. These A-trains can be quantified by the parameter "traintime," which is reliably correlated with postoperative functional outcome. The system presented was designed to monitor traintime in real-time. A dedicated hardware and software platform for automated continuous analysis of the intraoperative facial nerve electromyogram was specifically designed. The automatic detection of A-trains is performed by a software algorithm for real-time analysis of nonstationary biosignals. The system was evaluated in a series of 30 patients operated on for vestibular schwannoma. A-trains can be detected and measured automatically by the described method for real-time analysis. Traintime is monitored continuously via a graphic display and is shown as an absolute numeric value during the operation. It is an expression of overall, cumulated length of A-trains in a given channel; a high correlation between traintime as measured by real-time analysis and functional outcome immediately after the operation (Spearman correlation coefficient [rho] = 0.664, P < .001) and in long-term outcome (rho = 0.631, P < .001) was observed. Automated real-time analysis of the intraoperative facial nerve electromyogram is the first technique capable of reliable continuous real-time monitoring. It can critically contribute to the estimation of functional outcome during the course of the operative procedure.

  2. [Evaluation of dental plaque by quantitative digital image analysis system].

    PubMed

    Huang, Z; Luan, Q X

    2016-04-18

    To analyze the plaque staining image by using image analysis software, to verify the maneuverability, practicability and repeatability of this technique, and to evaluate the influence of different plaque stains. In the study, 30 volunteers were enrolled from the new dental students of Peking University Health Science Center in accordance with the inclusion criteria. The digital images of the anterior teeth were acquired after plaque stained according to filming standardization.The image analysis was performed using Image Pro Plus 7.0, and the Quigley-Hein plaque indexes of the anterior teeth were evaluated. The plaque stain area percentage and the corresponding dental plaque index were highly correlated,and the Spearman correlation coefficient was 0.776 (P<0.01). Intraclass correlation coefficients of the tooth area and plaque area which two researchers used the software to calculate were 0.956 and 0.930 (P<0.01).The Bland-Altman analysis chart showed only a few spots outside the 95% consistency boundaries. The different plaque stains image analysis results showed that the difference of the tooth area measurements was not significant, while the difference of the plaque area measurements significant (P<0.01). This method is easy in operation and control,highly related to the calculated percentage of plaque area and traditional plaque index, and has good reproducibility.The different plaque staining method has little effect on image segmentation results.The sensitive plaque stain for image analysis is suggested.

  3. DIGE Analysis Software and Protein Identification Approaches.

    PubMed

    Hmmier, Abduladim; Dowling, Paul

    2018-01-01

    DIGE is a high-resolution two-dimensional gel electrophoresis method, with excellent dynamic range obtained by fluorescent tag labeling of protein samples. Scanned images of DIGE gels show thousands of protein spots, each spot representing a single or a group of protein isoforms. By using commercially available software, each protein spot is defined by an outline, which is digitized and correlated with the quantity of proteins present in each spot. Software packages include DeCyder, SameSpots, and Dymension 3. In addition, proteins of interest can be excised from post-stained gels and identified with conventional mass spectrometry techniques. High-throughput mass spectrometry is performed using sophisticated instrumentation including matrix-assisted laser desorption ionization time-of-flight (MALDI-TOF), MALDI-TOF/TOF, and liquid chromatography tandem mass spectrometry (LC-MS/MS). Tandem MS (MALDI-TOF/TOF or LC-MS/MS), analyzes fragmented peptides, resulting in amino acid sequence information, especially useful when protein spots are low abundant or where a mixture of proteins is present.

  4. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool

    PubMed Central

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-01-01

    Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080

  5. SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.

    PubMed

    Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda

    2008-08-15

    It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.

  6. Cytoscape: a software environment for integrated models of biomolecular interaction networks.

    PubMed

    Shannon, Paul; Markiel, Andrew; Ozier, Owen; Baliga, Nitin S; Wang, Jonathan T; Ramage, Daniel; Amin, Nada; Schwikowski, Benno; Ideker, Trey

    2003-11-01

    Cytoscape is an open source software project for integrating biomolecular interaction networks with high-throughput expression data and other molecular states into a unified conceptual framework. Although applicable to any system of molecular components and interactions, Cytoscape is most powerful when used in conjunction with large databases of protein-protein, protein-DNA, and genetic interactions that are increasingly available for humans and model organisms. Cytoscape's software Core provides basic functionality to layout and query the network; to visually integrate the network with expression profiles, phenotypes, and other molecular states; and to link the network to databases of functional annotations. The Core is extensible through a straightforward plug-in architecture, allowing rapid development of additional computational analyses and features. Several case studies of Cytoscape plug-ins are surveyed, including a search for interaction pathways correlating with changes in gene expression, a study of protein complexes involved in cellular recovery to DNA damage, inference of a combined physical/functional interaction network for Halobacterium, and an interface to detailed stochastic/kinetic gene regulatory models.

  7. Implementation of metal-friendly EAM/FS-type semi-empirical potentials in HOOMD-blue: A GPU-accelerated molecular dynamics software

    NASA Astrophysics Data System (ADS)

    Yang, Lin; Zhang, Feng; Wang, Cai-Zhuang; Ho, Kai-Ming; Travesset, Alex

    2018-04-01

    We present an implementation of EAM and FS interatomic potentials, which are widely used in simulating metallic systems, in HOOMD-blue, a software designed to perform classical molecular dynamics simulations using GPU accelerations. We first discuss the details of our implementation and then report extensive benchmark tests. We demonstrate that single-precision floating point operations efficiently implemented on GPUs can produce sufficient accuracy when compared against double-precision codes, as demonstrated in test simulations of calculations of the glass-transition temperature of Cu64.5Zr35.5, and pair correlation function g (r) of liquid Ni3Al. Our code scales well with the size of the simulating system on NVIDIA Tesla M40 and P100 GPUs. Compared with another popular software LAMMPS running on 32 cores of AMD Opteron 6220 processors, the GPU/CPU performance ratio can reach as high as 4.6. The source code can be accessed through the HOOMD-blue web page for free by any interested user.

  8. Evaluation of interaction dynamics of concurrent processes

    NASA Astrophysics Data System (ADS)

    Sobecki, Piotr; Białasiewicz, Jan T.; Gross, Nicholas

    2017-03-01

    The purpose of this paper is to present the wavelet tools that enable the detection of temporal interactions of concurrent processes. In particular, the determination of interaction coherence of time-varying signals is achieved using a complex continuous wavelet transform. This paper has used electrocardiogram (ECG) and seismocardiogram (SCG) data set to show multiple continuous wavelet analysis techniques based on Morlet wavelet transform. MATLAB Graphical User Interface (GUI), developed in the reported research to assist in quick and simple data analysis, is presented. These software tools can discover the interaction dynamics of time-varying signals, hence they can reveal their correlation in phase and amplitude, as well as their non-linear interconnections. The user-friendly MATLAB GUI enables effective use of the developed software what enables to load two processes under investigation, make choice of the required processing parameters, and then perform the analysis. The software developed is a useful tool for researchers who have a need for investigation of interaction dynamics of concurrent processes.

  9. Comparison of two software systems for quantification of myocardial blood flow in patients with hypertrophic cardiomyopathy.

    PubMed

    Yalcin, Hulya; Valenta, Ines; Zhao, Min; Tahari, Abdel; Lu, Dai-Yin; Higuchi, Takahiro; Yalcin, Fatih; Kucukler, Nagehan; Soleimanifard, Yalda; Zhou, Yun; Pomper, Martin G; Abraham, Theodore P; Tsui, Ben; Lodge, Martin A; Schindler, Thomas H; Roselle Abraham, M

    2018-01-22

    Quantification of myocardial blood flow (MBF) by positron emission tomography (PET) is important for investigation of angina in hypertrophic cardiomyopathy (HCM). Several software programs exist for MBF quantification, but they have been mostly evaluated in patients (with normal cardiac geometry), referred for evaluation of coronary artery disease (CAD). Software performance has not been evaluated in HCM patients who frequently have hyperdynamic LV function, LV outflow tract (LVOT) obstruction, small LV cavity size, and variation in the degree/location of LV hypertrophy. We compared results of MBF obtained using PMod, which permits manual segmentation, to those obtained by FDA-approved QPET software which has an automated segmentation algorithm. 13 N-ammonia PET perfusion data were acquired in list mode at rest and during pharmacologic vasodilation, in 76 HCM patients and 10 non-HCM patients referred for evaluation of CAD (CAD group.) Data were resampled to create static, ECG-gated and 36-frame-dynamic images. Myocardial flow reserve (MFR) and MBF (in ml/min/g) were calculated using QPET and PMod softwares. All HCM patients had asymmetric septal hypertrophy, and 50% had evidence of LVOT obstruction, whereas non-HCM patients (CAD group) had normal wall thickness and ejection fraction. PMod yielded significantly higher values for global and regional stress-MBF and MFR than for QPET in HCM. Reasonably fair correlation was observed for global rest-MBF, stress-MBF, and MFR using these two softwares (rest-MBF: r = 0.78; stress-MBF: r = 0.66.; MFR: r = 0.7) in HCM patients. Agreement between global MBF and MFR values improved when HCM patients with high spillover fractions (> 0.65) were excluded from the analysis (rest-MBF: r = 0.84; stress-MBF: r = 0.72; MFR: r = 0.8.) Regionally, the highest agreement between PMod and QPET was observed in the LAD territory (rest-MBF: r = 0.82, Stress-MBF: r = 0.68) where spillover fraction was the lowest. Unlike HCM patients, the non-HCM patients (CAD group) demonstrated excellent agreement in MBF/MFR values, obtained by the two softwares, when patients with high spillover fractions were excluded (rest-MBF: r = 0.95; stress-MBF: r = 0.92; MFR: r = 0.95). Anatomic characteristics specific to HCM hearts contribute to lower correlations between MBF/MFR values obtained by PMod and QPET, compared with non-HCM patients. These differences indicate that PMod and QPET cannot be used interchangeably for MBF/MFR analyses in HCM patients.

  10. An online open-source tool for automated quantification of liver and myocardial iron concentrations by T2* magnetic resonance imaging.

    PubMed

    Git, K-A; Fioravante, L A B; Fernandes, J L

    2015-09-01

    To assess whether an online open-source tool would provide accurate calculations of T2(*) values for iron concentrations in the liver and heart compared with a standard reference software. An online open-source tool, written in pure HTML5/Javascript, was tested in 50 patients (age 26.0 ± 18.9 years, 46% males) who underwent T2(*) MRI of the liver and heart for iron overload assessment as part of their routine workup. Automated truncation correction was the default with optional manual adjustment provided if needed. The results were compared against a standard reference measurement using commercial software with manual truncation (CVI(42)(®) v. 5.1; Circle Cardiovascular Imaging; Calgary, AB). The mean liver T2(*) values calculated with the automated tool was 4.3 ms [95% confidence interval (CI) 3.1 to 5.5 ms] vs 4.26 ms using the reference software (95% CI 3.1 to 5.4 ms) without any significant differences (p = 0.71). In the liver, the mean difference was 0.036 ms (95% CI -0.1609 to 0.2329 ms) with a regression correlation coefficient of 0.97. For the heart, the automated T2(*) value was 26.0 ms (95% CI 22.9 to 29.0 ms) vs 25.3 ms (95% CI 22.3 to 28.3 ms), p = 0.28. The mean difference was 0.72 ms (95% CI 0.08191 to 1.3621 ms) with a correlation coefficient of 0.96. The automated online tool provides similar T2(*) values for the liver and myocardial iron concentrations as compared with a standard reference software. The online program provides an open-source tool for the calculation of T2(*) values, incorporating an automated correction algorithm in a simple and easy-to-use interface.

  11. Correlated regions of cerebral blood flow with clinical parameters in Parkinson's disease; comparison using 'Anatomy' and 'Talairach Daemon' software.

    PubMed

    Yoon, Hyun Jin; Cheon, Sang Myung; Jeong, Young Jin; Kang, Do-Young

    2012-02-01

    We assign the anatomical names of functional activation regions in the brain, based on the probabilistic cyto-architectonic atlas by Anatomy 1.7 from an analysis of correlations between regional cerebral blood flow (rCBF) and clinical parameters of the non-demented Parkinson's disease (PD) patients by SPM8. We evaluated Anatomy 1.7 of SPM toolbox compared to 'Talairach Daemon' (TD) Client 2.4.2 software. One hundred and thirty-six patients (mean age 60.0 ± 9.09 years; 73 women and 63 men) with non-demented PD were selected. Tc-99m-HMPAO brain single-photon emission computed tomography (SPECT) scans were performed on the patients using a two-head gamma-camera. We analyzed the brain image of PD patients by SPM8 and found the anatomical names of correlated regions of rCBF perfusion with the clinical parameters using TD Client 2.4.2 and Anatomy 1.7. The SPM8 provided a correlation coefficient between clinical parameters and cerebral hypoperfusion by a simple regression method. To the clinical parameters were added age, duration of disease, education period, Hoehn and Yahr (H&Y) stage and Korean mini-mental state examination (K-MMSE) score. Age was correlated with cerebral perfusion in the Brodmann area (BA) 6 and BA 3b assigned by Anatomy 1.7 and BA 6 and pyramis in gray matter by TD Client 2.4.2 with p < 0.001 uncorrected. Also, assigned significant correlated regions were found in the left and right lobules VI (Hem) with duration of disease, in left and right lobules VIIa crus I (Hem) with education, in left insula (Ig2), left and right lobules VI (Hem) with H&Y, and in BA 4a and 6 with K-MMSE score with p < 0.05 uncorrected by Anatomy 1.7, respectively. Most areas of correlation were overlapped by two different anatomical labeling methods, but some correlation areas were found with different names. Age was the most significantly correlated clinical parameter with rCBF. TD Client found the exact anatomical name by the peak intensity position of the cluster while Anatomy 1.7 of SPM8 toolbox, using the cyto-architectonic probability maps, assigned the anatomical name by percentage value of the probability.

  12. 25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... the application under the rules of the game. For example, if a bingo game with 75 objects with numbers... test potency and degree of serial correlation (outcomes must be independent from the previous game...) General requirements. (1) Software that calls an RNG to derive game outcome events must immediately use...

  13. 25 CFR 547.14 - What are the minimum technical standards for electronic random number generation?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... the application under the rules of the game. For example, if a bingo game with 75 objects with numbers... test potency and degree of serial correlation (outcomes must be independent from the previous game...) General requirements. (1) Software that calls an RNG to derive game outcome events must immediately use...

  14. Operational Based Vision Assessment Automated Vision Test Collection User Guide

    DTIC Science & Technology

    2017-05-15

    repeatability to support correlation analysis. The AVT research grade tests also support interservice, international, industry, and academic partnerships...software, provides information concerning various menu options and operation of the test, and provides a brief description of each of the automated vision...2802, 6 Jun 2017. TABLE OF CONTENTS (concluded) Section Page 7.0 OBVA VISION TEST DESCRIPTIONS

  15. SEU induced errors observed in microprocessor systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Asenek, V.; Underwood, C.; Oldfield, M.

    In this paper, the authors present software tools for predicting the rate and nature of observable SEU induced errors in microprocessor systems. These tools are built around a commercial microprocessor simulator and are used to analyze real satellite application systems. Results obtained from simulating the nature of SEU induced errors are shown to correlate with ground-based radiation test data.

  16. Comparing Students' Scratch Skills with Their Computational Thinking Skills in Terms of Different Variables

    ERIC Educational Resources Information Center

    Oluk, Ali; Korkmaz, Özgen

    2016-01-01

    This study aimed to compare 5th graders' scores obtained from Scratch projects developed in the framework of Information Technologies and Software classes via Dr Scratch web tool with the scores obtained from Computational Thinking Levels Scale and to examine this comparison in terms of different variables. Correlational research model was…

  17. APPLICATION OF COMPUTER-AIDED TOMOGRAPHY (CAT) AS A POTENTIAL INDICATOR OF MARINE MARCO BENTHIC ACTIVITY ALONG POLLUTION GRADIENTS

    EPA Science Inventory

    Sediment cores were imaged using a local hospital CAT scanner. These image data were transferred to a personal computer at our laboratory using specially developed software. Previously, we reported an inverse correlation (r2 = 0.98, P<0.01) between the average sediment x-ray atte...

  18. The Relationship between Gender and Students' Attitude and Experience of Using a Computer Algebra System

    ERIC Educational Resources Information Center

    Ocak, Mehmet

    2008-01-01

    This correlational study examined the relationship between gender and the students' attitude and prior knowledge of using one of the mathematical software programs (MATLAB). Participants were selected from one community college, one state university and one private college. Students were volunteers from three Calculus I classrooms (one class from…

  19. The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.

    Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.

  20. The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation

    DOE PAGES

    Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.

    2017-11-27

    Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.

  1. Agatha: Disentangling period signals from correlated noise in a periodogram framework

    NASA Astrophysics Data System (ADS)

    Feng, F.; Tuomi, M.; Jones, H. R. A.

    2018-04-01

    Agatha is a framework of periodograms to disentangle periodic signals from correlated noise and to solve the two-dimensional model selection problem: signal dimension and noise model dimension. These periodograms are calculated by applying likelihood maximization and marginalization and combined in a self-consistent way. Agatha can be used to select the optimal noise model and to test the consistency of signals in time and can be applied to time series analyses in other astronomical and scientific disciplines. An interactive web implementation of the software is also available at http://agatha.herts.ac.uk/.

  2. Frontiers of Two-Dimensional Correlation Spectroscopy. Part 1. New concepts and noteworthy developments

    NASA Astrophysics Data System (ADS)

    Noda, Isao

    2014-07-01

    A comprehensive survey review of new and noteworthy developments, which are advancing forward the frontiers in the field of 2D correlation spectroscopy during the last four years, is compiled. This review covers books, proceedings, and review articles published on 2D correlation spectroscopy, a number of significant conceptual developments in the field, data pretreatment methods and other pertinent topics, as well as patent and publication trends and citation activities. Developments discussed include projection 2D correlation analysis, concatenated 2D correlation, and correlation under multiple perturbation effects, as well as orthogonal sample design, predicting 2D correlation spectra, manipulating and comparing 2D spectra, correlation strategy based on segmented data blocks, such as moving-window analysis, features like determination of sequential order and enhanced spectral resolution, statistical 2D spectroscopy using covariance and other statistical metrics, hetero-correlation analysis, and sample-sample correlation technique. Data pretreatment operations prior to 2D correlation analysis are discussed, including the correction for physical effects, background and baseline subtraction, selection of reference spectrum, normalization and scaling of data, derivatives spectra and deconvolution technique, and smoothing and noise reduction. Other pertinent topics include chemometrics and statistical considerations, peak position shift phenomena, variable sampling increments, computation and software, display schemes, such as color coded format, slice and power spectra, tabulation, and other schemes.

  3. Monitoring and Discovery for Self-Organized Network Management in Virtualized and Software Defined Networks

    PubMed Central

    Valdivieso Caraguay, Ángel Leonardo; García Villalba, Luis Javier

    2017-01-01

    This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors. PMID:28362346

  4. Data Acquisition and Environmental Monitoring of the MAJORANA DEMONSTRATOR

    NASA Astrophysics Data System (ADS)

    Meijer, Samuel; Majorana Collaboration

    2015-04-01

    Low-background non-accelerator experiments have unique requirements for their data acquisition and environmental monitoring. Background signals can easily overwhelm the signals of interest, so events which could contribute to the background must be identified. There is a need to correlate events between detectors and environmental conditions, and data integrity must be maintained. Here, we describe several of the software and hardware techniques achieved by the MAJORANA Collaboration for the MAJORANA DEMONSTRATOR, such as using the Object-oriented Realtime Control and Acquisition (ORCA) software package. This material is based upon work supported by the U.S. Department of Energy, Office of Science, Office of Nuclear Physics, the Particle Astrophysics Program of the National Science Foundation, and the Sanford Underground Research Facility.

  5. The GRIDView Visualization Package

    NASA Astrophysics Data System (ADS)

    Kent, B. R.

    2011-07-01

    Large three-dimensional data cubes, catalogs, and spectral line archives are increasingly important elements of the data discovery process in astronomy. Visualization of large data volumes is of vital importance for the success of large spectral line surveys. Examples of data reduction utilizing the GRIDView software package are shown. The package allows users to manipulate data cubes, extract spectral profiles, and measure line properties. The package and included graphical user interfaces (GUIs) are designed with pipeline infrastructure in mind. The software has been used with great success analyzing spectral line and continuum data sets obtained from large radio survey collaborations. The tools are also important for multi-wavelength cross-correlation studies and incorporate Virtual Observatory client applications for overlaying database information in real time as cubes are examined by users.

  6. High Resolution X-Ray Micro-CT of Ultra-Thin Wall Space Components

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Rauser, R. W.; Bowman, Randy R.; Bonacuse, Peter; Martin, Richard E.; Locci, I. E.; Kelley, M.

    2012-01-01

    A high resolution micro-CT system has been assembled and is being used to provide optimal characterization for ultra-thin wall space components. The Glenn Research Center NDE Sciences Team, using this CT system, has assumed the role of inspection vendor for the Advanced Stirling Convertor (ASC) project at NASA. This article will discuss many aspects of the development of the CT scanning for this type of component, including CT system overview; inspection requirements; process development, software utilized and developed to visualize, process, and analyze results; calibration sample development; results on actual samples; correlation with optical/SEM characterization; CT modeling; and development of automatic flaw recognition software. Keywords: Nondestructive Evaluation, NDE, Computed Tomography, Imaging, X-ray, Metallic Components, Thin Wall Inspection

  7. Correlation of heat transfer coefficient in quenching process using ABAQUS

    NASA Astrophysics Data System (ADS)

    Davare, Sandeep Kedarnath; Balachandran, G.; Singh, R. K. P.

    2018-04-01

    During the heat treatment by quenching in a liquid medium the convective heat transfer coefficient plays a crucial role in the extraction of heat. The heat extraction ultimately influences the cooling rate and hence the hardness and mechanical properties. A Finite Element analysis of quenching a simple flat copper sample with different orientation of sample and with different quenchant temperatures were carried out to check and verify the results obtained from the experiments. The heat transfer coefficient (HTC) was calculated from temperature history in a simple flat copper disc sample experimentally. This HTC data was further used as input to simulation software and the cooling curves were back calculated. The results obtained from software and using experimentation shows nearly consistent values.

  8. Monitoring and Discovery for Self-Organized Network Management in Virtualized and Software Defined Networks.

    PubMed

    Caraguay, Ángel Leonardo Valdivieso; Villalba, Luis Javier García

    2017-03-31

    This paper presents the Monitoring and Discovery Framework of the Self-Organized Network Management in Virtualized and Software Defined Networks SELFNET project. This design takes into account the scalability and flexibility requirements needed by 5G infrastructures. In this context, the present framework focuses on gathering and storing the information (low-level metrics) related to physical and virtual devices, cloud environments, flow metrics, SDN traffic and sensors. Similarly, it provides the monitoring data as a generic information source in order to allow the correlation and aggregation tasks. Our design enables the collection and storing of information provided by all the underlying SELFNET sublayers, including the dynamically onboarded and instantiated SDN/NFV Apps, also known as SELFNET sensors.

  9. Data on assessment of groundwater quality with application of ArcGIS in Zanjan, Iran.

    PubMed

    Asghari, Farzaneh Baghal; Mohammadi, Ali Akbar; Dehghani, Mohammad Hadi; Yousefi, Mahmood

    2018-06-01

    The aim of this study was to Monitoring of physical and chemical characteristics of ground water including Ca 2+ , Mg 2+ , EC, pH, TDS, TH, H C O 3 - , Na + , K + , Cl - , SAR, %Na and S O 4 2 - in Zanjan city, Iran. For assessing the physic-chemical parameters from 15 wells, water samples 4 times at different times were collected and examined. Data were analyzed using R and Arc GIS software. According to the calculated correlation coefficients, the highest correlation Coefficient belonged to TDS-EC while H C O 3 - and Cl - showed low and weak correlations. However, Na + , Mg 2+ , K + , Ca 2+ exhibited good positive correlations with EC and TDS. The results show that the water in the study area at the time of the study was based on the WHO standards and appropriate for drinking.

  10. Radiological assessment of breast density by visual classification (BI-RADS) compared to automated volumetric digital software (Quantra): implications for clinical practice.

    PubMed

    Regini, Elisa; Mariscotti, Giovanna; Durando, Manuela; Ghione, Gianluca; Luparia, Andrea; Campanino, Pier Paolo; Bianchi, Caterina Chiara; Bergamasco, Laura; Fonio, Paolo; Gandini, Giovanni

    2014-10-01

    This study was done to assess breast density on digital mammography and digital breast tomosynthesis according to the visual Breast Imaging Reporting and Data System (BI-RADS) classification, to compare visual assessment with Quantra software for automated density measurement, and to establish the role of the software in clinical practice. We analysed 200 digital mammograms performed in 2D and 3D modality, 100 of which positive for breast cancer and 100 negative. Radiological density was assessed with the BI-RADS classification; a Quantra density cut-off value was sought on the 2D images only to discriminate between BI-RADS categories 1-2 and BI-RADS 3-4. Breast density was correlated with age, use of hormone therapy, and increased risk of disease. The agreement between the 2D and 3D assessments of BI-RADS density was high (K 0.96). A cut-off value of 21% is that which allows us to best discriminate between BI-RADS categories 1-2 and 3-4. Breast density was negatively correlated to age (r = -0.44) and positively to use of hormone therapy (p = 0.0004). Quantra density was higher in breasts with cancer than in healthy breasts. There is no clear difference between the visual assessments of density on 2D and 3D images. Use of the automated system requires the adoption of a cut-off value (set at 21%) to effectively discriminate BI-RADS 1-2 and 3-4, and could be useful in clinical practice.

  11. Reliable measurement of E. coli single cell fluorescence distribution using a standard microscope set-up.

    PubMed

    Cortesi, Marilisa; Bandiera, Lucia; Pasini, Alice; Bevilacqua, Alessandro; Gherardi, Alessandro; Furini, Simone; Giordano, Emanuele

    2017-01-01

    Quantifying gene expression at single cell level is fundamental for the complete characterization of synthetic gene circuits, due to the significant impact of noise and inter-cellular variability on the system's functionality. Commercial set-ups that allow the acquisition of fluorescent signal at single cell level (flow cytometers or quantitative microscopes) are expensive apparatuses that are hardly affordable by small laboratories. A protocol that makes a standard optical microscope able to acquire quantitative, single cell, fluorescent data from a bacterial population transformed with synthetic gene circuitry is presented. Single cell fluorescence values, acquired with a microscope set-up and processed with custom-made software, are compared with results that were obtained with a flow cytometer in a bacterial population transformed with the same gene circuitry. The high correlation between data from the two experimental set-ups, with a correlation coefficient computed over the tested dynamic range > 0.99, proves that a standard optical microscope- when coupled with appropriate software for image processing- might be used for quantitative single-cell fluorescence measurements. The calibration of the set-up, together with its validation, is described. The experimental protocol described in this paper makes quantitative measurement of single cell fluorescence accessible to laboratories equipped with standard optical microscope set-ups. Our method allows for an affordable measurement/quantification of intercellular variability, whose better understanding of this phenomenon will improve our comprehension of cellular behaviors and the design of synthetic gene circuits. All the required software is freely available to the synthetic biology community (MUSIQ Microscope flUorescence SIngle cell Quantification).

  12. A simple method of measuring tibial tubercle to trochlear groove distance on MRI: description of a novel and reliable technique.

    PubMed

    Camp, Christopher L; Heidenreich, Mark J; Dahm, Diane L; Bond, Jeffrey R; Collins, Mark S; Krych, Aaron J

    2016-03-01

    Tibial tubercle-trochlear groove (TT-TG) distance is a variable that helps guide surgical decision-making in patients with patellar instability. The purpose of this study was to compare the accuracy and reliability of an MRI TT-TG measuring technique using a simple external alignment method to a previously validated gold standard technique that requires advanced software read by radiologists. TT-TG was calculated by MRI on 59 knees with a clinical diagnosis of patellar instability in a blinded and randomized fashion by two musculoskeletal radiologists using advanced software and by two orthopaedists using the study technique which utilizes measurements taken on a simple electronic imaging platform. Interrater reliability between the two radiologists and the two orthopaedists and intermethods reliability between the two techniques were calculated using interclass correlation coefficients (ICC) and concordance correlation coefficients (CCC). ICC and CCC values greater than 0.75 were considered to represent excellent agreement. The mean TT-TG distance was 14.7 mm (Standard Deviation (SD) 4.87 mm) and 15.4 mm (SD 5.41) as measured by the radiologists and orthopaedists, respectively. Excellent interobserver agreement was noted between the radiologists (ICC 0.941; CCC 0.941), the orthopaedists (ICC 0.978; CCC 0.976), and the two techniques (ICC 0.941; CCC 0.933). The simple TT-TG distance measurement technique analysed in this study resulted in excellent agreement and reliability as compared to the gold standard technique. This method can predictably be performed by orthopaedic surgeons without advanced radiologic software. II.

  13. Using image J to document healing in ulcers of the foot in diabetes.

    PubMed

    Jeffcoate, William J; Musgrove, Alison J; Lincoln, Nadina B

    2017-12-01

    The aim of the study was to assess the reliability of measuring the cross-sectional area of diabetic foot ulcers using Image J software. The inter- and intra-rater reliability of ulcer area measures were assessed using digital images of acetate tracings of ulcers of the foot affecting 31 participants in an off-loading randomised trial. Observations were made independently by five specialist podiatrists, one of whom was experienced in the use of Image J software and educated the other four in a single session. The mean (±SD) of the mean cross-sectional areas of the 31 ulcers determined independently by the five observers was 1386·7 (±22·7) mm 2 . The correlation between all pairs of observers was >0·99 (P < 0·001). There was no significant difference overall between the five observers (ANOVA F1.538; P = 0·165) and no difference between any two (paired samples test t = -0·787-1·396; P ≥ 0·088). The correlation between the areas determined by two observers on two occasions separated by not less than 1 week was very high (0·997 and 0·999; P < 0·001 and <0·001, respectively). The inter- and intra-reliability of the Image J software is very high, with no evidence of a difference either between or within observers. This technique should be considered for both research and clinical use in order to document changes in ulcer area. © 2017 Medicalhelplines.com Inc and John Wiley & Sons Ltd.

  14. FlowerMorphology: fully automatic flower morphometry software.

    PubMed

    Rozov, Sergey M; Deineko, Elena V; Deyneko, Igor V

    2018-05-01

    The software FlowerMorphology is designed for automatic morphometry of actinomorphic flowers. The novel complex parameters of flowers calculated by FlowerMorphology allowed us to quantitatively characterize a polyploid series of tobacco. Morphological differences of plants representing closely related lineages or mutants are mostly quantitative. Very often, there are only very fine variations in plant morphology. Therefore, accurate and high-throughput methods are needed for their quantification. In addition, new characteristics are necessary for reliable detection of subtle changes in morphology. FlowerMorphology is an all-in-one software package to automatically image and analyze five-petal actinomorphic flowers of the dicotyledonous plants. Sixteen directly measured parameters and ten calculated complex parameters of a flower allow us to characterize variations with high accuracy. The program was developed for the needs of automatic characterization of Nicotiana tabacum flowers, but is applicable to many other plants with five-petal actinomorphic flowers and can be adopted for flowers of other merosity. A genetically similar polyploid series of N. tabacum plants was used to investigate differences in flower morphology. For the first time, we could quantify the dependence between ploidy and size and form of the tobacco flowers. We found that the radius of inner petal incisions shows a persistent positive correlation with the chromosome number. In contrast, a commonly used parameter-radius of outer corolla-does not discriminate 2n and 4n plants. Other parameters show that polyploidy leads to significant aberrations in flower symmetry and are also positively correlated with chromosome number. Executables of FlowerMorphology, source code, documentation, and examples are available at the program website: https://github.com/Deyneko/FlowerMorphology .

  15. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  16. Reliability and Validity of the Footprint Assessment Method Using Photoshop CS5 Software in Young People with Down Syndrome.

    PubMed

    Gutiérrez-Vilahú, Lourdes; Massó-Ortigosa, Núria; Rey-Abella, Ferran; Costa-Tutusaus, Lluís; Guerra-Balic, Myriam

    2016-05-01

    People with Down syndrome present skeletal abnormalities in their feet that can be analyzed by commonly used gold standard indices (the Hernández-Corvo index, the Chippaux-Smirak index, the Staheli arch index, and the Clarke angle) based on footprint measurements. The use of Photoshop CS5 software (Adobe Systems Software Ireland Ltd, Dublin, Ireland) to measure footprints has been validated in the general population. The present study aimed to assess the reliability and validity of this footprint assessment technique in the population with Down syndrome. Using optical podography and photography, 44 footprints from 22 patients with Down syndrome (11 men [mean ± SD age, 23.82 ± 3.12 years] and 11 women [mean ± SD age, 24.82 ± 6.81 years]) were recorded in a static bipedal standing position. A blinded observer performed the measurements using a validated manual method three times during the 4-month study, with 2 months between measurements. Test-retest was used to check the reliability of the Photoshop CS5 software measurements. Validity and reliability were obtained by intraclass correlation coefficient (ICC). The reliability test for all of the indices showed very good values for the Photoshop CS5 method (ICC, 0.982-0.995). Validity testing also found no differences between the techniques (ICC, 0.988-0.999). The Photoshop CS5 software method is reliable and valid for the study of footprints in young people with Down syndrome.

  17. Comparison of two high-resolution manometry software systems in evaluating esophageal motor function.

    PubMed

    Rengarajan, A; Drapekin, J; Patel, A; Gyawali, C P

    2016-12-01

    High-resolution manometry (HRM) utilizes software tools to diagnose esophageal motor disorders. Performance of these software metrics could be affected by averaging and by software characteristics of different manufacturers. High-resolution manometry studies on 86 patients referred for antireflux surgery (61.6 ± 1.4 year, 70% F) and 20 healthy controls (27.9 ± 0.7 year, 45% F) were first subject to standard analysis (Medtronic, Duluth, GA, USA). Coordinates for each of 10 test swallows were exported and averaged to generate a composite swallow. The swallows and averaged composites were imported as ASCII file format into Manoview (Medtronic) and Medical Measurement Systems database reporter (MMS, Dover, NH, USA), and analyses repeated. Comparisons were made between standard and composite swallow interpretations. Correlation between the two systems was high for mean distal contractile integral (DCI, r 2 ≥ 0.9) but lower for integrated relaxation pressure (IRP, r 2 = 0.7). Excluding achalasia, six patients with outflow obstruction (mean IRP 23.2 ± 2.1 with 10-swallow average) were identified by both systems. An additional nine patients (10.5%) were identified as outflow obstruction (15 mmHg threshold) with MMS 10-swallow and four with MMS composite swallow evaluation; only one was confirmed. Ineffective esophageal motility was diagnosed by 10-swallow evaluation in 19 (22.1%) with Manoview, and 20 (23.3%) with MMS. On Manoview composite, 17 had DCI <450 mmHg/cm/s, and on MMS composite, 21, (p ≥ 0.85 for each comparison) but these did not impact diagnostic conclusions. Comparison of 10 swallow and composite swallows demonstrate variability in software metrics between manometry systems. Our data support use of manufacturer specific software metrics on 10-swallow sequences. © 2016 John Wiley & Sons Ltd.

  18. Platform-Independent Cirrus and Spectralis Thickness Measurements in Eyes with Diabetic Macular Edema Using Fully Automated Software

    PubMed Central

    Willoughby, Alex S.; Chiu, Stephanie J.; Silverman, Rachel K.; Farsiu, Sina; Bailey, Clare; Wiley, Henry E.; Ferris, Frederick L.; Jaffe, Glenn J.

    2017-01-01

    Purpose We determine whether the automated segmentation software, Duke Optical Coherence Tomography Retinal Analysis Program (DOCTRAP), can measure, in a platform-independent manner, retinal thickness on Cirrus and Spectralis spectral domain optical coherence tomography (SD-OCT) images in eyes with diabetic macular edema (DME) under treatment in a clinical trial. Methods Automatic segmentation software was used to segment the internal limiting membrane (ILM), inner retinal pigment epithelium (RPE), and Bruch's membrane (BM) in SD-OCT images acquired by Cirrus and Spectralis commercial systems, from the same eye, on the same day during a clinical interventional DME trial. Mean retinal thickness differences were compared across commercial and DOCTRAP platforms using intraclass correlation (ICC) and Bland-Altman plots. Results The mean 1 mm central subfield thickness difference (standard error [SE]) comparing segmentation of Spectralis images with DOCTRAP versus HEYEX was 0.7 (0.3) μm (0.2 pixels). The corresponding values comparing segmentation of Cirrus images with DOCTRAP versus Cirrus software was 2.2 (0.7) μm. The mean 1 mm central subfield thickness difference (SE) comparing segmentation of Cirrus and Spectralis scan pairs with DOCTRAP using BM as the outer retinal boundary was −2.3 (0.9) μm compared to 2.8 (0.9) μm with inner RPE as the outer boundary. Conclusions DOCTRAP segmentation of Cirrus and Spectralis images produces validated thickness measurements that are very similar to each other, and very similar to the values generated by the corresponding commercial software in eyes with treated DME. Translational Relevance This software enables automatic total retinal thickness measurements across two OCT platforms, a process that is impractical to perform manually. PMID:28180033

  19. Upgrading Custom Simulink Library Components for Use in Newer Versions of Matlab

    NASA Technical Reports Server (NTRS)

    Stewart, Camiren L.

    2014-01-01

    The Spaceport Command and Control System (SCCS) at Kennedy Space Center (KSC) is a control system for monitoring and launching manned launch vehicles. Simulations of ground support equipment (GSE) and the launch vehicle systems are required throughout the life cycle of SCCS to test software, hardware, and procedures to train the launch team. The simulations of the GSE at the launch site in conjunction with off-line processing locations are developed using Simulink, a piece of Commercial Off-The-Shelf (COTS) software. The simulations that are built are then converted into code and ran in a simulation engine called Trick, a Government off-the-shelf (GOTS) piece of software developed by NASA. In the world of hardware and software, it is not uncommon to see the products that are utilized be upgraded and patched or eventually fade away into an obsolete status. In the case of SCCS simulation software, Matlab, a MathWorks product, has released a number of stable versions of Simulink since the deployment of the software on the Development Work Stations in the Linux environment (DWLs). The upgraded versions of Simulink has introduced a number of new tools and resources that, if utilized fully and correctly, will save time and resources during the overall development of the GSE simulation and its correlating documentation. Unfortunately, simply importing the already built simulations into the new Matlab environment will not suffice as it will produce results that may not be expected as they were in the version that is currently being utilized. Thus, an upgrade execution plan was developed and executed to fully upgrade the simulation environment to one of the latest versions of Matlab.

  20. Systematic on-site monitoring of compliance dust samples

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Grayson, R.L.; Gandy, J.R.

    1996-12-31

    Maintaining compliance with U.S. respirable coal mine dust standards can be difficult on high-productivity longwall panels. Comprehensive and systematic analysis of compliance dust sample data, coupled with access to the U.S. Bureau of Mines (USBM) DUSTPRO, can yield important information for use in maintaining compliance. The objective of this study was to develop and apply a customized software for the collection, storage, modification, and analysis of respirable dust data while providing for flexible export of data and linking with the USBM`s expert advisory system on dust control. An executable, IBM-compatible software was created and customized for use by the personmore » in charge of collecting, submitting, analyzing, and monitoring respirable dust compliance samples. Both descriptive statistics and multiple regression analysis were incorporated. The software allows ASCH files to be exported and directly links with DUSTPRO. After development and validation of the software, longwall compliance data from two different mines was analyzed to evaluate the value of the software. Data included variables on respirable dust concentration, tons produced, the existence of roof/floor rock (dummy variable), and the sampling cycle (dummy variables). Because of confidentiality, specific data will not be presented, only the equations and ANOVA tables. The final regression models explained 83.8% and 61.1% of the variation in the data for the two panels. Important correlations among variables within sampling cycles showed the value of using dummy variables for sampling cycles. The software proved flexible and fast for its intended use. The insights obtained from use improved the systematic monitoring of respirable dust compliance data, especially for pinpointing the most effective dust control methods during specific sampling cycles.« less

  1. Design, Fabrication, and Testing of a Hopper Spacecraft Simulator

    NASA Astrophysics Data System (ADS)

    Mucasey, Evan Phillip Krell

    A robust test bed is needed to facilitate future development of guidance, navigation, and control software for future vehicles capable of vertical takeoff and landings. Specifically, this work aims to develop both a hardware and software simulator that can be used for future flight software development for extra-planetary vehicles. To achieve the program requirements of a high thrust to weight ratio with large payload capability, the vehicle is designed to have a novel combination of electric motors and a micro jet engine is used to act as the propulsion elements. The spacecraft simulator underwent several iterations of hardware development using different materials and fabrication methods. The final design used a combination of carbon fiber and fiberglass that was cured under vacuum to serve as the frame of the vehicle which provided a strong, lightweight platform for all flight components and future payloads. The vehicle also uses an open source software development platform, Arduino, to serve as the initial flight computer and has onboard accelerometers, gyroscopes, and magnetometers to sense the vehicles attitude. To prevent instability due to noise, a polynomial kalman filter was designed and this fed the sensed angles and rates into a robust attitude controller which autonomously control the vehicle' s yaw, pitch, and roll angles. In addition to the hardware development of the vehicle itself, both a software simulation and a real time data acquisition interface was written in MATLAB/SIMULINK so that real flight data could be taken and then correlated to the simulation to prove the accuracy of the analytical model. In result, the full scale vehicle was designed and own outside of the lab environment and data showed that the software model accurately predicted the flight dynamics of the vehicle.

  2. Shade matching assisted by digital photography and computer software.

    PubMed

    Schropp, Lars

    2009-04-01

    To evaluate the efficacy of digital photographs and graphic computer software for color matching compared to conventional visual matching. The shade of a tab from a shade guide (Vita 3D-Master Guide) placed in a phantom head was matched to a second guide of the same type by nine observers. This was done for twelve selected shade tabs (tests). The shade-matching procedure was performed visually in a simulated clinic environment and with digital photographs, and the time spent for both procedures was recorded. An alternative arrangement of the shade tabs was used in the digital photographs. In addition, a graphic software program was used for color analysis. Hue, chroma, and lightness values of the test tab and all tabs of the second guide were derived from the digital photographs. According to the CIE L*C*h* color system, the color differences between the test tab and tabs of the second guide were calculated. The shade guide tab that deviated least from the test tab was determined to be the match. Shade matching performance by means of graphic software was compared with the two visual methods and tested by Chi-square tests (alpha= 0.05). Eight of twelve test tabs (67%) were matched correctly by the computer software method. This was significantly better (p < 0.02) than the performance of the visual shade matching methods conducted in the simulated clinic (32% correct match) and with photographs (28% correct match). No correlation between time consumption for the visual shade matching methods and frequency of correct match was observed. Shade matching assisted by digital photographs and computer software was significantly more reliable than by conventional visual methods.

  3. Plateletpheresis efficiency and mathematical correction of software-derived platelet yield prediction: A linear regression and ROC modeling approach.

    PubMed

    Jaime-Pérez, José Carlos; Jiménez-Castillo, Raúl Alberto; Vázquez-Hernández, Karina Elizabeth; Salazar-Riojas, Rosario; Méndez-Ramírez, Nereida; Gómez-Almaguer, David

    2017-10-01

    Advances in automated cell separators have improved the efficiency of plateletpheresis and the possibility of obtaining double products (DP). We assessed cell processor accuracy of predicted platelet (PLT) yields with the goal of a better prediction of DP collections. This retrospective proof-of-concept study included 302 plateletpheresis procedures performed on a Trima Accel v6.0 at the apheresis unit of a hematology department. Donor variables, software predicted yield and actual PLT yield were statistically evaluated. Software prediction was optimized by linear regression analysis and its optimal cut-off to obtain a DP assessed by receiver operating characteristic curve (ROC) modeling. Three hundred and two plateletpheresis procedures were performed; in 271 (89.7%) occasions, donors were men and in 31 (10.3%) women. Pre-donation PLT count had the best direct correlation with actual PLT yield (r = 0.486. P < .001). Means of software machine-derived values differed significantly from actual PLT yield, 4.72 × 10 11 vs.6.12 × 10 11 , respectively, (P < .001). The following equation was developed to adjust these values: actual PLT yield= 0.221 + (1.254 × theoretical platelet yield). ROC curve model showed an optimal apheresis device software prediction cut-off of 4.65 × 10 11 to obtain a DP, with a sensitivity of 82.2%, specificity of 93.3%, and an area under the curve (AUC) of 0.909. Trima Accel v6.0 software consistently underestimated PLT yields. Simple correction derived from linear regression analysis accurately corrected this underestimation and ROC analysis identified a precise cut-off to reliably predict a DP. © 2016 Wiley Periodicals, Inc.

  4. Semi-Automatic Segmentation Software for Quantitative Clinical Brain Glioblastoma Evaluation

    PubMed Central

    Zhu, Y; Young, G; Xue, Z; Huang, R; You, H; Setayesh, K; Hatabu, H; Cao, F; Wong, S.T.

    2012-01-01

    Rationale and Objectives Quantitative measurement provides essential information about disease progression and treatment response in patients with Glioblastoma multiforme (GBM). The goal of this paper is to present and validate a software pipeline for semi-automatic GBM segmentation, called AFINITI (Assisted Follow-up in NeuroImaging of Therapeutic Intervention), using clinical data from GBM patients. Materials and Methods Our software adopts the current state-of-the-art tumor segmentation algorithms and combines them into one clinically usable pipeline. Both the advantages of the traditional voxel-based and the deformable shape-based segmentation are embedded into the software pipeline. The former provides an automatic tumor segmentation scheme based on T1- and T2-weighted MR brain data, and the latter refines the segmentation results with minimal manual input. Results Twenty six clinical MR brain images of GBM patients were processed and compared with manual results. The results can be visualized using the embedded graphic user interface (GUI). Conclusion Validation results using clinical GBM data showed high correlation between the AFINITI results and manual annotation. Compared to the voxel-wise segmentation, AFINITI yielded more accurate results in segmenting the enhanced GBM from multimodality MRI data. The proposed pipeline could be used as additional information to interpret MR brain images in neuroradiology. PMID:22591720

  5. From proteomics to systems biology: MAPA, MASS WESTERN, PROMEX, and COVAIN as a user-oriented platform.

    PubMed

    Weckwerth, Wolfram; Wienkoop, Stefanie; Hoehenwarter, Wolfgang; Egelhofer, Volker; Sun, Xiaoliang

    2014-01-01

    Genome sequencing and systems biology are revolutionizing life sciences. Proteomics emerged as a fundamental technique of this novel research area as it is the basis for gene function analysis and modeling of dynamic protein networks. Here a complete proteomics platform suited for functional genomics and systems biology is presented. The strategy includes MAPA (mass accuracy precursor alignment; http://www.univie.ac.at/mosys/software.html ) as a rapid exploratory analysis step; MASS WESTERN for targeted proteomics; COVAIN ( http://www.univie.ac.at/mosys/software.html ) for multivariate statistical analysis, data integration, and data mining; and PROMEX ( http://www.univie.ac.at/mosys/databases.html ) as a database module for proteogenomics and proteotypic peptides for targeted analysis. Moreover, the presented platform can also be utilized to integrate metabolomics and transcriptomics data for the analysis of metabolite-protein-transcript correlations and time course analysis using COVAIN. Examples for the integration of MAPA and MASS WESTERN data, proteogenomic and metabolic modeling approaches for functional genomics, phosphoproteomics by integration of MOAC (metal-oxide affinity chromatography) with MAPA, and the integration of metabolomics, transcriptomics, proteomics, and physiological data using this platform are presented. All software and step-by-step tutorials for data processing and data mining can be downloaded from http://www.univie.ac.at/mosys/software.html.

  6. PIRATE: pediatric imaging response assessment and targeting environment

    NASA Astrophysics Data System (ADS)

    Glenn, Russell; Zhang, Yong; Krasin, Matthew; Hua, Chiaho

    2010-02-01

    By combining the strengths of various imaging modalities, the multimodality imaging approach has potential to improve tumor staging, delineation of tumor boundaries, chemo-radiotherapy regime design, and treatment response assessment in cancer management. To address the urgent needs for efficient tools to analyze large-scale clinical trial data, we have developed an integrated multimodality, functional and anatomical imaging analysis software package for target definition and therapy response assessment in pediatric radiotherapy (RT) patients. Our software provides quantitative tools for automated image segmentation, region-of-interest (ROI) histogram analysis, spatial volume-of-interest (VOI) analysis, and voxel-wise correlation across modalities. To demonstrate the clinical applicability of this software, histogram analyses were performed on baseline and follow-up 18F-fluorodeoxyglucose (18F-FDG) PET images of nine patients with rhabdomyosarcoma enrolled in an institutional clinical trial at St. Jude Children's Research Hospital. In addition, we combined 18F-FDG PET, dynamic-contrast-enhanced (DCE) MR, and anatomical MR data to visualize the heterogeneity in tumor pathophysiology with the ultimate goal of adaptive targeting of regions with high tumor burden. Our software is able to simultaneously analyze multimodality images across multiple time points, which could greatly speed up the analysis of large-scale clinical trial data and validation of potential imaging biomarkers.

  7. Validation of the Simbionix PROcedure Rehearsal Studio sizing module: A comparison of software for endovascular aneurysm repair sizing and planning.

    PubMed

    Velu, Juliëtte F; Groot Jebbink, Erik; de Vries, Jean-Paul P M; Slump, Cornelis H; Geelkerken, Robert H

    2017-02-01

    An important determinant of successful endovascular aortic aneurysm repair is proper sizing of the dimensions of the aortic-iliac vessels. The goal of the present study was to determine the concurrent validity, a method for comparison of test scores, for EVAR sizing and planning of the recently introduced Simbionix PROcedure Rehearsal Studio (PRORS). Seven vascular specialists analyzed anonymized computed tomography angiography scans of 70 patients with an infrarenal aneurysm of the abdominal aorta, using three different sizing software packages Simbionix PRORS (Simbionix USA Corp., Cleveland, OH, USA), 3mensio (Pie Medical Imaging BV, Maastricht, The Netherlands), and TeraRecon (Aquarius, Foster City, CA, USA). The following measurements were included in the protocol: diameter 1 mm below the most distal main renal artery, diameter 15 mm below the lowest renal artery, maximum aneurysm diameter, and length from the most distal renal artery to the left iliac artery bifurcation. Averaged over the locations, the intraclass correlation coefficient is 0.83 for Simbionix versus 3mensio, 0.81 for Simbionix versus TeraRecon, and 0.86 for 3mensio versus TeraRecon. It can be concluded that the Simbionix sizing software is as precise as two other validated and commercially available software packages.

  8. Semi-automated and automated glioma grading using dynamic susceptibility-weighted contrast-enhanced perfusion MRI relative cerebral blood volume measurements.

    PubMed

    Friedman, S N; Bambrough, P J; Kotsarini, C; Khandanpour, N; Hoggard, N

    2012-12-01

    Despite the established role of MRI in the diagnosis of brain tumours, histopathological assessment remains the clinically used technique, especially for the glioma group. Relative cerebral blood volume (rCBV) is a dynamic susceptibility-weighted contrast-enhanced perfusion MRI parameter that has been shown to correlate to tumour grade, but assessment requires a specialist and is time consuming. We developed analysis software to determine glioma gradings from perfusion rCBV scans in a manner that is quick, easy and does not require a specialist operator. MRI perfusion data from 47 patients with different histopathological grades of glioma were analysed with custom-designed software. Semi-automated analysis was performed with a specialist and non-specialist operator separately determining the maximum rCBV value corresponding to the tumour. Automated histogram analysis was performed by calculating the mean, standard deviation, median, mode, skewness and kurtosis of rCBV values. All values were compared with the histopathologically assessed tumour grade. A strong correlation between specialist and non-specialist observer measurements was found. Significantly different values were obtained between tumour grades using both semi-automated and automated techniques, consistent with previous results. The raw (unnormalised) data single-pixel maximum rCBV semi-automated analysis value had the strongest correlation with glioma grade. Standard deviation of the raw data had the strongest correlation of the automated analysis. Semi-automated calculation of raw maximum rCBV value was the best indicator of tumour grade and does not require a specialist operator. Both semi-automated and automated MRI perfusion techniques provide viable non-invasive alternatives to biopsy for glioma tumour grading.

  9. Somatostatin receptor immunohistochemistry in neuroendocrine tumors: comparison between manual and automated evaluation

    PubMed Central

    Daniel, Kaemmerer; Maria, Athelogou; Amelie, Lupp; Isabell, Lenhardt; Stefan, Schulz; Luisa, Peter; Merten, Hommann; Vikas, Prasad; Gerd, Binnig; Paul, Baum Richard

    2014-01-01

    Background: Manual evaluation of somatostatin receptor (SSTR) immunohistochemistry (IHC) is a time-consuming and cost-intensive procedure. Aim of the study was to compare manual evaluation of SSTR subtype IHC to an automated software-based analysis, and to in-vivo imaging by SSTR-based PET/CT. Methods: We examined 25 gastroenteropancreatic neuroendocrine tumor (GEP-NET) patients and correlated their in-vivo SSTR-PET/CT data (determined by the standardized uptake values SUVmax,-mean) with the corresponding ex-vivo IHC data of SSTR subtype (1, 2A, 4, 5) expression. Exactly the same lesions were imaged by PET/CT, resected and analyzed by IHC in each patient. After manual evaluation, the IHC slides were digitized and automatically evaluated for SSTR expression by Definiens XD software. A virtual IHC score “BB1” was created for comparing the manual and automated analysis of SSTR expression. Results: BB1 showed a significant correlation with the corresponding conventionally determined Her2/neu score of the SSTR-subtypes 2A (rs: 0.57), 4 (rs: 0.44) and 5 (rs: 0.43). BB1 of SSTR2A also significantly correlated with the SUVmax (rs: 0.41) and the SUVmean (rs: 0.50). Likewise, a significant correlation was seen between the conventionally evaluated SSTR2A status and the SUVmax (rs: 0.42) and SUVmean (rs: 0.62).Conclusion: Our data demonstrate that the evaluation of the SSTR status by automated analysis (BB1 score), using digitized histopathology slides (“virtual microscopy”), corresponds well with the SSTR2A, 4 and 5 expression as determined by conventional manual histopathology. The BB1 score also exhibited a significant association to the SSTR-PET/CT data in accordance with the high affinity profile of the SSTR analogues used for imaging. PMID:25197368

  10. Clinical utility of automated assessment of left ventricular ejection fraction using artificial intelligence-assisted border detection.

    PubMed

    Rahmouni, Hind W; Ky, Bonnie; Plappert, Ted; Duffy, Kevin; Wiegers, Susan E; Ferrari, Victor A; Keane, Martin G; Kirkpatrick, James N; Silvestry, Frank E; St John Sutton, Martin

    2008-03-01

    Ejection fraction (EF) calculated from 2-dimensional echocardiography provides important prognostic and therapeutic information in patients with heart disease. However, quantification of EF requires planimetry and is time-consuming. As a result, visual assessment is frequently used but is subjective and requires extensive experience. New computer software to assess EF automatically is now available and could be used routinely in busy digital laboratories (>15,000 studies per year) and in core laboratories running large clinical trials. We tested Siemens AutoEF software (Siemens Medical Solutions, Erlangen, Germany) to determine whether it correlated with visual estimates of EF, manual planimetry, and cardiac magnetic resonance (CMR). Siemens AutoEF is based on learned patterns and artificial intelligence. An expert and a novice reader assessed EF visually by reviewing transthoracic echocardiograms from consecutive patients. An experienced sonographer quantified EF in all studies using Simpson's method of disks. AutoEF results were compared to CMR. Ninety-two echocardiograms were analyzed. Visual assessment by the expert (R = 0.86) and the novice reader (R = 0.80) correlated more closely with manual planimetry using Simpson's method than did AutoEF (R = 0.64). The correlation between AutoEF and CMR was 0.63, 0.28, and 0.51 for EF, end-diastolic and end-systolic volumes, respectively. The discrepancies in EF estimates between AutoEF and manual tracing using Simpson's method and between AutoEF and CMR preclude routine clinical use of AutoEF until it has been validated in a number of large, busy echocardiographic laboratories. Visual assessment of EF, with its strong correlation with quantitative EF, underscores its continued clinical utility.

  11. Digital image measurement of specimen deformation based on CCD cameras and Image J software: an application to human pelvic biomechanics

    NASA Astrophysics Data System (ADS)

    Jia, Yongwei; Cheng, Liming; Yu, Guangrong; Lou, Yongjian; Yu, Yan; Chen, Bo; Ding, Zuquan

    2008-03-01

    A method of digital image measurement of specimen deformation based on CCD cameras and Image J software was developed. This method was used to measure the biomechanics behavior of human pelvis. Six cadaveric specimens from the third lumbar vertebra to the proximal 1/3 part of femur were tested. The specimens without any structural abnormalities were dissected of all soft tissue, sparing the hip joint capsules and the ligaments of the pelvic ring and floor. Markers with black dot on white background were affixed to the key regions of the pelvis. Axial loading from the proximal lumbar was applied by MTS in the gradient of 0N to 500N, which simulated the double feet standing stance. The anterior and lateral images of the specimen were obtained through two CCD cameras. Based on Image J software, digital image processing software, which can be freely downloaded from the National Institutes of Health, digital 8-bit images were processed. The procedure includes the recognition of digital marker, image invert, sub-pixel reconstruction, image segmentation, center of mass algorithm based on weighted average of pixel gray values. Vertical displacements of S1 (the first sacral vertebrae) in front view and micro-angular rotation of sacroiliac joint in lateral view were calculated according to the marker movement. The results of digital image measurement showed as following: marker image correlation before and after deformation was excellent. The average correlation coefficient was about 0.983. According to the 768 × 576 pixels image (pixel size 0.68mm × 0.68mm), the precision of the displacement detected in our experiment was about 0.018 pixels and the comparatively error could achieve 1.11\\perthou. The average vertical displacement of S1 of the pelvis was 0.8356+/-0.2830mm under vertical load of 500 Newtons and the average micro-angular rotation of sacroiliac joint in lateral view was 0.584+/-0.221°. The load-displacement curves obtained from our optical measure system matched the clinical results. Digital image measurement of specimen deformation based on CCD cameras and Image J software has good perspective for application in biomechanical research, which has the advantage of simple optical setup, no-contact, high precision, and no special requirement of test environment.

  12. Laboratory demonstration of Stellar Intensity Interferometry using a software correlator

    NASA Astrophysics Data System (ADS)

    Matthews, Nolan; Kieda, David

    2017-06-01

    In this talk I will present measurements of the spatial coherence function of laboratory thermal (black-body) sources using Hanbury-Brown and Twiss interferometry with a digital off-line correlator. Correlations in the intensity fluctuations of a thermal source, such as a star, allow retrieval of the second order coherence function which can be used to perform high resolution imaging and source geometry characterization. We also demonstrate that intensity fluctuations between orthogonal polarization states are uncorrelated but can be used to reduce systematic noise. The work performed here can readily be applied to existing and future Imaging Air-Cherenkov telescopes to measure spatial properties of stellar sources. Some possible candidates for astronomy applications include close binary star systems, fast rotators, Cepheid variables, and potentially even exoplanet characterization.

  13. Vids: Version 2.0 Alpha Visualization Engine

    DTIC Science & Technology

    2018-04-25

    fidelity than existing efforts. Vids is a project aimed at producing more dynamic and interactive visualization tools using modern computer game ...move through and interact with the data to improve informational understanding. The Vids software leverages off-the-shelf modern game development...analysis and correlations. Recently, an ARL-pioneered project named Virtual Reality Data Analysis Environment (VRDAE) used VR and a modern game engine

  14. Temporal Progression of Visual Injury from Blast Exposure

    DTIC Science & Technology

    2017-09-01

    seen throughout the duration of the study. To correlate experimental blast exposures in rodents to human blast exposures, a computational parametric...software (JMP 10.0, Cary,NC). Descriptive and univariate analyses will first be performed to identify the occurrence of delayed visual system...later). The biostatistician evaluating the retrospective data has completed the descriptive analysis and is working on the multiple regression. Table

  15. Regression methods for spatially correlated data: an example using beetle attacks in a seed orchard

    Treesearch

    Preisler Haiganoush; Nancy G. Rappaport; David L. Wood

    1997-01-01

    We present a statistical procedure for studying the simultaneous effects of observed covariates and unmeasured spatial variables on responses of interest. The procedure uses regression type analyses that can be used with existing statistical software packages. An example using the rate of twig beetle attacks on Douglas-fir trees in a seed orchard illustrates the...

  16. The Relationship between the Use of Spaced Repetition Software with a TOEIC Word List and TOEIC Score Gains

    ERIC Educational Resources Information Center

    Bower, Jack Victor; Rutson-Griffiths, Arthur

    2016-01-01

    A strong relationship between L2 vocabulary knowledge and L2 reading and listening comprehension is well established. However, less research has been conducted to explore correlations between pedagogic interventions to increase vocabulary knowledge and score gains on standardized L2 proficiency tests. This study addresses this gap in the research…

  17. Development of a Software-Defined Radar

    DTIC Science & Technology

    2017-10-01

    waveform to the widest available (unoccupied) instantaneous bandwidth in real time. Consequently, the radar range resolution and target detection are...LabVIEW The matched filter range profile is calculated in real time using fast Fourier transform (FFT) operations to perform a cross-correlation...between the transmitted waveform and the received complex data. Figure 4 demonstrates the block logic used to achieve real -time range profile

  18. A Correlational Study Assessing the Relationships among Information Technology Project Complexity, Project Complication, and Project Success

    ERIC Educational Resources Information Center

    Williamson, David J.

    2011-01-01

    The specific problem addressed in this study was the low success rate of information technology (IT) projects in the U.S. Due to the abstract nature and inherent complexity of software development, IT projects are among the most complex projects encountered. Most existing schools of project management theory are based on the rational systems…

  19. Corrigendum to "Nearest neighbor imputation of species-level, plot-scale forest structure attributes from LiDAR data"

    Treesearch

    Andrew T. Hudak; Nicholas L. Crookston; Jeffrey S. Evans; David E. hall; Michael J. Falkowski

    2009-01-01

    The authors regret that an error was discovered in the code within the R software package, yaImpute (Crookston & Finley, 2008), which led to incorrect results reported in the above article. The Most Similar Neighbor (MSN) method computes the distance between reference observations and target observations in a projected space defined using canonical correlation...

  20. Inter- and intrarater reliability of the Chicago Classification in pediatric high-resolution esophageal manometry recordings.

    PubMed

    Singendonk, M M J; Smits, M J; Heijting, I E; van Wijk, M P; Nurko, S; Rosen, R; Weijenborg, P W; Abu-Assi, R; Hoekman, D R; Kuizenga-Wessel, S; Seiboth, G; Benninga, M A; Omari, T I; Kritas, S

    2015-02-01

    The Chicago Classification (CC) facilitates interpretation of high-resolution manometry (HRM) recordings. Application of this adult based algorithm to the pediatric population is unknown. We therefore assessed intra and interrater reliability of software-based CC diagnosis in a pediatric cohort. Thirty pediatric solid state HRM recordings (13M; mean age 12.1 ± 5.1 years) assessing 10 liquid swallows per patient were analyzed twice by 11 raters (six experts, five non-experts). Software-placed anatomical landmarks required manual adjustment or removal. Integrated relaxation pressure (IRP4s), distal contractile integral (DCI), contractile front velocity (CFV), distal latency (DL) and break size (BS), and an overall CC diagnosis were software-generated. In addition, raters provided their subjective CC diagnosis. Reliability was calculated with Cohen's and Fleiss' kappa (κ) and intraclass correlation coefficient (ICC). Intra- and interrater reliability of software-generated CC diagnosis after manual adjustment of landmarks was substantial (mean κ = 0.69 and 0.77 respectively) and moderate-substantial for subjective CC diagnosis (mean κ = 0.70 and 0.58 respectively). Reliability of both software-generated and subjective diagnosis of normal motility was high (κ = 0.81 and κ = 0.79). Intra- and interrater reliability were excellent for IRP4s, DCI, and BS. Experts had higher interrater reliability than non-experts for DL (ICC = 0.65 vs ICC = 0.36 respectively) and the software-generated diagnosis diffuse esophageal spasm (DES, κ = 0.64 vs κ = 0.30). Among experts, the reliability for the subjective diagnosis of achalasia and esophageal gastric junction outflow obstruction was moderate-substantial (κ = 0.45-0.82). Inter- and intrarater reliability of software-based CC diagnosis of pediatric HRM recordings was high overall. However, experience was a factor influencing the diagnosis of some motility disorders, particularly DES and achalasia. © 2014 John Wiley & Sons Ltd.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tuazon, B; Narayanasamy, G; Kirby, N

    Purpose: The purpose of this study was to evaluate and compare the accuracy of dose calculation algorithms in the second check software programs Radcalc, Diamond, IMSure, and MUcheck, against the Pinnacle3 treatment planning system (TPS). Methods: Baseline accuracy of the second check software was established by comparison against Pinnacle TPS data using open square fields of 5, 10, 20, 30 and 40cm in a SAD setup. 18 previously treated patients’ files were exported from the Pinnacle3 TPS to each of the four second check softwares, consisting of 146 step and shoot intensity modulated radiotherapy (IMRT) beams and 60 Smart Arcs.more » Monitor unit (MU) calculated in each of the software were compared with the TPS and the values were represented as a percent difference. Open fields were calculated as a baseline for each software’s accuracy using 5×5, 10×10, 20×20, 30×30, and 40×40 fields. Box plots, Pearson correlation, and Bland-Altman analysis were used for comparison of the results. Results: The baseline accuracy was established to within 0.6%, −1.4%, −0.2%, and −1.0% for Diamond, IMSure,MUcheck, and Radcalc, respectively. In the clinical data, the dose difference represented as mean ± 1 standard deviation were 0.7%±0.1%, −0.3%±0.1%, −1.5%±0.1%, and 0.4%±0.0% for Diamond, IMSure, MUcheck, and Radcalc, respectively Conclusion: The implementation of Clarkson algorithm for the dose calculation between each of the software in question can vary considerably. The currently used second check software, Radcalc has shown the best agreement on average, variance, and smallest percent range from Pinnacle3 TPS values. The closest in average percent difference from the TPS data was the IMSure software, but has significantly larger variance and percent range. The mean percent differences in Diamond and MUcheck were significantly larger than Radcalc and IMSure.« less

  2. An analysis of functional shoulder movements during task performance using Dartfish movement analysis software.

    PubMed

    Khadilkar, Leenesh; MacDermid, Joy C; Sinden, Kathryn E; Jenkyn, Thomas R; Birmingham, Trevor B; Athwal, George S

    2014-01-01

    Video-based movement analysis software (Dartfish) has potential for clinical applications for understanding shoulder motion if functional measures can be reliably obtained. The primary purpose of this study was to describe the functional range of motion (ROM) of the shoulder used to perform a subset of functional tasks. A second purpose was to assess the reliability of functional ROM measurements obtained by different raters using Dartfish software. Ten healthy participants, mean age 29 ± 5 years, were videotaped while performing five tasks selected from the Disabilities of the Arm, Shoulder and Hand (DASH). Video cameras and markers were used to obtain video images suitable for analysis in Dartfish software. Three repetitions of each task were performed. Shoulder movements from all three repetitions were analyzed using Dartfish software. The tracking tool of the Dartfish software was used to obtain shoulder joint angles and arcs of motion. Test-retest and inter-rater reliability of the measurements were evaluated using intraclass correlation coefficients (ICC). Maximum (coronal plane) abduction (118° ± 16°) and (sagittal plane) flexion (111° ± 15°) was observed during 'washing one's hair;' maximum extension (-68° ± 9°) was identified during 'washing one's own back.' Minimum shoulder ROM was observed during 'opening a tight jar' (33° ± 13° abduction and 13° ± 19° flexion). Test-retest reliability (ICC = 0.45 to 0.94) suggests high inter-individual task variability, and inter-rater reliability (ICC = 0.68 to 1.00) showed moderate to excellent agreement. KEY FINDINGS INCLUDE: 1) functional shoulder ROM identified in this study compared to similar studies; 2) healthy individuals require less than full ROM when performing five common ADL tasks 3) high participant variability was observed during performance of the five ADL tasks; and 4) Dartfish software provides a clinically relevant tool to analyze shoulder function.

  3. Accurate and fiducial-marker-free correction for three-dimensional chromatic shift in biological fluorescence microscopy.

    PubMed

    Matsuda, Atsushi; Schermelleh, Lothar; Hirano, Yasuhiro; Haraguchi, Tokuko; Hiraoka, Yasushi

    2018-05-15

    Correction of chromatic shift is necessary for precise registration of multicolor fluorescence images of biological specimens. New emerging technologies in fluorescence microscopy with increasing spatial resolution and penetration depth have prompted the need for more accurate methods to correct chromatic aberration. However, the amount of chromatic shift of the region of interest in biological samples often deviates from the theoretical prediction because of unknown dispersion in the biological samples. To measure and correct chromatic shift in biological samples, we developed a quadrisection phase correlation approach to computationally calculate translation, rotation, and magnification from reference images. Furthermore, to account for local chromatic shifts, images are split into smaller elements, for which the phase correlation between channels is measured individually and corrected accordingly. We implemented this method in an easy-to-use open-source software package, called Chromagnon, that is able to correct shifts with a 3D accuracy of approximately 15 nm. Applying this software, we quantified the level of uncertainty in chromatic shift correction, depending on the imaging modality used, and for different existing calibration methods, along with the proposed one. Finally, we provide guidelines to choose the optimal chromatic shift registration method for any given situation.

  4. Survey for δ Sct components in eclipsing binaries and new correlations between pulsation frequency and fundamental stellar characteristics

    NASA Astrophysics Data System (ADS)

    Liakos, A.; Niarchos, P.; Soydugan, E.; Zasche, P.

    2012-05-01

    CCD observations of 68 eclipsing binary systems, candidates for containing δ Scuti components, were obtained. Their light curves are analysed using the PERIOD04 software for possible pulsational behaviour. For the systems QY Aql, CZ Aqr, TY Cap, WY Cet, UW Cyg, HL Dra, HZ Dra, AU Lac, CL Lyn and IO UMa, complete light curves were observed due to the detection of a pulsating component. All of them, except QY Aql and IO UMa, are analysed with modern astronomical softwares in order to determine their geometrical and pulsational characteristics. Spectroscopic observations of WY Cet and UW Cyg were used to estimate the spectral class of their primary components, while for HZ Dra radial velocities of its primary were measured. O - C diagram analysis was performed for the cases showing peculiar orbital period variations, namely CZ Aqr, TY Cap, WY Cet and UW Cyg, with the aim of obtaining a comprehensive picture of these systems. An updated catalogue of 74 close binaries including a δ Scuti companion is presented. Moreover, a connection between orbital and pulsation periods, as well as a correlation between evolutionary status and dominant pulsation frequency for these systems, is discussed.

  5. Cutoff Finder: A Comprehensive and Straightforward Web Application Enabling Rapid Biomarker Cutoff Optimization

    PubMed Central

    Budczies, Jan; Klauschen, Frederick; Sinn, Bruno V.; Győrffy, Balázs; Schmitt, Wolfgang D.; Darb-Esfahani, Silvia; Denkert, Carsten

    2012-01-01

    Gene or protein expression data are usually represented by metric or at least ordinal variables. In order to translate a continuous variable into a clinical decision, it is necessary to determine a cutoff point and to stratify patients into two groups each requiring a different kind of treatment. Currently, there is no standard method or standard software for biomarker cutoff determination. Therefore, we developed Cutoff Finder, a bundle of optimization and visualization methods for cutoff determination that is accessible online. While one of the methods for cutoff optimization is based solely on the distribution of the marker under investigation, other methods optimize the correlation of the dichotomization with respect to an outcome or survival variable. We illustrate the functionality of Cutoff Finder by the analysis of the gene expression of estrogen receptor (ER) and progesterone receptor (PgR) in breast cancer tissues. This distribution of these important markers is analyzed and correlated with immunohistologically determined ER status and distant metastasis free survival. Cutoff Finder is expected to fill a relevant gap in the available biometric software repertoire and will enable faster optimization of new diagnostic biomarkers. The tool can be accessed at http://molpath.charite.de/cutoff. PMID:23251644

  6. MCTBI: a web server for predicting metal ion effects in RNA structures.

    PubMed

    Sun, Li-Zhen; Zhang, Jing-Xiang; Chen, Shi-Jie

    2017-08-01

    Metal ions play critical roles in RNA structure and function. However, web servers and software packages for predicting ion effects in RNA structures are notably scarce. Furthermore, the existing web servers and software packages mainly neglect ion correlation and fluctuation effects, which are potentially important for RNAs. We here report a new web server, the MCTBI server (http://rna.physics.missouri.edu/MCTBI), for the prediction of ion effects for RNA structures. This server is based on the recently developed MCTBI, a model that can account for ion correlation and fluctuation effects for nucleic acid structures and can provide improved predictions for the effects of metal ions, especially for multivalent ions such as Mg 2+ effects, as shown by extensive theory-experiment test results. The MCTBI web server predicts metal ion binding fractions, the most probable bound ion distribution, the electrostatic free energy of the system, and the free energy components. The results provide mechanistic insights into the role of metal ions in RNA structure formation and folding stability, which is important for understanding RNA functions and the rational design of RNA structures. © 2017 Sun et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  7. Assessment of changes in expression and presentation of NKG2D under influence of MICA serum factor in different stages of breast cancer.

    PubMed

    Roshani, R; Boroujerdnia, M Ghafourian; Talaiezadeh, A H; Khodadadi, A

    2016-05-01

    Breast cancer is the most common cancer in women worldwide. In this study, we correlated the serum level of major histocompatibility complex class I-related chain A (sMICA) with expression and presentation of NKG2D receptors on NK cells among patients with breast cancer. Peripheral blood (PB) samples were collected from 49 healthy and 49 breast cancer patients before surgery and chemotherapy. The expression and presentation of NKG2D were assessed using qRT-PCR and flow cytometry, respectively. Furthermore, sMICA levels were determined using ELISA. In flow cytometry, whole blood samples were stained with anti-CD56/NKG2D/CD3 and the obtained results were analyzed using WinMDI software. In addition, SPSS software was used for statistical analysis of data. Significantly higher levels sMICA were detected in the sera of the majority of cancer patients in contrast to healthy volunteers (P < 0.001). The expression and presentation of NKG2D receptor were significantly lower than those in healthy persons, and with an inverse correlation to sMICA and positively correlated with tumor stage. Our study showed that sMICA may have an important role in diminishing the expression and presentation of NKG2D receptor in breast cancer patients and proposes the notion that sMICA can be a target candidate for treatment of breast cancer.

  8. Development of a Gas Dynamic and Thermodynamic Simulation Model of the Lontra Blade Compressor™

    NASA Astrophysics Data System (ADS)

    Karlovsky, Jerome

    2015-08-01

    The Lontra Blade Compressor™ is a patented double acting, internally compressing, positive displacement rotary compressor of innovative design. The Blade Compressor is in production for waste-water treatment, and will soon be launched for a range of applications at higher pressure ratios. In order to aid the design and development process, a thermodynamic and gas dynamic simulation program has been written in house. The software has been successfully used to optimise geometries and running conditions of current designs, and is also being used to evaluate future designs for different applications and markets. The simulation code has three main elements. A positive displacement chamber model, a leakage model and a gas dynamic model to simulate gas flow through ports and to track pressure waves in the inlet and outlet pipes. All three of these models are interlinked in order to track mass and energy flows within the system. A correlation study has been carried out to verify the software. The main correlation markers used were mass flow, chamber pressure, pressure wave tracking in the outlet pipe, and volumetric efficiency. It will be shown that excellent correlation has been achieved between measured and simulated data. Mass flow predictions were to within 2% of measured data, and the timings and magnitudes of all major gas dynamic effects were well replicated. The simulation will be further developed in the near future to help with the optimisation of exhaust and inlet silencers.

  9. Latin Hypercube Sampling (LHS) UNIX Library/Standalone

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2004-05-13

    The LHS UNIX Library/Standalone software provides the capability to draw random samples from over 30 distribution types. It performs the sampling by a stratified sampling method called Latin Hypercube Sampling (LHS). Multiple distributions can be sampled simultaneously, with user-specified correlations amongst the input distributions, LHS UNIX Library/ Standalone provides a way to generate multi-variate samples. The LHS samples can be generated either as a callable library (e.g., from within the DAKOTA software framework) or as a standalone capability. LHS UNIX Library/Standalone uses the Latin Hypercube Sampling method (LHS) to generate samples. LHS is a constrained Monte Carlo sampling scheme. Inmore » LHS, the range of each variable is divided into non-overlapping intervals on the basis of equal probability. A sample is selected at random with respect to the probability density in each interval, If multiple variables are sampled simultaneously, then values obtained for each are paired in a random manner with the n values of the other variables. In some cases, the pairing is restricted to obtain specified correlations amongst the input variables. Many simulation codes have input parameters that are uncertain and can be specified by a distribution, To perform uncertainty analysis and sensitivity analysis, random values are drawn from the input parameter distributions, and the simulation is run with these values to obtain output values. If this is done repeatedly, with many input samples drawn, one can build up a distribution of the output as well as examine correlations between input and output variables.« less

  10. Determination of Soil Moisture Content using Laboratory Experimental and Field Electrical Resistivity Values

    NASA Astrophysics Data System (ADS)

    Hazreek, Z. A. M.; Rosli, S.; Fauziah, A.; Wijeyesekera, D. C.; Ashraf, M. I. M.; Faizal, T. B. M.; Kamarudin, A. F.; Rais, Y.; Dan, M. F. Md; Azhar, A. T. S.; Hafiz, Z. M.

    2018-04-01

    The efficiency of civil engineering structure require comprehensive geotechnical data obtained from site investigation. In the past, conventional site investigation was heavily related to drilling techniques thus suffer from several limitations such as time consuming, expensive and limited data collection. Consequently, this study presents determination of soil moisture content using laboratory experimental and field electrical resistivity values (ERV). Field and laboratory electrical resistivity (ER) test were performed using ABEM SAS4000 and Nilsson400 soil resistance meter. Soil sample used for resistivity test was tested for characterization test specifically on particle size distribution and moisture content test according to BS1377 (1990). Field ER data was processed using RES2DINV software while laboratory ER data was analyzed using SPSS and Excel software. Correlation of ERV and moisture content shows some medium relationship due to its r = 0.506. Moreover, coefficient of determination, R2 analyzed has demonstrate that the statistical correlation obtain was very good due to its R2 value of 0.9382. In order to determine soil moisture content based on statistical correlation (w = 110.68ρ-0.347), correction factor, C was established through laboratory and field ERV given as 19.27. Finally, this study has shown that soil basic geotechnical properties with particular reference to water content was applicably determined using integration of laboratory and field ERV data analysis thus able to compliment conventional approach due to its economic, fast and wider data coverage.

  11. Time-resolved perfusion imaging at the angiography suite: preclinical comparison of a new flat-detector application to computed tomography perfusion.

    PubMed

    Jürgens, Julian H W; Schulz, Nadine; Wybranski, Christian; Seidensticker, Max; Streit, Sebastian; Brauner, Jan; Wohlgemuth, Walter A; Deuerling-Zheng, Yu; Ricke, Jens; Dudeck, Oliver

    2015-02-01

    The objective of this study was to compare the parameter maps of a new flat-panel detector application for time-resolved perfusion imaging in the angiography room (FD-CTP) with computed tomography perfusion (CTP) in an experimental tumor model. Twenty-four VX2 tumors were implanted into the hind legs of 12 rabbits. Three weeks later, FD-CTP (Artis zeego; Siemens) and CTP (SOMATOM Definition AS +; Siemens) were performed. The parameter maps for the FD-CTP were calculated using a prototype software, and those for the CTP were calculated with VPCT-body software on a dedicated syngo MultiModality Workplace. The parameters were compared using Pearson product-moment correlation coefficient and linear regression analysis. The Pearson product-moment correlation coefficient showed good correlation values for both the intratumoral blood volume of 0.848 (P < 0.01) and the blood flow of 0.698 (P < 0.01). The linear regression analysis of the perfusion between FD-CTP and CTP showed for the blood volume a regression equation y = 4.44x + 36.72 (P < 0.01) and for the blood flow y = 0.75x + 14.61 (P < 0.01). This preclinical study provides evidence that FD-CTP allows a time-resolved (dynamic) perfusion imaging of tumors similar to CTP, which provides the basis for clinical applications such as the assessment of tumor response to locoregional therapies directly in the angiography suite.

  12. Intraoperative Right Ventricular Fractional Area Change Is a Good Indicator of Right Ventricular Contractility: A Retrospective Comparison Using Two- and Three-Dimensional Echocardiography.

    PubMed

    Imada, Tatsuyuki; Kamibayashi, Takahiko; Ota, Chiho; Carl Shibata, Sho; Iritakenishi, Takeshi; Sawa, Yoshiki; Fujino, Yuji

    2015-08-01

    Intraoperative two-dimensional echocardiography is technically challenging, given the unique geometry of the right ventricle (RV). It was hypothesized that the RV fractional area change (RVFAC) could be used as a simple method to evaluate RV function during surgery. Therefore, the correlation between the intraoperative RVFAC and the true right ventricular ejection fraction (RVEF), as measured using newly developed three-dimensional (3D) analysis software, was evaluated. Retrospective study. University hospital. Patients who underwent cardiac surgery with transesophageal echocardiography monitoring between March 2014 and June 2014. None. Sixty-two patients were included in this study. After the exclusion of poor imaging data and patients with arrhythmias, 54 data sets were analyzed. RVFAC was measured by one anesthesiologist during surgery, and full-volume 3D echocardiographic data were recorded simultaneously. The 3D data were analyzed postoperatively using off-line 3D analysis software by a second anesthesiologist, who was blinded to the RVFAC results. The mean RVFAC was 38.8% ± 8.7%, the mean RVEF was 41.4% ± 8.3%, and there was a good correlation between the RVFAC and the RVEF (r(2) = 0.638; p<0.0001). The RVFAC was well-correlated with the RVEF calculated using 3D echocardiography; therefore, RVFAC provides a simple and useful method for anesthesiologists to evaluate intraoperative RV function. Copyright © 2015 Elsevier Inc. All rights reserved.

  13. Proposal for future diagnosis and management of vascular tumors by using automatic software for image processing and statistic prediction.

    PubMed

    Popescu, M D; Draghici, L; Secheli, I; Secheli, M; Codrescu, M; Draghici, I

    2015-01-01

    Infantile Hemangiomas (IH) are the most frequent tumors of vascular origin, and the differential diagnosis from vascular malformations is difficult to establish. Specific types of IH due to the location, dimensions and fast evolution, can determine important functional and esthetic sequels. To avoid these unfortunate consequences it is necessary to establish the exact appropriate moment to begin the treatment and decide which the most adequate therapeutic procedure is. Based on clinical data collected by a serial clinical observations correlated with imaging data, and processed by a computer-aided diagnosis system (CAD), the study intended to develop a treatment algorithm to accurately predict the best final results, from the esthetical and functional point of view, for a certain type of lesion. The preliminary database was composed of 75 patients divided into 4 groups according to the treatment management they received: medical therapy, sclerotherapy, surgical excision and no treatment. The serial clinical observation was performed each month and all the data was processed by using CAD. The project goal was to create a software that incorporated advanced methods to accurately measure the specific IH lesions, integrated medical information, statistical methods and computational methods to correlate this information with that obtained from the processing of images. Based on these correlations, a prediction mechanism of the evolution of hemangioma, which helped determine the best method of therapeutic intervention to minimize further complications, was established.

  14. Accuracy of laser-scanned models compared to plaster models and cone-beam computed tomography.

    PubMed

    Kim, Jooseong; Heo, Giseon; Lagravère, Manuel O

    2014-05-01

    To compare the accuracy of measurements obtained from the three-dimensional (3D) laser scans to those taken from the cone-beam computed tomography (CBCT) scans and those obtained from plaster models. Eighteen different measurements, encompassing mesiodistal width of teeth and both maxillary and mandibular arch length and width, were selected using various landmarks. CBCT scans and plaster models were prepared from 60 patients. Plaster models were scanned using the Ortho Insight 3D laser scanner, and the selected landmarks were measured using its software. CBCT scans were imported and analyzed using the Avizo software, and the 26 landmarks corresponding to the selected measurements were located and recorded. The plaster models were also measured using a digital caliper. Descriptive statistics and intraclass correlation coefficient (ICC) were used to analyze the data. The ICC result showed that the values obtained by the three different methods were highly correlated in all measurements, all having correlations>0.808. When checking the differences between values and methods, the largest mean difference found was 0.59 mm±0.38 mm. In conclusion, plaster models, CBCT models, and laser-scanned models are three different diagnostic records, each with its own advantages and disadvantages. The present results showed that the laser-scanned models are highly accurate to plaster models and CBCT scans. This gives general clinicians an alternative to take into consideration the advantages of laser-scanned models over plaster models and CBCT reconstructions.

  15. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    PubMed

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation coefficient between AQCEL and conventional methods were 0.973 and 0.986 for the normal and affected sides at rest, respectively, and 0.977 and 0.984 for the normal and affected sides after ACZ loading, respectively. The quality of images reconstructed using the application software AQCEL were superior to that obtained using conventional method after ACZ loading, and high correlations were shown in quantity at rest and after ACZ loading. This software can be applied to clinical practice and is a useful tool for improvement of reproducibility and throughput.

  16. Evolving attractive faces using morphing technology and a genetic algorithm: a new approach to determining ideal facial aesthetics.

    PubMed

    Wong, Brian J F; Karimi, Koohyar; Devcic, Zlatko; McLaren, Christine E; Chen, Wen-Pin

    2008-06-01

    The objectives of this study were to: 1) determine if a genetic algorithm in combination with morphing software can be used to evolve more attractive faces; and 2) evaluate whether this approach can be used as a tool to define or identify the attributes of the ideal attractive face. Basic research study incorporating focus group evaluations. Digital images were acquired of 250 female volunteers (18-25 y). Randomly selected images were used to produce a parent generation (P) of 30 synthetic faces using morphing software. Then, a focus group of 17 trained volunteers (18-25 y) scored each face on an attractiveness scale ranging from 1 (unattractive) to 10 (attractive). A genetic algorithm was used to select 30 new pairs from the parent generation, and these were morphed using software to produce a new first generation (F1) of faces. The F1 faces were scored by the focus group, and the process was repeated for a total of four iterations of the algorithm. The algorithm mimics natural selection by using the attractiveness score as the selection pressure; the more attractive faces are more likely to morph. All five generations (P-F4) were then scored by three focus groups: a) surgeons (n = 12), b) cos-metology students (n = 44), and c) undergraduate students (n = 44). Morphometric measurements were made of 33 specific features on each of the 150 synthetic faces, and correlated with attractiveness scores using univariate and multivariate analysis. The average facial attractiveness scores increased with each generation and were 3.66 (+0.60), 4.59 (+/-0.73), 5.50 (+/-0.62), 6.23 (+/-0.31), and 6.39 (+/-0.24) for P and F1-F4 generations, respectively. Histograms of attractiveness score distributions show a significant shift in the skew of each curve toward more attractive faces with each generation. Univariate analysis identified nasal width, eyebrow arch height, and lip thickness as being significantly correlated with attractiveness scores. Multivariate analysis identified a similar collection of morphometric measures. No correlation with more commonly accepted measures such as the length facial thirds or fifths were identified. When images are examined as a montage (by generation), clear distinct trends are identified: oval shaped faces, distinct arched eyebrows, and full lips predominate. Faces evolve to approximate the guidelines suggested by classical canons. F3 and F4 generation faces look profoundly similar. The statistical and qualitative analysis indicates that the algorithm and methodology succeeds in generating successively more attractive faces. The use of genetic algorithms in combination with a morphing software and traditional focus-group derived attractiveness scores can be used to evolve attractive synthetic faces. We have demonstrated that the evolution of attractive faces can be mimicked in software. Genetic algorithms and morphing provide a robust alternative to traditional approaches rooted in comparing attractiveness scores with a series of morphometric measurements in human subjects.

  17. Feature tracking CMR reveals abnormal strain in preclinical arrhythmogenic right ventricular dysplasia/ cardiomyopathy: a multisoftware feasibility and clinical implementation study.

    PubMed

    Bourfiss, Mimount; Vigneault, Davis M; Aliyari Ghasebeh, Mounes; Murray, Brittney; James, Cynthia A; Tichnell, Crystal; Mohamed Hoesein, Firdaus A; Zimmerman, Stefan L; Kamel, Ihab R; Calkins, Hugh; Tandri, Harikrishna; Velthuis, Birgitta K; Bluemke, David A; Te Riele, Anneline S J M

    2017-09-01

    Regional right ventricular (RV) dysfunction is the hallmark of Arrhythmogenic Right Ventricular Dysplasia/Cardiomyopathy (ARVD/C), but is currently only qualitatively evaluated in the clinical setting. Feature Tracking Cardiovascular Magnetic Resonance (FT-CMR) is a novel quantitative method that uses cine CMR to calculate strain values. However, most prior FT-CMR studies in ARVD/C have focused on global RV strain using different software methods, complicating implementation of FT-CMR in clinical practice. We aimed to assess the clinical value of global and regional strain using FT-CMR in ARVD/C and to determine differences between commercially available FT-CMR software packages. We analyzed cine CMR images of 110 subjects (39 overt ARVD/C [mutation+/phenotype+], 40 preclinical ARVD/C [mutation+/phenotype-] and 31 control) for global and regional (subtricuspid, anterior, apical) RV strain in the horizontal longitudinal axis using four FT-CMR software methods (Multimodality Tissue Tracking, TomTec, Medis and Circle Cardiovascular Imaging). Intersoftware agreement was assessed using Bland Altman plots. For global strain, all methods showed reduced strain in overt ARVD/C patients compared to control subjects (p < 0.041), whereas none distinguished preclinical from control subjects (p > 0.275). For regional strain, overt ARVD/C patients showed reduced strain compared to control subjects in all segments which reached statistical significance in the subtricuspid region for all software methods (p < 0.037), in the anterior wall for two methods (p < 0.005) and in the apex for one method (p = 0.012). Preclinical subjects showed abnormal subtricuspid strain compared to control subjects using one of the software methods (p = 0.009). Agreement between software methods for absolute strain values was low (Intraclass Correlation Coefficient = 0.373). Despite large intersoftware variability of FT-CMR derived strain values, all four software methods distinguished overt ARVD/C patients from control subjects by both global and subtricuspid strain values. In the subtricuspid region, one software package distinguished preclinical from control subjects, suggesting the potential to identify early ARVD/C prior to overt disease expression.

  18. Correlation analysis of a ground-water level monitoring network, Miami-Dade County, Florida

    USGS Publications Warehouse

    Prinos, Scott T.

    2005-01-01

    The U.S. Geological Survey cooperative ground-water monitoring program in Miami-Dade County, Florida, expanded from 4 to 98 continuously recording water-level monitoring wells during the 1939-2001 period. Network design was based on area specific assessments; however, no countywide statistical assessments of network coverage had been performed for the purpose of assessing network redundancy. To aid in the assessment of network redundancy, correlation analyses were performed using S-PLUS 2000 statistical analysis software for daily maximum water-level data from 98 monitoring wells for the November 1, 1973, to October 31, 2000 period. Because of the complexities of the hydrologic, water-supply, and water-management systems in Miami-Dade County and the changes that have occurred to these systems through time, spatial and temporal variations in the degree of correlation had to be considered. To assess temporal variation in correlation, water-level data from each well were subdivided by year and by wet and dry seasons. For each well, year, and season, correlation analyses were performed on the data from those wells that had available data. For selected wells, the resulting correlation coefficients from each year and season were plotted with respect to time. To assess spatial variation in correlation, the coefficients determined from the correlation analysis were averaged. These average wet- and dry-season correlation coefficients were plotted spatially using geographic information system software. Wells with water-level data that correlated with a coefficient of 0.95 or greater were almost always located in relatively close proximity to each other. Five areas were identified where the water-level data from wells within the area remained correlated with that of other wells in the area during the wet and dry seasons. These areas are located in or near the C-1 and C-102 basins (2 wells), in or near the C-6 and C-7 basins (2 wells), near the Florida Keys Aqueduct Authority Well Field (2 wells), near the Hialeah-Miami Springs Well Field (6 wells), and near the West Well Field (21 wells). Data from the remaining 65 wells (most of the wells in the network) generally were not correlated with those of other wells during both the wet and dry seasons with an average coefficient of 0.95 or greater for the comparison. Because many of the wells near the West Well Field and some near the Hialeah-Miami Springs Well Field had not been in operation for very long (most having been installed in 1994), the averaged correlation coefficients for these wells were often determined using only a few seasons of data. For the few instances where water-level data were found to be well correlated on average for a lengthy period of record, short-term declines in correlation were often identified. In general, it would be beneficial to compare data for longer periods of record than currently available.

  19. Working covariance model selection for generalized estimating equations.

    PubMed

    Carey, Vincent J; Wang, You-Gan

    2011-11-20

    We investigate methods for data-based selection of working covariance models in the analysis of correlated data with generalized estimating equations. We study two selection criteria: Gaussian pseudolikelihood and a geodesic distance based on discrepancy between model-sensitive and model-robust regression parameter covariance estimators. The Gaussian pseudolikelihood is found in simulation to be reasonably sensitive for several response distributions and noncanonical mean-variance relations for longitudinal data. Application is also made to a clinical dataset. Assessment of adequacy of both correlation and variance models for longitudinal data should be routine in applications, and we describe open-source software supporting this practice. Copyright © 2011 John Wiley & Sons, Ltd.

  20. Application of ChemDraw NMR Tool: Correlation of Program-Generated 13C Chemical Shifts and pKa Values of para-Substituted Benzoic Acids

    NASA Astrophysics Data System (ADS)

    Wang, Hongyi

    2005-09-01

    An application of ChemDraw NMR Tool was demonstrated by correlation of program-generated 13 C NMR chemical shifts and p K a values of para-substituted benzoic acids. Experimental 13 C NMR chemical shifts were analyzed in the same way for comparison. The project can be used as an assignment at the end of the first-year organic chemistry course to review topics or explore new techniques: Hammett equation, acid base equilibrium theory, electronic nature of functional groups, inductive and resonance effects, structure reactivity relationship, NMR spectroscopy, literature search, database search, and ChemDraw software.

  1. The AuScope Project and Trans-Tasman VLBI

    NASA Technical Reports Server (NTRS)

    Lovell, Jim; Dickey, John; Gulyaev, Sergei; Natusch, Tim; Titov, Oleg; Tingay, Steven

    2010-01-01

    Three 12-meter radio telescopes are being built in Australia (the AuScope project) and one in New Zealand. These facilities will be fully-equipped for undertaking S and X-band geodetic VLBI observations and correlation will take place on a software correlator (part of the AuScope project). All sites are equipped with permanent GPS receivers to provide co-location of several space geodetic techniques. The following scientific tasks of geodesy and astrometry are considered. 1. Improvement and densification of the International Celestial Reference Frame in the southern hemisphere; 2. Improvement of the International Terrestrial Reference Frame in the region; 3. Measurement of intraplate deformation of the Australian tectonic plate.

  2. Observation VLBI Session RAPL02. the Results of the Data Processing

    NASA Astrophysics Data System (ADS)

    Chuprikov, A. A.

    Results of processing of data of a VLBI experiment titled RAPL02 are presented. These observations were made in 2011 February with 5 antennas. All 3 antennas of Petersberg's Institute of Applied Astronomy (IAA) were used in this session. These were antennae in Svetloe, in Zelenchuck, and in Badary. Additionally, a 22-m antenna in Puschino as well as a 32-m antenna in Medicina (Italy) were also included into observations. The raw data correlation was made at the software correlator of Astro Space Center. The secondary data processing was made for 3 quasars, 3C273, 3C279, and 3C286.

  3. The eating and physical activity habits of inner-city adolescents.

    PubMed

    Sweeney, Nancy M; Glaser, Dale; Tedeschi, Christine

    2007-01-01

    The purposes of this study were to (a) analyze the body mass index (BMI) percentile and eating and physical activity habits of adolescents, viewing them by sex, ethnicity, household type, and foreign-born or born in the United States, and (b) evaluate diet and activity analysis software for use by practitioners and adolescent clients. A descriptive-correlational study of 74 adolescents from low-income households completed a 24-hour recall of their foods, drinks, and activities, which were analyzed using MyPyramidTracker software. Data were analyzed using parametric and nonparametric methods to test associations and conduct between-group comparisons. Girls, Hispanics, adolescents living with single-parent mothers, and those who were foreign born had the highest mean BMI percentiles and the least healthy eating and physical activity habits. BMI percentile fell as daily calorie expenditure rose. MyPyramidTracker software is suitable for use by adolescents and their family members. To contribute to the reversal of national trends of increasing overweight status among adolescents, practitioners can focus their teaching, counseling, and advocacy on adolescents in these groups.

  4. Implementation of metal-friendly EAM/FS-type semi-empirical potentials in HOOMD-blue: A GPU-accelerated molecular dynamics software

    DOE PAGES

    Yang, Lin; Zhang, Feng; Wang, Cai-Zhuang; ...

    2018-01-12

    We present an implementation of EAM and FS interatomic potentials, which are widely used in simulating metallic systems, in HOOMD-blue, a software designed to perform classical molecular dynamics simulations using GPU accelerations. We first discuss the details of our implementation and then report extensive benchmark tests. We demonstrate that single-precision floating point operations efficiently implemented on GPUs can produce sufficient accuracy when compared against double-precision codes, as demonstrated in test simulations of calculations of the glass-transition temperature of Cu 64.5Zr 35.5, and pair correlation function of liquid Ni 3Al. Our code scales well with the size of the simulating systemmore » on NVIDIA Tesla M40 and P100 GPUs. Compared with another popular software LAMMPS running on 32 cores of AMD Opteron 6220 processors, the GPU/CPU performance ratio can reach as high as 4.6. In conclusion, the source code can be accessed through the HOOMD-blue web page for free by any interested user.« less

  5. Implementation of metal-friendly EAM/FS-type semi-empirical potentials in HOOMD-blue: A GPU-accelerated molecular dynamics software

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, Lin; Zhang, Feng; Wang, Cai-Zhuang

    We present an implementation of EAM and FS interatomic potentials, which are widely used in simulating metallic systems, in HOOMD-blue, a software designed to perform classical molecular dynamics simulations using GPU accelerations. We first discuss the details of our implementation and then report extensive benchmark tests. We demonstrate that single-precision floating point operations efficiently implemented on GPUs can produce sufficient accuracy when compared against double-precision codes, as demonstrated in test simulations of calculations of the glass-transition temperature of Cu 64.5Zr 35.5, and pair correlation function of liquid Ni 3Al. Our code scales well with the size of the simulating systemmore » on NVIDIA Tesla M40 and P100 GPUs. Compared with another popular software LAMMPS running on 32 cores of AMD Opteron 6220 processors, the GPU/CPU performance ratio can reach as high as 4.6. In conclusion, the source code can be accessed through the HOOMD-blue web page for free by any interested user.« less

  6. Forensic microradiology: micro-computed tomography (Micro-CT) and analysis of patterned injuries inside of bone.

    PubMed

    Thali, Michael J; Taubenreuther, Ulrike; Karolczak, Marek; Braun, Marcel; Brueschweiler, Walter; Kalender, Willi A; Dirnhofer, Richard

    2003-11-01

    When a knife is stabbed in bone, it leaves an impression in the bone. The characteristics (shape, size, etc.) may indicate the type of tool used to produce the patterned injury in bone. Until now it has been impossible in forensic sciences to document such damage precisely and non-destructively. Micro-computed tomography (Micro-CT) offers an opportunity to analyze patterned injuries of tool marks made in bone. Using high-resolution Micro-CT and computer software, detailed analysis of three-dimensional (3D) architecture has recently become feasible and allows microstructural 3D bone information to be collected. With adequate viewing software, data from 2D slice of an arbitrary plane can be extracted from 3D datasets. Using such software as a "digital virtual knife," the examiner can interactively section and analyze the 3D sample. Analysis of the bone injury revealed that Micro-CT provides an opportunity to correlate a bone injury to an injury-causing instrument. Even broken knife tips can be graphically and non-destructively assigned to a suspect weapon.

  7. Psychometric characteristics of Clinical Reasoning Problems (CRPs) and its correlation with routine multiple choice question (MCQ) in Cardiology department.

    PubMed

    Derakhshandeh, Zahra; Amini, Mitra; Kojuri, Javad; Dehbozorgian, Marziyeh

    2018-01-01

    Clinical reasoning is one of the most important skills in the process of training a medical student to become an efficient physician. Assessment of the reasoning skills in a medical school program is important to direct students' learning. One of the tests for measuring the clinical reasoning ability is Clinical Reasoning Problems (CRPs). The major aim of this study is to measure psychometric qualities of CRPs and define correlation between this test and routine MCQ in cardiology department of Shiraz medical school. This study was a descriptive study conducted on total cardiology residents of Shiraz Medical School. The study population consists of 40 residents in 2014. The routine CRPs and the MCQ tests was designed based on similar objectives and were carried out simultaneously. Reliability, item difficulty, item discrimination, and correlation between each item and the total score of CRPs were all measured by Excel and SPSS software for checking psycometeric CRPs test. Furthermore, we calculated the correlation between CRPs test and MCQ test. The mean differences of CRPs test score between residents' academic year [second, third and fourth year] were also evaluated by Analysis of variances test (One Way ANOVA) using SPSS software (version 20)(α=0.05). The mean and standard deviation of score in CRPs was 10.19 ±3.39 out of 20; in MCQ, it was 13.15±3.81 out of 20. Item difficulty was in the range of 0.27-0.72; item discrimination was 0.30-0.75 with question No.3 being the exception (that was 0.24). The correlation between each item and the total score of CRP was 0.26-0.87; the correlation between CRPs test and MCQ test was 0.68 (p<0.001). The reliability of the CRPs was 0.72 as calculated by using Cronbach's alpha. The mean score of CRPs was different among residents based on their academic year and this difference was statistically significant (p<0.001). The results of this present investigation revealed that CRPs could be reliable test for measuring clinical reasoning in residents. It can be included in cardiology residency assessment programs.

  8. A software platform for phase contrast x-ray breast imaging research.

    PubMed

    Bliznakova, K; Russo, P; Mettivier, G; Requardt, H; Popov, P; Bravin, A; Buliev, I

    2015-06-01

    To present and validate a computer-based simulation platform dedicated for phase contrast x-ray breast imaging research. The software platform, developed at the Technical University of Varna on the basis of a previously validated x-ray imaging software simulator, comprises modules for object creation and for x-ray image formation. These modules were updated to take into account the refractive index for phase contrast imaging as well as implementation of the Fresnel-Kirchhoff diffraction theory of the propagating x-ray waves. Projection images are generated in an in-line acquisition geometry. To test and validate the platform, several phantoms differing in their complexity were constructed and imaged at 25 keV and 60 keV at the beamline ID17 of the European Synchrotron Radiation Facility. The software platform was used to design computational phantoms that mimic those used in the experimental study and to generate x-ray images in absorption and phase contrast modes. The visual and quantitative results of the validation process showed an overall good correlation between simulated and experimental images and show the potential of this platform for research in phase contrast x-ray imaging of the breast. The application of the platform is demonstrated in a feasibility study for phase contrast images of complex inhomogeneous and anthropomorphic breast phantoms, compared to x-ray images generated in absorption mode. The improved visibility of mammographic structures suggests further investigation and optimisation of phase contrast x-ray breast imaging, especially when abnormalities are present. The software platform can be exploited also for educational purposes. Copyright © 2015 Elsevier Ltd. All rights reserved.

  9. Novel semi-automated kidney volume measurements in autosomal dominant polycystic kidney disease.

    PubMed

    Muto, Satoru; Kawano, Haruna; Isotani, Shuji; Ide, Hisamitsu; Horie, Shigeo

    2018-06-01

    We assessed the effectiveness and convenience of a novel semi-automatic kidney volume (KV) measuring high-speed 3D-image analysis system SYNAPSE VINCENT ® (Fuji Medical Systems, Tokyo, Japan) for autosomal dominant polycystic kidney disease (ADPKD) patients. We developed a novel semi-automated KV measurement software for patients with ADPKD to be included in the imaging analysis software SYNAPSE VINCENT ® . The software extracts renal regions using image recognition software and measures KV (VINCENT KV). The algorithm was designed to work with the manual designation of a long axis of a kidney including cysts. After using the software to assess the predictive accuracy of the VINCENT method, we performed an external validation study and compared accurate KV and ellipsoid KV based on geometric modeling by linear regression analysis and Bland-Altman analysis. Median eGFR was 46.9 ml/min/1.73 m 2 . Median accurate KV, Vincent KV and ellipsoid KV were 627.7, 619.4 ml (IQR 431.5-947.0) and 694.0 ml (IQR 488.1-1107.4), respectively. Compared with ellipsoid KV (r = 0.9504), Vincent KV correlated strongly with accurate KV (r = 0.9968), without systematic underestimation or overestimation (ellipsoid KV; 14.2 ± 22.0%, Vincent KV; - 0.6 ± 6.0%). There were no significant slice thickness-specific differences (p = 0.2980). The VINCENT method is an accurate and convenient semi-automatic method to measure KV in patients with ADPKD compared with the conventional ellipsoid method.

  10. SEI Innovation Center Report: Cyber Intelligence Tradecraft Project: Summary of Key Findings

    DTIC Science & Technology

    2013-01-01

    source news, social media ), and focuses collection on the pertinent threats and strategic needs analysts identify while learning about their...difficult to correlate with other data sources (network data, social media , chat rooms, geopolitical news sites) and complicates trend analysis or...use of commonly exploited software, prohibiting USB storage devices, and impeding access to websites associated with scams and malware make it very

  11. Development of a Simulink Library for the Design, Testing and Simulation of Software Defined GPS Radios. With Application to the Development of Parallel Correlator Structures

    DTIC Science & Technology

    2014-05-01

    function Value = Select_Element(Index,Signal) %# eml Value = Signal(Index); Code Listing 1 Code for Selector Block 12 | P a g e 4.3...code for the Simulink function shiftedSignal = fcn(signal,Shift) %# eml shiftedSignal = circshift(signal,Shift); Code Listing 2 Code for CircShift

  12. NDVI and Panchromatic Image Correlation Using Texture Analysis

    DTIC Science & Technology

    2010-03-01

    6 Figure 5. Spectral reflectance of vegetation and soil from 0.4 to 1.1 mm (From Perry...should help the classification methods to be able to classify kelp. Figure 5. Spectral reflectance of vegetation and soil from 0.4 to 1.1 mm...1988). Image processing software for imaging spectrometry analysis. Remote Sensing of Enviroment , 24: 201–210. Perry, C., & Lautenschlager, L. F

  13. SAS Code for Calculating Intraclass Correlation Coefficients and Effect Size Benchmarks for Site-Randomized Education Experiments

    ERIC Educational Resources Information Center

    Brandon, Paul R.; Harrison, George M.; Lawton, Brian E.

    2013-01-01

    When evaluators plan site-randomized experiments, they must conduct the appropriate statistical power analyses. These analyses are most likely to be valid when they are based on data from the jurisdictions in which the studies are to be conducted. In this method note, we provide software code, in the form of a SAS macro, for producing statistical…

  14. Structure, composition and thermal state of the crust in Brazil. [geomagnetic survey

    NASA Technical Reports Server (NTRS)

    Pacca, I. I. G. (Principal Investigator); Shukowsky, W.

    1981-01-01

    Efforts in support of a geomagnetic survey of the Brazilian area are described. Software to convert MAGSAT data tapes to the Burroughs/B-6700 binary format was developed and tested. A preliminary analysis of the first total intensity anomaly map was performed and methodologies for more intensive analysis were defined. The sources for correlative geological, aeromagnetic, and gravimetric data are described.

  15. 3D planning in orthognathic surgery: CAD/CAM surgical splints and prediction of the soft and hard tissues results - our experience in 16 cases.

    PubMed

    Aboul-Hosn Centenero, Samir; Hernández-Alfaro, Federico

    2012-02-01

    The aim of this article is to determine the advantages of 3D planning in predicting postoperative results and manufacturing surgical splints using CAD/CAM (Computer Aided Design/Computer Aided Manufacturing) technology in orthognathic surgery when the software program Simplant OMS 10.1 (Materialise(®), Leuven, Belgium) was used for the purpose of this study which was carried out on 16 patients. A conventional preoperative treatment plan was devised for each patient following our Centre's standard protocol, and surgical splints were manufactured. These splints were used as study controls. The preoperative treatment plans devised were then transferred to a 3D-virtual environment on a personal computer (PC). Surgery was simulated, the prediction of results on soft and hard tissue produced, and surgical splints manufactured using CAD/CAM technology. In the operating room, both types of surgical splints were compared and the degree of similitude in results obtained in three planes was calculated. The maxillary osteotomy line was taken as the point of reference. The level of concordance was used to compare the surgical splints. Three months after surgery a second set of 3D images were obtained and used to obtain linear and angular measurements on screen. Using the Intraclass Correlation Coefficient these postoperative measurements were compared with the measurements obtained when predicting postoperative results. Results showed that a high degree of correlation in 15 of the 16 cases. A high coefficient of correlation was obtained in the majority of predictions of results in hard tissue, although less precise results were obtained in measurements in soft tissue in the labial area. The study shows that the software program used in the study is reliable for 3D planning and for the manufacture of surgical splints using CAD/CAM technology. Nevertheless, further progress in the development of technologies for the acquisition of 3D images, new versions of software programs, and further studies of objective data are necessary to increase precision in computerised 3D planning. Copyright © 2011 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  16. Evaluation of atlas-based auto-segmentation software in prostate cancer patients

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Greenham, Stuart, E-mail: stuart.greenham@ncahs.health.nsw.gov.au; Dean, Jenna; Fu, Cheuk Kuen Kenneth

    2014-09-15

    The performance and limitations of an atlas-based auto-segmentation software package (ABAS; Elekta Inc.) was evaluated using male pelvic anatomy as the area of interest. Contours from 10 prostate patients were selected to create atlases in ABAS. The contoured regions of interest were created manually to align with published guidelines and included the prostate, bladder, rectum, femoral heads and external patient contour. Twenty-four clinically treated prostate patients were auto-contoured using a randomised selection of two, four, six, eight or ten atlases. The concordance between the manually drawn and computer-generated contours were evaluated statistically using Pearson's product–moment correlation coefficient (r) and clinicallymore » in a validated qualitative evaluation. In the latter evaluation, six radiation therapists classified the degree of agreement for each structure using seven clinically appropriate categories. The ABAS software generated clinically acceptable contours for the bladder, rectum, femoral heads and external patient contour. For these structures, ABAS-generated volumes were highly correlated with ‘as treated’ volumes, manually drawn; for four atlases, for example, bladder r = 0.988 (P < 0.001), rectum r = 0.739 (P < 0.001) and left femoral head r = 0.560 (P < 0.001). Poorest results were seen for the prostate (r = 0.401, P < 0.05) (four atlases); however this was attributed to the comparison prostate volume being contoured on magnetic resonance imaging (MRI) rather than computed tomography (CT) data. For all structures, increasing the number of atlases did not consistently improve accuracy. ABAS-generated contours are clinically useful for a range of structures in the male pelvis. Clinically appropriate volumes were created, but editing of some contours was inevitably required. The ideal number of atlases to improve generated automatic contours is yet to be determined.« less

  17. Using pharmacokinetic modelling to improve prescribing practices of intravenous aminophylline in childhood asthma exacerbations.

    PubMed

    Cooney, Lewis; McBride, Antonia; Lilley, Andrew; Sinha, Ian; Johnson, Trevor N; Hawcutt, Daniel B

    2017-04-01

    To evaluate physiologically based pharmacokinetic modelling (PBPK) software in paediatric asthma patients using intravenous aminophylline. Prospective clinical audit of children receiving iv aminophylline (July 2014 to June 2016), and in-silico modelling using Simcyp software. Thirty-eight admissions (25 children) were included. Children with aminophylline levels ≥10 mg/l had equivalent clinical outcomes compared to those <10 mg/L, and adverse effects occurred in 57%. Therapeutic drug monitoring (TDM) data correlated well with PBPK model. PBPK modelling of a 5 mg/kg iv loading dose (≤18yr) shows a mean C max of 8.99 mg/L (5th-95th centiles 5.5-13.7 mg/L), with 70.3% of subjects <10 mg/L, 29.4% achieving 10-20 mg/L, and 0.1% > 20 mg/L. For an aminophylline infusion (0-12 y) of 1.0  mg/kg/h, the mean steady state infusion concentration was 16.4 mg/L, (5th-95th centiles 5.3-32 mg/L), with 26.8% having a serum concentration >20 mg/L. For 12-18yr receiving 0.5  mg/kg/h infusion, the mean steady state infusion concentration was 9.37 mg/L (5th-95th centiles 3.4-18 mg/L), with 59.8% having a serum concentration <10 mg/L. PBPK software modelling correlates well with clinical data. Current aminophylline iv loading dosage recommendations achieve levels <10 mg/l in 70% of children. Routine TDM may need altering as low risk of toxicity (>20 mg/l). Copyright © 2017 Elsevier Ltd. All rights reserved.

  18. Computational Pipeline for NIRS-EEG Joint Imaging of tDCS-Evoked Cerebral Responses-An Application in Ischemic Stroke.

    PubMed

    Guhathakurta, Debarpan; Dutta, Anirban

    2016-01-01

    Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about -15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization.

  19. Computational Pipeline for NIRS-EEG Joint Imaging of tDCS-Evoked Cerebral Responses—An Application in Ischemic Stroke

    PubMed Central

    Guhathakurta, Debarpan; Dutta, Anirban

    2016-01-01

    Transcranial direct current stimulation (tDCS) modulates cortical neural activity and hemodynamics. Electrophysiological methods (electroencephalography-EEG) measure neural activity while optical methods (near-infrared spectroscopy-NIRS) measure hemodynamics coupled through neurovascular coupling (NVC). Assessment of NVC requires development of NIRS-EEG joint-imaging sensor montages that are sensitive to the tDCS affected brain areas. In this methods paper, we present a software pipeline incorporating freely available software tools that can be used to target vascular territories with tDCS and develop a NIRS-EEG probe for joint imaging of tDCS-evoked responses. We apply this software pipeline to target primarily the outer convexity of the brain territory (superficial divisions) of the middle cerebral artery (MCA). We then present a computational method based on Empirical Mode Decomposition of NIRS and EEG time series into a set of intrinsic mode functions (IMFs), and then perform a cross-correlation analysis on those IMFs from NIRS and EEG signals to model NVC at the lesional and contralesional hemispheres of an ischemic stroke patient. For the contralesional hemisphere, a strong positive correlation between IMFs of regional cerebral hemoglobin oxygen saturation and the log-transformed mean-power time-series of IMFs for EEG with a lag of about −15 s was found after a cumulative 550 s stimulation of anodal tDCS. It is postulated that system identification, for example using a continuous-time autoregressive model, of this coupling relation under tDCS perturbation may provide spatiotemporal discriminatory features for the identification of ischemia. Furthermore, portable NIRS-EEG joint imaging can be incorporated into brain computer interfaces to monitor tDCS-facilitated neurointervention as well as cortical reorganization. PMID:27378836

  20. Quality of Radiomic Features in Glioblastoma Multiforme: Impact of Semi-Automated Tumor Segmentation Software

    PubMed Central

    Lee, Myungeun; Woo, Boyeong; Kuo, Michael D.; Jamshidi, Neema

    2017-01-01

    Objective The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. Materials and Methods MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Results Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC < 0.5). Most first order statistics and morphometric features showed moderate-to-high NDR (4 > NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. Conclusion The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics. PMID:28458602

  1. Quality of Radiomic Features in Glioblastoma Multiforme: Impact of Semi-Automated Tumor Segmentation Software.

    PubMed

    Lee, Myungeun; Woo, Boyeong; Kuo, Michael D; Jamshidi, Neema; Kim, Jong Hyo

    2017-01-01

    The purpose of this study was to evaluate the reliability and quality of radiomic features in glioblastoma multiforme (GBM) derived from tumor volumes obtained with semi-automated tumor segmentation software. MR images of 45 GBM patients (29 males, 16 females) were downloaded from The Cancer Imaging Archive, in which post-contrast T1-weighted imaging and fluid-attenuated inversion recovery MR sequences were used. Two raters independently segmented the tumors using two semi-automated segmentation tools (TumorPrism3D and 3D Slicer). Regions of interest corresponding to contrast-enhancing lesion, necrotic portions, and non-enhancing T2 high signal intensity component were segmented for each tumor. A total of 180 imaging features were extracted, and their quality was evaluated in terms of stability, normalized dynamic range (NDR), and redundancy, using intra-class correlation coefficients, cluster consensus, and Rand Statistic. Our study results showed that most of the radiomic features in GBM were highly stable. Over 90% of 180 features showed good stability (intra-class correlation coefficient [ICC] ≥ 0.8), whereas only 7 features were of poor stability (ICC < 0.5). Most first order statistics and morphometric features showed moderate-to-high NDR (4 > NDR ≥1), while above 35% of the texture features showed poor NDR (< 1). Features were shown to cluster into only 5 groups, indicating that they were highly redundant. The use of semi-automated software tools provided sufficiently reliable tumor segmentation and feature stability; thus helping to overcome the inherent inter-rater and intra-rater variability of user intervention. However, certain aspects of feature quality, including NDR and redundancy, need to be assessed for determination of representative signature features before further development of radiomics.

  2. Integrated data analysis for genome-wide research.

    PubMed

    Steinfath, Matthias; Repsilber, Dirk; Scholz, Matthias; Walther, Dirk; Selbig, Joachim

    2007-01-01

    Integrated data analysis is introduced as the intermediate level of a systems biology approach to analyse different 'omics' datasets, i.e., genome-wide measurements of transcripts, protein levels or protein-protein interactions, and metabolite levels aiming at generating a coherent understanding of biological function. In this chapter we focus on different methods of correlation analyses ranging from simple pairwise correlation to kernel canonical correlation which were recently applied in molecular biology. Several examples are presented to illustrate their application. The input data for this analysis frequently originate from different experimental platforms. Therefore, preprocessing steps such as data normalisation and missing value estimation are inherent to this approach. The corresponding procedures, potential pitfalls and biases, and available software solutions are reviewed. The multiplicity of observations obtained in omics-profiling experiments necessitates the application of multiple testing correction techniques.

  3. Demosaicing images from colour cameras for digital image correlation

    NASA Astrophysics Data System (ADS)

    Forsey, A.; Gungor, S.

    2016-11-01

    Digital image correlation is not the intended use for consumer colour cameras, but with care they can be successfully employed in such a role. The main obstacle is the sparsely sampled colour data caused by the use of a colour filter array (CFA) to separate the colour channels. It is shown that the method used to convert consumer camera raw files into a monochrome image suitable for digital image correlation (DIC) can have a significant effect on the DIC output. A number of widely available software packages and two in-house methods are evaluated in terms of their performance when used with DIC. Using an in-plane rotating disc to produce a highly constrained displacement field, it was found that the bicubic spline based in-house demosaicing method outperformed the other methods in terms of accuracy and aliasing suppression.

  4. Estimation of Rank Correlation for Clustered Data

    PubMed Central

    Rosner, Bernard; Glynn, Robert

    2017-01-01

    It is well known that the sample correlation coefficient (Rxy) is the maximum likelihood estimator (MLE) of the Pearson correlation (ρxy) for i.i.d. bivariate normal data. However, this is not true for ophthalmologic data where X (e.g., visual acuity) and Y (e.g., visual field) are available for each eye and there is positive intraclass correlation for both X and Y in fellow eyes. In this paper, we provide a regression-based approach for obtaining the MLE of ρxy for clustered data, which can be implemented using standard mixed effects model software. This method is also extended to allow for estimation of partial correlation by controlling both X and Y for a vector U of other covariates. In addition, these methods can be extended to allow for estimation of rank correlation for clustered data by (a) converting ranks of both X and Y to the probit scale, (b) estimating the Pearson correlation between probit scores for X and Y, and (c) using the relationship between Pearson and rank correlation for bivariate normally distributed data. The validity of the methods in finite-sized samples is supported by simulation studies. Finally, two examples from ophthalmology and analgesic abuse are used to illustrate the methods. PMID:28399615

  5. Acquisition of multiple image stacks with a confocal laser scanning microscope

    NASA Astrophysics Data System (ADS)

    Zuschratter, Werner; Steffen, Thomas; Braun, Katharina; Herzog, Andreas; Michaelis, Bernd; Scheich, Henning

    1998-06-01

    Image acquisition at high magnification is inevitably correlated with a limited view over the entire tissue section. To overcome this limitation we designed software for multiple image-stack acquisition (3D-MISA) in confocal laser scanning microscopy (CLSM). The system consists of a 4 channel Leica CLSM equipped with a high resolution z- scanning stage mounted on a xy-monitorized stage. The 3D- MISA software is implemented into the microscope scanning software and uses the microscope settings for the movements of the xy-stage. It allows storage and recall of 70 xyz- positions and the automatic 3D-scanning of image arrays between selected xyz-coordinates. The number of images within one array is limited only by the amount of disk space or memory available. Although for most applications the accuracy of the xy-scanning stage is sufficient for a precise alignment of tiled views, the software provides the possibility of an adjustable overlap between two image stacks by shifting the moving steps of the xy-scanning stage. After scanning a tiled image gallery of the extended focus-images of each channel will be displayed on a graphic monitor. In addition, a tiled image gallery of individual focal planes can be created. In summary, the 3D-MISA allows 3D-image acquisition of coherent regions in combination with high resolution of single images.

  6. A new approach to aid the characterisation and identification of metabolites of a model drug; partial isotope enrichment combined with novel formula elucidation software.

    PubMed

    Hobby, Kirsten; Gallagher, Richard T; Caldwell, Patrick; Wilson, Ian D

    2009-01-01

    This work describes the identification of 'isotopically enriched' metabolites of 4-cyanoaniline using the unique features of the software package 'Spectral Simplicity'. The software is capable of creating the theoretical mass spectra for partially isotope-enriched compounds, and subsequently performing an elemental composition analysis to give the elemental formula for the 'isotopically enriched' metabolite. A novel mass spectral correlation method, called 'FuzzyFit', was employed. 'FuzzyFit' utilises the expected experimental distribution of errors in both mass accuracy and isotope pattern and enables discrimination between statistically probable and improbable candidate formulae. The software correctly determined the molecular formulae of ten previously described metabolites of 4-cyanoaniline confirming the technique of partial isotope enrichment can produce results analogous to standard methodologies. Six previously unknown species were also identified, based on the presence of the unique 'designer' isotope ratio. Three of the unknowns were tentatively identified as N-acetylglutamine, O-methyl-N acetylglucuronide and a putative fatty acid conjugate. The discovery of a significant number of unknown species of a model drug with a comprehensive history of investigation highlights the potential for enhancement to the analytical process by the use of 'designer' isotope ratio compounds. The 'FuzzyFit' methodology significantly aided the elucidation of candidate formulae, by provision of a vastly simplified candidate formula data set. Copyright (c) 2008 John Wiley & Sons, Ltd.

  7. Digitized hand-wrist radiographs: comparison of subjective and software-derived image quality at various compression ratios.

    PubMed

    McCord, Layne K; Scarfe, William C; Naylor, Rachel H; Scheetz, James P; Silveira, Anibal; Gillespie, Kevin R

    2007-05-01

    The objectives of this study were to compare the effect of JPEG 2000 compression of hand-wrist radiographs on observer image quality qualitative assessment and to compare with a software-derived quantitative image quality index. Fifteen hand-wrist radiographs were digitized and saved as TIFF and JPEG 2000 images at 4 levels of compression (20:1, 40:1, 60:1, and 80:1). The images, including rereads, were viewed by 13 orthodontic residents who determined the image quality rating on a scale of 1 to 5. A quantitative analysis was also performed by using a readily available software based on the human visual system (Image Quality Measure Computer Program, version 6.2, Mitre, Bedford, Mass). ANOVA was used to determine the optimal compression level (P < or =.05). When we compared subjective indexes, JPEG compression greater than 60:1 significantly reduced image quality. When we used quantitative indexes, the JPEG 2000 images had lower quality at all compression ratios compared with the original TIFF images. There was excellent correlation (R2 >0.92) between qualitative and quantitative indexes. Image Quality Measure indexes are more sensitive than subjective image quality assessments in quantifying image degradation with compression. There is potential for this software-based quantitative method in determining the optimal compression ratio for any image without the use of subjective raters.

  8. Performance of new automated transthoracic three-dimensional echocardiographic software for left ventricular volumes and function assessment in routine clinical practice: Comparison with 3 Tesla cardiac magnetic resonance.

    PubMed

    Levy, Franck; Dan Schouver, Elie; Iacuzio, Laura; Civaia, Filippo; Rusek, Stephane; Dommerc, Carinne; Marechaux, Sylvestre; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles

    2017-11-01

    Three-dimensional (3D) transthoracic echocardiography (TTE) is superior to two-dimensional Simpson's method for assessment of left ventricular (LV) volumes and LV ejection fraction (LVEF). Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time-consuming. To evaluate the feasibility, accuracy and reproducibility of new fully automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for quantification of LV volumes and LVEF in routine practice; to compare the 3D LV volumes and LVEF obtained with a cardiac magnetic resonance (CMR) reference; and to optimize automated default border settings with CMR as reference. Sixty-three consecutive patients, who had comprehensive 3D TTE and CMR examinations within 24hours, were eligible for inclusion. Nine patients (14%) were excluded because of insufficient echogenicity in the 3D TTE. Thus, 54 patients (40 men; mean age 63±13 years) were prospectively included into the study. The inter- and intraobserver reproducibilities of 3D TTE were excellent (coefficient of variation<10%) for end-diastolic volume (EDV), end-systolic volume (ESV) and LVEF. Despite a slight underestimation of EDV using 3D TTE compared with CMR (bias=-22±34mL; P<0.0001), a significant correlation was found between the two measurements (r=0.93; P=0.0001). Enlarging default border detection settings leads to frequent volume overestimation in the general population, but improved agreement with CMR in patients with LVEF≤50%. Correlations between 3D TTE and CMR for ESV and LVEF were excellent (r=0.93 and r=0.91, respectively; P<0.0001). 3D TTE using new-generation fully automated software is a feasible, fast, reproducible and accurate imaging modality for LV volumetric quantification in routine practice. Optimization of border detection settings may increase agreement with CMR for EDV assessment in dilated ventricles. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  9. Quantitative assessment of primary mitral regurgitation using left ventricular volumes obtained with new automated three-dimensional transthoracic echocardiographic software: A comparison with 3-Tesla cardiac magnetic resonance.

    PubMed

    Levy, Franck; Marechaux, Sylvestre; Iacuzio, Laura; Schouver, Elie Dan; Castel, Anne Laure; Toledano, Manuel; Rusek, Stephane; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles

    2018-03-30

    Quantitative assessment of primary mitral regurgitation (MR) using left ventricular (LV) volumes obtained with three-dimensional transthoracic echocardiography (3D TTE) recently showed encouraging results. Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time consuming. To investigate the accuracy and reproducibility of new automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for the quantification of LV volumes and MR severity in patients with isolated degenerative primary MR; and to compare regurgitant volume (RV) obtained with 3D TTE with a cardiac magnetic resonance (CMR) reference. Fifty-three patients (37 men; mean age 64±12 years) with at least mild primary isolated MR, and having comprehensive 3D TTE and CMR studies within 24h, were eligible for inclusion. MR RV was calculated using the proximal isovelocity surface area (PISA) method and the volumetric method (total LV stroke volume minus aortic stroke volume) with either CMR or 3D TTE. Inter- and intraobserver reproducibility of 3D TTE was excellent (coefficient of variation≤10%) for LV volumes. MR RV was similar using CMR and 3D TTE (57±23mL vs 56±28mL; P=0.22), but was significantly higher using the PISA method (69±30mL; P<0.05 compared with CMR and 3D TTE). The PISA method consistently overestimated MR RV compared with CMR (bias 12±21mL), while no significant bias was found between 3D TTE and CMR (bias 2±14mL). Concordance between echocardiography and CMR was higher using 3D TTE MR grading (intraclass correlation coefficient [ICC]=0.89) than with PISA MR grading (ICC=0.78). Complete agreement with CMR grading was more frequent with 3D TTE than with the PISA method (76% vs 63%). 3D TTE RV assessment using the new generation of automated software correlates well with CMR in patients with isolated degenerative primary MR. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  10. Training Children in Pedestrian Safety: Distinguishing Gains in Knowledge from Gains in Safe Behavior

    PubMed Central

    McClure, Leslie A.

    2014-01-01

    Pedestrian injuries contribute greatly to child morbidity and mortality. Recent evidence suggests that training within virtual pedestrian environments may improve children’s street crossing skills, but may not convey knowledge about safety in street environments. We hypothesized that (a) children will gain pedestrian safety knowledge via videos/software/internet websites, but not when trained by virtual pedestrian environment or other strategies; (b) pedestrian safety knowledge will be associated with safe pedestrian behavior both before and after training; and (c) increases in knowledge will be associated with increases in safe behavior among children trained individually at streetside locations, but not those trained by means of other strategies. We analyzed data from a randomized controlled trial evaluating pedestrian safety training. We randomly assigned 240 children ages 7–8 to one of four training conditions: videos/software/internet, virtual reality (VR), individualized streetside instruction, or a no-contact control. Both virtual and field simulations of street crossing at 2-lane bi-directional mid-block locations assessed pedestrian behavior at baseline, post-training, and 6-month follow-up. Pedestrian knowledge was assessed orally on all three occasions. Children trained by videos/software/internet, and those trained individually, showed increased knowledge following training relative to children in the other groups (ps < 0.01). Correlations between pedestrian safety knowledge and pedestrian behavior were mostly non-significant. Correlations between change in knowledge and change in behavior from pre- to post-intervention also were non-significant, both for the full sample and within conditions. Children trained using videos/software/internet gained knowledge but did not change their behavior. Children trained individually gained in both knowledge and safer behavior. Children trained virtually gained in safer behavior but not knowledge. If VR is used for training, tools like videos/internet might effectively supplement training. We discovered few associations between knowledge and behavior, and none between changes in knowledge and behavior. Pedestrian safety knowledge and safe pedestrian behavior may be orthogonal constructs that should be considered independently for research and training purposes. PMID:24573688

  11. Speech Spectrum's Correlation with Speakers' Eysenck Personality Traits

    PubMed Central

    Hu, Chao; Wang, Qiandong; Short, Lindsey A.; Fu, Genyue

    2012-01-01

    The current study explored the correlation between speakers' Eysenck personality traits and speech spectrum parameters. Forty-six subjects completed the Eysenck Personality Questionnaire. They were instructed to verbally answer the questions shown on a computer screen and their responses were recorded by the computer. Spectrum parameters of /sh/ and /i/ were analyzed by Praat voice software. Formant frequencies of the consonant /sh/ in lying responses were significantly lower than that in truthful responses, whereas no difference existed on the vowel /i/ speech spectrum. The second formant bandwidth of the consonant /sh/ speech spectrum was significantly correlated with the personality traits of Psychoticism, Extraversion, and Neuroticism, and the correlation differed between truthful and lying responses, whereas the first formant frequency of the vowel /i/ speech spectrum was negatively correlated with Neuroticism in both response types. The results suggest that personality characteristics may be conveyed through the human voice, although the extent to which these effects are due to physiological differences in the organs associated with speech or to a general Pygmalion effect is yet unknown. PMID:22439014

  12. Robust image alignment for cryogenic transmission electron microscopy.

    PubMed

    McLeod, Robert A; Kowal, Julia; Ringler, Philippe; Stahlberg, Henning

    2017-03-01

    Cryo-electron microscopy recently experienced great improvements in structure resolution due to direct electron detectors with improved contrast and fast read-out leading to single electron counting. High frames rates enabled dose fractionation, where a long exposure is broken into a movie, permitting specimen drift to be registered and corrected. The typical approach for image registration, with high shot noise and low contrast, is multi-reference (MR) cross-correlation. Here we present the software package Zorro, which provides robust drift correction for dose fractionation by use of an intensity-normalized cross-correlation and logistic noise model to weight each cross-correlation in the MR model and filter each cross-correlation optimally. Frames are reliably registered by Zorro with low dose and defocus. Methods to evaluate performance are presented, by use of independently-evaluated even- and odd-frame stacks by trajectory comparison and Fourier ring correlation. Alignment of tiled sub-frames is also introduced, and demonstrated on an example dataset. Zorro source code is available at github.com/CINA/zorro. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Spurious but systematic correlations in functional connectivity MRI networks arise from subject motion

    PubMed Central

    Power, Jonathan D; Barnes, Kelly A; Snyder, Abraham Z; Schlaggar, Bradley L; Petersen, Steven E

    2011-01-01

    Here, we demonstrate that subject motion produces substantial changes in the timecourses of resting state functional connectivity MRI (rs-fcMRI) data despite compensatory spatial registration and regression of motion estimates from the data. These changes cause systematic but spurious correlation structures throughout the brain. Specifically, many long-distance correlations are decreased by subject motion, whereas many short-distance correlations are increased. These changes in rs-fcMRI correlations do not arise from, nor are they adequately countered by, some common functional connectivity processing steps. Two indices of data quality are proposed, and a simple method to reduce motion-related effects in rs-fcMRI analyses is demonstrated that should be flexibly implementable across a variety of software platforms. We demonstrate how application of this technique impacts our own data, modifying previous conclusions about brain development. These results suggest the need for greater care in dealing with subject motion, and the need to critically revisit previous rs-fcMRI work that may not have adequately controlled for effects of transient subject movements. PMID:22019881

  14. The Correlation Between Dislocations and Vacancy Defects Using Positron Annihilation Spectroscopy

    NASA Astrophysics Data System (ADS)

    Pang, Jinbiao; Li, Hui; Zhou, Kai; Wang, Zhu

    2012-07-01

    An analysis program for positron annihilation lifetime spectra is only applicable to isolated defects, but is of no use in the presence of defective correlations. Such limitations have long caused problems for positron researchers in their studies of complicated defective systems. In order to solve this problem, we aim to take a semiconductor material, for example, to achieve a credible average lifetime of single crystal silicon under plastic deformation at different temperatures using positron life time spectroscopy. By establishing reasonable positron trapping models with defective correlations and sorting out four lifetime components with multiple parameters, as well as their respective intensities, information is obtained on the positron trapping centers, such as the positron trapping rates of defects, the density of the dislocation lines and correlation between the dislocation lines, and the vacancy defects, by fitting with the average lifetime with the aid of Matlab software. These results give strong grounds for the existence of dislocation-vacancy correlation in plastically deformed silicon, and lay a theoretical foundation for the analysis of positron lifetime spectra when the positron trapping model involves dislocation-related defects.

  15. Effect of 21-day head down bed rest on urine proteins related to endothelium: Correlations with changes in carbohydrate metabolism

    NASA Astrophysics Data System (ADS)

    Kashirina, D.; Pastushkova, L.; Custaud, M. A.; Dobrokhotov, I.; Brzhozovsky, A.; Navasiolava, N.; Nosovsky, A.; Kononikhin, A.; Nikolaev, E.; Larina, I.

    2017-08-01

    We performed liquid chromatography-mass spectrometric study of the urine proteome in 8 healthy volunteers aged between 20 and 44 y.o. who have completed 21-day head-down bed rest. ANDSystem software which builds associative networks was used to identify the urinary proteins functionally related to the endothelium. We identified 7 endothelium-related biological processes, directly linked to 13 urine proteins. We performed manual annotation of the proteins which were the most important in terms of endothelial functions. Analysis of the correlations with biochemical variables revealed a positive correlation between fasting blood glucose and the following urine proteins: albumin, CD44 antigen, endothelial protein C receptor, mucin-1, osteopontin, receptor tyrosine kinase. As well, we found a positive correlation between HOMA-insulin resistance index and the following urine proteins: endothelial protein C receptor and syndecan-4. These results might suggest the involvement of above-mentioned proteins in glucose metabolism and their participation in the response to changes in blood glucose level.

  16. Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation

    NASA Technical Reports Server (NTRS)

    Rakoczy, John M.; Herren, Kenneth A.

    2008-01-01

    A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.

  17. Space Vehicle Pose Estimation via Optical Correlation and Nonlinear Estimation

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Herren, Kenneth

    2007-01-01

    A technique for 6-degree-of-freedom (6DOF) pose estimation of space vehicles is being developed. This technique draws upon recent developments in implementing optical correlation measurements in a nonlinear estimator, which relates the optical correlation measurements to the pose states (orientation and position). For the optical correlator, the use of both conjugate filters and binary, phase-only filters in the design of synthetic discriminant function (SDF) filters is explored. A static neural network is trained a priori and used as the nonlinear estimator. New commercial animation and image rendering software is exploited to design the SDF filters and to generate a large filter set with which to train the neural network. The technique is applied to pose estimation for rendezvous and docking of free-flying spacecraft and to terrestrial surface mobility systems for NASA's Vision for Space Exploration. Quantitative pose estimation performance will be reported. Advantages and disadvantages of the implementation of this technique are discussed.

  18. [Descriptive analysis of pelvic asymmetry in an asymptomatic population].

    PubMed

    Barbosa, A C; Bonifácio, D N; Lopes, I P; Martins, F L M; Barbosa, M C S A; Barbosa, A C

    2014-01-01

    Pelvic tilt is clinically assessed based on its relationship with spinal conditions, but there is little evidence from the asymptomatic-population for comparison purposes. To analyze an asymptomatic population focusing,on pelvic asymmetries using photogrammetry. 92 subjects (18-35 years old) underwent marking of the anterior and posterior iliac spines and were photographed. Alcimage software was used to measure the pelvic tilt angle. Other tests included: the Kolmogorov normality test, t test, Wilcoxon test, and Pearson coefficient to measure the correlation. 11.96% of males had anteversion and 34.78% normality; 38.04% of females had anteversion and 15.22% normality. Angles between iliacs for bilateral tilt showed no difference, but a difference was seen with the predominance of one side. For unilateral tilt a difference between illacs was seen. Good correlation of predominance versus anteversion was observed, and correlation was poor for side angles. The rest showed a weak or non-significant correlation. Tilt cannot be used individually to characterize pelvic dysfunction or pathology.

  19. Multivariate meta-analysis using individual participant data

    PubMed Central

    Riley, R. D.; Price, M. J.; Jackson, D.; Wardle, M.; Gueyffier, F.; Wang, J.; Staessen, J. A.; White, I. R.

    2016-01-01

    When combining results across related studies, a multivariate meta-analysis allows the joint synthesis of correlated effect estimates from multiple outcomes. Joint synthesis can improve efficiency over separate univariate syntheses, may reduce selective outcome reporting biases, and enables joint inferences across the outcomes. A common issue is that within-study correlations needed to fit the multivariate model are unknown from published reports. However, provision of individual participant data (IPD) allows them to be calculated directly. Here, we illustrate how to use IPD to estimate within-study correlations, using a joint linear regression for multiple continuous outcomes and bootstrapping methods for binary, survival and mixed outcomes. In a meta-analysis of 10 hypertension trials, we then show how these methods enable multivariate meta-analysis to address novel clinical questions about continuous, survival and binary outcomes; treatment–covariate interactions; adjusted risk/prognostic factor effects; longitudinal data; prognostic and multiparameter models; and multiple treatment comparisons. Both frequentist and Bayesian approaches are applied, with example software code provided to derive within-study correlations and to fit the models. PMID:26099484

  20. Effective correlator for RadioAstron project

    NASA Astrophysics Data System (ADS)

    Sergeev, Sergey

    This paper presents the implementation of programme FX-correlator for Very Long Baseline Interferometry, adapted for the project "RadioAstron". Software correlator implemented for heterogeneous computing systems using graphics accelerators. It is shown that for the task interferometry implementation of the graphics hardware has a high efficiency. The host processor of heterogeneous computing system, performs the function of forming the data flow for graphics accelerators, the number of which corresponds to the number of frequency channels. So, for the Radioastron project, such channels is seven. Each accelerator is perform correlation matrix for all bases for a single frequency channel. Initial data is converted to the floating-point format, is correction for the corresponding delay function and computes the entire correlation matrix simultaneously. Calculation of the correlation matrix is performed using the sliding Fourier transform. Thus, thanks to the compliance of a solved problem for architecture graphics accelerators, managed to get a performance for one processor platform Kepler, which corresponds to the performance of this task, the computing cluster platforms Intel on four nodes. This task successfully scaled not only on a large number of graphics accelerators, but also on a large number of nodes with multiple accelerators.

  1. The reliability of Little's Irregularity Index for the upper dental arch using three dimensional (3D) digital models.

    PubMed

    Burns, Angus; Dowling, Adam H; Garvey, Thérèse M; Fleming, Garry J P

    2014-10-01

    To investigate the inter-examiner variability of contact point displacement measurements (used to calculate the overall Little's Irregularity Index (LII) score) from digital models of the maxillary arch by four independent examiners. Maxillary orthodontic pre-treatment study models of ten patients were scanned using the Lava(tm) Chairside Oral Scanner (LCOS) and 3D digital models were created using Creo(®) computer aided design (CAD) software. Four independent examiners measured the contact point displacements of the anterior maxillary teeth using the software. Measurements were recorded randomly on three separate occasions by the examiners and the measurements (n=600) obtained were analysed using correlation analyses and analyses of variance (ANOVA). LII contact point displacement measurements for the maxillary arch were reproducible for inter-examiner assessment when using the digital method and were highly correlated between examiner pairs for contact point displacement measurements >2mm. The digital measurement technique showed poor correlation for smaller contact point displacement measurements (<2mm) for repeated measurements. The coefficient of variation (CoV) of the digital contact point displacement measurements highlighted 348 of the 600 measurements differed by more than 20% of the mean compared with 516 of 600 for the same measurements performed using the conventional LII measurement technique. Although the inter-examiner variability of LII contact point displacement measurements on the maxillary arch was reduced using the digital compared with the conventional LII measurement methodology, neither method was considered appropriate for orthodontic research purposes particularly when measuring small contact point displacements. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. Automated Real-Time Behavioral and Physiological Data Acquisition and Display Integrated with Stimulus Presentation for fMRI

    PubMed Central

    Voyvodic, James T.; Glover, Gary H.; Greve, Douglas; Gadde, Syam

    2011-01-01

    Functional magnetic resonance imaging (fMRI) is based on correlating blood oxygen-level dependent (BOLD) signal fluctuations in the brain with other time-varying signals. Although the most common reference for correlation is the timing of a behavioral task performed during the scan, many other behavioral and physiological variables can also influence fMRI signals. Variations in cardiac and respiratory functions in particular are known to contribute significant BOLD signal fluctuations. Variables such as skin conduction, eye movements, and other measures that may be relevant to task performance can also be correlated with BOLD signals and can therefore be used in image analysis to differentiate multiple components in complex brain activity signals. Combining real-time recording and data management of multiple behavioral and physiological signals in a way that can be routinely used with any task stimulus paradigm is a non-trivial software design problem. Here we discuss software methods that allow users control of paradigm-specific audio–visual or other task stimuli combined with automated simultaneous recording of multi-channel behavioral and physiological response variables, all synchronized with sub-millisecond temporal accuracy. We also discuss the implementation and importance of real-time display feedback to ensure data quality of all recorded variables. Finally, we discuss standards and formats for storage of temporal covariate data and its integration into fMRI image analysis. These neuroinformatics methods have been adopted for behavioral task control at all sites in the Functional Biomedical Informatics Research Network (FBIRN) multi-center fMRI study. PMID:22232596

  3. Meningiomas: Objective assessment of proliferative indices by immunohistochemistry and automated counting method.

    PubMed

    Chavali, Pooja; Uppin, Megha S; Uppin, Shantveer G; Challa, Sundaram

    2017-01-01

    The most reliable histological correlate of recurrence risk in meningiomas is increased mitotic activity. Proliferative index with Ki-67 immunostaining is a helpful adjunct to manual counting. However, both show considerable inter-observer variability. A new immunohistochemical method for counting mitotic figures, using antibody against the phosphohistone H3 (PHH3) protein was introduced. Similarly, a computer based automated counting for Ki-67 labelling index (LI) is available. To study the use of these new techniques in the objective assessment of proliferation indices in meningiomas. This was a retrospective study of intracranial meningiomas diagnosed during the year 2013.The hematoxylin and eosin (H and E) sections and immunohistochemistry (IHC) with Ki-67 were reviewed by two pathologists. Photomicrographs of the representative areas were subjected to Ki-67 analysis by Immunoratio (IR) software. Mean Ki-67 LI, both manual and by IR were calculated. IHC with PHH3 was performed. PHH3 positive nuclei were counted and mean values calculated. Data analysis was done using SPSS software. A total of 64 intracranial meningiomas were diagnosed. Evaluation on H and E, PHH3, Ki-67 LI (both manual and IR) were done in 32 cases (22 grade I and 10 grade II meningiomas). Statistically significant correlation was seen between the mitotic count in each grade and PHH3 values and also between the grade of the tumor and values of Ki-67 and PHH3. Both the techniques used in the study had advantage over, as well as, correlated well with the existing techniques and hence, can be applied to routine use.

  4. The design of an fast Fourier filter for enhancing diagnostically relevant structures - endodontic files.

    PubMed

    Bruellmann, Dan; Sander, Steven; Schmidtmann, Irene

    2016-05-01

    The endodontic working length is commonly determined by electronic apex locators and intraoral periapical radiographs. No algorithms for the automatic detection of endodontic files in dental radiographs have been described in the recent literature. Teeth from the mandibles of pig cadavers were accessed, and digital radiographs of these specimens were obtained using an optical bench. The specimens were then recorded in identical positions and settings after the insertion of endodontic files of known sizes (ISO sizes 10-15). The frequency bands generated by the endodontic files were determined using fast Fourier transforms (FFTs) to convert the resulting images into frequency spectra. The detected frequencies were used to design a pre-segmentation filter, which was programmed using Delphi XE RAD Studio software (Embarcadero Technologies, San Francisco, USA) and tested on 20 radiographs. For performance evaluation purposes, the gauged lengths (measured with a caliper) of visible endodontic files were measured in the native and filtered images. The software was able to segment the endodontic files in both the samples and similar dental radiographs. We observed median length differences of 0.52 mm (SD: 2.76 mm) and 0.46 mm (SD: 2.33 mm) in the native and post-segmentation images, respectively. Pearson's correlation test revealed a significant correlation of 0.915 between the true length and the measured length in the native images; the corresponding correlation for the filtered images was 0.97 (p=0.0001). The algorithm can be used to automatically detect and measure the lengths of endodontic files in digital dental radiographs. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. TACT: A Set of MSC/PATRAN- and MSC/NASTRAN- based Modal Correlation Tools

    NASA Technical Reports Server (NTRS)

    Marlowe, Jill M.; Dixon, Genevieve D.

    1998-01-01

    This paper describes the functionality and demonstrates the utility of the Test Analysis Correlation Tools (TACT), a suite of MSC/PATRAN Command Language (PCL) tools which automate the process of correlating finite element models to modal survey test data. The initial release of TACT provides a basic yet complete set of tools for performing correlation totally inside the PATRAN/NASTRAN environment. Features include a step-by-step menu structure, pre-test accelerometer set evaluation and selection, analysis and test result export/import in Universal File Format, calculation of frequency percent difference and cross-orthogonality correlation results using NASTRAN, creation and manipulation of mode pairs, and five different ways of viewing synchronized animations of analysis and test modal results. For the PATRAN-based analyst, TACT eliminates the repetitive, time-consuming and error-prone steps associated with transferring finite element data to a third-party modal correlation package, which allows the analyst to spend more time on the more challenging task of model updating. The usefulness of this software is presented using a case history, the correlation for a NASA Langley Research Center (LaRC) low aspect ratio research wind tunnel model. To demonstrate the improvements that TACT offers the MSC/PATRAN- and MSC/DIASTRAN- based structural analysis community, a comparison of the modal correlation process using TACT within PATRAN versus external third-party modal correlation packages is presented.

  6. [Design of a pulse oximeter used to low perfusion and low oxygen saturation].

    PubMed

    Tan, Shuangping; Ai, Zhiguang; Yang, Yuxing; Xie, Qingguo

    2013-05-01

    This paper presents a new pulse oximeter used to low perfusion at 0.125% and wide oxygen saturation range from 35% to 100%. In order to acquire the best PPG signals, the variable gain amplifier(VGA) is adopted in hardware. The self-developed auto-correlation modeling method is adopted in software and it can extract pulse wave from low perfusion signals and remove motion artifacts partly.

  7. Colometer: a real-time quality feedback system for screening colonoscopy.

    PubMed

    Filip, Dobromir; Gao, Xuexin; Angulo-Rodríguez, Leticia; Mintchev, Martin P; Devlin, Shane M; Rostom, Alaa; Rosen, Wayne; Andrews, Christopher N

    2012-08-28

    To investigate the performance of a new software-based colonoscopy quality assessment system. The software-based system employs a novel image processing algorithm which detects the levels of image clarity, withdrawal velocity, and level of the bowel preparation in a real-time fashion from live video signal. Threshold levels of image blurriness and the withdrawal velocity below which the visualization could be considered adequate have initially been determined arbitrarily by review of sample colonoscopy videos by two experienced endoscopists. Subsequently, an overall colonoscopy quality rating was computed based on the percentage of the withdrawal time with adequate visualization (scored 1-5; 1, when the percentage was 1%-20%; 2, when the percentage was 21%-40%, etc.). In order to test the proposed velocity and blurriness thresholds, screening colonoscopy withdrawal videos from a specialized ambulatory colon cancer screening center were collected, automatically processed and rated. Quality ratings on the withdrawal were compared to the insertion in the same patients. Then, 3 experienced endoscopists reviewed the collected videos in a blinded fashion and rated the overall quality of each withdrawal (scored 1-5; 1, poor; 3, average; 5, excellent) based on 3 major aspects: image quality, colon preparation, and withdrawal velocity. The automated quality ratings were compared to the averaged endoscopist quality ratings using Spearman correlation coefficient. Fourteen screening colonoscopies were assessed. Adenomatous polyps were detected in 4/14 (29%) of the collected colonoscopy video samples. As a proof of concept, the Colometer software rated colonoscope withdrawal as having better visualization than the insertion in the 10 videos which did not have any polyps (average percent time with adequate visualization: 79% ± 5% for withdrawal and 50% ± 14% for insertion, P < 0.01). Withdrawal times during which no polyps were removed ranged from 4-12 min. The median quality rating from the automated system and the reviewers was 3.45 [interquartile range (IQR), 3.1-3.68] and 3.00 (IQR, 2.33-3.67) respectively for all colonoscopy video samples. The automated rating revealed a strong correlation with the reviewer's rating (ρ coefficient= 0.65, P = 0.01). There was good correlation of the automated overall quality rating and the mean endoscopist withdrawal speed rating (Spearman r coefficient= 0.59, P = 0.03). There was no correlation of automated overall quality rating with mean endoscopists image quality rating (Spearman r coefficient= 0.41, P = 0.15). The results from a novel automated real-time colonoscopy quality feedback system strongly agreed with the endoscopists' quality assessments. Further study is required to validate this approach.

  8. Changes in Sunken Eyes Combined with Blepharoptosis after Levator Resection.

    PubMed

    Mawatari, Yuki; Fukushima, Mikiko; Kawaji, Takahiro

    2017-12-01

    This study aims to report the changes in sunken eyes combined with blepharoptosis after levator resection. Analysis involved 60 eyes from 32 patients with sunken eyes combined with blepharoptosis. Advancement of the levator aponeurosis and the Müller's muscle complex (levator resection) was performed in these patients. Area of upper eyelid sulcus (AES) was defined as the area of the upper eyelid shadow. The digital images were converted to black and white using image-processing software (Adobe Photoshop), and the AES was calculated using ImageJ software. In addition, margin reflex distance, eyebrow height (EBH), and AES were measured before and 3 months after surgery to assess the changes in the eyelids. Preoperative AES was significantly correlated to age ( P < 0.0001; r = 0.8062). Sunken eyes were remarkably improved after levator resection in all patients. Mean margin reflex distance significantly increased, whereas mean EBH and mean AES significantly decreased at 3 months after surgery ( P < 0.0001). The AES change was significantly correlated to the EBH change ( P < 0.0001; r = 0.5184). The principal aim of levator resection is to improve upper eyelid height and visual fields; however, this technique can alter the location of the eyebrow and upper orbital fat. The effects fill the hollowness of the upper eyelid and can remarkably improve sunken eyes.

  9. A comparison between Philips and Tomtec for left ventricular deformation and volume measurements in neonatal intensive care patients.

    PubMed

    de Waal, Koert; Phad, Nilkant

    2018-03-01

    Two-dimensional speckle tracking echocardiography is an emerging technique for analyzing cardiac function in newborns. Strain is a highly reliable and reproducible parameter, and reference values have been established for term and preterm newborns. Its implementation into clinical practice has been slow, partly due to lack of inter-vendor consistency. Our aim was to compare recent versions of Philips and Tomtec speckle tracking software for deformation and semiautomated volume and area measurements in neonatal intensive care patients. Longitudinal and circumferential deformation and cavity dimensions (volume, area) were determined off line from apical and short-axis images in 50 consecutive newborns with a median birthweight of 760 g (range 460-3200 g). Absolute mean endocardial global longitudinal strain measurements were similar between vendors, but with wide limits of agreement (Philips -18.9 [2.1]%, Tomtec -18.6 [2.5]%, bias -0.3 [1.7]%, and limits of agreement -3.6%-3.1%). Longitudinal strain rate and circumferential measurements showed poor correlation. All volume and area measurements correlated well between the vendors, but with significant bias. Global longitudinal strain measurements compared well between vendors but wide limits of agreement, suggesting that longitudinal measurements are preferred using similar hardware and software. © 2017, Wiley Periodicals, Inc.

  10. CGI: Java Software for Mapping and Visualizing Data from Array-based Comparative Genomic Hybridization and Expression Profiling

    PubMed Central

    Gu, Joyce Xiuweu-Xu; Wei, Michael Yang; Rao, Pulivarthi H.; Lau, Ching C.; Behl, Sanjiv; Man, Tsz-Kwong

    2007-01-01

    With the increasing application of various genomic technologies in biomedical research, there is a need to integrate these data to correlate candidate genes/regions that are identified by different genomic platforms. Although there are tools that can analyze data from individual platforms, essential software for integration of genomic data is still lacking. Here, we present a novel Java-based program called CGI (Cytogenetics-Genomics Integrator) that matches the BAC clones from array-based comparative genomic hybridization (aCGH) to genes from RNA expression profiling datasets. The matching is computed via a fast, backend MySQL database containing UCSC Genome Browser annotations. This program also provides an easy-to-use graphical user interface for visualizing and summarizing the correlation of DNA copy number changes and RNA expression patterns from a set of experiments. In addition, CGI uses a Java applet to display the copy number values of a specific BAC clone in aCGH experiments side by side with the expression levels of genes that are mapped back to that BAC clone from the microarray experiments. The CGI program is built on top of extensible, reusable graphic components specifically designed for biologists. It is cross-platform compatible and the source code is freely available under the General Public License. PMID:19936083

  11. CGI: Java software for mapping and visualizing data from array-based comparative genomic hybridization and expression profiling.

    PubMed

    Gu, Joyce Xiuweu-Xu; Wei, Michael Yang; Rao, Pulivarthi H; Lau, Ching C; Behl, Sanjiv; Man, Tsz-Kwong

    2007-10-06

    With the increasing application of various genomic technologies in biomedical research, there is a need to integrate these data to correlate candidate genes/regions that are identified by different genomic platforms. Although there are tools that can analyze data from individual platforms, essential software for integration of genomic data is still lacking. Here, we present a novel Java-based program called CGI (Cytogenetics-Genomics Integrator) that matches the BAC clones from array-based comparative genomic hybridization (aCGH) to genes from RNA expression profiling datasets. The matching is computed via a fast, backend MySQL database containing UCSC Genome Browser annotations. This program also provides an easy-to-use graphical user interface for visualizing and summarizing the correlation of DNA copy number changes and RNA expression patterns from a set of experiments. In addition, CGI uses a Java applet to display the copy number values of a specific BAC clone in aCGH experiments side by side with the expression levels of genes that are mapped back to that BAC clone from the microarray experiments. The CGI program is built on top of extensible, reusable graphic components specifically designed for biologists. It is cross-platform compatible and the source code is freely available under the General Public License.

  12. Relation between diagnosis of atheromatous plaque from orthopantomographs and cardiovascular risk factors. A study of cases and control subjects

    PubMed Central

    Gutierrez-Bonet, Carmen; Leco-Berrocal, Isabel; Fernández-Cáliz, Fernando; Martínez-González, José-María

    2016-01-01

    Background In recent years the use of orthopantomography has been proposed as a low-cost, reliable and non-invasive diagnostic medium for detecting atheromatous plaque. The purpose of this study was to correlate the presence of carotid calcifications (atheroma) in orthopantomographs with specific risk factors for cerebrovascular accidents (previous cerebrovascular accidents, arterial hypertension, and diabetes). Material and Methods The methods used in this observational study of cases and control subjects followed STROBE (Strengthening the Reporting of Observational studies in Epidemiology) recommendations. The study analyzed a total of 1,602 panoramic radiographs taken for dental diagnostic purposes between January 2010 and February 2014. The main variables analyzed were the incidence of atheromatous plaque and other cardiovascular risk factors. Epidat 3.1 statistical software was used to determine minimum sample sizes and the results were analyzed using PASW (Predictive Analytics Software) Statistics 10.0.0. Results For all the variables analyzed, the correlation between radiographic detection of atheromatous plaque and the presence of cardiovascular disease risk factors was found to be statistically significant (RR>1.5). Conclusions The presence of cardiovascular risk factors is related to the incidence of radiopaque lesions at the carotid artery bifurcation, indicating the presence of atheromatous plaque. Key words:Orthopantomography, atheromatous plaque, cerebrovascular accident, diabetes, arterial hypertension. PMID:26595828

  13. Relative azimuth inversion by way of damped maximum correlation estimates

    USGS Publications Warehouse

    Ringler, A.T.; Edwards, J.D.; Hutt, C.R.; Shelly, F.

    2012-01-01

    Horizontal seismic data are utilized in a large number of Earth studies. Such work depends on the published orientations of the sensitive axes of seismic sensors relative to true North. These orientations can be estimated using a number of different techniques: SensOrLoc (Sensitivity, Orientation and Location), comparison to synthetics (Ekstrom and Busby, 2008), or by way of magnetic compass. Current methods for finding relative station azimuths are unable to do so with arbitrary precision quickly because of limitations in the algorithms (e.g. grid search methods). Furthermore, in order to determine instrument orientations during station visits, it is critical that any analysis software be easily run on a large number of different computer platforms and the results be obtained quickly while on site. We developed a new technique for estimating relative sensor azimuths by inverting for the orientation with the maximum correlation to a reference instrument, using a non-linear parameter estimation routine. By making use of overlapping windows, we are able to make multiple azimuth estimates, which helps to identify the confidence of our azimuth estimate, even when the signal-to-noise ratio (SNR) is low. Finally, our algorithm has been written as a stand-alone, platform independent, Java software package with a graphical user interface for reading and selecting data segments to be analyzed.

  14. PrePhyloPro: phylogenetic profile-based prediction of whole proteome linkages

    PubMed Central

    Niu, Yulong; Liu, Chengcheng; Moghimyfiroozabad, Shayan; Yang, Yi

    2017-01-01

    Direct and indirect functional links between proteins as well as their interactions as part of larger protein complexes or common signaling pathways may be predicted by analyzing the correlation of their evolutionary patterns. Based on phylogenetic profiling, here we present a highly scalable and time-efficient computational framework for predicting linkages within the whole human proteome. We have validated this method through analysis of 3,697 human pathways and molecular complexes and a comparison of our results with the prediction outcomes of previously published co-occurrency model-based and normalization methods. Here we also introduce PrePhyloPro, a web-based software that uses our method for accurately predicting proteome-wide linkages. We present data on interactions of human mitochondrial proteins, verifying the performance of this software. PrePhyloPro is freely available at http://prephylopro.org/phyloprofile/. PMID:28875072

  15. Novel wavelength diversity technique for high-speed atmospheric turbulence compensation

    NASA Astrophysics Data System (ADS)

    Arrasmith, William W.; Sullivan, Sean F.

    2010-04-01

    The defense, intelligence, and homeland security communities are driving a need for software dominant, real-time or near-real time atmospheric turbulence compensated imagery. The development of parallel processing capabilities are finding application in diverse areas including image processing, target tracking, pattern recognition, and image fusion to name a few. A novel approach to the computationally intensive case of software dominant optical and near infrared imaging through atmospheric turbulence is addressed in this paper. Previously, the somewhat conventional wavelength diversity method has been used to compensate for atmospheric turbulence with great success. We apply a new correlation based approach to the wavelength diversity methodology using a parallel processing architecture enabling high speed atmospheric turbulence compensation. Methods for optical imaging through distributed turbulence are discussed, simulation results are presented, and computational and performance assessments are provided.

  16. Computerized Analysis of Digital Photographs for Evaluation of Tooth Movement.

    PubMed

    Toodehzaeim, Mohammad Hossein; Karandish, Maryam; Karandish, Mohammad Nabi

    2015-03-01

    Various methods have been introduced for evaluation of tooth movement in orthodontics. The challenge is to adopt the most accurate and most beneficial method for patients. This study was designed to introduce analysis of digital photographs with AutoCAD software as a method to evaluate tooth movement and assess the reliability of this method. Eighteen patients were evaluated in this study. Three intraoral digital images from the buccal view were captured from each patient in half an hour interval. All the photos were sent to AutoCAD software 2011, calibrated and the distance between canine and molar hooks were measured. The data was analyzed using intraclass correlation coefficient. Photographs were found to have high reliability coefficient (P > 0.05). The introduced method is an accurate, efficient and reliable method for evaluation of tooth movement.

  17. ICER-3D Hyperspectral Image Compression Software

    NASA Technical Reports Server (NTRS)

    Xie, Hua; Kiely, Aaron; Klimesh, matthew; Aranki, Nazeeh

    2010-01-01

    Software has been developed to implement the ICER-3D algorithm. ICER-3D effects progressive, three-dimensional (3D), wavelet-based compression of hyperspectral images. If a compressed data stream is truncated, the progressive nature of the algorithm enables reconstruction of hyperspectral data at fidelity commensurate with the given data volume. The ICER-3D software is capable of providing either lossless or lossy compression, and incorporates an error-containment scheme to limit the effects of data loss during transmission. The compression algorithm, which was derived from the ICER image compression algorithm, includes wavelet-transform, context-modeling, and entropy coding subalgorithms. The 3D wavelet decomposition structure used by ICER-3D exploits correlations in all three dimensions of sets of hyperspectral image data, while facilitating elimination of spectral ringing artifacts, using a technique summarized in "Improving 3D Wavelet-Based Compression of Spectral Images" (NPO-41381), NASA Tech Briefs, Vol. 33, No. 3 (March 2009), page 7a. Correlation is further exploited by a context-modeling subalgorithm, which exploits spectral dependencies in the wavelet-transformed hyperspectral data, using an algorithm that is summarized in "Context Modeler for Wavelet Compression of Hyperspectral Images" (NPO-43239), which follows this article. An important feature of ICER-3D is a scheme for limiting the adverse effects of loss of data during transmission. In this scheme, as in the similar scheme used by ICER, the spatial-frequency domain is partitioned into rectangular error-containment regions. In ICER-3D, the partitions extend through all the wavelength bands. The data in each partition are compressed independently of those in the other partitions, so that loss or corruption of data from any partition does not affect the other partitions. Furthermore, because compression is progressive within each partition, when data are lost, any data from that partition received prior to the loss can be used to reconstruct that partition at lower fidelity. By virtue of the compression improvement it achieves relative to previous means of onboard data compression, this software enables (1) increased return of hyperspectral scientific data in the presence of limits on the rates of transmission of data from spacecraft to Earth via radio communication links and/or (2) reduction in spacecraft radio-communication power and/or cost through reduction in the amounts of data required to be downlinked and stored onboard prior to downlink. The software is also suitable for compressing hyperspectral images for ground storage or archival purposes.

  18. The effect of applying principles of reformed teaching and learning to an asynchronous online environment on student cognition of physics concepts in kinematics

    NASA Astrophysics Data System (ADS)

    Turner, Michael Lloyd

    Research on student difficulties in learning physics concepts has been coincident with a general reform movement in science education with the aim of increasing the level of inquiry in the teaching and learning of science. Coincident with these efforts has been a dramatic increase in the offering of online courses. Generally, this movement toward online course offerings has taken place without the inclusion of laboratory science offerings. The Learn Anytime Anywhere Physics (LAAPhysics) program for asynchronous online introductory physics learning is a notable exception. LAAPhysics software attempts to implement the principles of reformed science teaching and learning in an online environment. The purpose of this study was to measure how student cognition of physics concepts in kinematics was effected through use of LAAPhysics online kinematics tutorials. The normalized gains between pre-instruction and post-instruction scores on the Test of Understanding Graphs in Kinematics (TUG-K) for a treatment group of LAAPhysics testers was calculated. This normalized gain was compared to normalized gains typically found for students taking face-to-face physics courses. The normalized gain scores for LAAPhysics testers were also tested for correlation against time on task variables as measured by connectivity to the online software. Finally, a content analysis of student responses recorded in the LAAPhysics software was conducted. Normalized gain scores for LAAPhysics testers were not found to be greater than gain scores typically found in face-to-face courses. The number of student connections to the software and their total time working in the software were found to be significantly related to normalized gain on the TUG-K. The content analysis of student responses in the LAAPhysics software revealed variation in initial understanding of physics concepts in kinematics as well as variation in change in understanding across students.

  19. Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda

    2017-01-01

    As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification & Validation (IV&V) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASAs Office of Safety and Mission Assurance (OSMA) defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domain/component, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IV&V enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.

  20. A Flexible Method for Producing F.E.M. Analysis of Bone Using Open-Source Software

    NASA Technical Reports Server (NTRS)

    Boppana, Abhishektha; Sefcik, Ryan; Meyers, Jerry G.; Lewandowski, Beth E.

    2016-01-01

    This project, performed in support of the NASA GRC Space Academy summer program, sought to develop an open-source workflow methodology that segmented medical image data, created a 3D model from the segmented data, and prepared the model for finite-element analysis. In an initial step, a technological survey evaluated the performance of various existing open-source software that claim to perform these tasks. However, the survey concluded that no single software exhibited the wide array of functionality required for the potential NASA application in the area of bone, muscle and bio fluidic studies. As a result, development of a series of Python scripts provided the bridging mechanism to address the shortcomings of the available open source tools. The implementation of the VTK library provided the most quick and effective means of segmenting regions of interest from the medical images; it allowed for the export of a 3D model by using the marching cubes algorithm to build a surface mesh. To facilitate the development of the model domain from this extracted information required a surface mesh to be processed in the open-source software packages Blender and Gmsh. The Preview program of the FEBio suite proved to be sufficient for volume filling the model with an unstructured mesh and preparing boundaries specifications for finite element analysis. To fully allow FEM modeling, an in house developed Python script allowed assignment of material properties on an element by element basis by performing a weighted interpolation of voxel intensity of the parent medical image correlated to published information of image intensity to material properties, such as ash density. A graphical user interface combined the Python scripts and other software into a user friendly interface. The work using Python scripts provides a potential alternative to expensive commercial software and inadequate, limited open-source freeware programs for the creation of 3D computational models. More work will be needed to validate this approach in creating finite-element models.

  1. Arterial pressure-based cardiac output monitoring: a multicenter validation of the third-generation software in septic patients.

    PubMed

    De Backer, Daniel; Marx, Gernot; Tan, Andrew; Junker, Christopher; Van Nuffelen, Marc; Hüter, Lars; Ching, Willy; Michard, Frédéric; Vincent, Jean-Louis

    2011-02-01

    Second-generation FloTrac software has been shown to reliably measure cardiac output (CO) in cardiac surgical patients. However, concerns have been raised regarding its accuracy in vasoplegic states. The aim of the present multicenter study was to investigate the accuracy of the third-generation software in patients with sepsis, particularly when total systemic vascular resistance (TSVR) is low. Fifty-eight septic patients were included in this prospective observational study in four university-affiliated ICUs. Reference CO was measured by bolus pulmonary thermodilution (iCO) using 3-5 cold saline boluses. Simultaneously, CO was computed from the arterial pressure curve recorded on a computer using the second-generation (CO(G2)) and third-generation (CO(G3)) FloTrac software. CO was also measured by semi-continuous pulmonary thermodilution (CCO). A total of 401 simultaneous measurements of iCO, CO(G2), CO(G3), and CCO were recorded. The mean (95%CI) biases between CO(G2) and iCO, CO(G3) and iCO, and CCO and iCO were -10 (-15 to -5)% [-0.8 (-1.1 to -0.4) L/min], 0 (-4 to 4)% [0 (-0.3 to 0.3) L/min], and 9 (6-13)% [0.7 (0.5-1.0) L/min], respectively. The percentage errors were 29 (20-37)% for CO(G2), 30 (24-37)% for CO(G3), and 28 (22-34)% for CCO. The difference between iCO and CO(G2) was significantly correlated with TSVR (r(2) = 0.37, p < 0.0001). A very weak (r(2) = 0.05) relationship was also observed for the difference between iCO and CO(G3). In patients with sepsis, the third-generation FloTrac software is more accurate, as precise, and less influenced by TSVR than the second-generation software.

  2. Precision analysis of a quantitative CT liver surface nodularity score.

    PubMed

    Smith, Andrew; Varney, Elliot; Zand, Kevin; Lewis, Tara; Sirous, Reza; York, James; Florez, Edward; Abou Elkassem, Asser; Howard-Claudio, Candace M; Roda, Manohar; Parker, Ellen; Scortegagna, Eduardo; Joyner, David; Sandlin, David; Newsome, Ashley; Brewster, Parker; Lirette, Seth T; Griswold, Michael

    2018-04-26

    To evaluate precision of a software-based liver surface nodularity (LSN) score derived from CT images. An anthropomorphic CT phantom was constructed with simulated liver containing smooth and nodular segments at the surface and simulated visceral and subcutaneous fat components. The phantom was scanned multiple times on a single CT scanner with adjustment of image acquisition and reconstruction parameters (N = 34) and on 22 different CT scanners from 4 manufacturers at 12 imaging centers. LSN scores were obtained using a software-based method. Repeatability and reproducibility were evaluated by intraclass correlation (ICC) and coefficient of variation. Using abdominal CT images from 68 patients with various stages of chronic liver disease, inter-observer agreement and test-retest repeatability among 12 readers assessing LSN by software- vs. visual-based scoring methods were evaluated by ICC. There was excellent repeatability of LSN scores (ICC:0.79-0.99) using the CT phantom and routine image acquisition and reconstruction parameters (kVp 100-140, mA 200-400, and auto-mA, section thickness 1.25-5.0 mm, field of view 35-50 cm, and smooth or standard kernels). There was excellent reproducibility (smooth ICC: 0.97; 95% CI 0.95, 0.99; CV: 7%; nodular ICC: 0.94; 95% CI 0.89, 0.97; CV: 8%) for LSN scores derived from CT images from 22 different scanners. Inter-observer agreement for the software-based LSN scoring method was excellent (ICC: 0.84; 95% CI 0.79, 0.88; CV: 28%) vs. good for the visual-based method (ICC: 0.61; 95% CI 0.51, 0.69; CV: 43%). Test-retest repeatability for the software-based LSN scoring method was excellent (ICC: 0.82; 95% CI 0.79, 0.84; CV: 12%). The software-based LSN score is a quantitative CT imaging biomarker with excellent repeatability, reproducibility, inter-observer agreement, and test-retest repeatability.

  3. Risk-Significant Adverse Condition Awareness Strengthens Assurance of Fault Management Systems

    NASA Technical Reports Server (NTRS)

    Fitz, Rhonda

    2017-01-01

    As spaceflight systems increase in complexity, Fault Management (FM) systems are ranked high in risk-based assessment of software criticality, emphasizing the importance of establishing highly competent domain expertise to provide assurance. Adverse conditions (ACs) and specific vulnerabilities encountered by safety- and mission-critical software systems have been identified through efforts to reduce the risk posture of software-intensive NASA missions. Acknowledgement of potential off-nominal conditions and analysis to determine software system resiliency are important aspects of hazard analysis and FM. A key component of assuring FM is an assessment of how well software addresses susceptibility to failure through consideration of ACs. Focus on significant risk predicted through experienced analysis conducted at the NASA Independent Verification Validation (IVV) Program enables the scoping of effective assurance strategies with regard to overall asset protection of complex spaceflight as well as ground systems. Research efforts sponsored by NASA's Office of Safety and Mission Assurance defined terminology, categorized data fields, and designed a baseline repository that centralizes and compiles a comprehensive listing of ACs and correlated data relevant across many NASA missions. This prototype tool helps projects improve analysis by tracking ACs and allowing queries based on project, mission type, domaincomponent, causal fault, and other key characteristics. Vulnerability in off-nominal situations, architectural design weaknesses, and unexpected or undesirable system behaviors in reaction to faults are curtailed with the awareness of ACs and risk-significant scenarios modeled for analysts through this database. Integration within the Enterprise Architecture at NASA IVV enables interfacing with other tools and datasets, technical support, and accessibility across the Agency. This paper discusses the development of an improved workflow process utilizing this database for adaptive, risk-informed FM assurance that critical software systems will safely and securely protect against faults and respond to ACs in order to achieve successful missions.

  4. Evolving Attractive Faces Using Morphing Technology and a Genetic Algorithm: A New Approach to Determining Ideal Facial Aesthetics

    PubMed Central

    Wong, Brian J. F.; Karmi, Koohyar; Devcic, Zlatko; McLaren, Christine E.; Chen, Wen-Pin

    2013-01-01

    Objectives The objectives of this study were to: 1) determine if a genetic algorithm in combination with morphing software can be used to evolve more attractive faces; and 2) evaluate whether this approach can be used as a tool to define or identify the attributes of the ideal attractive face. Study Design Basic research study incorporating focus group evaluations. Methods Digital images were acquired of 250 female volunteers (18–25 y). Randomly selected images were used to produce a parent generation (P) of 30 synthetic faces using morphing software. Then, a focus group of 17 trained volunteers (18–25 y) scored each face on an attractiveness scale ranging from 1 (unattractive) to 10 (attractive). A genetic algorithm was used to select 30 new pairs from the parent generation, and these were morphed using software to produce a new first generation (F1) of faces. The F1 faces were scored by the focus group, and the process was repeated for a total of four iterations of the algorithm. The algorithm mimics natural selection by using the attractiveness score as the selection pressure; the more attractive faces are more likely to morph. All five generations (P-F4) were then scored by three focus groups: a) surgeons (n = 12), b) cosmetology students (n = 44), and c) undergraduate students (n = 44). Morphometric measurements were made of 33 specific features on each of the 150 synthetic faces, and correlated with attractiveness scores using univariate and multivariate analysis. Results The average facial attractiveness scores increased with each generation and were 3.66 (+0.60), 4.59 (±0.73), 5.50 (±0.62), 6.23 (±0.31), and 6.39 (±0.24) for P and F1–F4 generations, respectively. Histograms of attractiveness score distributions show a significant shift in the skew of each curve toward more attractive faces with each generation. Univariate analysis identified nasal width, eyebrow arch height, and lip thickness as being significantly correlated with attractiveness scores. Multivariate analysis identified a similar collection of morphometric measures. No correlation with more commonly accepted measures such as the length facial thirds or fifths were identified. When images are examined as a montage (by generation), clear distinct trends are identified: oval shaped faces, distinct arched eyebrows, and full lips predominate. Faces evolve to approximate the guidelines suggested by classical canon. F3 and F4 generation faces look profoundly similar. The statistical and qualitative analysis indicates that the algorithm and methodology succeeds in generating successively more attractive faces. Conclusions The use of genetic algorithms in combination with a morphing software and traditional focus-group derived attractiveness scores can be used to evolve attractive synthetic faces. We have demonstrated that the evolution of attractive faces can be mimicked in software. Genetic algorithms and morphing provide a robust alternative to traditional approaches rooted in comparing attractiveness scores with a series of morphometric measurements in human subjects. PMID:18401273

  5. Computed gray levels in multislice and cone-beam computed tomography.

    PubMed

    Azeredo, Fabiane; de Menezes, Luciane Macedo; Enciso, Reyes; Weissheimer, Andre; de Oliveira, Rogério Belle

    2013-07-01

    Gray level is the range of shades of gray in the pixels, representing the x-ray attenuation coefficient that allows for tissue density assessments in computed tomography (CT). An in-vitro study was performed to investigate the relationship between computed gray levels in 3 cone-beam CT (CBCT) scanners and 1 multislice spiral CT device using 5 software programs. Six materials (air, water, wax, acrylic, plaster, and gutta-percha) were scanned with the CBCT and CT scanners, and the computed gray levels for each material at predetermined points were measured with OsiriX Medical Imaging software (Geneva, Switzerland), OnDemand3D (CyberMed International, Seoul, Korea), E-Film (Merge Healthcare, Milwaukee, Wis), Dolphin Imaging (Dolphin Imaging & Management Solutions, Chatsworth, Calif), and InVivo Dental Software (Anatomage, San Jose, Calif). The repeatability of these measurements was calculated with intraclass correlation coefficients, and the gray levels were averaged to represent each material. Repeated analysis of variance tests were used to assess the differences in gray levels among scanners and materials. There were no differences in mean gray levels with the different software programs. There were significant differences in gray levels between scanners for each material evaluated (P <0.001). The software programs were reliable and had no influence on the CT and CBCT gray level measurements. However, the gray levels might have discrepancies when different CT and CBCT scanners are used. Therefore, caution is essential when interpreting or evaluating CBCT images because of the significant differences in gray levels between different CBCT scanners, and between CBCT and CT values. Copyright © 2013 American Association of Orthodontists. Published by Mosby, Inc. All rights reserved.

  6. Assessing symmetry using the mirror stand device with manual and software-assisted methods in postoperative zygomatic fracture patients

    NASA Astrophysics Data System (ADS)

    Syarif, A. N.; Bangun, K.

    2017-08-01

    Zygomatic fractures are among the most common fractures to the facial skeleton. However, because no standard and reliable method of evaluation is available to assess postoperative patients, we often rely on photographs and subjective assessments. A portable mirror stand device (MiRS), which is a new method for the standardization of photography, was developed in our institution. Used with image analysis software, this device provides a new method for evaluating outcomes after the open reduction and internal fixation of zygomatic fractures. The portable mirror stand device was set up in our outpatient clinic at the Cleft Craniofacial Center at Cipto Mangunkusumo Hospital. Photographs of 11 postoperative patients were taken using the device, and they were analyzed both manually and using image analysis software (ImageJ 1.46) for symmetry. The two methods were then compared to assess the correlation and agreement of the results. The measurements taken using the manual method and the software-assisted method did not differ significantly, which indicated the good agreement between the two methods. The results of the symmetry achieved atour center were similar to other centers in the Asian region (ΔZy = 3.4±1.5 mm, ΔBc = 2.6±1.6 mm, ΔCh = 2.3±2.4 mm) compared with (ΔZy = 3.2±1.7 mm, ΔBc = 2.6±1.6 mm, ΔCh = 2.3±2.5 mm). The treatment of zygomatic fracture a tour center achieved good results. The portable mirror stand device assisted the image analysis software (ImageJ 1.46), which could be beneficial in assessing symmetry in postoperative zygomatic fracture patients.

  7. Comparison of grey scale median (GSM) measurement in ultrasound images of human carotid plaques using two different softwares.

    PubMed

    Östling, Gerd; Persson, Margaretha; Hedblad, Bo; Gonçalves, Isabel

    2013-11-01

    Grey scale median (GSM) measured on ultrasound images of carotid plaques has been used for several years now in research to find the vulnerable plaque. Centres have used different software and also different methods for GSM measurement. This has resulted in a wide range of GSM values and cut-off values for the detection of the vulnerable plaque. The aim of this study was to compare the values obtained with two different softwares, using different standardization methods, for the measurement of GSM on ultrasound images of carotid human plaques. GSM was measured with Adobe Photoshop(®) and with Artery Measurement System (AMS) on duplex ultrasound images of 100 consecutive medium- to large-sized carotid plaques of the Beta-blocker Cholesterol-lowering Asymptomatic Plaque Study (BCAPS). The mean values of GSM were 35·2 ± 19·3 and 55·8 ± 22·5 for Adobe Photoshop(®) and AMS, respectively. Mean difference was 20·45 (95% CI: 19·17-21·73). Although the absolute values of GSM differed, the agreement between the two measurements was good, correlation coefficient 0·95. A chi-square test revealed a kappa value of 0·68 when studying quartiles of GSM. The intra-observer variability was 1·9% for AMS and 2·5% for Adobe Photoshop. The difference between softwares and standardization methods must be taken into consideration when comparing studies. To avoid these problems, researcher should come to a consensus regarding software and standardization method for GSM measurement on ultrasound images of plaque in the arteries. © 2013 Scandinavian Society of Clinical Physiology and Nuclear Medicine. Published by John Wiley & Sons Ltd.

  8. Evidence-based development of a diagnosis-dependent therapy planning system and its implementation in modern diagnostic software.

    PubMed

    Ahlers, M O; Jakstat, H A

    2005-07-01

    The prerequisite for structured individual therapy of craniomandibular dysfunctions is differential diagnostics. Suggestions for the structured recording of findings and their structured evaluation beyond the global diagnosis of "craniomandibular disorders" have been published. Only this structured approach enables computerization of the diagnostic process. The respective software is available for use in practice (CMDcheck for CMD screening, CMDfact for the differential diagnostics). Based on this structured diagnostics, knowledge-based therapy planning is also conceivable. The prerequisite for this would be a model of achieving consensus on the indicated forms of therapy related to the diagnosis. Therefore, a procedure for evidence-based achievement of consensus on suitable forms of therapy in CMD was developed first in multicentric cooperation, and then implemented in corresponding software. The clinical knowledge of experienced specialists was included consciously for the consensus achievement process. At the same time, anonymized mathematical statistical evaluations were used for control and objectification. Different examiners form different departments of several universities working independently of one another assigned the theoretically conceiveable therapeutic alternatives to the already published diagnostic scheme. After anonymization, the correlation of these assignments was then calculated mathematically. For achieving consensus in those cases for which no agreement initally existed, agreement was subsequently arrived at in the course of a consensus conference on the basis of literature evaluations and the discussion of clinical case examples. This consensus in turn finally served as the basis of a therapy planner implemented in the above-mentioned diagnostic software CMDfact. Contributing to quality assurance, the principles of programming this assistant as well as the interface for linking into the diagnostic software are documented and also published here.

  9. Intra- and interrater reliability of the Chicago Classification of achalasia subtypes in pediatric high-resolution esophageal manometry (HRM) recordings.

    PubMed

    Singendonk, M M J; Rosen, R; Oors, J; Rommel, N; van Wijk, M P; Benninga, M A; Nurko, S; Omari, T I

    2017-11-01

    Subtyping achalasia by high-resolution manometry (HRM) is clinically relevant as response to therapy and prognosis have shown to vary accordingly. The aim of this study was to assess inter- and intrarater reliability of diagnosing achalasia and achalasia subtyping in children using the Chicago Classification (CC) V3.0. Six observers analyzed 40 pediatric HRM recordings (22 achalasia and 18 non-achalasia) twice by using dedicated analysis software (ManoView 3.0, Given Imaging, Los Angeles, CA, USA). Integrated relaxation pressure (IRP4s), distal contractile integral (DCI), intrabolus pressurization pattern (IBP), and distal latency (DL) were extracted and analyzed hierarchically. Cohen's κ (2 raters) and Fleiss' κ (>2 raters) and the intraclass correlation coefficient (ICC) were used for categorical and ordinal data, respectively. Based on the results of dedicated analysis software only, intra- and interrater reliability was excellent and moderate (κ=0.89 and κ=0.52, respectively) for differentiating achalasia from non-achalasia. For subtyping achalasia, reliability decreased to substantial and fair (κ=0.72 and κ=0.28, respectively). When observers were allowed to change the software-driven diagnosis according to their own interpretation of the manometric patterns, intra- and interrater reliability increased for diagnosing achalasia (κ=0.98 and κ=0.92, respectively) and for subtyping achalasia (κ=0.79 and κ=0.58, respectively). Intra- and interrater agreement for diagnosing achalasia when using HRM and the CC was very good to excellent when results of automated analysis software were interpreted by experienced observers. More variability was seen when relying solely on the software-driven diagnosis and for subtyping achalasia. Therefore, diagnosing and subtyping achalasia should be performed in pediatric motility centers with significant expertise. © 2017 John Wiley & Sons Ltd.

  10. Area of ischemia assessed by physicians and software packages from myocardial perfusion scintigrams

    PubMed Central

    2014-01-01

    Background The European Society of Cardiology recommends that patients with >10% area of ischemia should receive revascularization. We investigated inter-observer variability for the extent of ischemic defects reported by different physicians and by different software tools, and if inter-observer variability was reduced when the physicians were provided with a computerized suggestion of the defects. Methods Twenty-five myocardial perfusion single photon emission computed tomography (SPECT) patients who were regarded as ischemic according to the final report were included. Eleven physicians in nuclear medicine delineated the extent of the ischemic defects. After at least two weeks, they delineated the defects again, and were this time provided a suggestion of the defect delineation by EXINI HeartTM (EXINI). Summed difference scores and ischemic extent values were obtained from four software programs. Results The median extent values obtained from the 11 physicians varied between 8% and 34%, and between 9% and 16% for the software programs. For all 25 patients, mean extent obtained from EXINI was 17.0% (± standard deviation (SD) 14.6%). Mean extent for physicians was 22.6% (± 15.6%) for the first delineation and 19.1% (± 14.9%) for the evaluation where they were provided computerized suggestion. Intra-class correlation (ICC) increased from 0.56 (95% confidence interval (CI) 0.41-0.72) to 0.81 (95% CI 0.71-0.90) between the first and the second delineation, and SD between physicians were 7.8 (first) and 5.9 (second delineation). Conclusions There was large variability in the estimated ischemic defect size obtained both from different physicians and from different software packages. When the physicians were provided with a suggested delineation, the inter-observer variability decreased significantly. PMID:24479846

  11. Analysis of linear measurements on 3D surface models using CBCT data segmentation obtained by automatic standard pre-set thresholds in two segmentation software programs: an in vitro study.

    PubMed

    Poleti, Marcelo Lupion; Fernandes, Thais Maria Freire; Pagin, Otávio; Moretti, Marcela Rodrigues; Rubira-Bullen, Izabel Regina Fischer

    2016-01-01

    The aim of this in vitro study was to evaluate the reliability and accuracy of linear measurements on three-dimensional (3D) surface models obtained by standard pre-set thresholds in two segmentation software programs. Ten mandibles with 17 silica markers were scanned for 0.3-mm voxels in the i-CAT Classic (Imaging Sciences International, Hatfield, PA, USA). Twenty linear measurements were carried out by two observers two times on the 3D surface models: the Dolphin Imaging 11.5 (Dolphin Imaging & Management Solutions, Chatsworth, CA, USA), using two filters(Translucent and Solid-1), and in the InVesalius 3.0.0 (Centre for Information Technology Renato Archer, Campinas, SP, Brazil). The physical measurements were made by another observer two times using a digital caliper on the dry mandibles. Excellent intra- and inter-observer reliability for the markers, physical measurements, and 3D surface models were found (intra-class correlation coefficient (ICC) and Pearson's r ≥ 0.91). The linear measurements on 3D surface models by Dolphin and InVesalius software programs were accurate (Dolphin Solid-1 > InVesalius > Dolphin Translucent). The highest absolute and percentage errors were obtained for the variable R1-R1 (1.37 mm) and MF-AC (2.53 %) in the Dolphin Translucent and InVesalius software, respectively. Linear measurements on 3D surface models obtained by standard pre-set thresholds in the Dolphin and InVesalius software programs are reliable and accurate compared with physical measurements. Studies that evaluate the reliability and accuracy of the 3D models are necessary to ensure error predictability and to establish diagnosis, treatment plan, and prognosis in a more realistic way.

  12. [VALUE OF SMART PHONE Scoliometer SOFTWARE IN OBTAINING OPTIMAL LUMBAR LORDOSIS DURING L4-S1 FUSION SURGERY].

    PubMed

    Yu, Weibo; Liang, De; Ye, Linqiang; Jiang, Xiaobing; Yao, Zhensong; Tang, Jingjing; Tang, Yongchao

    2015-10-01

    To investigate the value of smart phone Scoliometer software in obtaining optimal lumbar lordosis (LL) during L4-S1 fusion surgery. Between November 2014 and February 2015, 20 patients scheduled for L4-S1 fusion surgery were prospectively enrolled the study. There were 8 males and 12 females, aged 41-65 years (mean, 52.3 years). The disease duration ranged from 6 months to 6 years (mean, 3.4 years). Before operation, the pelvic incidence (PI) and Cobb angle of L4-S1 (CobbL4-S1) were measured on lateral X-ray film of lumbosacral spine by PACS system; and the ideal CobbL4-S1 was then calculated according to previously published methods [(PI+9 degrees) x 70%]. Subsequently, intraoperative CobbL4-S1 was monitored by the Scoliometer software and was defined as optimal while it was less than 5 degrees difference compared with ideal CobbL4-S1. Finally, the CobbL4-S1 was measured by the PACS system after operation and the consistency was compared between Scoliometer software and PACS system to evaluate the accuracy of this software. In addition, value of this method in obtaining optimal LL was validated by comparing the difference between ideal CobbL4-S1 and preoperative one with that between ideal CobbL4-S1 and postoperative one. The CobbL4-S1 was (36.17 ± 1.53)degrees for ideal one, (22.57 ± 5.50)degrees for preoperative one, (32.25 ± 1.46)degrees for intraoperative one measured by Scoliometer software, and (34.43 ± 1.72)degrees for postoperative one, respectively. The observed intraclass correlation coefficient (ICC) was excellent [ICC = 0.96, 95% confidence interval (0.93, 0.97)] and the mean absolute difference (MAD) was low (MAD = 1.23) between Scoliometer software and PACS system. The deviation between ideal CobbL4-S1 and postoperative CobbL4-S1 was (2.31 ± 0.23)degrees, which was significantly lower than the deviation between ideal CobbL4-S1 and preoperative CobbL4-S1 (13.60 ± 1.85)degrees (t = 6.065, P = 0.001). Scoliometer software can help surgeon obtain the optimal LL and deserve further dissemination.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    CorAL is a software Library designed to aid in the analysis of femtoscipic data. Femtoscopic data are a class of measured quantities used in heavy-ion collisions to characterize particle emitting source sizes. The most common type of this data is two-particle correleations induced by the Hanbury-Brown/Twiss (HBT) Effect, but can also include correlations induced by final-state interactions between pairs of emitted particles in a heavy-ion collision. Because heavy-ion collisions are complex many particle systems, modeling hydrodynamical models or hybrid techniques. Using the CRAB module, CorAL can turn the output from these models into something that can be directley compared tomore » experimental data. CorAL can also take the raw experimentally measured correlation functions and image them by inverting the Koonin-Pratt equation to extract the space-time emission profile of the particle emitting source. This source function can be further analyzed or directly compared to theoretical calculations.« less

  14. A study on the relationships between age, work experience, cognition, and work ability in older employees working in heavy industry.

    PubMed

    Chung, Jaeyeop; Park, Juhyung; Cho, Milim; Park, Yunhee; Kim, DeokJu; Yang, Dongju; Yang, Yeongae

    2015-01-01

    [Purpose] The purpose of this study was to examine the correlation of age, work experience, cognition, and work ability in older employees working in heavy industry. [Subjects and Methods] The study was conducted using 100 subjects who were over 55 years old and worked in heavy industry. To obtain data, we first had the subjects complete the MoCA-K test and Work Ability Index (WAI). The data were then analyzed by frequency and correlation using statistical software (SPSS 21.0). [Results] Through this study, we discovered a significant positive correlation between WAI and MoCA-K, age, and work experience. [Conclusion] This study revealed that work ability in older employees increases not with the number of years worked but with the enhancement of cognitive ability. Special management that focuses on cognition is therefore required for senior employees working in the field of heavy industry.

  15. Study on the Influence of Leadership Style on Employee’s Organizational Commitment

    NASA Astrophysics Data System (ADS)

    Wang, Lin

    2018-03-01

    Talent is the core competitiveness of an enterprise, how to retain talent, inspire their creative, exert its advantages for the enterprise to bring profit maximization and value appreciation. It has always been the focus of enterprises and scholars. A great number of studies have shown that organizational commitment has an important impact on employees’ attitudes, thoughts and behaviors, and leadership style is also an important variable which affects the organizational commitment of employees. Through the questionnaire survey, the statistical software SPSS24.0 empirical analysis of the collected data shows that there is a positive correlation between the style of leadership and the commitment of the employee and the correlation with the employee’s normative commitment is not significant; The established leadership is negatively correlated with the employee’s emotional commitment and normative commitment, but it is not significant with the continuous commitment.

  16. Development and Validation of a Pressurization System Model for a Crossfeed Subscale Water Test Article

    NASA Technical Reports Server (NTRS)

    Nguyen, Han; Mazurkivich, Pete

    2006-01-01

    A pressurization system model was developed for a crossfeed subscale water test article using the EASY5 modeling software. The model consisted of an integrated tank pressurization and pressurization line model. The tank model was developed using the general purpose library, while the line model was assembled from the gas dynamic library. The pressurization system model was correlated to water test data obtained from nine test runs conducted on the crossfeed subscale test article. The model was first correlated to a representative test run and frozen. The correlated model was then used to predict the tank pressures and compared with the test data for eight other runs. The model prediction showed excellent agreement with the test data, allowing it to be used in a later study to analyze the pressurization system performance of a full-scale bimese vehicle with cryogenic propellants.

  17. An open-architecture approach to defect analysis software for mask inspection systems

    NASA Astrophysics Data System (ADS)

    Pereira, Mark; Pai, Ravi R.; Reddy, Murali Mohan; Krishna, Ravi M.

    2009-04-01

    Industry data suggests that Mask Inspection represents the second biggest component of Mask Cost and Mask Turn Around Time (TAT). Ever decreasing defect size targets lead to more sensitive mask inspection across the chip, thus generating too many defects. Hence, more operator time is being spent in analyzing and disposition of defects. Also, the fact that multiple Mask Inspection Systems and Defect Analysis strategies would typically be in use in a Mask Shop or a Wafer Foundry further complicates the situation. In this scenario, there is a need for a versatile, user friendly and extensible Defect Analysis software that reduces operator analysis time and enables correct classification and disposition of mask defects by providing intuitive visual and analysis aids. We propose a new vendor-neutral defect analysis software, NxDAT, based on an open architecture. The open architecture of NxDAT makes it easily extensible to support defect analysis for mask inspection systems from different vendors. The capability to load results from mask inspection systems from different vendors either directly or through a common interface enables the functionality of establishing correlation between inspections carried out by mask inspection systems from different vendors. This capability of NxDAT enhances the effectiveness of defect analysis as it directly addresses the real-life scenario where multiple types of mask inspection systems from different vendors co-exist in mask shops or wafer foundries. The open architecture also potentially enables loading wafer inspection results as well as loading data from other related tools such as Review Tools, Repair Tools, CD-SEM tools etc, and correlating them with the corresponding mask inspection results. A unique concept of Plug-In interface to NxDAT further enhances the openness of the architecture of NxDAT by enabling end-users to add their own proprietary defect analysis and image processing algorithms. The plug-in interface makes it possible for the end-users to make use of their collected knowledge through the years of experience in mask inspection process by encapsulating the knowledge into software utilities and plugging them into NxDAT. The plug-in interface is designed with the intent of enabling the pro-active mask defect analysis teams to build competitive differentiation into their defect analysis process while protecting their knowledge internally within their company. By providing interface with all major standard layout and mask data formats, NxDAT enables correlation of defect data on reticles with design and mask databases, further extending the effectiveness of defect analysis for D2DB inspection. NxDAT also includes many other advanced features for easy and fast navigation, visual display of defects, defect selection, multi-tier classification, defect clustering and gridding, sophisticated CD and contact measurement analysis, repeatability analysis such as adder analysis, defect trend, capture rate etc.

  18. Regional Earthquake Shaking and Loss Estimation

    NASA Astrophysics Data System (ADS)

    Sesetyan, K.; Demircioglu, M. B.; Zulfikar, C.; Durukal, E.; Erdik, M.

    2009-04-01

    This study, conducted under the JRA-3 component of the EU NERIES Project, develops a methodology and software (ELER) for the rapid estimation of earthquake shaking and losses in the Euro-Mediterranean region. This multi-level methodology developed together with researchers from Imperial College, NORSAR and ETH-Zurich is capable of incorporating regional variability and sources of uncertainty stemming from ground motion predictions, fault finiteness, site modifications, inventory of physical and social elements subjected to earthquake hazard and the associated vulnerability relationships. GRM Risk Management, Inc. of Istanbul serves as sub-contractor tor the coding of the ELER software. The methodology encompasses the following general steps: 1. Finding of the most likely location of the source of the earthquake using regional seismotectonic data base and basic source parameters, and if and when possible, by the estimation of fault rupture parameters from rapid inversion of data from on-line stations. 2. Estimation of the spatial distribution of selected ground motion parameters through region specific ground motion attenuation relationships and using shear wave velocity distributions.(Shake Mapping) 4. Incorporation of strong ground motion and other empirical macroseismic data for the improvement of Shake Map 5. Estimation of the losses (damage, casualty and economic) at different levels of sophistication (0, 1 and 2) that commensurate with the availability of inventory of human built environment (Loss Mapping) Both Level 0 (similar to PAGER system of USGS) and Level 1 analyses of the ELER routine are based on obtaining intensity distributions analytically and estimating total number of casualties and their geographic distribution either using regionally adjusted intensity-casualty or magnitude-casualty correlations (Level 0) of using regional building inventory data bases (Level 1). Level 0 analysis is similar to the PAGER system being developed by USGS. For given basis source parameters the intensity distributions can be computed using: a)Regional intensity attenuation relationships, b)Intensity correlations with attenuation relationship based PGV, PGA, and Spectral Amplitudes and, c)Intensity correlations with synthetic Fourier Amplitude Spectrum. In Level 1 analysis EMS98 based building vulnerability relationships are used for regional estimates of building damage and the casualty distributions. Results obtained from pilot applications of the Level 0 and Level 1 analysis modes of the ELER software to the 1999 M 7.4 Kocaeli, 1995 M 6.1 Dinar, and 2007 M 5.4 Bingol earthquakes in terms of ground shaking and losses are presented and comparisons with the observed losses are made. The regional earthquake shaking and loss information is intented for dissemination in a timely manner to related agencies for the planning and coordination of the post-earthquake emergency response. However the same software can also be used for scenario earthquake loss estimation and related Monte-Carlo type simulations.

  19. MARZ: Manual and automatic redshifting software

    NASA Astrophysics Data System (ADS)

    Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.

    2016-04-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.

  20. MRM-DIFF: data processing strategy for differential analysis in large scale MRM-based lipidomics studies.

    PubMed

    Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori

    2014-01-01

    Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the "Standalone software" section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website.

Top