Assessment of three different software systems in the evaluation of dynamic MRI of the breast.
Kurz, K D; Steinhaus, D; Klar, V; Cohnen, M; Wittsack, H J; Saleh, A; Mödder, U; Blondin, D
2009-02-01
The aim was to compare the diagnostic performance and handling of dynamic contrast-enhanced MRI of the breast with two commercial software solutions ("CADstream" and "3TP") and one self-developed software system ("Mammatool"). Identical data sets of dynamic breast MRI from 21 patients were evaluated retrospectively with all three software systems. The exams were classified according to the BI-RADS classification. The number of lesions in the parametric mapping was compared to histology or follow-up of more than 2 years. In addition, 25 quality criteria were judged by 3 independent investigators with a score from 0 to 5. Statistical analysis was performed to document the quality ranking of the different software systems. There were 9 invasive carcinomas, one pure DCIS, one papilloma, one radial scar, three histologically proven changes due to mastopathy, one adenosis and two fibroadenomas. Additionally two patients with enhancing parenchyma followed with MRI for more than 3 years and one scar after breast conserving therapy were included. All malignant lesions were classified as BI-RADS 4 or 5 using all software systems and showed significant enhancement in the parametric mapping. "CADstream" showed the best score on subjective quality criteria. "3TP" showed the lowest number of false-positive results. "Mammatool" produced the lowest number of benign tissues indicated with parametric overlay. All three software programs tested were adequate for sensitive and efficient assessment of dynamic MRI of the breast. Improvements in specificity may be achievable.
Ceschin, Rafael; Panigrahy, Ashok; Gopalakrishnan, Vanathi
2015-01-01
A major challenge in the diagnosis and treatment of brain tumors is tissue heterogeneity leading to mixed treatment response. Additionally, they are often difficult or at very high risk for biopsy, further hindering the clinical management process. To overcome this, novel advanced imaging methods are increasingly being adapted clinically to identify useful noninvasive biomarkers capable of disease stage characterization and treatment response prediction. One promising technique is called functional diffusion mapping (fDM), which uses diffusion-weighted imaging (DWI) to generate parametric maps between two imaging time points in order to identify significant voxel-wise changes in water diffusion within the tumor tissue. Here we introduce serial functional diffusion mapping (sfDM), an extension of existing fDM methods, to analyze the entire tumor diffusion profile along the temporal course of the disease. sfDM provides the tools necessary to analyze a tumor data set in the context of spatiotemporal parametric mapping: the image registration pipeline, biomarker extraction, and visualization tools. We present the general workflow of the pipeline, along with a typical use case for the software. sfDM is written in Python and is freely available as an open-source package under the Berkley Software Distribution (BSD) license to promote transparency and reproducibility.
Ghosh, Adarsh; Singh, Tulika; Singla, Veenu; Bagga, Rashmi; Khandelwal, Niranjan
2017-12-01
Apparent diffusion coefficient (ADC) maps are usually generated by builtin software provided by the MRI scanner vendors; however, various open-source postprocessing software packages are available for image manipulation and parametric map generation. The purpose of this study is to establish the reproducibility of absolute ADC values obtained using different postprocessing software programs. DW images with three b values were obtained with a 1.5-T MRI scanner, and the trace images were obtained. ADC maps were automatically generated by the in-line software provided by the vendor during image generation and were also separately generated on postprocessing software. These ADC maps were compared on the basis of ROIs using paired t test, Bland-Altman plot, mountain plot, and Passing-Bablok regression plot. There was a statistically significant difference in the mean ADC values obtained from the different postprocessing software programs when the same baseline trace DW images were used for the ADC map generation. For using ADC values as a quantitative cutoff for histologic characterization of tissues, standardization of the postprocessing algorithm is essential across processing software packages, especially in view of the implementation of vendor-neutral archiving.
Robust biological parametric mapping: an improved technique for multimodal brain image analysis
NASA Astrophysics Data System (ADS)
Yang, Xue; Beason-Held, Lori; Resnick, Susan M.; Landman, Bennett A.
2011-03-01
Mapping the quantitative relationship between structure and function in the human brain is an important and challenging problem. Numerous volumetric, surface, region of interest and voxelwise image processing techniques have been developed to statistically assess potential correlations between imaging and non-imaging metrics. Recently, biological parametric mapping has extended the widely popular statistical parametric approach to enable application of the general linear model to multiple image modalities (both for regressors and regressands) along with scalar valued observations. This approach offers great promise for direct, voxelwise assessment of structural and functional relationships with multiple imaging modalities. However, as presented, the biological parametric mapping approach is not robust to outliers and may lead to invalid inferences (e.g., artifactual low p-values) due to slight mis-registration or variation in anatomy between subjects. To enable widespread application of this approach, we introduce robust regression and robust inference in the neuroimaging context of application of the general linear model. Through simulation and empirical studies, we demonstrate that our robust approach reduces sensitivity to outliers without substantial degradation in power. The robust approach and associated software package provides a reliable way to quantitatively assess voxelwise correlations between structural and functional neuroimaging modalities.
Perfusion CT in acute stroke: effectiveness of automatically-generated colour maps.
Ukmar, Maja; Degrassi, Ferruccio; Pozzi Mucelli, Roberta Antea; Neri, Francesca; Mucelli, Fabio Pozzi; Cova, Maria Assunta
2017-04-01
To evaluate the accuracy of perfusion CT (pCT) in the definition of the infarcted core and the penumbra, comparing the data obtained from the evaluation of parametric maps [cerebral blood volume (CBV), cerebral blood flow (CBF) and mean transit time (MTT)] with software-generated colour maps. A retrospective analysis was performed to identify patients with suspected acute ischaemic strokes and who had undergone unenhanced CT and pCT carried out within 4.5 h from the onset of the symptoms. A qualitative evaluation of the CBV, CBF and MTT maps was performed, followed by an analysis of the colour maps automatically generated by the software. 26 patients were identified, but a direct CT follow-up was performed only on 19 patients after 24-48 h. In the qualitative analysis, 14 patients showed perfusion abnormalities. Specifically, 29 perfusion deficit areas were detected, of which 15 areas suggested the penumbra and the remaining 14 areas suggested the infarct. As for automatically software-generated maps, 12 patients showed perfusion abnormalities. 25 perfusion deficit areas were identified, 15 areas of which suggested the penumbra and the other 10 areas the infarct. The McNemar's test showed no statistically significant difference between the two methods of evaluation in highlighting infarcted areas proved later at CT follow-up. We demonstrated how pCT provides good diagnostic accuracy in the identification of acute ischaemic lesions. The limits of identification of the lesions mainly lie at the pons level and in the basal ganglia area. Qualitative analysis has proven to be more efficient in identification of perfusion lesions in comparison with software-generated maps. However, software-generated maps have proven to be very useful in the emergency setting. Advances in knowledge: The use of CT perfusion is requested in increasingly more patients in order to optimize the treatment, thanks also to the technological evolution of CT, which now allows a whole-brain study. The need for performing CT perfusion study also in the emergency setting could represent a problem for physicians who are not used to interpreting the parametric maps (CBV, MTT etc.). The software-generated maps could be of value in these settings, helping the less expert physician in the differentiation between different areas.
A tool for the estimation of the distribution of landslide area in R
NASA Astrophysics Data System (ADS)
Rossi, M.; Cardinali, M.; Fiorucci, F.; Marchesini, I.; Mondini, A. C.; Santangelo, M.; Ghosh, S.; Riguer, D. E. L.; Lahousse, T.; Chang, K. T.; Guzzetti, F.
2012-04-01
We have developed a tool in R (the free software environment for statistical computing, http://www.r-project.org/) to estimate the probability density and the frequency density of landslide area. The tool implements parametric and non-parametric approaches to the estimation of the probability density and the frequency density of landslide area, including: (i) Histogram Density Estimation (HDE), (ii) Kernel Density Estimation (KDE), and (iii) Maximum Likelihood Estimation (MLE). The tool is available as a standard Open Geospatial Consortium (OGC) Web Processing Service (WPS), and is accessible through the web using different GIS software clients. We tested the tool to compare Double Pareto and Inverse Gamma models for the probability density of landslide area in different geological, morphological and climatological settings, and to compare landslides shown in inventory maps prepared using different mapping techniques, including (i) field mapping, (ii) visual interpretation of monoscopic and stereoscopic aerial photographs, (iii) visual interpretation of monoscopic and stereoscopic VHR satellite images and (iv) semi-automatic detection and mapping from VHR satellite images. Results show that both models are applicable in different geomorphological settings. In most cases the two models provided very similar results. Non-parametric estimation methods (i.e., HDE and KDE) provided reasonable results for all the tested landslide datasets. For some of the datasets, MLE failed to provide a result, for convergence problems. The two tested models (Double Pareto and Inverse Gamma) resulted in very similar results for large and very large datasets (> 150 samples). Differences in the modeling results were observed for small datasets affected by systematic biases. A distinct rollover was observed in all analyzed landslide datasets, except for a few datasets obtained from landslide inventories prepared through field mapping or by semi-automatic mapping from VHR satellite imagery. The tool can also be used to evaluate the probability density and the frequency density of landslide volume.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Balasubramoniam, A; Bednarek, D; Rudin, S
Purpose: To create 4D parametric images using biplane Digital Subtraction Angiography (DSA) sequences co-registered with the 3D vascular geometry obtained from Cone Beam-CT (CBCT). Methods: We investigated a method to derive multiple 4D Parametric Imaging (PI) maps using only one CBCT acquisition. During this procedure a 3D-DSA geometry is stored and used subsequently for all 4D images. Each time a biplane DSA is acquired, we calculate 2D parametric maps of Bolus Arrival Time (BAT), Mean Transit Time (MTT) and Time to Peak (TTP). Arterial segments which are nearly parallel with one of the biplane imaging planes in the 2D parametricmore » maps are co-registered with the 3D geometry. The values in the remaining vascular network are found using spline interpolation since the points chosen for co-registration on the vasculature are discrete and remaining regions need to be interpolated. To evaluate the method we used a patient CT volume data set for 3D printing a neurovascular phantom containing a complete Circle of Willis. We connected the phantom to a flow loop with a peristaltic pump, simulating physiological flow conditions. Contrast media was injected with an automatic injector at 10 ml/sec. Images were acquired with a Toshiba Infinix C-arm and 4D parametric image maps of the vasculature were calculated. Results: 4D BAT, MTT, and TTP parametric image maps of the Circle of Willis were derived. We generated color-coded 3D geometries which avoided artifacts due to vessel overlap or foreshortening in the projection direction. Conclusion: The software was tested successfully and multiple 4D parametric images were obtained from biplane DSA sequences without the need to acquire additional 3D-DSA runs. This can benefit the patient by reducing the contrast media and the radiation dose normally associated with these procedures. Partial support from NIH Grant R01-EB002873 and Toshiba Medical Systems Corp.« less
Malyarenko, Dariya; Fedorov, Andriy; Bell, Laura; Prah, Melissa; Hectors, Stefanie; Arlinghaus, Lori; Muzi, Mark; Solaiyappan, Meiyappan; Jacobs, Michael; Fung, Maggie; Shukla-Dave, Amita; McManus, Kevin; Boss, Michael; Taouli, Bachir; Yankeelov, Thomas E; Quarles, Christopher Chad; Schmainda, Kathleen; Chenevert, Thomas L; Newitt, David C
2018-01-01
This paper reports on results of a multisite collaborative project launched by the MRI subgroup of Quantitative Imaging Network to assess current capability and provide future guidelines for generating a standard parametric diffusion map Digital Imaging and Communication in Medicine (DICOM) in clinical trials that utilize quantitative diffusion-weighted imaging (DWI). Participating sites used a multivendor DWI DICOM dataset of a single phantom to generate parametric maps (PMs) of the apparent diffusion coefficient (ADC) based on two models. The results were evaluated for numerical consistency among models and true phantom ADC values, as well as for consistency of metadata with attributes required by the DICOM standards. This analysis identified missing metadata descriptive of the sources for detected numerical discrepancies among ADC models. Instead of the DICOM PM object, all sites stored ADC maps as DICOM MR objects, generally lacking designated attributes and coded terms for quantitative DWI modeling. Source-image reference, model parameters, ADC units and scale, deemed important for numerical consistency, were either missing or stored using nonstandard conventions. Guided by the identified limitations, the DICOM PM standard has been amended to include coded terms for the relevant diffusion models. Open-source software has been developed to support conversion of site-specific formats into the standard representation.
Biological Parametric Mapping: A Statistical Toolbox for Multi-Modality Brain Image Analysis
Casanova, Ramon; Ryali, Srikanth; Baer, Aaron; Laurienti, Paul J.; Burdette, Jonathan H.; Hayasaka, Satoru; Flowers, Lynn; Wood, Frank; Maldjian, Joseph A.
2006-01-01
In recent years multiple brain MR imaging modalities have emerged; however, analysis methodologies have mainly remained modality specific. In addition, when comparing across imaging modalities, most researchers have been forced to rely on simple region-of-interest type analyses, which do not allow the voxel-by-voxel comparisons necessary to answer more sophisticated neuroscience questions. To overcome these limitations, we developed a toolbox for multimodal image analysis called biological parametric mapping (BPM), based on a voxel-wise use of the general linear model. The BPM toolbox incorporates information obtained from other modalities as regressors in a voxel-wise analysis, thereby permitting investigation of more sophisticated hypotheses. The BPM toolbox has been developed in MATLAB with a user friendly interface for performing analyses, including voxel-wise multimodal correlation, ANCOVA, and multiple regression. It has a high degree of integration with the SPM (statistical parametric mapping) software relying on it for visualization and statistical inference. Furthermore, statistical inference for a correlation field, rather than a widely-used T-field, has been implemented in the correlation analysis for more accurate results. An example with in-vivo data is presented demonstrating the potential of the BPM methodology as a tool for multimodal image analysis. PMID:17070709
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2013-02-01
This paper outlines a new approach for generating digital heritage models from laser scan or photogrammetric data using Historic Building Information Modelling (HBIM). HBIM is a plug-in for Building Information Modelling (BIM) software that uses parametric library objects and procedural modelling techniques to automate the modelling stage. The HBIM process involves a reverse engineering solution whereby parametric interactive objects representing architectural elements are mapped onto laser scan or photogrammetric survey data. A library of parametric architectural objects has been designed from historic manuscripts and architectural pattern books. These parametric objects were built using an embedded programming language within the ArchiCAD BIM software called Geometric Description Language (GDL). Procedural modelling techniques have been implemented with the same language to create a parametric building façade which automatically combines library objects based on architectural rules and proportions. Different configurations of the façade are controlled by user parameter adjustment. The automatically positioned elements of the façade can be subsequently refined using graphical editing while overlaying the model with orthographic imagery. Along with this semi-automatic method for generating façade models, manual plotting of library objects can also be used to generate a BIM model from survey data. After the 3D model has been completed conservation documents such as plans, sections, elevations and 3D views can be automatically generated for conservation projects.
Can color-coded parametric maps improve dynamic enhancement pattern analysis in MR mammography?
Baltzer, P A; Dietzel, M; Vag, T; Beger, S; Freiberg, C; Herzog, A B; Gajda, M; Camara, O; Kaiser, W A
2010-03-01
Post-contrast enhancement characteristics (PEC) are a major criterion for differential diagnosis in MR mammography (MRM). Manual placement of regions of interest (ROIs) to obtain time/signal intensity curves (TSIC) is the standard approach to assess dynamic enhancement data. Computers can automatically calculate the TSIC in every lesion voxel and combine this data to form one color-coded parametric map (CCPM). Thus, the TSIC of the whole lesion can be assessed. This investigation was conducted to compare the diagnostic accuracy (DA) of CCPM with TSIC for the assessment of PEC. 329 consecutive patients with 469 histologically verified lesions were examined. MRM was performed according to a standard protocol (1.5 T, 0.1 mmol/kgbw Gd-DTPA). ROIs were drawn manually within any lesion to calculate the TSIC. CCPMs were created in all patients using dedicated software (CAD Sciences). Both methods were rated by 2 observers in consensus on an ordinal scale. Receiver operating characteristics (ROC) analysis was used to compare both methods. The area under the curve (AUC) was significantly (p=0.026) higher for CCPM (0.829) than TSIC (0.749). The sensitivity was 88.5% (CCPM) vs. 82.8% (TSIC), whereas equal specificity levels were found (CCPM: 63.7%, TSIC: 63.0%). The color-coded parametric maps (CCPMs) showed a significantly higher DA compared to TSIC, in particular the sensitivity could be increased. Therefore, the CCPM method is a feasible approach to assessing dynamic data in MRM and condenses several imaging series into one parametric map. © Georg Thieme Verlag KG Stuttgart · New York.
Semiautomated Workflow for Clinically Streamlined Glioma Parametric Response Mapping
Keith, Lauren; Ross, Brian D.; Galbán, Craig J.; Luker, Gary D.; Galbán, Stefanie; Zhao, Binsheng; Guo, Xiaotao; Chenevert, Thomas L.; Hoff, Benjamin A.
2017-01-01
Management of glioblastoma multiforme remains a challenging problem despite recent advances in targeted therapies. Timely assessment of therapeutic agents is hindered by the lack of standard quantitative imaging protocols for determining targeted response. Clinical response assessment for brain tumors is determined by volumetric changes assessed at 10 weeks post-treatment initiation. Further, current clinical criteria fail to use advanced quantitative imaging approaches, such as diffusion and perfusion magnetic resonance imaging. Development of the parametric response mapping (PRM) applied to diffusion-weighted magnetic resonance imaging has provided a sensitive and early biomarker of successful cytotoxic therapy in brain tumors while maintaining a spatial context within the tumor. Although PRM provides an earlier readout than volumetry and sometimes greater sensitivity compared with traditional whole-tumor diffusion statistics, it is not routinely used for patient management; an automated and standardized software for performing the analysis and for the generation of a clinical report document is required for this. We present a semiautomated and seamless workflow for image coregistration, segmentation, and PRM classification of glioblastoma multiforme diffusion-weighted magnetic resonance imaging scans. The software solution can be integrated using local hardware or performed remotely in the cloud while providing connectivity to existing picture archive and communication systems. This is an important step toward implementing PRM analysis of solid tumors in routine clinical practice. PMID:28286871
Automated detection of extended sources in radio maps: progress from the SCORPIO survey
NASA Astrophysics Data System (ADS)
Riggi, S.; Ingallinera, A.; Leto, P.; Cavallaro, F.; Bufano, F.; Schillirò, F.; Trigilio, C.; Umana, G.; Buemi, C. S.; Norris, R. P.
2016-08-01
Automated source extraction and parametrization represents a crucial challenge for the next-generation radio interferometer surveys, such as those performed with the Square Kilometre Array (SKA) and its precursors. In this paper, we present a new algorithm, called CAESAR (Compact And Extended Source Automated Recognition), to detect and parametrize extended sources in radio interferometric maps. It is based on a pre-filtering stage, allowing image denoising, compact source suppression and enhancement of diffuse emission, followed by an adaptive superpixel clustering stage for final source segmentation. A parametrization stage provides source flux information and a wide range of morphology estimators for post-processing analysis. We developed CAESAR in a modular software library, also including different methods for local background estimation and image filtering, along with alternative algorithms for both compact and diffuse source extraction. The method was applied to real radio continuum data collected at the Australian Telescope Compact Array (ATCA) within the SCORPIO project, a pathfinder of the Evolutionary Map of the Universe (EMU) survey at the Australian Square Kilometre Array Pathfinder (ASKAP). The source reconstruction capabilities were studied over different test fields in the presence of compact sources, imaging artefacts and diffuse emission from the Galactic plane and compared with existing algorithms. When compared to a human-driven analysis, the designed algorithm was found capable of detecting known target sources and regions of diffuse emission, outperforming alternative approaches over the considered fields.
NASA Technical Reports Server (NTRS)
Wallace, Dolores R.
2003-01-01
In FY01 we learned that hardware reliability models need substantial changes to account for differences in software, thus making software reliability measurements more effective, accurate, and easier to apply. These reliability models are generally based on familiar distributions or parametric methods. An obvious question is 'What new statistical and probability models can be developed using non-parametric and distribution-free methods instead of the traditional parametric method?" Two approaches to software reliability engineering appear somewhat promising. The first study, begin in FY01, is based in hardware reliability, a very well established science that has many aspects that can be applied to software. This research effort has investigated mathematical aspects of hardware reliability and has identified those applicable to software. Currently the research effort is applying and testing these approaches to software reliability measurement, These parametric models require much project data that may be difficult to apply and interpret. Projects at GSFC are often complex in both technology and schedules. Assessing and estimating reliability of the final system is extremely difficult when various subsystems are tested and completed long before others. Parametric and distribution free techniques may offer a new and accurate way of modeling failure time and other project data to provide earlier and more accurate estimates of system reliability.
STAMPS: Software Tool for Automated MRI Post-processing on a supercomputer.
Bigler, Don C; Aksu, Yaman; Miller, David J; Yang, Qing X
2009-08-01
This paper describes a Software Tool for Automated MRI Post-processing (STAMP) of multiple types of brain MRIs on a workstation and for parallel processing on a supercomputer (STAMPS). This software tool enables the automation of nonlinear registration for a large image set and for multiple MR image types. The tool uses standard brain MRI post-processing tools (such as SPM, FSL, and HAMMER) for multiple MR image types in a pipeline fashion. It also contains novel MRI post-processing features. The STAMP image outputs can be used to perform brain analysis using Statistical Parametric Mapping (SPM) or single-/multi-image modality brain analysis using Support Vector Machines (SVMs). Since STAMPS is PBS-based, the supercomputer may be a multi-node computer cluster or one of the latest multi-core computers.
Booth, Brian G; Keijsers, Noël L W; Sijbers, Jan; Huysmans, Toon
2018-05-03
Pedobarography produces large sets of plantar pressure samples that are routinely subsampled (e.g. using regions of interest) or aggregated (e.g. center of pressure trajectories, peak pressure images) in order to simplify statistical analysis and provide intuitive clinical measures. We hypothesize that these data reductions discard gait information that can be used to differentiate between groups or conditions. To test the hypothesis of null information loss, we created an implementation of statistical parametric mapping (SPM) for dynamic plantar pressure datasets (i.e. plantar pressure videos). Our SPM software framework brings all plantar pressure videos into anatomical and temporal correspondence, then performs statistical tests at each sampling location in space and time. Novelly, we introduce non-linear temporal registration into the framework in order to normalize for timing differences within the stance phase. We refer to our software framework as STAPP: spatiotemporal analysis of plantar pressure measurements. Using STAPP, we tested our hypothesis on plantar pressure videos from 33 healthy subjects walking at different speeds. As walking speed increased, STAPP was able to identify significant decreases in plantar pressure at mid-stance from the heel through the lateral forefoot. The extent of these plantar pressure decreases has not previously been observed using existing plantar pressure analysis techniques. We therefore conclude that the subsampling of plantar pressure videos - a task which led to the discarding of gait information in our study - can be avoided using STAPP. Copyright © 2018 Elsevier B.V. All rights reserved.
Machine learning-based dual-energy CT parametric mapping
NASA Astrophysics Data System (ADS)
Su, Kuan-Hao; Kuo, Jung-Wen; Jordan, David W.; Van Hedent, Steven; Klahr, Paul; Wei, Zhouping; Helo, Rose Al; Liang, Fan; Qian, Pengjiang; Pereira, Gisele C.; Rassouli, Negin; Gilkeson, Robert C.; Traughber, Bryan J.; Cheng, Chee-Wai; Muzic, Raymond F., Jr.
2018-06-01
The aim is to develop and evaluate machine learning methods for generating quantitative parametric maps of effective atomic number (Zeff), relative electron density (ρ e), mean excitation energy (I x ), and relative stopping power (RSP) from clinical dual-energy CT data. The maps could be used for material identification and radiation dose calculation. Machine learning methods of historical centroid (HC), random forest (RF), and artificial neural networks (ANN) were used to learn the relationship between dual-energy CT input data and ideal output parametric maps calculated for phantoms from the known compositions of 13 tissue substitutes. After training and model selection steps, the machine learning predictors were used to generate parametric maps from independent phantom and patient input data. Precision and accuracy were evaluated using the ideal maps. This process was repeated for a range of exposure doses, and performance was compared to that of the clinically-used dual-energy, physics-based method which served as the reference. The machine learning methods generated more accurate and precise parametric maps than those obtained using the reference method. Their performance advantage was particularly evident when using data from the lowest exposure, one-fifth of a typical clinical abdomen CT acquisition. The RF method achieved the greatest accuracy. In comparison, the ANN method was only 1% less accurate but had much better computational efficiency than RF, being able to produce parametric maps in 15 s. Machine learning methods outperformed the reference method in terms of accuracy and noise tolerance when generating parametric maps, encouraging further exploration of the techniques. Among the methods we evaluated, ANN is the most suitable for clinical use due to its combination of accuracy, excellent low-noise performance, and computational efficiency.
Machine learning-based dual-energy CT parametric mapping.
Su, Kuan-Hao; Kuo, Jung-Wen; Jordan, David W; Van Hedent, Steven; Klahr, Paul; Wei, Zhouping; Al Helo, Rose; Liang, Fan; Qian, Pengjiang; Pereira, Gisele C; Rassouli, Negin; Gilkeson, Robert C; Traughber, Bryan J; Cheng, Chee-Wai; Muzic, Raymond F
2018-06-08
The aim is to develop and evaluate machine learning methods for generating quantitative parametric maps of effective atomic number (Z eff ), relative electron density (ρ e ), mean excitation energy (I x ), and relative stopping power (RSP) from clinical dual-energy CT data. The maps could be used for material identification and radiation dose calculation. Machine learning methods of historical centroid (HC), random forest (RF), and artificial neural networks (ANN) were used to learn the relationship between dual-energy CT input data and ideal output parametric maps calculated for phantoms from the known compositions of 13 tissue substitutes. After training and model selection steps, the machine learning predictors were used to generate parametric maps from independent phantom and patient input data. Precision and accuracy were evaluated using the ideal maps. This process was repeated for a range of exposure doses, and performance was compared to that of the clinically-used dual-energy, physics-based method which served as the reference. The machine learning methods generated more accurate and precise parametric maps than those obtained using the reference method. Their performance advantage was particularly evident when using data from the lowest exposure, one-fifth of a typical clinical abdomen CT acquisition. The RF method achieved the greatest accuracy. In comparison, the ANN method was only 1% less accurate but had much better computational efficiency than RF, being able to produce parametric maps in 15 s. Machine learning methods outperformed the reference method in terms of accuracy and noise tolerance when generating parametric maps, encouraging further exploration of the techniques. Among the methods we evaluated, ANN is the most suitable for clinical use due to its combination of accuracy, excellent low-noise performance, and computational efficiency.
Technological Innovations in Magnetic Resonance for Early Detection of Cardiovascular Diseases.
Santarelli, Maria F; Positano, Vincenzo; Martini, Nicola; Valvano, Giuseppe; Landini, Luigi
2016-01-01
Most recent technical innovations in cardiovascular MR imaging (CMRI) are presented in this review. They include hardware and software developments, and novelties in parametric mapping. All these recent improvements lead to high spatial and temporal resolution and quantitative information on the heart structure and function. They make it achievable ambitious goals in the field of magnetic resonance, such as the early detection of cardiovascular pathologies. In this review article, we present recent innovations in CMRI, emphasizing the progresses performed and the solutions proposed to some yet opened technical problems.
Seo, Seongho; Kim, Su Jin; Lee, Dong Soo; Lee, Jae Sung
2014-10-01
Tracer kinetic modeling in dynamic positron emission tomography (PET) has been widely used to investigate the characteristic distribution patterns or dysfunctions of neuroreceptors in brain diseases. Its practical goal has progressed from regional data quantification to parametric mapping that produces images of kinetic-model parameters by fully exploiting the spatiotemporal information in dynamic PET data. Graphical analysis (GA) is a major parametric mapping technique that is independent on any compartmental model configuration, robust to noise, and computationally efficient. In this paper, we provide an overview of recent advances in the parametric mapping of neuroreceptor binding based on GA methods. The associated basic concepts in tracer kinetic modeling are presented, including commonly-used compartment models and major parameters of interest. Technical details of GA approaches for reversible and irreversible radioligands are described, considering both plasma input and reference tissue input models. Their statistical properties are discussed in view of parametric imaging.
NASA Technical Reports Server (NTRS)
1973-01-01
Parametric studies and subsystem comparisons for the orbital radar mapping mission to planet Venus are presented. Launch vehicle requirements and primary orbiter propulsion system requirements are evaluated. The systems parametric analysis indicated that orbit size and orientation interrelated with almost all of the principal spacecraft systems and influenced significantly the definition of orbit insertion propulsion requirements, weight in orbit capability, radar system design, and mapping strategy.
Occupancy Grid Map Merging Using Feature Maps
2010-11-01
each robot begins exploring at different starting points, once two robots can communicate, they send their odometry data, LIDAR observations, and maps...robots [11]. Moreover, it is relevant to mention that significant success has been achieved in solving SLAM problems when using hybrid maps [12...represents the environment by parametric features. Our method is capable of representing a LIDAR scanned environment map in a parametric fashion. In general
Fan, Zhen; Dani, Melanie; Femminella, Grazia D; Wood, Melanie; Calsolaro, Valeria; Veronese, Mattia; Turkheimer, Federico; Gentleman, Steve; Brooks, David J; Hinz, Rainer; Edison, Paul
2018-07-01
Neuroinflammation and microglial activation play an important role in amnestic mild cognitive impairment (MCI) and Alzheimer's disease. In this study, we investigated the spatial distribution of neuroinflammation in MCI subjects, using spectral analysis (SA) to generate parametric maps and quantify 11 C-PBR28 PET, and compared these with compartmental and other kinetic models of quantification. Thirteen MCI and nine healthy controls were enrolled in this study. Subjects underwent 11 C-PBR28 PET scans with arterial cannulation. Spectral analysis with an arterial plasma input function was used to generate 11 C-PBR28 parametric maps. These maps were then compared with regional 11 C-PBR28 V T (volume of distribution) using a two-tissue compartment model and Logan graphic analysis. Amyloid load was also assessed with 18 F-Flutemetamol PET. With SA, three component peaks were identified in addition to blood volume. The 11 C-PBR28 impulse response function (IRF) at 90 min produced the lowest coefficient of variation. Single-subject analysis using this IRF demonstrated microglial activation in five out of seven amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake revealed a group-wise significant increase in neuroinflammation in amyloid-positive MCI subjects versus HC in multiple cortical association areas, and particularly in the temporal lobe. Interestingly, compartmental analysis detected group-wise increase in 11 C-PBR28 binding in the thalamus of amyloid-positive MCI subjects, while Logan parametric maps did not perform well. This study demonstrates for the first time that spectral analysis can be used to generate parametric maps of 11 C-PBR28 uptake, and is able to detect microglial activation in amyloid-positive MCI subjects. IRF parametric maps of 11 C-PBR28 uptake allow voxel-wise single-subject analysis and could be used to evaluate microglial activation in individual subjects.
NASA Astrophysics Data System (ADS)
Mazzetti, S.; Giannini, V.; Russo, F.; Regge, D.
2018-05-01
Computer-aided diagnosis (CAD) systems are increasingly being used in clinical settings to report multi-parametric magnetic resonance imaging (mp-MRI) of the prostate. Usually, CAD systems automatically highlight cancer-suspicious regions to the radiologist, reducing reader variability and interpretation errors. Nevertheless, implementing this software requires the selection of which mp-MRI parameters can best discriminate between malignant and non-malignant regions. To exploit functional information, some parameters are derived from dynamic contrast-enhanced (DCE) acquisitions. In particular, much CAD software employs pharmacokinetic features, such as K trans and k ep, derived from the Tofts model, to estimate a likelihood map of malignancy. However, non-pharmacokinetic models can be also used to describe DCE-MRI curves, without any requirement for prior knowledge or measurement of the arterial input function, which could potentially lead to large errors in parameter estimation. In this work, we implemented an empirical function derived from the phenomenological universalities (PUN) class to fit DCE-MRI. The parameters of the PUN model are used in combination with T2-weighted and diffusion-weighted acquisitions to feed a support vector machine classifier to produce a voxel-wise malignancy likelihood map of the prostate. The results were all compared to those for a CAD system based on Tofts pharmacokinetic features to describe DCE-MRI curves, using different quality aspects of image segmentation, while also evaluating the number and size of false positive (FP) candidate regions. This study included 61 patients with 70 biopsy-proven prostate cancers (PCa). The metrics used to evaluate segmentation quality between the two CAD systems were not statistically different, although the PUN-based CAD reported a lower number of FP, with reduced size compared to the Tofts-based CAD. In conclusion, the CAD software based on PUN parameters is a feasible means with which to detect PCa, without affecting segmentation quality, and hence it could be successfully applied in clinical settings, improving the automated diagnosis process and reducing computational complexity.
Design of a Horizontal Penetrometer for Measuring On-the-Go Soil Resistance
Topakci, Mehmet; Unal, Ilker; Canakci, Murad; Celik, Huseyin Kursat; Karayel, Davut
2010-01-01
Soil compaction is one of the main negative factors that limits plant growth and crop yield. Therefore, it is important to determine the soil resistance level and map it for the field to find solutions for the negative effects of the compaction. Nowadays, high powered communication technology and computers help us on this issue within the approach of precision agriculture applications. This study is focused on the design of a penetrometer, which can make instantaneous soil resistance measurements in the soil horizontally and data acquisition software based on the GPS (Global Positioning System). The penetrometer was designed using commercial 3D parametric solid modelling design software. The data acquisition software was developed in Microsoft Visual Basic.NET programming language. After the design of the system, manufacturing and assembly of the system was completed and then a field experiment was carried out. According to the data from GPS and penetration resistance values which are collected in Microsoft SQL Server database, a Kriging method by ArcGIS was used and soil resistance was mapped in the field for a soil depth of 40 cm. During operation, no faults, either in mechanical and software parts, were seen. As a result, soil resistance values of 0.2 MPa and 3 MPa were obtained as minimum and maximum values, respectively. In conclusion, the experimental results showed that the designed system works quite well in the field and the horizontal penetrometer is a practical tool for providing on-line soil resistance measurements. This study contributes to further research for the development of on-line soil resistance measurements and mapping within the precision agriculture applications. PMID:22163410
Design of a horizontal penetrometer for measuring on-the-go soil resistance.
Topakci, Mehmet; Unal, Ilker; Canakci, Murad; Celik, Huseyin Kursat; Karayel, Davut
2010-01-01
Soil compaction is one of the main negative factors that limits plant growth and crop yield. Therefore, it is important to determine the soil resistance level and map it for the field to find solutions for the negative effects of the compaction. Nowadays, high powered communication technology and computers help us on this issue within the approach of precision agriculture applications. This study is focused on the design of a penetrometer, which can make instantaneous soil resistance measurements in the soil horizontally and data acquisition software based on the GPS (Global Positioning System). The penetrometer was designed using commercial 3D parametric solid modelling design software. The data acquisition software was developed in Microsoft Visual Basic.NET programming language. After the design of the system, manufacturing and assembly of the system was completed and then a field experiment was carried out. According to the data from GPS and penetration resistance values which are collected in Microsoft SQL Server database, a Kriging method by ArcGIS was used and soil resistance was mapped in the field for a soil depth of 40 cm. During operation, no faults, either in mechanical and software parts, were seen. As a result, soil resistance values of 0.2 MPa and 3 MPa were obtained as minimum and maximum values, respectively. In conclusion, the experimental results showed that the designed system works quite well in the field and the horizontal penetrometer is a practical tool for providing on-line soil resistance measurements. This study contributes to further research for the development of on-line soil resistance measurements and mapping within the precision agriculture applications.
Changing space and sound: Parametric design and variable acoustics
NASA Astrophysics Data System (ADS)
Norton, Christopher William
This thesis examines the potential for parametric design software to create performance based design using acoustic metrics as the design criteria. A former soundstage at the University of Southern California used by the Thornton School of Music is used as a case study for a multiuse space for orchestral, percussion, master class and recital use. The criteria used for each programmatic use include reverberation time, bass ratio, and the early energy ratios of the clarity index and objective support. Using a panelized ceiling as a design element to vary the parameters of volume, panel orientation and type of absorptive material, the relationships between these parameters and the design criteria are explored. These relationships and subsequently derived equations are applied to Grasshopper parametric modeling software for Rhino 3D (a NURBS modeling software). Using the target reverberation time and bass ratio for each programmatic use as input for the parametric model, the genomic optimization function of Grasshopper - Galapagos - is run to identify the optimum ceiling geometry and material distribution.
Parametric Modelling of As-Built Beam Framed Structure in Bim Environment
NASA Astrophysics Data System (ADS)
Yang, X.; Koehl, M.; Grussenmeyer, P.
2017-02-01
A complete documentation and conservation of a historic timber roof requires the integration of geometry modelling, attributional and dynamic information management and results of structural analysis. Recently developed as-built Building Information Modelling (BIM) technique has the potential to provide a uniform platform, which provides possibility to integrate the traditional geometry modelling, parametric elements management and structural analysis together. The main objective of the project presented in this paper is to develop a parametric modelling tool for a timber roof structure whose elements are leaning and crossing beam frame. Since Autodesk Revit, as the typical BIM software, provides the platform for parametric modelling and information management, an API plugin, able to automatically create the parametric beam elements and link them together with strict relationship, was developed. The plugin under development is introduced in the paper, which can obtain the parametric beam model via Autodesk Revit API from total station points and terrestrial laser scanning data. The results show the potential of automatizing the parametric modelling by interactive API development in BIM environment. It also integrates the separate data processing and different platforms into the uniform Revit software.
A general framework for parametric survival analysis.
Crowther, Michael J; Lambert, Paul C
2014-12-30
Parametric survival models are being increasingly used as an alternative to the Cox model in biomedical research. Through direct modelling of the baseline hazard function, we can gain greater understanding of the risk profile of patients over time, obtaining absolute measures of risk. Commonly used parametric survival models, such as the Weibull, make restrictive assumptions of the baseline hazard function, such as monotonicity, which is often violated in clinical datasets. In this article, we extend the general framework of parametric survival models proposed by Crowther and Lambert (Journal of Statistical Software 53:12, 2013), to incorporate relative survival, and robust and cluster robust standard errors. We describe the general framework through three applications to clinical datasets, in particular, illustrating the use of restricted cubic splines, modelled on the log hazard scale, to provide a highly flexible survival modelling framework. Through the use of restricted cubic splines, we can derive the cumulative hazard function analytically beyond the boundary knots, resulting in a combined analytic/numerical approach, which substantially improves the estimation process compared with only using numerical integration. User-friendly Stata software is provided, which significantly extends parametric survival models available in standard software. Copyright © 2014 John Wiley & Sons, Ltd.
NASA Technical Reports Server (NTRS)
Parsons-Wingerter, Patricia; Kao, David; Valizadegan, Hamed; Martin, Rodney; Murray, Matthew C.; Ramesh, Sneha; Sekaran, Srinivaas
2017-01-01
Currently, astronauts face significant health risks in future long-duration exploration missions such as colonizing the Moon and traveling to Mars. Numerous risks include greatly increased radiation exposures beyond the low earth orbit (LEO) of the ISS, and visual and ocular impairments in response to microgravity environments. The cardiovascular system is a key mediator in human physiological responses to radiation and microgravity. Moreover, blood vessels are necessarily involved in the progression and treatment of vascular-dependent terrestrial diseases such as cancer, coronary vessel disease, wound-healing, reproductive disorders, and diabetes. NASA developed an innovative, globally requested beta-level software, VESsel GENeration Analysis (VESGEN) to map and quantify vascular remodeling for application to astronaut and terrestrial health challenges. VESGEN mappings of branching vascular trees and networks are based on a weighted multi-parametric analysis derived from vascular physiological branching rules. Complex vascular branching patterns are determined by biological signaling mechanisms together with the fluid mechanics of multi-phase laminar blood flow.
Amen, Daniel G; Hanks, Chris; Prunella, Jill R; Green, Aisa
2007-01-01
The authors explored differences in regional cerebral blood flow in 11 impulsive murderers and 11 healthy comparison subjects using single photon emission computed tomography. The authors assessed subjects at rest and during a computerized go/no-go concentration task. Using statistical parametric mapping software, the authors performed voxel-by-voxel t tests to assess significant differences, making family-wide error corrections for multiple comparisons. Murderers were found to have significantly lower relative rCBF during concentration, particularly in areas associated with concentration and impulse control. These results indicate that nonemotionally laden stimuli may result in frontotemporal dysregulation in people predisposed to impulsive violence.
Bim and Gis: when Parametric Modeling Meets Geospatial Data
NASA Astrophysics Data System (ADS)
Barazzetti, L.; Banfi, F.
2017-12-01
Geospatial data have a crucial role in several projects related to infrastructures and land management. GIS software are able to perform advanced geospatial analyses, but they lack several instruments and tools for parametric modelling typically available in BIM. At the same time, BIM software designed for buildings have limited tools to handle geospatial data. As things stand at the moment, BIM and GIS could appear as complementary solutions, notwithstanding research work is currently under development to ensure a better level of interoperability, especially at the scale of the building. On the other hand, the transition from the local (building) scale to the infrastructure (where geospatial data cannot be neglected) has already demonstrated that parametric modelling integrated with geoinformation is a powerful tool to simplify and speed up some phases of the design workflow. This paper reviews such mixed approaches with both simulated and real examples, demonstrating that integration is already a reality at specific scales, which are not dominated by "pure" GIS or BIM. The paper will also demonstrate that some traditional operations carried out with GIS software are also available in parametric modelling software for BIM, such as transformation between reference systems, DEM generation, feature extraction, and geospatial queries. A real case study is illustrated and discussed to show the advantage of a combined use of both technologies. BIM and GIS integration can generate greater usage of geospatial data in the AECOO (Architecture, Engineering, Construction, Owner and Operator) industry, as well as new solutions for parametric modelling with additional geoinformation.
ACCELERATING MR PARAMETER MAPPING USING SPARSITY-PROMOTING REGULARIZATION IN PARAMETRIC DIMENSION
Velikina, Julia V.; Alexander, Andrew L.; Samsonov, Alexey
2013-01-01
MR parameter mapping requires sampling along additional (parametric) dimension, which often limits its clinical appeal due to a several-fold increase in scan times compared to conventional anatomic imaging. Data undersampling combined with parallel imaging is an attractive way to reduce scan time in such applications. However, inherent SNR penalties of parallel MRI due to noise amplification often limit its utility even at moderate acceleration factors, requiring regularization by prior knowledge. In this work, we propose a novel regularization strategy, which utilizes smoothness of signal evolution in the parametric dimension within compressed sensing framework (p-CS) to provide accurate and precise estimation of parametric maps from undersampled data. The performance of the method was demonstrated with variable flip angle T1 mapping and compared favorably to two representative reconstruction approaches, image space-based total variation regularization and an analytical model-based reconstruction. The proposed p-CS regularization was found to provide efficient suppression of noise amplification and preservation of parameter mapping accuracy without explicit utilization of analytical signal models. The developed method may facilitate acceleration of quantitative MRI techniques that are not suitable to model-based reconstruction because of complex signal models or when signal deviations from the expected analytical model exist. PMID:23213053
Sparse-grid, reduced-basis Bayesian inversion: Nonaffine-parametric nonlinear equations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen, Peng, E-mail: peng@ices.utexas.edu; Schwab, Christoph, E-mail: christoph.schwab@sam.math.ethz.ch
2016-07-01
We extend the reduced basis (RB) accelerated Bayesian inversion methods for affine-parametric, linear operator equations which are considered in [16,17] to non-affine, nonlinear parametric operator equations. We generalize the analysis of sparsity of parametric forward solution maps in [20] and of Bayesian inversion in [48,49] to the fully discrete setting, including Petrov–Galerkin high-fidelity (“HiFi”) discretization of the forward maps. We develop adaptive, stochastic collocation based reduction methods for the efficient computation of reduced bases on the parametric solution manifold. The nonaffinity and nonlinearity with respect to (w.r.t.) the distributed, uncertain parameters and the unknown solution is collocated; specifically, by themore » so-called Empirical Interpolation Method (EIM). For the corresponding Bayesian inversion problems, computational efficiency is enhanced in two ways: first, expectations w.r.t. the posterior are computed by adaptive quadratures with dimension-independent convergence rates proposed in [49]; the present work generalizes [49] to account for the impact of the PG discretization in the forward maps on the convergence rates of the Quantities of Interest (QoI for short). Second, we propose to perform the Bayesian estimation only w.r.t. a parsimonious, RB approximation of the posterior density. Based on the approximation results in [49], the infinite-dimensional parametric, deterministic forward map and operator admit N-term RB and EIM approximations which converge at rates which depend only on the sparsity of the parametric forward map. In several numerical experiments, the proposed algorithms exhibit dimension-independent convergence rates which equal, at least, the currently known rate estimates for N-term approximation. We propose to accelerate Bayesian estimation by first offline construction of reduced basis surrogates of the Bayesian posterior density. The parsimonious surrogates can then be employed for online data assimilation and for Bayesian estimation. They also open a perspective for optimal experimental design.« less
NASA Astrophysics Data System (ADS)
Lee, Jae-Seung; Im, In-Chul; Kang, Su-Man; Goo, Eun-Hoe; Kwak, Byung-Joon
2013-07-01
This study aimed to quantitatively analyze data from diffusion tensor imaging (DTI) using statistical parametric mapping (SPM) in patients with brain disorders and to assess its potential utility for analyzing brain function. DTI was obtained by performing 3.0-T magnetic resonance imaging for patients with Alzheimer's disease (AD) and vascular dementia (VD), and the data were analyzed using Matlab-based SPM software. The two-sample t-test was used for error analysis of the location of the activated pixels. We compared regions of white matter where the fractional anisotropy (FA) values were low and the apparent diffusion coefficients (ADCs) were increased. In the AD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right sub-lobar insula, and right occipital lingual gyrus whereas the ADCs were significantly increased in the right inferior frontal gyrus and right middle frontal gyrus. In the VD group, the FA values were low in the right superior temporal gyrus, right inferior temporal gyrus, right limbic cingulate gyrus, and right sub-lobar caudate tail whereas the ADCs were significantly increased in the left lateral globus pallidus and left medial globus pallidus. In conclusion by using DTI and SPM analysis, we were able to not only determine the structural state of the regions affected by brain disorders but also quantitatively analyze and assess brain function.
NASA Astrophysics Data System (ADS)
Kazmi, K. R.; Khan, F. A.
2008-01-01
In this paper, using proximal-point mapping technique of P-[eta]-accretive mapping and the property of the fixed-point set of set-valued contractive mappings, we study the behavior and sensitivity analysis of the solution set of a parametric generalized implicit quasi-variational-like inclusion involving P-[eta]-accretive mapping in real uniformly smooth Banach space. Further, under suitable conditions, we discuss the Lipschitz continuity of the solution set with respect to the parameter. The technique and results presented in this paper can be viewed as extension of the techniques and corresponding results given in [R.P. Agarwal, Y.-J. Cho, N.-J. Huang, Sensitivity analysis for strongly nonlinear quasi-variational inclusions, Appl. MathE Lett. 13 (2002) 19-24; S. Dafermos, Sensitivity analysis in variational inequalities, Math. Oper. Res. 13 (1988) 421-434; X.-P. Ding, Sensitivity analysis for generalized nonlinear implicit quasi-variational inclusions, Appl. Math. Lett. 17 (2) (2004) 225-235; X.-P. Ding, Parametric completely generalized mixed implicit quasi-variational inclusions involving h-maximal monotone mappings, J. Comput. Appl. Math. 182 (2) (2005) 252-269; X.-P. Ding, C.L. Luo, On parametric generalized quasi-variational inequalities, J. Optim. Theory Appl. 100 (1999) 195-205; Z. Liu, L. Debnath, S.M. Kang, J.S. Ume, Sensitivity analysis for parametric completely generalized nonlinear implicit quasi-variational inclusions, J. Math. Anal. Appl. 277 (1) (2003) 142-154; R.N. Mukherjee, H.L. Verma, Sensitivity analysis of generalized variational inequalities, J. Math. Anal. Appl. 167 (1992) 299-304; M.A. Noor, Sensitivity analysis framework for general quasi-variational inclusions, Comput. Math. Appl. 44 (2002) 1175-1181; M.A. Noor, Sensitivity analysis for quasivariational inclusions, J. Math. Anal. Appl. 236 (1999) 290-299; J.Y. Park, J.U. Jeong, Parametric generalized mixed variational inequalities, Appl. Math. Lett. 17 (2004) 43-48].
Chaotic map clustering algorithm for EEG analysis
NASA Astrophysics Data System (ADS)
Bellotti, R.; De Carlo, F.; Stramaglia, S.
2004-03-01
The non-parametric chaotic map clustering algorithm has been applied to the analysis of electroencephalographic signals, in order to recognize the Huntington's disease, one of the most dangerous pathologies of the central nervous system. The performance of the method has been compared with those obtained through parametric algorithms, as K-means and deterministic annealing, and supervised multi-layer perceptron. While supervised neural networks need a training phase, performed by means of data tagged by the genetic test, and the parametric methods require a prior choice of the number of classes to find, the chaotic map clustering gives a natural evidence of the pathological class, without any training or supervision, thus providing a new efficient methodology for the recognition of patterns affected by the Huntington's disease.
Developing Software for Pharmacodynamics and Bioassay Studies
The objective of the project is to develop a software system to process general pharmacologic, toxicological, or other biomedical research data that...exhibit a non-monotonic dose-response relationship - for which the current parametric models fail. The software will analyze dose-response
Algorithmic procedures for Bayesian MEG/EEG source reconstruction in SPM☆
López, J.D.; Litvak, V.; Espinosa, J.J.; Friston, K.; Barnes, G.R.
2014-01-01
The MEG/EEG inverse problem is ill-posed, giving different source reconstructions depending on the initial assumption sets. Parametric Empirical Bayes allows one to implement most popular MEG/EEG inversion schemes (Minimum Norm, LORETA, etc.) within the same generic Bayesian framework. It also provides a cost-function in terms of the variational Free energy—an approximation to the marginal likelihood or evidence of the solution. In this manuscript, we revisit the algorithm for MEG/EEG source reconstruction with a view to providing a didactic and practical guide. The aim is to promote and help standardise the development and consolidation of other schemes within the same framework. We describe the implementation in the Statistical Parametric Mapping (SPM) software package, carefully explaining each of its stages with the help of a simple simulated data example. We focus on the Multiple Sparse Priors (MSP) model, which we compare with the well-known Minimum Norm and LORETA models, using the negative variational Free energy for model comparison. The manuscript is accompanied by Matlab scripts to allow the reader to test and explore the underlying algorithm. PMID:24041874
Goh, Vicky; Sanghera, Bal; Wellsted, David M; Sundin, Josefin; Halligan, Steve
2009-06-01
The aim was to evaluate the feasibility of fractal analysis for assessing the spatial pattern of colorectal tumour perfusion at dynamic contrast-enhanced CT (perfusion CT). Twenty patients with colorectal adenocarcinoma underwent a 65-s perfusion CT study from which a perfusion parametric map was generated using validated commercial software. The tumour was identified by an experienced radiologist, segmented via thresholding and fractal analysis applied using in-house software: fractal dimension, abundance and lacunarity were assessed for the entire outlined tumour and for selected representative areas within the tumour of low and high perfusion. Comparison was made with ten patients with normal colons, processed in a similar manner, using two-way mixed analysis of variance with statistical significance at the 5% level. Fractal values were higher in cancer than normal colon (p < or = 0.001): mean (SD) 1.71 (0.07) versus 1.61 (0.07) for fractal dimension and 7.82 (0.62) and 6.89 (0.47) for fractal abundance. Fractal values were lower in 'high' than 'low' perfusion areas. Lacunarity curves were shifted to the right for cancer compared with normal colon. In conclusion, colorectal cancer mapped by perfusion CT demonstrates fractal properties. Fractal analysis is feasible, potentially providing a quantitative measure of the spatial pattern of tumour perfusion.
Automated, Parametric Geometry Modeling and Grid Generation for Turbomachinery Applications
NASA Technical Reports Server (NTRS)
Harrand, Vincent J.; Uchitel, Vadim G.; Whitmire, John B.
2000-01-01
The objective of this Phase I project is to develop a highly automated software system for rapid geometry modeling and grid generation for turbomachinery applications. The proposed system features a graphical user interface for interactive control, a direct interface to commercial CAD/PDM systems, support for IGES geometry output, and a scripting capability for obtaining a high level of automation and end-user customization of the tool. The developed system is fully parametric and highly automated, and, therefore, significantly reduces the turnaround time for 3D geometry modeling, grid generation and model setup. This facilitates design environments in which a large number of cases need to be generated, such as for parametric analysis and design optimization of turbomachinery equipment. In Phase I we have successfully demonstrated the feasibility of the approach. The system has been tested on a wide variety of turbomachinery geometries, including several impellers and a multi stage rotor-stator combination. In Phase II, we plan to integrate the developed system with turbomachinery design software and with commercial CAD/PDM software.
NASA Astrophysics Data System (ADS)
Oliveira, N. P.; Maciel, L.; Catarino, A. P.; Rocha, A. M.
2017-10-01
This work proposes the creation of models of surfaces using a parametric computer modelling software to obtain three-dimensional structures in weft knitted fabrics produced on single needle system machines. Digital prototyping, another feature of digital modelling software, was also explored in three-dimensional drawings generated using the Rhinoceros software. With this approach, different 3D structures were developed and produced. Physical characterization tests were then performed on the resulting 3D weft knitted structures to assess their ability to promote comfort. From the obtained results, it is apparent that the developed structures have potential for application in different market segments, such as clothing and interior textiles.
Guehl, Nicolas J; Normandin, Marc D; Wooten, Dustin W; Rozen, Guy; Ruskin, Jeremy N; Shoup, Timothy M; Woo, Jonghye; Ptaszek, Leon M; Fakhri, Georges El; Alpert, Nathaniel M
2017-09-01
We have recently reported a method for measuring rest-stress myocardial blood flow (MBF) using a single, relatively short, PET scan session. The method requires two IV tracer injections, one to initiate rest imaging and one at peak stress. We previously validated absolute flow quantitation in ml/min/cc for standard bull's eye, segmental analysis. In this work, we extend the method for fast computation of rest-stress MBF parametric images. We provide an analytic solution to the single-scan rest-stress flow model which is then solved using a two-dimensional table lookup method (LM). Simulations were performed to compare the accuracy and precision of the lookup method with the original nonlinear method (NLM). Then the method was applied to 16 single scan rest/stress measurements made in 12 pigs: seven studied after infarction of the left anterior descending artery (LAD) territory, and nine imaged in the native state. Parametric maps of rest and stress MBF as well as maps of left (f LV ) and right (f RV ) ventricular spill-over fractions were generated. Regions of interest (ROIs) for 17 myocardial segments were defined in bull's eye fashion on the parametric maps. The mean of each ROI was then compared to the rest (K 1r ) and stress (K 1s ) MBF estimates obtained from fitting the 17 regional TACs with the NLM. In simulation, the LM performed as well as the NLM in terms of precision and accuracy. The simulation did not show that bias was introduced by the use of a predefined two-dimensional lookup table. In experimental data, parametric maps demonstrated good statistical quality and the LM was computationally much more efficient than the original NLM. Very good agreement was obtained between the mean MBF calculated on the parametric maps for each of the 17 ROIs and the regional MBF values estimated by the NLM (K 1map LM = 1.019 × K 1 ROI NLM + 0.019, R 2 = 0.986; mean difference = 0.034 ± 0.036 mL/min/cc). We developed a table lookup method for fast computation of parametric imaging of rest and stress MBF. Our results show the feasibility of obtaining good quality MBF maps using modest computational resources, thus demonstrating that the method can be applied in a clinical environment to obtain full quantitative MBF information. © 2017 American Association of Physicists in Medicine.
Donald Gagliasso; Susan Hummel; Hailemariam Temesgen
2014-01-01
Various methods have been used to estimate the amount of above ground forest biomass across landscapes and to create biomass maps for specific stands or pixels across ownership or project areas. Without an accurate estimation method, land managers might end up with incorrect biomass estimate maps, which could lead them to make poorer decisions in their future...
Quantifying parametric uncertainty in the Rothermel model
S. Goodrick
2008-01-01
The purpose of the present work is to quantify parametric uncertainty in the Rothermel wildland fire spreadmodel (implemented in software such as fire spread models in the United States. This model consists of a non-linear system of equations that relates environmentalvariables (input parameter groups...
fMRI paradigm designing and post-processing tools
James, Jija S; Rajesh, PG; Chandran, Anuvitha VS; Kesavadas, Chandrasekharan
2014-01-01
In this article, we first review some aspects of functional magnetic resonance imaging (fMRI) paradigm designing for major cognitive functions by using stimulus delivery systems like Cogent, E-Prime, Presentation, etc., along with their technical aspects. We also review the stimulus presentation possibilities (block, event-related) for visual or auditory paradigms and their advantage in both clinical and research setting. The second part mainly focus on various fMRI data post-processing tools such as Statistical Parametric Mapping (SPM) and Brain Voyager, and discuss the particulars of various preprocessing steps involved (realignment, co-registration, normalization, smoothing) in these software and also the statistical analysis principles of General Linear Modeling for final interpretation of a functional activation result. PMID:24851001
Chavarrías, Cristina; García-Vázquez, Verónica; Alemán-Gómez, Yasser; Montesinos, Paula; Pascau, Javier; Desco, Manuel
2016-05-01
The purpose of this study was to develop a multi-platform automatic software tool for full processing of fMRI rodent studies. Existing tools require the usage of several different plug-ins, a significant user interaction and/or programming skills. Based on a user-friendly interface, the tool provides statistical parametric brain maps (t and Z) and percentage of signal change for user-provided regions of interest. The tool is coded in MATLAB (MathWorks(®)) and implemented as a plug-in for SPM (Statistical Parametric Mapping, the Wellcome Trust Centre for Neuroimaging). The automatic pipeline loads default parameters that are appropriate for preclinical studies and processes multiple subjects in batch mode (from images in either Nifti or raw Bruker format). In advanced mode, all processing steps can be selected or deselected and executed independently. Processing parameters and workflow were optimized for rat studies and assessed using 460 male-rat fMRI series on which we tested five smoothing kernel sizes and three different hemodynamic models. A smoothing kernel of FWHM = 1.2 mm (four times the voxel size) yielded the highest t values at the somatosensorial primary cortex, and a boxcar response function provided the lowest residual variance after fitting. fMRat offers the features of a thorough SPM-based analysis combined with the functionality of several SPM extensions in a single automatic pipeline with a user-friendly interface. The code and sample images can be downloaded from https://github.com/HGGM-LIM/fmrat .
Breast-Lesion Characterization using Textural Features of Quantitative Ultrasound Parametric Maps.
Sadeghi-Naini, Ali; Suraweera, Harini; Tran, William Tyler; Hadizad, Farnoosh; Bruni, Giancarlo; Rastegar, Rashin Fallah; Curpen, Belinda; Czarnota, Gregory J
2017-10-20
This study evaluated, for the first time, the efficacy of quantitative ultrasound (QUS) spectral parametric maps in conjunction with texture-analysis techniques to differentiate non-invasively benign versus malignant breast lesions. Ultrasound B-mode images and radiofrequency data were acquired from 78 patients with suspicious breast lesions. QUS spectral-analysis techniques were performed on radiofrequency data to generate parametric maps of mid-band fit, spectral slope, spectral intercept, spacing among scatterers, average scatterer diameter, and average acoustic concentration. Texture-analysis techniques were applied to determine imaging biomarkers consisting of mean, contrast, correlation, energy and homogeneity features of parametric maps. These biomarkers were utilized to classify benign versus malignant lesions with leave-one-patient-out cross-validation. Results were compared to histopathology findings from biopsy specimens and radiology reports on MR images to evaluate the accuracy of technique. Among the biomarkers investigated, one mean-value parameter and 14 textural features demonstrated statistically significant differences (p < 0.05) between the two lesion types. A hybrid biomarker developed using a stepwise feature selection method could classify the legions with a sensitivity of 96%, a specificity of 84%, and an AUC of 0.97. Findings from this study pave the way towards adapting novel QUS-based frameworks for breast cancer screening and rapid diagnosis in clinic.
Linkage analysis of high myopia susceptibility locus in 26 families.
Paget, Sandrine; Julia, Sophie; Vitezica, Zulma G; Soler, Vincent; Malecaze, François; Calvas, Patrick
2008-01-01
We conducted a linkage analysis in high myopia families to replicate suggestive results from chromosome 7q36 using a model of autosomal dominant inheritance and genetic heterogeneity. We also performed a genome-wide scan to identify novel loci. Twenty-six families, with at least two high-myopic subjects (ie. refractive value in the less affected eye of -5 diopters) in each family, were included. Phenotypic examination included standard autorefractometry, ultrasonographic eye length measurement, and clinical confirmation of the non-syndromic character of the refractive disorder. Nine families were collected de novo including 136 available members of whom 34 were highly myopic subjects. Twenty new subjects were added in 5 of the 17 remaining families. A total of 233 subjects were submitted to a genome scan using ABI linkage mapping set LMSv2-MD-10, additional markers in all regions where preliminary LOD scores were greater than 1.5 were used. Multipoint parametric and non-parametric analyses were conducted with the software packages Genehunter 2.0 and Merlin 1.0.1. Two autosomal recessive, two autosomal dominant, and four autosomal additive models were used in the parametric linkage analyses. No linkage was found using the subset of nine newly collected families. Study of the entire population of 26 families with a parametric model did not yield a significant LOD score (>3), even for the previously suggestive locus on 7q36. A non-parametric model demonstrated significant linkage to chromosome 7p15 in the entire population (Z-NPL=4.07, p=0.00002). The interval is 7.81 centiMorgans (cM) between markers D7S2458 and D7S2515. The significant interval reported here needs confirmation in other cohorts. Among possible susceptibility genes in the interval, certain candidates are likely to be involved in eye growth and development.
NASA Astrophysics Data System (ADS)
Agapiou, Sergios; Burger, Martin; Dashti, Masoumeh; Helin, Tapio
2018-04-01
We consider the inverse problem of recovering an unknown functional parameter u in a separable Banach space, from a noisy observation vector y of its image through a known possibly non-linear map {{\\mathcal G}} . We adopt a Bayesian approach to the problem and consider Besov space priors (see Lassas et al (2009 Inverse Problems Imaging 3 87-122)), which are well-known for their edge-preserving and sparsity-promoting properties and have recently attracted wide attention especially in the medical imaging community. Our key result is to show that in this non-parametric setup the maximum a posteriori (MAP) estimates are characterized by the minimizers of a generalized Onsager-Machlup functional of the posterior. This is done independently for the so-called weak and strong MAP estimates, which as we show coincide in our context. In addition, we prove a form of weak consistency for the MAP estimators in the infinitely informative data limit. Our results are remarkable for two reasons: first, the prior distribution is non-Gaussian and does not meet the smoothness conditions required in previous research on non-parametric MAP estimates. Second, the result analytically justifies existing uses of the MAP estimate in finite but high dimensional discretizations of Bayesian inverse problems with the considered Besov priors.
Algorithmic procedures for Bayesian MEG/EEG source reconstruction in SPM.
López, J D; Litvak, V; Espinosa, J J; Friston, K; Barnes, G R
2014-01-01
The MEG/EEG inverse problem is ill-posed, giving different source reconstructions depending on the initial assumption sets. Parametric Empirical Bayes allows one to implement most popular MEG/EEG inversion schemes (Minimum Norm, LORETA, etc.) within the same generic Bayesian framework. It also provides a cost-function in terms of the variational Free energy-an approximation to the marginal likelihood or evidence of the solution. In this manuscript, we revisit the algorithm for MEG/EEG source reconstruction with a view to providing a didactic and practical guide. The aim is to promote and help standardise the development and consolidation of other schemes within the same framework. We describe the implementation in the Statistical Parametric Mapping (SPM) software package, carefully explaining each of its stages with the help of a simple simulated data example. We focus on the Multiple Sparse Priors (MSP) model, which we compare with the well-known Minimum Norm and LORETA models, using the negative variational Free energy for model comparison. The manuscript is accompanied by Matlab scripts to allow the reader to test and explore the underlying algorithm. © 2013. Published by Elsevier Inc. All rights reserved.
Current State of the Art Historic Building Information Modelling
NASA Astrophysics Data System (ADS)
Dore, C.; Murphy, M.
2017-08-01
In an extensive review of existing literature a number of observations were made in relation to the current approaches for recording and modelling existing buildings and environments: Data collection and pre-processing techniques are becoming increasingly automated to allow for near real-time data capture and fast processing of this data for later modelling applications. Current BIM software is almost completely focused on new buildings and has very limited tools and pre-defined libraries for modelling existing and historic buildings. The development of reusable parametric library objects for existing and historic buildings supports modelling with high levels of detail while decreasing the modelling time. Mapping these parametric objects to survey data, however, is still a time-consuming task that requires further research. Promising developments have been made towards automatic object recognition and feature extraction from point clouds for as-built BIM. However, results are currently limited to simple and planar features. Further work is required for automatic accurate and reliable reconstruction of complex geometries from point cloud data. Procedural modelling can provide an automated solution for generating 3D geometries but lacks the detail and accuracy required for most as-built applications in AEC and heritage fields.
Lee, Peter; Yan, Ping; Ewart, Paul; Kohl, Peter
2012-01-01
Whole-heart multi-parametric optical mapping has provided valuable insight into the interplay of electro-physiological parameters, and this technology will continue to thrive as dyes are improved and technical solutions for imaging become simpler and cheaper. Here, we show the advantage of using improved 2nd-generation voltage dyes, provide a simple solution to panoramic multi-parametric mapping, and illustrate the application of flash photolysis of caged compounds for studies in the whole heart. For proof of principle, we used the isolated rat whole-heart model. After characterising the blue and green isosbestic points of di-4-ANBDQBS and di-4-ANBDQPQ, respectively, two voltage and calcium mapping systems are described. With two newly custom-made multi-band optical filters, (1) di-4-ANBDQBS and fluo-4 and (2) di-4-ANBDQPQ and rhod-2 mapping are demonstrated. Furthermore, we demonstrate three-parameter mapping using di-4-ANBDQPQ, rhod-2 and NADH. Using off-the-shelf optics and the di-4-ANBDQPQ and rhod-2 combination, we demonstrate panoramic multi-parametric mapping, affording a 360° spatiotemporal record of activity. Finally, local optical perturbation of calcium dynamics in the whole heart is demonstrated using the caged compound, o-nitrophenyl ethylene glycol tetraacetic acid (NP-EGTA), with an ultraviolet light-emitting diode (LED). Calcium maps (heart loaded with di-4-ANBDQPQ and rhod-2) demonstrate successful NP-EGTA loading and local flash photolysis. All imaging systems were built using only a single camera. In conclusion, using novel 2nd-generation voltage dyes, we developed scalable techniques for multi-parametric optical mapping of the whole heart from one point of view and panoramically. In addition to these parameter imaging approaches, we show that it is possible to use caged compounds and ultraviolet LEDs to locally perturb electrophysiological parameters in the whole heart. PMID:22886365
Influence of signal intensity non-uniformity on brain volumetry using an atlas-based method.
Goto, Masami; Abe, Osamu; Miyati, Tosiaki; Kabasawa, Hiroyuki; Takao, Hidemasa; Hayashi, Naoto; Kurosu, Tomomi; Iwatsubo, Takeshi; Yamashita, Fumio; Matsuda, Hiroshi; Mori, Harushi; Kunimatsu, Akira; Aoki, Shigeki; Ino, Kenji; Yano, Keiichi; Ohtomo, Kuni
2012-01-01
Many studies have reported pre-processing effects for brain volumetry; however, no study has investigated whether non-parametric non-uniform intensity normalization (N3) correction processing results in reduced system dependency when using an atlas-based method. To address this shortcoming, the present study assessed whether N3 correction processing provides reduced system dependency in atlas-based volumetry. Contiguous sagittal T1-weighted images of the brain were obtained from 21 healthy participants, by using five magnetic resonance protocols. After image preprocessing using the Statistical Parametric Mapping 5 software, we measured the structural volume of the segmented images with the WFU-PickAtlas software. We applied six different bias-correction levels (Regularization 10, Regularization 0.0001, Regularization 0, Regularization 10 with N3, Regularization 0.0001 with N3, and Regularization 0 with N3) to each set of images. The structural volume change ratio (%) was defined as the change ratio (%) = (100 × [measured volume - mean volume of five magnetic resonance protocols] / mean volume of five magnetic resonance protocols) for each bias-correction level. A low change ratio was synonymous with lower system dependency. The results showed that the images with the N3 correction had a lower change ratio compared with those without the N3 correction. The present study is the first atlas-based volumetry study to show that the precision of atlas-based volumetry improves when using N3-corrected images. Therefore, correction for signal intensity non-uniformity is strongly advised for multi-scanner or multi-site imaging trials.
Influence of Signal Intensity Non-Uniformity on Brain Volumetry Using an Atlas-Based Method
Abe, Osamu; Miyati, Tosiaki; Kabasawa, Hiroyuki; Takao, Hidemasa; Hayashi, Naoto; Kurosu, Tomomi; Iwatsubo, Takeshi; Yamashita, Fumio; Matsuda, Hiroshi; Mori, Harushi; Kunimatsu, Akira; Aoki, Shigeki; Ino, Kenji; Yano, Keiichi; Ohtomo, Kuni
2012-01-01
Objective Many studies have reported pre-processing effects for brain volumetry; however, no study has investigated whether non-parametric non-uniform intensity normalization (N3) correction processing results in reduced system dependency when using an atlas-based method. To address this shortcoming, the present study assessed whether N3 correction processing provides reduced system dependency in atlas-based volumetry. Materials and Methods Contiguous sagittal T1-weighted images of the brain were obtained from 21 healthy participants, by using five magnetic resonance protocols. After image preprocessing using the Statistical Parametric Mapping 5 software, we measured the structural volume of the segmented images with the WFU-PickAtlas software. We applied six different bias-correction levels (Regularization 10, Regularization 0.0001, Regularization 0, Regularization 10 with N3, Regularization 0.0001 with N3, and Regularization 0 with N3) to each set of images. The structural volume change ratio (%) was defined as the change ratio (%) = (100 × [measured volume - mean volume of five magnetic resonance protocols] / mean volume of five magnetic resonance protocols) for each bias-correction level. Results A low change ratio was synonymous with lower system dependency. The results showed that the images with the N3 correction had a lower change ratio compared with those without the N3 correction. Conclusion The present study is the first atlas-based volumetry study to show that the precision of atlas-based volumetry improves when using N3-corrected images. Therefore, correction for signal intensity non-uniformity is strongly advised for multi-scanner or multi-site imaging trials. PMID:22778560
Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in vivo studies.
Petibon, Yoann; Rakvongthai, Yothin; El Fakhri, Georges; Ouyang, Jinsong
2017-05-07
Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves-TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18 F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans-each containing 1/8th of the total number of events-were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18 F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard ordered subset expectation maximization (OSEM) reconstruction algorithm on one side, and the one-step late maximum a posteriori (OSL-MAP) algorithm on the other side, which incorporates a quadratic penalty function. The parametric images were then calculated using voxel-wise weighted least-square fitting of the reconstructed myocardial PET TACs. For the direct method, parametric images were estimated directly from the dynamic PET sinograms using a maximum a posteriori (MAP) parametric reconstruction algorithm which optimizes an objective function comprised of the Poisson log-likelihood term, the kinetic model and a quadratic penalty function. Maximization of the objective function with respect to each set of parameters was achieved using a preconditioned conjugate gradient algorithm with a specifically developed pre-conditioner. The performance of the direct method was evaluated by comparing voxel- and segment-wise estimates of [Formula: see text], the tracer transport rate (ml · min -1 · ml -1 ), to those obtained using the indirect method applied to both OSEM and OSL-MAP dynamic reconstructions. The proposed direct reconstruction method produced [Formula: see text] maps with visibly lower noise than the indirect method based on OSEM and OSL-MAP reconstructions. At normal count levels, the direct method was shown to outperform the indirect method based on OSL-MAP in the sense that at matched level of bias, reduced regional noise levels were obtained. At lower count levels, the direct method produced [Formula: see text] estimates with significantly lower standard deviation across noise realizations than the indirect method based on OSL-MAP at matched bias level. In all cases, the direct method yielded lower noise and standard deviation than the indirect method based on OSEM. Overall, the proposed direct reconstruction offered a better bias-variance tradeoff than the indirect method applied to either OSEM and OSL-MAP. Direct parametric reconstruction as applied to in vivo dynamic PET MPI data is therefore a promising method for producing MBF maps with lower variance.
Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in-vivo studies
Petibon, Yoann; Rakvongthai, Yothin; Fakhri, Georges El; Ouyang, Jinsong
2017-01-01
Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves -TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in-vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans - each containing 1/8th of the total number of events - were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard Ordered Subset Expectation Maximization (OSEM) reconstruction algorithm on one side, and the One-Step Late Maximum a Posteriori (OSL-MAP) algorithm on the other side, which incorporates a quadratic penalty function. The parametric images were then calculated using voxel-wise weighted least-square fitting of the reconstructed myocardial PET TACs. For the direct method, parametric images were estimated directly from the dynamic PET sinograms using a maximum a posteriori (MAP) parametric reconstruction algorithm which optimizes an objective function comprised of the Poisson log-likelihood term, the kinetic model and a quadratic penalty function. Maximization of the objective function with respect to each set of parameters was achieved using a preconditioned conjugate gradient algorithm with a specifically developed pre-conditioner. The performance of the direct method was evaluated by comparing voxel- and segment-wise estimates of K1, the tracer transport rate (mL.min−1.mL−1), to those obtained using the indirect method applied to both OSEM and OSL-MAP dynamic reconstructions. The proposed direct reconstruction method produced K1 maps with visibly lower noise than the indirect method based on OSEM and OSL-MAP reconstructions. At normal count levels, the direct method was shown to outperform the indirect method based on OSL-MAP in the sense that at matched level of bias, reduced regional noise levels were obtained. At lower count levels, the direct method produced K1 estimates with significantly lower standard deviation across noise realizations than the indirect method based on OSL-MAP at matched bias level. In all cases, the direct method yielded lower noise and standard deviation than the indirect method based on OSEM. Overall, the proposed direct reconstruction offered a better bias-variance tradeoff than the indirect method applied to either OSEM and OSL-MAP. Direct parametric reconstruction as applied to in-vivo dynamic PET MPI data is therefore a promising method for producing MBF maps with lower variance. PMID:28379843
Direct parametric reconstruction in dynamic PET myocardial perfusion imaging: in vivo studies
NASA Astrophysics Data System (ADS)
Petibon, Yoann; Rakvongthai, Yothin; El Fakhri, Georges; Ouyang, Jinsong
2017-05-01
Dynamic PET myocardial perfusion imaging (MPI) used in conjunction with tracer kinetic modeling enables the quantification of absolute myocardial blood flow (MBF). However, MBF maps computed using the traditional indirect method (i.e. post-reconstruction voxel-wise fitting of kinetic model to PET time-activity-curves-TACs) suffer from poor signal-to-noise ratio (SNR). Direct reconstruction of kinetic parameters from raw PET projection data has been shown to offer parametric images with higher SNR compared to the indirect method. The aim of this study was to extend and evaluate the performance of a direct parametric reconstruction method using in vivo dynamic PET MPI data for the purpose of quantifying MBF. Dynamic PET MPI studies were performed on two healthy pigs using a Siemens Biograph mMR scanner. List-mode PET data for each animal were acquired following a bolus injection of ~7-8 mCi of 18F-flurpiridaz, a myocardial perfusion agent. Fully-3D dynamic PET sinograms were obtained by sorting the coincidence events into 16 temporal frames covering ~5 min after radiotracer administration. Additionally, eight independent noise realizations of both scans—each containing 1/8th of the total number of events—were generated from the original list-mode data. Dynamic sinograms were then used to compute parametric maps using the conventional indirect method and the proposed direct method. For both methods, a one-tissue compartment model accounting for spillover from the left and right ventricle blood-pools was used to describe the kinetics of 18F-flurpiridaz. An image-derived arterial input function obtained from a TAC taken in the left ventricle cavity was used for tracer kinetic analysis. For the indirect method, frame-by-frame images were estimated using two fully-3D reconstruction techniques: the standard ordered subset expectation maximization (OSEM) reconstruction algorithm on one side, and the one-step late maximum a posteriori (OSL-MAP) algorithm on the other side, which incorporates a quadratic penalty function. The parametric images were then calculated using voxel-wise weighted least-square fitting of the reconstructed myocardial PET TACs. For the direct method, parametric images were estimated directly from the dynamic PET sinograms using a maximum a posteriori (MAP) parametric reconstruction algorithm which optimizes an objective function comprised of the Poisson log-likelihood term, the kinetic model and a quadratic penalty function. Maximization of the objective function with respect to each set of parameters was achieved using a preconditioned conjugate gradient algorithm with a specifically developed pre-conditioner. The performance of the direct method was evaluated by comparing voxel- and segment-wise estimates of {{K}1} , the tracer transport rate (ml · min-1 · ml-1), to those obtained using the indirect method applied to both OSEM and OSL-MAP dynamic reconstructions. The proposed direct reconstruction method produced {{K}1} maps with visibly lower noise than the indirect method based on OSEM and OSL-MAP reconstructions. At normal count levels, the direct method was shown to outperform the indirect method based on OSL-MAP in the sense that at matched level of bias, reduced regional noise levels were obtained. At lower count levels, the direct method produced {{K}1} estimates with significantly lower standard deviation across noise realizations than the indirect method based on OSL-MAP at matched bias level. In all cases, the direct method yielded lower noise and standard deviation than the indirect method based on OSEM. Overall, the proposed direct reconstruction offered a better bias-variance tradeoff than the indirect method applied to either OSEM and OSL-MAP. Direct parametric reconstruction as applied to in vivo dynamic PET MPI data is therefore a promising method for producing MBF maps with lower variance.
NASA Technical Reports Server (NTRS)
Samareh, Jamshid A.; Sensmeier, mark D.; Stewart, Bret A.
2006-01-01
Algorithms for rapid generation of moderate-fidelity structural finite element models of air vehicle structures to allow more accurate weight estimation earlier in the vehicle design process have been developed. Application of these algorithms should help to rapidly assess many structural layouts before the start of the preliminary design phase and eliminate weight penalties imposed when actual structure weights exceed those estimated during conceptual design. By defining the structural topology in a fully parametric manner, the structure can be mapped to arbitrary vehicle configurations being considered during conceptual design optimization. Recent enhancements to this approach include the porting of the algorithms to a platform-independent software language Python, and modifications to specifically consider morphing aircraft-type configurations. Two sample cases which illustrate these recent developments are presented.
[How to start a neuroimaging study].
Narumoto, Jin
2012-06-01
In order to help researchers understand how to start a neuroimaging study, several tips are described in this paper. These include 1) Choice of an imaging modality, 2) Statistical method, and 3) Interpretation of the results. 1) There are several imaging modalities available in clinical research. Advantages and disadvantages of each modality are described. 2) Statistical Parametric Mapping, which is the most common statistical software for neuroimaging analysis, is described in terms of parameter setting in normalization and level of significance. 3) In the discussion section, the region which shows a significant difference between patients and normal controls should be discussed in relation to the neurophysiology of the disease, making reference to previous reports from neuroimaging studies in normal controls, lesion studies and animal studies. A typical pattern of discussion is described.
Construction and comparative evaluation of different activity detection methods in brain FDG-PET.
Buchholz, Hans-Georg; Wenzel, Fabian; Gartenschläger, Martin; Thiele, Frank; Young, Stewart; Reuss, Stefan; Schreckenberger, Mathias
2015-08-18
We constructed and evaluated reference brain FDG-PET databases for usage by three software programs (Computer-aided diagnosis for dementia (CAD4D), Statistical Parametric Mapping (SPM) and NEUROSTAT), which allow a user-independent detection of dementia-related hypometabolism in patients' brain FDG-PET. Thirty-seven healthy volunteers were scanned in order to construct brain FDG reference databases, which reflect the normal, age-dependent glucose consumption in human brain, using either software. Databases were compared to each other to assess the impact of different stereotactic normalization algorithms used by either software package. In addition, performance of the new reference databases in the detection of altered glucose consumption in the brains of patients was evaluated by calculating statistical maps of regional hypometabolism in FDG-PET of 20 patients with confirmed Alzheimer's dementia (AD) and of 10 non-AD patients. Extent (hypometabolic volume referred to as cluster size) and magnitude (peak z-score) of detected hypometabolism was statistically analyzed. Differences between the reference databases built by CAD4D, SPM or NEUROSTAT were observed. Due to the different normalization methods, altered spatial FDG patterns were found. When analyzing patient data with the reference databases created using CAD4D, SPM or NEUROSTAT, similar characteristic clusters of hypometabolism in the same brain regions were found in the AD group with either software. However, larger z-scores were observed with CAD4D and NEUROSTAT than those reported by SPM. Better concordance with CAD4D and NEUROSTAT was achieved using the spatially normalized images of SPM and an independent z-score calculation. The three software packages identified the peak z-scores in the same brain region in 11 of 20 AD cases, and there was concordance between CAD4D and SPM in 16 AD subjects. The clinical evaluation of brain FDG-PET of 20 AD patients with either CAD4D-, SPM- or NEUROSTAT-generated databases from an identical reference dataset showed similar patterns of hypometabolism in the brain regions known to be involved in AD. The extent of hypometabolism and peak z-score appeared to be influenced by the calculation method used in each software package rather than by different spatial normalization parameters.
Mapping the Chevallier-Polarski-Linder parametrization onto physical dark energy Models
NASA Astrophysics Data System (ADS)
Scherrer, Robert J.
2015-08-01
We examine the Chevallier-Polarski-Linder (CPL) parametrization, in the context of quintessence and barotropic dark energy models, to determine the subset of such models to which it can provide a good fit. The CPL parametrization gives the equation of state parameter w for the dark energy as a linear function of the scale factor a , namely w =w0+wa(1 -a ). In the case of quintessence models, we find that over most of the w0, wa parameter space the CPL parametrization maps onto a fairly narrow form of behavior for the potential V (ϕ ), while a one-dimensional subset of parameter space, for which wa=κ (1 +w0) , with κ constant, corresponds to a wide range of functional forms for V (ϕ ). For barotropic models, we show that the functional dependence of the pressure on the density, up to a multiplicative constant, depends only on wi=wa+w0 and not on w0 and wa separately. Our results suggest that the CPL parametrization may not be optimal for testing either type of model.
Parametrization of local CR automorphisms by finite jets and applications
NASA Astrophysics Data System (ADS)
Lamel, Bernhard; Mir, Nordine
2007-04-01
For any real-analytic hypersurface Msubset {C}^N , which does not contain any complex-analytic subvariety of positive dimension, we show that for every point pin M the local real-analytic CR automorphisms of M fixing p can be parametrized real-analytically by their ell_p jets at p . As a direct application, we derive a Lie group structure for the topological group operatorname{Aut}(M,p) . Furthermore, we also show that the order ell_p of the jet space in which the group operatorname{Aut}(M,p) embeds can be chosen to depend upper-semicontinuously on p . As a first consequence, it follows that given any compact real-analytic hypersurface M in {C}^N , there exists an integer k depending only on M such that for every point pin M germs at p of CR diffeomorphisms mapping M into another real-analytic hypersurface in {C}^N are uniquely determined by their k -jet at that point. Another consequence is the following boundary version of H. Cartan's uniqueness theorem: given any bounded domain Ω with smooth real-analytic boundary, there exists an integer k depending only on partial Ω such that if H\\colon Ωto Ω is a proper holomorphic mapping extending smoothly up to partial Ω near some point pin partial Ω with the same k -jet at p with that of the identity mapping, then necessarily H=Id . Our parametrization theorem also holds for the stability group of any essentially finite minimal real-analytic CR manifold of arbitrary codimension. One of the new main tools developed in the paper, which may be of independent interest, is a parametrization theorem for invertible solutions of a certain kind of singular analytic equations, which roughly speaking consists of inverting certain families of parametrized maps with singularities.
Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1981-01-01
A parametric software cost estimation model prepared for Jet PRopulsion Laboratory (JPL) Deep Space Network (DSN) Data System implementation tasks is described. The resource estimation mdel modifies and combines a number of existing models. The model calibrates the task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software life-cycle statistics.
NASA Technical Reports Server (NTRS)
Choo, Yung K.; Slater, John W.; Henderson, Todd L.; Bidwell, Colin S.; Braun, Donald C.; Chung, Joongkee
1998-01-01
TURBO-GRD is a software system for interactive two-dimensional boundary/field grid generation. modification, and refinement. Its features allow users to explicitly control grid quality locally and globally. The grid control can be achieved interactively by using control points that the user picks and moves on the workstation monitor or by direct stretching and refining. The techniques used in the code are the control point form of algebraic grid generation, a damped cubic spline for edge meshing and parametric mapping between physical and computational domains. It also performs elliptic grid smoothing and free-form boundary control for boundary geometry manipulation. Internal block boundaries are constructed and shaped by using Bezier curve. Because TURBO-GRD is a highly interactive code, users can read in an initial solution, display its solution contour in the background of the grid and control net, and exercise grid modification using the solution contour as a guide. This process can be called an interactive solution-adaptive grid generation.
Visual Literacy and the Integration of Parametric Modeling in the Problem-Based Curriculum
ERIC Educational Resources Information Center
Assenmacher, Matthew Benedict
2013-01-01
This quasi-experimental study investigated the application of visual literacy skills in the form of parametric modeling software in relation to traditional forms of sketching. The study included two groups of high school technical design students. The control and experimental groups involved in the study consisted of two randomly selected groups…
Complex mapping of aerofoils - a different perspective
NASA Astrophysics Data System (ADS)
Matthews, Miccal T.
2012-01-01
In this article an application of conformal mapping to aerofoil theory is studied from a geometric and calculus point of view. The problem is suitable for undergraduate teaching in terms of a project or extended piece of work, and brings together the concepts of geometric mapping, parametric equations, complex numbers and calculus. The Joukowski and Karman-Trefftz aerofoils are studied, and it is shown that the Karman-Trefftz aerofoil is an improvement over the Joukowski aerofoil from a practical point of view. For the most part only a spreadsheet program and pen and paper is required, only for the last portion of the study of the Karman-Trefftz aerofoils a symbolic computer package is employed. Ignoring the concept of a conformal mapping and instead viewing the problem from a parametric point of view, some interesting mappings are obtained. By considering the derivative of the mapped mapping via the chain rule, some new and interesting analytical results are obtained for the Joukowski aerofoil, and numerical results for the Karman-Trefftz aerofoil.
Linkage mapping of beta 2 EEG waves via non-parametric regression.
Ghosh, Saurabh; Begleiter, Henri; Porjesz, Bernice; Chorlian, David B; Edenberg, Howard J; Foroud, Tatiana; Goate, Alison; Reich, Theodore
2003-04-01
Parametric linkage methods for analyzing quantitative trait loci are sensitive to violations in trait distributional assumptions. Non-parametric methods are relatively more robust. In this article, we modify the non-parametric regression procedure proposed by Ghosh and Majumder [2000: Am J Hum Genet 66:1046-1061] to map Beta 2 EEG waves using genome-wide data generated in the COGA project. Significant linkage findings are obtained on chromosomes 1, 4, 5, and 15 with findings at multiple regions on chromosomes 4 and 15. We analyze the data both with and without incorporating alcoholism as a covariate. We also test for epistatic interactions between regions of the genome exhibiting significant linkage with the EEG phenotypes and find evidence of epistatic interactions between a region each on chromosome 1 and chromosome 4 with one region on chromosome 15. While regressing out the effect of alcoholism does not affect the linkage findings, the epistatic interactions become statistically insignificant. Copyright 2003 Wiley-Liss, Inc.
Huang, Huiyuan; Ding, Zhongxiang; Mao, Dewang; Yuan, Jianhua; Zhu, Fangmei; Chen, Shuda; Xu, Yan; Lou, Lin; Feng, Xiaoyan; Qi, Le; Qiu, Wusi; Zhang, Han; Zang, Yu-Feng
2016-10-01
The main goal of brain tumor surgery is to maximize tumor resection while minimizing the risk of irreversible postoperative functional sequelae. Eloquent functional areas should be delineated preoperatively, particularly for patients with tumors near eloquent areas. Functional magnetic resonance imaging (fMRI) is a noninvasive technique that demonstrates great promise for presurgical planning. However, specialized data processing toolkits for presurgical planning remain lacking. Based on several functions in open-source software such as Statistical Parametric Mapping (SPM), Resting-State fMRI Data Analysis Toolkit (REST), Data Processing Assistant for Resting-State fMRI (DPARSF) and Multiple Independent Component Analysis (MICA), here, we introduce an open-source MATLAB toolbox named PreSurgMapp. This toolbox can reveal eloquent areas using comprehensive methods and various complementary fMRI modalities. For example, PreSurgMapp supports both model-based (general linear model, GLM, and seed correlation) and data-driven (independent component analysis, ICA) methods and processes both task-based and resting-state fMRI data. PreSurgMapp is designed for highly automatic and individualized functional mapping with a user-friendly graphical user interface (GUI) for time-saving pipeline processing. For example, sensorimotor and language-related components can be automatically identified without human input interference using an effective, accurate component identification algorithm using discriminability index. All the results generated can be further evaluated and compared by neuro-radiologists or neurosurgeons. This software has substantial value for clinical neuro-radiology and neuro-oncology, including application to patients with low- and high-grade brain tumors and those with epilepsy foci in the dominant language hemisphere who are planning to undergo a temporal lobectomy.
Modeling and Visualization Process of the Curve of Pen Point by GeoGebra
ERIC Educational Resources Information Center
Aktümen, Muharem; Horzum, Tugba; Ceylan, Tuba
2013-01-01
This study describes the mathematical construction of a real-life model by means of parametric equations, as well as the two- and three-dimensional visualization of the model using the software GeoGebra. The model was initially considered as "determining the parametric equation of the curve formed on a plane by the point of a pen, positioned…
A software tool for advanced MRgFUS prostate therapy planning and follow up
NASA Astrophysics Data System (ADS)
van Straaten, Dörte; Hoogenboom, Martijn; van Amerongen, Martinus J.; Weiler, Florian; Issawi, Jumana Al; Günther, Matthias; Fütterer, Jurgen; Jenne, Jürgen W.
2017-03-01
US guided HIFU/FUS ablation for the therapy of prostate cancer is a clinical established method, while MR guided HIFU/FUS applications for prostate recently started clinical evaluation. Even if MRI examination is an excellent diagnostic tool for prostate cancer, it is a time consuming procedure and not practicable within an MRgFUS therapy session. The aim of our ongoing work is to develop software to support therapy planning and post-therapy follow-up for MRgFUS on localized prostate cancer, based on multi-parametric MR protocols. The clinical workflow of diagnosis, therapy and follow-up of MR guided FUS on prostate cancer was deeply analyzed. Based on this, the image processing workflow was designed and all necessary components, e.g. GUI, viewer, registration tools etc. were defined and implemented. The software bases on MeVisLab with several implemented C++ modules for the image processing tasks. The developed software, called LTC (Local Therapy Control) will register and visualize automatically all images (T1w, T2w, DWI etc.) and ADC or perfusion maps gained from the diagnostic MRI session. This maximum of diagnostic information helps to segment all necessary ROIs, e.g. the tumor, for therapy planning. Final therapy planning will be performed based on these segmentation data in the following MRgFUS therapy session. In addition, the developed software should help to evaluate the therapy success, by synchronization and display of pre-therapeutic, therapy and follow-up image data including the therapy plan and thermal dose information. In this ongoing project, the first stand-alone prototype was completed and will be clinically evaluated.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Najafi, M; El Kaffas, A; Han, B
Purpose: Clarity Autoscan ultrasound monitoring system allows acquisition of raw radiofrequency (RF) ultrasound data prior and during radiotherapy. This enables the computation of 3D Quantitative Ultrasound (QUS) tissue parametric maps from. We aim to evaluate whether QUS parameters undergo changes with radiotherapy and thus potentially be used as early predictors and/or markers of treatment response in prostate cancer patients. Methods: In-vivo evaluation was performed under IRB protocol to allow data collection in prostate patients treated with VMAT whereby prostate was imaged through the acoustic window of the perineum. QUS spectroscopy analysis was carried out by computing a tissue power spectrummore » normalized to the power spectrum obtained from a quartz to remove system transfer function effects. A ROI was selected within the 3D image volume of the prostate. Because longitudinal registration was optimal, the same features could be used to select ROIs at roughly the same location in images acquired on different days. Parametric maps were generated within the rectangular ROIs with window sizes that were approximately 8 times the wavelength of the ultrasound. The mid-band fit (MBF), spectral slope (SS) and spectral intercept (SI) QUS parameters were computed for each window within the ROI and displayed as parametric maps. Quantitative parameters were obtained by averaging each of the spectral parameters over the whole ROI. Results: Data was acquired for over 21 treatment fractions. Preliminary results show changes in the parametric maps. MBF values decreased from −33.9 dB to −38.7 dB from pre-treatment to the last day of treatment. The spectral slope increased from −1.1 a.u. to −0.5 a.u., and spectral intercept decreased from −28.2 dB to −36.3 dB over the 21 treatment regimen. Conclusion: QUS parametric maps change over the course of treatment which warrants further investigation in their potential use for treatment planning and predicting treatment outcomes. Research was supported by Elekta.« less
Parametric mapping of [18F]fluoromisonidazole positron emission tomography using basis functions.
Hong, Young T; Beech, John S; Smith, Rob; Baron, Jean-Claude; Fryer, Tim D
2011-02-01
In this study, we show a basis function method (BAFPIC) for voxelwise calculation of kinetic parameters (K(1), k(2), k(3), K(i)) and blood volume using an irreversible two-tissue compartment model. BAFPIC was applied to rat ischaemic stroke micro-positron emission tomography data acquired with the hypoxia tracer [(18)F]fluoromisonidazole because irreversible two-tissue compartmental modelling provided good fits to data from both hypoxic and normoxic tissues. Simulated data show that BAFPIC produces kinetic parameters with significantly lower variability and bias than nonlinear least squares (NLLS) modelling in hypoxic tissue. The advantage of BAFPIC over NLLS is less pronounced in normoxic tissue. K(i) determined from BAFPIC has lower variability than that from the Patlak-Gjedde graphical analysis (PGA) by up to 40% and lower bias, except for normoxic tissue at mid-high noise levels. Consistent with the simulation results, BAFPIC parametric maps of real data suffer less noise-induced variability than do NLLS and PGA. Delineation of hypoxia on BAFPIC k(3) maps is aided by low variability in normoxic tissue, which matches that in K(i) maps. BAFPIC produces K(i) values that correlate well with those from PGA (r(2)=0.93 to 0.97; slope 0.99 to 1.05, absolute intercept <0.00002 mL/g per min). BAFPIC is a computationally efficient method of determining parametric maps with low bias and variance.
An Interactive Software for Conceptual Wing Flutter Analysis and Parametric Study
NASA Technical Reports Server (NTRS)
Mukhopadhyay, Vivek
1996-01-01
An interactive computer program was developed for wing flutter analysis in the conceptual design stage. The objective was to estimate the flutter instability boundary of a flexible cantilever wing, when well-defined structural and aerodynamic data are not available, and then study the effect of change in Mach number, dynamic pressure, torsional frequency, sweep, mass ratio, aspect ratio, taper ratio, center of gravity, and pitch inertia, to guide the development of the concept. The software was developed for Macintosh or IBM compatible personal computers, on MathCad application software with integrated documentation, graphics, data base and symbolic mathematics. The analysis method was based on non-dimensional parametric plots of two primary flutter parameters, namely Regier number and Flutter number, with normalization factors based on torsional stiffness, sweep, mass ratio, taper ratio, aspect ratio, center of gravity location and pitch inertia radius of gyration. The parametric plots were compiled in a Vought Corporation report from a vast data base of past experiments and wind-tunnel tests. The computer program was utilized for flutter analysis of the outer wing of a Blended-Wing-Body concept, proposed by McDonnell Douglas Corp. Using a set of assumed data, preliminary flutter boundary and flutter dynamic pressure variation with altitude, Mach number and torsional stiffness were determined.
The parametric resonance—from LEGO Mindstorms to cold atoms
NASA Astrophysics Data System (ADS)
Kawalec, Tomasz; Sierant, Aleksandra
2017-07-01
We show an experimental setup based on a popular LEGO Mindstorms set, allowing us to both observe and investigate the parametric resonance phenomenon. The presented method is simple but covers a variety of student activities like embedded software development, conducting measurements, data collection and analysis. It may be used during science shows, as part of student projects and to illustrate the parametric resonance in mechanics or even quantum physics, during lectures or classes. The parametrically driven LEGO pendulum gains energy in a spectacular way, increasing its amplitude from 10° to about 100° within a few tens of seconds. We provide also a short description of a wireless absolute orientation sensor that may be used in quantitative analysis of driven or free pendulum movement.
Schuitemaker, Alie; van Berckel, Bart N M; Kropholler, Marc A; Veltman, Dick J; Scheltens, Philip; Jonker, Cees; Lammertsma, Adriaan A; Boellaard, Ronald
2007-05-01
(R)-[11C]PK11195 has been used for quantifying cerebral microglial activation in vivo. In previous studies, both plasma input and reference tissue methods have been used, usually in combination with a region of interest (ROI) approach. Definition of ROIs, however, can be labourious and prone to interobserver variation. In addition, results are only obtained for predefined areas and (unexpected) signals in undefined areas may be missed. On the other hand, standard pharmacokinetic models are too sensitive to noise to calculate (R)-[11C]PK11195 binding on a voxel-by-voxel basis. Linearised versions of both plasma input and reference tissue models have been described, and these are more suitable for parametric imaging. The purpose of this study was to compare the performance of these plasma input and reference tissue parametric methods on the outcome of statistical parametric mapping (SPM) analysis of (R)-[11C]PK11195 binding. Dynamic (R)-[11C]PK11195 PET scans with arterial blood sampling were performed in 7 younger and 11 elderly healthy subjects. Parametric images of volume of distribution (Vd) and binding potential (BP) were generated using linearised versions of plasma input (Logan) and reference tissue (Reference Parametric Mapping) models. Images were compared at the group level using SPM with a two-sample t-test per voxel, both with and without proportional scaling. Parametric BP images without scaling provided the most sensitive framework for determining differences in (R)-[11C]PK11195 binding between younger and elderly subjects. Vd images could only demonstrate differences in (R)-[11C]PK11195 binding when analysed with proportional scaling due to intersubject variation in K1/k2 (blood-brain barrier transport and non-specific binding).
Pixel-based parametric source depth map for Cerenkov luminescence imaging
NASA Astrophysics Data System (ADS)
Altabella, L.; Boschi, F.; Spinelli, A. E.
2016-01-01
Optical tomography represents a challenging problem in optical imaging because of the intrinsically ill-posed inverse problem due to photon diffusion. Cerenkov luminescence tomography (CLT) for optical photons produced in tissues by several radionuclides (i.e.: 32P, 18F, 90Y), has been investigated using both 3D multispectral approach and multiviews methods. Difficult in convergence of 3D algorithms can discourage to use this technique to have information of depth and intensity of source. For these reasons, we developed a faster 2D corrected approach based on multispectral acquisitions, to obtain source depth and its intensity using a pixel-based fitting of source intensity. Monte Carlo simulations and experimental data were used to develop and validate the method to obtain the parametric map of source depth. With this approach we obtain parametric source depth maps with a precision between 3% and 7% for MC simulation and 5-6% for experimental data. Using this method we are able to obtain reliable information about the source depth of Cerenkov luminescence with a simple and flexible procedure.
NASA Astrophysics Data System (ADS)
Montoya, Paula; Ballesteros, José; Gervás, Pablo
2015-04-01
The increasing complexity of space use and resource cycles in cities, demands an understanding of the built environment as "ecological": enabling mutation while remaining balanced and biologically sustainable. Designing man`s environment is no longer a question of defining types, but rather an act of inserting changes within a complex system. Architecture and urban planning have become increasingly aware of their condition as system-oriented disciplines, and they are in the process of developing the necessary languages, design tools, and alliances. We will argue the relevance of parametric maps as one of the most powerful of those tools, in terms of their potential for adaptive prototype design, convergence of disciplines, and collaborative work. Cities need to change in order to survive. As the main human landscape (by 2050 75% of the world's population will live in urban areas) cities follow biological patterns of behaviour, constantly replacing their cells, renovating infrastructure systems and refining methods for energy provision and waste management. They need to adapt constantly. As responsive entities, they develop their own protocols for reaction to environmental change and challenge the increasing pressure of several issues related to scale: population, mobility, water and energy supply, pollution... The representation of these urban issues on maps becomes crucial for understanding and addressing them in design. Maps enhanced with parametric tools are relational and not only they register environmental dynamics but they allow adaptation of the system through interwoven parameters of mutation. Citizens are taking part in decisions and becoming aware of their role as urban experts in a bottom-up design process of the cities where they live. Modern tools for dynamic visualisation and collaborative edition of maps have an important role to play in this process. More and more people consult maps on hand-held devices as part of their daily routine. The advent of open access collaborative maps allows them to actively extend and modify these maps by uploading data of their own design. This can generate an immense amount of unique information that is publicly available. The work of architects, planners, and political agents can be informed by the contributions of a community of volunteer cartographers. Counter-cartographies built through collaboration arise from spontaneous processes of knowledge and data collection, and demand continuous non-commercial revision. Both scientific and non-academic users have direct access to geostrategic information and actively take part in exploring, recording and inserting their contrasted contributions into the way in which our world is described. This proposal explores the idea of a counter-cartography as a collection of maps that unveil territorial environmental conditions different from those shown in official maps. By using parametric tools we can incorporate information of this type directly into architectural documents and generate interlaced changes in the design. A parametric map is a flexible yet accurate tool for design and discovery: it integrates multiple particular views into a precise physical context that culminates in a generative design. Working with complex maps in this way is gradually becoming the ultimate document for designing the city in an integrated manner.
Multiscale Reconstruction for Magnetic Resonance Fingerprinting
Pierre, Eric Y.; Ma, Dan; Chen, Yong; Badve, Chaitra; Griswold, Mark A.
2015-01-01
Purpose To reduce acquisition time needed to obtain reliable parametric maps with Magnetic Resonance Fingerprinting. Methods An iterative-denoising algorithm is initialized by reconstructing the MRF image series at low image resolution. For subsequent iterations, the method enforces pixel-wise fidelity to the best-matching dictionary template then enforces fidelity to the acquired data at slightly higher spatial resolution. After convergence, parametric maps with desirable spatial resolution are obtained through template matching of the final image series. The proposed method was evaluated on phantom and in-vivo data using the highly-undersampled, variable-density spiral trajectory and compared with the original MRF method. The benefits of additional sparsity constraints were also evaluated. When available, gold standard parameter maps were used to quantify the performance of each method. Results The proposed approach allowed convergence to accurate parametric maps with as few as 300 time points of acquisition, as compared to 1000 in the original MRF work. Simultaneous quantification of T1, T2, proton density (PD) and B0 field variations in the brain was achieved in vivo for a 256×256 matrix for a total acquisition time of 10.2s, representing a 3-fold reduction in acquisition time. Conclusions The proposed iterative multiscale reconstruction reliably increases MRF acquisition speed and accuracy. PMID:26132462
ZOOM Lite: next-generation sequencing data mapping and visualization software
Zhang, Zefeng; Lin, Hao; Ma, Bin
2010-01-01
High-throughput next-generation sequencing technologies pose increasing demands on the efficiency, accuracy and usability of data analysis software. In this article, we present ZOOM Lite, a software for efficient reads mapping and result visualization. With a kernel capable of mapping tens of millions of Illumina or AB SOLiD sequencing reads efficiently and accurately, and an intuitive graphical user interface, ZOOM Lite integrates reads mapping and result visualization into a easy to use pipeline on desktop PC. The software handles both single-end and paired-end reads, and can output both the unique mapping result or the top N mapping results for each read. Additionally, the software takes a variety of input file formats and outputs to several commonly used result formats. The software is freely available at http://bioinfor.com/zoom/lite/. PMID:20530531
Lausch, Anthony; Yeung, Timothy Pok-Chi; Chen, Jeff; Law, Elton; Wang, Yong; Urbini, Benedetta; Donelli, Filippo; Manco, Luigi; Fainardi, Enrico; Lee, Ting-Yim; Wong, Eugene
2017-11-01
Parametric response map (PRM) analysis of functional imaging has been shown to be an effective tool for early prediction of cancer treatment outcomes and may also be well-suited toward guiding personalized adaptive radiotherapy (RT) strategies such as sub-volume boosting. However, the PRM method was primarily designed for analysis of longitudinally acquired pairs of single-parameter image data. The purpose of this study was to demonstrate the feasibility of a generalized parametric response map analysis framework, which enables analysis of multi-parametric data while maintaining the key advantages of the original PRM method. MRI-derived apparent diffusion coefficient (ADC) and relative cerebral blood volume (rCBV) maps acquired at 1 and 3-months post-RT for 19 patients with high-grade glioma were used to demonstrate the algorithm. Images were first co-registered and then standardized using normal tissue image intensity values. Tumor voxels were then plotted in a four-dimensional Cartesian space with coordinate values equal to a voxel's image intensity in each of the image volumes and an origin defined as the multi-parametric mean of normal tissue image intensity values. Voxel positions were orthogonally projected onto a line defined by the origin and a pre-determined response vector. The voxels are subsequently classified as positive, negative or nil, according to whether projected positions along the response vector exceeded a threshold distance from the origin. The response vector was selected by identifying the direction in which the standard deviation of tumor image intensity values was maximally different between responding and non-responding patients within a training dataset. Voxel classifications were visualized via familiar three-class response maps and then the fraction of tumor voxels associated with each of the classes was investigated for predictive utility analogous to the original PRM method. Independent PRM and MPRM analyses of the contrast-enhancing lesion (CEL) and a 1 cm shell of surrounding peri-tumoral tissue were performed. Prediction using tumor volume metrics was also investigated. Leave-one-out cross validation (LOOCV) was used in combination with permutation testing to assess preliminary predictive efficacy and estimate statistically robust P-values. The predictive endpoint was overall survival (OS) greater than or equal to the median OS of 18.2 months. Single-parameter PRM and multi-parametric response maps (MPRMs) were generated for each patient and used to predict OS via the LOOCV. Tumor volume metrics (P ≥ 0.071 ± 0.01) and single-parameter PRM analyses (P ≥ 0.170 ± 0.01) were not found to be predictive of OS within this study. MPRM analysis of the peri-tumoral region but not the CEL was found to be predictive of OS with a classification sensitivity, specificity and accuracy of 80%, 100%, and 89%, respectively (P = 0.001 ± 0.01). The feasibility of a generalized MPRM analysis framework was demonstrated with improved prediction of overall survival compared to the original single-parameter method when applied to a glioblastoma dataset. The proposed algorithm takes the spatial heterogeneity in multi-parametric response into consideration and enables visualization. MPRM analysis of peri-tumoral regions was shown to have predictive potential supporting further investigation of a larger glioblastoma dataset. © 2017 American Association of Physicists in Medicine.
Head circumference as a useful surrogate for intracranial volume in older adults.
Hshieh, Tammy T; Fox, Meaghan L; Kosar, Cyrus M; Cavallari, Michele; Guttmann, Charles R G; Alsop, David; Marcantonio, Edward R; Schmitt, Eva M; Jones, Richard N; Inouye, Sharon K
2016-01-01
Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for ICV in older adults. 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Head circumference and ICV by SPM8 were moderately correlated (overall r = 0.73, men r = 0.67, women r = 0.63). Head circumference and ICV by FSL were also moderately correlated (overall r = 0.69, men r = 0.63, women r = 0.49). Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan.
Park, Hae-Jeong; Kwon, Jun Soo; Youn, Tak; Pae, Ji Soo; Kim, Jae-Jin; Kim, Myung-Sun; Ha, Kyoo-Seob
2002-11-01
We describe a method for the statistical parametric mapping of low resolution electromagnetic tomography (LORETA) using high-density electroencephalography (EEG) and individual magnetic resonance images (MRI) to investigate the characteristics of the mismatch negativity (MMN) generators in schizophrenia. LORETA, using a realistic head model of the boundary element method derived from the individual anatomy, estimated the current density maps from the scalp topography of the 128-channel EEG. From the current density maps that covered the whole cortical gray matter (up to 20,000 points), volumetric current density images were reconstructed. Intensity normalization of the smoothed current density images was used to reduce the confounding effect of subject specific global activity. After transforming each image into a standard stereotaxic space, we carried out statistical parametric mapping of the normalized current density images. We applied this method to the source localization of MMN in schizophrenia. The MMN generators, produced by a deviant tone of 1,200 Hz (5% of 1,600 trials) under the standard tone of 1,000 Hz, 80 dB binaural stimuli with 300 msec of inter-stimulus interval, were measured in 14 right-handed schizophrenic subjects and 14 age-, gender-, and handedness-matched controls. We found that the schizophrenic group exhibited significant current density reductions of MMN in the left superior temporal gyrus and the left inferior parietal gyrus (P < 0. 0005). This study is the first voxel-by-voxel statistical mapping of current density using individual MRI and high-density EEG. Copyright 2002 Wiley-Liss, Inc.
Complex Mapping of Aerofoils--A Different Perspective
ERIC Educational Resources Information Center
Matthews, Miccal T.
2012-01-01
In this article an application of conformal mapping to aerofoil theory is studied from a geometric and calculus point of view. The problem is suitable for undergraduate teaching in terms of a project or extended piece of work, and brings together the concepts of geometric mapping, parametric equations, complex numbers and calculus. The Joukowski…
Complex Geometric Models of Diffusion and Relaxation in Healthy and Damaged White Matter
Farrell, Jonathan A.D.; Smith, Seth A.; Reich, Daniel S.; Calabresi, Peter A.; van Zijl, Peter C.M.
2010-01-01
Which aspects of tissue microstructure affect diffusion weighted MRI signals? Prior models, many of which use Monte-Carlo simulations, have focused on relatively simple models of the cellular microenvironment and have not considered important anatomic details. With the advent of higher-order analysis models for diffusion imaging, such as high-angular-resolution diffusion imaging (HARDI), more realistic models are necessary. This paper presents and evaluates the reproducibility of simulations of diffusion in complex geometries. Our framework is quantitative, does not require specialized hardware, is easily implemented with little programming experience, and is freely available as open-source software. Models may include compartments with different diffusivities, permeabilities, and T2 time constants using both parametric (e.g., spheres and cylinders) and arbitrary (e.g., mesh-based) geometries. Three-dimensional diffusion displacement-probability functions are mapped with high reproducibility, and thus can be readily used to assess reproducibility of diffusion-derived contrasts. PMID:19739233
Lee, L.; Helsel, D.
2007-01-01
Analysis of low concentrations of trace contaminants in environmental media often results in left-censored data that are below some limit of analytical precision. Interpretation of values becomes complicated when there are multiple detection limits in the data-perhaps as a result of changing analytical precision over time. Parametric and semi-parametric methods, such as maximum likelihood estimation and robust regression on order statistics, can be employed to model distributions of multiply censored data and provide estimates of summary statistics. However, these methods are based on assumptions about the underlying distribution of data. Nonparametric methods provide an alternative that does not require such assumptions. A standard nonparametric method for estimating summary statistics of multiply-censored data is the Kaplan-Meier (K-M) method. This method has seen widespread usage in the medical sciences within a general framework termed "survival analysis" where it is employed with right-censored time-to-failure data. However, K-M methods are equally valid for the left-censored data common in the geosciences. Our S-language software provides an analytical framework based on K-M methods that is tailored to the needs of the earth and environmental sciences community. This includes routines for the generation of empirical cumulative distribution functions, prediction or exceedance probabilities, and related confidence limits computation. Additionally, our software contains K-M-based routines for nonparametric hypothesis testing among an unlimited number of grouping variables. A primary characteristic of K-M methods is that they do not perform extrapolation and interpolation. Thus, these routines cannot be used to model statistics beyond the observed data range or when linear interpolation is desired. For such applications, the aforementioned parametric and semi-parametric methods must be used.
ERIC Educational Resources Information Center
Lozano-Parada, Jaime H.; Burnham, Helen; Martinez, Fiderman Machuca
2018-01-01
A classical nonlinear system, the "Brusselator", was used to illustrate the modeling and simulation of oscillating chemical systems using stability analysis techniques with modern software tools such as Comsol Multiphysics, Matlab, and Excel. A systematic approach is proposed in order to establish a regime of parametric conditions that…
Kero, Tanja; Lindsjö, Lars; Sörensen, Jens; Lubberink, Mark
2016-08-01
(11)C-PIB PET is a promising non-invasive diagnostic tool for cardiac amyloidosis. Semiautomatic analysis of PET data is now available but it is not known how accurate these methods are for amyloid imaging. The aim of this study was to evaluate the feasibility of one semiautomatic software tool for analysis and visualization of (11)C-PIB left ventricular retention index (RI) in cardiac amyloidosis. Patients with systemic amyloidosis and cardiac involvement (n = 10) and healthy controls (n = 5) were investigated with dynamic (11)C-PIB PET. Two observers analyzed the PET studies with semiautomatic software to calculate the left ventricular RI of (11)C-PIB and to create parametric images. The mean RI at 15-25 min from the semiautomatic analysis was compared with RI based on manual analysis and showed comparable values (0.056 vs 0.054 min(-1) for amyloidosis patients and 0.024 vs 0.025 min(-1) in healthy controls; P = .78) and the correlation was excellent (r = 0.98). Inter-reader reproducibility also was excellent (intraclass correlation coefficient, ICC > 0.98). Parametric polarmaps and histograms made visual separation of amyloidosis patients and healthy controls fast and simple. Accurate semiautomatic analysis of cardiac (11)C-PIB RI in amyloidosis patients is feasible. Parametric polarmaps and histograms make visual interpretation fast and simple.
Research on AutoCAD secondary development and function expansion based on VBA technology
NASA Astrophysics Data System (ADS)
Zhang, Runmei; Gu, Yehuan
2017-06-01
AutoCAD is the most widely used drawing tool among the similar design drawing products. In the process of drawing different types of design drawings of the same product, there are a lot of repetitive and single work contents. The traditional manual method uses a drawing software AutoCAD drawing graphics with low efficiency, high error rate and high input cost shortcomings and many more. In order to solve these problems, the design of the parametric drawing system of the hot-rolled I-beam (steel beam) cross-section is completed by using the VBA secondary development tool and the Access database software with large-capacity storage data, and the analysis of the functional extension of the plane drawing and the parametric drawing design in this paper. For the secondary development of AutoCAD functions, the system drawing work will be simplified and work efficiency also has been greatly improved. This introduction of parametric design of AutoCAD drawing system to promote the industrial mass production and related industries economic growth rate similar to the standard I-beam hot-rolled products.
Developing integrated parametric planning models for budgeting and managing complex projects
NASA Technical Reports Server (NTRS)
Etnyre, Vance A.; Black, Ken U.
1988-01-01
The applicability of integrated parametric models for the budgeting and management of complex projects is investigated. Methods for building a very flexible, interactive prototype for a project planning system, and software resources available for this purpose, are discussed and evaluated. The prototype is required to be sensitive to changing objectives, changing target dates, changing costs relationships, and changing budget constraints. To achieve the integration of costs and project and task durations, parametric cost functions are defined by a process of trapezoidal segmentation, where the total cost for the project is the sum of the various project cost segments, and each project cost segment is the integral of a linearly segmented cost loading function over a specific interval. The cost can thus be expressed algebraically. The prototype was designed using Lotus-123 as the primary software tool. This prototype implements a methodology for interactive project scheduling that provides a model of a system that meets most of the goals for the first phase of the study and some of the goals for the second phase.
The binned bispectrum estimator: template-based and non-parametric CMB non-Gaussianity searches
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bucher, Martin; Racine, Benjamin; Tent, Bartjan van, E-mail: bucher@apc.univ-paris7.fr, E-mail: benjar@uio.no, E-mail: vantent@th.u-psud.fr
2016-05-01
We describe the details of the binned bispectrum estimator as used for the official 2013 and 2015 analyses of the temperature and polarization CMB maps from the ESA Planck satellite. The defining aspect of this estimator is the determination of a map bispectrum (3-point correlation function) that has been binned in harmonic space. For a parametric determination of the non-Gaussianity in the map (the so-called f NL parameters), one takes the inner product of this binned bispectrum with theoretically motivated templates. However, as a complementary approach one can also smooth the binned bispectrum using a variable smoothing scale in ordermore » to suppress noise and make coherent features stand out above the noise. This allows one to look in a model-independent way for any statistically significant bispectral signal. This approach is useful for characterizing the bispectral shape of the galactic foreground emission, for which a theoretical prediction of the bispectral anisotropy is lacking, and for detecting a serendipitous primordial signal, for which a theoretical template has not yet been put forth. Both the template-based and the non-parametric approaches are described in this paper.« less
UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.
2012-01-01
UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.
Multiscale reconstruction for MR fingerprinting.
Pierre, Eric Y; Ma, Dan; Chen, Yong; Badve, Chaitra; Griswold, Mark A
2016-06-01
To reduce the acquisition time needed to obtain reliable parametric maps with Magnetic Resonance Fingerprinting. An iterative-denoising algorithm is initialized by reconstructing the MRF image series at low image resolution. For subsequent iterations, the method enforces pixel-wise fidelity to the best-matching dictionary template then enforces fidelity to the acquired data at slightly higher spatial resolution. After convergence, parametric maps with desirable spatial resolution are obtained through template matching of the final image series. The proposed method was evaluated on phantom and in vivo data using the highly undersampled, variable-density spiral trajectory and compared with the original MRF method. The benefits of additional sparsity constraints were also evaluated. When available, gold standard parameter maps were used to quantify the performance of each method. The proposed approach allowed convergence to accurate parametric maps with as few as 300 time points of acquisition, as compared to 1000 in the original MRF work. Simultaneous quantification of T1, T2, proton density (PD), and B0 field variations in the brain was achieved in vivo for a 256 × 256 matrix for a total acquisition time of 10.2 s, representing a three-fold reduction in acquisition time. The proposed iterative multiscale reconstruction reliably increases MRF acquisition speed and accuracy. Magn Reson Med 75:2481-2492, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.
Newitt, David C; Malyarenko, Dariya; Chenevert, Thomas L; Quarles, C Chad; Bell, Laura; Fedorov, Andriy; Fennessy, Fiona; Jacobs, Michael A; Solaiyappan, Meiyappan; Hectors, Stefanie; Taouli, Bachir; Muzi, Mark; Kinahan, Paul E; Schmainda, Kathleen M; Prah, Melissa A; Taber, Erin N; Kroenke, Christopher; Huang, Wei; Arlinghaus, Lori R; Yankeelov, Thomas E; Cao, Yue; Aryal, Madhava; Yen, Yi-Fen; Kalpathy-Cramer, Jayashree; Shukla-Dave, Amita; Fung, Maggie; Liang, Jiachao; Boss, Michael; Hylton, Nola
2018-01-01
Diffusion weighted MRI has become ubiquitous in many areas of medicine, including cancer diagnosis and treatment response monitoring. Reproducibility of diffusion metrics is essential for their acceptance as quantitative biomarkers in these areas. We examined the variability in the apparent diffusion coefficient (ADC) obtained from both postprocessing software implementations utilized by the NCI Quantitative Imaging Network and online scan time-generated ADC maps. Phantom and in vivo breast studies were evaluated for two ([Formula: see text]) and four ([Formula: see text]) [Formula: see text]-value diffusion metrics. Concordance of the majority of implementations was excellent for both phantom ADC measures and in vivo [Formula: see text], with relative biases [Formula: see text] ([Formula: see text]) and [Formula: see text] (phantom [Formula: see text]) but with higher deviations in ADC at the lowest phantom ADC values. In vivo [Formula: see text] concordance was good, with typical biases of [Formula: see text] to 3% but higher for online maps. Multiple b -value ADC implementations were separated into two groups determined by the fitting algorithm. Intergroup mean ADC differences ranged from negligible for phantom data to 2.8% for [Formula: see text] in vivo data. Some higher deviations were found for individual implementations and online parametric maps. Despite generally good concordance, implementation biases in ADC measures are sometimes significant and may be large enough to be of concern in multisite studies.
ERIC Educational Resources Information Center
Cui, Zhongmin; Kolen, Michael J.
2008-01-01
This article considers two methods of estimating standard errors of equipercentile equating: the parametric bootstrap method and the nonparametric bootstrap method. Using a simulation study, these two methods are compared under three sample sizes (300, 1,000, and 3,000), for two test content areas (the Iowa Tests of Basic Skills Maps and Diagrams…
NASA Astrophysics Data System (ADS)
Yu, Haiqing; Chen, Shuhang; Chen, Yunmei; Liu, Huafeng
2017-05-01
Dynamic positron emission tomography (PET) is capable of providing both spatial and temporal information of radio tracers in vivo. In this paper, we present a novel joint estimation framework to reconstruct temporal sequences of dynamic PET images and the coefficients characterizing the system impulse response function, from which the associated parametric images of the system macro parameters for tracer kinetics can be estimated. The proposed algorithm, which combines statistical data measurement and tracer kinetic models, integrates a dictionary sparse coding (DSC) into a total variational minimization based algorithm for simultaneous reconstruction of the activity distribution and parametric map from measured emission sinograms. DSC, based on the compartmental theory, provides biologically meaningful regularization, and total variation regularization is incorporated to provide edge-preserving guidance. We rely on techniques from minimization algorithms (the alternating direction method of multipliers) to first generate the estimated activity distributions with sub-optimal kinetic parameter estimates, and then recover the parametric maps given these activity estimates. These coupled iterative steps are repeated as necessary until convergence. Experiments with synthetic, Monte Carlo generated data, and real patient data have been conducted, and the results are very promising.
Butterfly valve in a virtual environment
NASA Astrophysics Data System (ADS)
Talekar, Aniruddha; Patil, Saurabh; Thakre, Prashant; Rajkumar, E.
2017-11-01
Assembly of components is one of the processes involved in product design and development. The present paper deals with the assembly of a simple butterfly valve components in a virtual environment. The assembly has been carried out using virtual reality software by trial and error methods. The parts are modelled using parametric software (SolidWorks), meshed accordingly, and then called into virtual environment for assembly.
ERIC Educational Resources Information Center
Schochet, Peter Z.
2015-01-01
This report presents the statistical theory underlying the "RCT-YES" software that estimates and reports impacts for RCTs for a wide range of designs used in social policy research. The report discusses a unified, non-parametric design-based approach for impact estimation using the building blocks of the Neyman-Rubin-Holland causal…
1984-09-28
variables before simula- tion of model - Search for reality checks a, - Express uncertainty as a probability density distribution. a. H2 a, H-22 TWIF... probability that the software con- tains errors. This prior is updated as test failure data are accumulated. Only a p of 1 (software known to contain...discusssed; both parametric and nonparametric versions are presented. It is shown by the author that the bootstrap underlies the jackknife method and
2011-01-01
Background The identification of genes or quantitative trait loci that are expressed in response to different environmental factors such as temperature and light, through functional mapping, critically relies on precise modeling of the covariance structure. Previous work used separable parametric covariance structures, such as a Kronecker product of autoregressive one [AR(1)] matrices, that do not account for interaction effects of different environmental factors. Results We implement a more robust nonparametric covariance estimator to model these interactions within the framework of functional mapping of reaction norms to two signals. Our results from Monte Carlo simulations show that this estimator can be useful in modeling interactions that exist between two environmental signals. The interactions are simulated using nonseparable covariance models with spatio-temporal structural forms that mimic interaction effects. Conclusions The nonparametric covariance estimator has an advantage over separable parametric covariance estimators in the detection of QTL location, thus extending the breadth of use of functional mapping in practical settings. PMID:21269481
BROCCOLI: Software for fast fMRI analysis on many-core CPUs and GPUs
Eklund, Anders; Dufort, Paul; Villani, Mattias; LaConte, Stephen
2014-01-01
Analysis of functional magnetic resonance imaging (fMRI) data is becoming ever more computationally demanding as temporal and spatial resolutions improve, and large, publicly available data sets proliferate. Moreover, methodological improvements in the neuroimaging pipeline, such as non-linear spatial normalization, non-parametric permutation tests and Bayesian Markov Chain Monte Carlo approaches, can dramatically increase the computational burden. Despite these challenges, there do not yet exist any fMRI software packages which leverage inexpensive and powerful graphics processing units (GPUs) to perform these analyses. Here, we therefore present BROCCOLI, a free software package written in OpenCL (Open Computing Language) that can be used for parallel analysis of fMRI data on a large variety of hardware configurations. BROCCOLI has, for example, been tested with an Intel CPU, an Nvidia GPU, and an AMD GPU. These tests show that parallel processing of fMRI data can lead to significantly faster analysis pipelines. This speedup can be achieved on relatively standard hardware, but further, dramatic speed improvements require only a modest investment in GPU hardware. BROCCOLI (running on a GPU) can perform non-linear spatial normalization to a 1 mm3 brain template in 4–6 s, and run a second level permutation test with 10,000 permutations in about a minute. These non-parametric tests are generally more robust than their parametric counterparts, and can also enable more sophisticated analyses by estimating complicated null distributions. Additionally, BROCCOLI includes support for Bayesian first-level fMRI analysis using a Gibbs sampler. The new software is freely available under GNU GPL3 and can be downloaded from github (https://github.com/wanderine/BROCCOLI/). PMID:24672471
Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. C.
1981-01-01
A parametric software cost estimation model prepared for Deep Space Network (DSN) Data Systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit DSN software life cycle statistics. The estimation model output scales a standard DSN Work Breakdown Structure skeleton, which is then input into a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
Zhou, Yu; Ren, Jie
2011-04-01
We put forward a new concept of software oversampling mapping system for electrocardiogram (ECG) to assist the research of the ECG inverse problem to improve the generality of mapping system and the quality of mapping signals. We then developed a conceptual system based on the traditional ECG detecting circuit, Labview and DAQ card produced by National Instruments, and at the same time combined the newly-developed oversampling method into the system. The results indicated that the system could map ECG signals accurately and the quality of the signals was good. The improvement of hardware and enhancement of software made the system suitable for mapping in different situations. So the primary development of the software for oversampling mapping system was successful and further research and development can make the system a powerful tool for researching ECG inverse problem.
Parametric model of the scala tympani for haptic-rendered cochlear implantation.
Todd, Catherine; Naghdy, Fazel
2005-01-01
A parametric model of the human scala tympani has been designed for use in a haptic-rendered computer simulation of cochlear implant surgery. It will be the first surgical simulator of this kind. A geometric model of the Scala Tympani has been derived from measured data for this purpose. The model is compared with two existing descriptions of the cochlear spiral. A first approximation of the basilar membrane is also produced. The structures are imported into a force-rendering software application for system development.
Convergence optimization of parametric MLEM reconstruction for estimation of Patlak plot parameters.
Angelis, Georgios I; Thielemans, Kris; Tziortzi, Andri C; Turkheimer, Federico E; Tsoumpas, Charalampos
2011-07-01
In dynamic positron emission tomography data many researchers have attempted to exploit kinetic models within reconstruction such that parametric images are estimated directly from measurements. This work studies a direct parametric maximum likelihood expectation maximization algorithm applied to [(18)F]DOPA data using reference-tissue input function. We use a modified version for direct reconstruction with a gradually descending scheme of subsets (i.e. 18-6-1) initialized with the FBP parametric image for faster convergence and higher accuracy. The results compared with analytic reconstructions show quantitative robustness (i.e. minimal bias) and clinical reproducibility within six human acquisitions in the region of clinical interest. Bland-Altman plots for all the studies showed sufficient quantitative agreement between the direct reconstructed parametric maps and the indirect FBP (--0.035x+0.48E--5). Copyright © 2011 Elsevier Ltd. All rights reserved.
Integrating diffusion maps with umbrella sampling: Application to alanine dipeptide
NASA Astrophysics Data System (ADS)
Ferguson, Andrew L.; Panagiotopoulos, Athanassios Z.; Debenedetti, Pablo G.; Kevrekidis, Ioannis G.
2011-04-01
Nonlinear dimensionality reduction techniques can be applied to molecular simulation trajectories to systematically extract a small number of variables with which to parametrize the important dynamical motions of the system. For molecular systems exhibiting free energy barriers exceeding a few kBT, inadequate sampling of the barrier regions between stable or metastable basins can lead to a poor global characterization of the free energy landscape. We present an adaptation of a nonlinear dimensionality reduction technique known as the diffusion map that extends its applicability to biased umbrella sampling simulation trajectories in which restraining potentials are employed to drive the system into high free energy regions and improve sampling of phase space. We then propose a bootstrapped approach to iteratively discover good low-dimensional parametrizations by interleaving successive rounds of umbrella sampling and diffusion mapping, and we illustrate the technique through a study of alanine dipeptide in explicit solvent.
NIRS-SPM: statistical parametric mapping for near infrared spectroscopy
NASA Astrophysics Data System (ADS)
Tak, Sungho; Jang, Kwang Eun; Jung, Jinwook; Jang, Jaeduck; Jeong, Yong; Ye, Jong Chul
2008-02-01
Even though there exists a powerful statistical parametric mapping (SPM) tool for fMRI, similar public domain tools are not available for near infrared spectroscopy (NIRS). In this paper, we describe a new public domain statistical toolbox called NIRS-SPM for quantitative analysis of NIRS signals. Specifically, NIRS-SPM statistically analyzes the NIRS data using GLM and makes inference as the excursion probability which comes from the random field that are interpolated from the sparse measurement. In order to obtain correct inference, NIRS-SPM offers the pre-coloring and pre-whitening method for temporal correlation estimation. For simultaneous recording NIRS signal with fMRI, the spatial mapping between fMRI image and real coordinate in 3-D digitizer is estimated using Horn's algorithm. These powerful tools allows us the super-resolution localization of the brain activation which is not possible using the conventional NIRS analysis tools.
Head Circumference as a Useful Surrogate for Intracranial Volume in Older Adults
Hshieh, Tammy T.; Fox, Meaghan L.; Kosar, Cyrus M.; Cavallari, Michele; Guttmann, Charles R.G.; Alsop, David; Marcantonio, Edward R.; Schmitt, Eva M.; Jones, Richard N.; Inouye, Sharon K.
2015-01-01
Background Intracranial volume (ICV) has been proposed as a measure of maximum lifetime brain size. Accurate ICV measures require neuroimaging which is not always feasible for epidemiologic investigations. We examined head circumference as a useful surrogate for intracranial volume in older adults. Methods 99 older adults underwent Magnetic Resonance Imaging (MRI). ICV was measured by Statistical Parametric Mapping 8 (SPM8) software or Functional MRI of the Brain Software Library (FSL) extraction with manual editing, typically considered the gold standard. Head circumferences were determined using standardized tape measurement. We examined estimated correlation coefficients between head circumference and the two MRI-based ICV measurements. Results Head circumference and ICV by SPM8 were moderately correlated (overall r=0.73, men r=0.67, women r=0.63). Head circumference and ICV by FSL were also moderately correlated (overall r=0.69, men r=0.63, women r=0.49). Conclusions Head circumference measurement was strongly correlated with MRI-derived ICV. Our study presents a simple method to approximate ICV among older patients, which may prove useful as a surrogate for cognitive reserve in large scale epidemiologic studies of cognitive outcomes. This study also suggests the stability of head circumference correlation with ICV throughout the lifespan. PMID:26631180
Novel fully integrated computer system for custom footwear: from 3D digitization to manufacturing
NASA Astrophysics Data System (ADS)
Houle, Pascal-Simon; Beaulieu, Eric; Liu, Zhaoheng
1998-03-01
This paper presents a recently developed custom footwear system, which integrates 3D digitization technology, range image fusion techniques, a 3D graphical environment for corrective actions, parametric curved surface representation and computer numerical control (CNC) machining. In this system, a support designed with the help of biomechanics experts can stabilize the foot in a correct and neutral position. The foot surface is then captured by a 3D camera using active ranging techniques. A software using a library of documented foot pathologies suggests corrective actions on the orthosis. Three kinds of deformations can be achieved. The first method uses previously scanned pad surfaces by our 3D scanner, which can be easily mapped onto the foot surface to locally modify the surface shape. The second kind of deformation is construction of B-Spline surfaces by manipulating control points and modifying knot vectors in a 3D graphical environment to build desired deformation. The last one is a manual electronic 3D pen, which may be of different shapes and sizes, and has an adjustable 'pressure' information. All applied deformations should respect a G1 surface continuity, which ensure that the surface can accustom a foot. Once the surface modification process is completed, the resulting data is sent to manufacturing software for CNC machining.
Goto, Masami; Abe, Osamu; Aoki, Shigeki; Hayashi, Naoto; Miyati, Tosiaki; Takao, Hidemasa; Iwatsubo, Takeshi; Yamashita, Fumio; Matsuda, Hiroshi; Mori, Harushi; Kunimatsu, Akira; Ino, Kenji; Yano, Keiichi; Ohtomo, Kuni
2013-07-01
This study aimed to investigate whether the effect of scanner for cortex volumetry with atlas-based method is reduced using Diffeomorphic Anatomical Registration Through Exponentiated Lie Algebra (DARTEL) normalization compared with standard normalization. Three-dimensional T1-weighted magnetic resonance images (3D-T1WIs) of 21 healthy subjects were obtained and evaluated for effect of scanner in cortex volumetry. 3D-T1WIs of the 21 subjects were obtained with five MRI systems. Imaging of each subject was performed on each of five different MRI scanners. We used the Voxel-Based Morphometry 8 tool implemented in Statistical Parametric Mapping 8 and WFU PickAtlas software (Talairach brain atlas theory). The following software default settings were used as bilateral region-of-interest labels: "Frontal Lobe," "Hippocampus," "Occipital Lobe," "Orbital Gyrus," "Parietal Lobe," "Putamen," and "Temporal Lobe." Effect of scanner for cortex volumetry using the atlas-based method was reduced with DARTEL normalization compared with standard normalization in Frontal Lobe, Occipital Lobe, Orbital Gyrus, Putamen, and Temporal Lobe; was the same in Hippocampus and Parietal Lobe; and showed no increase with DARTEL normalization for any region of interest (ROI). DARTEL normalization reduces the effect of scanner, which is a major problem in multicenter studies.
Castro, Marcelo P; Pataky, Todd C; Sole, Gisela; Vilas-Boas, Joao Paulo
2015-07-16
Ground reaction force (GRF) data from men and women are commonly pooled for analyses. However, it may not be justifiable to pool sexes on the basis of discrete parameters extracted from continuous GRF gait waveforms because this can miss continuous effects. Forty healthy participants (20 men and 20 women) walked at a cadence of 100 steps per minute across two force plates, recording GRFs. Two statistical methods were used to test the null hypothesis of no mean GRF differences between sexes: (i) Statistical Parametric Mapping-using the entire three-component GRF waveform; and (ii) traditional approach-using the first and second vertical GRF peaks. Statistical Parametric Mapping results suggested large sex differences, which post-hoc analyses suggested were due predominantly to higher anterior-posterior and vertical GRFs in early stance in women compared to men. Statistically significant differences were observed for the first GRF peak and similar values for the second GRF peak. These contrasting results emphasise that different parts of the waveform have different signal strengths and thus that one may use the traditional approach to choose arbitrary metrics and make arbitrary conclusions. We suggest that researchers and clinicians consider both the entire gait waveforms and sex-specificity when analysing GRF data. Copyright © 2015 Elsevier Ltd. All rights reserved.
Eisenbrey, John R; Dave, Jaydev K; Merton, Daniel A; Palazzo, Juan P; Hall, Anne L; Forsberg, Flemming
2011-01-01
Parametric maps showing perfusion of contrast media can be useful tools for characterizing lesions in breast tissue. In this study we show the feasibility of parametric subharmonic imaging (SHI), which allows imaging of a vascular marker (the ultrasound contrast agent) while providing near complete tissue suppression. Digital SHI clips of 16 breast lesions from 14 women were acquired. Patients were scanned using a modified LOGIQ 9 scanner (GE Healthcare, Waukesha, WI) transmitting/receiving at 4.4/2.2 MHz. Using motion-compensated cumulative maximum intensity (CMI) sequences, parametric maps were generated for each lesion showing the time to peak (TTP), estimated perfusion (EP), and area under the time-intensity curve (AUC). Findings were grouped and compared according to biopsy results as benign lesions (n = 12, including 5 fibroadenomas and 3 cysts) and carcinomas (n = 4). For each lesion CMI, TTP, EP, and AUC parametric images were generated. No significant variations were detected with CMI (P = .80), TTP (P = .35), or AUC (P = .65). A statistically significant variation was detected for the average pixel EP (P = .002). Especially, differences were seen between carcinoma and benign lesions (mean ± SD, 0.10 ± 0.03 versus 0.05 ± 0.02 intensity units [IU]/s; P = .0014) and between carcinoma and fibroadenoma (0.10 ± 0.03 versus 0.04 ± 0.01 IU/s; P = .0044), whereas differences between carcinomas and cysts were found to be nonsignificant. In conclusion, a parametric imaging method for characterization of breast lesions using the high contrast to tissue signal provided by SHI has been developed. While the preliminary sample size was limited, results show potential for breast lesion characterization based on perfusion flow parameters.
77 FR 15369 - Mobility Fund Phase I Auction GIS Data of Potentially Eligible Census Blocks
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-15
....fcc.gov/auctions/901/ , are the following: Downloadable shapefile Web mapping service MapBox map tiles... GIS software allows you to add this service as a layer to your session or project. 6. MapBox map tiles are cached map tiles of the data. With this open source software approach, these image tiles can be...
Bahrami, Sheyda; Shamsi, Mousa
2017-01-01
Functional magnetic resonance imaging (fMRI) is a popular method to probe the functional organization of the brain using hemodynamic responses. In this method, volume images of the entire brain are obtained with a very good spatial resolution and low temporal resolution. However, they always suffer from high dimensionality in the face of classification algorithms. In this work, we combine a support vector machine (SVM) with a self-organizing map (SOM) for having a feature-based classification by using SVM. Then, a linear kernel SVM is used for detecting the active areas. Here, we use SOM for feature extracting and labeling the datasets. SOM has two major advances: (i) it reduces dimension of data sets for having less computational complexity and (ii) it is useful for identifying brain regions with small onset differences in hemodynamic responses. Our non-parametric model is compared with parametric and non-parametric methods. We use simulated fMRI data sets and block design inputs in this paper and consider the contrast to noise ratio (CNR) value equal to 0.6 for simulated datasets. fMRI simulated dataset has contrast 1-4% in active areas. The accuracy of our proposed method is 93.63% and the error rate is 6.37%.
Parametric Design and Mechanical Analysis of Beams based on SINOVATION
NASA Astrophysics Data System (ADS)
Xu, Z. G.; Shen, W. D.; Yang, D. Y.; Liu, W. M.
2017-07-01
In engineering practice, engineer needs to carry out complicated calculation when the loads on the beam are complex. The processes of analysis and calculation take a lot of time and the results are unreliable. So VS2005 and ADK are used to develop a software for beams design based on the 3D CAD software SINOVATION with C ++ programming language. The software can realize the mechanical analysis and parameterized design of various types of beams and output the report of design in HTML format. Efficiency and reliability of design of beams are improved.
Nadal-Serrano, Jose M; Nadal-Serrano, Adolfo; Lopez-Vallejo, Marisa
2017-01-01
This paper focuses on the application of rapid prototyping techniques using additive manufacturing in combination with parametric design to create low-cost, yet accurate and reliable instruments. The methodology followed makes it possible to make instruments with a degree of customization until now available only to a narrow audience, helping democratize science. The proposal discusses a holistic design-for-manufacturing approach that comprises advanced modeling techniques, open-source design strategies, and an optimization algorithm using free parametric software for both professional and educational purposes. The design and fabrication of an instrument for scattering measurement is used as a case of study to present the previous concepts.
Lopez-Vallejo, Marisa
2017-01-01
This paper focuses on the application of rapid prototyping techniques using additive manufacturing in combination with parametric design to create low-cost, yet accurate and reliable instruments. The methodology followed makes it possible to make instruments with a degree of customization until now available only to a narrow audience, helping democratize science. The proposal discusses a holistic design-for-manufacturing approach that comprises advanced modeling techniques, open-source design strategies, and an optimization algorithm using free parametric software for both professional and educational purposes. The design and fabrication of an instrument for scattering measurement is used as a case of study to present the previous concepts. PMID:29112987
Systematics in lensing reconstruction: dark matter rings in the sky?
NASA Astrophysics Data System (ADS)
Ponente, P. P.; Diego, J. M.
2011-11-01
Context. Non-parametric lensing methods are a useful way of reconstructing the lensing mass of a cluster without making assumptions about the way the mass is distributed in the cluster. These methods are particularly powerful in the case of galaxy clusters with a large number of constraints. The advantage of not assuming implicitly that the luminous matter follows the dark matter is particularly interesting in those cases where the cluster is in a non-relaxed dynamical state. On the other hand, non-parametric methods have several limitations that should be taken into account carefully. Aims: We explore some of these limitations and focus on their implications for the possible ring of dark matter around the galaxy cluster CL0024+17. Methods: We project three background galaxies through a mock cluster of known radial profile density and obtain a map for the arcs (θ map). We also calculate the shear field associated with the mock cluster across the whole field of view (3.3 arcmin). Combining the positions of the arcs and the two-direction shear, we perform an inversion of the lens equation using two separate methods, the biconjugate gradient, and the quadratic programming (QADP) to reconstruct the convergence map of the mock cluster. Results: We explore the space of the solutions of the convergence map and compare the radial density profiles to the density profile of the mock cluster. When the inversion matrix algorithms are forced to find the exact solution, we encounter systematic effects resembling ring structures, that clearly depart from the original convergence map. Conclusions: Overfitting lensing data with a non-parametric method can produce ring-like structures similar to the alleged one in CL0024.
A controlled experiment on the impact of software structure on maintainability
NASA Technical Reports Server (NTRS)
Rombach, Dieter H.
1987-01-01
The impact of software structure on maintainability aspects including comprehensibility, locality, modifiability, and reusability in a distributed system environment is studied in a controlled maintenance experiment involving six medium-size distributed software systems implemented in LADY (language for distributed systems) and six in an extended version of sequential PASCAL. For all maintenance aspects except reusability, the results were quantitatively given in terms of complexity metrics which could be automated. The results showed LADY to be better suited to the development of maintainable software than the extension of sequential PASCAL. The strong typing combined with high parametrization of units is suggested to improve the reusability of units in LADY.
Green, Adam E; Kraemer, David J M; Fugelsang, Jonathan A; Gray, Jeremy R; Dunbar, Kevin N
2010-01-01
Solving problems often requires seeing new connections between concepts or events that seemed unrelated at first. Innovative solutions of this kind depend on analogical reasoning, a relational reasoning process that involves mapping similarities between concepts. Brain-based evidence has implicated the frontal pole of the brain as important for analogical mapping. Separately, cognitive research has identified semantic distance as a key characteristic of the kind of analogical mapping that can support innovation (i.e., identifying similarities across greater semantic distance reveals connections that support more innovative solutions and models). However, the neural substrates of semantically distant analogical mapping are not well understood. Here, we used functional magnetic resonance imaging (fMRI) to measure brain activity during an analogical reasoning task, in which we parametrically varied the semantic distance between the items in the analogies. Semantic distance was derived quantitatively from latent semantic analysis. Across 23 participants, activity in an a priori region of interest (ROI) in left frontopolar cortex covaried parametrically with increasing semantic distance, even after removing effects of task difficulty. This ROI was centered on a functional peak that we previously associated with analogical mapping. To our knowledge, these data represent a first empirical characterization of how the brain mediates semantically distant analogical mapping.
NASA Technical Reports Server (NTRS)
Page, Lance; Shen, C. N.
1991-01-01
This paper describes skyline-based terrain matching, a new method for locating the vantage point of laser range-finding measurements on a global map previously prepared by satellite or aerial mapping. Skylines can be extracted from the range-finding measurements and modelled from the global map, and are represented in parametric, cylindrical form with azimuth angle as the independent variable. The three translational parameters of the vantage point are determined with a three-dimensional matching of these two sets of skylines.
Detection And Mapping (DAM) package. Volume 4B: Software System Manual, part 2
NASA Technical Reports Server (NTRS)
Schlosser, E. H.
1980-01-01
Computer programs, graphic devices, and an integrated set of manual procedures designed for efficient production of precisely registered and formatted maps from digital data are presented. The software can be used on any Univac 1100 series computer. The software includes pre-defined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3.
NASA Astrophysics Data System (ADS)
Chiabrando, F.; Lo Turco, M.; Rinaudo, F.
2017-08-01
The recent trends in architectural data management imply the scientific and professional collaborations of several disciplines involved in the design, restoration and maintenance. It seems an achieved concept that, in the next future, all the information connected to new interventions or conservation activities on historical buildings will be managed by using a BIM platform. Nowadays the actual range or image based metric survey techniques (mainly produced by using Terrestrial Laser Scanner or photogrammetric platform today more based on projective geometry) allow to generate 3D point clouds, 3D models, orthophotos and other outputs with assessed accuracy. The subsequent conversion of 3D information into parametric components, especially in an historical environment, is not easy and has a lot of open issues. According to the actual BIM commercial software and to the embedded tools or plugin, the paper deals with the methodology followed for the realization of two parametric 3D models (Palazzo Sarmatoris and Smistamento RoundHouse, two historical building in the north-west part of Italy). The paper describes the proposed workflow according to the employed plug-in for automatic reconstruction and to the solution adopted for the well-known problems connected to the modeling phase such as the vaults realization or the 3D irregular surfaces modeling. Finally, the studied strategy for mapping the decay in a BIM environment and the connected results with the conclusions and future perspectives are critically discussed.
Aeroelastic Scaling of a Joined Wing Aircraft Concept
2010-01-11
waxed and then peel ply is laid down, next the layers of fabric are laid down (outermost first) with an outer layer of light glass scrim used as the...A parametric model is developed using Phoenix Integration’s Model Center Software (MC). This model includes the vortex lattice software, AVL that...piece of real-time footage taken from the on-board, gimbaled camera. 2009 Progress Report 27 Figure 35 – initial Autonomous Flight After
Spatial data software integration - Merging CAD/CAM/mapping with GIS and image processing
NASA Technical Reports Server (NTRS)
Logan, Thomas L.; Bryant, Nevin A.
1987-01-01
The integration of CAD/CAM/mapping with image processing using geographic information systems (GISs) as the interface is examined. Particular emphasis is given to the development of software interfaces between JPL's Video Image Communication and Retrieval (VICAR)/Imaged Based Information System (IBIS) raster-based GIS and the CAD/CAM/mapping system. The design and functions of the VICAR and IBIS are described. Vector data capture and editing are studied. Various software programs for interfacing between the VICAR/IBIS and CAD/CAM/mapping are presented and analyzed.
Non-Grey Radiation Modeling using Thermal Desktop/Sindaworks TFAWS06-1009
NASA Technical Reports Server (NTRS)
Anderson, Kevin R.; Paine, Chris
2006-01-01
This paper provides an overview of the non-grey radiation modeling capabilities of Cullimore and Ring's Thermal Desktop(Registered TradeMark) Version 4.8 SindaWorks software. The non-grey radiation analysis theory implemented by Sindaworks and the methodology used by the software are outlined. Representative results from a parametric trade study of a radiation shield comprised of a series of v-grooved shaped deployable panels is used to illustrate the capabilities of the SindaWorks non-grey radiation thermal analysis software using emissivities with temperature and wavelength dependency modeled via a Hagen-Rubens relationship.
Simulation of parametric model towards the fixed covariate of right censored lung cancer data
NASA Astrophysics Data System (ADS)
Afiqah Muhamad Jamil, Siti; Asrul Affendi Abdullah, M.; Kek, Sie Long; Ridwan Olaniran, Oyebayo; Enera Amran, Syahila
2017-09-01
In this study, simulation procedure was applied to measure the fixed covariate of right censored data by using parametric survival model. The scale and shape parameter were modified to differentiate the analysis of parametric regression survival model. Statistically, the biases, mean biases and the coverage probability were used in this analysis. Consequently, different sample sizes were employed to distinguish the impact of parametric regression model towards right censored data with 50, 100, 150 and 200 number of sample. R-statistical software was utilised to develop the coding simulation with right censored data. Besides, the final model of right censored simulation was compared with the right censored lung cancer data in Malaysia. It was found that different values of shape and scale parameter with different sample size, help to improve the simulation strategy for right censored data and Weibull regression survival model is suitable fit towards the simulation of survival of lung cancer patients data in Malaysia.
An application of quantile random forests for predictive mapping of forest attributes
E.A. Freeman; G.G. Moisen
2015-01-01
Increasingly, random forest models are used in predictive mapping of forest attributes. Traditional random forests output the mean prediction from the random trees. Quantile regression forests (QRF) is an extension of random forests developed by Nicolai Meinshausen that provides non-parametric estimates of the median predicted value as well as prediction quantiles. It...
Suppa, Per; Anker, Ulrich; Spies, Lothar; Bopp, Irene; Rüegger-Frey, Brigitte; Klaghofer, Richard; Gocke, Carola; Hampel, Harald; Beck, Sacha; Buchert, Ralph
2015-01-01
Hippocampal volume is a promising biomarker to enhance the accuracy of the diagnosis of dementia due to Alzheimer's disease (AD). However, whereas hippocampal volume is well studied in patient samples from clinical trials, its value in clinical routine patient care is still rather unclear. The aim of the present study, therefore, was to evaluate fully automated atlas-based hippocampal volumetry for detection of AD in the setting of a secondary care expert memory clinic for outpatients. One-hundred consecutive patients with memory complaints were clinically evaluated and categorized into three diagnostic groups: AD, intermediate AD, and non-AD. A software tool based on open source software (Statistical Parametric Mapping SPM8) was employed for fully automated tissue segmentation and stereotactical normalization of high-resolution three-dimensional T1-weighted magnetic resonance images. Predefined standard masks were used for computation of grey matter volume of the left and right hippocampus which then was scaled to the patient's total grey matter volume. The right hippocampal volume provided an area under the receiver operating characteristic curve of 84% for detection of AD patients in the whole sample. This indicates that fully automated MR-based hippocampal volumetry fulfills the requirements for a relevant core feasible biomarker for detection of AD in everyday patient care in a secondary care memory clinic for outpatients. The software used in the present study has been made freely available as an SPM8 toolbox. It is robust and fast so that it is easily integrated into routine workflow.
Ground and Airborne Methane Measurements with an Optical Parametric Amplifier
NASA Technical Reports Server (NTRS)
Numata, Kenji
2012-01-01
We report on ground and airborne atmospheric methane measurements with a differential absorption lidar using an optical parametric amplifier (OPA). Methane is a strong greenhouse gas on Earth and its accurate global mapping is urgently needed to understand climate change. We are developing a nanosecond-pulsed OPA for remote measurements of methane from an Earth-orbiting satellite. We have successfully demonstrated the detection of methane on the ground and from an airplane at approximately 11-km altitude.
MapEdit: solution to continuous raster map creation
NASA Astrophysics Data System (ADS)
Rančić, Dejan; Djordjevi-Kajan, Slobodanka
2003-03-01
The paper describes MapEdit, MS Windows TM software for georeferencing and rectification of scanned paper maps. The software produces continuous raster maps which can be used as background in geographical information systems. Process of continuous raster map creation using MapEdit "mosaicking" function is also described as well as the georeferencing and rectification algorithms which are used in MapEdit. Our approach for georeferencing and rectification using four control points and two linear transformations for each scanned map part, together with nearest neighbor resampling method, represents low cost—high speed solution that produce continuous raster maps with satisfactory quality for many purposes (±1 pixel). Quality assessment of several continuous raster maps at different scales that have been created using our software and methodology, has been undertaken and results are presented in the paper. For the quality control of the produced raster maps we referred to three wide adopted standards: US Standard for Digital Cartographic Data, National Standard for Spatial Data Accuracy and US National Map Accuracy Standard. The results obtained during the quality assessment process are given in the paper and show that our maps meat all three standards.
Hyperspectral Soil Mapper (HYSOMA) software interface: Review and future plans
NASA Astrophysics Data System (ADS)
Chabrillat, Sabine; Guillaso, Stephane; Eisele, Andreas; Rogass, Christian
2014-05-01
With the upcoming launch of the next generation of hyperspectral satellites that will routinely deliver high spectral resolution images for the entire globe (e.g. EnMAP, HISUI, HyspIRI, HypXIM, PRISMA), an increasing demand for the availability/accessibility of hyperspectral soil products is coming from the geoscience community. Indeed, many robust methods for the prediction of soil properties based on imaging spectroscopy already exist and have been successfully used for a wide range of soil mapping airborne applications. Nevertheless, these methods require expert know-how and fine-tuning, which makes them used sparingly. More developments are needed toward easy-to-access soil toolboxes as a major step toward the operational use of hyperspectral soil products for Earth's surface processes monitoring and modelling, to allow non-experienced users to obtain new information based on non-expensive software packages where repeatability of the results is an important prerequisite. In this frame, based on the EU-FP7 EUFAR (European Facility for Airborne Research) project and EnMAP satellite science program, higher performing soil algorithms were developed at the GFZ German Research Center for Geosciences as demonstrators for end-to-end processing chains with harmonized quality measures. The algorithms were built-in into the HYSOMA (Hyperspectral SOil MApper) software interface, providing an experimental platform for soil mapping applications of hyperspectral imagery that gives the choice of multiple algorithms for each soil parameter. The software interface focuses on fully automatic generation of semi-quantitative soil maps such as soil moisture, soil organic matter, iron oxide, clay content, and carbonate content. Additionally, a field calibration option calculates fully quantitative soil maps provided ground truth soil data are available. Implemented soil algorithms have been tested and validated using extensive in-situ ground truth data sets. The source of the HYSOMA code was developed as standalone IDL software to allow easy implementation in the hyperspectral and non-hyperspectral communities. Indeed, within the hyperspectral community, IDL language is very widely used, and for non-expert users that do not have an ENVI license, such software can be executed as a binary version using the free IDL virtual machine under various operating systems. Based on the growing interest of users in the software interface, the experimental software was adapted for public release version in 2012, and since then ~80 users of hyperspectral soil products downloaded the soil algorithms at www.gfz-potsdam.de/hysoma. The software interface was distributed for free as IDL plug-ins under the IDL-virtual machine. Up-to-now distribution of HYSOMA was based on a close source license model, for non-commercial and educational purposes. Currently, the HYSOMA is being under further development in the context of the EnMAP satellite mission, for extension and implementation in the EnMAP Box as EnSoMAP (EnMAP SOil MAPper). The EnMAP Box is a freely available, platform-independent software distributed under an open source license. In the presentation we will focus on an update of the HYSOMA software interface status and upcoming implementation in the EnMAP Box. Scientific software validation, associated publication record and users responses as well as software management and transition to open source will be discussed.
Vectoring of parallel synthetic jets: A parametric study
NASA Astrophysics Data System (ADS)
Berk, Tim; Gomit, Guillaume; Ganapathisubramani, Bharathram
2016-11-01
The vectoring of a pair of parallel synthetic jets can be described using five dimensionless parameters: the aspect ratio of the slots, the Strouhal number, the Reynolds number, the phase difference between the jets and the spacing between the slots. In the present study, the influence of the latter four on the vectoring behaviour of the jets is examined experimentally using particle image velocimetry. Time-averaged velocity maps are used to study the variations in vectoring behaviour for a parametric sweep of each of the four parameters independently. A topological map is constructed for the full four-dimensional parameter space. The vectoring behaviour is described both qualitatively and quantitatively. A vectoring mechanism is proposed, based on measured vortex positions. We acknowledge the financial support from the European Research Council (ERC Grant Agreement No. 277472).
One-dimensional statistical parametric mapping in Python.
Pataky, Todd C
2012-01-01
Statistical parametric mapping (SPM) is a topological methodology for detecting field changes in smooth n-dimensional continua. Many classes of biomechanical data are smooth and contained within discrete bounds and as such are well suited to SPM analyses. The current paper accompanies release of 'SPM1D', a free and open-source Python package for conducting SPM analyses on a set of registered 1D curves. Three example applications are presented: (i) kinematics, (ii) ground reaction forces and (iii) contact pressure distribution in probabilistic finite element modelling. In addition to offering a high-level interface to a variety of common statistical tests like t tests, regression and ANOVA, SPM1D also emphasises fundamental concepts of SPM theory through stand-alone example scripts. Source code and documentation are available at: www.tpataky.net/spm1d/.
A 100-3000 GHz model of thermal dust emission observed by Planck, DIRBE and IRAS
NASA Astrophysics Data System (ADS)
Meisner, Aaron M.; Finkbeiner, Douglas P.
2015-01-01
We apply the Finkbeiner et al. (1999) two-component thermal dust emission model to the Planck HFI maps. This parametrization of the far-infrared dust spectrum as the sum of two modified blackbodies serves as an important alternative to the commonly adopted single modified blackbody (MBB) dust emission model. Analyzing the joint Planck/DIRBE dust spectrum, we show that two-component models provide a better fit to the 100-3000 GHz emission than do single-MBB models, though by a lesser margin than found by Finkbeiner et al. (1999) based on FIRAS and DIRBE. We also derive full-sky 6.1' resolution maps of dust optical depth and temperature by fitting the two-component model to Planck 217-857 GHz along with DIRBE/IRAS 100μm data. Because our two-component model matches the dust spectrum near its peak, accounts for the spectrum's flattening at millimeter wavelengths, and specifies dust temperature at 6.1' FWHM, our model provides reliable, high-resolution thermal dust emission foreground predictions from 100 to 3000 GHz. We find that, in diffuse sky regions, our two-component 100-217 GHz predictions are on average accurate to within 2.2%, while extrapolating the Planck Collaboration (2013) single-MBB model systematically underpredicts emission by 18.8% at 100 GHz, 12.6% at 143 GHz and 7.9% at 217 GHz. We calibrate our two-component optical depth to reddening, and compare with reddening estimates based on stellar spectra. We find the dominant systematic problems in our temperature/reddening maps to be zodiacal light on large angular scales and the cosmic infrared background anistropy on small angular scales. We have recently released maps and associated software utilities for obtaining thermal dust emission and reddening predictions using our Planck-based two-component model.
Liu, Ren-Hu; Meng, Jin-Ling
2003-05-01
MAPMAKER is one of the most widely used computer software package for constructing genetic linkage maps.However, the PC version, MAPMAKER 3.0 for PC, could not draw the genetic linkage maps that its Macintosh version, MAPMAKER 3.0 for Macintosh,was able to do. Especially in recent years, Macintosh computer is much less popular than PC. Most of the geneticists use PC to analyze their genetic linkage data. So a new computer software to draw the same genetic linkage maps on PC as the MAPMAKER for Macintosh to do on Macintosh has been crying for. Microsoft Excel,one component of Microsoft Office package, is one of the most popular software in laboratory data processing. Microsoft Visual Basic for Applications (VBA) is one of the most powerful functions of Microsoft Excel. Using this program language, we can take creative control of Excel, including genetic linkage map construction, automatic data processing and more. In this paper, a Microsoft Excel macro called MapDraw is constructed to draw genetic linkage maps on PC computer based on given genetic linkage data. Use this software,you can freely construct beautiful genetic linkage map in Excel and freely edit and copy it to Word or other application. This software is just an Excel format file. You can freely copy it from ftp://211.69.140.177 or ftp://brassica.hzau.edu.cn and the source code can be found in Excel's Visual Basic Editor.
Development of a Cross-Flow Fan Rotor for Vertical Take-Off and Landing Aircraft
2013-06-01
ANSYS CFX , along with the commercial computer-aided design software SolidWorks, was used to model and perform a parametric study on the number of rotor...the results found using ANSYS CFX . The experimental and analytical models were successfully compared at speeds ranging from 4,000 to 7,000 RPM...will make vertical take-off possible. The commercial computational fluid dynamics software ANSYS CFX , along with the commercial computer-aided design
5m RPV for Exploring Joined Wing Gust Response
2009-12-01
an outer layer of light glass scrim used as the first layer. Varying layers of carbon cloth are used and then the core material is added. In this...for various angles of attack and sideslip angles. A parametric model is developed using Phoenix Integration’s Model Center Software (MC). This model...by the ground control software and finally a piece of real-time footage taken from the on-board, gimbaled camera. 2009 Progress Report 27
Computer-assisted concept mapping: Visual aids for knowledge construction
Mammen, Jennifer R.
2016-01-01
Background Concept mapping is a visual representation of ideas that facilitates critical thinking and is applicable to many areas of nursing education. Computer-Assisted Concept Maps are more flexible and less constrained than traditional paper methods, allowing for analysis and synthesis of complex topics and larger amounts of data. Ability to iteratively revise and collaboratively create computerized maps can contribute to enhanced interpersonal learning. However, there is limited awareness of free software that can support these types of applications. Discussion This educational brief examines affordances and limitations of Computer-Assisted Concept Maps and reviews free software for development of complex, collaborative malleable maps. Free software such as VUE, Xmind, MindMaple, and others can substantially contribute to utility of concept-mapping for nursing education. Conclusions Computerized concept-mapping is an important tool for nursing and is likely to hold greater benefit for students and faculty than traditional pen and paper methods alone. PMID:27351610
Packham, B; Barnes, G; Dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-06-01
Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity.
Fitting C 2 Continuous Parametric Surfaces to Frontiers Delimiting Physiologic Structures
Bayer, Jason D.
2014-01-01
We present a technique to fit C 2 continuous parametric surfaces to scattered geometric data points forming frontiers delimiting physiologic structures in segmented images. Such mathematical representation is interesting because it facilitates a large number of operations in modeling. While the fitting of C 2 continuous parametric curves to scattered geometric data points is quite trivial, the fitting of C 2 continuous parametric surfaces is not. The difficulty comes from the fact that each scattered data point should be assigned a unique parametric coordinate, and the fit is quite sensitive to their distribution on the parametric plane. We present a new approach where a polygonal (quadrilateral or triangular) surface is extracted from the segmented image. This surface is subsequently projected onto a parametric plane in a manner to ensure a one-to-one mapping. The resulting polygonal mesh is then regularized for area and edge length. Finally, from this point, surface fitting is relatively trivial. The novelty of our approach lies in the regularization of the polygonal mesh. Process performance is assessed with the reconstruction of a geometric model of mouse heart ventricles from a computerized tomography scan. Our results show an excellent reproduction of the geometric data with surfaces that are C 2 continuous. PMID:24782911
Packham, B; Barnes, G; dos Santos, G Sato; Aristovich, K; Gilad, O; Ghosh, A; Oh, T; Holder, D
2016-01-01
Abstract Electrical impedance tomography (EIT) allows for the reconstruction of internal conductivity from surface measurements. A change in conductivity occurs as ion channels open during neural activity, making EIT a potential tool for functional brain imaging. EIT images can have >10 000 voxels, which means statistical analysis of such images presents a substantial multiple testing problem. One way to optimally correct for these issues and still maintain the flexibility of complicated experimental designs is to use random field theory. This parametric method estimates the distribution of peaks one would expect by chance in a smooth random field of a given size. Random field theory has been used in several other neuroimaging techniques but never validated for EIT images of fast neural activity, such validation can be achieved using non-parametric techniques. Both parametric and non-parametric techniques were used to analyze a set of 22 images collected from 8 rats. Significant group activations were detected using both techniques (corrected p < 0.05). Both parametric and non-parametric analyses yielded similar results, although the latter was less conservative. These results demonstrate the first statistical analysis of such an image set and indicate that such an analysis is an approach for EIT images of neural activity. PMID:27203477
Neural network representation and learning of mappings and their derivatives
NASA Technical Reports Server (NTRS)
White, Halbert; Hornik, Kurt; Stinchcombe, Maxwell; Gallant, A. Ronald
1991-01-01
Discussed here are recent theorems proving that artificial neural networks are capable of approximating an arbitrary mapping and its derivatives as accurately as desired. This fact forms the basis for further results establishing the learnability of the desired approximations, using results from non-parametric statistics. These results have potential applications in robotics, chaotic dynamics, control, and sensitivity analysis. An example involving learning the transfer function and its derivatives for a chaotic map is discussed.
Software cost/resource modeling: Deep space network software cost estimation model
NASA Technical Reports Server (NTRS)
Tausworthe, R. J.
1980-01-01
A parametric software cost estimation model prepared for JPL deep space network (DSN) data systems implementation tasks is presented. The resource estimation model incorporates principles and data from a number of existing models, such as those of the General Research Corporation, Doty Associates, IBM (Walston-Felix), Rome Air Force Development Center, University of Maryland, and Rayleigh-Norden-Putnam. The model calibrates task magnitude and difficulty, development environment, and software technology effects through prompted responses to a set of approximately 50 questions. Parameters in the model are adjusted to fit JPL software lifecycle statistics. The estimation model output scales a standard DSN work breakdown structure skeleton, which is then input to a PERT/CPM system, producing a detailed schedule and resource budget for the project being planned.
PyPWA: A partial-wave/amplitude analysis software framework
NASA Astrophysics Data System (ADS)
Salgado, Carlos
2016-05-01
The PyPWA project aims to develop a software framework for Partial Wave and Amplitude Analysis of data; providing the user with software tools to identify resonances from multi-particle final states in photoproduction. Most of the code is written in Python. The software is divided into two main branches: one general-shell where amplitude's parameters (or any parametric model) are to be estimated from the data. This branch also includes software to produce simulated data-sets using the fitted amplitudes. A second branch contains a specific realization of the isobar model (with room to include Deck-type and other isobar model extensions) to perform PWA with an interface into the computer resources at Jefferson Lab. We are currently implementing parallelism and vectorization using the Intel's Xeon Phi family of coprocessors.
Parametric-Studies and Data-Plotting Modules for the SOAP
NASA Technical Reports Server (NTRS)
2008-01-01
"Parametric Studies" and "Data Table Plot View" are the names of software modules in the Satellite Orbit Analysis Program (SOAP). Parametric Studies enables parameterization of as many as three satellite or ground-station attributes across a range of values and computes the average, minimum, and maximum of a specified metric, the revisit time, or 21 other functions at each point in the parameter space. This computation produces a one-, two-, or three-dimensional table of data representing statistical results across the parameter space. Inasmuch as the output of a parametric study in three dimensions can be a very large data set, visualization is a paramount means of discovering trends in the data (see figure). Data Table Plot View enables visualization of the data table created by Parametric Studies or by another data source: this module quickly generates a display of the data in the form of a rotatable three-dimensional-appearing plot, making it unnecessary to load the SOAP output data into a separate plotting program. The rotatable three-dimensionalappearing plot makes it easy to determine which points in the parameter space are most desirable. Both modules provide intuitive user interfaces for ease of use.
NASA Astrophysics Data System (ADS)
Durmaz, Murat; Karslioglu, Mahmut Onur
2015-04-01
There are various global and regional methods that have been proposed for the modeling of ionospheric vertical total electron content (VTEC). Global distribution of VTEC is usually modeled by spherical harmonic expansions, while tensor products of compactly supported univariate B-splines can be used for regional modeling. In these empirical parametric models, the coefficients of the basis functions as well as differential code biases (DCBs) of satellites and receivers can be treated as unknown parameters which can be estimated from geometry-free linear combinations of global positioning system observables. In this work we propose a new semi-parametric multivariate adaptive regression B-splines (SP-BMARS) method for the regional modeling of VTEC together with satellite and receiver DCBs, where the parametric part of the model is related to the DCBs as fixed parameters and the non-parametric part adaptively models the spatio-temporal distribution of VTEC. The latter is based on multivariate adaptive regression B-splines which is a non-parametric modeling technique making use of compactly supported B-spline basis functions that are generated from the observations automatically. This algorithm takes advantage of an adaptive scale-by-scale model building strategy that searches for best-fitting B-splines to the data at each scale. The VTEC maps generated from the proposed method are compared numerically and visually with the global ionosphere maps (GIMs) which are provided by the Center for Orbit Determination in Europe (CODE). The VTEC values from SP-BMARS and CODE GIMs are also compared with VTEC values obtained through calibration using local ionospheric model. The estimated satellite and receiver DCBs from the SP-BMARS model are compared with the CODE distributed DCBs. The results show that the SP-BMARS algorithm can be used to estimate satellite and receiver DCBs while adaptively and flexibly modeling the daily regional VTEC.
Quantitative evaluation of microvascular blood flow by contrast-enhanced ultrasound (CEUS).
Greis, Christian
2011-01-01
Ultrasound contrast agents consist of tiny gas-filled microbubbles the size of red blood cells. Due to their size distribution, they are purely intravascular tracers which do not extravasate into the interstitial fluid, and thus they are perfect agents for imaging blood distribution and flow. Using ultrasound scanners with contrast-specific software, the specific microbubble-derived echo signals can be separated from tissue signals in realtime, allowing selective imaging of the contrast agent. The signal intensity obtained lies in a linear relationship to the amount of microbubbles in the target organ, which allows easy and reliable assessment of relative blood volume. Imaging of the contrast wash-in and wash-out after bolus injection, or more precisely using the flash-replenishment technique, allows assessment of regional blood flow velocity. Commercially available quantification software packages can calculate time-related intensity values from the contrast wash-in and wash-out phase for each image pixel from stored video clips. After fitting of a mathematical model curve according to the respective kinetic model (bolus or flash-replenishment kinetics), time/intensity curves (TIC) can be calculated from single pixels or user-defined regions of interest (ROI). Characteristic parameters of these TICs (e.g. peak intensity, area under the curve, wash-in rate, etc.) can be displayed as color-coded parametric maps on top of the anatomical image, to identify cold and hot spots with abnormal perfusion.
Suppa, Per; Hampel, Harald; Spies, Lothar; Fiebach, Jochen B; Dubois, Bruno; Buchert, Ralph
2015-01-01
Hippocampus volumetry based on magnetic resonance imaging (MRI) has not yet been translated into everyday clinical diagnostic patient care, at least in part due to limited availability of appropriate software tools. In the present study, we evaluate a fully-automated and computationally efficient processing pipeline for atlas based hippocampal volumetry using freely available Statistical Parametric Mapping (SPM) software in 198 amnestic mild cognitive impairment (MCI) subjects from the Alzheimer's Disease Neuroimaging Initiative (ADNI1). Subjects were grouped into MCI stable and MCI to probable Alzheimer's disease (AD) converters according to follow-up diagnoses at 12, 24, and 36 months. Hippocampal grey matter volume (HGMV) was obtained from baseline T1-weighted MRI and then corrected for total intracranial volume and age. Average processing time per subject was less than 4 minutes on a standard PC. The area under the receiver operator characteristic curve of the corrected HGMV for identification of MCI to probable AD converters within 12, 24, and 36 months was 0.78, 0.72, and 0.71, respectively. Thus, hippocampal volume computed with the fully-automated processing pipeline provides similar power for prediction of MCI to probable AD conversion as computationally more expensive methods. The whole processing pipeline has been made freely available as an SPM8 toolbox. It is easily set up and integrated into everyday clinical patient care.
Approximating prediction uncertainty for random forest regression models
John W. Coulston; Christine E. Blinn; Valerie A. Thomas; Randolph H. Wynne
2016-01-01
Machine learning approaches such as random forest have increased for the spatial modeling and mapping of continuous variables. Random forest is a non-parametric ensemble approach, and unlike traditional regression approaches there is no direct quantification of prediction error. Understanding prediction uncertainty is important when using model-based continuous maps as...
NASA Astrophysics Data System (ADS)
Bandos, Igor A.; Ortín, Tomás
2016-08-01
We review and investigate different aspects of scalar fields in supergravity theories both when they parametrize symmetric spaces and when they parametrize spaces of special holonomy which are not necessarily symmetric (Kähler and Quaternionic-Kähler spaces): their rôle in the definition of derivatives of the fermions covariant under the R-symmetry group and (in gauged supergravities) under some gauge group, their dualization into ( d - 2)-forms, their role in the supersymmetry transformation rules (via fermion shifts, for instance) etc. We find a general definition of momentum map that applies to any manifold admitting a Killing vector and coincides with those of the holomorphic and tri-holomorphic momentum maps in Kähler and quaternionic-Kähler spaces and with an independent definition that can be given in symmetric spaces. We show how the momen-tum map occurs ubiquitously: in gauge-covariant derivatives of fermions, in fermion shifts, in the supersymmetry transformation rules of ( d - 2)-forms etc. We also give the general structure of the Noether-Gaillard-Zumino conserved currents in theories with fields of different ranks in any dimension.
DOT National Transportation Integrated Search
2014-04-01
This report provides a Road Map for implementing the AASHTOWare Pavement ME Design software for the Idaho Transportation Department (ITD). The Road Map calls for a series of three stages: Stage 1 - Immediate, Stage 2 - Near Term, and Stage 3 - Future...
Detection And Mapping (DAM) package. Volume 4A: Software System Manual, part 1
NASA Technical Reports Server (NTRS)
Schlosser, E. H.
1980-01-01
The package is an integrated set of manual procedures, computer programs, and graphic devices designed for efficient production of precisely registered and formatted maps from digital LANDSAT multispectral scanner (MSS) data. The software can be readily implemented on any Univac 1100 series computer with standard peripheral equipment. This version of the software includes predefined spectral limits for use in classifying and mapping surface water for LANDSAT-1, LANDSAT-2, and LANDSAT-3. Tape formats supported include X, AM, and PM.
3-D Geometric Modeling for the 21st Century.
ERIC Educational Resources Information Center
Ault, Holly K.
1999-01-01
Describes new geometric computer models used in contemporary computer-aided design (CAD) software including wire frame, surface, solid, and parametric models. Reviews their use in engineering design and discusses the impact of these new technologies on the engineering design graphics curriculum. (Author/CCM)
Direct Parametric Reconstruction With Joint Motion Estimation/Correction for Dynamic Brain PET Data.
Jiao, Jieqing; Bousse, Alexandre; Thielemans, Kris; Burgos, Ninon; Weston, Philip S J; Schott, Jonathan M; Atkinson, David; Arridge, Simon R; Hutton, Brian F; Markiewicz, Pawel; Ourselin, Sebastien
2017-01-01
Direct reconstruction of parametric images from raw photon counts has been shown to improve the quantitative analysis of dynamic positron emission tomography (PET) data. However it suffers from subject motion which is inevitable during the typical acquisition time of 1-2 hours. In this work we propose a framework to jointly estimate subject head motion and reconstruct the motion-corrected parametric images directly from raw PET data, so that the effects of distorted tissue-to-voxel mapping due to subject motion can be reduced in reconstructing the parametric images with motion-compensated attenuation correction and spatially aligned temporal PET data. The proposed approach is formulated within the maximum likelihood framework, and efficient solutions are derived for estimating subject motion and kinetic parameters from raw PET photon count data. Results from evaluations on simulated [ 11 C]raclopride data using the Zubal brain phantom and real clinical [ 18 F]florbetapir data of a patient with Alzheimer's disease show that the proposed joint direct parametric reconstruction motion correction approach can improve the accuracy of quantifying dynamic PET data with large subject motion.
Prediction of contaminant fate and transport in potable water systems using H2OFate
NASA Astrophysics Data System (ADS)
Devarakonda, Venkat; Manickavasagam, Sivakumar; VanBlaricum, Vicki; Ginsberg, Mark
2009-05-01
BlazeTech has recently developed a software called H2OFate to predict the fate and transport of chemical and biological contaminants in water distribution systems. This software includes models for the reactions of these contaminants with residual disinfectant in bulk water and at the pipe wall, and their adhesion/reactions with the pipe walls. This software can be interfaced with sensors through SCADA systems to monitor water distribution networks for contamination events and activate countermeasures, as needed. This paper presents results from parametric calculations carried out using H2OFate for a simulated contaminant release into a sample water distribution network.
Nakayama, N; Okumura, A; Shinoda, J; Nakashima, T; Iwama, T
2006-07-01
The cerebral metabolism of patients in the chronic stage of traumatic diffuse brain injury (TDBI) has not been fully investigated. To study the relationship between regional cerebral metabolism (rCM) and consciousness disturbance in patients with TDBI. 52 patients with TDBI in the chronic stage without large focal lesions were enrolled, and rCM was evaluated by fluorine-18-fluorodeoxyglucose positron emission tomography (FDG-PET) with statistical parametric mapping (SPM). All the patients were found to have disturbed consciousness or cognitive function and were divided into the following three groups: group A (n = 22), patients in a state with higher brain dysfunction; group B (n = 13), patients in a minimally conscious state; and group C (n = 17), patients in a vegetative state. rCM patterns on FDG-PET among these groups were evaluated and compared with those of normal control subjects on statistical parametric maps. Hypometabolism was consistently indicated bilaterally in the medial prefrontal regions, the medial frontobasal regions, the cingulate gyrus and the thalamus. Hypometabolism in these regions was the most widespread and prominent in group C, and that in group B was more widespread and prominent than that in group A. Bilateral hypometabolism in the medial prefrontal regions, the medial frontobasal regions, the cingulate gyrus and the thalamus may reflect the clinical deterioration of TDBI, which is due to functional and structural disconnections of neural networks rather than due to direct cerebral focal contusion.
Parametric versus Cox's model: an illustrative analysis of divorce in Canada.
Balakrishnan, T R; Rao, K V; Krotki, K J; Lapierre-adamcyk, E
1988-06-01
Recent demographic literature clearly recognizes the importance of survival modes in the analysis of cross-sectional event histories. Of the various survival models, Cox's (1972) partial parametric model has been very popular due to its simplicity, and readily available computer software for estimation, sometimes at the cost of precision and parsimony of the model. This paper focuses on parametric failure time models for event history analysis such as Weibell, lognormal, loglogistic, and exponential models. The authors also test the goodness of fit of these parametric models versus the Cox's proportional hazards model taking Kaplan-Meier estimate as base. As an illustration, the authors reanalyze the Canadian Fertility Survey data on 1st marriage dissolution with parametric models. Though these parametric model estimates were not very different from each other, there seemed to be a slightly better fit with loglogistic. When 8 covariates were used in the analysis, it was found that the coefficients were similar in the models, and the overall conclusions about the relative risks would not have been different. The findings reveal that in marriage dissolution, the differences according to demographic and socioeconomic characteristics may be far more important than is generally found in many studies. Therefore, one should not treat the population as homogeneous in analyzing survival probabilities of marriages, other than for cursory analysis of overall trends.
Mazzotta, Laura; Cozzani, Mauro; Mutinelli, Sabrina; Castaldo, Attilio; Silvestrini-Biavati, Armando
2013-01-01
Objectives. To build a 3D parametric model to detect shape and volume of dental roots, from a panoramic radiograph (PAN) of the patient. Materials and Methods. A PAN and a cone beam computed tomography (CBCT) of a patient were acquired. For each tooth, various parameters were considered (coronal and root lengths and widths): these were measured from the CBCT and from the PAN. Measures were compared to evaluate the accuracy level of PAN measurements. By using a CAD software, parametric models of an incisor and of a molar were constructed employing B-spline curves and free-form surfaces. PAN measures of teeth 2.1 and 3.6 were assigned to the parametric models; the same two teeth were segmented from CBCT. The two models were superimposed to assess the accuracy of the parametric model. Results. PAN measures resulted to be accurate and comparable with all other measurements. From model superimposition the maximum error resulted was 1.1 mm on the incisor crown and 2 mm on the molar furcation. Conclusion. This study shows that it is possible to build a 3D parametric model starting from 2D information with a clinically valid accuracy level. This can ultimately lead to a crown-root movement simulation. PMID:23554814
Painting a picture across the landscape with ModelMap
Brian Cooke; Elizabeth Freeman; Gretchen Moisen; Tracey Frescino
2017-01-01
Scientists and statisticians working for the Rocky Mountain Research Station have created a software package that simplifies and automates many of the processes needed for converting models into maps. This software package, called ModelMap, has helped a variety of specialists and land managers to quickly convert data into easily understood graphical images. The...
Two-component Thermal Dust Emission Model: Application to the Planck HFI Maps
NASA Astrophysics Data System (ADS)
Meisner, Aaron M.; Finkbeiner, Douglas P.
2014-06-01
We present full-sky, 6.1 arcminute resolution maps of dust optical depth and temperature derived by fitting the Finkbeiner et al. (1999) two-component dust emission model to the Planck HFI and IRAS 100 micron maps. This parametrization of the far infrared thermal dust SED as the sum of two modified blackbodies serves as an important alternative to the commonly adopted single modified blackbody dust emission model. We expect our Planck-based maps of dust temperature and optical depth to form the basis for a next-generation, high-resolution extinction map which will additionally incorporate small-scale detail from WISE imaging.
Multiple Concurrent Visual-Motor Mappings: Implications for Models of Adaptation
NASA Technical Reports Server (NTRS)
Cunningham, H. A.; Welch, Robert B.
1994-01-01
Previous research on adaptation to visual-motor rearrangement suggests that the central nervous system represents accurately only 1 visual-motor mapping at a time. This idea was examined in 3 experiments where subjects tracked a moving target under repeated alternations between 2 initially interfering mappings (the 'normal' mapping characteristic of computer input devices and a 108' rotation of the normal mapping). Alternation between the 2 mappings led to significant reduction in error under the rotated mapping and significant reduction in the adaptation aftereffect ordinarily caused by switching between mappings. Color as a discriminative cue, interference versus decay in adaptation aftereffect, and intermanual transfer were also examined. The results reveal a capacity for multiple concurrent visual-motor mappings, possibly controlled by a parametric process near the motor output stage of processing.
Computer-Assisted Concept Mapping: Visual Aids for Knowledge Construction.
Mammen, Jennifer R
2016-07-01
Concept mapping is a visual representation of ideas that facilitates critical thinking and is applicable to many areas of nursing education. Computer-assisted concept maps are more flexible and less constrained than traditional paper methods, allowing for analysis and synthesis of complex topics and larger amounts of data. Ability to iteratively revise and collaboratively create computerized maps can contribute to enhanced interpersonal learning. However, there is limited awareness of free software that can support these types of applications. This educational brief examines affordances and limitations of computer-assisted concept maps and reviews free software for development of complex, collaborative malleable maps. Free software, such as VUE, XMind, MindMaple, and others, can substantially contribute to the utility of concept mapping for nursing education. Computerized concept-mapping is an important tool for nursing and is likely to hold greater benefit for students and faculty than traditional pen-and-paper methods alone. [J Nurs Educ. 2016;55(7):403-406.]. Copyright 2016, SLACK Incorporated.
18F-FLT uptake kinetics in head and neck squamous cell carcinoma: a PET imaging study.
Liu, Dan; Chalkidou, Anastasia; Landau, David B; Marsden, Paul K; Fenwick, John D
2014-04-01
To analyze the kinetics of 3(')-deoxy-3(')-[F-18]-fluorothymidine (18F-FLT) uptake by head and neck squamous cell carcinomas and involved nodes imaged using positron emission tomography (PET). Two- and three-tissue compartment models were fitted to 12 tumor time-activity-curves (TACs) obtained for 6 structures (tumors or involved nodes) imaged in ten dynamic PET studies of 1 h duration, carried out for five patients. The ability of the models to describe the data was assessed using a runs test, the Akaike information criterion (AIC) and leave-one-out cross-validation. To generate parametric maps the models were also fitted to TACs of individual voxels. Correlations between maps of different parameters were characterized using Pearson'sr coefficient; in particular the phosphorylation rate-constants k3-2tiss and k5 of the two- and three-tissue models were studied alongside the flux parameters KFLT- 2tiss and KFLT of these models, and standardized uptake values (SUV). A methodology based on expectation-maximization clustering and the Bayesian information criterion ("EM-BIC clustering") was used to distil the information from noisy parametric images. Fits of two-tissue models 2C3K and 2C4K and three-tissue models 3C5K and 3C6K comprising three, four, five, and six rate-constants, respectively, pass the runs test for 4, 8, 10, and 11 of 12 tumor TACs. The three-tissue models have lower AIC and cross-validation scores for nine of the 12 tumors. Overall the 3C6K model has the lowest AIC and cross-validation scores and its fitted parameter values are of the same orders of magnitude as literature estimates. Maps of KFLT and KFLT- 2tiss are strongly correlated (r = 0.85) and also correlate closely with SUV maps (r = 0.72 for KFLT- 2tiss, 0.64 for KFLT). Phosphorylation rate-constant maps are moderately correlated with flux maps (r = 0.48 for k3-2tiss vs KFLT- 2tiss and r = 0.68 for k5 vs KFLT); however, neither phosphorylation rate-constant correlates significantly with SUV. EM-BIC clustering reduces the parametric maps to a small number of levels--on average 5.8, 3.5, 3.4, and 1.4 for KFLT- 2tiss, KFLT, k3-2tiss, and k5. This large simplification is potentially useful for radiotherapy dose-painting, but demonstrates the high noise in some maps. Statistical simulations show that voxel level noise degrades TACs generated from the 3C6K model sufficiently that the average AIC score, parameter bias, and total uncertainty of 2C4K model fits are similar to those of 3C6K fits, whereas at the whole tumor level the scores are lower for 3C6K fits. For the patients studied here, whole tumor FLT uptake time-courses are represented better overall by a three-tissue than by a two-tissue model. EM-BIC clustering simplifies noisy parametric maps, providing the best description of the underlying information they contain and is potentially useful for radiotherapy dose-painting. However, the clustering highlights the large degree of noise present in maps of the phosphorylation rate-constantsk5 and k3-2tiss, which are conceptually tightly linked to cellular proliferation. Methods must be found to make these maps more robust-either by constraining other model parameters or modifying dynamic imaging protocols. © 2014 American Association of Physicists in Medicine.
Educational Software for Mapping Quantitative Trait Loci (QTL)
ERIC Educational Resources Information Center
Helms, T. C.; Doetkott, C.
2007-01-01
This educational software was developed to aid teachers and students in their understanding of how the process of identifying the most likely quantitative trait loci (QTL) position is determined between two flanking DNA markers. The objective of the software that we developed was to: (1) show how a QTL is mapped to a position on a chromosome using…
NASA Astrophysics Data System (ADS)
Alshakova, E. L.
2017-01-01
The program in the AutoLISP language allows automatically to form parametrical drawings during the work in the AutoCAD software product. Students study development of programs on AutoLISP language with the use of the methodical complex containing methodical instructions in which real examples of creation of images and drawings are realized. Methodical instructions contain reference information necessary for the performance of the offered tasks. The method of step-by-step development of the program is the basis for training in programming on AutoLISP language: the program draws elements of the drawing of a detail by means of definitely created function which values of arguments register in that sequence in which AutoCAD gives out inquiries when performing the corresponding command in the editor. The process of the program design is reduced to the process of step-by-step formation of functions and sequence of their calls. The author considers the development of the AutoLISP program for the creation of parametrical drawings of details, the defined design, the user enters the dimensions of elements of details. These programs generate variants of tasks of the graphic works performed in educational process of "Engineering graphics", "Engineering and computer graphics" disciplines. Individual tasks allow to develop at students skills of independent work in reading and creation of drawings, as well as 3D modeling.
Dieringer, Matthias A.; Deimling, Michael; Santoro, Davide; Wuerfel, Jens; Madai, Vince I.; Sobesky, Jan; von Knobelsdorff-Brenkenhoff, Florian; Schulz-Menger, Jeanette; Niendorf, Thoralf
2014-01-01
Introduction Visual but subjective reading of longitudinal relaxation time (T1) weighted magnetic resonance images is commonly used for the detection of brain pathologies. For this non-quantitative measure, diagnostic quality depends on hardware configuration, imaging parameters, radio frequency transmission field (B1+) uniformity, as well as observer experience. Parametric quantification of the tissue T1 relaxation parameter offsets the propensity for these effects, but is typically time consuming. For this reason, this study examines the feasibility of rapid 2D T1 quantification using a variable flip angles (VFA) approach at magnetic field strengths of 1.5 Tesla, 3 Tesla, and 7 Tesla. These efforts include validation in phantom experiments and application for brain T1 mapping. Methods T1 quantification included simulations of the Bloch equations to correct for slice profile imperfections, and a correction for B1+. Fast gradient echo acquisitions were conducted using three adjusted flip angles for the proposed T1 quantification approach that was benchmarked against slice profile uncorrected 2D VFA and an inversion-recovery spin-echo based reference method. Brain T1 mapping was performed in six healthy subjects, one multiple sclerosis patient, and one stroke patient. Results Phantom experiments showed a mean T1 estimation error of (-63±1.5)% for slice profile uncorrected 2D VFA and (0.2±1.4)% for the proposed approach compared to the reference method. Scan time for single slice T1 mapping including B1+ mapping could be reduced to 5 seconds using an in-plane resolution of (2×2) mm2, which equals a scan time reduction of more than 99% compared to the reference method. Conclusion Our results demonstrate that rapid 2D T1 quantification using a variable flip angle approach is feasible at 1.5T/3T/7T. It represents a valuable alternative for rapid T1 mapping due to the gain in speed versus conventional approaches. This progress may serve to enhance the capabilities of parametric MR based lesion detection and brain tissue characterization. PMID:24621588
Dieringer, Matthias A; Deimling, Michael; Santoro, Davide; Wuerfel, Jens; Madai, Vince I; Sobesky, Jan; von Knobelsdorff-Brenkenhoff, Florian; Schulz-Menger, Jeanette; Niendorf, Thoralf
2014-01-01
Visual but subjective reading of longitudinal relaxation time (T1) weighted magnetic resonance images is commonly used for the detection of brain pathologies. For this non-quantitative measure, diagnostic quality depends on hardware configuration, imaging parameters, radio frequency transmission field (B1+) uniformity, as well as observer experience. Parametric quantification of the tissue T1 relaxation parameter offsets the propensity for these effects, but is typically time consuming. For this reason, this study examines the feasibility of rapid 2D T1 quantification using a variable flip angles (VFA) approach at magnetic field strengths of 1.5 Tesla, 3 Tesla, and 7 Tesla. These efforts include validation in phantom experiments and application for brain T1 mapping. T1 quantification included simulations of the Bloch equations to correct for slice profile imperfections, and a correction for B1+. Fast gradient echo acquisitions were conducted using three adjusted flip angles for the proposed T1 quantification approach that was benchmarked against slice profile uncorrected 2D VFA and an inversion-recovery spin-echo based reference method. Brain T1 mapping was performed in six healthy subjects, one multiple sclerosis patient, and one stroke patient. Phantom experiments showed a mean T1 estimation error of (-63±1.5)% for slice profile uncorrected 2D VFA and (0.2±1.4)% for the proposed approach compared to the reference method. Scan time for single slice T1 mapping including B1+ mapping could be reduced to 5 seconds using an in-plane resolution of (2×2) mm2, which equals a scan time reduction of more than 99% compared to the reference method. Our results demonstrate that rapid 2D T1 quantification using a variable flip angle approach is feasible at 1.5T/3T/7T. It represents a valuable alternative for rapid T1 mapping due to the gain in speed versus conventional approaches. This progress may serve to enhance the capabilities of parametric MR based lesion detection and brain tissue characterization.
4D-PET reconstruction using a spline-residue model with spatial and temporal roughness penalties
NASA Astrophysics Data System (ADS)
Ralli, George P.; Chappell, Michael A.; McGowan, Daniel R.; Sharma, Ricky A.; Higgins, Geoff S.; Fenwick, John D.
2018-05-01
4D reconstruction of dynamic positron emission tomography (dPET) data can improve the signal-to-noise ratio in reconstructed image sequences by fitting smooth temporal functions to the voxel time-activity-curves (TACs) during the reconstruction, though the optimal choice of function remains an open question. We propose a spline-residue model, which describes TACs as weighted sums of convolutions of the arterial input function with cubic B-spline basis functions. Convolution with the input function constrains the spline-residue model at early time-points, potentially enhancing noise suppression in early time-frames, while still allowing a wide range of TAC descriptions over the entire imaged time-course, thus limiting bias. Spline-residue based 4D-reconstruction is compared to that of a conventional (non-4D) maximum a posteriori (MAP) algorithm, and to 4D-reconstructions based on adaptive-knot cubic B-splines, the spectral model and an irreversible two-tissue compartment (‘2C3K’) model. 4D reconstructions were carried out using a nested-MAP algorithm including spatial and temporal roughness penalties. The algorithms were tested using Monte-Carlo simulated scanner data, generated for a digital thoracic phantom with uptake kinetics based on a dynamic [18F]-Fluromisonidazole scan of a non-small cell lung cancer patient. For every algorithm, parametric maps were calculated by fitting each voxel TAC within a sub-region of the reconstructed images with the 2C3K model. Compared to conventional MAP reconstruction, spline-residue-based 4D reconstruction achieved >50% improvements for five of the eight combinations of the four kinetics parameters for which parametric maps were created with the bias and noise measures used to analyse them, and produced better results for 5/8 combinations than any of the other reconstruction algorithms studied, while spectral model-based 4D reconstruction produced the best results for 2/8. 2C3K model-based 4D reconstruction generated the most biased parametric maps. Inclusion of a temporal roughness penalty function improved the performance of 4D reconstruction based on the cubic B-spline, spectral and spline-residue models.
Stochastic climate dynamics: Stochastic parametrizations and their global effects
NASA Astrophysics Data System (ADS)
Ghil, Michael
2010-05-01
A well-known difficulty in modeling the atmosphere and oceans' general circulation is the limited, albeit increasing resolution possible in the numerical solution of the governing partial differential equations. While the mass, energy and momentum of an individual cloud, in the atmosphere, or convection chimney, in the oceans, is negligible, their combined effects over long times are not. Until recently, small, subgrid-scale processes were represented in general circulation models (GCMs) by deterministic "parametrizations." While A. Arakawa and associates had realized over three decades ago the conceptual need for ensembles of clouds in such parametrizations, it is only very recently that truly stochastic parametrizations have been introduced into GCMs and weather prediction models. These parametrizations essentially transform a deterministic autonomous system into a non-autonomous one, subject to random forcing. To study systematically the long-term effects of such a forcing has to rely on theory of random dynamical systems (RDS). This theory allows one to consider the detailed geometric structure of the random attractors associated with nonlinear, stochastically perturbed systems. These attractors extend the concept of strange attractors from autonomous dynamical systems to non-autonomous systems with random forcing. To illustrate the essence of the theory, its concepts and methods, we carry out a high-resolution numerical study of two "toy" models in their respective phase spaces. This study allows one to obtain a good approximation of their global random attractors, as well as of the time-dependent invariant measures supported by these attractors. The first of the two models studied herein is the Arnol'd family of circle maps in the presence of noise. The maps' fine-grained, resonant landscape --- associated with Arnol'd tongues --- is smoothed by the noise, thus permitting a comparison with the observable aspects of the "Devil's staircase" that arises in modeling the El Nino-Southern Oscillation (ENSO). These results are confirmed by studying a "French garden" that is obtained by smoothing a "Devil's quarry." Such a quarry results from coupling two circle maps, and random forcing leads to a smoothed version thereof. We thus suspect that stochastic parametrizations will stabilize the sensitive dependence on parameters that has been noticed in the development of GCMs. This talk represents joint work with Mickael D. Chekroun, D. Kondrashov, Eric Simonnet and I. Zaliapin. Several other talks and posters complement the results presented here and provide further insights into RDS theory and its application to the geosciences.
PET image reconstruction using multi-parametric anato-functional priors
NASA Astrophysics Data System (ADS)
Mehranian, Abolfazl; Belzunce, Martin A.; Niccolini, Flavia; Politis, Marios; Prieto, Claudia; Turkheimer, Federico; Hammers, Alexander; Reader, Andrew J.
2017-08-01
In this study, we investigate the application of multi-parametric anato-functional (MR-PET) priors for the maximum a posteriori (MAP) reconstruction of brain PET data in order to address the limitations of the conventional anatomical priors in the presence of PET-MR mismatches. In addition to partial volume correction benefits, the suitability of these priors for reconstruction of low-count PET data is also introduced and demonstrated, comparing to standard maximum-likelihood (ML) reconstruction of high-count data. The conventional local Tikhonov and total variation (TV) priors and current state-of-the-art anatomical priors including the Kaipio, non-local Tikhonov prior with Bowsher and Gaussian similarity kernels are investigated and presented in a unified framework. The Gaussian kernels are calculated using both voxel- and patch-based feature vectors. To cope with PET and MR mismatches, the Bowsher and Gaussian priors are extended to multi-parametric priors. In addition, we propose a modified joint Burg entropy prior that by definition exploits all parametric information in the MAP reconstruction of PET data. The performance of the priors was extensively evaluated using 3D simulations and two clinical brain datasets of [18F]florbetaben and [18F]FDG radiotracers. For simulations, several anato-functional mismatches were intentionally introduced between the PET and MR images, and furthermore, for the FDG clinical dataset, two PET-unique active tumours were embedded in the PET data. Our simulation results showed that the joint Burg entropy prior far outperformed the conventional anatomical priors in terms of preserving PET unique lesions, while still reconstructing functional boundaries with corresponding MR boundaries. In addition, the multi-parametric extension of the Gaussian and Bowsher priors led to enhanced preservation of edge and PET unique features and also an improved bias-variance performance. In agreement with the simulation results, the clinical results also showed that the Gaussian prior with voxel-based feature vectors, the Bowsher and the joint Burg entropy priors were the best performing priors. However, for the FDG dataset with simulated tumours, the TV and proposed priors were capable of preserving the PET-unique tumours. Finally, an important outcome was the demonstration that the MAP reconstruction of a low-count FDG PET dataset using the proposed joint entropy prior can lead to comparable image quality to a conventional ML reconstruction with up to 5 times more counts. In conclusion, multi-parametric anato-functional priors provide a solution to address the pitfalls of the conventional priors and are therefore likely to increase the diagnostic confidence in MR-guided PET image reconstructions.
GIS Methodic and New Database for Magmatic Rocks. Application for Atlantic Oceanic Magmatism.
NASA Astrophysics Data System (ADS)
Asavin, A. M.
2001-12-01
There are several geochemical Databases in INTERNET available now. There one of the main peculiarities of stored geochemical information is geographical coordinates of each samples in those Databases. As rule the software of this Database use spatial information only for users interface search procedures. In the other side, GIS-software (Geographical Information System software),for example ARC/INFO software which using for creation and analyzing special geological, geochemical and geophysical e-map, have been deeply involved with geographical coordinates for of samples. We join peculiarities GIS systems and relational geochemical Database from special software. Our geochemical information system created in Vernadsky Geological State Museum and institute of Geochemistry and Analytical Chemistry from Moscow. Now we tested system with data of geochemistry oceanic rock from Atlantic and Pacific oceans, about 10000 chemical analysis. GIS information content consist from e-map covers Wold Globes. Parts of these maps are Atlantic ocean covers gravica map (with grid 2''), oceanic bottom hot stream, altimeteric maps, seismic activity, tectonic map and geological map. Combination of this information content makes possible created new geochemical maps and combination of spatial analysis and numerical geochemical modeling of volcanic process in ocean segment. Now we tested information system on thick client technology. Interface between GIS system Arc/View and Database resides in special multiply SQL-queries sequence. The result of the above gueries were simple DBF-file with geographical coordinates. This file act at the instant of creation geochemical and other special e-map from oceanic region. We used more complex method for geophysical data. From ARC\\View we created grid cover for polygon spatial geophysical information.
Ultraino: An Open Phased-Array System for Narrowband Airborne Ultrasound Transmission.
Marzo, Asier; Corkett, Tom; Drinkwater, Bruce W
2018-01-01
Modern ultrasonic phased-array controllers are electronic systems capable of delaying the transmitted or received signals of multiple transducers. Configurable transmit-receive array systems, capable of electronic steering and shaping of the beam in near real-time, are available commercially, for example, for medical imaging. However, emerging applications, such as ultrasonic haptics, parametric audio, or ultrasonic levitation, require only a small subset of the capabilities provided by the existing controllers. To meet this need, we present Ultraino, a modular, inexpensive, and open platform that provides hardware, software, and example applications specifically aimed at controlling the transmission of narrowband airborne ultrasound. Our system is composed of software, driver boards, and arrays that enable users to quickly and efficiently perform research in various emerging applications. The software can be used to define array geometries, simulate the acoustic field in real time, and control the connected driver boards. The driver board design is based on an Arduino Mega and can control 64 channels with a square wave of up to 17 Vpp and /5 phase resolution. Multiple boards can be chained together to increase the number of channels. The 40-kHz arrays with flat and spherical geometries are demonstrated for parametric audio generation, acoustic levitation, and haptic feedback.
Towards the Optimal Pixel Size of dem for Automatic Mapping of Landslide Areas
NASA Astrophysics Data System (ADS)
Pawłuszek, K.; Borkowski, A.; Tarolli, P.
2017-05-01
Determining appropriate spatial resolution of digital elevation model (DEM) is a key step for effective landslide analysis based on remote sensing data. Several studies demonstrated that choosing the finest DEM resolution is not always the best solution. Various DEM resolutions can be applicable for diverse landslide applications. Thus, this study aims to assess the influence of special resolution on automatic landslide mapping. Pixel-based approach using parametric and non-parametric classification methods, namely feed forward neural network (FFNN) and maximum likelihood classification (ML), were applied in this study. Additionally, this allowed to determine the impact of used classification method for selection of DEM resolution. Landslide affected areas were mapped based on four DEMs generated at 1 m, 2 m, 5 m and 10 m spatial resolution from airborne laser scanning (ALS) data. The performance of the landslide mapping was then evaluated by applying landslide inventory map and computation of confusion matrix. The results of this study suggests that the finest scale of DEM is not always the best fit, however working at 1 m DEM resolution on micro-topography scale, can show different results. The best performance was found at 5 m DEM-resolution for FFNN and 1 m DEM resolution for results. The best performance was found to be using 5 m DEM-resolution for FFNN and 1 m DEM resolution for ML classification.
Broët, Philippe; Tsodikov, Alexander; De Rycke, Yann; Moreau, Thierry
2004-06-01
This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests.
Computer aided system for parametric design of combination die
NASA Astrophysics Data System (ADS)
Naranje, Vishal G.; Hussein, H. M. A.; Kumar, S.
2017-09-01
In this paper, a computer aided system for parametric design of combination dies is presented. The system is developed using knowledge based system technique of artificial intelligence. The system is capable to design combination dies for production of sheet metal parts having punching and cupping operations. The system is coded in Visual Basic and interfaced with AutoCAD software. The low cost of the proposed system will help die designers of small and medium scale sheet metal industries for design of combination dies for similar type of products. The proposed system is capable to reduce design time and efforts of die designers for design of combination dies.
NASA Astrophysics Data System (ADS)
Bereau, Tristan; Wang, Zun-Jing; Deserno, Markus
2014-03-01
Interfacial systems are at the core of fascinating phenomena in many disciplines, such as biochemistry, soft-matter physics, and food science. However, the parametrization of accurate, reliable, and consistent coarse-grained (CG) models for systems at interfaces remains a challenging endeavor. In the present work, we explore to what extent two independently developed solvent-free CG models of peptides and lipids—of different mapping schemes, parametrization methods, target functions, and validation criteria—can be combined by only tuning the cross-interactions. Our results show that the cross-parametrization can reproduce a number of structural properties of membrane peptides (for example, tilt and hydrophobic mismatch), in agreement with existing peptide-lipid CG force fields. We find encouraging results for two challenging biophysical problems: (i) membrane pore formation mediated by the cooperative action of several antimicrobial peptides, and (ii) the insertion and folding of the helix-forming peptide WALP23 in the membrane.
Regional Brain Activity in Abstinent Methamphetamine Dependent Males Following Cue Exposure.
Malcolm, Robert; Myrick, Hugh; Li, Xingbao; Henderson, Scott; Brady, Kathleen T; George, Mark S; See, Ronald E
Neuroimaging of drug-associated cue presentations has aided in understanding the neurobiological substrates of craving and relapse for cocaine, alcohol, and nicotine. However, imaging of cue-reactivity in methamphetamine addiction has been much less studied. Nine caucasian male methamphetamine-dependent subjects and nine healthy controls were scanned in a Phillips 3.0T MRI scan when they viewed a randomized presentation of visual cues of methamphetamine, neutral objects, and rest conditions. Functional Imaging data were analyzed with Statistical Parametric Mapping software 5 (SPM 5). Methamphetamine subjects had significant brain activation in the ventral striatum and medial frontal cortex in comparison to meth pictures and neutral pictures in healthy controls (p<0.005, threshold 15 voxels). Interestingly the ventral striatum activation significantly correlated with the days since the last use of meth (r=-0.76, p=0.017). No significant activity was found in healthy control group. The preliminary data suggest that methamphetamine dependent subjects, when exposed to methamphetamine-associated visual cues, have increased brain activity in ventral striatum, caudate nucleus and medial frontal cortex which subserve craving, drug-seeking, and drug use.
Britz, Juliane; Pitts, Michael A
2011-11-01
We used an intermittent stimulus presentation to investigate event-related potential (ERP) components associated with perceptual reversals during binocular rivalry. The combination of spatiotemporal ERP analysis with source imaging and statistical parametric mapping of the concomitant source differences yielded differences in three time windows: reversals showed increased activity in early visual (∼120 ms) and in inferior frontal and anterior temporal areas (∼400-600 ms) and decreased activity in the ventral stream (∼250-350 ms). The combination of source imaging and statistical parametric mapping suggests that these differences were due to differences in generator strength and not generator configuration, unlike the initiation of reversals in right inferior parietal areas. These results are discussed within the context of the extensive network of brain areas that has been implicated in the initiation, implementation, and appraisal of bistable perceptual reversals. Copyright © 2011 Society for Psychophysiological Research.
NASA Astrophysics Data System (ADS)
Oreni, D.; Karimi, G.; Barazzetti, L.
2017-08-01
This paper presents the development of a BIM model for a stratified historic structure characterized by a complex geometry: Filarete's Ospedale Maggiore ice house, one of the few remaining historic ice houses in Milan (Fig. 1). Filarete, a well-known Renaissance architect and theorist, planned the hospital in the 15th century, but the ice house was built two centuries later with a double-storey irregular octagonal brick structure, half under and half above ground, that enclosed another circular structure called the ice room. The purpose of the double-walled structure was to store ice in the middle and store and preserve perishable food and medicine at the outer side of the ice room. During World War II, major portions of the hospital and the above-ground section of the ice house was bombed and heavily damaged. Later, in 1962, the hospital was restored and rehabilitated into a university, with the plan to conceal the ice house's remaining structure in the courtyard, which ultimately was excavated and incorporated into a new library for the university. A team of engineers, architects, and students from Politecnico di Milano and Carleton University conducted two heritage recording surveys in 2015 and 2016 to fully document the existing condition of the ice house, resulting in an inclusive laser scanner and photogrammetric point cloud dataset. The point cloud data was consolidated and imported into two leading parametric modelling software, Autodesk Revitand Graphisoft ArchiCAD©, with the goal to develop two BIMs in parallel in order to study and compare the software BIM workflow, parametric capabilities, attributes to capture the complex geometry with high accuracy, and the duration for parametric modelling. The comparison study of the two software revealed their workflow limitations, leading to integration of the BIM generative process with other pure modelling software such as Rhinoceros©. The integrative BIM process led to the production of a comprehensive BIM model that documented related historic data and the existing physical state of the ice house, to be used as a baseline for preventive maintenance, monitoring, and future conservation projects.
Using Google Earth as an innovative tool for community mapping.
Lefer, Theodore B; Anderson, Matthew R; Fornari, Alice; Lambert, Anastasia; Fletcher, Jason; Baquero, Maria
2008-01-01
Maps are used to track diseases and illustrate the social context of health problems. However, commercial mapping software requires special training. This article illustrates how nonspecialists used Google Earth, a free program, to create community maps. The Bronx, New York, is characterized by high levels of obesity and diabetes. Residents and medical students measured the variety and quality of food and exercise sources around a residency training clinic and a student-run free clinic, using Google Earth to create maps with minimal assistance. Locations were identified using street addresses or simply by pointing to them on a map. Maps can be shared via e-mail, viewed online with Google Earth or Google Maps, and the data can be incorporated into other mapping software.
Kelley, James J; Maor, Shay; Kim, Min Kyung; Lane, Anatoliy; Lun, Desmond S
2017-08-15
Visualization of metabolites, reactions and pathways in genome-scale metabolic networks (GEMs) can assist in understanding cellular metabolism. Three attributes are desirable in software used for visualizing GEMs: (i) automation, since GEMs can be quite large; (ii) production of understandable maps that provide ease in identification of pathways, reactions and metabolites; and (iii) visualization of the entire network to show how pathways are interconnected. No software currently exists for visualizing GEMs that satisfies all three characteristics, but MOST-Visualization, an extension of the software package MOST (Metabolic Optimization and Simulation Tool), satisfies (i), and by using a pre-drawn overview map of metabolism based on the Roche map satisfies (ii) and comes close to satisfying (iii). MOST is distributed for free on the GNU General Public License. The software and full documentation are available at http://most.ccib.rutgers.edu/. dslun@rutgers.edu. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
How Conoco uses GIS technology to map geology, geography through time
DOE Office of Scientific and Technical Information (OSTI.GOV)
Foley, D.C.; Ghazi, T.Y.
1995-05-08
Conoco Inc.`s Advanced Exploration Organization (AEO) is in the business of studying foreign sedimentary basins from a regional perspective to evaluate their potential for petroleum exploration. Recently the company decided to focus some of the AEO`s resources on developing a global ranking system for those areas of the world where hydrocarbons might occur. AEO obtained software from the University of Texas, Arlington that rotates continents or portions of continents through time. Using the software, company geoscientists have created a series of maps, known as a PaleoAtlas, that depicts the geography and selected geological features for different periods in Phanerozoic time.more » In addition, the AEO has developed a software package based on ARC/INFO (ESRI Inc., Redlands, Calif.), a commercial GIS platform, to manage, integrate, and analyze those time-slice maps. Entitled PaleoAtlas Geographic Evaluation system (Pages), this software also sequences portions of the maps in a montage effect that geoscientists can use to study the geological evolution of petroleum source rocks. The paper describes the AEO project and its software.« less
AIRS Maps from Space Processing Software
NASA Technical Reports Server (NTRS)
Thompson, Charles K.; Licata, Stephen J.
2012-01-01
This software package processes Atmospheric Infrared Sounder (AIRS) Level 2 swath standard product geophysical parameters, and generates global, colorized, annotated maps. It automatically generates daily and multi-day averaged colorized and annotated maps of various AIRS Level 2 swath geophysical parameters. It also generates AIRS input data sets for Eyes on Earth, Puffer-sphere, and Magic Planet. This program is tailored to AIRS Level 2 data products. It re-projects data into 1/4-degree grids that can be combined and averaged for any number of days. The software scales and colorizes global grids utilizing AIRS-specific color tables, and annotates images with title and color bar. This software can be tailored for use with other swath data products for the purposes of visualization.
Design and Applications of Rapid Image Tile Producing Software Based on Mosaic Dataset
NASA Astrophysics Data System (ADS)
Zha, Z.; Huang, W.; Wang, C.; Tang, D.; Zhu, L.
2018-04-01
Map tile technology is widely used in web geographic information services. How to efficiently produce map tiles is key technology for rapid service of images on web. In this paper, a rapid producing software for image tile data based on mosaic dataset is designed, meanwhile, the flow of tile producing is given. Key technologies such as cluster processing, map representation, tile checking, tile conversion and compression in memory are discussed. Accomplished by software development and tested by actual image data, the results show that this software has a high degree of automation, would be able to effectively reducing the number of IO and improve the tile producing efficiency. Moreover, the manual operations would be reduced significantly.
Descriptions of Free and Freeware Software in the Mathematics Teaching
NASA Astrophysics Data System (ADS)
Antunes de Macedo, Josue; Neves de Almeida, Samara; Voelzke, Marcos Rincon
2016-05-01
This paper presents the analysis and the cataloging of free and freeware mathematical software available on the internet, a brief explanation of them, and types of licenses for use in teaching and learning. The methodology is based on the qualitative research. Among the different types of software found, it stands out in algebra, the Winmat, that works with linear algebra, matrices and linear systems. In geometry, the GeoGebra, which can be used in the study of functions, plan and spatial geometry, algebra and calculus. For graphing, can quote the Graph and Graphequation. With Graphmatica software, it is possible to build various graphs of mathematical equations on the same screen, representing cartesian equations, inequalities, parametric among other functions. The Winplot allows the user to build graphics in two and three dimensions functions and mathematical equations. Thus, this work aims to present the teachers some free math software able to be used in the classroom.
Software for Generating Troposphere Corrections for InSAR Using GPS and Weather Model Data
NASA Technical Reports Server (NTRS)
Moore, Angelyn W.; Webb, Frank H.; Fishbein, Evan F.; Fielding, Eric J.; Owen, Susan E.; Granger, Stephanie L.; Bjoerndahl, Fredrik; Loefgren, Johan; Fang, Peng; Means, James D.;
2013-01-01
Atmospheric errors due to the troposphere are a limiting error source for spaceborne interferometric synthetic aperture radar (InSAR) imaging. This software generates tropospheric delay maps that can be used to correct atmospheric artifacts in InSAR data. The software automatically acquires all needed GPS (Global Positioning System), weather, and Digital Elevation Map data, and generates a tropospheric correction map using a novel algorithm for combining GPS and weather information while accounting for terrain. Existing JPL software was prototypical in nature, required a MATLAB license, required additional steps to acquire and ingest needed GPS and weather data, and did not account for topography in interpolation. Previous software did not achieve a level of automation suitable for integration in a Web portal. This software overcomes these issues. GPS estimates of tropospheric delay are a source of corrections that can be used to form correction maps to be applied to InSAR data, but the spacing of GPS stations is insufficient to remove short-wavelength tropospheric artifacts. This software combines interpolated GPS delay with weather model precipitable water vapor (PWV) and a digital elevation model to account for terrain, increasing the spatial resolution of the tropospheric correction maps and thus removing short wavelength tropospheric artifacts to a greater extent. It will be integrated into a Web portal request system, allowing use in a future L-band SAR Earth radar mission data system. This will be a significant contribution to its technology readiness, building on existing investments in in situ space geodetic networks, and improving timeliness, quality, and science value of the collected data
NASA Astrophysics Data System (ADS)
Gangeh, Mehrdad J.; Fung, Brandon; Tadayyon, Hadi; Tran, William T.; Czarnota, Gregory J.
2016-03-01
A non-invasive computer-aided-theragnosis (CAT) system was developed for the early assessment of responses to neoadjuvant chemotherapy in patients with locally advanced breast cancer. The CAT system was based on quantitative ultrasound spectroscopy methods comprising several modules including feature extraction, a metric to measure the dissimilarity between "pre-" and "mid-treatment" scans, and a supervised learning algorithm for the classification of patients to responders/non-responders. One major requirement for the successful design of a high-performance CAT system is to accurately measure the changes in parametric maps before treatment onset and during the course of treatment. To this end, a unified framework based on Hilbert-Schmidt independence criterion (HSIC) was used for the design of feature extraction from parametric maps and the dissimilarity measure between the "pre-" and "mid-treatment" scans. For the feature extraction, HSIC was used to design a supervised dictionary learning (SDL) method by maximizing the dependency between the scans taken from "pre-" and "mid-treatment" with "dummy labels" given to the scans. For the dissimilarity measure, an HSIC-based metric was employed to effectively measure the changes in parametric maps as an indication of treatment effectiveness. The HSIC-based feature extraction and dissimilarity measure used a kernel function to nonlinearly transform input vectors into a higher dimensional feature space and computed the population means in the new space, where enhanced group separability was ideally obtained. The results of the classification using the developed CAT system indicated an improvement of performance compared to a CAT system with basic features using histogram of intensity.
Inspection design using 2D phased array, TFM and cueMAP software
DOE Office of Scientific and Technical Information (OSTI.GOV)
McGilp, Ailidh; Dziewierz, Jerzy; Lardner, Tim
2014-02-18
A simulation suite, cueMAP, has been developed to facilitate the design of inspection processes and sparse 2D array configurations. At the core of cueMAP is a Total Focusing Method (TFM) imaging algorithm that enables computer assisted design of ultrasonic inspection scenarios, including the design of bespoke array configurations to match the inspection criteria. This in-house developed TFM code allows for interactive evaluation of image quality indicators of ultrasonic imaging performance when utilizing a 2D phased array working in FMC/TFM mode. The cueMAP software uses a series of TFM images to build a map of resolution, contrast and sensitivity of imagingmore » performance of a simulated reflector, swept across the inspection volume. The software takes into account probe properties, wedge or water standoff, and effects of specimen curvature. In the validation process of this new software package, two 2D arrays have been evaluated on 304n stainless steel samples, typical of the primary circuit in nuclear plants. Thick section samples have been inspected using a 1MHz 2D matrix array. Due to the processing efficiency of the software, the data collected from these array configurations has been used to investigate the influence sub-aperture operation on inspection performance.« less
Heating and thermal squeezing in parametrically driven oscillators with added noise.
Batista, Adriano A
2012-11-01
In this paper we report a theoretical model based on Green's functions, Floquet theory, and averaging techniques up to second order that describes the dynamics of parametrically driven oscillators with added thermal noise. Quantitative estimates for heating and quadrature thermal noise squeezing near and below the transition line of the first parametric instability zone of the oscillator are given. Furthermore, we give an intuitive explanation as to why heating and thermal squeezing occur. For small amplitudes of the parametric pump the Floquet multipliers are complex conjugate of each other with a constant magnitude. As the pump amplitude is increased past a threshold value in the stable zone near the first parametric instability, the two Floquet multipliers become real and have different magnitudes. This creates two different effective dissipation rates (one smaller and the other larger than the real dissipation rate) along the stable manifolds of the first-return Poincaré map. We also show that the statistical average of the input power due to thermal noise is constant and independent of the pump amplitude and frequency. The combination of these effects causes most of heating and thermal squeezing. Very good agreement between analytical and numerical estimates of the thermal fluctuations is achieved.
Nam, Yoonho; Jang, Jinhee; Park, Sonya Youngju; Choi, Hyun Seok; Jung, So-Lyung; Ahn, Kook-Jin; Kim, Bum-Soo
2018-05-22
To explore the feasibility of using correlation-based time-delay (CTD) maps produced from time-resolved MR angiography (TRMRA) to diagnose perfusion abnormalities in patients suspected to have steno-occlusive lesions in the craniocervical arteries. Twenty-seven patients who were suspected to have steno-occlusive lesions in the craniocervical arteries underwent both TRMRA and brain single-photon emission computed tomography (SPECT). TRMRA was performed on the supra-aortic area after intravenous injection of a 0.03 mmol/kg gadolinium-based contrast agent. Time-to-peak (TTP) maps and CTD maps of the brain were automatically generated from TRMRA data, and their quality was assessed. Detection of perfusion abnormalities was compared between CTD maps and the time-series maximal intensity projection (MIP) images from TRMRA and TTP maps. Correlation coefficients between quantitative changes in SPECT and parametric maps for the abnormal perfusion areas were calculated. The CTD maps were of significantly superior quality than TTP maps (p < 0.01). For perfusion abnormality detection, CTD maps (kappa 0.84, 95% confidence interval [CI] 0.67-1.00) showed better agreement with SPECT than TTP maps (0.66, 0.46-0.85). For perfusion deficit detection, CTD maps showed higher accuracy (85.2%, 95% CI 66.3-95.8) than MIP images (66.7%, 46-83.5), with marginal significance (p = 0.07). In abnormal perfusion areas, correlation coefficients between SPECT and CTD (r = 0.74, 95% CI 0.34-0.91) were higher than those between SPECT and TTP (r = 0.66, 0.20-0.88). CTD maps generated from TRMRA were of high quality and offered good diagnostic performance for detecting perfusion abnormalities associated with steno-occlusive arterial lesions in the craniocervical area. • Generation of perfusion parametric maps from time-resolved MR angiography is clinically useful. • Correlation-based delay maps can be used to detect perfusion abnormalities associated with steno-occlusive craniocervical arteries. • Estimation of correlation-based delay is robust for low signal-to-noise 4D MR data.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Indoor Modelling Benchmark for 3D Geometry Extraction
NASA Astrophysics Data System (ADS)
Thomson, C.; Boehm, J.
2014-06-01
A combination of faster, cheaper and more accurate hardware, more sophisticated software, and greater industry acceptance have all laid the foundations for an increased desire for accurate 3D parametric models of buildings. Pointclouds are the data source of choice currently with static terrestrial laser scanning the predominant tool for large, dense volume measurement. The current importance of pointclouds as the primary source of real world representation is endorsed by CAD software vendor acquisitions of pointcloud engines in 2011. Both the capture and modelling of indoor environments require great effort in time by the operator (and therefore cost). Automation is seen as a way to aid this by reducing the workload of the user and some commercial packages have appeared that provide automation to some degree. In the data capture phase, advances in indoor mobile mapping systems are speeding up the process, albeit currently with a reduction in accuracy. As a result this paper presents freely accessible pointcloud datasets of two typical areas of a building each captured with two different capture methods and each with an accurate wholly manually created model. These datasets are provided as a benchmark for the research community to gauge the performance and improvements of various techniques for indoor geometry extraction. With this in mind, non-proprietary, interoperable formats are provided such as E57 for the scans and IFC for the reference model. The datasets can be found at: http://indoor-bench.github.io/indoor-bench.
Koush, Yury; Ashburner, John; Prilepin, Evgeny; Sladky, Ronald; Zeidman, Peter; Bibikov, Sergei; Scharnowski, Frank; Nikonorov, Artem; De Ville, Dimitri Van
2017-08-01
Neurofeedback based on real-time functional magnetic resonance imaging (rt-fMRI) is a novel and rapidly developing research field. It allows for training of voluntary control over localized brain activity and connectivity and has demonstrated promising clinical applications. Because of the rapid technical developments of MRI techniques and the availability of high-performance computing, new methodological advances in rt-fMRI neurofeedback become possible. Here we outline the core components of a novel open-source neurofeedback framework, termed Open NeuroFeedback Training (OpenNFT), which efficiently integrates these new developments. This framework is implemented using Python and Matlab source code to allow for diverse functionality, high modularity, and rapid extendibility of the software depending on the user's needs. In addition, it provides an easy interface to the functionality of Statistical Parametric Mapping (SPM) that is also open-source and one of the most widely used fMRI data analysis software. We demonstrate the functionality of our new framework by describing case studies that include neurofeedback protocols based on brain activity levels, effective connectivity models, and pattern classification approaches. This open-source initiative provides a suitable framework to actively engage in the development of novel neurofeedback approaches, so that local methodological developments can be easily made accessible to a wider range of users. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Andersen, Flemming; Watanabe, Hideaki; Bjarkam, Carsten; Danielsen, Erik H; Cumming, Paul
2005-07-15
The analysis of physiological processes in brain by position emission tomography (PET) is facilitated when images are spatially normalized to a standard coordinate system. Thus, PET activation studies of human brain frequently employ the common stereotaxic coordinates of Talairach. We have developed an analogous stereotaxic coordinate system for the brain of the Gottingen miniature pig, based on automatic co-registration of magnetic resonance (MR) images obtained in 22 male pigs. The origin of the pig brain stereotaxic space (0, 0, 0) was arbitrarily placed in the centroid of the pineal gland as identified on the average MRI template. The orthogonal planes were imposed using the line between stereotaxic zero and the optic chiasm. A series of mean MR images in the coronal, sagittal and horizontal planes were generated. To test the utility of the common coordinate system for functional imaging studies of minipig brain, we calculated cerebral blood flow (CBF) maps from normal minipigs and from minipigs with a syndrome of parkisonism induced by 1-methyl-4-phenyl-1,2,3,6-tetrahydropyridine (MPTP)-poisoning. These maps were transformed from the native space into the common stereotaxic space. After global normalization of these maps, an undirected search for differences between the groups was then performed using statistical parametric mapping. Using this method, we detected a statistically significant focal increase in CBF in the left cerebellum of the MPTP-lesioned group. We expect the present approach to be of general use in the statistical parametric mapping of CBF and other physiological parameters in living pig brain.
Instructional Strategies to Promote Student Strategic Thinking When Using SolidWorks
ERIC Educational Resources Information Center
Toto, Roxanne; Colledge, Thomas; Frederick, David; Pung, Wik Hung
2014-01-01
Reflective of current trends in industry, engineering design professionals are expected to have knowledge of 3D modeling software. Responding to this need, engineering curricula seek to effectively prepare students for the workforce by requiring instruction in the use of 3D parametric solid modeling. Recent literature contains many examples that…
Martí-Bonmatí, Luis; Lull, Juan José; García-Martí, Gracián; Aguilar, Eduardo J; Moratal-Pérez, David; Poyatos, Cecilio; Robles, Montserrat; Sanjuán, Julio
2007-08-01
To prospectively evaluate if functional magnetic resonance (MR) imaging abnormalities associated with auditory emotional stimuli coexist with focal brain reductions in schizophrenic patients with chronic auditory hallucinations. Institutional review board approval was obtained and all participants gave written informed consent. Twenty-one right-handed male patients with schizophrenia and persistent hallucinations (started to hear hallucinations at a mean age of 23 years +/- 10, with 15 years +/- 8 of mean illness duration) and 10 healthy paired participants (same ethnic group [white], age, and education level [secondary school]) were studied. Functional echo-planar T2*-weighted (after both emotional and neutral auditory stimulation) and morphometric three-dimensional gradient-recalled echo T1-weighted MR images were analyzed using Statistical Parametric Mapping (SPM2) software. Brain activation images were extracted by subtracting those with emotional from nonemotional words. Anatomic differences were explored by optimized voxel-based morphometry. The functional and morphometric MR images were overlaid to depict voxels statistically reported by both techniques. A coincidence map was generated by multiplying the emotional subtracted functional MR and volume decrement morphometric maps. Statistical analysis used the general linear model, Student t tests, random effects analyses, and analysis of covariance with a correction for multiple comparisons following the false discovery rate method. Large coinciding brain clusters (P < .005) were found in the left and right middle temporal and superior temporal gyri. Smaller coinciding clusters were found in the left posterior and right anterior cingular gyri, left inferior frontal gyrus, and middle occipital gyrus. The middle and superior temporal and the cingular gyri are closely related to the abnormal neural network involved in the auditory emotional dysfunction seen in schizophrenic patients.
Optical Imaging and Radiometric Modeling and Simulation
NASA Technical Reports Server (NTRS)
Ha, Kong Q.; Fitzmaurice, Michael W.; Moiser, Gary E.; Howard, Joseph M.; Le, Chi M.
2010-01-01
OPTOOL software is a general-purpose optical systems analysis tool that was developed to offer a solution to problems associated with computational programs written for the James Webb Space Telescope optical system. It integrates existing routines into coherent processes, and provides a structure with reusable capabilities that allow additional processes to be quickly developed and integrated. It has an extensive graphical user interface, which makes the tool more intuitive and friendly. OPTOOL is implemented using MATLAB with a Fourier optics-based approach for point spread function (PSF) calculations. It features parametric and Monte Carlo simulation capabilities, and uses a direct integration calculation to permit high spatial sampling of the PSF. Exit pupil optical path difference (OPD) maps can be generated using combinations of Zernike polynomials or shaped power spectral densities. The graphical user interface allows rapid creation of arbitrary pupil geometries, and entry of all other modeling parameters to support basic imaging and radiometric analyses. OPTOOL provides the capability to generate wavefront-error (WFE) maps for arbitrary grid sizes. These maps are 2D arrays containing digital sampled versions of functions ranging from Zernike polynomials to combination of sinusoidal wave functions in 2D, to functions generated from a spatial frequency power spectral distribution (PSD). It also can generate optical transfer functions (OTFs), which are incorporated into the PSF calculation. The user can specify radiometrics for the target and sky background, and key performance parameters for the instrument s focal plane array (FPA). This radiometric and detector model setup is fairly extensive, and includes parameters such as zodiacal background, thermal emission noise, read noise, and dark current. The setup also includes target spectral energy distribution as a function of wavelength for polychromatic sources, detector pixel size, and the FPA s charge diffusion modulation transfer function (MTF).
Source Lines Counter (SLiC) Version 4.0
NASA Technical Reports Server (NTRS)
Monson, Erik W.; Smith, Kevin A.; Newport, Brian J.; Gostelow, Roli D.; Hihn, Jairus M.; Kandt, Ronald K.
2011-01-01
Source Lines Counter (SLiC) is a software utility designed to measure software source code size using logical source statements and other common measures for 22 of the programming languages commonly used at NASA and the aerospace industry. Such metrics can be used in a wide variety of applications, from parametric cost estimation to software defect analysis. SLiC has a variety of unique features such as automatic code search, automatic file detection, hierarchical directory totals, and spreadsheet-compatible output. SLiC was written for extensibility; new programming language support can be added with minimal effort in a short amount of time. SLiC runs on a variety of platforms including UNIX, Windows, and Mac OSX. Its straightforward command-line interface allows for customization and incorporation into the software build process for tracking development metrics. T
NASA Astrophysics Data System (ADS)
Wilcox, T.
2016-12-01
How quickly can students (and educators) get started using a "ready to fly" UAS and popular publicly available photogrammetric mapping software for student research at the undergraduate level? This poster presentation focuses on the challenges of starting up your own drone-mapping program for undergraduate research in a compressed timescale of three months. Particular focus will be given to learning the operation of the platforms, hardware and software interface challenges, and using these electronic systems in real-world field settings that pose a range of physical challenges to both operators and equipment. We will be using a combination of the popular DJI Phantom UAS and Pix4D mapping software to investigate mass wasting processes and potential hazards present in public lands popular with recreational users. Projects are aimed at characterizing active geological hazards that operate on short timescales and may include gully headwall erosion in Flaming Geyser State Park and potential landslide instability within Capital State Forest, both in the Puget Sound region of Washington State.
Parametrically Optimized Carbon Nanotube-Coated Cold Cathode Spindt Arrays
Yuan, Xuesong; Cole, Matthew T.; Zhang, Yu; Wu, Jianqiang; Milne, William I.; Yan, Yang
2017-01-01
Here, we investigate, through parametrically optimized macroscale simulations, the field electron emission from arrays of carbon nanotube (CNT)-coated Spindts towards the development of an emerging class of novel vacuum electron devices. The present study builds on empirical data gleaned from our recent experimental findings on the room temperature electron emission from large area CNT electron sources. We determine the field emission current of the present microstructures directly using particle in cell (PIC) software and present a new CNT cold cathode array variant which has been geometrically optimized to provide maximal emission current density, with current densities of up to 11.5 A/cm2 at low operational electric fields of 5.0 V/μm. PMID:28336845
Conversion of Component-Based Point Definition to VSP Model and Higher Order Meshing
NASA Technical Reports Server (NTRS)
Ordaz, Irian
2011-01-01
Vehicle Sketch Pad (VSP) has become a powerful conceptual and parametric geometry tool with numerous export capabilities for third-party analysis codes as well as robust surface meshing capabilities for computational fluid dynamics (CFD) analysis. However, a capability gap currently exists for reconstructing a fully parametric VSP model of a geometry generated by third-party software. A computer code called GEO2VSP has been developed to close this gap and to allow the integration of VSP into a closed-loop geometry design process with other third-party design tools. Furthermore, the automated CFD surface meshing capability of VSP are demonstrated for component-based point definition geometries in a conceptual analysis and design framework.
BROËT, PHILIPPE; TSODIKOV, ALEXANDER; DE RYCKE, YANN; MOREAU, THIERRY
2010-01-01
This paper presents two-sample statistics suited for testing equality of survival functions against improper semi-parametric accelerated failure time alternatives. These tests are designed for comparing either the short- or the long-term effect of a prognostic factor, or both. These statistics are obtained as partial likelihood score statistics from a time-dependent Cox model. As a consequence, the proposed tests can be very easily implemented using widely available software. A breast cancer clinical trial is presented as an example to demonstrate the utility of the proposed tests. PMID:15293627
Organizing Space Shuttle parametric data for maintainability
NASA Technical Reports Server (NTRS)
Angier, R. C.
1983-01-01
A model of organization and management of Space Shuttle data is proposed. Shuttle avionics software is parametrically altered by a reconfiguration process for each flight. As the flight rate approaches an operational level, current methods of data management would become increasingly complex. An alternative method is introduced, using modularized standard data, and its implications for data collection, integration, validation, and reconfiguration processes are explored. Information modules are cataloged for later use, and may be combined in several levels for maintenance. For each flight, information modules can then be selected from the catalog at a high level. These concepts take advantage of the reusability of Space Shuttle information to reduce the cost of reconfiguration as flight experience increases.
Bim Automation: Advanced Modeling Generative Process for Complex Structures
NASA Astrophysics Data System (ADS)
Banfi, F.; Fai, S.; Brumana, R.
2017-08-01
The new paradigm of the complexity of modern and historic structures, which are characterised by complex forms, morphological and typological variables, is one of the greatest challenges for building information modelling (BIM). Generation of complex parametric models needs new scientific knowledge concerning new digital technologies. These elements are helpful to store a vast quantity of information during the life cycle of buildings (LCB). The latest developments of parametric applications do not provide advanced tools, resulting in time-consuming work for the generation of models. This paper presents a method capable of processing and creating complex parametric Building Information Models (BIM) with Non-Uniform to NURBS) with multiple levels of details (Mixed and ReverseLoD) based on accurate 3D photogrammetric and laser scanning surveys. Complex 3D elements are converted into parametric BIM software and finite element applications (BIM to FEA) using specific exchange formats and new modelling tools. The proposed approach has been applied to different case studies: the BIM of modern structure for the courtyard of West Block on Parliament Hill in Ottawa (Ontario) and the BIM of Masegra Castel in Sondrio (Italy), encouraging the dissemination and interaction of scientific results without losing information during the generative process.
NASA Technical Reports Server (NTRS)
Nez, G. (Principal Investigator); Mutter, D.
1977-01-01
The author has identified the following significant results. New LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.
Simple heterogeneity parametrization for sea surface temperature and chlorophyll
NASA Astrophysics Data System (ADS)
Skákala, Jozef; Smyth, Timothy J.
2016-06-01
Using satellite maps this paper offers a complex analysis of chlorophyll & SST heterogeneity in the shelf seas around the southwest of the UK. The heterogeneity scaling follows a simple power law and is consequently parametrized by two parameters. It is shown that in most cases these two parameters vary only relatively little with time. The paper offers a detailed comparison of field heterogeneity between different regions. How much heterogeneity is in each region preserved in the annual median data is also determined. The paper explicitly demonstrates how one can use these results to calculate representative measurement area for in situ networks.
Tau-REx: A new look at the retrieval of exoplanetary atmospheres
NASA Astrophysics Data System (ADS)
Waldmann, Ingo
2014-11-01
The field of exoplanetary spectroscopy is as fast moving as it is new. With an increasing amount of space and ground based instruments obtaining data on a large set of extrasolar planets we are indeed entering the era of exoplanetary characterisation. Permanently at the edge of instrument feasibility, it is as important as it is difficult to find the most optimal and objective methodologies to analysing and interpreting current data. This is particularly true for smaller and fainter Earth and Super-Earth type planets.For low to mid signal to noise (SNR) observations, we are prone to two sources of biases: 1) Prior selection in the data reduction and analysis; 2) Prior constraints on the spectral retrieval. In Waldmann et al. (2013), Morello et al. (2014) and Waldmann (2012, 2014) we have shown a prior-free approach to data analysis based on non-parametric machine learning techniques. Following these approaches we will present a new take on the spectral retrieval of extrasolar planets. Tau-REx (tau-retrieval of exoplanets) is a new line-by-line, atmospheric retrieval framework. In the past the decision on what opacity sources go into an atmospheric model were usually user defined. Manual input can lead to model biases and poor convergence of the atmospheric model to the data. In Tau-REx we have set out to solve this. Through custom built pattern recognition software, Tau-REx is able to rapidly identify the most likely atmospheric opacities from a large number of possible absorbers/emitters (ExoMol or HiTran data bases) and non-parametrically constrain the prior space for the Bayesian retrieval. Unlike other (MCMC based) techniques, Tau-REx is able to fully integrate high-dimensional log-likelihood spaces and to calculate the full Bayesian Evidence of the atmospheric models. We achieve this through a combination of Nested Sampling and a high degree of code parallelisation. This allows for an exact and unbiased Bayesian model selection and a fully mapping of potential model-data degeneracies. Together with non-parametric data de-trending of exoplanetary spectra, we can reach an un- precedented level of objectivity in our atmospheric characterisation of these foreign worlds.
NASA Astrophysics Data System (ADS)
Pavlis, Terry; Hurtado, Jose; Langford, Richard; Serpa, Laura
2014-05-01
Although many geologists refuse to admit it, it is time to put paper-based geologic mapping into the historical archives and move to the full potential of digital mapping techniques. For our group, flat map digital geologic mapping is now a routine operation in both research and instruction. Several software options are available, and basic proficiency with the software can be learned in a few hours of instruction and practice. The first practical field GIS software, ArcPad, remains a viable, stable option on Windows-based systems. However, the vendor seems to be moving away from ArcPad in favor of mobile software solutions that are difficult to implement without GIS specialists. Thus, we have pursued a second software option based on the open source program QGIS. Our QGIS system uses the same shapefile-centric data structure as our ArcPad system, including similar pop-up data entry forms and generic graphics for easy data management in the field. The advantage of QGIS is that the same software runs on virtually all common platforms except iOS, although the Android version remains unstable as of this writing. A third software option we are experimenting with for flat map-based field work is Fieldmove, a derivative of the 3D-capable program Move developed by Midland Valley. Our initial experiments with Fieldmove are positive, particularly with the new, inexpensive (<300Euros) Windows tablets. However, the lack of flexibility in data structure makes for cumbersome workflows when trying to interface our existing shapefile-centric data structures to Move. Nonetheless, in spring 2014 we will experiment with full-3D immersion in the field using the full Move software package in combination with ground based LiDAR and photogrammetry. One new workflow suggested by our initial experiments is that field geologists should consider using photogrammetry software to capture 3D visualizations of key outcrops. This process is now straightforward in several software packages, and it affords a previously unheard of potential for communicating the complexity of key exposures. For example, in studies of metamorphic structures we often search for days to find "Rosetta Stone" outcrops that display key geometric relationships. While conventional photographs rarely can capture the essence of the field exposure, capturing a true 3D representation of the exposure with multiple photos from many orientations can solve this communication problem. As spatial databases evolve these 3D models should be readily importable into the database.
A case study in programming a quantum annealer for hard operational planning problems
NASA Astrophysics Data System (ADS)
Rieffel, Eleanor G.; Venturelli, Davide; O'Gorman, Bryan; Do, Minh B.; Prystay, Elicia M.; Smelyanskiy, Vadim N.
2015-01-01
We report on a case study in programming an early quantum annealer to attack optimization problems related to operational planning. While a number of studies have looked at the performance of quantum annealers on problems native to their architecture, and others have examined performance of select problems stemming from an application area, ours is one of the first studies of a quantum annealer's performance on parametrized families of hard problems from a practical domain. We explore two different general mappings of planning problems to quadratic unconstrained binary optimization (QUBO) problems, and apply them to two parametrized families of planning problems, navigation-type and scheduling-type. We also examine two more compact, but problem-type specific, mappings to QUBO, one for the navigation-type planning problems and one for the scheduling-type planning problems. We study embedding properties and parameter setting and examine their effect on the efficiency with which the quantum annealer solves these problems. From these results, we derive insights useful for the programming and design of future quantum annealers: problem choice, the mapping used, the properties of the embedding, and the annealing profile all matter, each significantly affecting the performance.
NASA Technical Reports Server (NTRS)
Nez, G. (Principal Investigator); Mutter, D.
1977-01-01
The author has identified the following significant results. The project mapped land use/cover classifications from LANDSAT computer compatible tape data and combined those results with other multisource data via computer mapping/compositing techniques to analyze various land use planning/natural resource management problems. Data were analyzed on 1:24,000 scale maps at 1.1 acre resolution. LANDSAT analysis software and linkages with other computer mapping software were developed. Significant results were also achieved in training, communication, and identification of needs for developing the LANDSAT/computer mapping technologies into operational tools for use by decision makers.
Functional mapping of language networks in the normal brain using a word-association task.
Ghosh, Shantanu; Basu, Amrita; Kumaran, Senthil S; Khushu, Subash
2010-08-01
Language functions are known to be affected in diverse neurological conditions, including ischemic stroke, traumatic brain injury, and brain tumors. Because language networks are extensive, interpretation of functional data depends on the task completed during evaluation. The aim was to map the hemodynamic consequences of word association using functional magnetic resonance imaging (fMRI) in normal human subjects. Ten healthy subjects underwent fMRI scanning with a postlexical access semantic association task vs lexical processing task. The fMRI protocol involved a T2*-weighted gradient-echo echo-planar imaging (GE-EPI) sequence (TR 4523 ms, TE 64 ms, flip angle 90°) with alternate baseline and activation blocks. A total of 78 scans were taken (interscan interval = 3 s) with a total imaging time of 587 s. Functional data were processed in Statistical Parametric Mapping software (SPM2) with 8-mm Gaussian kernel by convolving the blood oxygenation level-dependent (BOLD) signal with an hemodynamic response function estimated by general linear method to generate SPM{t} and SPM{F} maps. Single subject analysis of the functional data (FWE-corrected, P≤0.001) revealed extensive activation in the frontal lobes, with overlaps among middle frontal gyrus (MFG), superior, and inferior frontal gyri. BOLD activity was also found in the medial frontal gyrus, middle occipital gyrus (MOG), anterior fusiform gyrus, superior and inferior parietal lobules, and to a smaller extent, the thalamus and right anterior cerebellum. Group analysis (FWE-corrected, P≤0.001) revealed neural recruitment of bilateral lingual gyri, left MFG, bilateral MOG, left superior occipital gyrus, left fusiform gyrus, bilateral thalami, and right cerebellar areas. Group data analysis revealed a cerebellar-occipital-fusiform-thalamic network centered around bilateral lingual gyri for word association, thereby indicating how these areas facilitate language comprehension by activating a semantic association network of words processed postlexical access. This finding is important when assessing the extent of cognitive damage and/or recovery and can be used for presurgical planning after optimization.
Hernández-Martin, Estefania; Marcano, Francisco; Casanova, Oscar; Modroño, Cristian; Plata-Bello, Julio; González-Mora, Jose Luis
2017-01-01
Abstract. Diffuse optical tomography (DOT) measures concentration changes in both oxy- and deoxyhemoglobin providing three-dimensional images of local brain activations. A pilot study, which compares both DOT and functional magnetic resonance imaging (fMRI) volumes through t-maps given by canonical statistical parametric mapping (SPM) processing for both data modalities, is presented. The DOT series were processed using a method that is based on a Bayesian filter application on raw DOT data to remove physiological changes and minimum description length application index to select a number of singular values, which reduce the data dimensionality during image reconstruction and adaptation of DOT volume series to normalized standard space. Therefore, statistical analysis is performed with canonical SPM software in the same way as fMRI analysis is done, accepting DOT volumes as if they were fMRI volumes. The results show the reproducibility and ruggedness of the method to process DOT series on group analysis using cognitive paradigms on the prefrontal cortex. Difficulties such as the fact that scalp–brain distances vary between subjects or cerebral activations are difficult to reproduce due to strategies used by the subjects to solve arithmetic problems are considered. T-images given by fMRI and DOT volume series analyzed in SPM show that at the functional level, both DOT and fMRI measures detect the same areas, although DOT provides complementary information to fMRI signals about cerebral activity. PMID:28386575
Erberich, Stephan G; Bhandekar, Manasee; Chervenak, Ann; Kesselman, Carl; Nelson, Marvin D
2007-01-01
Functional MRI is successfully being used in clinical and research applications including preoperative planning, language mapping, and outcome monitoring. However, clinical use of fMRI is less widespread due to its complexity of imaging, image workflow, post-processing, and lack of algorithmic standards hindering result comparability. As a consequence, wide-spread adoption of fMRI as clinical tool is low contributing to the uncertainty of community physicians how to integrate fMRI into practice. In addition, training of physicians with fMRI is in its infancy and requires clinical and technical understanding. Therefore, many institutions which perform fMRI have a team of basic researchers and physicians to perform fMRI as a routine imaging tool. In order to provide fMRI as an advanced diagnostic tool to the benefit of a larger patient population, image acquisition and image post-processing must be streamlined, standardized, and available at any institution which does not have these resources available. Here we describe a software architecture, the functional imaging laboratory (funcLAB/G), which addresses (i) standardized image processing using Statistical Parametric Mapping and (ii) its extension to secure sharing and availability for the community using standards-based Grid technology (Globus Toolkit). funcLAB/G carries the potential to overcome the limitations of fMRI in clinical use and thus makes standardized fMRI available to the broader healthcare enterprise utilizing the Internet and HealthGrid Web Services technology.
Neural correlates of the popular music phenomenon: evidence from functional MRI and PET imaging.
Chen, Qiaozhen; Zhang, Ying; Hou, Haifeng; Du, Fenglei; Wu, Shuang; Chen, Lin; Shen, Yehua; Chao, Fangfang; Chung, June-Key; Zhang, Hong; Tian, Mei
2017-06-01
Music can induce different emotions. However, its neural mechanism remains unknown. The aim of this study was to use functional magnetic resonance imaging (fMRI) and position emission tomography (PET) imaging for mapping of neural changes under the most popular music in healthy volunteers. Blood-oxygen-level-dependent (BOLD) fMRI and monoamine receptor PET imaging with 11 C-N-methylspiperone ( 11 C-NMSP) were conducted under the popular music Gangnam Style and light music A Comme Amour in healthy subjects. PET and fMRI images were analyzed by using the Statistical Parametric Mapping software (SPM). Significantly increased fMRI BOLD signals were found in the bilateral superior temporal cortices, left cerebellum, left putamen and right thalamus cortex. Monoamine receptor availability was increased significantly in the left superior temporal gyrus and left putamen, but decreased in the bilateral superior occipital cortices under the Gangnam Style compared with the light music condition. Significant positive correlation was found between 11 C-NMSP binding and fMRI BOLD signals in the left temporal cortex. Furthermore, increased 11 C-NMSP binding in the left putamen was positively correlated with the mood arousal level score under the Gangnam Style condition. Popular music Gangnam Style can arouse pleasure experience and strong emotional response. The left putamen is positively correlated with the mood arousal level score under the Gangnam Style condition. Our results revealed characteristic patterns of brain activity associated with Gangnam Style, and may also provide more general insights into the music-induced emotional processing.
Nissan, Noam; Furman-Haran, Edna; Feinberg-Shapiro, Myra; Grobgeld, Dov; Eyal, Erez; Zehavi, Tania; Degani, Hadassa
2014-12-15
Breast cancer is the most common cause of cancer among women worldwide. Early detection of breast cancer has a critical role in improving the quality of life and survival of breast cancer patients. In this paper a new approach for the detection of breast cancer is described, based on tracking the mammary architectural elements using diffusion tensor imaging (DTI). The paper focuses on the scanning protocols and image processing algorithms and software that were designed to fit the diffusion properties of the mammary fibroglandular tissue and its changes during malignant transformation. The final output yields pixel by pixel vector maps that track the architecture of the entire mammary ductal glandular trees and parametric maps of the diffusion tensor coefficients and anisotropy indices. The efficiency of the method to detect breast cancer was tested by scanning women volunteers including 68 patients with breast cancer confirmed by histopathology findings. Regions with cancer cells exhibited a marked reduction in the diffusion coefficients and in the maximal anisotropy index as compared to the normal breast tissue, providing an intrinsic contrast for delineating the boundaries of malignant growth. Overall, the sensitivity of the DTI parameters to detect breast cancer was found to be high, particularly in dense breasts, and comparable to the current standard breast MRI method that requires injection of a contrast agent. Thus, this method offers a completely non-invasive, safe and sensitive tool for breast cancer detection.
Development and Analysis of a Bi-Directional Tidal Turbine
2012-03-01
commercial CFD software ANSYS CFX was utilized to build a turbine map. The basic turbine map was developed for a 25 blade bi-axial turbine under...directional turbine created for this purpose. In the present study, the commercial CFD software ANSYS CFX was utilized to build a turbine map. The...sheath C. PROBLEM SPECIFICATIONS AND BOUNDARY CONDITIONS The simulation definition was created using ANSYS CFX -Pre. The best measurements to determine
An open source multivariate framework for n-tissue segmentation with evaluation on public data.
Avants, Brian B; Tustison, Nicholas J; Wu, Jue; Cook, Philip A; Gee, James C
2011-12-01
We introduce Atropos, an ITK-based multivariate n-class open source segmentation algorithm distributed with ANTs ( http://www.picsl.upenn.edu/ANTs). The Bayesian formulation of the segmentation problem is solved using the Expectation Maximization (EM) algorithm with the modeling of the class intensities based on either parametric or non-parametric finite mixtures. Atropos is capable of incorporating spatial prior probability maps (sparse), prior label maps and/or Markov Random Field (MRF) modeling. Atropos has also been efficiently implemented to handle large quantities of possible labelings (in the experimental section, we use up to 69 classes) with a minimal memory footprint. This work describes the technical and implementation aspects of Atropos and evaluates its performance on two different ground-truth datasets. First, we use the BrainWeb dataset from Montreal Neurological Institute to evaluate three-tissue segmentation performance via (1) K-means segmentation without use of template data; (2) MRF segmentation with initialization by prior probability maps derived from a group template; (3) Prior-based segmentation with use of spatial prior probability maps derived from a group template. We also evaluate Atropos performance by using spatial priors to drive a 69-class EM segmentation problem derived from the Hammers atlas from University College London. These evaluation studies, combined with illustrative examples that exercise Atropos options, demonstrate both performance and wide applicability of this new platform-independent open source segmentation tool.
An Open Source Multivariate Framework for n-Tissue Segmentation with Evaluation on Public Data
Tustison, Nicholas J.; Wu, Jue; Cook, Philip A.; Gee, James C.
2012-01-01
We introduce Atropos, an ITK-based multivariate n-class open source segmentation algorithm distributed with ANTs (http://www.picsl.upenn.edu/ANTs). The Bayesian formulation of the segmentation problem is solved using the Expectation Maximization (EM) algorithm with the modeling of the class intensities based on either parametric or non-parametric finite mixtures. Atropos is capable of incorporating spatial prior probability maps (sparse), prior label maps and/or Markov Random Field (MRF) modeling. Atropos has also been efficiently implemented to handle large quantities of possible labelings (in the experimental section, we use up to 69 classes) with a minimal memory footprint. This work describes the technical and implementation aspects of Atropos and evaluates its performance on two different ground-truth datasets. First, we use the BrainWeb dataset from Montreal Neurological Institute to evaluate three-tissue segmentation performance via (1) K-means segmentation without use of template data; (2) MRF segmentation with initialization by prior probability maps derived from a group template; (3) Prior-based segmentation with use of spatial prior probability maps derived from a group template. We also evaluate Atropos performance by using spatial priors to drive a 69-class EM segmentation problem derived from the Hammers atlas from University College London. These evaluation studies, combined with illustrative examples that exercise Atropos options, demonstrate both performance and wide applicability of this new platform-independent open source segmentation tool. PMID:21373993
NASA Astrophysics Data System (ADS)
Marchand, Paul J.; Bouwens, Arno; Shamaei, Vincent; Nguyen, David; Extermann, Jerome; Bolmont, Tristan; Lasser, Theo
2016-03-01
Magnetic Resonance Imaging has revolutionised our understanding of brain function through its ability to image human cerebral structures non-invasively over the entire brain. By exploiting the different magnetic properties of oxygenated and deoxygenated blood, functional MRI can indirectly map areas undergoing neural activation. Alongside the development of fMRI, powerful statistical tools have been developed in an effort to shed light on the neural pathways involved in processing of sensory and cognitive information. In spite of the major improvements made in fMRI technology, the obtained spatial resolution of hundreds of microns prevents MRI in resolving and monitoring processes occurring at the cellular level. In this regard, Optical Coherence Microscopy is an ideal instrumentation as it can image at high spatio-temporal resolution. Moreover, by measuring the mean and the width of the Doppler spectra of light scattered by moving particles, OCM allows extracting the axial and lateral velocity components of red blood cells. The ability to assess quantitatively total blood velocity, as opposed to classical axial velocity Doppler OCM, is of paramount importance in brain imaging as a large proportion of cortical vascular is oriented perpendicularly to the optical axis. We combine here quantitative blood flow imaging with extended-focus Optical Coherence Microscopy and Statistical Parametric Mapping tools to generate maps of stimuli-evoked cortical hemodynamics at the capillary level.
Incorporating Manual and Autonomous Code Generation
NASA Technical Reports Server (NTRS)
McComas, David
1998-01-01
Code can be generated manually or using code-generated software tools, but how do you interpret the two? This article looks at a design methodology that combines object-oriented design with autonomic code generation for attitude control flight software. Recent improvements in space flight computers are allowing software engineers to spend more time engineering the applications software. The application developed was the attitude control flight software for an astronomical satellite called the Microwave Anisotropy Probe (MAP). The MAP flight system is being designed, developed, and integrated at NASA's Goddard Space Flight Center. The MAP controls engineers are using Integrated Systems Inc.'s MATRIXx for their controls analysis. In addition to providing a graphical analysis for an environment, MATRIXx includes an autonomic code generation facility called AutoCode. This article examines the forces that shaped the final design and describes three highlights of the design process: (1) Defining the manual to autonomic code interface; (2) Applying object-oriented design to the manual flight code; (3) Implementing the object-oriented design in C.
Edison, Paul; Brooks, David J; Turkheimer, Federico E; Archer, Hilary A; Hinz, Rainer
2009-11-01
Pittsburgh compound B or [11C]PIB is an amyloid imaging agent which shows a clear differentiation between subjects with Alzheimer's disease (AD) and controls. However the observed signal difference in other forms of dementia such as dementia with Lewy bodies (DLB) is smaller, and mild cognitively impaired (MCI) subjects and some healthy elderly normals may show intermediate levels of [11C]PIB binding. The cerebellum, a commonly used reference region for non-specific tracer uptake in [11C]PIB studies in AD may not be valid in Prion disorders or monogenic forms of AD. The aim of this work was to: 1-compare methods for generating parametric maps of [11C]PIB retention in tissue using a plasma input function in respect of their ability to discriminate between AD subjects and controls and 2-estimate the test-retest reproducibility in AD subjects. 12 AD subjects (5 of which underwent a repeat scan within 6 weeks) and 10 control subjects had 90 minute [11C]PIB dynamic PET scans, and arterial plasma input functions were measured. Parametric maps were generated with graphical analysis of reversible binding (Logan plot), irreversible binding (Patlak plot), and spectral analysis. Between group differentiation was calculated using Student's t-test and comparisons between different methods were made using p values. Reproducibility was assessed by intraclass correlation coefficients (ICC). We found that the 75 min value of the impulse response function showed the best group differentiation and had a higher ICC than volume of distribution maps generated from Logan and spectral analysis. Patlak analysis of [11C]PIB binding was the least reproducible.
Brain serotonin transporter density and aggression in abstinent methamphetamine abusers.
Sekine, Yoshimoto; Ouchi, Yasuomi; Takei, Nori; Yoshikawa, Etsuji; Nakamura, Kazuhiko; Futatsubashi, Masami; Okada, Hiroyuki; Minabe, Yoshio; Suzuki, Katsuaki; Iwata, Yasuhide; Tsuchiya, Kenji J; Tsukada, Hideo; Iyo, Masaomi; Mori, Norio
2006-01-01
In animals, methamphetamine is known to have a neurotoxic effect on serotonin neurons, which have been implicated in the regulation of mood, anxiety, and aggression. It remains unknown whether methamphetamine damages serotonin neurons in humans. To investigate the status of brain serotonin neurons and their possible relationship with clinical characteristics in currently abstinent methamphetamine abusers. Case-control analysis. A hospital research center. Twelve currently abstinent former methamphetamine abusers (5 women and 7 men) and 12 age-, sex-, and education-matched control subjects recruited from the community. The brain regional density of the serotonin transporter, a structural component of serotonin neurons, was estimated using positron emission tomography and trans-1,2,3,5,6,10-beta-hexahydro-6-[4-(methylthio)phenyl]pyrrolo-[2,1-a]isoquinoline ([(11)C](+)McN-5652). Estimates were derived from region-of-interest and statistical parametric mapping methods, followed by within-case analysis using the measures of clinical variables. The duration of methamphetamine use, the magnitude of aggression and depressive symptoms, and changes in serotonin transporter density represented by the [(11)C](+)McN-5652 distribution volume. Methamphetamine abusers showed increased levels of aggression compared with controls. Region-of-interest and statistical parametric mapping analyses revealed that the serotonin transporter density in global brain regions (eg, the midbrain, thalamus, caudate, putamen, cerebral cortex, and cerebellum) was significantly lower in methamphetamine abusers than in control subjects, and this reduction was significantly inversely correlated with the duration of methamphetamine use. Furthermore, statistical parametric mapping analyses indicated that the density in the orbitofrontal, temporal, and anterior cingulate areas was closely associated with the magnitude of aggression in methamphetamine abusers. Protracted abuse of methamphetamine may reduce the density of the serotonin transporter in the brain, leading to elevated aggression, even in currently abstinent abusers.
A working environment for digital planetary data processing and mapping using ISIS and GRASS GIS
Frigeri, A.; Hare, T.; Neteler, M.; Coradini, A.; Federico, C.; Orosei, R.
2011-01-01
Since the beginning of planetary exploration, mapping has been fundamental to summarize observations returned by scientific missions. Sensor-based mapping has been used to highlight specific features from the planetary surfaces by means of processing. Interpretative mapping makes use of instrumental observations to produce thematic maps that summarize observations of actual data into a specific theme. Geologic maps, for example, are thematic interpretative maps that focus on the representation of materials and processes and their relative timing. The advancements in technology of the last 30 years have allowed us to develop specialized systems where the mapping process can be made entirely in the digital domain. The spread of networked computers on a global scale allowed the rapid propagation of software and digital data such that every researcher can now access digital mapping facilities on his desktop. The efforts to maintain planetary missions data accessible to the scientific community have led to the creation of standardized digital archives that facilitate the access to different datasets by software capable of processing these data from the raw level to the map projected one. Geographic Information Systems (GIS) have been developed to optimize the storage, the analysis, and the retrieval of spatially referenced Earth based environmental geodata; since the last decade these computer programs have become popular among the planetary science community, and recent mission data start to be distributed in formats compatible with these systems. Among all the systems developed for the analysis of planetary and spatially referenced data, we have created a working environment combining two software suites that have similar characteristics in their modular design, their development history, their policy of distribution and their support system. The first, the Integrated Software for Imagers and Spectrometers (ISIS) developed by the United States Geological Survey, represents the state of the art for processing planetary remote sensing data, from the raw unprocessed state to the map projected product. The second, the Geographic Resources Analysis Support System (GRASS) is a Geographic Information System developed by an international team of developers, and one of the core projects promoted by the Open Source Geospatial Foundation (OSGeo). We have worked on enabling the combined use of these software systems throughout the set-up of a common user interface, the unification of the cartographic reference system nomenclature and the minimization of data conversion. Both software packages are distributed with free open source licenses, as well as the source code, scripts and configuration files hereafter presented. In this paper we describe our work done to merge these working environments into a common one, where the user benefits from functionalities of both systems without the need to switch or transfer data from one software suite to the other one. Thereafter we provide an example of its usage in the handling of planetary data and the crafting of a digital geologic map. ?? 2010 Elsevier Ltd. All rights reserved.
ABMapper: a suffix array-based tool for multi-location searching and splice-junction mapping.
Lou, Shao-Ke; Ni, Bing; Lo, Leung-Yau; Tsui, Stephen Kwok-Wing; Chan, Ting-Fung; Leung, Kwong-Sak
2011-02-01
Sequencing reads generated by RNA-sequencing (RNA-seq) must first be mapped back to the genome through alignment before they can be further analyzed. Current fast and memory-saving short-read mappers could give us a quick view of the transcriptome. However, they are neither designed for reads that span across splice junctions nor for repetitive reads, which can be mapped to multiple locations in the genome (multi-reads). Here, we describe a new software package: ABMapper, which is specifically designed for exploring all putative locations of reads that are mapped to splice junctions or repetitive in nature. The software is freely available at: http://abmapper.sourceforge.net/. The software is written in C++ and PERL. It runs on all major platforms and operating systems including Windows, Mac OS X and LINUX.
Validation of a Parametric Approach for 3d Fortification Modelling: Application to Scale Models
NASA Astrophysics Data System (ADS)
Jacquot, K.; Chevrier, C.; Halin, G.
2013-02-01
Parametric modelling approach applied to cultural heritage virtual representation is a field of research explored for years since it can address many limitations of digitising tools. For example, essential historical sources for fortification virtual reconstructions like plans-reliefs have several shortcomings when they are scanned. To overcome those problems, knowledge based-modelling can be used: knowledge models based on the analysis of theoretical literature of a specific domain such as bastioned fortification treatises can be the cornerstone of the creation of a parametric library of fortification components. Implemented in Grasshopper, these components are manually adjusted on the data available (i.e. 3D surveys of plans-reliefs or scanned maps). Most of the fortification area is now modelled and the question of accuracy assessment is raised. A specific method is used to evaluate the accuracy of the parametric components. The results of the assessment process will allow us to validate the parametric approach. The automation of the adjustment process can finally be planned. The virtual model of fortification is part of a larger project aimed at valorising and diffusing a very unique cultural heritage item: the collection of plans-reliefs. As such, knowledge models are precious assets when automation and semantic enhancements will be considered.
{sup 18}F-FLT uptake kinetics in head and neck squamous cell carcinoma: A PET imaging study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Liu, Dan, E-mail: dan.liu@oncology.ox.ac.uk; Fenwick, John D.; Chalkidou, Anastasia
2014-04-15
Purpose: To analyze the kinetics of 3{sup ′}-deoxy-3{sup ′}-[F-18]-fluorothymidine (18F-FLT) uptake by head and neck squamous cell carcinomas and involved nodes imaged using positron emission tomography (PET). Methods: Two- and three-tissue compartment models were fitted to 12 tumor time-activity-curves (TACs) obtained for 6 structures (tumors or involved nodes) imaged in ten dynamic PET studies of 1 h duration, carried out for five patients. The ability of the models to describe the data was assessed using a runs test, the Akaike information criterion (AIC) and leave-one-out cross-validation. To generate parametric maps the models were also fitted to TACs of individual voxels.more » Correlations between maps of different parameters were characterized using Pearson'sr coefficient; in particular the phosphorylation rate-constants k{sub 3-2tiss} and k{sub 5} of the two- and three-tissue models were studied alongside the flux parameters K{sub FLT-2tiss} and K{sub FLT} of these models, and standardized uptake values (SUV). A methodology based on expectation-maximization clustering and the Bayesian information criterion (“EM-BIC clustering”) was used to distil the information from noisy parametric images. Results: Fits of two-tissue models 2C3K and 2C4K and three-tissue models 3C5K and 3C6K comprising three, four, five, and six rate-constants, respectively, pass the runs test for 4, 8, 10, and 11 of 12 tumor TACs. The three-tissue models have lower AIC and cross-validation scores for nine of the 12 tumors. Overall the 3C6K model has the lowest AIC and cross-validation scores and its fitted parameter values are of the same orders of magnitude as literature estimates. Maps ofK{sub FLT} and K{sub FLT-2tiss} are strongly correlated (r = 0.85) and also correlate closely with SUV maps (r = 0.72 for K{sub FLT-2tiss}, 0.64 for K{sub FLT}). Phosphorylation rate-constant maps are moderately correlated with flux maps (r = 0.48 for k{sub 3-2tiss} vs K{sub FLT-2tiss} and r = 0.68 for k{sub 5} vs K{sub FLT}); however, neither phosphorylation rate-constant correlates significantly with SUV. EM-BIC clustering reduces the parametric maps to a small number of levels—on average 5.8, 3.5, 3.4, and 1.4 for K{sub FLT-2tiss}, K{sub FLT}, k{sub 3-2tiss}, and k{sub 5.} This large simplification is potentially useful for radiotherapy dose-painting, but demonstrates the high noise in some maps. Statistical simulations show that voxel level noise degrades TACs generated from the 3C6K model sufficiently that the average AIC score, parameter bias, and total uncertainty of 2C4K model fits are similar to those of 3C6K fits, whereas at the whole tumor level the scores are lower for 3C6K fits. Conclusions: For the patients studied here, whole tumor FLT uptake time-courses are represented better overall by a three-tissue than by a two-tissue model. EM-BIC clustering simplifies noisy parametric maps, providing the best description of the underlying information they contain and is potentially useful for radiotherapy dose-painting. However, the clustering highlights the large degree of noise present in maps of the phosphorylation rate-constantsk{sub 5} and k{sub 3-2tiss}, which are conceptually tightly linked to cellular proliferation. Methods must be found to make these maps more robust—either by constraining other model parameters or modifying dynamic imaging protocols.« less
ActionMap: A web-based software that automates loci assignments to framework maps.
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-07-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/).
ActionMap: a web-based software that automates loci assignments to framework maps
Albini, Guillaume; Falque, Matthieu; Joets, Johann
2003-01-01
Genetic linkage computation may be a repetitive and time consuming task, especially when numerous loci are assigned to a framework map. We thus developed ActionMap, a web-based software that automates genetic mapping on a fixed framework map without adding the new markers to the map. Using this tool, hundreds of loci may be automatically assigned to the framework in a single process. ActionMap was initially developed to map numerous ESTs with a small plant mapping population and is limited to inbred lines and backcrosses. ActionMap is highly configurable and consists of Perl and PHP scripts that automate command steps for the MapMaker program. A set of web forms were designed for data import and mapping settings. Results of automatic mapping can be displayed as tables or drawings of maps and may be exported. The user may create personal access-restricted projects to store raw data, settings and mapping results. All data may be edited, updated or deleted. ActionMap may be used either online or downloaded for free (http://moulon.inra.fr/~bioinfo/). PMID:12824426
Stress Recovery and Error Estimation for Shell Structures
NASA Technical Reports Server (NTRS)
Yazdani, A. A.; Riggs, H. R.; Tessler, A.
2000-01-01
The Penalized Discrete Least-Squares (PDLS) stress recovery (smoothing) technique developed for two dimensional linear elliptic problems is adapted here to three-dimensional shell structures. The surfaces are restricted to those which have a 2-D parametric representation, or which can be built-up of such surfaces. The proposed strategy involves mapping the finite element results to the 2-D parametric space which describes the geometry, and smoothing is carried out in the parametric space using the PDLS-based Smoothing Element Analysis (SEA). Numerical results for two well-known shell problems are presented to illustrate the performance of SEA/PDLS for these problems. The recovered stresses are used in the Zienkiewicz-Zhu a posteriori error estimator. The estimated errors are used to demonstrate the performance of SEA-recovered stresses in automated adaptive mesh refinement of shell structures. The numerical results are encouraging. Further testing involving more complex, practical structures is necessary.
NASA Astrophysics Data System (ADS)
Boschi, Lapo
2006-10-01
I invert a large set of teleseismic phase-anomaly observations, to derive tomographic maps of fundamental-mode surface wave phase velocity, first via ray theory, then accounting for finite-frequency effects through scattering theory, in the far-field approximation and neglecting mode coupling. I make use of a multiple-resolution pixel parametrization which, in the assumption of sufficient data coverage, should be adequate to represent strongly oscillatory Fréchet kernels. The parametrization is finer over North America, a region particularly well covered by the data. For each surface-wave mode where phase-anomaly observations are available, I derive a wide spectrum of plausible, differently damped solutions; I then conduct a trade-off analysis, and select as optimal solution model the one associated with the point of maximum curvature on the trade-off curve. I repeat this exercise in both theoretical frameworks, to find that selected scattering and ray theoretical phase-velocity maps are coincident in pattern, and differ only slightly in amplitude.
Harada, Kengo; Saeki, Hiroshi; Matsuya, Eiji; Okita, Izumi
2013-11-01
We carried out differential diagnosis of brain blood flow images using single-photon emission computed tomography (SPECT) for patients with Parkinson's disease (PD) or progressive supranuclear paralysis (PSP) using statistical parametric mapping (SPM) and to whom we had applied anatomical standardization. We studied two groups and compared brain blood flow images using SPECT (N-isopropyl-4-iodoamphetamine [(123)I] hydrochloride injection, 222 MGq dosage i.v.). A total of 27 patients were studied using SPM: 18 with PD and 9 with PSP; humming bird sign on MRI was from moderate to medium. The decline of brain bloodstream in the PSP group was more notable in the midbrain, near the domain where the humming bird sign was observable, than in the PD group. The observable differences in brain bloodstream decline in the midbrain of PSP and PD patients suggest the potential usefulness of this technique's clinical application to distinction diagnosis.
NASA Software Cost Estimation Model: An Analogy Based Estimation Model
NASA Technical Reports Server (NTRS)
Hihn, Jairus; Juster, Leora; Menzies, Tim; Mathew, George; Johnson, James
2015-01-01
The cost estimation of software development activities is increasingly critical for large scale integrated projects such as those at DOD and NASA especially as the software systems become larger and more complex. As an example MSL (Mars Scientific Laboratory) developed at the Jet Propulsion Laboratory launched with over 2 million lines of code making it the largest robotic spacecraft ever flown (Based on the size of the software). Software development activities are also notorious for their cost growth, with NASA flight software averaging over 50% cost growth. All across the agency, estimators and analysts are increasingly being tasked to develop reliable cost estimates in support of program planning and execution. While there has been extensive work on improving parametric methods there is very little focus on the use of models based on analogy and clustering algorithms. In this paper we summarize our findings on effort/cost model estimation and model development based on ten years of software effort estimation research using data mining and machine learning methods to develop estimation models based on analogy and clustering. The NASA Software Cost Model performance is evaluated by comparing it to COCOMO II, linear regression, and K- nearest neighbor prediction model performance on the same data set.
Rastas, Pasi; Calboli, Federico C. F.; Guo, Baocheng; Shikano, Takahito; Merilä, Juha
2016-01-01
High-density linkage maps are important tools for genome biology and evolutionary genetics by quantifying the extent of recombination, linkage disequilibrium, and chromosomal rearrangements across chromosomes, sexes, and populations. They provide one of the best ways to validate and refine de novo genome assemblies, with the power to identify errors in assemblies increasing with marker density. However, assembly of high-density linkage maps is still challenging due to software limitations. We describe Lep-MAP2, a software for ultradense genome-wide linkage map construction. Lep-MAP2 can handle various family structures and can account for achiasmatic meiosis to gain linkage map accuracy. Simulations show that Lep-MAP2 outperforms other available mapping software both in computational efficiency and accuracy. When applied to two large F2-generation recombinant crosses between two nine-spined stickleback (Pungitius pungitius) populations, it produced two high-density (∼6 markers/cM) linkage maps containing 18,691 and 20,054 single nucleotide polymorphisms. The two maps showed a high degree of synteny, but female maps were 1.5–2 times longer than male maps in all linkage groups, suggesting genome-wide recombination suppression in males. Comparison with the genome sequence of the three-spined stickleback (Gasterosteus aculeatus) revealed a high degree of interspecific synteny with a low frequency (<5%) of interchromosomal rearrangements. However, a fairly large (ca. 10 Mb) translocation from autosome to sex chromosome was detected in both maps. These results illustrate the utility and novel features of Lep-MAP2 in assembling high-density linkage maps, and their usefulness in revealing evolutionarily interesting properties of genomes, such as strong genome-wide sex bias in recombination rates. PMID:26668116
Building a Snow Data Management System using Open Source Software (and IDL)
NASA Astrophysics Data System (ADS)
Goodale, C. E.; Mattmann, C. A.; Ramirez, P.; Hart, A. F.; Painter, T.; Zimdars, P. A.; Bryant, A.; Brodzik, M.; Skiles, M.; Seidel, F. C.; Rittger, K. E.
2012-12-01
At NASA's Jet Propulsion Laboratory free and open source software is used everyday to support a wide range of projects, from planetary to climate to research and development. In this abstract I will discuss the key role that open source software has played in building a robust science data processing pipeline for snow hydrology research, and how the system is also able to leverage programs written in IDL, making JPL's Snow Data System a hybrid of open source and proprietary software. Main Points: - The Design of the Snow Data System (illustrate how the collection of sub-systems are combined to create a complete data processing pipeline) - Discuss the Challenges of moving from a single algorithm on a laptop, to running 100's of parallel algorithms on a cluster of servers (lesson's learned) - Code changes - Software license related challenges - Storage Requirements - System Evolution (from data archiving, to data processing, to data on a map, to near-real-time products and maps) - Road map for the next 6 months (including how easily we re-used the snowDS code base to support the Airborne Snow Observatory Mission) Software in Use and their Software Licenses: IDL - Used for pre and post processing of data. Licensed under a proprietary software license held by Excelis. Apache OODT - Used for data management and workflow processing. Licensed under the Apache License Version 2. GDAL - Geospatial Data processing library used for data re-projection currently. Licensed under the X/MIT license. GeoServer - WMS Server. Licensed under the General Public License Version 2.0 Leaflet.js - Javascript web mapping library. Licensed under the Berkeley Software Distribution License. Python - Glue code and miscellaneous data processing support. Licensed under the Python Software Foundation License. Perl - Script wrapper for running the SCAG algorithm. Licensed under the General Public License Version 3. PHP - Front-end web application programming. Licensed under the PHP License Version 3.01
Software For Tie-Point Registration Of SAR Data
NASA Technical Reports Server (NTRS)
Rignot, Eric; Dubois, Pascale; Okonek, Sharon; Van Zyl, Jacob; Burnette, Fred; Borgeaud, Maurice
1995-01-01
SAR-REG software package registers synthetic-aperture-radar (SAR) image data to common reference frame based on manual tie-pointing. Image data can be in binary, integer, floating-point, or AIRSAR compressed format. For example, with map of soil characteristics, vegetation map, digital elevation map, or SPOT multispectral image, as long as user can generate binary image to be used by tie-pointing routine and data are available in one of the previously mentioned formats. Written in FORTRAN 77.
A Fast and Scalable Radiation Hybrid Map Construction and Integration Strategy
Agarwala, Richa; Applegate, David L.; Maglott, Donna; Schuler, Gregory D.; Schäffer, Alejandro A.
2000-01-01
This paper describes a fast and scalable strategy for constructing a radiation hybrid (RH) map from data on different RH panels. The maps on each panel are then integrated to produce a single RH map for the genome. Recurring problems in using maps from several sources are that the maps use different markers, the maps do not place the overlapping markers in same order, and the objective functions for map quality are incomparable. We use methods from combinatorial optimization to develop a strategy that addresses these issues. We show that by the standard objective functions of obligate chromosome breaks and maximum likelihood, software for the traveling salesman problem produces RH maps with better quality much more quickly than using software specifically tailored for RH mapping. We use known algorithms for the longest common subsequence problem as part of our map integration strategy. We demonstrate our methods by reconstructing and integrating maps for markers typed on the Genebridge 4 (GB4) and the Stanford G3 panels publicly available from the RH database. We compare map quality of our integrated map with published maps for GB4 panel and G3 panel by considering whether markers occur in the same order on a map and in DNA sequence contigs submitted to GenBank. We find that all of the maps are inconsistent with the sequence data for at least 50% of the contigs, but our integrated maps are more consistent. The map integration strategy not only scales to multiple RH maps but also to any maps that have comparable criteria for measuring map quality. Our software improves on current technology for doing RH mapping in areas of computation time and algorithms for considering a large number of markers for mapping. The essential impediments to producing dense high-quality RH maps are data quality and panel size, not computation. PMID:10720576
Automatic Rock Detection and Mapping from HiRISE Imagery
NASA Technical Reports Server (NTRS)
Huertas, Andres; Adams, Douglas S.; Cheng, Yang
2008-01-01
This system includes a C-code software program and a set of MATLAB software tools for statistical analysis and rock distribution mapping. The major functions include rock detection and rock detection validation. The rock detection code has been evolved into a production tool that can be used by engineers and geologists with minor training.
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics
2010-01-01
Background Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. Description An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Conclusions Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms. PMID:21210976
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics.
Taylor, Ronald C
2010-12-21
Bioinformatics researchers are now confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBase project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date. Hadoop and the MapReduce programming paradigm already have a substantial base in the bioinformatics community, especially in the field of next-generation sequencing analysis, and such use is increasing. This is due to the cost-effectiveness of Hadoop-based analysis on commodity Linux clusters, and in the cloud via data upload to cloud vendors who have implemented Hadoop/HBase; and due to the effectiveness and ease-of-use of the MapReduce method in parallelization of many data analysis algorithms.
KinSNP software for homozygosity mapping of disease genes using SNP microarrays
2010-01-01
Consanguineous families affected with a recessive genetic disease caused by homozygotisation of a mutation offer a unique advantage for positional cloning of rare diseases. Homozygosity mapping of patient genotypes is a powerful technique for the identification of the genomic locus harbouring the causing mutation. This strategy relies on the observation that in these patients a large region spanning the disease locus is also homozygous with high probability. The high marker density in single nucleotide polymorphism (SNP) arrays is extremely advantageous for homozygosity mapping. We present KinSNP, a user-friendly software tool for homozygosity mapping using SNP arrays. The software searches for stretches of SNPs which are homozygous to the same allele in all ascertained sick individuals. User-specified parameters control the number of allowed genotyping 'errors' within homozygous blocks. Candidate disease regions are then reported in a detailed, coloured Excel file, along with genotypes of family members and healthy controls. An interactive genome browser has been included which shows homozygous blocks, individual genotypes, genes and further annotations along the chromosomes, with zooming and scrolling capabilities. The software has been used to identify the location of a mutated gene causing insensitivity to pain in a large Bedouin family. KinSNP is freely available from http://bioinfo.bgu.ac.il/bsu/software/kinSNP. PMID:20846928
Analysis of computer-aided techniques for virtual planning in nasoalveolar moulding.
Loeffelbein, D J; Ritschl, L M; Rau, A; Wolff, K-D; Barbarino, M; Pfeifer, S; Schönberger, M; Wintermantel, E
2015-05-01
We compared two methods of planning virtual alveolar moulding as the first step in nasoalveolar moulding to provide the basis for an automated process to fabricate nasoalveolar moulding appliances by using computer-assisted design and computer-aided manufacturing (CAD/CAM). First, the initial intraoral casts taken from seven newborn babies with complete unilateral cleft lip and palate were digitised. This was repeated for the target models after conventional nasoalveolar moulding had been completed. The initial digital model for each patient was then virtually modified by two different modelling techniques to achieve the corresponding target model: parametric and freeform modelling with the software Geomagic(®). The digitally-remodelled casts were quantitatively compared with the actual target model for each patient, and the comparison between the two modified models and the target model showed that freeform modelling of the initial cast was successful (mean (SD) deviation n=7, +0.723 (0.148) to -0.694 (0.157)mm) but needed continuous orientation and was difficult to automate. The results from the parametric modelling (mean (SD) deviation, n=7, +1.168 (0.185) to -1.067 (0.221)mm) were not as good as those from freeform modelling. During parametric modelling, we found some irregularities on the surface, and transverse growth of the maxilla was not accounted for. However, this method seems to be the right one as far as automation is concerned. In addition, an external algorithm must be implemented because the function of the commercial software is limited. Copyright © 2015 The British Association of Oral and Maxillofacial Surgeons. Published by Elsevier Ltd. All rights reserved.
Graded-threshold parametric response maps: towards a strategy for adaptive dose painting
NASA Astrophysics Data System (ADS)
Lausch, A.; Jensen, N.; Chen, J.; Lee, T. Y.; Lock, M.; Wong, E.
2014-03-01
Purpose: To modify the single-threshold parametric response map (ST-PRM) method for predicting treatment outcomes in order to facilitate its use for guidance of adaptive dose painting in intensity-modulated radiotherapy. Methods: Multiple graded thresholds were used to extend the ST-PRM method (Nat. Med. 2009;15(5):572-576) such that the full functional change distribution within tumours could be represented with respect to multiple confidence interval estimates for functional changes in similar healthy tissue. The ST-PRM and graded-threshold PRM (GT-PRM) methods were applied to functional imaging scans of 5 patients treated for hepatocellular carcinoma. Pre and post-radiotherapy arterial blood flow maps (ABF) were generated from CT-perfusion scans of each patient. ABF maps were rigidly registered based on aligning tumour centres of mass. ST-PRM and GT-PRM analyses were then performed on overlapping tumour regions within the registered ABF maps. Main findings: The ST-PRMs contained many disconnected clusters of voxels classified as having a significant change in function. While this may be useful to predict treatment response, it may pose challenges for identifying boost volumes or for informing dose-painting by numbers strategies. The GT-PRMs included all of the same information as ST-PRMs but also visualized the full tumour functional change distribution. Heterogeneous clusters in the ST-PRMs often became more connected in the GT-PRMs by voxels with similar functional changes. Conclusions: GT-PRMs provided additional information which helped to visualize relationships between significant functional changes identified by ST-PRMs. This may enhance ST-PRM utility for guiding adaptive dose painting.
Southern Salish Sea Habitat Map Series: Admiralty Inlet
Cochrane, Guy R.; Dethier, Megan N.; Hodson, Timothy O.; Kull, Kristine K.; Golden, Nadine E.; Ritchie, Andrew C.; Moegling, Crescent; Pacunski, Robert E.; Cochrane, Guy R.
2015-01-01
This publication includes four map sheets, explanatory text, and a descriptive pamphlet. Each map sheet is published as a portable document format (PDF) file. ESRI ArcGIS compatible geotiffs (for example, bathymetry) and shapefiles (for example video observation points) will be available for download in the data catalog associated with this publication (Cochrane, 2015). An ArcGIS Project File with the symbology used to generate the map sheets is also provided. For those who do not own the full suite of ESRI GIS and mapping software, the data can be read using ESRI ArcReader, a free viewer that is available at http://www.esri.com/software/arcgis/arcreader/index.html.
Software support for SBGN maps: SBGN-ML and LibSBGN.
van Iersel, Martijn P; Villéger, Alice C; Czauderna, Tobias; Boyd, Sarah E; Bergmann, Frank T; Luna, Augustin; Demir, Emek; Sorokin, Anatoly; Dogrusoz, Ugur; Matsuoka, Yukiko; Funahashi, Akira; Aladjem, Mirit I; Mi, Huaiyu; Moodie, Stuart L; Kitano, Hiroaki; Le Novère, Nicolas; Schreiber, Falk
2012-08-01
LibSBGN is a software library for reading, writing and manipulating Systems Biology Graphical Notation (SBGN) maps stored using the recently developed SBGN-ML file format. The library (available in C++ and Java) makes it easy for developers to add SBGN support to their tools, whereas the file format facilitates the exchange of maps between compatible software applications. The library also supports validation of maps, which simplifies the task of ensuring compliance with the detailed SBGN specifications. With this effort we hope to increase the adoption of SBGN in bioinformatics tools, ultimately enabling more researchers to visualize biological knowledge in a precise and unambiguous manner. Milestone 2 was released in December 2011. Source code, example files and binaries are freely available under the terms of either the LGPL v2.1+ or Apache v2.0 open source licenses from http://libsbgn.sourceforge.net. sbgn-libsbgn@lists.sourceforge.net.
Investigations on the Bundle Adjustment Results from Sfm-Based Software for Mapping Purposes
NASA Astrophysics Data System (ADS)
Lumban-Gaol, Y. A.; Murtiyoso, A.; Nugroho, B. H.
2018-05-01
Since its first inception, aerial photography has been used for topographic mapping. Large-scale aerial photography contributed to the creation of many of the topographic maps around the world. In Indonesia, a 2013 government directive on spatial management has re-stressed the need for topographic maps, with aerial photogrammetry providing the main method of acquisition. However, the large need to generate such maps is often limited by budgetary reasons. Today, SfM (Structure-from-Motion) offers quicker and less expensive solutions to this problem. However, considering the required precision for topographic missions, these solutions need to be assessed to see if they provide enough level of accuracy. In this paper, a popular SfM-based software Agisoft PhotoScan is used to perform bundle adjustment on a set of large-scale aerial images. The aim of the paper is to compare its bundle adjustment results with those generated by more classical photogrammetric software, namely Trimble Inpho and ERDAS IMAGINE. Furthermore, in order to provide more bundle adjustment statistics to be compared, the Damped Bundle Adjustment Toolbox (DBAT) was also used to reprocess the PhotoScan project. Results show that PhotoScan results are less stable than those generated by the two photogrammetric software programmes. This translates to lower accuracy, which may impact the final photogrammetric product.
Observation model and parameter partials for the JPL geodetic GPS modeling software GPSOMC
NASA Technical Reports Server (NTRS)
Sovers, O. J.; Border, J. S.
1988-01-01
The physical models employed in GPSOMC and the modeling module of the GIPSY software system developed at JPL for analysis of geodetic Global Positioning Satellite (GPS) measurements are described. Details of the various contributions to range and phase observables are given, as well as the partial derivatives of the observed quantities with respect to model parameters. A glossary of parameters is provided to enable persons doing data analysis to identify quantities in the current report with their counterparts in the computer programs. There are no basic model revisions, with the exceptions of an improved ocean loading model and some new options for handling clock parametrization. Such misprints as were discovered were corrected. Further revisions include modeling improvements and assurances that the model description is in accord with the current software.
High-Performance Tiled WMS and KML Web Server
NASA Technical Reports Server (NTRS)
Plesea, Lucian
2007-01-01
This software is an Apache 2.0 module implementing a high-performance map server to support interactive map viewers and virtual planet client software. It can be used in applications that require access to very-high-resolution geolocated images, such as GIS, virtual planet applications, and flight simulators. It serves Web Map Service (WMS) requests that comply with a given request grid from an existing tile dataset. It also generates the KML super-overlay configuration files required to access the WMS image tiles.
Remote Viewer for Maritime Robotics Software
NASA Technical Reports Server (NTRS)
Kuwata, Yoshiaki; Wolf, Michael; Huntsberger, Terrance L.; Howard, Andrew B.
2013-01-01
This software is a viewer program for maritime robotics software that provides a 3D visualization of the boat pose, its position history, ENC (Electrical Nautical Chart) information, camera images, map overlay, and detected tracks.
Karabatsos, George
2017-02-01
Most of applied statistics involves regression analysis of data. In practice, it is important to specify a regression model that has minimal assumptions which are not violated by data, to ensure that statistical inferences from the model are informative and not misleading. This paper presents a stand-alone and menu-driven software package, Bayesian Regression: Nonparametric and Parametric Models, constructed from MATLAB Compiler. Currently, this package gives the user a choice from 83 Bayesian models for data analysis. They include 47 Bayesian nonparametric (BNP) infinite-mixture regression models; 5 BNP infinite-mixture models for density estimation; and 31 normal random effects models (HLMs), including normal linear models. Each of the 78 regression models handles either a continuous, binary, or ordinal dependent variable, and can handle multi-level (grouped) data. All 83 Bayesian models can handle the analysis of weighted observations (e.g., for meta-analysis), and the analysis of left-censored, right-censored, and/or interval-censored data. Each BNP infinite-mixture model has a mixture distribution assigned one of various BNP prior distributions, including priors defined by either the Dirichlet process, Pitman-Yor process (including the normalized stable process), beta (two-parameter) process, normalized inverse-Gaussian process, geometric weights prior, dependent Dirichlet process, or the dependent infinite-probits prior. The software user can mouse-click to select a Bayesian model and perform data analysis via Markov chain Monte Carlo (MCMC) sampling. After the sampling completes, the software automatically opens text output that reports MCMC-based estimates of the model's posterior distribution and model predictive fit to the data. Additional text and/or graphical output can be generated by mouse-clicking other menu options. This includes output of MCMC convergence analyses, and estimates of the model's posterior predictive distribution, for selected functionals and values of covariates. The software is illustrated through the BNP regression analysis of real data.
Crowther, Michael J; Look, Maxime P; Riley, Richard D
2014-09-28
Multilevel mixed effects survival models are used in the analysis of clustered survival data, such as repeated events, multicenter clinical trials, and individual participant data (IPD) meta-analyses, to investigate heterogeneity in baseline risk and covariate effects. In this paper, we extend parametric frailty models including the exponential, Weibull and Gompertz proportional hazards (PH) models and the log logistic, log normal, and generalized gamma accelerated failure time models to allow any number of normally distributed random effects. Furthermore, we extend the flexible parametric survival model of Royston and Parmar, modeled on the log-cumulative hazard scale using restricted cubic splines, to include random effects while also allowing for non-PH (time-dependent effects). Maximum likelihood is used to estimate the models utilizing adaptive or nonadaptive Gauss-Hermite quadrature. The methods are evaluated through simulation studies representing clinically plausible scenarios of a multicenter trial and IPD meta-analysis, showing good performance of the estimation method. The flexible parametric mixed effects model is illustrated using a dataset of patients with kidney disease and repeated times to infection and an IPD meta-analysis of prognostic factor studies in patients with breast cancer. User-friendly Stata software is provided to implement the methods. Copyright © 2014 John Wiley & Sons, Ltd.
Kargarian-Marvasti, Sadegh; Rimaz, Shahnaz; Abolghasemi, Jamileh; Heydari, Iraj
2017-01-01
Cox proportional hazard model is the most common method for analyzing the effects of several variables on survival time. However, under certain circumstances, parametric models give more precise estimates to analyze survival data than Cox. The purpose of this study was to investigate the comparative performance of Cox and parametric models in a survival analysis of factors affecting the event time of neuropathy in patients with type 2 diabetes. This study included 371 patients with type 2 diabetes without neuropathy who were registered at Fereydunshahr diabetes clinic. Subjects were followed up for the development of neuropathy between 2006 to March 2016. To investigate the factors influencing the event time of neuropathy, significant variables in univariate model ( P < 0.20) were entered into the multivariate Cox and parametric models ( P < 0.05). In addition, Akaike information criterion (AIC) and area under ROC curves were used to evaluate the relative goodness of fitted model and the efficiency of each procedure, respectively. Statistical computing was performed using R software version 3.2.3 (UNIX platforms, Windows and MacOS). Using Kaplan-Meier, survival time of neuropathy was computed 76.6 ± 5 months after initial diagnosis of diabetes. After multivariate analysis of Cox and parametric models, ethnicity, high-density lipoprotein and family history of diabetes were identified as predictors of event time of neuropathy ( P < 0.05). According to AIC, "log-normal" model with the lowest Akaike's was the best-fitted model among Cox and parametric models. According to the results of comparison of survival receiver operating characteristics curves, log-normal model was considered as the most efficient and fitted model.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ray, Jaideep; Lee, Jina; Lefantzi, Sophia
The estimation of fossil-fuel CO2 emissions (ffCO2) from limited ground-based and satellite measurements of CO2 concentrations will form a key component of the monitoring of treaties aimed at the abatement of greenhouse gas emissions. To that end, we construct a multiresolution spatial parametrization for fossil-fuel CO2 emissions (ffCO2), to be used in atmospheric inversions. Such a parametrization does not currently exist. The parametrization uses wavelets to accurately capture the multiscale, nonstationary nature of ffCO2 emissions and employs proxies of human habitation, e.g., images of lights at night and maps of built-up areas to reduce the dimensionality of the multiresolution parametrization.more » The parametrization is used in a synthetic data inversion to test its suitability for use in atmospheric inverse problem. This linear inverse problem is predicated on observations of ffCO2 concentrations collected at measurement towers. We adapt a convex optimization technique, commonly used in the reconstruction of compressively sensed images, to perform sparse reconstruction of the time-variant ffCO2 emission field. We also borrow concepts from compressive sensing to impose boundary conditions i.e., to limit ffCO2 emissions within an irregularly shaped region (the United States, in our case). We find that the optimization algorithm performs a data-driven sparsification of the spatial parametrization and retains only of those wavelets whose weights could be estimated from the observations. Further, our method for the imposition of boundary conditions leads to a 10computational saving over conventional means of doing so. We conclude with a discussion of the accuracy of the estimated emissions and the suitability of the spatial parametrization for use in inverse problems with a significant degree of regularization.« less
Parametric Crowd Generation Software for MS&T Simulations and Training
2007-02-20
3 Technology Overview 5 Dynemotion System Components 5 Dynemotion System Architecture 6 Dynemotion-Enabled NPC Brain Cycles 9 Dynemotion API...Contents 10 Development Project Background Information 11 Potential Application and Impact for the DoD 13 Project Objectives, Scope...Methodology 13 Benefits of the Project 13 Project Innovation 14 *l_essons Learned and Open Questions 14 Research and Development Challenges 16
Uncertainty quantification in Rothermel's Model using an efficient sampling method
Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick
2007-01-01
The purpose of the present work is to quantify parametric uncertainty in Rothermelâs wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...
Temporal Progression of Visual Injury from Blast Exposure
2017-09-01
seen throughout the duration of the study. To correlate experimental blast exposures in rodents to human blast exposures, a computational parametric...software (JMP 10.0, Cary,NC). Descriptive and univariate analyses will first be performed to identify the occurrence of delayed visual system...later). The biostatistician evaluating the retrospective data has completed the descriptive analysis and is working on the multiple regression. Table
Controlling CAMAC instrumentation through the USB port
NASA Astrophysics Data System (ADS)
Ribas, R. V.
2012-02-01
A programmable device to interface CAMAC instrumentation to the USB port of computers, without the need of heavy, noisy and expensive CAMAC crates is described in this article. Up to four single-width modules can be used. Also, all software necessary for a multi-parametric data acquisition system was developed. A standard crate-controller based on the same project is being designed.
Mendrek, Adrianna; Mancini-Marië, Adham; Fahim, Cherine; Stip, Emmanuel
2007-02-01
Impaired processing of various emotions is considered one of the fundamental features of schizophrenia. In the recent study intriguing sex differences were observed in the cerebral function associated with the experience of sadness in schizophrenia patients. The aim of the present study was to explore this phenomenon during exposure to aversive stimuli. Fifteen men and 10 women with the DSM-IV diagnosis of schizophrenia underwent functional magnetic resonance imaging (fMRI) while viewing alternating blocks of negative and neutral pictures. Data were analysed using random-effects model within statistical parametric mapping (SPM99) software. Processing of negative stimuli evoked significantly greater activations in men in the thalamus, cerebellum, temporal, occipital and posterior cingulate cortex, while women exhibited greater activations in the left middle frontal gyrus. The sex differences in the cerebral activations in schizophrenia patients deviate from what has been observed in the general population during exposure and experience of negative affect. As such the present study supports and extends the authors' preliminary observation of the anomalous sexual dimorphism in schizophrenia at the functional neuroanatomical level, suggesting potential masculinization of female subjects and feminization of male subjects with schizophrenia.
Modeling genome coverage in single-cell sequencing
Daley, Timothy; Smith, Andrew D.
2014-01-01
Motivation: Single-cell DNA sequencing is necessary for examining genetic variation at the cellular level, which remains hidden in bulk sequencing experiments. But because they begin with such small amounts of starting material, the amount of information that is obtained from single-cell sequencing experiment is highly sensitive to the choice of protocol employed and variability in library preparation. In particular, the fraction of the genome represented in single-cell sequencing libraries exhibits extreme variability due to quantitative biases in amplification and loss of genetic material. Results: We propose a method to predict the genome coverage of a deep sequencing experiment using information from an initial shallow sequencing experiment mapped to a reference genome. The observed coverage statistics are used in a non-parametric empirical Bayes Poisson model to estimate the gain in coverage from deeper sequencing. This approach allows researchers to know statistical features of deep sequencing experiments without actually sequencing deeply, providing a basis for optimizing and comparing single-cell sequencing protocols or screening libraries. Availability and implementation: The method is available as part of the preseq software package. Source code is available at http://smithlabresearch.org/preseq. Contact: andrewds@usc.edu Supplementary information: Supplementary material is available at Bioinformatics online. PMID:25107873
Perspectives on Machine Learning for Classification of Schizotypy Using fMRI Data.
Madsen, Kristoffer H; Krohne, Laerke G; Cai, Xin-Lu; Wang, Yi; Chan, Raymond C K
2018-03-15
Functional magnetic resonance imaging is capable of estimating functional activation and connectivity in the human brain, and lately there has been increased interest in the use of these functional modalities combined with machine learning for identification of psychiatric traits. While these methods bear great potential for early diagnosis and better understanding of disease processes, there are wide ranges of processing choices and pitfalls that may severely hamper interpretation and generalization performance unless carefully considered. In this perspective article, we aim to motivate the use of machine learning schizotypy research. To this end, we describe common data processing steps while commenting on best practices and procedures. First, we introduce the important role of schizotypy to motivate the importance of reliable classification, and summarize existing machine learning literature on schizotypy. Then, we describe procedures for extraction of features based on fMRI data, including statistical parametric mapping, parcellation, complex network analysis, and decomposition methods, as well as classification with a special focus on support vector classification and deep learning. We provide more detailed descriptions and software as supplementary material. Finally, we present current challenges in machine learning for classification of schizotypy and comment on future trends and perspectives.
Kang, Jee In; Park, Hae-Jeong; Kim, Se Joo; Kim, Kyung Ran; Lee, Su Young; Lee, Eun; An, Suk Kyoon; Kwon, Jun Soo; Lee, Jong Doo
2014-05-01
Altered transmission of gamma-aminobutyric acid (GABA), a major inhibitory neurotransmitter, may contribute to the development of schizophrenia. The purpose of the present study was to investigate the presence of GABA-A/benzodiazepine (BZ) receptor binding abnormalities in individuals at ultra-high risk (UHR) for psychosis in comparison with normal controls using [(18)F]-fluoroflumazenil (FFMZ) positron emission tomography (PET). In particular, we set regions of interest in the striatum (caudate, putamen, and nucleus accumbens) and medial temporal area (hippocampus and parahippocampal gyrus). Eleven BZ-naive people at UHR and 15 normal controls underwent PET scanning using [(18)F]-FFMZ to measure GABA-A/BZ receptor binding potential. The regional group differences between UHR individuals and normal controls were analyzed using Statistical Parametric Mapping 8 software. Participants were evaluated using the structured interview for prodromal syndromes and neurocognitive function tasks. People at UHR demonstrated significantly reduced binding potential of GABA-A/BZ receptors in the right caudate. Altered GABAergic transmission and/or the imbalance of inhibitory and excitatory systems in the striatum may be present at the putative prodromal stage and play a pivotal role in the pathophysiology of psychosis.
A Scalable Framework For Segmenting Magnetic Resonance Images
Hore, Prodip; Goldgof, Dmitry B.; Gu, Yuhua; Maudsley, Andrew A.; Darkazanli, Ammar
2009-01-01
A fast, accurate and fully automatic method of segmenting magnetic resonance images of the human brain is introduced. The approach scales well allowing fast segmentations of fine resolution images. The approach is based on modifications of the soft clustering algorithm, fuzzy c-means, that enable it to scale to large data sets. Two types of modifications to create incremental versions of fuzzy c-means are discussed. They are much faster when compared to fuzzy c-means for medium to extremely large data sets because they work on successive subsets of the data. They are comparable in quality to application of fuzzy c-means to all of the data. The clustering algorithms coupled with inhomogeneity correction and smoothing are used to create a framework for automatically segmenting magnetic resonance images of the human brain. The framework is applied to a set of normal human brain volumes acquired from different magnetic resonance scanners using different head coils, acquisition parameters and field strengths. Results are compared to those from two widely used magnetic resonance image segmentation programs, Statistical Parametric Mapping and the FMRIB Software Library (FSL). The results are comparable to FSL while providing significant speed-up and better scalability to larger volumes of data. PMID:20046893
Techniques for Down-Sampling a Measured Surface Height Map for Model Validation
NASA Technical Reports Server (NTRS)
Sidick, Erkin
2012-01-01
This software allows one to down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. The software tool of the current two new techniques can be used in all optical model validation processes involving large space optical surfaces
MapFactory - Towards a mapping design pattern for big geospatial data
NASA Astrophysics Data System (ADS)
Rautenbach, Victoria; Coetzee, Serena
2018-05-01
With big geospatial data emerging, cartographers and geographic information scientists have to find new ways of dealing with the volume, variety, velocity, and veracity (4Vs) of the data. This requires the development of tools that allow processing, filtering, analysing, and visualising of big data through multidisciplinary collaboration. In this paper, we present the MapFactory design pattern that will be used for the creation of different maps according to the (input) design specification for big geospatial data. The design specification is based on elements from ISO19115-1:2014 Geographic information - Metadata - Part 1: Fundamentals that would guide the design and development of the map or set of maps to be produced. The results of the exploratory research suggest that the MapFactory design pattern will help with software reuse and communication. The MapFactory design pattern will aid software developers to build the tools that are required to automate map making with big geospatial data. The resulting maps would assist cartographers and others to make sense of big geospatial data.
Concept Maps Applied to Mars Exploration Public Outreach
NASA Technical Reports Server (NTRS)
Briggs, Geoffrey; Canas, Alberto; Shamma, David; Scargle, Jeffrey; Novak, Joseph
2004-01-01
This paper describes CMEX Mars, an effort in the creation of a comprehensive set of concept maps to describe all aspects of Mars exploration. These concept maps, created using the CmapTools software developed by the Institute for Human and Machine Cognition, are available on the Internet at http:/cmex.arc.nasa.gov/CMEX and are linked among themselves as well as to resources on the Internet. The work described took place mainly between 1998 and 2001 and combined the goals of: 1) developing a library of concept maps for educational outreach while also 2) refining the capabilities of the software used to create the interactive maps and 3) making them available on the Internet. Here we focus on the library of Mars exploration concept maps that has been created.
Remanent magnetization and three-dimensional density model of the Kentucky anomaly region
NASA Technical Reports Server (NTRS)
1982-01-01
Existing software was modified to handle 3-D density and magnetization models of the Kentucky body and is being tested. Gravity and magnetic anomaly data sets are ready for use. A preliminary block model is under construction using the 1:1,000,000 maps. An x-y grid to overlay the 1:2,500,000 Albers maps and keyed to the 1:1,000,000 scale block models was created. Software was developed to generate a smoothed MAGSAT data set over this grid; this is to be input to an inversion program for generating the regional magnetization map. The regional scale 1:2,500,000 map mosaic is being digitized using previous magnetization models, the U.S. magnetic anomaly map, and regional tectonic maps as a guide.
Miniature microwave applicator for murine bladder hyperthermia studies.
Salahi, Sara; Maccarini, Paolo F; Rodrigues, Dario B; Etienne, Wiguins; Landon, Chelsea D; Inman, Brant A; Dewhirst, Mark W; Stauffer, Paul R
2012-01-01
Novel combinations of heat with chemotherapeutic agents are often studied in murine tumour models. Currently, no device exists to selectively heat small tumours at depth in mice. In this project we modelled, built and tested a miniature microwave heat applicator, the physical dimensions of which can be scaled to adjust the volume and depth of heating to focus on the tumour volume. Of particular interest is a device that can selectively heat murine bladder. Using Avizo(®) segmentation software, we created a numerical mouse model based on micro-MRI scan data. The model was imported into HFSS™ (Ansys) simulation software and parametric studies were performed to optimise the dimensions of a water-loaded circular waveguide for selective power deposition inside a 0.15 mL bladder. A working prototype was constructed operating at 2.45 GHz. Heating performance was characterised by mapping fibre-optic temperature sensors along catheters inserted at depths of 0-1 mm (subcutaneous), 2-3 mm (vaginal), and 4-5 mm (rectal) below the abdominal wall, with the mid depth catheter adjacent to the bladder. Core temperature was monitored orally. Thermal measurements confirm the simulations which demonstrate that this applicator can provide local heating at depth in small animals. Measured temperatures in murine pelvis show well-localised bladder heating to 42-43°C while maintaining normothermic skin and core temperatures. Simulation techniques facilitate the design optimisation of microwave antennas for use in pre-clinical applications such as localised tumour heating in small animals. Laboratory measurements demonstrate the effectiveness of a new miniature water-coupled microwave applicator for localised heating of murine bladder.
A Non-parametric Approach to Constrain the Transfer Function in Reverberation Mapping
NASA Astrophysics Data System (ADS)
Li, Yan-Rong; Wang, Jian-Min; Bai, Jin-Ming
2016-11-01
Broad emission lines of active galactic nuclei stem from a spatially extended region (broad-line region, BLR) that is composed of discrete clouds and photoionized by the central ionizing continuum. The temporal behaviors of these emission lines are blurred echoes of continuum variations (I.e., reverberation mapping, RM) and directly reflect the structures and kinematic information of BLRs through the so-called transfer function (also known as the velocity-delay map). Based on the previous works of Rybicki and Press and Zu et al., we develop an extended, non-parametric approach to determine the transfer function for RM data, in which the transfer function is expressed as a sum of a family of relatively displaced Gaussian response functions. Therefore, arbitrary shapes of transfer functions associated with complicated BLR geometry can be seamlessly included, enabling us to relax the presumption of a specified transfer function frequently adopted in previous studies and to let it be determined by observation data. We formulate our approach in a previously well-established framework that incorporates the statistical modeling of continuum variations as a damped random walk process and takes into account long-term secular variations which are irrelevant to RM signals. The application to RM data shows the fidelity of our approach.
The design of electric vehicle intelligent charger
NASA Astrophysics Data System (ADS)
Xu, Yangyang; Wang, Ying
2018-05-01
As the situation of the lack of energy and environment pollution deteriorates rapidly, electric vehicle, a new type of traffic tool, is being researched worldwide. As the core components of electric vehicle, the battery and charger's performance play an important roles in the quality of electric vehicle. So the design of the Electric Vehicle Intelligent Charger based on language-C is designed in this paper. The hardware system is used to produce the input signals of Electric Vehicle Intelligent Charger. The software system adopts the language-C software as development environment. The design can accomplish the test of the parametric such as voltage-current and temperature.
iPads at Field Camp: A First Test of the Challenges and Opportunities
NASA Astrophysics Data System (ADS)
Hurst, S. D.; Stewart, M. A.
2011-12-01
An iPad 2 was given to approximately half of the University of Illinois students attending the Wasatch-Uinta Field Camp (WUFC) in summer 2011. The iPads were provisioned with orientation measuring, mapping and location software. The software would automatically transfer an orientation measurement to the current location on the Google Maps application, and was able to output a full list of orientation data. Students also had normal access to more traditional mapping tools such as Brunton compasses and GPS units and were required to map with these tools along with other students of WUFC not provided iPads. Compared to traditional tools, iPads have drawbacks such as increased weight, break-ability, need for power source and wireless connectivity; in sum, they need a substantial infrastructure that reduces range, availability, and probably most importantly, convenience. Some of these drawbacks inhibited adoption by our students, the primary reasons being the added weight and the inability to map directly to a GIS application with detailed topographic maps equivalent to the physical topographic map sheets used at WUFC. In their favor, the iPads combine a host of tools into one, including software that can measure orientation in a fashion more intuitively than a Brunton. They also allow storage, editing and analysis of data, notes (spoken and/or written) and potentially unlimited access to a variety of maps. Via a post-field camp survey of the University of Illinois students at WUFC, we have identified some of the important issues that need to be addressed before portable tablets like the iPad become the tool of choice for general field work. Some problems are intrinsic to almost any advanced technology, some are artifacts of the current generations of hardware and software available for these devices. Technical drawbacks aside, the adoption of iPads was further inhibited primarily by inexperience with their use as a mapping tool and secondarily by their redundancy with traditional tools. We are addressing some aspects of software limitations and future technology improvements by the industry will naturally reduce other limitations. We will continue testing iPads during field trips and courses for the foreseeable future. As we begin to deal with these limitations and students become more accustomed to their use in the field, we expect our students to more fully embrace iPads as a convenient field and mapping tool.
Managing mapping data using commercial data base management software.
Elassal, A.A.
1985-01-01
Electronic computers are involved in almost every aspect of the map making process. This involvement has become so thorough that it is practically impossible to find a recently developed process or device in the mapping field which does not employ digital processing in some form or another. This trend, which has been evolving over two decades, is accelerated by the significant improvements in capility, reliability, and cost-effectiveness of electronic devices. Computerized mapping processes and devices share a common need for machine readable data. Integrating groups of these components into automated mapping systems requires careful planning for data flow amongst them. Exploring the utility of commercial data base management software to assist in this task is the subject of this paper. -Author
From Neutron Star Observables to the Equation of State. I. An Optimal Parametrization
NASA Astrophysics Data System (ADS)
Raithel, Carolyn A.; Özel, Feryal; Psaltis, Dimitrios
2016-11-01
The increasing number and precision of measurements of neutron star masses, radii, and, in the near future, moments of inertia offer the possibility of precisely determining the neutron star equation of state (EOS). One way to facilitate the mapping of observables to the EOS is through a parametrization of the latter. We present here a generic method for optimizing the parametrization of any physically allowed EOS. We use mock EOS that incorporate physically diverse and extreme behavior to test how well our parametrization reproduces the global properties of the stars, by minimizing the errors in the observables of mass, radius, and the moment of inertia. We find that using piecewise polytropes and sampling the EOS with five fiducial densities between ˜1-8 times the nuclear saturation density results in optimal errors for the smallest number of parameters. Specifically, it recreates the radii of the assumed EOS to within less than 0.5 km for the extreme mock EOS and to within less than 0.12 km for 95% of a sample of 42 proposed, physically motivated EOS. Such a parametrization is also able to reproduce the maximum mass to within 0.04 {M}⊙ and the moment of inertia of a 1.338 {M}⊙ neutron star to within less than 10% for 95% of the proposed sample of EOS.
A physiology-based parametric imaging method for FDG-PET data
NASA Astrophysics Data System (ADS)
Scussolini, Mara; Garbarino, Sara; Sambuceti, Gianmario; Caviglia, Giacomo; Piana, Michele
2017-12-01
Parametric imaging is a compartmental approach that processes nuclear imaging data to estimate the spatial distribution of the kinetic parameters governing tracer flow. The present paper proposes a novel and efficient computational method for parametric imaging which is potentially applicable to several compartmental models of diverse complexity and which is effective in the determination of the parametric maps of all kinetic coefficients. We consider applications to [18 F]-fluorodeoxyglucose positron emission tomography (FDG-PET) data and analyze the two-compartment catenary model describing the standard FDG metabolization by an homogeneous tissue and the three-compartment non-catenary model representing the renal physiology. We show uniqueness theorems for both models. The proposed imaging method starts from the reconstructed FDG-PET images of tracer concentration and preliminarily applies image processing algorithms for noise reduction and image segmentation. The optimization procedure solves pixel-wise the non-linear inverse problem of determining the kinetic parameters from dynamic concentration data through a regularized Gauss-Newton iterative algorithm. The reliability of the method is validated against synthetic data, for the two-compartment system, and experimental real data of murine models, for the renal three-compartment system.
NASA Technical Reports Server (NTRS)
Meyer, Peter; Larson, Steven A.; Hansen, Earl G.; Itten, Klaus I.
1993-01-01
Remotely sensed data have geometric characteristics and representation which depend on the type of the acquisition system used. To correlate such data over large regions with other real world representation tools like conventional maps or Geographic Information System (GIS) for verification purposes, or for further treatment within different data sets, a coregistration has to be performed. In addition to the geometric characteristics of the sensor there are two other dominating factors which affect the geometry: the stability of the platform and the topography. There are two basic approaches for a geometric correction on a pixel-by-pixel basis: (1) A parametric approach using the location of the airplane and inertial navigation system data to simulate the observation geometry; and (2) a non-parametric approach using tie points or ground control points. It is well known that the non-parametric approach is not reliable enough for the unstable flight conditions of airborne systems, and is not satisfying in areas with significant topography, e.g. mountains and hills. The present work describes a parametric preprocessing procedure which corrects effects of flight line and attitude variation as well as topographic influences and is described in more detail by Meyer.
NASA Astrophysics Data System (ADS)
Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John
2018-01-01
Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.
Single-Frame Terrain Mapping Software for Robotic Vehicles
NASA Technical Reports Server (NTRS)
Rankin, Arturo L.
2011-01-01
This software is a component in an unmanned ground vehicle (UGV) perception system that builds compact, single-frame terrain maps for distribution to other systems, such as a world model or an operator control unit, over a local area network (LAN). Each cell in the map encodes an elevation value, terrain classification, object classification, terrain traversability, terrain roughness, and a confidence value into four bytes of memory. The input to this software component is a range image (from a lidar or stereo vision system), and optionally a terrain classification image and an object classification image, both registered to the range image. The single-frame terrain map generates estimates of the support surface elevation, ground cover elevation, and minimum canopy elevation; generates terrain traversability cost; detects low overhangs and high-density obstacles; and can perform geometry-based terrain classification (ground, ground cover, unknown). A new origin is automatically selected for each single-frame terrain map in global coordinates such that it coincides with the corner of a world map cell. That way, single-frame terrain maps correctly line up with the world map, facilitating the merging of map data into the world map. Instead of using 32 bits to store the floating-point elevation for a map cell, the vehicle elevation is assigned to the map origin elevation and reports the change in elevation (from the origin elevation) in terms of the number of discrete steps. The single-frame terrain map elevation resolution is 2 cm. At that resolution, terrain elevation from 20.5 to 20.5 m (with respect to the vehicle's elevation) is encoded into 11 bits. For each four-byte map cell, bits are assigned to encode elevation, terrain roughness, terrain classification, object classification, terrain traversability cost, and a confidence value. The vehicle s current position and orientation, the map origin, and the map cell resolution are all included in a header for each map. The map is compressed into a vector prior to delivery to another system.
Investigating the Impact of Concept Mapping Software on Greek Students with Attention Deficit (AD)
ERIC Educational Resources Information Center
Riga, Asimina; Papayiannis, Nikolaos
2015-01-01
The present study investigates if there is a positive effect of the use of concept mapping software on students with Attention Deficit (AD) when learning descriptive writing in the secondary level of education. It also examines what kind of difficulties AD students may have come across during this learning procedure. Sample students were selected…
Digital Geological Mapping for Earth Science Students
NASA Astrophysics Data System (ADS)
England, Richard; Smith, Sally; Tate, Nick; Jordan, Colm
2010-05-01
This SPLINT (SPatial Literacy IN Teaching) supported project is developing pedagogies for the introduction of teaching of digital geological mapping to Earth Science students. Traditionally students are taught to make geological maps on a paper basemap with a notebook to record their observations. Learning to use a tablet pc with GIS based software for mapping and data recording requires emphasis on training staff and students in specific GIS and IT skills and beneficial adjustments to the way in which geological data is recorded in the field. A set of learning and teaching materials are under development to support this learning process. Following the release of the British Geological Survey's Sigma software we have been developing generic methodologies for the introduction of digital geological mapping to students that already have experience of mapping by traditional means. The teaching materials introduce the software to the students through a series of structured exercises. The students learn the operation of the software in the laboratory by entering existing observations, preferably data that they have collected. Through this the students benefit from being able to reflect on their previous work, consider how it might be improved and plan new work. Following this they begin fieldwork in small groups using both methods simultaneously. They are able to practise what they have learnt in the classroom and review the differences, advantages and disadvantages of the two methods, while adding to the work that has already been completed. Once the field exercises are completed students use the data that they have collected in the production of high quality map products and are introduced to the use of integrated digital databases which they learn to search and extract information from. The relatively recent development of the technologies which underpin digital mapping also means that many academic staff also require training before they are able to deliver the course materials. Consequently, a set of staff training materials are being developed in parallel to those for the students. These focus on the operation of the software and an introduction to the structure of the exercises. The presentation will review the teaching exercises and student and staff responses to their introduction.
BenMAP 1.0 is the legacy version of the BenMAP software that the EPA is making available for archival purposes. It is designed for regional and national-scale analyses within the Continental United States.
Download the current and legacy versions of the BenMAP program. Download configuration and aggregation/pooling/valuation files to estimate benefits. BenMAP-CE is free and open source software, and the source code is available upon request.
X-MATE: a flexible system for mapping short read data
Pearson, John V.; Cloonan, Nicole; Grimmond, Sean M.
2011-01-01
Summary: Accurate and complete mapping of short-read sequencing to a reference genome greatly enhances the discovery of biological results and improves statistical predictions. We recently presented RNA-MATE, a pipeline for the recursive mapping of RNA-Seq datasets. With the rapid increase in genome re-sequencing projects, progression of available mapping software and the evolution of file formats, we now present X-MATE, an updated version of RNA-MATE, capable of mapping both RNA-Seq and DNA datasets and with improved performance, output file formats, configuration files, and flexibility in core mapping software. Availability: Executables, source code, junction libraries, test data and results and the user manual are available from http://grimmond.imb.uq.edu.au/X-MATE/. Contact: n.cloonan@uq.edu.au; s.grimmond@uq.edu.au Supplementary information: Supplementary data are available at Bioinformatics Online. PMID:21216778
Three-Dimensional Geologic Map of the Hayward Fault Zone, San Francisco Bay Region, California
Phelps, G.A.; Graymer, R.W.; Jachens, R.C.; Ponce, D.A.; Simpson, R.W.; Wentworth, C.M.
2008-01-01
A three-dimensional (3D) geologic map of the Hayward Fault zone was created by integrating the results from geologic mapping, potential field geophysics, and seismology investigations. The map volume is 100 km long, 20 km wide, and extends to a depth of 12 km below sea level. The map volume is oriented northwest and is approximately bisected by the Hayward Fault. The complex geologic structure of the region makes it difficult to trace many geologic units into the subsurface. Therefore, the map units are generalized from 1:24,000-scale geologic maps. Descriptions of geologic units and structures are offered, along with a discussion of the methods used to map them and incorporate them into the 3D geologic map. The map spatial database and associated viewing software are provided. Elements of the map, such as individual fault surfaces, are also provided in a non-proprietary format so that the user can access the map via open-source software. The sheet accompanying this manuscript shows views taken from the 3D geologic map for the user to access. The 3D geologic map is designed as a multi-purpose resource for further geologic investigations and process modeling.
The Trial Software version for DEMETER power spectrum files visualization and mapping
NASA Astrophysics Data System (ADS)
Lozbin, Anatoliy; Inchin, Alexander; Shpadi, Maxim
2010-05-01
In the frame of Kazakhstan's Scientific Space System creation for earthquakes precursors research, the hardware and software of DEMETER satellite was investigated. The data processing Software of DEMETER is based on package SWAN under IDL Virtual machine and realizes many features, but we can't find an important tool for the spectrograms analysis - space-time visualization of power spectrum files from electromagnetic devices as ICE and IMSC. For elimination of this problem we have developed Software which is offered to use. The DeSS (DEMETER Spectrogram Software) - it is Software for visualization, analysis and a mapping of power spectrum data from electromagnetic devices ICE and IMSC. The Software primary goal is to give the researcher friendly tool for the analysis of electromagnetic data from DEMETER Satellite for earthquake precursors and other ionosphere events researches. The Input data for DeSS Software is a power spectrum files: - Power spectrum of 1 component of the electric field in the VLF range (APID 1132); - Power spectrum of 1 component of the electric field in the HF range (APID 1134); - Power spectrum of 1 component of the magnetic field in the VLF range (APID 1137). The main features and operations of the software is possible: - various time and frequency filtration; - visualization of time dependence of signal intensity on fixed frequency; - spectral density visualization for fixed frequency range; - spectrogram autosize and smooth spectrogram; - the information in each point of the spectrogram: time, frequency and intensity; - the spectrum information in the separate window, consisting of 4 blocks; - data mapping with 6 range scale. On the map we can browse next information: - satellite orbit; - conjugate point at the satellite altitude; - north conjugate point at the altitude 110 km; - south conjugate point at the altitude 110 km. This is only trial software version to help the researchers and we always ready collaborate with scientists for software improvement. References: 1. D.Lagoutte, J.Y. Brochot, D. de Carvalho, L.Madrias and M. Parrot. DEMETER Microsatellite. Scientific Mission Center. Data product description. DMT-SP-9-CM-6054-LPC. 2. D.Lagoutte, J.Y. Brochot, P.Latremoliere. SWAN - Software for Waveform Analysis. LPCE/NI/003.E - Part 1 (User's guide), Part 2 (Analysis tools), Part 3 (User's project interface).
Gottlieb, Josh; Princenthal, Robert; Cohen, Martin I
2017-07-01
To evaluate the multi-parametric MRI (mpMRI) findings in patients with biopsy-proven granulomatous prostatitis and prior Bacillus Calmette-Guérin (BCG) exposure. MRI was performed in six patients with pathologically proven granulomatous prostatitis and a prior history of bladder cancer treated with intravesical BCG therapy. Multi-parametric prostate MRI images were recorded on a GE 750W or Philips Achieva 3.0 Tesla MRI scanner with high-resolution, small-field-of-view imaging consisting of axial T2, axial T1, coronal T2, sagittal T2, axial multiple b-value diffusion (multiple values up to 1200 or 1400), and dynamic contrast-enhanced 3D axial T1 with fat suppression sequence. Two different patterns of MR findings were observed. Five of the six patients had a low mean ADC value <1000 (decreased signal on ADC map images) and isointense signal on high-b-value imaging (b = 1200 or 1400), consistent with nonspecific granulomatous prostatitis. The other pattern seen in one of the six patients was decreased signal on the ADC map images with increased signal on the high-b-value sequence, revealing true restricted diffusion indistinguishable from aggressive prostate cancer. This patient had biopsy-confirmed acute BCG prostatitis. Our study suggests that patients with known BCG exposure and PI-RADS v2 scores ≤3, showing similar mpMRI findings as demonstrated, may not require prostate biopsy.
Daniel J. Leduc; Thomas G. Matney; Keith L. Belli; V. Clark Baldwin
2001-01-01
Artificial neural networks (NN) are becoming a popular estimation tool. Because they require no assumptions about the form of a fitting function, they can free the modeler from reliance on parametric approximating functions that may or may not satisfactorily fit the observed data. To date there have been few applications in forestry science, but as better NN software...
Engine structures modeling software system: Computer code. User's manual
NASA Technical Reports Server (NTRS)
1992-01-01
ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.
Remotely Monitored Sealing Array Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
2012-09-12
The Remotely Monitored Sealing Array (RMSA) utilizes the Secure Sensor Platform (SSP) framework to establish the fundamental operating capabilities for communication, security, power management, and cryptography. In addition to the SSP framework the RMSA software has unique capabilities to support monitoring a fiber optic seal. Fiber monitoring includes open and closed as well as parametric monitoring to detect tampering attacks. The fiber monitoring techniques, using the SSP power management processes, allow the seals to last for years while maintaining the security requirements of the monitoring application. The seal is enclosed in a tamper resistant housing with software to support activemore » tamper monitoring. New features include LED notification of fiber closure, the ability to retrieve the entire fiber optic history via translator command, separate memory storage for fiber optic events, and a more robust method for tracking and resending failed messages.« less
Grid adaption based on modified anisotropic diffusion equations formulated in the parametic domain
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hagmeijer, R.
1994-11-01
A new grid-adaption algorithm for problems in computational fluid dynamics is presented. The basic equations are derived from a variational problem formulated in the parametric domain of the mapping that defines the existing grid. Modification of the basic equations provides desirable properties in boundary layers. The resulting modified anisotropic diffusion equations are solved for the computational coordinates as functions of the parametric coordinates and these functions are numerically inverted. Numerical examples show that the algorithm is robust, that shocks and boundary layers are well-resolved on the adapted grid, and that the flow solution becomes a globally smooth function of themore » computational coordinates.« less
McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W
2008-01-01
Background Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. Methods The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). Results A significant area of hypoperfusion (P < 0.01) was found as a response to the TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. Conclusion The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques. PMID:18312639
McGoron, Anthony J; Capille, Michael; Georgiou, Michael F; Sanchez, Pablo; Solano, Juan; Gonzalez-Brito, Manuel; Kuluz, John W
2008-02-29
Assessment of cerebral blood flow (CBF) by SPECT could be important in the management of patients with severe traumatic brain injury (TBI) because changes in regional CBF can affect outcome by promoting edema formation and intracranial pressure elevation (with cerebral hyperemia), or by causing secondary ischemic injury including post-traumatic stroke. The purpose of this study was to establish an improved method for evaluating regional CBF changes after TBI in piglets. The focal effects of moderate traumatic brain injury (TBI) on cerebral blood flow (CBF) by SPECT cerebral blood perfusion (CBP) imaging in an animal model were investigated by parallelized statistical techniques. Regional CBF was measured by radioactive microspheres and by SPECT 2 hours after injury in sham-operated piglets versus those receiving severe TBI by fluid-percussion injury to the left parietal lobe. Qualitative SPECT CBP accuracy was assessed against reference radioactive microsphere regional CBF measurements by map reconstruction, registration and smoothing. Cerebral hypoperfusion in the test group was identified at the voxel level using statistical parametric mapping (SPM). A significant area of hypoperfusion (P < 0.01) was found as a response to the TBI. Statistical mapping of the reference microsphere CBF data confirms a focal decrease found with SPECT and SPM. The suitability of SPM for application to the experimental model and ability to provide insight into CBF changes in response to traumatic injury was validated by the SPECT SPM result of a decrease in CBP at the left parietal region injury area of the test group. Further study and correlation of this characteristic lesion with long-term outcomes and auxiliary diagnostic modalities is critical to developing more effective critical care treatment guidelines and automated medical imaging processing techniques.
NASA Astrophysics Data System (ADS)
Cerretelli, Stefania; Poggio, Laura; Gimona, Alessandro; Peressotti, Alessandro; Black, Helaina
2017-04-01
Land and soil degradation are widespread especially in dry and developing countries such as Ethiopia. Land degradation leads to ecosystems services (ESS) degradation, because it causes the depletion and loss of several soil functions. Ethiopia's farmland faces intense degradation due to deforestation, agricultural land expansion, land overexploitation and overgrazing. In this study we modelled the impact of physical factors on ESS degradation, in particular soil erodibility, carbon storage and nutrient retention, in the Ethiopian Great Rift Valley, northwestern of Hawassa. We used models of the Sediment retention/loss, the Nutrient Retention/loss (from the software suite InVEST) and Carbon Storage. To run the models we coupled soil local data (such as soil organic carbon, soil texture) with remote sensing data as input in the parametrization phase, e.g. to derive a land use map, to calculate the aboveground and belowground carbon, the evapotraspiration coefficient and the capacity of vegetation to retain nutrient. We then used spatialised Bayesian Belief Networks (sBBNs) predicting ecosystem services degradation on the basis of the results of the three mechanistic models. The results show i) the importance of mapping of ESS degradation taking into consideration the spatial heterogeneity and the cross-correlations between impacts ii) the fundamental role of remote sensing data in monitoring and modelling in remote, data-poor areas and iii) the important role of spatial BBNs in providing spatially explicit measures of risk and uncertainty. This approach could help decision makers to identify priority areas for intervention in order to reduce land and ecosystem services degradation.
Modularity, quaternion-Kähler spaces, and mirror symmetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Alexandrov, Sergei; Banerjee, Sibasish
2013-10-15
We provide an explicit twistorial construction of quaternion-Kähler manifolds obtained by deformation of c-map spaces and carrying an isometric action of the modular group SL(2,Z). The deformation is not assumed to preserve any continuous isometry and therefore this construction presents a general framework for describing NS5-brane instanton effects in string compactifications with N= 2 supersymmetry. In this context the modular invariant parametrization of twistor lines found in this work yields the complete non-perturbative mirror map between type IIA and type IIB physical fields.
Spectral analysis for nonstationary and nonlinear systems: a discrete-time-model-based approach.
He, Fei; Billings, Stephen A; Wei, Hua-Liang; Sarrigiannis, Ptolemaios G; Zhao, Yifan
2013-08-01
A new frequency-domain analysis framework for nonlinear time-varying systems is introduced based on parametric time-varying nonlinear autoregressive with exogenous input models. It is shown how the time-varying effects can be mapped to the generalized frequency response functions (FRFs) to track nonlinear features in frequency, such as intermodulation and energy transfer effects. A new mapping to the nonlinear output FRF is also introduced. A simulated example and the application to intracranial electroencephalogram data are used to illustrate the theoretical results.
Applying the metro map to software development management
NASA Astrophysics Data System (ADS)
Aguirregoitia, Amaia; Dolado, J. Javier; Presedo, Concepción
2010-01-01
This paper presents MetroMap, a new graphical representation model for controlling and managing the software development process. Metromap uses metaphors and visual representation techniques to explore several key indicators in order to support problem detection and resolution. The resulting visualization addresses diverse management tasks, such as tracking of deviations from the plan, analysis of patterns of failure detection and correction, overall assessment of change management policies, and estimation of product quality. The proposed visualization uses a metaphor with a metro map along with various interactive techniques to represent information concerning the software development process and to deal efficiently with multivariate visual queries. Finally, the paper shows the implementation of the tool in JavaFX with data of a real project and the results of testing the tool with the aforementioned data and users attempting several information retrieval tasks. The conclusion shows the results of analyzing user response time and efficiency using the MetroMap visualization system. The utility of the tool was positively evaluated.
Kumar, Rajendra; Sobhy, Haitham
2017-01-01
Abstract Hi-C experiments generate data in form of large genome contact maps (Hi-C maps). These show that chromosomes are arranged in a hierarchy of three-dimensional compartments. But to understand how these compartments form and by how much they affect genetic processes such as gene regulation, biologists and bioinformaticians need efficient tools to visualize and analyze Hi-C data. However, this is technically challenging because these maps are big. In this paper, we remedied this problem, partly by implementing an efficient file format and developed the genome contact map explorer platform. Apart from tools to process Hi-C data, such as normalization methods and a programmable interface, we made a graphical interface that let users browse, scroll and zoom Hi-C maps to visually search for patterns in the Hi-C data. In the software, it is also possible to browse several maps simultaneously and plot related genomic data. The software is openly accessible to the scientific community. PMID:28973466
Space transfer vehicle concepts and requirements study. Volume 3, book 1: Program cost estimates
NASA Technical Reports Server (NTRS)
Peffley, Al F.
1991-01-01
The Space Transfer Vehicle (STV) Concepts and Requirements Study cost estimate and program planning analysis is presented. The cost estimating technique used to support STV system, subsystem, and component cost analysis is a mixture of parametric cost estimating and selective cost analogy approaches. The parametric cost analysis is aimed at developing cost-effective aerobrake, crew module, tank module, and lander designs with the parametric cost estimates data. This is accomplished using cost as a design parameter in an iterative process with conceptual design input information. The parametric estimating approach segregates costs by major program life cycle phase (development, production, integration, and launch support). These phases are further broken out into major hardware subsystems, software functions, and tasks according to the STV preliminary program work breakdown structure (WBS). The WBS is defined to a low enough level of detail by the study team to highlight STV system cost drivers. This level of cost visibility provided the basis for cost sensitivity analysis against various design approaches aimed at achieving a cost-effective design. The cost approach, methodology, and rationale are described. A chronological record of the interim review material relating to cost analysis is included along with a brief summary of the study contract tasks accomplished during that period of review and the key conclusions or observations identified that relate to STV program cost estimates. The STV life cycle costs are estimated on the proprietary parametric cost model (PCM) with inputs organized by a project WBS. Preliminary life cycle schedules are also included.
An overview of the Hadoop/MapReduce/HBase framework and its current applications in bioinformatics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Ronald C.
Bioinformatics researchers are increasingly confronted with analysis of ultra large-scale data sets, a problem that will only increase at an alarming rate in coming years. Recent developments in open source software, that is, the Hadoop project and associated software, provide a foundation for scaling to petabyte scale data warehouses on Linux clusters, providing fault-tolerant parallelized analysis on such data using a programming style named MapReduce. An overview is given of the current usage within the bioinformatics community of Hadoop, a top-level Apache Software Foundation project, and of associated open source software projects. The concepts behind Hadoop and the associated HBasemore » project are defined, and current bioinformatics software that employ Hadoop is described. The focus is on next-generation sequencing, as the leading application area to date.« less
Parametric Quantum Search Algorithm as Quantum Walk: A Quantum Simulation
NASA Astrophysics Data System (ADS)
Ellinas, Demosthenes; Konstandakis, Christos
2016-02-01
Parametric quantum search algorithm (PQSA) is a form of quantum search that results by relaxing the unitarity of the original algorithm. PQSA can naturally be cast in the form of quantum walk, by means of the formalism of oracle algebra. This is due to the fact that the completely positive trace preserving search map used by PQSA, admits a unitarization (unitary dilation) a la quantum walk, at the expense of introducing auxiliary quantum coin-qubit space. The ensuing QW describes a process of spiral motion, chosen to be driven by two unitary Kraus generators, generating planar rotations of Bloch vector around an axis. The quadratic acceleration of quantum search translates into an equivalent quadratic saving of the number of coin qubits in the QW analogue. The associated to QW model Hamiltonian operator is obtained and is shown to represent a multi-particle long-range interacting quantum system that simulates parametric search. Finally, the relation of PQSA-QW simulator to the QW search algorithm is elucidated.
NASA Astrophysics Data System (ADS)
Löwe, Peter
2015-04-01
Many Free and Open Source Software (FOSS) tools have been created for the various application fields within geoscience. While FOSS allows re-implementation of functionalities in new environments by access to the original codebase, the easiest approach to build new software solutions for new problems is the combination or merging of existing software tools. Such mash-ups are implemented by embedding and encapsulating FOSS tools within each another, effectively focusing the use of the embedded software to the specific role it needs to perform in the given scenario, while ignoring all its other capabilities. GRASS GIS is a powerful and established FOSS GIS for raster, vector and volume data processing while the Generic Mapping Tools (GMT) are a suite of powerful Open Source mapping tools, which exceed the mapping capabilities of GRASS GIS. This poster reports on the new GRASS GIS add-on module r.out.polycones. It enables users to utilize non-continuous projections for map production within the GRASS production environment. This is implemented on the software level by encapsulating a subset of GMT mapping capabilities into a GRASS GIS (Version 6.x) add-on module. The module was developed at the German National Library of Science and Technology (TIB) to provide custom global maps of scientific collaboration networks, such as the DataCite consortium, the registration agency for Digital Object Identifiers (DOI) for research data. The GRASS GIS add-on module can be used for global mapping of raster data into a variety of non continuous sinosoidal projections, allowing the creation of printable biangles (gores) to be used for globe making. Due to the well structured modular nature of GRASS modules, technical follow-up work will focus on API-level Python-based integration in GRASS 7 [1]. Based on this, GMT based mapping capabilities in GRASS will be extended beyond non-continuous sinosoidal maps and advanced from raster-layers to content GRASS display monitors. References: [1] Petras, V., Petrasova, A., Chemin, Y., Zambelli, P., Landa, M., Gebbert, S., Neteler, N., Löwe, P.: Analyzing rasters, vectors and time series using new Python interfaces in GRASS GIS 7, Geophysical Research Abstracts Vol. 17, EGU2015-8142, 2015 (in preparation)
National Geothermal Data System: Open Access to Geoscience Data, Maps, and Documents
NASA Astrophysics Data System (ADS)
Caudill, C. M.; Richard, S. M.; Musil, L.; Sonnenschein, A.; Good, J.
2014-12-01
The U.S. National Geothermal Data System (NGDS) provides free open access to millions of geoscience data records, publications, maps, and reports via distributed web services to propel geothermal research, development, and production. NGDS is built on the US Geoscience Information Network (USGIN) data integration framework, which is a joint undertaking of the USGS and the Association of American State Geologists (AASG), and is compliant with international standards and protocols. NGDS currently serves geoscience information from 60+ data providers in all 50 states. Free and open source software is used in this federated system where data owners maintain control of their data. This interactive online system makes geoscience data easily discoverable, accessible, and interoperable at no cost to users. The dynamic project site http://geothermaldata.org serves as the information source and gateway to the system, allowing data and applications discovery and availability of the system's data feed. It also provides access to NGDS specifications and the free and open source code base (on GitHub), a map-centric and library style search interface, other software applications utilizing NGDS services, NGDS tutorials (via YouTube and USGIN site), and user-created tools and scripts. The user-friendly map-centric web-based application has been created to support finding, visualizing, mapping, and acquisition of data based on topic, location, time, provider, or key words. Geographic datasets visualized through the map interface also allow users to inspect the details of individual GIS data points (e.g. wells, geologic units, etc.). In addition, the interface provides the information necessary for users to access the GIS data from third party software applications such as GoogleEarth, UDig, and ArcGIS. A redistributable, free and open source software package called GINstack (USGIN software stack) was also created to give data providers a simple way to release data using interoperable and shareable standards, upload data and documents, and expose those data as a node in the NGDS or any larger data system through a CSW endpoint. The easy-to-use interface is supported by back-end software including Postgres, GeoServer, and custom CKAN extensions among others.
USDA-ARS?s Scientific Manuscript database
Ecophysiological crop models encode intra-species behaviors using parameters that are presumed to summarize genotypic properties of individual lines or cultivars. These genotype-specific parameters (GSP’s) can be interpreted as quantitative traits that can be mapped or otherwise analyzed, as are mor...
NASA Astrophysics Data System (ADS)
Li, Xin; Rooney, William D.; Várallyay, Csanád G.; Gahramanov, Seymur; Muldoon, Leslie L.; Goodman, James A.; Tagge, Ian J.; Selzer, Audrey H.; Pike, Martin M.; Neuwelt, Edward A.; Springer, Charles S.
2010-10-01
The accurate mapping of the tumor blood volume (TBV) fraction ( vb) is a highly desired imaging biometric goal. It is commonly thought that achieving this is difficult, if not impossible, when small molecule contrast reagents (CRs) are used for the T1-weighted (Dynamic-Contrast-Enhanced) DCE-MRI technique. This is because angiogenic malignant tumor vessels allow facile CR extravasation. Here, a three-site equilibrium water exchange model is applied to DCE-MRI data from the cerebrally-implanted rat brain U87 glioma, a tumor exhibiting rapid CR extravasation. Analyses of segments of the (and the entire) DCE data time-course with this "shutter-speed" pharmacokinetic model, which admits finite water exchange kinetics, allow TBV estimation from the first-pass segment. Pairwise parameter determinances were tested with grid searches of 2D parametric error surfaces. Tumor blood volume ( vb), as well as ve (the extracellular, extravascular space volume fraction), and Ktrans (a CR extravasation rate measure) parametric maps are presented. The role of the Patlak Plot in DCE-MRI is also considered.
3-D Quantitative Dynamic Contrast Ultrasound for Prostate Cancer Localization.
Schalk, Stefan G; Huang, Jing; Li, Jia; Demi, Libertario; Wijkstra, Hessel; Huang, Pintong; Mischi, Massimo
2018-04-01
To investigate quantitative 3-D dynamic contrast-enhanced ultrasound (DCE-US) and, in particular 3-D contrast-ultrasound dispersion imaging (CUDI), for prostate cancer detection and localization, 43 patients referred for 10-12-core systematic biopsy underwent 3-D DCE-US. For each 3-D DCE-US recording, parametric maps of CUDI-based and perfusion-based parameters were computed. The parametric maps were divided in regions, each corresponding to a biopsy core. The obtained parameters were validated per biopsy location and after combining two or more adjacent regions. For CUDI by correlation (r) and for the wash-in time (WIT), a significant difference in parameter values between benign and malignant biopsy cores was found (p < 0.001). In a per-prostate analysis, sensitivity and specificity were 94% and 50% for r, and 53% and 81% for WIT. Based on these results, it can be concluded that quantitative 3-D DCE-US could aid in localizing prostate cancer. Therefore, we recommend follow-up studies to investigate its value for targeting biopsies. Copyright © 2018 World Federation for Ultrasound in Medicine and Biology. Published by Elsevier Inc. All rights reserved.
Elastostatic stress analysis of orthotropic rectangular center-cracked plates
NASA Technical Reports Server (NTRS)
Gyekenyesi, G. S.; Mendelson, A.
1972-01-01
A mapping-collocation method was developed for the elastostatic stress analysis of finite, anisotropic plates with centrally located traction-free cracks. The method essentially consists of mapping the crack into the unit circle and satisfying the crack boundary conditions exactly with the help of Muskhelishvili's function extension concept. The conditions on the outer boundary are satisfied approximately by applying the method of least-squares boundary collocation. A parametric study of finite-plate stress intensity factors, employing this mapping-collocation method, is presented. It shows the effects of varying material properties, orientation angle, and crack-length-to-plate-width and plate-height-to-plate-width ratios for rectangular orthotropic plates under constant tensile and shear loads.
Integrating Multibody Simulation and CFD: toward Complex Multidisciplinary Design Optimization
NASA Astrophysics Data System (ADS)
Pieri, Stefano; Poloni, Carlo; Mühlmeier, Martin
This paper describes the use of integrated multidisciplinary analysis and optimization of a race car model on a predefined circuit. The objective is the definition of the most efficient geometric configuration that can guarantee the lowest lap time. In order to carry out this study it has been necessary to interface the design optimization software modeFRONTIER with the following softwares: CATIA v5, a three dimensional CAD software, used for the definition of the parametric geometry; A.D.A.M.S./Motorsport, a multi-body dynamic simulation software; IcemCFD, a mesh generator, for the automatic generation of the CFD grid; CFX, a Navier-Stokes code, for the fluid-dynamic forces prediction. The process integration gives the possibility to compute, for each geometrical configuration, a set of aerodynamic coefficients that are then used in the multiboby simulation for the computation of the lap time. Finally an automatic optimization procedure is started and the lap-time minimized. The whole process is executed on a Linux cluster running CFD simulations in parallel.
NASA Astrophysics Data System (ADS)
Deliparaschos, Kyriakos M.; Michail, Konstantinos; Zolotas, Argyrios C.; Tzafestas, Spyros G.
2016-05-01
This work presents a field programmable gate array (FPGA)-based embedded software platform coupled with a software-based plant, forming a hardware-in-the-loop (HIL) that is used to validate a systematic sensor selection framework. The systematic sensor selection framework combines multi-objective optimization, linear-quadratic-Gaussian (LQG)-type control, and the nonlinear model of a maglev suspension. A robustness analysis of the closed-loop is followed (prior to implementation) supporting the appropriateness of the solution under parametric variation. The analysis also shows that quantization is robust under different controller gains. While the LQG controller is implemented on an FPGA, the physical process is realized in a high-level system modeling environment. FPGA technology enables rapid evaluation of the algorithms and test designs under realistic scenarios avoiding heavy time penalty associated with hardware description language (HDL) simulators. The HIL technique facilitates significant speed-up in the required execution time when compared to its software-based counterpart model.
Guetarni, F; Rigoard, P
2015-03-01
Conventional spinal cord stimulation (SCS) generates paraesthesia, as the efficacy of this technique is based on the relationship between the paraesthesia provided by SCS on the painful zone and an analgesic effect on the stimulated zone. Although this basic postulate is based on clinical evidence, it is clear that this relationship has never been formally demonstrated by scientific studies. There is a need for objective evaluation tools ("transducers") to transpose electrical signals to clinical effects and to guide therapeutic choices. We have developed a software at Poitiers University hospital allowing real-time objective mapping of the paraesthesia generated by SCS lead placement and programming during the implantation procedure itself, on a touch screen interface. The purpose of this article is to describe this intraoperative mapping software, in terms of its concept and technical aspects. The Neuro-Mapping Locator (NML) software is dedicated to patients with failed back surgery syndrome, candidates for SCS lead implantation, to actively participate in the implantation procedure. Real-time geographical localization of the paraesthesia generated by percutaneous or multicolumn surgical SCS lead implanted under awake anaesthesia allows intraoperative lead programming and possibly lead positioning to be modified with the patient's cooperation. Software updates should enable us to refine objectives related to the use of this tool and minimize observational biases. The ultimate goals of NML software should not be limited to optimize one specific device implantation in a patient but also allow to compare instantaneously various stimulation strategies, by characterizing new technical parameters as "coverage efficacy" and "device specificity" on selected subgroups of patients. Another longer-term objective would be to organize these predictive factors into computer science ontologies, which could constitute robust and helpful data for device selection and programming of tomorrow's neurostimulators. Copyright © 2014 Elsevier Masson SAS. All rights reserved.
Photometric Modeling of Simulated Surace-Resolved Bennu Images
NASA Astrophysics Data System (ADS)
Golish, D.; DellaGiustina, D. N.; Clark, B.; Li, J. Y.; Zou, X. D.; Bennett, C. A.; Lauretta, D. S.
2017-12-01
The Origins, Spectral Interpretation, Resource Identification, Security, Regolith Explorer (OSIRIS-REx) is a NASA mission to study and return a sample of asteroid (101955) Bennu. Imaging data from the mission will be used to develop empirical surface-resolved photometric models of Bennu at a series of wavelengths. These models will be used to photometrically correct panchromatic and color base maps of Bennu, compensating for variations due to shadows and photometric angle differences, thereby minimizing seams in mosaicked images. Well-corrected mosaics are critical to the generation of a global hazard map and a global 1064-nm reflectance map which predicts LIDAR response. These data products directly feed into the selection of a site from which to safely acquire a sample. We also require photometric correction for the creation of color ratio maps of Bennu. Color ratios maps provide insight into the composition and geological history of the surface and allow for comparison to other Solar System small bodies. In advance of OSIRIS-REx's arrival at Bennu, we use simulated images to judge the efficacy of both the photometric modeling software and the mission observation plan. Our simulation software is based on USGS's Integrated Software for Imagers and Spectrometers (ISIS) and uses a synthetic shape model, a camera model, and an empirical photometric model to generate simulated images. This approach gives us the flexibility to create simulated images of Bennu based on analog surfaces from other small Solar System bodies and to test our modeling software under those conditions. Our photometric modeling software fits image data to several conventional empirical photometric models and produces the best fit model parameters. The process is largely automated, which is crucial to the efficient production of data products during proximity operations. The software also produces several metrics on the quality of the observations themselves, such as surface coverage and the completeness of the data set for evaluating the phase and disk functions of the surface. Application of this software to simulated mission data has revealed limitations in the initial mission design, which has fed back into the planning process. The entire photometric pipeline further serves as an exercise of planned activities for proximity operations.
Benefits Mapping and Analysis Program (BenMAP)
This area summarizes the key features of the BenMAP-CE program and links to pages that provide more details regarding the program, the basic principles of air pollution benefits analysis and a link to download the software.
Comparison of Survival Models for Analyzing Prognostic Factors in Gastric Cancer Patients
Habibi, Danial; Rafiei, Mohammad; Chehrei, Ali; Shayan, Zahra; Tafaqodi, Soheil
2018-03-27
Objective: There are a number of models for determining risk factors for survival of patients with gastric cancer. This study was conducted to select the model showing the best fit with available data. Methods: Cox regression and parametric models (Exponential, Weibull, Gompertz, Log normal, Log logistic and Generalized Gamma) were utilized in unadjusted and adjusted forms to detect factors influencing mortality of patients. Comparisons were made with Akaike Information Criterion (AIC) by using STATA 13 and R 3.1.3 softwares. Results: The results of this study indicated that all parametric models outperform the Cox regression model. The Log normal, Log logistic and Generalized Gamma provided the best performance in terms of AIC values (179.2, 179.4 and 181.1, respectively). On unadjusted analysis, the results of the Cox regression and parametric models indicated stage, grade, largest diameter of metastatic nest, largest diameter of LM, number of involved lymph nodes and the largest ratio of metastatic nests to lymph nodes, to be variables influencing the survival of patients with gastric cancer. On adjusted analysis, according to the best model (log normal), grade was found as the significant variable. Conclusion: The results suggested that all parametric models outperform the Cox model. The log normal model provides the best fit and is a good substitute for Cox regression. Creative Commons Attribution License
Nieuwenhuys, Angela; Papageorgiou, Eirini; Desloovere, Kaat; Molenaers, Guy; De Laet, Tinne
2017-01-01
Experts recently identified 49 joint motion patterns in children with cerebral palsy during a Delphi consensus study. Pattern definitions were therefore the result of subjective expert opinion. The present study aims to provide objective, quantitative data supporting the identification of these consensus-based patterns. To do so, statistical parametric mapping was used to compare the mean kinematic waveforms of 154 trials of typically developing children (n = 56) to the mean kinematic waveforms of 1719 trials of children with cerebral palsy (n = 356), which were classified following the classification rules of the Delphi study. Three hypotheses stated that: (a) joint motion patterns with 'no or minor gait deviations' (n = 11 patterns) do not differ significantly from the gait pattern of typically developing children; (b) all other pathological joint motion patterns (n = 38 patterns) differ from typically developing gait and the locations of difference within the gait cycle, highlighted by statistical parametric mapping, concur with the consensus-based classification rules. (c) all joint motion patterns at the level of each joint (n = 49 patterns) differ from each other during at least one phase of the gait cycle. Results showed that: (a) ten patterns with 'no or minor gait deviations' differed somewhat unexpectedly from typically developing gait, but these differences were generally small (≤3°); (b) all other joint motion patterns (n = 38) differed from typically developing gait and the significant locations within the gait cycle that were indicated by the statistical analyses, coincided well with the classification rules; (c) joint motion patterns at the level of each joint significantly differed from each other, apart from two sagittal plane pelvic patterns. In addition to these results, for several joints, statistical analyses indicated other significant areas during the gait cycle that were not included in the pattern definitions of the consensus study. Based on these findings, suggestions to improve pattern definitions were made.
Papageorgiou, Eirini; Desloovere, Kaat; Molenaers, Guy; De Laet, Tinne
2017-01-01
Experts recently identified 49 joint motion patterns in children with cerebral palsy during a Delphi consensus study. Pattern definitions were therefore the result of subjective expert opinion. The present study aims to provide objective, quantitative data supporting the identification of these consensus-based patterns. To do so, statistical parametric mapping was used to compare the mean kinematic waveforms of 154 trials of typically developing children (n = 56) to the mean kinematic waveforms of 1719 trials of children with cerebral palsy (n = 356), which were classified following the classification rules of the Delphi study. Three hypotheses stated that: (a) joint motion patterns with ‘no or minor gait deviations’ (n = 11 patterns) do not differ significantly from the gait pattern of typically developing children; (b) all other pathological joint motion patterns (n = 38 patterns) differ from typically developing gait and the locations of difference within the gait cycle, highlighted by statistical parametric mapping, concur with the consensus-based classification rules. (c) all joint motion patterns at the level of each joint (n = 49 patterns) differ from each other during at least one phase of the gait cycle. Results showed that: (a) ten patterns with ‘no or minor gait deviations’ differed somewhat unexpectedly from typically developing gait, but these differences were generally small (≤3°); (b) all other joint motion patterns (n = 38) differed from typically developing gait and the significant locations within the gait cycle that were indicated by the statistical analyses, coincided well with the classification rules; (c) joint motion patterns at the level of each joint significantly differed from each other, apart from two sagittal plane pelvic patterns. In addition to these results, for several joints, statistical analyses indicated other significant areas during the gait cycle that were not included in the pattern definitions of the consensus study. Based on these findings, suggestions to improve pattern definitions were made. PMID:28081229
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ren, S; Tianjin University, Tianjin; Hara, W
Purpose: MRI has a number of advantages over CT as a primary modality for radiation treatment planning (RTP). However, one key bottleneck problem still remains, which is the lack of electron density information in MRI. In the work, a reliable method to map electron density is developed by leveraging the differential contrast of multi-parametric MRI. Methods: We propose a probabilistic Bayesian approach for electron density mapping based on T1 and T2-weighted MRI, using multiple patients as atlases. For each voxel, we compute two conditional probabilities: (1) electron density given its image intensity on T1 and T2-weighted MR images, and (2)more » electron density given its geometric location in a reference anatomy. The two sources of information (image intensity and spatial location) are combined into a unifying posterior probability density function using the Bayesian formalism. The mean value of the posterior probability density function provides the estimated electron density. Results: We evaluated the method on 10 head and neck patients and performed leave-one-out cross validation (9 patients as atlases and remaining 1 as test). The proposed method significantly reduced the errors in electron density estimation, with a mean absolute HU error of 138, compared with 193 for the T1-weighted intensity approach and 261 without density correction. For bone detection (HU>200), the proposed method had an accuracy of 84% and a sensitivity of 73% at specificity of 90% (AUC = 87%). In comparison, the AUC for bone detection is 73% and 50% using the intensity approach and without density correction, respectively. Conclusion: The proposed unifying method provides accurate electron density estimation and bone detection based on multi-parametric MRI of the head with highly heterogeneous anatomy. This could allow for accurate dose calculation and reference image generation for patient setup in MRI-based radiation treatment planning.« less
NASA Technical Reports Server (NTRS)
Hen, Itay; Rieffel, Eleanor G.; Do, Minh; Venturelli, Davide
2014-01-01
There are two common ways to evaluate algorithms: performance on benchmark problems derived from real applications and analysis of performance on parametrized families of problems. The two approaches complement each other, each having its advantages and disadvantages. The planning community has concentrated on the first approach, with few ways of generating parametrized families of hard problems known prior to this work. Our group's main interest is in comparing approaches to solving planning problems using a novel type of computational device - a quantum annealer - to existing state-of-the-art planning algorithms. Because only small-scale quantum annealers are available, we must compare on small problem sizes. Small problems are primarily useful for comparison only if they are instances of parametrized families of problems for which scaling analysis can be done. In this technical report, we discuss our approach to the generation of hard planning problems from classes of well-studied NP-complete problems that map naturally to planning problems or to aspects of planning problems that many practical planning problems share. These problem classes exhibit a phase transition between easy-to-solve and easy-to-show-unsolvable planning problems. The parametrized families of hard planning problems lie at the phase transition. The exponential scaling of hardness with problem size is apparent in these families even at very small problem sizes, thus enabling us to characterize even very small problems as hard. The families we developed will prove generally useful to the planning community in analyzing the performance of planning algorithms, providing a complementary approach to existing evaluation methods. We illustrate the hardness of these problems and their scaling with results on four state-of-the-art planners, observing significant differences between these planners on these problem families. Finally, we describe two general, and quite different, mappings of planning problems to QUBOs, the form of input required for a quantum annealing machine such as the D-Wave II.
NASA Astrophysics Data System (ADS)
Müller, Peter; Krause, Marita; Beck, Rainer; Schmidt, Philip
2017-10-01
Context. The venerable NOD2 data reduction software package for single-dish radio continuum observations, which was developed for use at the 100-m Effelsberg radio telescope, has been successfully applied over many decades. Modern computing facilities, however, call for a new design. Aims: We aim to develop an interactive software tool with a graphical user interface for the reduction of single-dish radio continuum maps. We make a special effort to reduce the distortions along the scanning direction (scanning effects) by combining maps scanned in orthogonal directions or dual- or multiple-horn observations that need to be processed in a restoration procedure. The package should also process polarisation data and offer the possibility to include special tasks written by the individual user. Methods: Based on the ideas of the NOD2 package we developed NOD3, which includes all necessary tasks from the raw maps to the final maps in total intensity and linear polarisation. Furthermore, plot routines and several methods for map analysis are available. The NOD3 package is written in Python, which allows the extension of the package via additional tasks. The required data format for the input maps is FITS. Results: The NOD3 package is a sophisticated tool to process and analyse maps from single-dish observations that are affected by scanning effects from clouds, receiver instabilities, or radio-frequency interference. The "basket-weaving" tool combines orthogonally scanned maps into a final map that is almost free of scanning effects. The new restoration tool for dual-beam observations reduces the noise by a factor of about two compared to the NOD2 version. Combining single-dish with interferometer data in the map plane ensures the full recovery of the total flux density. Conclusions: This software package is available under the open source license GPL for free use at other single-dish radio telescopes of the astronomical community. The NOD3 package is designed to be extendable to multi-channel data represented by data cubes in Stokes I, Q, and U.
Open source software integrated into data services of Japanese planetary explorations
NASA Astrophysics Data System (ADS)
Yamamoto, Y.; Ishihara, Y.; Otake, H.; Imai, K.; Masuda, K.
2015-12-01
Scientific data obtained by Japanese scientific satellites and lunar and planetary explorations are archived in DARTS (Data ARchives and Transmission System). DARTS provides the data with a simple method such as HTTP directory listing for long-term preservation while DARTS tries to provide rich web applications for ease of access with modern web technologies based on open source software. This presentation showcases availability of open source software through our services. KADIAS is a web-based application to search, analyze, and obtain scientific data measured by SELENE(Kaguya), a Japanese lunar orbiter. KADIAS uses OpenLayers to display maps distributed from Web Map Service (WMS). As a WMS server, open source software MapServer is adopted. KAGUYA 3D GIS (KAGUYA 3D Moon NAVI) provides a virtual globe for the SELENE's data. The main purpose of this application is public outreach. NASA World Wind Java SDK is used to develop. C3 (Cross-Cutting Comparisons) is a tool to compare data from various observations and simulations. It uses Highcharts to draw graphs on web browsers. Flow is a tool to simulate a Field-Of-View of an instrument onboard a spacecraft. This tool itself is open source software developed by JAXA/ISAS, and the license is BSD 3-Caluse License. SPICE Toolkit is essential to compile FLOW. SPICE Toolkit is also open source software developed by NASA/JPL, and the website distributes many spacecrafts' data. Nowadays, open source software is an indispensable tool to integrate DARTS services.
Boulos, Maged N Kamel; Honda, Kiyoshi
2006-01-01
Open Source Web GIS software systems have reached a stage of maturity, sophistication, robustness and stability, and usability and user friendliness rivalling that of commercial, proprietary GIS and Web GIS server products. The Open Source Web GIS community is also actively embracing OGC (Open Geospatial Consortium) standards, including WMS (Web Map Service). WMS enables the creation of Web maps that have layers coming from multiple different remote servers/sources. In this article we present one easy to implement Web GIS server solution that is based on the Open Source University of Minnesota (UMN) MapServer. By following the accompanying step-by-step tutorial instructions, interested readers running mainstream Microsoft® Windows machines and with no prior technical experience in Web GIS or Internet map servers will be able to publish their own health maps on the Web and add to those maps additional layers retrieved from remote WMS servers. The 'digital Asia' and 2004 Indian Ocean tsunami experiences in using free Open Source Web GIS software are also briefly described. PMID:16420699
Operational excellence (six sigma) philosophy: Application to software quality assurance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lackner, M.
1997-11-01
This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.
Profiling a Mind Map User: A Descriptive Appraisal
ERIC Educational Resources Information Center
Tucker, Joanne M.; Armstrong, Gary R.; Massad, Victor J.
2010-01-01
Whether manually or through the use of software, a non-linear information organization framework known as mind mapping offers an alternative method for capturing thoughts, ideas and information to linear thinking modes such as outlining. Mind mapping is brainstorming, organizing, and problem solving. This paper examines mind mapping techniques,…
KinSNP software for homozygosity mapping of disease genes using SNP microarrays.
Amir, El-Ad David; Bartal, Ofer; Morad, Efrat; Nagar, Tal; Sheynin, Jony; Parvari, Ruti; Chalifa-Caspi, Vered
2010-08-01
Consanguineous families affected with a recessive genetic disease caused by homozygotisation of a mutation offer a unique advantage for positional cloning of rare diseases. Homozygosity mapping of patient genotypes is a powerful technique for the identification of the genomic locus harbouring the causing mutation. This strategy relies on the observation that in these patients a large region spanning the disease locus is also homozygous with high probability. The high marker density in single nucleotide polymorphism (SNP) arrays is extremely advantageous for homozygosity mapping. We present KinSNP, a user-friendly software tool for homozygosity mapping using SNP arrays. The software searches for stretches of SNPs which are homozygous to the same allele in all ascertained sick individuals. User-specified parameters control the number of allowed genotyping 'errors' within homozygous blocks. Candidate disease regions are then reported in a detailed, coloured Excel file, along with genotypes of family members and healthy controls. An interactive genome browser has been included which shows homozygous blocks, individual genotypes, genes and further annotations along the chromosomes, with zooming and scrolling capabilities. The software has been used to identify the location of a mutated gene causing insensitivity to pain in a large Bedouin family. KinSNP is freely available from.
NASA Astrophysics Data System (ADS)
Duffy, Alan; Yates, Brian; Takacs, Peter
2012-09-01
The Optical Metrology Facility at the Canadian Light Source (CLS) has recently purchased MountainsMap surface analysis software from Digital Surf and we report here our experiences with this package and its usefulness as a tool for examining metrology data of synchrotron x-ray mirrors. The package has a number of operators that are useful for determining surface roughness and slope error including compliance with ISO standards (viz. ISO 4287 and ISO 25178). The software is extensible with MATLAB scripts either by loading an m-file or by a user written script. This makes it possible to apply a custom operator to measurement data sets. Using this feature we have applied the simple six-line MATLAB code for the direct least square fitting of ellipses developed by Fitzgibbon et. al. to investigate the residual slope error of elliptical mirrors upon the removal of the best-fit-ellipse. The software includes support for many instruments (e.g. Zygo, MicroMap, etc...) and can import ASCII data (e.g. LTP data). The stitching module allows the user to assemble overlapping images and we report on our experiences with this feature applied to MicroMap surface roughness data. The power spectral density function was determined for the stitched and unstitched data and compared.
NASA Technical Reports Server (NTRS)
Laubenthal, N. A.; Bertsch, D.; Lal, N.; Etienne, A.; Mcdonald, L.; Mattox, J.; Sreekumar, P.; Nolan, P.; Fierro, J.
1992-01-01
The Energetic Gamma Ray Telescope Experiment (EGRET) on the Compton Gamma Ray Observatory has been in orbit for more than a year and is being used to map the full sky for gamma rays in a wide energy range from 30 to 20,000 MeV. Already these measurements have resulted in a wide range of exciting new information on quasars, pulsars, galactic sources, and diffuse gamma ray emission. The central part of the analysis is done with sky maps that typically cover an 80 x 80 degree section of the sky for an exposure time of several days. Specific software developed for this program generates the counts, exposure, and intensity maps. The analysis is done on a network of UNIX based workstations and takes full advantage of a custom-built user interface called X-dialog. The maps that are generated are stored in the FITS format for a collection of energies. These, along with similar diffuse emission background maps generated from a model calculation, serve as input to a maximum likelihood program that produces maps of likelihood with optional contours that are used to evaluate regions for sources. Likelihood also evaluates the background corrected intensity at each location for each energy interval from which spectra can be generated. Being in a standard FITS format permits all of the maps to be easily accessed by the full complement of tools available in several commercial astronomical analysis systems. In the EGRET case, IDL is used to produce graphics plots in two and three dimensions and to quickly implement any special evaluation that might be desired. Other custom-built software, such as the spectral and pulsar analyses, take advantage of the XView toolkit for display and Postscript output for the color hard copy. This poster paper outlines the data flow and provides examples of the user interfaces and output products. It stresses the advantages that are derived from the integration of the specific instrument-unique software and powerful commercial tools for graphics and statistical evaluation. This approach has several proven advantages including flexibility, a minimum of development effort, ease of use, and portability.
ERIC Educational Resources Information Center
Carrington, Michal; Chen, Richard; Davies, Martin; Kaur, Jagjit; Neville, Benjamin
2011-01-01
An argument map visually represents the structure of an argument, outlining its informal logical connections and informing judgments as to its worthiness. Argument mapping can be augmented with dedicated software that aids the mapping process. Empirical evidence suggests that semester-length subjects using argument mapping along with dedicated…
Niphadkar, Madhura; Nagendra, Harini; Tarantino, Cristina; Adamo, Maria; Blonda, Palma
2017-01-01
The establishment of invasive alien species in varied habitats across the world is now recognized as a genuine threat to the preservation of biodiversity. Specifically, plant invasions in understory tropical forests are detrimental to the persistence of healthy ecosystems. Monitoring such invasions using Very High Resolution (VHR) satellite remote sensing has been shown to be valuable in designing management interventions for conservation of native habitats. Object-based classification methods are very helpful in identifying invasive plants in various habitats, by their inherent nature of imitating the ability of the human brain in pattern recognition. However, these methods have not been tested adequately in dense tropical mixed forests where invasion occurs in the understorey. This study compares a pixel-based and object-based classification method for mapping the understorey invasive shrub Lantana camara (Lantana) in a tropical mixed forest habitat in the Western Ghats biodiversity hotspot in India. Overall, a hierarchical approach of mapping top canopy at first, and then further processing for the understorey shrub, using measures such as texture and vegetation indices proved effective in separating out Lantana from other cover types. In the first method, we implement a simple parametric supervised classification for mapping cover types, and then process within these types for Lantana delineation. In the second method, we use an object-based segmentation algorithm to map cover types, and then perform further processing for separating Lantana. The improved ability of the object-based approach to delineate structurally distinct objects with characteristic spectral and spatial characteristics of their own, as well as with reference to their surroundings, allows for much flexibility in identifying invasive understorey shrubs among the complex vegetation of the tropical forest than that provided by the parametric classifier. Conservation practices in tropical mixed forests can benefit greatly by adopting methods which use high resolution remotely sensed data and advanced techniques to monitor the patterns and effective functioning of native ecosystems by periodically mapping disturbances such as invasion. PMID:28620400
Ertürk, M Arcan; Sathyanarayana Hegde, Shashank; Bottomley, Paul A
2016-12-01
Purpose To develop and demonstrate in vitro and in vivo a single interventional magnetic resonance (MR)-active device that integrates the functions of precise identification of a tissue site with the delivery of radiofrequency (RF) energy for ablation, high-spatial-resolution thermal mapping to monitor thermal dose, and quantitative MR imaging relaxometry to document ablation-induced tissue changes for characterizing ablated tissue. Materials and Methods All animal studies were approved by the institutional animal care and use committee. A loopless MR imaging antenna composed of a tuned microcable either 0.8 or 2.2 mm in diameter with an extended central conductor was switched between a 3-T MR imaging unit and an RF power source to monitor and perform RF ablation in bovine muscle and human artery samples in vitro and in rabbits in vivo. High-spatial-resolution (250-300-μm) proton resonance frequency shift MR thermometry was interleaved with ablations. Quantitative spin-lattice (T1) and spin-spin (T2) relaxation time MR imaging mapping was performed before and after ablation. These maps were compared with findings from gross tissue examination of the region of ablated tissue after MR imaging. Results High-spatial-resolution MR imaging afforded temperature mapping in less than 8 seconds for monitoring ablation temperatures in excess of 85°C delivered by the same device. This produced irreversible thermal injury and necrosis. Quantitative MR imaging relaxation time maps demonstrated up to a twofold variation in mean regional T1 and T2 after ablation versus before ablation. Conclusion A simple, integrated, minimally invasive interventional probe that provides image-guided therapy delivery, thermal mapping of dose, and detection of ablation-associated MR imaging parametric changes was developed and demonstrated. With this single-device approach, coupling-related safety concerns associated with multiple conductor approaches were avoided. © RSNA, 2016 Online supplemental material is available for this article.
User's Guide for MapIMG 2: Map Image Re-projection Software Package
Finn, Michael P.; Trent, Jason R.; Buehler, Robert A.
2006-01-01
BACKGROUND Scientists routinely accomplish small-scale geospatial modeling in the raster domain, using high-resolution datasets for large parts of continents and low-resolution to high-resolution datasets for the entire globe. Direct implementation of point-to-point transformation with appropriate functions yields the variety of projections available in commercial software packages, but implementation with data other than points requires specific adaptation of the transformation equations or prior preparation of the data to allow the transformation to succeed. It seems that some of these packages use the U.S. Geological Survey's (USGS) General Cartographic Transformation Package (GCTP) or similar point transformations without adaptation to the specific characteristics of raster data (Usery and others, 2003a). Usery and others (2003b) compiled and tabulated the accuracy of categorical areas in projected raster datasets of global extent. Based on the shortcomings identified in these studies, geographers and applications programmers at the USGS expanded and evolved a USGS software package, MapIMG, for raster map projection transformation (Finn and Trent, 2004). Daniel R. Steinwand of Science Applications International Corporation, National Center for Earth Resources Observation and Science, originally developed MapIMG for the USGS, basing it on GCTP. Through previous and continuing efforts at the USGS' National Geospatial Technical Operations Center, this program has been transformed from an application based on command line input into a software package based on a graphical user interface for Windows, Linux, and other UNIX machines.
NASA Astrophysics Data System (ADS)
Enescu (Balaş, M. L.; Alexandru, C.
2016-08-01
The paper deals with the optimal design of the control system for a 6-DOF robot used in thin layers deposition. The optimization is based on parametric technique, by modelling the design objective as a numerical function, and then establishing the optimal values of the design variables so that to minimize the objective function. The robotic system is a mechatronic product, which integrates the mechanical device and the controlled operating device.The mechanical device of the robot was designed in the CAD (Computer Aided Design) software CATIA, the 3D-model being then transferred to the MBS (Multi-Body Systems) environment ADAMS/View. The control system was developed in the concurrent engineering concept, through the integration with the MBS mechanical model, by using the DFC (Design for Control) software solution EASY5. The necessary angular motions in the six joints of the robot, in order to obtain the imposed trajectory of the end-effector, have been established by performing the inverse kinematic analysis. The positioning error in each joint of the robot is used as design objective, the optimization goal being to minimize the root mean square during simulation, which is a measure of the magnitude of the positioning error varying quantity.
Software design for analysis of multichannel intracardial and body surface electrocardiograms.
Potse, Mark; Linnenbank, André C; Grimbergen, Cornelis A
2002-11-01
Analysis of multichannel ECG recordings (body surface maps (BSMs) and intracardial maps) requires special software. We created a software package and a user interface on top of a commercial data analysis package (MATLAB) by a combination of high-level and low-level programming. Our software was created to satisfy the needs of a diverse group of researchers. It can handle a large variety of recording configurations. It allows for interactive usage through a fast and robust user interface, and batch processing for the analysis of large amounts of data. The package is user-extensible, includes routines for both common and experimental data processing tasks, and works on several computer platforms. The source code is made intelligible using software for structured documentation and is available to the users. The package is currently used by more than ten research groups analysing ECG data worldwide.
NASA Astrophysics Data System (ADS)
Azhdari, G. H.; Deilami, K.; Firooznia, E.
2015-12-01
Natural Resources are essential for security and sustainable development of each country. Therefore, in order to reach sustainable development, conservation as well as optimum utilization of natural resources, executing of natural resources cadastral plan is necessary and essential. Governments conduct lands management in Iran, so there is a need for comprehensive plan with arranged program for best evaluation. In this research as a pilot, Pasargadae city is opted. Pasargadae region is located in north-east of Shiraz in Fars province with Latitude and longitude of 30° 15 ´ 53 ° N and 53° 13 ´ 29 ° E respectively. In order to generate the cadastral maps, Firstly, images from QuickBird satellite with 50-60 centimeters resolution were georeferenced by utilizing ground control points with accurate GPS coordinates. In addition to satellite images, old paper maps with 1:10000 scale in local coordinate system from agriculture ministry in 1963 were digitized according to 1:25000 scale map from army geographical organization with AutoCad software. Beside, paper maps with 1:50000 scale and Google Earth were used to find the changes during time. All the above maps were added to QuickBird images as new layers by using ArcMap software. These maps also were utilized to determine the different land-uses. Thus, by employing ArcMap software lands divide into 2 groups: firstly, lands with official document, which is owned by either natural or legal persons, and secondly national lands under different uses such as forestry, range management and desertification plans. Consequently, the generation of cadastral maps leads to better difference between private and national lands. In addition, producing cadastral maps prevent the destruction and illegal possession of natural lands by individuals.
Optimal boundary conditions for ORCA-2 model
NASA Astrophysics Data System (ADS)
Kazantsev, Eugene
2013-08-01
A 4D-Var data assimilation technique is applied to ORCA-2 configuration of the NEMO in order to identify the optimal parametrization of boundary conditions on the lateral boundaries as well as on the bottom and on the surface of the ocean. The influence of boundary conditions on the solution is analyzed both within and beyond the assimilation window. It is shown that the optimal bottom and surface boundary conditions allow us to better represent the jet streams, such as Gulf Stream and Kuroshio. Analyzing the reasons of the jets reinforcement, we notice that data assimilation has a major impact on parametrization of the bottom boundary conditions for u and v. Automatic generation of the tangent and adjoint codes is also discussed. Tapenade software is shown to be able to produce the adjoint code that can be used after a memory usage optimization.
Aerodynamic shape optimization of a HSCT type configuration with improved surface definition
NASA Technical Reports Server (NTRS)
Thomas, Almuttil M.; Tiwari, Surendra N.
1994-01-01
Two distinct parametrization procedures of generating free-form surfaces to represent aerospace vehicles are presented. The first procedure is the representation using spline functions such as nonuniform rational b-splines (NURBS) and the second is a novel (geometrical) parametrization using solutions to a suitably chosen partial differential equation. The main idea is to develop a surface which is more versatile and can be used in an optimization process. Unstructured volume grid is generated by an advancing front algorithm and solutions obtained using an Euler solver. Grid sensitivity with respect to surface design parameters and aerodynamic sensitivity coefficients based on potential flow is obtained using an automatic differentiator precompiler software tool. Aerodynamic shape optimization of a complete aircraft with twenty four design variables is performed. High speed civil transport aircraft (HSCT) configurations are targeted to demonstrate the process.
NASA Astrophysics Data System (ADS)
Lim, Teik-Cheng
2004-01-01
A parametric relationship between the Pearson Takai Halicioglu Tiller (PTHT) and the Kaxiras Pandey (KP) empirical potential energy functions is developed for the case of 2-body interaction. The need for such relationship arises when preferred parametric data and adopted software correspond to different potential functions. The analytical relationship was obtained by equating the potential functions' derivatives at zeroth, first and second order with respect to the interatomic distance at the equilibrium bond length, followed by comparison of coefficients in the repulsive and attractive terms. Plots of non-dimensional 2-body energy versus the nondimensional interatomic distance verified the analytical relationships developed herein. The discrepancy revealed in theoretical plots suggests that the 2-body PTHT and KP potentials are more suitable for curve-fitting "softer" and "harder" bonds respectively.
NASA Astrophysics Data System (ADS)
Acomi, Nicoleta; Ancuţa, Cristian; Andrei, Cristian; Boştinǎ, Alina; Boştinǎ, Aurel
2016-12-01
Ships are mainly built to sail and transport cargo at sea. Environmental conditions and state of the sea are communicated to vessels through periodic weather forecasts. Despite officers being aware of the sea state, their sea time experience is a decisive factor when the vessel encounters severe environmental conditions. Another important factor is the loading condition of the vessel, which triggers different behaviour in similar marine environmental conditions. This paper aims to analyse the behaviour of a port container vessel in severe environmental conditions and to estimate the potential conditions of parametric roll resonance. Octopus software simulation is employed to simulate vessel motions under certain conditions of the sea, with possibility to analyse the behaviour of ships and the impact of high waves on ships due to specific wave encounter situations. The study should be regarded as a supporting tool during the decision making process.
NASA Technical Reports Server (NTRS)
Dill, Loren H.; Choo, Yung K. (Technical Monitor)
2004-01-01
Software was developed to construct approximating NURBS curves for iced airfoil geometries. Users specify a tolerance that determines the extent to which the approximating curve follows the rough ice. The user can therefore smooth the ice geometry in a controlled manner, thereby enabling the generation of grids suitable for numerical aerodynamic simulations. Ultimately, this ability to smooth the ice geometry will permit studies of the effects of smoothing upon the aerodynamics of iced airfoils. The software was applied to several different types of iced airfoil data collected in the Icing Research Tunnel at NASA Glenn Research Center, and in all cases was found to efficiently generate suitable approximating NURBS curves. This method is an improvement over the current "control point formulation" of Smaggice (v.1.2). In this report, we present the relevant theory of approximating NURBS curves and discuss typical results of the software.
A parametric ribcage geometry model accounting for variations among the adult population.
Wang, Yulong; Cao, Libo; Bai, Zhonghao; Reed, Matthew P; Rupp, Jonathan D; Hoff, Carrie N; Hu, Jingwen
2016-09-06
The objective of this study is to develop a parametric ribcage model that can account for morphological variations among the adult population. Ribcage geometries, including 12 pair of ribs, sternum, and thoracic spine, were collected from CT scans of 101 adult subjects through image segmentation, landmark identification (1016 for each subject), symmetry adjustment, and template mesh mapping (26,180 elements for each subject). Generalized procrustes analysis (GPA), principal component analysis (PCA), and regression analysis were used to develop a parametric ribcage model, which can predict nodal locations of the template mesh according to age, sex, height, and body mass index (BMI). Two regression models, a quadratic model for estimating the ribcage size and a linear model for estimating the ribcage shape, were developed. The results showed that the ribcage size was dominated by the height (p=0.000) and age-sex-interaction (p=0.007) and the ribcage shape was significantly affected by the age (p=0.0005), sex (p=0.0002), height (p=0.0064) and BMI (p=0.0000). Along with proper assignment of cortical bone thickness, material properties and failure properties, this parametric ribcage model can directly serve as the mesh of finite element ribcage models for quantifying effects of human characteristics on thoracic injury risks. Copyright © 2016 Elsevier Ltd. All rights reserved.
kruX: matrix-based non-parametric eQTL discovery.
Qi, Jianlong; Asl, Hassan Foroughi; Björkegren, Johan; Michoel, Tom
2014-01-14
The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com.
NASA Technical Reports Server (NTRS)
Haj-Ali, Rami; Aboudi, Jacob
2012-01-01
The recent two-dimensional (2-D) parametric formulation of the high fidelity generalized method of cells (HFGMC) reported by the authors is generalized for the micromechanical analysis of three-dimensional (3-D) multiphase composites with periodic microstructure. Arbitrary hexahedral subcell geometry is developed to discretize a triply periodic repeating unit-cell (RUC). Linear parametric-geometric mapping is employed to transform the arbitrary hexahedral subcell shapes from the physical space to an auxiliary orthogonal shape, where a complete quadratic displacement expansion is performed. Previously in the 2-D case, additional three equations are needed in the form of average moments of equilibrium as a result of the inclusion of the bilinear terms. However, the present 3-D parametric HFGMC formulation eliminates the need for such additional equations. This is achieved by expressing the coefficients of the full quadratic polynomial expansion of the subcell in terms of the side or face average-displacement vectors. The 2-D parametric and orthogonal HFGMC are special cases of the present 3-D formulation. The continuity of displacements and tractions, as well as the equilibrium equations, are imposed in the average (integral) sense as in the original HFGMC formulation. Each of the six sides (faces) of a subcell has an independent average displacement micro-variable vector which forms an energy-conjugate pair with the transformed average-traction vector. This allows generating symmetric stiffness matrices along with internal resisting vectors for the subcells which enhances the computational efficiency. The established new parametric 3-D HFGMC equations are formulated and solution implementations are addressed. Several applications for triply periodic 3-D composites are presented to demonstrate the general capability and varsity of the present parametric HFGMC method for refined micromechanical analysis by generating the spatial distributions of local stress fields. These applications include triply periodic composites with inclusions in the form of a cavity, spherical inclusion, ellipsoidal inclusion, discontinuous aligned short fiber. A 3-D repeating unit-cell for foam material composite is simulated.
VLBI Analysis with the Multi-Technique Software GEOSAT
NASA Technical Reports Server (NTRS)
Kierulf, Halfdan Pascal; Andersen, Per-Helge; Boeckmann, Sarah; Kristiansen, Oddgeir
2010-01-01
GEOSAT is a multi-technique geodetic analysis software developed at Forsvarets Forsknings Institutt (Norwegian defense research establishment). The Norwegian Mapping Authority has now installed the software and has, together with Forsvarets Forsknings Institutt, adapted the software to deliver datum-free normal equation systems in SINEX format. The goal is to be accepted as an IVS Associate Analysis Center and to provide contributions to the IVS EOP combination on a routine basis. GEOSAT is based on an upper diagonal factorized Kalman filter which allows estimation of time variable parameters like the troposphere and clocks as stochastic parameters. The tropospheric delays in various directions are mapped to tropospheric zenith delay using ray-tracing. Meteorological data from ECMWF with a resolution of six hours is used to perform the ray-tracing which depends both on elevation and azimuth. Other models are following the IERS and IVS conventions. The Norwegian Mapping Authority has submitted test SINEX files produced with GEOSAT to IVS. The results have been compared with the existing IVS combined products. In this paper the outcome of these comparisons is presented.
Section 4. The GIS Weasel User's Manual
Viger, Roland J.; Leavesley, George H.
2007-01-01
INTRODUCTION The GIS Weasel was designed to aid in the preparation of spatial information for input to lumped and distributed parameter hydrologic or other environmental models. The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to a user's model and to generate parameters from those maps. The operation of the GIS Weasel does not require the user to be a GIS expert, only that the user have an understanding of the spatial information requirements of the environmental simulation model being used. The GIS Weasel software system uses a GIS-based graphical user interface (GUI), the C programming language, and external scripting languages. The software will run on any computing platform where ArcInfo Workstation (version 8.0.2 or later) and the GRID extension are accessible. The user controls the processing of the GIS Weasel by interacting with menus, maps, and tables. The purpose of this document is to describe the operation of the software. This document is not intended to describe the usage of this software in support of any particular environmental simulation model. Such guides are published separately.
Area of Stochastic Scrape-Off Layer for a Single-Null Divertor Tokamak Using Simple Map
NASA Astrophysics Data System (ADS)
Fisher, Tiffany; Verma, Arun; Punjabi, Alkesh
1996-11-01
The magnetic topology of a single-null divertor tokamak is represented by Simple Map (Punjabi A, Verma A and Boozer A, Phys Rev Lett), 69, 3322 (1992) and J Plasma Phys, 52, 91 (1994). The Simple map is characterized by a single parameter k representing the toroidal asymmetry. The width of the stochastic scrape-off layer and its area varies with the map parameter k. We calculate the area of the stochastic scrape-off layer for different k's and obtain a parametric expression for the area in terms of k and y _LastGoodSurface(k). This work is supported by US DOE OFES. Tiffany Fisher is a HU CFRT Summer Fusion High school Workshop Scholar from New Bern High School in North Carolina. She is supported by NASA SHARP Plus Program.
Miniature Microwave Applicator for Murine Bladder Hyperthermia Studies
Salahi, Sara; Maccarini, Paolo F.; Rodrigues, Dario B.; Etienne, Wiguins; Landon, Chelsea D.; Inman, Brant A.; Dewhirst, Mark W.; Stauffer, Paul R.
2012-01-01
Purpose Novel combinations of heat with chemotherapeutic agents are often studied in murine tumor models. Currently, no device exists to selectively heat small tumors at depth in mice. In this project, we modelled, built and tested a miniature microwave heat applicator, the physical dimensions of which can be scaled to adjust the volume and depth of heating to focus on the tumor volume. Of particular interest is a device that can selectively heat murine bladder. Materials and Methods Using Avizo® segmentation software, we created a numerical mouse model based on micro-MRI scan data. The model was imported into HFSS™ simulation software and parametric studies were performed to optimize the dimensions of a water-loaded circular waveguide for selective power deposition inside a 0.15ml bladder. A working prototype was constructed operating at 2.45GHz. Heating performance was characterized by mapping fiber-optic temperature sensors along catheters inserted at depths of 0-1mm (subcutaneous), 2-3mm (vaginal), and 4-5mm (rectal) below the abdominal wall, with the mid-depth catheter adjacent to the bladder. Core temperature was monitored orally. Results Thermal measurements confirm the simulations which demonstrate that this applicator can provide local heating at depth in small animals. Measured temperatures in murine pelvis show well-localized bladder heating to 42-43°C while maintaining normothermic skin and core temperatures. Conclusions Simulation techniques facilitate the design optimization of microwave antennas for use in pre-clinical applications such as localized tumor heating in small animals. Laboratory measurements demonstrate the effectiveness of a new miniature water-coupled microwave applicator for localized heating of murine bladder. PMID:22690856
Python based high-level synthesis compiler
NASA Astrophysics Data System (ADS)
Cieszewski, Radosław; Pozniak, Krzysztof; Romaniuk, Ryszard
2014-11-01
This paper presents a python based High-Level synthesis (HLS) compiler. The compiler interprets an algorithmic description of a desired behavior written in Python and map it to VHDL. FPGA combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be reconfigured over the lifetime of the system. FPGAs therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Creating parallel programs implemented in FPGAs is not trivial. This article describes design, implementation and first results of created Python based compiler.
QGene 4.0, an extensible Java QTL-analysis platform.
Joehanes, Roby; Nelson, James C
2008-12-01
Of many statistical methods developed to date for quantitative trait locus (QTL) analysis, only a limited subset are available in public software allowing their exploration, comparison and practical application by researchers. We have developed QGene 4.0, a plug-in platform that allows execution and comparison of a variety of modern QTL-mapping methods and supports third-party addition of new ones. The software accommodates line-cross mating designs consisting of any arbitrary sequence of selfing, backcrossing, intercrossing and haploid-doubling steps that includes map, population, and trait simulators; and is scriptable. Software and documentation are available at http://coding.plantpath.ksu.edu/qgene. Source code is available on request.
Viger, Roland J.
2008-01-01
This fact sheet provides a high-level description of the GIS Weasel, a software system designed to aid users in preparing spatial information as input to lumped and distributed parameter environmental simulation models (ESMs). The GIS Weasel provides geographic information system (GIS) tools to help create maps of geographic features relevant to the application of a user?s ESM and to generate parameters from those maps. The operation of the GIS Weasel does not require a user to be a GIS expert, only that a user has an understanding of the spatial information requirements of the model. The GIS Weasel software system provides a GIS-based graphical user interface (GUI), C programming language executables, and general utility scripts. The software will run on any computing platform where ArcInfo Workstation (version 8.1 or later) and the GRID extension are accessible. The user controls the GIS Weasel by interacting with menus, maps, and tables.
Historical evolution of disease mapping in general and specifically of cancer mapping.
Howe, G M
1989-01-01
The presentation of areal data in epidemiology is illustrated by such mapping techniques as dots (spots), shading (choropleth, thematic) and isolines (isopleths). Examples are also given of computer-assisted cartography (computer graphics) which employs hardware and software components of digital computers, together with the use of geographical and demographic base maps.
Analyzing the Scientific Evolution of Social Work Using Science Mapping
ERIC Educational Resources Information Center
Martínez, Ma Angeles; Cobo, Manuel Jesús; Herrera, Manuel; Herrera-Viedma, Enrique
2015-01-01
Objectives: This article reports the first science mapping analysis of the social work field, which shows its conceptual structure and scientific evolution. Methods: Science Mapping Analysis Software Tool, a bibliometric science mapping tool based on co-word analysis and h-index, is applied using a sample of 18,794 research articles published from…
A Parametric Model of Shoulder Articulation for Virtual Assessment of Space Suit Fit
NASA Technical Reports Server (NTRS)
Kim, K. Han; Young, Karen S.; Bernal, Yaritza; Boppana, Abhishektha; Vu, Linh Q.; Benson, Elizabeth A.; Jarvis, Sarah; Rajulu, Sudhakar L.
2016-01-01
Shoulder injury is one of the most severe risks that have the potential to impair crewmembers' performance and health in long duration space flight. Overall, 64% of crewmembers experience shoulder pain after extra-vehicular training in a space suit, and 14% of symptomatic crewmembers require surgical repair (Williams & Johnson, 2003). Suboptimal suit fit, in particular at the shoulder region, has been identified as one of the predominant risk factors. However, traditional suit fit assessments and laser scans represent only a single person's data, and thus may not be generalized across wide variations of body shapes and poses. The aim of this work is to develop a software tool based on a statistical analysis of a large dataset of crewmember body shapes. This tool can accurately predict the skin deformation and shape variations for any body size and shoulder pose for a target population, from which the geometry can be exported and evaluated against suit models in commercial CAD software. A preliminary software tool was developed by statistically analyzing 150 body shapes matched with body dimension ranges specified in the Human-Systems Integration Requirements of NASA ("baseline model"). Further, the baseline model was incorporated with shoulder joint articulation ("articulation model"), using additional subjects scanned in a variety of shoulder poses across a pre-specified range of motion. Scan data was cleaned and aligned using body landmarks. The skin deformation patterns were dimensionally reduced and the co-variation with shoulder angles was analyzed. A software tool is currently in development and will be presented in the final proceeding. This tool would allow suit engineers to parametrically generate body shapes in strategically targeted anthropometry dimensions and shoulder poses. This would also enable virtual fit assessments, with which the contact volume and clearance between the suit and body surface can be predictively quantified at reduced time and cost.
The Frontier Fields lens modelling comparison project
NASA Astrophysics Data System (ADS)
Meneghetti, M.; Natarajan, P.; Coe, D.; Contini, E.; De Lucia, G.; Giocoli, C.; Acebron, A.; Borgani, S.; Bradac, M.; Diego, J. M.; Hoag, A.; Ishigaki, M.; Johnson, T. L.; Jullo, E.; Kawamata, R.; Lam, D.; Limousin, M.; Liesenborgs, J.; Oguri, M.; Sebesta, K.; Sharon, K.; Williams, L. L. R.; Zitrin, A.
2017-12-01
Gravitational lensing by clusters of galaxies offers a powerful probe of their structure and mass distribution. Several research groups have developed techniques independently to achieve this goal. While these methods have all provided remarkably high-precision mass maps, particularly with exquisite imaging data from the Hubble Space Telescope (HST), the reconstructions themselves have never been directly compared. In this paper, we present for the first time a detailed comparison of methodologies for fidelity, accuracy and precision. For this collaborative exercise, the lens modelling community was provided simulated cluster images that mimic the depth and resolution of the ongoing HST Frontier Fields. The results of the submitted reconstructions with the un-blinded true mass profile of these two clusters are presented here. Parametric, free-form and hybrid techniques have been deployed by the participating groups and we detail the strengths and trade-offs in accuracy and systematics that arise for each methodology. We note in conclusion that several properties of the lensing clusters are recovered equally well by most of the lensing techniques compared in this study. For example, the reconstruction of azimuthally averaged density and mass profiles by both parametric and free-form methods matches the input models at the level of ∼10 per cent. Parametric techniques are generally better at recovering the 2D maps of the convergence and of the magnification. For the best-performing algorithms, the accuracy in the magnification estimate is ∼10 per cent at μtrue = 3 and it degrades to ∼30 per cent at μtrue ∼ 10.
NASA Astrophysics Data System (ADS)
Mamalakis, Antonios; Langousis, Andreas; Deidda, Roberto; Marrocu, Marino
2017-03-01
Distribution mapping has been identified as the most efficient approach to bias-correct climate model rainfall, while reproducing its statistics at spatial and temporal resolutions suitable to run hydrologic models. Yet its implementation based on empirical distributions derived from control samples (referred to as nonparametric distribution mapping) makes the method's performance sensitive to sample length variations, the presence of outliers, the spatial resolution of climate model results, and may lead to biases, especially in extreme rainfall estimation. To address these shortcomings, we propose a methodology for simultaneous bias correction and high-resolution downscaling of climate model rainfall products that uses: (a) a two-component theoretical distribution model (i.e., a generalized Pareto (GP) model for rainfall intensities above a specified threshold u*, and an exponential model for lower rainrates), and (b) proper interpolation of the corresponding distribution parameters on a user-defined high-resolution grid, using kriging for uncertain data. We assess the performance of the suggested parametric approach relative to the nonparametric one, using daily raingauge measurements from a dense network in the island of Sardinia (Italy), and rainfall data from four GCM/RCM model chains of the ENSEMBLES project. The obtained results shed light on the competitive advantages of the parametric approach, which is proved more accurate and considerably less sensitive to the characteristics of the calibration period, independent of the GCM/RCM combination used. This is especially the case for extreme rainfall estimation, where the GP assumption allows for more accurate and robust estimates, also beyond the range of the available data.
NASA Astrophysics Data System (ADS)
Pandey, Palak; Kunte, Pravin D.
2016-10-01
This study presents an easy, modular, user-friendly, and flexible software package for processing of Landsat 7 ETM and Landsat 8 OLI-TIRS data for estimating suspended particulate matter concentrations in the coastal waters. This package includes 1) algorithm developed using freely downloadable SCILAB package, 2) ERDAS Models for iterative processing of Landsat images and 3) ArcMAP tool for plotting and map making. Utilizing SCILAB package, a module is written for geometric corrections, radiometric corrections and obtaining normalized water-leaving reflectance by incorporating Landsat 8 OLI-TIRS and Landsat 7 ETM+ data. Using ERDAS models, a sequence of modules are developed for iterative processing of Landsat images and estimating suspended particulate matter concentrations. Processed images are used for preparing suspended sediment concentration maps. The applicability of this software package is demonstrated by estimating and plotting seasonal suspended sediment concentration maps off the Bengal delta. The software is flexible enough to accommodate other remotely sensed data like Ocean Color monitor (OCM) data, Indian Remote Sensing data (IRS), MODIS data etc. by replacing a few parameters in the algorithm, for estimating suspended sediment concentration in coastal waters.
GACD: Integrated Software for Genetic Analysis in Clonal F1 and Double Cross Populations.
Zhang, Luyan; Meng, Lei; Wu, Wencheng; Wang, Jiankang
2015-01-01
Clonal species are common among plants. Clonal F1 progenies are derived from the hybridization between 2 heterozygous clones. In self- and cross-pollinated species, double crosses can be made from 4 inbred lines. A clonal F1 population can be viewed as a double cross population when the linkage phase is determined. The software package GACD (Genetic Analysis of Clonal F1 and Double cross) is freely available public software, capable of building high-density linkage maps and mapping quantitative trait loci (QTL) in clonal F1 and double cross populations. Three functionalities are integrated in GACD version 1.0: binning of redundant markers (BIN); linkage map construction (CDM); and QTL mapping (CDQ). Output of BIN can be directly used as input of CDM. After adding the phenotypic data, the output of CDM can be used as input of CDQ. Thus, GACD acts as a pipeline for genetic analysis. GACD and example datasets are freely available from www.isbreeding.net. © The American Genetic Association. 2015. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Cartographic applications software
,
1992-01-01
The Office of the Assistant Division Chief for Research, National Mapping Division, develops computer software for the solution of geometronic problems in the fields of surveying, geodesy, remote sensing, and photogrammetry. Software that has been developed using public funds is available on request for a nominal charge to recover the cost of duplication.
Software for Managing Parametric Studies
NASA Technical Reports Server (NTRS)
Yarrow, Maurice; McCann, Karen M.; DeVivo, Adrian
2003-01-01
The Information Power Grid Virtual Laboratory (ILab) is a Practical Extraction and Reporting Language (PERL) graphical-user-interface computer program that generates shell scripts to facilitate parametric studies performed on the Grid. (The Grid denotes a worldwide network of supercomputers used for scientific and engineering computations involving data sets too large to fit on desktop computers.) Heretofore, parametric studies on the Grid have been impeded by the need to create control language scripts and edit input data files painstaking tasks that are necessary for managing multiple jobs on multiple computers. ILab reflects an object-oriented approach to automation of these tasks: All data and operations are organized into packages in order to accelerate development and debugging. A container or document object in ILab, called an experiment, contains all the information (data and file paths) necessary to define a complex series of repeated, sequenced, and/or branching processes. For convenience and to enable reuse, this object is serialized to and from disk storage. At run time, the current ILab experiment is used to generate required input files and shell scripts, create directories, copy data files, and then both initiate and monitor the execution of all computational processes.
Automated a complex computer aided design concept generated using macros programming
NASA Astrophysics Data System (ADS)
Rizal Ramly, Mohammad; Asrokin, Azharrudin; Abd Rahman, Safura; Zulkifly, Nurul Ain Md
2013-12-01
Changing a complex Computer Aided design profile such as car and aircraft surfaces has always been difficult and challenging. The capability of CAD software such as AutoCAD and CATIA show that a simple configuration of a CAD design can be easily modified without hassle, but it is not the case with complex design configuration. Design changes help users to test and explore various configurations of the design concept before the production of a model. The purpose of this study is to look into macros programming as parametric method of the commercial aircraft design. Macros programming is a method where the configurations of the design are done by recording a script of commands, editing the data value and adding a certain new command line to create an element of parametric design. The steps and the procedure to create a macro programming are discussed, besides looking into some difficulties during the process of creation and advantage of its usage. Generally, the advantages of macros programming as a method of parametric design are; allowing flexibility for design exploration, increasing the usability of the design solution, allowing proper contained by the model while restricting others and real time feedback changes.
Parametric regression model for survival data: Weibull regression model as an example
2016-01-01
Weibull regression model is one of the most popular forms of parametric regression model that it provides estimate of baseline hazard function, as well as coefficients for covariates. Because of technical difficulties, Weibull regression model is seldom used in medical literature as compared to the semi-parametric proportional hazard model. To make clinical investigators familiar with Weibull regression model, this article introduces some basic knowledge on Weibull regression model and then illustrates how to fit the model with R software. The SurvRegCensCov package is useful in converting estimated coefficients to clinical relevant statistics such as hazard ratio (HR) and event time ratio (ETR). Model adequacy can be assessed by inspecting Kaplan-Meier curves stratified by categorical variable. The eha package provides an alternative method to model Weibull regression model. The check.dist() function helps to assess goodness-of-fit of the model. Variable selection is based on the importance of a covariate, which can be tested using anova() function. Alternatively, backward elimination starting from a full model is an efficient way for model development. Visualization of Weibull regression model after model development is interesting that it provides another way to report your findings. PMID:28149846
O'Connor, B P
2000-08-01
Popular statistical software packages do not have the proper procedures for determining the number of components in factor and principal components analyses. Parallel analysis and Velicer's minimum average partial (MAP) test are validated procedures, recommended widely by statisticians. However, many researchers continue to use alternative, simpler, but flawed procedures, such as the eigenvalues-greater-than-one rule. Use of the proper procedures might be increased if these procedures could be conducted within familiar software environments. This paper describes brief and efficient programs for using SPSS and SAS to conduct parallel analyses and the MAP test.
Kang, Jee In; Park, Hae-Jeong; An, Suk Kyoon
2014-01-01
Background: Altered transmission of gamma-aminobutyric acid (GABA), a major inhibitory neurotransmitter, may contribute to the development of schizophrenia. The purpose of the present study was to investigate the presence of GABA-A/benzodiazepine (BZ) receptor binding abnormalities in individuals at ultra-high risk (UHR) for psychosis in comparison with normal controls using [18F]-fluoroflumazenil (FFMZ) positron emission tomography (PET). In particular, we set regions of interest in the striatum (caudate, putamen, and nucleus accumbens) and medial temporal area (hippocampus and parahippocampal gyrus). Methods: Eleven BZ-naive people at UHR and 15 normal controls underwent PET scanning using [18F]-FFMZ to measure GABA-A/BZ receptor binding potential. The regional group differences between UHR individuals and normal controls were analyzed using Statistical Parametric Mapping 8 software. Participants were evaluated using the structured interview for prodromal syndromes and neurocognitive function tasks. Results: People at UHR demonstrated significantly reduced binding potential of GABA-A/BZ receptors in the right caudate. Conclusions: Altered GABAergic transmission and/or the imbalance of inhibitory and excitatory systems in the striatum may be present at the putative prodromal stage and play a pivotal role in the pathophysiology of psychosis. PMID:23588475
Fu, Yue; Zhang, Quan; Zhang, Jing; Zhang, Yun Ting
2015-01-01
To compare the effects of active and passive movements on brain activation in patients with cerebral infarction using fMRI. Twenty-four hemiplegic patients with cerebral infarction were evaluated using fMRI. All patients performed active and passive finger opposition movements. Patients were instructed to perform the finger opposition movement for the active movement task. For the passive movement task, the subject's fingers were moved by the examiner to perform the finger opposition movement. Statistical parametric mapping software was used for statistical analyses and to process all data. In the affected hemisphere, sensorimotor cortex (SMC) activation intensity and range were significantly stronger during the passive movement of the affected fingers compared to the active movement of the affected fingers (p < 0.05). However, there were no significant differences between active and passive movements of unaffected fingers in SMC activation intensity and range in the unaffected hemisphere (p > 0.05). In addition, the passive movement activated many other regions of the brain. The brain regions activated by passive movements of the affected fingers tended to center toward the contralateral SMC. Our findings suggest that passive movements induce cortical reorganization in patients with cerebral infarction. Therefore, passive movement is likely beneficial for motor function recovery in patients with cerebral infarction.
A hybrid short read mapping accelerator
2013-01-01
Background The rapid growth of short read datasets poses a new challenge to the short read mapping problem in terms of sensitivity and execution speed. Existing methods often use a restrictive error model for computing the alignments to improve speed, whereas more flexible error models are generally too slow for large-scale applications. A number of short read mapping software tools have been proposed. However, designs based on hardware are relatively rare. Field programmable gate arrays (FPGAs) have been successfully used in a number of specific application areas, such as the DSP and communications domains due to their outstanding parallel data processing capabilities, making them a competitive platform to solve problems that are “inherently parallel”. Results We present a hybrid system for short read mapping utilizing both FPGA-based hardware and CPU-based software. The computation intensive alignment and the seed generation operations are mapped onto an FPGA. We present a computationally efficient, parallel block-wise alignment structure (Align Core) to approximate the conventional dynamic programming algorithm. The performance is compared to the multi-threaded CPU-based GASSST and BWA software implementations. For single-end alignment, our hybrid system achieves faster processing speed than GASSST (with a similar sensitivity) and BWA (with a higher sensitivity); for pair-end alignment, our design achieves a slightly worse sensitivity than that of BWA but has a higher processing speed. Conclusions This paper shows that our hybrid system can effectively accelerate the mapping of short reads to a reference genome based on the seed-and-extend approach. The performance comparison to the GASSST and BWA software implementations under different conditions shows that our hybrid design achieves a high degree of sensitivity and requires less overall execution time with only modest FPGA resource utilization. Our hybrid system design also shows that the performance bottleneck for the short read mapping problem can be changed from the alignment stage to the seed generation stage, which provides an additional requirement for the future development of short read aligners. PMID:23441908
2014-12-01
The aim of this study was to design and implement a seroprevalence map based on business intelligence for low pathogenicity notifiable avian influenza (LPNAI) in broilerchickens in Comunidad Valenciana (Spain). The software mapping tool developed for this study consisted of three main phases: data collection, data analysis and data representation. To obtain the serological data, the authors analysed 8,520 serum samples from broiler farms over three years. The data were represented on a map of Comunidad Valenciana, including geographical information of flock locations to facilitate disease monitoring. No clinical signs of LPNAI were reported in the studied flocks. The data from this study showed no evidence of contact with LPNAI in broiler flocks and the novel software mapping tool proved a valuable method for easily monitoring on the serological response to avian influenza information, including geographical information.
Development of a Two-Wheel Contingency Mode for the MAP Spacecraft
NASA Technical Reports Server (NTRS)
Starin, Scott R.; ODonnell, James R., Jr.; Bauer, Frank (Technical Monitor)
2002-01-01
The Microwave Anisotropy Probe (MAP) is a follow-on mission to the Cosmic Background Explorer (COBE), and is currently collecting data from its orbit near the second Sun-Earth libration point. Due to limited mass, power, and financial resources, a traditional reliability concept including fully redundant components was not feasible for MAP. Instead, the MAP design employs selective hardware redundancy in tandem with contingency software modes and algorithms to improve the odds of mission success. One direction for such improvement has been the development of a two-wheel backup control strategy. This strategy would allow MAP to position itself for maneuvers and collect science data should one of its three reaction wheels fail. Along with operational considerations, the strategy includes three new control algorithms. These algorithms would use the remaining attitude control actuators-thrusters and two reaction wheels-in ways that achieve control goals while minimizing adverse impacts on the functionality of other subsystems and software.
Simple Parametric Model for Airfoil Shape Description
NASA Astrophysics Data System (ADS)
Ziemkiewicz, David
2017-12-01
We show a simple, analytic equation describing a class of two-dimensional shapes well suited for representation of aircraft airfoil profiles. Our goal was to create a description characterized by a small number of parameters with easily understandable meaning, providing a tool to alter the shape with optimization procedures as well as manual tweaks by the designer. The generated shapes are well suited for numerical analysis with 2D flow solving software such as XFOIL.
A linear programming approach to characterizing norm bounded uncertainty from experimental data
NASA Technical Reports Server (NTRS)
Scheid, R. E.; Bayard, D. S.; Yam, Y.
1991-01-01
The linear programming spectral overbounding and factorization (LPSOF) algorithm, an algorithm for finding a minimum phase transfer function of specified order whose magnitude tightly overbounds a specified nonparametric function of frequency, is introduced. This method has direct application to transforming nonparametric uncertainty bounds (available from system identification experiments) into parametric representations required for modern robust control design software (i.e., a minimum-phase transfer function multiplied by a norm-bounded perturbation).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smidts, Carol; Huang, Funqun; Li, Boyuan
With the current transition from analog to digital instrumentation and control systems in nuclear power plants, the number and variety of software-based systems have significantly increased. The sophisticated nature and increasing complexity of software raises trust in these systems as a significant challenge. The trust placed in a software system is typically termed software dependability. Software dependability analysis faces uncommon challenges since software systems’ characteristics differ from those of hardware systems. The lack of systematic science-based methods for quantifying the dependability attributes in software-based instrumentation as well as control systems in safety critical applications has proved itself to be amore » significant inhibitor to the expanded use of modern digital technology in the nuclear industry. Dependability refers to the ability of a system to deliver a service that can be trusted. Dependability is commonly considered as a general concept that encompasses different attributes, e.g., reliability, safety, security, availability and maintainability. Dependability research has progressed significantly over the last few decades. For example, various assessment models and/or design approaches have been proposed for software reliability, software availability and software maintainability. Advances have also been made to integrate multiple dependability attributes, e.g., integrating security with other dependability attributes, measuring availability and maintainability, modeling reliability and availability, quantifying reliability and security, exploring the dependencies between security and safety and developing integrated analysis models. However, there is still a lack of understanding of the dependencies between various dependability attributes as a whole and of how such dependencies are formed. To address the need for quantification and give a more objective basis to the review process -- therefore reducing regulatory uncertainty -- measures and methods are needed to assess dependability attributes early on, as well as throughout the life-cycle process of software development. In this research, extensive expert opinion elicitation is used to identify the measures and methods for assessing software dependability. Semi-structured questionnaires were designed to elicit expert knowledge. A new notation system, Causal Mechanism Graphing, was developed to extract and represent such knowledge. The Causal Mechanism Graphs were merged, thus, obtaining the consensus knowledge shared by the domain experts. In this report, we focus on how software contributes to dependability. However, software dependability is not discussed separately from the context of systems or socio-technical systems. Specifically, this report focuses on software dependability, reliability, safety, security, availability, and maintainability. Our research was conducted in the sequence of stages found below. Each stage is further examined in its corresponding chapter. Stage 1 (Chapter 2): Elicitation of causal maps describing the dependencies between dependability attributes. These causal maps were constructed using expert opinion elicitation. This chapter describes the expert opinion elicitation process, the questionnaire design, the causal map construction method and the causal maps obtained. Stage 2 (Chapter 3): Elicitation of the causal map describing the occurrence of the event of interest for each dependability attribute. The causal mechanisms for the “event of interest” were extracted for each of the software dependability attributes. The “event of interest” for a dependability attribute is generally considered to be the “attribute failure”, e.g. security failure. The extraction was based on the analysis of expert elicitation results obtained in Stage 1. Stage 3 (Chapter 4): Identification of relevant measurements. Measures for the “events of interest” and their causal mechanisms were obtained from expert opinion elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of “missing” measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.« less
Open source pipeline for ESPaDOnS reduction and analysis
NASA Astrophysics Data System (ADS)
Martioli, Eder; Teeple, Doug; Manset, Nadine; Devost, Daniel; Withington, Kanoa; Venne, Andre; Tannock, Megan
2012-09-01
OPERA is a Canada-France-Hawaii Telescope (CFHT) open source collaborative software project currently under development for an ESPaDOnS echelle spectro-polarimetric image reduction pipeline. OPERA is designed to be fully automated, performing calibrations and reduction, producing one-dimensional intensity and polarimetric spectra. The calibrations are performed on two-dimensional images. Spectra are extracted using an optimal extraction algorithm. While primarily designed for CFHT ESPaDOnS data, the pipeline is being written to be extensible to other echelle spectrographs. A primary design goal is to make use of fast, modern object-oriented technologies. Processing is controlled by a harness, which manages a set of processing modules, that make use of a collection of native OPERA software libraries and standard external software libraries. The harness and modules are completely parametrized by site configuration and instrument parameters. The software is open- ended, permitting users of OPERA to extend the pipeline capabilities. All these features have been designed to provide a portable infrastructure that facilitates collaborative development, code re-usability and extensibility. OPERA is free software with support for both GNU/Linux and MacOSX platforms. The pipeline is hosted on SourceForge under the name "opera-pipeline".
ERIC Educational Resources Information Center
Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John
2018-01-01
Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that…
MAP - a mapping and analysis program for harvest planning
Robert N. Eli; Chris B. LeDoux; Penn A. Peters
1984-01-01
The Northeastern Forest Experiment Station and the Department of Civil Engineering at West Virginia University are cooperating in the development of a Mapping and Analysis Program, to be named MAP. The goal of this computer software package is to significantly improve the planning and harvest efficiency of small to moderately sized harvest units located in mountainous...
The IHMC CmapTools software in research and education: a multi-level use case in Space Meteorology
NASA Astrophysics Data System (ADS)
Messerotti, Mauro
2010-05-01
The IHMC (Institute for Human and Machine Cognition, Florida University System, USA) CmapTools software is a powerful multi-platform tool for knowledge modelling in graphical form based on concept maps. In this work we present its application for the high-level development of a set of multi-level concept maps in the framework of Space Meteorology to act as the kernel of a space meteorology domain ontology. This is an example of a research use case, as a domain ontology coded in machine-readable form via e.g. OWL (Web Ontology Language) is suitable to be an active layer of any knowledge management system embedded in a Virtual Observatory (VO). Apart from being manageable at machine level, concept maps developed via CmapTools are intrinsically human-readable and can embed hyperlinks and objects of many kinds. Therefore they are suitable to be published on the web: the coded knowledge can be exploited for educational purposes by the students and the public, as the level of information can be naturally organized among linked concept maps in progressively increasing complexity levels. Hence CmapTools and its advanced version COE (Concept-map Ontology Editor) represent effective and user-friendly software tools for high-level knowledge represention in research and education.
Accelerating artificial intelligence with reconfigurable computing
NASA Astrophysics Data System (ADS)
Cieszewski, Radoslaw
Reconfigurable computing is emerging as an important area of research in computer architectures and software systems. Many algorithms can be greatly accelerated by placing the computationally intense portions of an algorithm into reconfigurable hardware. Reconfigurable computing combines many benefits of both software and ASIC implementations. Like software, the mapped circuit is flexible, and can be changed over the lifetime of the system. Similar to an ASIC, reconfigurable systems provide a method to map circuits into hardware. Reconfigurable systems therefore have the potential to achieve far greater performance than software as a result of bypassing the fetch-decode-execute operations of traditional processors, and possibly exploiting a greater level of parallelism. Such a field, where there is many different algorithms which can be accelerated, is an artificial intelligence. This paper presents example hardware implementations of Artificial Neural Networks, Genetic Algorithms and Expert Systems.
Ieva, Francesca; Jackson, Christopher H; Sharples, Linda D
2017-06-01
In chronic diseases like heart failure (HF), the disease course and associated clinical event histories for the patient population vary widely. To improve understanding of the prognosis of patients and enable health care providers to assess and manage resources, we wish to jointly model disease progression, mortality and their relation with patient characteristics. We show how episodes of hospitalisation for disease-related events, obtained from administrative data, can be used as a surrogate for disease status. We propose flexible multi-state models for serial hospital admissions and death in HF patients, that are able to accommodate important features of disease progression, such as multiple ordered events and competing risks. Fully parametric and semi-parametric semi-Markov models are implemented using freely available software in R. The models were applied to a dataset from the administrative data bank of the Lombardia region in Northern Italy, which included 15,298 patients who had a first hospitalisation ending in 2006 and 4 years of follow-up thereafter. This provided estimates of the associations of age and gender with rates of hospital admission and length of stay in hospital, and estimates of the expected total time spent in hospital over five years. For example, older patients and men were readmitted more frequently, though the total time in hospital was roughly constant with age. We also discuss the relative merits of parametric and semi-parametric multi-state models, and model assessment and comparison.
Synthesis and Analysis of Custom Bi-directional Reflectivity Distribution Functions in DIRSIG
NASA Astrophysics Data System (ADS)
Dank, J.; Allen, D.
2016-09-01
The bi-directional reflectivity distribution (BRDF) function is a fundamental optical property of materials, characterizing important properties of light scattered by a surface. For accurate radiance calculations using synthetic targets and numerical simulations such as the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model, fidelity of the target BRDFs is critical. While fits to measured BRDF data can be used in DIRSIG, obtaining high-quality data over a large spectral continuum can be time-consuming and expensive, requiring significant investment in illumination sources, sensors, and other specialized hardware. As a consequence, numerous parametric BRDF models are available to approximate actual behavior; but these all have shortcomings. Further, DIRSIG doesn't allow direct visualization of BRDFs, making it difficult for the user to understand the numerical impact of various models. Here, we discuss the innovative use of "mixture maps" to synthesize custom BRDFs as linear combinations of parametric models and measured data. We also show how DIRSIG's interactive mode can be used to visualize and analyze both available parametric models currently used in DIRSIG and custom BRDFs developed using our methods.
NASA Astrophysics Data System (ADS)
Lewis, Debra
2013-05-01
Relative equilibria of Lagrangian and Hamiltonian systems with symmetry are critical points of appropriate scalar functions parametrized by the Lie algebra (or its dual) of the symmetry group. Setting aside the structures - symplectic, Poisson, or variational - generating dynamical systems from such functions highlights the common features of their construction and analysis, and supports the construction of analogous functions in non-Hamiltonian settings. If the symmetry group is nonabelian, the functions are invariant only with respect to the isotropy subgroup of the given parameter value. Replacing the parametrized family of functions with a single function on the product manifold and extending the action using the (co)adjoint action on the algebra or its dual yields a fully invariant function. An invariant map can be used to reverse the usual perspective: rather than selecting a parametrized family of functions and finding their critical points, conditions under which functions will be critical on specific orbits, typically distinguished by isotropy class, can be derived. This strategy is illustrated using several well-known mechanical systems - the Lagrange top, the double spherical pendulum, the free rigid body, and the Riemann ellipsoids - and generalizations of these systems.
Parametric nonfeedback resonance in period doubling systems
NASA Astrophysics Data System (ADS)
Pisarchik, A. N.; Corbalán, R.
1999-02-01
Slow periodic modulation of a control parameter in a period doubling system leads to an interaction between stable and unstable periodic orbits. This causes a resonance in the system response at the modulation frequency. The conditions for this resonance are studied through numerical simulations of quadratic map and laser equations. The results are confirmed by experiments in a CO2 laser with modulated losses.
Non-parametric analysis of LANDSAT maps using neural nets and parallel computers
NASA Technical Reports Server (NTRS)
Salu, Yehuda; Tilton, James
1991-01-01
Nearest neighbor approaches and a new neural network, the Binary Diamond, are used for the classification of images of ground pixels obtained by LANDSAT satellite. The performances are evaluated by comparing classifications of a scene in the vicinity of Washington DC. The problem of optimal selection of categories is addressed as a step in the classification process.
Ronald E. McRoberts; Erkki O. Tomppo; Andrew O. Finley; Heikkinen Juha
2007-01-01
The k-Nearest Neighbor (k-NN) technique has become extremely popular for a variety of forest inventory mapping and estimation applications. Much of this popularity may be attributed to the non-parametric, multivariate features of the technique, its intuitiveness, and its ease of use. When used with satellite imagery and forest...
Stroet, Martin; Koziara, Katarzyna B; Malde, Alpeshkumar K; Mark, Alan E
2017-12-12
A general method for parametrizing atomic interaction functions is presented. The method is based on an analysis of surfaces corresponding to the difference between calculated and target data as a function of alternative combinations of parameters (parameter space mapping). The consideration of surfaces in parameter space as opposed to local values or gradients leads to a better understanding of the relationships between the parameters being optimized and a given set of target data. This in turn enables for a range of target data from multiple molecules to be combined in a robust manner and for the optimal region of parameter space to be trivially identified. The effectiveness of the approach is illustrated by using the method to refine the chlorine 6-12 Lennard-Jones parameters against experimental solvation free enthalpies in water and hexane as well as the density and heat of vaporization of the liquid at atmospheric pressure for a set of 10 aromatic-chloro compounds simultaneously. Single-step perturbation is used to efficiently calculate solvation free enthalpies for a wide range of parameter combinations. The capacity of this approach to parametrize accurate and transferrable force fields is discussed.
NASA Astrophysics Data System (ADS)
Rezaei Kh., S.; Bailer-Jones, C. A. L.; Hanson, R. J.; Fouesneau, M.
2017-02-01
We present a non-parametric model for inferring the three-dimensional (3D) distribution of dust density in the Milky Way. Our approach uses the extinction measured towards stars at different locations in the Galaxy at approximately known distances. Each extinction measurement is proportional to the integrated dust density along its line of sight (LoS). Making simple assumptions about the spatial correlation of the dust density, we can infer the most probable 3D distribution of dust across the entire observed region, including along sight lines which were not observed. This is possible because our model employs a Gaussian process to connect all LoS. We demonstrate the capability of our model to capture detailed dust density variations using mock data and simulated data from the Gaia Universe Model Snapshot. We then apply our method to a sample of giant stars observed by APOGEE and Kepler to construct a 3D dust map over a small region of the Galaxy. Owing to our smoothness constraint and its isotropy, we provide one of the first maps which does not show the "fingers of God" effect.
ERIC Educational Resources Information Center
Zheng, Yongjie
2012-01-01
Software architecture plays an increasingly important role in complex software development. Its further application, however, is challenged by the fact that software architecture, over time, is often found not conformant to its implementation. This is usually caused by frequent development changes made to both artifacts. Against this background,…
Eisen, Lars; Lozano-Fuentes, Saul
2009-01-01
The aims of this review paper are to 1) provide an overview of how mapping and spatial and space-time modeling approaches have been used to date to visualize and analyze mosquito vector and epidemiologic data for dengue; and 2) discuss the potential for these approaches to be included as routine activities in operational vector and dengue control programs. Geographical information system (GIS) software are becoming more user-friendly and now are complemented by free mapping software that provide access to satellite imagery and basic feature-making tools and have the capacity to generate static maps as well as dynamic time-series maps. Our challenge is now to move beyond the research arena by transferring mapping and GIS technologies and spatial statistical analysis techniques in user-friendly packages to operational vector and dengue control programs. This will enable control programs to, for example, generate risk maps for exposure to dengue virus, develop Priority Area Classifications for vector control, and explore socioeconomic associations with dengue risk. PMID:19399163
ShakeMap manual: technical manual, user's guide, and software guide
Wald, David J.; Worden, Bruce C.; Quitoriano, Vincent; Pankow, Kris L.
2005-01-01
ShakeMap (http://earthquake.usgs.gov/shakemap) --rapidly, automatically generated shaking and intensity maps--combines instrumental measurements of shaking with information about local geology and earthquake location and magnitude to estimate shaking variations throughout a geographic area. The results are rapidly available via the Web through a variety of map formats, including Geographic Information System (GIS) coverages. These maps have become a valuable tool for emergency response, public information, loss estimation, earthquake planning, and post-earthquake engineering and scientific analyses. With the adoption of ShakeMap as a standard tool for a wide array of users and uses came an impressive demand for up-to-date technical documentation and more general guidelines for users and software developers. This manual is meant to address this need. ShakeMap, and associated Web and data products, are rapidly evolving as new advances in communications, earthquake science, and user needs drive improvements. As such, this documentation is organic in nature. We will make every effort to keep it current, but undoubtedly necessary changes in operational systems take precedence over producing and making documentation publishable.
An interactive dynamic analysis and decision support software for MR mammography.
Ertaş, Gökhan; Gülçür, H Ozcan; Tunaci, Mehtap
2008-06-01
A fully automated software is introduced to facilitate MR mammography (MRM) examinations and overcome subjectiveness in diagnosis using normalized maximum intensity-time ratio (nMITR) maps. These maps inherently suppress enhancements due to normal parenchyma and blood vessels that surround lesions and have natural tolerance to small field inhomogeneities and motion artifacts. The classifier embedded within the software is trained with normalized complexity and maximum nMITR of 22 lesions and tested with the features of remaining 22 lesions. Achieved diagnostic performances are 92% sensitivity, 90% specificity, 91% accuracy, 92% positive predictive value and 90% negative predictive value. DynaMammoAnalyst shortens evaluation time considerably and reduces inter and intra-observer variability by providing decision support.
NASA Astrophysics Data System (ADS)
Vrabec, Marko; Dolžan, Erazem
2016-04-01
The undergraduate field course in Geological Mapping at the University of Ljubljana involves 20-40 students per year, which precludes the use of specialized rugged digital field equipment as the costs would be way beyond the capabilities of the Department. A different mapping area is selected each year with the aim to provide typical conditions that a professional geologist might encounter when doing fieldwork in Slovenia, which includes rugged relief, dense tree cover, and moderately-well- to poorly-exposed bedrock due to vegetation and urbanization. It is therefore mandatory that the digital tools and workflows are combined with classical methods of fieldwork, since, for example, full-time precise GNSS positioning is not viable under such circumstances. Additionally, due to the prevailing combination of complex geological structure with generally poor exposure, students cannot be expected to produce line (vector) maps of geological contacts on the go, so there is no need for such functionality in hardware and software that we use in the field. Our workflow therefore still relies on paper base maps, but is strongly complemented with digital tools to provide robust positioning, track recording, and acquisition of various point-based data. Primary field hardware are students' Android-based smartphones and optionally tablets. For our purposes, the built-in GNSS chips provide adequate positioning precision most of the time, particularly if they are GLONASS-capable. We use Oruxmaps, a powerful free offline map viewer for the Android platform, which facilitates the use of custom-made geopositioned maps. For digital base maps, which we prepare in free Windows QGIS software, we use scanned topographic maps provided by the National Geodetic Authority, but also other maps such as aerial imagery, processed Digital Elevation Models, scans of existing geological maps, etc. Point data, like important outcrop locations or structural measurements, are entered into Oruxmaps as waypoints. Students are also encouraged to directly measure structural data with specialized Android apps such as the MVE FieldMove Clino. Digital field data is exported from Oruxmaps to Windows computers primarily in the ubiquitous GPX data format and then integrated in the QGIS environment. Recorded GPX tracks are also used with the free Geosetter Windows software to geoposition and tag any digital photographs taken in the field. With minimal expenses, our workflow provides the students with basic familiarity and experience in using digital field tools and methods. The workflow is also practical enough for the prevailing field conditions of Slovenia that the faculty staff is using it in geological mapping for scientific research and consultancy work.
Prehn, Alexander; Glöckner, Rosa; Rempe, Gerhard; Zeppenfeld, Martin
2017-03-01
Optical frequency combs (OFCs) provide a convenient reference for the frequency stabilization of continuous-wave lasers. We demonstrate a frequency control method relying on tracking over a wide range and stabilizing the beat note between the laser and the OFC. The approach combines fast frequency ramps on a millisecond timescale in the entire mode-hop free tuning range of the laser and precise stabilization to single frequencies. We apply it to a commercially available optical parametric oscillator (OPO) and demonstrate tuning over more than 60 GHz with a ramping speed up to 3 GHz/ms. Frequency ramps spanning 15 GHz are performed in less than 10 ms, with the OPO instantly relocked to the OFC after the ramp at any desired frequency. The developed control hardware and software are able to stabilize the OPO to sub-MHz precision and to perform sequences of fast frequency ramps automatically.
Development of Parametric Mass and Volume Models for an Aerospace SOFC/Gas Turbine Hybrid System
NASA Technical Reports Server (NTRS)
Tornabene, Robert; Wang, Xiao-yen; Steffen, Christopher J., Jr.; Freeh, Joshua E.
2005-01-01
In aerospace power systems, mass and volume are key considerations to produce a viable design. The utilization of fuel cells is being studied for a commercial aircraft electrical power unit. Based on preliminary analyses, a SOFC/gas turbine system may be a potential solution. This paper describes the parametric mass and volume models that are used to assess an aerospace hybrid system design. The design tool utilizes input from the thermodynamic system model and produces component sizing, performance, and mass estimates. The software is designed such that the thermodynamic model is linked to the mass and volume model to provide immediate feedback during the design process. It allows for automating an optimization process that accounts for mass and volume in its figure of merit. Each component in the system is modeled with a combination of theoretical and empirical approaches. A description of the assumptions and design analyses is presented.
Using virtual reality for science mission planning: A Mars Pathfinder case
NASA Technical Reports Server (NTRS)
Kim, Jacqueline H.; Weidner, Richard J.; Sacks, Allan L.
1994-01-01
NASA's Mars Pathfinder Project requires a Ground Data System (GDS) that supports both engineering and scientific payloads with reduced mission operations staffing, and short planning schedules. Also, successful surface operation of the lander camera requires efficient mission planning and accurate pointing of the camera. To meet these challenges, a new software strategy that integrates virtual reality technology with existing navigational ancillary information and image processing capabilities. The result is an interactive workstation based applications software that provides a high resolution, 3-dimensial, stereo display of Mars as if it were viewed through the lander camera. The design, implementation strategy and parametric specification phases for the development of this software were completed, and the prototype tested. When completed, the software will allow scientists and mission planners to access simulated and actual scenes of Mars' surface. The perspective from the lander camera will enable scientists to plan activities more accurately and completely. The application will also support the sequence and command generation process and will allow testing and verification of camera pointing commands via simulation.
Automation Hooks Architecture Trade Study for Flexible Test Orchestration
NASA Technical Reports Server (NTRS)
Lansdowne, Chatwin A.; Maclean, John R.; Graffagnino, Frank J.; McCartney, Patrick A.
2010-01-01
We describe the conclusions of a technology and communities survey supported by concurrent and follow-on proof-of-concept prototyping to evaluate feasibility of defining a durable, versatile, reliable, visible software interface to support strategic modularization of test software development. The objective is that test sets and support software with diverse origins, ages, and abilities can be reliably integrated into test configurations that assemble and tear down and reassemble with scalable complexity in order to conduct both parametric tests and monitored trial runs. The resulting approach is based on integration of three recognized technologies that are currently gaining acceptance within the test industry and when combined provide a simple, open and scalable test orchestration architecture that addresses the objectives of the Automation Hooks task. The technologies are automated discovery using multicast DNS Zero Configuration Networking (zeroconf), commanding and data retrieval using resource-oriented Restful Web Services, and XML data transfer formats based on Automatic Test Markup Language (ATML). This open-source standards-based approach provides direct integration with existing commercial off-the-shelf (COTS) analysis software tools.
Software project management tools in global software development: a systematic mapping study.
Chadli, Saad Yasser; Idri, Ali; Ros, Joaquín Nicolás; Fernández-Alemán, José Luis; de Gea, Juan M Carrillo; Toval, Ambrosio
2016-01-01
Global software development (GSD) which is a growing trend in the software industry is characterized by a highly distributed environment. Performing software project management (SPM) in such conditions implies the need to overcome new limitations resulting from cultural, temporal and geographic separation. The aim of this research is to discover and classify the various tools mentioned in literature that provide GSD project managers with support and to identify in what way they support group interaction. A systematic mapping study has been performed by means of automatic searches in five sources. We have then synthesized the data extracted and presented the results of this study. A total of 102 tools were identified as being used in SPM activities in GSD. We have classified these tools, according to the software life cycle process on which they focus and how they support the 3C collaboration model (communication, coordination and cooperation). The majority of the tools found are standalone tools (77%). A small number of platforms (8%) also offer a set of interacting tools that cover the software development lifecycle. Results also indicate that SPM areas in GSD are not adequately supported by corresponding tools and deserve more attention from tool builders.
Postprocessing classification images
NASA Technical Reports Server (NTRS)
Kan, E. P.
1979-01-01
Program cleans up remote-sensing maps. It can be used with existing image-processing software. Remapped images closely resemble familiar resource information maps and can replace or supplement classification images not postprocessed by this program.
Representations of spacetime diffeomorphisms. I. Canonical parametrized field theories
DOE Office of Scientific and Technical Information (OSTI.GOV)
Isham, C.J.; Kuchar, K.V.
The super-Hamiltonian and supermomentum in canonical geometrodynamics or in a parametried field theory on a given Riemannian background have Poisson brackets which obey the Dirac relations. By smearing the supermomentum with vector fields VepsilonL Diff..sigma.. on the space manifold ..sigma.., the Lie algebra L Diff ..sigma.. of the spatial diffeomorphism group Diff ..sigma.. can be mapped antihomomorphically into the Poisson bracket algebra on the phase space of the system. The explicit dependence of the Poisson brackets between two super-Hamiltonians on canonical coordinates (spatial metrics in geometrodynamics and embedding variables in parametrized theories) is usually regarded as an indication that themore » Dirac relations cannot be connected with a representation of the complete Lie algebra L Diff M of spacetime diffeomorphisms.« less
Pechukas-Yukawa approach to the evolution of the quantum state of a parametrically perturbed system
NASA Astrophysics Data System (ADS)
Qureshi, Mumnuna A.; Zhong, Johnny; Qureshi, Zihad; Mason, Peter; Betouras, Joseph J.; Zagoskin, Alexandre M.
2018-03-01
We consider the evolution of the quantum states of a Hamiltonian that is parametrically perturbed via a term proportional to the adiabatic parameter λ (t ) . Starting with the Pechukas-Yukawa mapping of the energy eigenvalue evolution in a generalized Calogero-Sutherland model of a one-dimensional classical gas, we consider the adiabatic approximation with two different expansions of the quantum state in powers of d λ /d t and compare them with a direct numerical simulation. We show that one of these expansions (Magnus series) is especially convenient for the description of nonadiabatic evolution of the system. Applying the expansion to the exact cover 3-satisfiability problem, we obtain the occupation dynamics, which provides insight into the population of states and sources of decoherence in a quantum system.
Pinching parameters for open (super) strings
NASA Astrophysics Data System (ADS)
Playle, Sam; Sciuto, Stefano
2018-02-01
We present an approach to the parametrization of (super) Schottky space obtained by sewing together three-punctured discs with strips. Different cubic ribbon graphs classify distinct sets of pinching parameters; we show how they are mapped onto each other. The parametrization is particularly well-suited to describing the region within (super) moduli space where open bosonic or Neveu-Schwarz string propagators become very long and thin, which dominates the IR behaviour of string theories. We show how worldsheet objects such as the Green's function converge to graph theoretic objects such as the Symanzik polynomials in the α ' → 0 limit, allowing us to see how string theory reproduces the sum over Feynman graphs. The (super) string measure takes on a simple and elegant form when expressed in terms of these parameters.
Economic policy optimization based on both one stochastic model and the parametric control theory
NASA Astrophysics Data System (ADS)
Ashimov, Abdykappar; Borovskiy, Yuriy; Onalbekov, Mukhit
2016-06-01
A nonlinear dynamic stochastic general equilibrium model with financial frictions is developed to describe two interacting national economies in the environment of the rest of the world. Parameters of nonlinear model are estimated based on its log-linearization by the Bayesian approach. The nonlinear model is verified by retroprognosis, estimation of stability indicators of mappings specified by the model, and estimation the degree of coincidence for results of internal and external shocks' effects on macroeconomic indicators on the basis of the estimated nonlinear model and its log-linearization. On the base of the nonlinear model, the parametric control problems of economic growth and volatility of macroeconomic indicators of Kazakhstan are formulated and solved for two exchange rate regimes (free floating and managed floating exchange rates)
Panagiotopoulou, Olga; Pataky, Todd C; Hill, Zoe; Hutchinson, John R
2012-05-01
Foot pressure distributions during locomotion have causal links with the anatomical and structural configurations of the foot tissues and the mechanics of locomotion. Elephant feet have five toes bound in a flexible pad of fibrous tissue (digital cushion). Does this specialized foot design control peak foot pressures in such giant animals? And how does body size, such as during ontogenetic growth, influence foot pressures? We addressed these questions by studying foot pressure distributions in elephant feet and their correlation with body mass and centre of pressure trajectories, using statistical parametric mapping (SPM), a neuro-imaging technology. Our results show a positive correlation between body mass and peak pressures, with the highest pressures dominated by the distal ends of the lateral toes (digits 3, 4 and 5). We also demonstrate that pressure reduction in the elephant digital cushion is a complex interaction of its viscoelastic tissue structure and its centre of pressure trajectories, because there is a tendency to avoid rear 'heel' contact as an elephant grows. Using SPM, we present a complete map of pressure distributions in elephant feet during ontogeny by performing statistical analysis at the pixel level across the entire plantar/palmar surface. We hope that our study will build confidence in the potential clinical and scaling applications of mammalian foot pressures, given our findings in support of a link between regional peak pressures and pathogenesis in elephant feet.
Novel cardiac magnetic resonance biomarkers: native T1 and extracellular volume myocardial mapping.
Cannaò, Paola Maria; Altabella, Luisa; Petrini, Marcello; Alì, Marco; Secchi, Francesco; Sardanelli, Francesco
2016-04-28
Cardiac magnetic resonance (CMR) is a non-invasive diagnostic tool playing a key role in the assessment of cardiac morphology and function as well as in tissue characterization. Late gadolinium enhancement is a fundamental CMR technique for detecting focal or regional abnormalities such as scar tissue, replacement fibrosis, or inflammation using qualitative, semi-quantitative, or quantitative methods, but not allowing for evaluating the whole myocardium in the presence of diffuse disease. The novel T1 mapping approach permits a quantitative assessment of the entire myocardium providing a voxel-by-voxel map of native T1 relaxation time, obtained before the intravenous administration of gadolinium-based contrast material. Combining T1 data obtained before and after contrast injection, it is also possible to calculate the voxel-by-voxel extracellular volume (ECV), resulting in another myocardial parametric map. This article describes technical challenges and clinical perspectives of these two novel CMR biomarkers: myocardial native T1 and ECV mapping.
ERIC Educational Resources Information Center
Lee, Woon Jee
2012-01-01
The purpose of this study was to explore the nature of students' mapping and discourse behaviors while constructing causal maps to articulate their understanding of a complex, ill-structured problem. In this study, six graduate-level students were assigned to one of three pair groups, and each pair used the causal mapping software program,…
Ground-Based GPS Sensing of Azimuthal Variations in Precipitable Water Vapor
NASA Technical Reports Server (NTRS)
Kroger, P. M.; Bar-Sever, Y. E.
1997-01-01
Current models for troposphere delay employed by GPS software packages map the total zenith delay to the line-of-sight delay of the individual satellite-receiver link under the assumption of azimuthal homogeneity. This could be a poor approximation for many sites, in particular, those located at an ocean front or next to a mountain range. We have modified the GIPSY-OASIS II software package to include a simple non-symmetric mapping function (MacMillan, 1995) which introduces two new parameters.
Mapping Ad Hoc Communications Network of a Large Number Fixed-Wing UAV Swarm
2017-03-01
partitioned sub-swarms. The work covered in this thesis is to build a model of the NPS swarm’s communication network in ns-3 simulation software and use...partitioned sub- swarms. The work covered in this thesis is to build a model of the NPS swarm’s communication network in ns-3 simulation software and...NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS MAPPING AD HOC COMMUNICATIONS NETWORK OF A LARGE NUMBER FIXED-WING UAV SWARM by Alexis
An open-source java platform for automated reaction mapping.
Crabtree, John D; Mehta, Dinesh P; Kouri, Tina M
2010-09-27
This article presents software applications that have been built upon a modular, open-source, reaction mapping library that can be used in both cheminformatics and bioinformatics research. We first describe the theoretical underpinnings and modular architecture of the core software library. We then describe two applications that have been built upon that core. The first is a generic reaction viewer and mapper, and the second classifies reactions according to rules that can be modified by end users with little or no programming skills.
Video Altimeter and Obstruction Detector for an Aircraft
NASA Technical Reports Server (NTRS)
Delgado, Frank J.; Abernathy, Michael F.; White, Janis; Dolson, William R.
2013-01-01
Video-based altimetric and obstruction detection systems for aircraft have been partially developed. The hardware of a system of this type includes a downward-looking video camera, a video digitizer, a Global Positioning System receiver or other means of measuring the aircraft velocity relative to the ground, a gyroscope based or other attitude-determination subsystem, and a computer running altimetric and/or obstruction-detection software. From the digitized video data, the altimetric software computes the pixel velocity in an appropriate part of the video image and the corresponding angular relative motion of the ground within the field of view of the camera. Then by use of trigonometric relationships among the aircraft velocity, the attitude of the camera, the angular relative motion, and the altitude, the software computes the altitude. The obstruction-detection software performs somewhat similar calculations as part of a larger task in which it uses the pixel velocity data from the entire video image to compute a depth map, which can be correlated with a terrain map, showing locations of potential obstructions. The depth map can be used as real-time hazard display and/or to update an obstruction database.
1998-03-28
This image-based surface map of Pluto was assembled by computer image processing software from four separate images of Pluto disk taken with the European Space Agency Faint Object Camera aboard NASA Hubble Space Telescope.
Method for Pre-Conditioning a Measured Surface Height Map for Model Validation
NASA Technical Reports Server (NTRS)
Sidick, Erkin
2012-01-01
This software allows one to up-sample or down-sample a measured surface map for model validation, not only without introducing any re-sampling errors, but also eliminating the existing measurement noise and measurement errors. Because the re-sampling of a surface map is accomplished based on the analytical expressions of Zernike-polynomials and a power spectral density model, such re-sampling does not introduce any aliasing and interpolation errors as is done by the conventional interpolation and FFT-based (fast-Fourier-transform-based) spatial-filtering method. Also, this new method automatically eliminates the measurement noise and other measurement errors such as artificial discontinuity. The developmental cycle of an optical system, such as a space telescope, includes, but is not limited to, the following two steps: (1) deriving requirements or specs on the optical quality of individual optics before they are fabricated through optical modeling and simulations, and (2) validating the optical model using the measured surface height maps after all optics are fabricated. There are a number of computational issues related to model validation, one of which is the "pre-conditioning" or pre-processing of the measured surface maps before using them in a model validation software tool. This software addresses the following issues: (1) up- or down-sampling a measured surface map to match it with the gridded data format of a model validation tool, and (2) eliminating the surface measurement noise or measurement errors such that the resulted surface height map is continuous or smoothly-varying. So far, the preferred method used for re-sampling a surface map is two-dimensional interpolation. The main problem of this method is that the same pixel can take different values when the method of interpolation is changed among the different methods such as the "nearest," "linear," "cubic," and "spline" fitting in Matlab. The conventional, FFT-based spatial filtering method used to eliminate the surface measurement noise or measurement errors can also suffer from aliasing effects. During re-sampling of a surface map, this software preserves the low spatial-frequency characteristic of a given surface map through the use of Zernike-polynomial fit coefficients, and maintains mid- and high-spatial-frequency characteristics of the given surface map by the use of a PSD model derived from the two-dimensional PSD data of the mid- and high-spatial-frequency components of the original surface map. Because this new method creates the new surface map in the desired sampling format from analytical expressions only, it does not encounter any aliasing effects and does not cause any discontinuity in the resultant surface map.
Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise
2010-01-01
A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware. PMID:21344013
Abdoun, Oussama; Joucla, Sébastien; Mazzocco, Claire; Yvert, Blaise
2011-01-01
A major characteristic of neural networks is the complexity of their organization at various spatial scales, from microscopic local circuits to macroscopic brain-scale areas. Understanding how neural information is processed thus entails the ability to study them at multiple scales simultaneously. This is made possible using microelectrodes array (MEA) technology. Indeed, high-density MEAs provide large-scale coverage (several square millimeters) of whole neural structures combined with microscopic resolution (about 50 μm) of unit activity. Yet, current options for spatiotemporal representation of MEA-collected data remain limited. Here we present NeuroMap, a new interactive Matlab-based software for spatiotemporal mapping of MEA data. NeuroMap uses thin plate spline interpolation, which provides several assets with respect to conventional mapping methods used currently. First, any MEA design can be considered, including 2D or 3D, regular or irregular, arrangements of electrodes. Second, spline interpolation allows the estimation of activity across the tissue with local extrema not necessarily at recording sites. Finally, this interpolation approach provides a straightforward analytical estimation of the spatial Laplacian for better current sources localization. In this software, coregistration of 2D MEA data on the anatomy of the neural tissue is made possible by fine matching of anatomical data with electrode positions using rigid-deformation-based correction of anatomical pictures. Overall, NeuroMap provides substantial material for detailed spatiotemporal analysis of MEA data. The package is distributed under GNU General Public License and available at http://sites.google.com/site/neuromapsoftware.
The Impact of Graphic Organisers on Learning from Presentations
ERIC Educational Resources Information Center
Casteleyn, Jordi; Mottart, André; Valcke, Martin
2013-01-01
There is abundant educational research indicating that graphic organisers (knowledge maps, concept maps, or mind maps) have a beneficial impact on learning, but hardly any research has examined this in the context of presentations. This study therefore investigated how graphic organisers -- as delivered via presentation software -- affect learning…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bynum, Leo
FASTMap is mapping application available for the web or on mobile devices (IOS and Android) that browses geospatial data and produces detailed reports of objects within any area of analysis. FASTMap can access any geospatial dataset. The software can provide immediate access to the selected data through a fully symbolized interactive mapping interface. FASTMap can load arbitrary contours that represent a region of interest and can dynamically identify and geospatially select objects that reside within the region. The software can produce a report listing the objects and aggregations for the region, as well as producing publication quality maps. FASTMap alsomore » has the ability to post and maintain authored maps, any GIS data included in the map, areas of interest, as well as any titles, and labels. These defining ingredients of a map are called map contexts. These mao contexts can be instantly broadcast via the internet through any of an infinite number of named channels to small or large numbers of users monitouring any of the channels being posted to, so a user can author a map and immediately share that map with others instantly, whether they are on traditional desktop computer, laptop, mobile tablet or smartphone. Further, users receiving broadcast maps can also alter the maps can also alter the maps, or create new ones and publish back to the channel in a collaborative manner. FASTMap can be configured to access virtually any geospatial data.« less
ERIC Educational Resources Information Center
Ortega, Ryan A.; Brame, Cynthia J.
2015-01-01
Concept mapping was developed as a method of displaying and organizing hierarchical knowledge structures. Using the new, multidimensional presentation software Prezi, we have developed a new teaching technique designed to engage higher-level skills in the cognitive domain. This tool, synthesis mapping, is a natural evolution of concept mapping,…
A procedure for automated land use mapping using remotely sensed multispectral scanner data
NASA Technical Reports Server (NTRS)
Whitley, S. L.
1975-01-01
A system of processing remotely sensed multispectral scanner data by computer programs to produce color-coded land use maps for large areas is described. The procedure is explained, the software and the hardware are described, and an analogous example of the procedure is presented. Detailed descriptions of the multispectral scanners currently in use are provided together with a summary of the background of current land use mapping techniques. The data analysis system used in the procedure and the pattern recognition software used are functionally described. Current efforts by the NASA Earth Resources Laboratory to evaluate operationally a less complex and less costly system are discussed in a separate section.
Villanueva, Pia; Newbury, Dianne F; Jara, Lilian; De Barbieri, Zulema; Mirza, Ghazala; Palomino, Hernán M; Fernández, María Angélica; Cazier, Jean-Baptiste; Monaco, Anthony P; Palomino, Hernán
2011-01-01
Specific language impairment (SLI) is an unexpected deficit in the acquisition of language skills and affects between 5 and 8% of pre-school children. Despite its prevalence and high heritability, our understanding of the aetiology of this disorder is only emerging. In this paper, we apply genome-wide techniques to investigate an isolated Chilean population who exhibit an increased frequency of SLI. Loss of heterozygosity (LOH) mapping and parametric and non-parametric linkage analyses indicate that complex genetic factors are likely to underlie susceptibility to SLI in this population. Across all analyses performed, the most consistently implicated locus was on chromosome 7q. This locus achieved highly significant linkage under all three non-parametric models (max NPL=6.73, P=4.0 × 10−11). In addition, it yielded a HLOD of 1.24 in the recessive parametric linkage analyses and contained a segment that was homozygous in two affected individuals. Further, investigation of this region identified a two-SNP haplotype that occurs at an increased frequency in language-impaired individuals (P=0.008). We hypothesise that the linkage regions identified here, in particular that on chromosome 7, may contain variants that underlie the high prevalence of SLI observed in this isolated population and may be of relevance to other populations affected by language impairments. PMID:21248734
kruX: matrix-based non-parametric eQTL discovery
2014-01-01
Background The Kruskal-Wallis test is a popular non-parametric statistical test for identifying expression quantitative trait loci (eQTLs) from genome-wide data due to its robustness against variations in the underlying genetic model and expression trait distribution, but testing billions of marker-trait combinations one-by-one can become computationally prohibitive. Results We developed kruX, an algorithm implemented in Matlab, Python and R that uses matrix multiplications to simultaneously calculate the Kruskal-Wallis test statistic for several millions of marker-trait combinations at once. KruX is more than ten thousand times faster than computing associations one-by-one on a typical human dataset. We used kruX and a dataset of more than 500k SNPs and 20k expression traits measured in 102 human blood samples to compare eQTLs detected by the Kruskal-Wallis test to eQTLs detected by the parametric ANOVA and linear model methods. We found that the Kruskal-Wallis test is more robust against data outliers and heterogeneous genotype group sizes and detects a higher proportion of non-linear associations, but is more conservative for calling additive linear associations. Conclusion kruX enables the use of robust non-parametric methods for massive eQTL mapping without the need for a high-performance computing infrastructure and is freely available from http://krux.googlecode.com. PMID:24423115
2012-01-01
Background Quantitative trait loci (QTL) detection on a huge amount of phenotypes, like eQTL detection on transcriptomic data, can be dramatically impaired by the statistical properties of interval mapping methods. One of these major outcomes is the high number of QTL detected at marker locations. The present study aims at identifying and specifying the sources of this bias, in particular in the case of analysis of data issued from outbred populations. Analytical developments were carried out in a backcross situation in order to specify the bias and to propose an algorithm to control it. The outbred population context was studied through simulated data sets in a wide range of situations. The likelihood ratio test was firstly analyzed under the "one QTL" hypothesis in a backcross population. Designs of sib families were then simulated and analyzed using the QTL Map software. On the basis of the theoretical results in backcross, parameters such as the population size, the density of the genetic map, the QTL effect and the true location of the QTL, were taken into account under the "no QTL" and the "one QTL" hypotheses. A combination of two non parametric tests - the Kolmogorov-Smirnov test and the Mann-Whitney-Wilcoxon test - was used in order to identify the parameters that affected the bias and to specify how much they influenced the estimation of QTL location. Results A theoretical expression of the bias of the estimated QTL location was obtained for a backcross type population. We demonstrated a common source of bias under the "no QTL" and the "one QTL" hypotheses and qualified the possible influence of several parameters. Simulation studies confirmed that the bias exists in outbred populations under both the hypotheses of "no QTL" and "one QTL" on a linkage group. The QTL location was systematically closer to marker locations than expected, particularly in the case of low QTL effect, small population size or low density of markers, i.e. designs with low power. Practical recommendations for experimental designs for QTL detection in outbred populations are given on the basis of this bias quantification. Furthermore, an original algorithm is proposed to adjust the location of a QTL, obtained with interval mapping, which co located with a marker. Conclusions Therefore, one should be attentive when one QTL is mapped at the location of one marker, especially under low power conditions. PMID:22520935
NASA Astrophysics Data System (ADS)
Ozkaya, Sait I.
2018-03-01
Fracture corridors are interconnected large fractures in a narrow sub vertical tabular array, which usually traverse entire reservoir vertically and extended for several hundreds of meters laterally. Fracture corridors with their huge conductivities constitute an important element of many fractured reservoirs. Unlike small diffuse fractures, actual fracture corridors must be mapped deterministically for simulation or field development purposes. Fracture corridors can be identified and quantified definitely with borehole image logs and well testing. However, there are rarely sufficient image logs or well tests, and it is necessary to utilize various fracture corridor indicators with varying degrees of reliability. Integration of data from many different sources, in turn, requires a platform with powerful editing and layering capability. Available commercial reservoir characterization software packages, with layering and editing capabilities, can be cost intensive. CAD packages are far more affordable and may easily acquire the versatility and power of commercial software packages with addition of a small software toolbox. The objective of this communication is to present FRACOR, a software toolbox which enables deterministic 2D fracture corridor mapping and modeling on AutoCAD platform. The FRACOR toolbox is written in AutoLISPand contains several independent routines to import and integrate available fracture corridor data from an oil field, and export results as text files. The resulting fracture corridor maps consists mainly of fracture corridors with different confidence levels from combination of static and dynamic data and exclusion zones where no fracture corridor can exist. The exported text file of fracture corridors from FRACOR can be imported into an upscaling programs to generate fracture grid for dual porosity simulation or used for field development and well planning.
1997-09-01
Illinois Institute of Technology Research Institute (IITRI) calibrated seven parametric models including SPQR /20, the forerunner of CHECKPOINT. The...a semicolon); thus, SPQR /20 was calibrated using SLOC sizing data (IITRI, 1989: 3-4). The results showed only slight overall improvements in accuracy...even when validating the calibrated models with the same data sets. The IITRI study demonstrated SPQR /20 to be one of two models that were most
Multi-Mission Power Analysis Tool (MMPAT) Version 3
NASA Technical Reports Server (NTRS)
Wood, Eric G.; Chang, George W.; Chen, Fannie C.
2012-01-01
The Multi-Mission Power Analysis Tool (MMPAT) simulates a spacecraft power subsystem including the power source (solar array and/or radioisotope thermoelectric generator), bus-voltage control, secondary battery (lithium-ion or nickel-hydrogen), thermostatic heaters, and power-consuming equipment. It handles multiple mission types including heliocentric orbiters, planetary orbiters, and surface operations. Being parametrically driven along with its user-programmable features can reduce or even eliminate any need for software modifications when configuring it for a particular spacecraft. It provides multiple levels of fidelity, thereby fulfilling the vast majority of a project s power simulation needs throughout the lifecycle. It can operate in a stand-alone mode with a graphical user interface, in batch mode, or as a library linked with other tools. This software can simulate all major aspects of a spacecraft power subsystem. It is parametrically driven to reduce or eliminate the need for a programmer. Added flexibility is provided through user-designed state models and table-driven parameters. MMPAT is designed to be used by a variety of users, such as power subsystem engineers for sizing power subsystem components; mission planners for adjusting mission scenarios using power profiles generated by the model; system engineers for performing system- level trade studies using the results of the model during the early design phases of a spacecraft; and operations personnel for high-fidelity modeling of the essential power aspect of the planning picture.
SOFIA: a flexible source finder for 3D spectral line data
NASA Astrophysics Data System (ADS)
Serra, Paolo; Westmeier, Tobias; Giese, Nadine; Jurek, Russell; Flöer, Lars; Popping, Attila; Winkel, Benjamin; van der Hulst, Thijs; Meyer, Martin; Koribalski, Bärbel S.; Staveley-Smith, Lister; Courtois, Hélène
2015-04-01
We introduce SOFIA, a flexible software application for the detection and parametrization of sources in 3D spectral line data sets. SOFIA combines for the first time in a single piece of software a set of new source-finding and parametrization algorithms developed on the way to future H I surveys with ASKAP (WALLABY, DINGO) and APERTIF. It is designed to enable the general use of these new algorithms by the community on a broad range of data sets. The key advantages of SOFIA are the ability to: search for line emission on multiple scales to detect 3D sources in a complete and reliable way, taking into account noise level variations and the presence of artefacts in a data cube; estimate the reliability of individual detections; look for signal in arbitrarily large data cubes using a catalogue of 3D coordinates as a prior; provide a wide range of source parameters and output products which facilitate further analysis by the user. We highlight the modularity of SOFIA, which makes it a flexible package allowing users to select and apply only the algorithms useful for their data and science questions. This modularity makes it also possible to easily expand SOFIA in order to include additional methods as they become available. The full SOFIA distribution, including a dedicated graphical user interface, is publicly available for download.
New technologies for advanced three-dimensional optimum shape design in aeronautics
NASA Astrophysics Data System (ADS)
Dervieux, Alain; Lanteri, Stéphane; Malé, Jean-Michel; Marco, Nathalie; Rostaing-Schmidt, Nicole; Stoufflet, Bruno
1999-05-01
The analysis of complex flows around realistic aircraft geometries is becoming more and more predictive. In order to obtain this result, the complexity of flow analysis codes has been constantly increasing, involving more refined fluid models and sophisticated numerical methods. These codes can only run on top computers, exhausting their memory and CPU capabilities. It is, therefore, difficult to introduce best analysis codes in a shape optimization loop: most previous works in the optimum shape design field used only simplified analysis codes. Moreover, as the most popular optimization methods are the gradient-based ones, the more complex the flow solver, the more difficult it is to compute the sensitivity code. However, emerging technologies are contributing to make such an ambitious project, of including a state-of-the-art flow analysis code into an optimisation loop, feasible. Among those technologies, there are three important issues that this paper wishes to address: shape parametrization, automated differentiation and parallel computing. Shape parametrization allows faster optimization by reducing the number of design variable; in this work, it relies on a hierarchical multilevel approach. The sensitivity code can be obtained using automated differentiation. The automated approach is based on software manipulation tools, which allow the differentiation to be quick and the resulting differentiated code to be rather fast and reliable. In addition, the parallel algorithms implemented in this work allow the resulting optimization software to run on increasingly larger geometries. Copyright
Implementation of AN Unmanned Aerial Vehicle System for Large Scale Mapping
NASA Astrophysics Data System (ADS)
Mah, S. B.; Cryderman, C. S.
2015-08-01
Unmanned Aerial Vehicles (UAVs), digital cameras, powerful personal computers, and software have made it possible for geomatics professionals to capture aerial photographs and generate digital terrain models and orthophotographs without using full scale aircraft or hiring mapping professionals. This has been made possible by the availability of miniaturized computers and sensors, and software which has been driven, in part, by the demand for this technology in consumer items such as smartphones. The other force that is in play is the increasing number of Do-It-Yourself (DIY) people who are building UAVs as a hobby or for professional use. Building a UAV system for mapping is an alternative to purchasing a turnkey system. This paper describes factors to be considered when building a UAV mapping system, the choices made, and the test results of a project using this completed system.
Application of GIS Rapid Mapping Technology in Disaster Monitoring
NASA Astrophysics Data System (ADS)
Wang, Z.; Tu, J.; Liu, G.; Zhao, Q.
2018-04-01
With the rapid development of GIS and RS technology, especially in recent years, GIS technology and its software functions have been increasingly mature and enhanced. And with the rapid development of mathematical statistical tools for spatial modeling and simulation, has promoted the widespread application and popularization of quantization in the field of geology. Based on the investigation of field disaster and the construction of spatial database, this paper uses remote sensing image, DEM and GIS technology to obtain the data information of disaster vulnerability analysis, and makes use of the information model to carry out disaster risk assessment mapping.Using ArcGIS software and its spatial data modeling method, the basic data information of the disaster risk mapping process was acquired and processed, and the spatial data simulation tool was used to map the disaster rapidly.
Exploring physics concepts among novice teachers through CMAP tools
NASA Astrophysics Data System (ADS)
Suprapto, N.; Suliyanah; Prahani, B. K.; Jauhariyah, M. N. R.; Admoko, S.
2018-03-01
Concept maps are graphical tools for organising, elaborating and representing knowledge. Through Cmap tools software, it can be explored the understanding and the hierarchical structuring of physics concepts among novice teachers. The software helps physics teachers indicated a physics context, focus questions, parking lots, cross-links, branching, hierarchy, and propositions. By using an exploratory quantitative study, a total 13-concept maps with different physics topics created by novice physics teachers were analysed. The main differences of scoring between lecturer and peer-teachers’ scoring were also illustrated. The study offered some implications, especially for physics educators to determine the hierarchical structure of the physics concepts, to construct a physics focus question, and to see how a concept in one domain of knowledge represented on the map is related to a concept in another domain shown on the map.
Broadband set-top box using MAP-CA processor
NASA Astrophysics Data System (ADS)
Bush, John E.; Lee, Woobin; Basoglu, Chris
2001-12-01
Advances in broadband access are expected to exert a profound impact in our everyday life. It will be the key to the digital convergence of communication, computer and consumer equipment. A common thread that facilitates this convergence comprises digital media and Internet. To address this market, Equator Technologies, Inc., is developing the Dolphin broadband set-top box reference platform using its MAP-CA Broadband Signal ProcessorT chip. The Dolphin reference platform is a universal media platform for display and presentation of digital contents on end-user entertainment systems. The objective of the Dolphin reference platform is to provide a complete set-top box system based on the MAP-CA processor. It includes all the necessary hardware and software components for the emerging broadcast and the broadband digital media market based on IP protocols. Such reference design requires a broadband Internet access and high-performance digital signal processing. By using the MAP-CA processor, the Dolphin reference platform is completely programmable, allowing various codecs to be implemented in software, such as MPEG-2, MPEG-4, H.263 and proprietary codecs. The software implementation also enables field upgrades to keep pace with evolving technology and industry demands.
Calibration Software for Use with Jurassicprok
NASA Technical Reports Server (NTRS)
Chapin, Elaine; Hensley, Scott; Siqueira, Paul
2004-01-01
The Jurassicprok Interferometric Calibration Software (also called "Calibration Processor" or simply "CP") estimates the calibration parameters of an airborne synthetic-aperture-radar (SAR) system, the raw measurement data of which are processed by the Jurassicprok software described in the preceding article. Calibration parameters estimated by CP include time delays, baseline offsets, phase screens, and radiometric offsets. CP examines raw radar-pulse data, single-look complex image data, and digital elevation map data. For each type of data, CP compares the actual values with values expected on the basis of ground-truth data. CP then converts the differences between the actual and expected values into updates for the calibration parameters in an interferometric calibration file (ICF) and a radiometric calibration file (RCF) for the particular SAR system. The updated ICF and RCF are used as inputs to both Jurassicprok and to the companion Motion Measurement Processor software (described in the following article) for use in generating calibrated digital elevation maps.
The Tetracorder user guide: version 4.4
Livo, Keith Eric; Clark, Roger N.
2014-01-01
Imaging spectroscopy mapping software assists in the identification and mapping of materials based on their chemical properties as expressed in spectral measurements of a planet including the solid or liquid surface or atmosphere. Such software can be used to analyze field, aircraft, or spacecraft data; remote sensing datasets; or laboratory spectra. Tetracorder is a set of software algorithms commanded through an expert system to identify materials based on their spectra (Clark and others, 2003). Tetracorder also can be used in traditional remote sensing analyses, because some of the algorithms are a version of a matched filter. Thus, depending on the instructions fed to the Tetracorder system, results can range from simple matched filter output, to spectral feature fitting, to full identification of surface materials (within the limits of the spectral signatures of materials over the spectral range and resolution of the imaging spectroscopy data). A basic understanding of spectroscopy by the user is required for developing an optimum mapping strategy and assessing the results.
Automation of the CFD Process on Distributed Computing Systems
NASA Technical Reports Server (NTRS)
Tejnil, Ed; Gee, Ken; Rizk, Yehia M.
2000-01-01
A script system was developed to automate and streamline portions of the CFD process. The system was designed to facilitate the use of CFD flow solvers on supercomputer and workstation platforms within a parametric design event. Integrating solver pre- and postprocessing phases, the fully automated ADTT script system marshalled the required input data, submitted the jobs to available computational resources, and processed the resulting output data. A number of codes were incorporated into the script system, which itself was part of a larger integrated design environment software package. The IDE and scripts were used in a design event involving a wind tunnel test. This experience highlighted the need for efficient data and resource management in all parts of the CFD process. To facilitate the use of CFD methods to perform parametric design studies, the script system was developed using UNIX shell and Perl languages. The goal of the work was to minimize the user interaction required to generate the data necessary to fill a parametric design space. The scripts wrote out the required input files for the user-specified flow solver, transferred all necessary input files to the computational resource, submitted and tracked the jobs using the resource queuing structure, and retrieved and post-processed the resulting dataset. For computational resources that did not run queueing software, the script system established its own simple first-in-first-out queueing structure to manage the workload. A variety of flow solvers were incorporated in the script system, including INS2D, PMARC, TIGER and GASP. Adapting the script system to a new flow solver was made easier through the use of object-oriented programming methods. The script system was incorporated into an ADTT integrated design environment and evaluated as part of a wind tunnel experiment. The system successfully generated the data required to fill the desired parametric design space. This stressed the computational resources required to compute and store the information. The scripts were continually modified to improve the utilization of the computational resources and reduce the likelihood of data loss due to failures. An ad-hoc file server was created to manage the large amount of data being generated as part of the design event. Files were stored and retrieved as needed to create new jobs and analyze the results. Additional information is contained in the original.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise
2006-09-01
The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.
Measurement of CIB power spectra with CAM-SPEC from Planck HFI maps
NASA Astrophysics Data System (ADS)
Mak, Suet Ying; Challinor, Anthony; Efstathiou, George; Lagache, Guilaine
2015-08-01
We present new measurements of the cosmic infrared background (CIB) anisotropies and its first likelihood using Planck HFI data at 353, 545, and 857 GHz. The measurements are based on cross-frequency power spectra and likelihood analysis using the CAM-SPEC package, rather than map based template removal of foregrounds as done in previous Planck CIB analysis. We construct the likelihood of the CIB temperature fluctuations, an extension of CAM-SPEC likelihood as used in CMB analysis to higher frequency, and use it to drive the best estimate of the CIB power spectrum over three decades in multiple moment, l, covering 50 ≤ l ≤ 2500. We adopt parametric models of the CIB and foreground contaminants (Galactic cirrus, infrared point sources, and cosmic microwave background anisotropies), and calibrate the dataset uniformly across frequencies with known Planck beam and noise properties in the likelihood construction. We validate our likelihood through simulations and extensive suite of consistency tests, and assess the impact of instrumental and data selection effects on the final CIB power spectrum constraints. Two approaches are developed for interpreting the CIB power spectrum. The first approach is based on simple parametric model which model the cross frequency power using amplitudes, correlation coefficients, and known multipole dependence. The second approach is based on the physical models for galaxy clustering and the evolution of infrared emission of galaxies. The new approaches fit all auto- and cross- power spectra very well, with the best fit of χ2ν = 1.04 (parametric model). Using the best foreground solution, we find that the cleaned CIB power spectra are in good agreement with previous Planck and Herschel measurements.
West, Amanda M.; Evangelista, Paul H.; Jarnevich, Catherine S.; Young, Nicholas E.; Stohlgren, Thomas J.; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan
2016-01-01
Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.
West, Amanda M; Evangelista, Paul H; Jarnevich, Catherine S; Young, Nicholas E; Stohlgren, Thomas J; Talbert, Colin; Talbert, Marian; Morisette, Jeffrey; Anderson, Ryan
2016-10-11
Early detection of invasive plant species is vital for the management of natural resources and protection of ecosystem processes. The use of satellite remote sensing for mapping the distribution of invasive plants is becoming more common, however conventional imaging software and classification methods have been shown to be unreliable. In this study, we test and evaluate the use of five species distribution model techniques fit with satellite remote sensing data to map invasive tamarisk (Tamarix spp.) along the Arkansas River in Southeastern Colorado. The models tested included boosted regression trees (BRT), Random Forest (RF), multivariate adaptive regression splines (MARS), generalized linear model (GLM), and Maxent. These analyses were conducted using a newly developed software package called the Software for Assisted Habitat Modeling (SAHM). All models were trained with 499 presence points, 10,000 pseudo-absence points, and predictor variables acquired from the Landsat 5 Thematic Mapper (TM) sensor over an eight-month period to distinguish tamarisk from native riparian vegetation using detection of phenological differences. From the Landsat scenes, we used individual bands and calculated Normalized Difference Vegetation Index (NDVI), Soil-Adjusted Vegetation Index (SAVI), and tasseled capped transformations. All five models identified current tamarisk distribution on the landscape successfully based on threshold independent and threshold dependent evaluation metrics with independent location data. To account for model specific differences, we produced an ensemble of all five models with map output highlighting areas of agreement and areas of uncertainty. Our results demonstrate the usefulness of species distribution models in analyzing remotely sensed data and the utility of ensemble mapping, and showcase the capability of SAHM in pre-processing and executing multiple complex models.
Evolution families of conformal mappings with fixed points and the Löwner-Kufarev equation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Goryainov, V V
2015-01-31
The paper is concerned with evolution families of conformal mappings of the unit disc to itself that fix an interior point and a boundary point. Conditions are obtained for the evolution families to be differentiable, and an existence and uniqueness theorem for an evolution equation is proved. A convergence theorem is established which describes the topology of locally uniform convergence of evolution families in terms of infinitesimal generating functions. The main result in this paper is the embedding theorem which shows that any conformal mapping of the unit disc to itself with two fixed points can be embedded into a differentiable evolution familymore » of such mappings. This result extends the range of the parametric method in the theory of univalent functions. In this way the problem of the mutual change of the derivative at an interior point and the angular derivative at a fixed point on the boundary is solved for a class of mappings of the unit disc to itself. In particular, the rotation theorem is established for this class of mappings. Bibliography: 27 titles.« less
NASA Technical Reports Server (NTRS)
Harvey, James E.; Wissinger, Alan B.; Bunner, Alan N.
1986-01-01
The comparative advantages of synthetic aperture telescopes (SATs) of segmented primary mirror and common secondary mirror type, on the one hand, and on the other those employing an array of independent telescopes, are discussed. The diffraction-limited optical performance of both redundant and nonredundant subaperture configurations are compared in terms of point spread function characteristics and encircled energy plots. Coherent imaging with afocal telescope SATs involves a pupil-mapping operation followed by a Fourier transform one. A quantitative analysis of the off-axis optical performance degradation due to pupil-mapping errors is presented, together with the field-dependent effects of residual design aberrations of independent telescopes.
A statistical method (cross-validation) for bone loss region detection after spaceflight
Zhao, Qian; Li, Wenjun; Li, Caixia; Chu, Philip W.; Kornak, John; Lang, Thomas F.
2010-01-01
Astronauts experience bone loss after the long spaceflight missions. Identifying specific regions that undergo the greatest losses (e.g. the proximal femur) could reveal information about the processes of bone loss in disuse and disease. Methods for detecting such regions, however, remains an open problem. This paper focuses on statistical methods to detect such regions. We perform statistical parametric mapping to get t-maps of changes in images, and propose a new cross-validation method to select an optimum suprathreshold for forming clusters of pixels. Once these candidate clusters are formed, we use permutation testing of longitudinal labels to derive significant changes. PMID:20632144
NASA Astrophysics Data System (ADS)
Hou, Gary Y.; Marquet, Fabrice; Wang, Shutao; Konofagou, Elisa E.
2014-03-01
Harmonic motion imaging for focused ultrasound (HMIFU) is a recently developed high-intensity focused ultrasound (HIFU) treatment monitoring method with feasibilities demonstrated in vitro and in vivo. Here, a multi-parametric study is performed to investigate both elastic and acoustics-independent viscoelastic tissue changes using the Harmonic Motion Imaging (HMI) displacement, axial compressive strain and change in relative phase shift during high energy HIFU treatment with tissue boiling. Forty three (n = 43) thermal lesions were formed in ex vivo canine liver specimens (n = 28). Two-dimensional (2D) transverse HMI displacement maps were also obtained before and after lesion formation. The same method was repeated in 10 s, 20 s and 30 s HIFU durations at three different acoustic powers of 8, 10, and 11 W, which were selected and verified as treatment parameters capable of inducing boiling using both thermocouple and passive cavitation detection (PCD) measurements. Although a steady decrease in the displacement, compressive strain, and relative change in the focal phase shift (Δϕ) were obtained in numerous cases, indicating an overall increase in relative stiffness, the study outcomes also showed that during boiling, a reverse lesion-to-background displacement contrast was detected, indicating potential change in tissue absorption, geometrical change and/or, mechanical gelatification or pulverization. Following treatment, corresponding 2D HMI displacement images of the thermal lesions also mapped consistent discrepancy in the lesion-to-background displacement contrast. Despite the expectedly chaotic changes in acoustic properties with boiling, the relative change in phase shift showed a consistent decrease, indicating its robustness to monitor biomechanical properties independent of the acoustic property changes throughout the HIFU treatment. In addition, the 2D HMI displacement images confirmed and indicated the increase in the thermal lesion size with treatment duration, which was validated against pathology. In conclusion, multi-parametric HMIFU was shown capable of monitoring and mapping tissue viscoelastic response changes during and after HIFU boiling, some of which were independent of the acoustic parameter changes.
Whyte, Enda F; Richter, Chris; OʼConnor, Siobhan; Moran, Kieran A
2018-06-01
Whyte, EF, Richter, C, O'Connor, S, and Moran, KA. Investigation of the effects of high-intensity, intermittent exercise and unanticipation on trunk and lower limb biomechanics during a side-cutting maneuver using statistical parametric mapping. J Strength Cond Res 32(6): 1583-1593, 2018-Anterior cruciate ligament (ACL) injuries frequently occur during side-cutting maneuvers when fatigued or reacting to the sporting environment. Trunk and hip biomechanics are proposed to influence ACL loading during these activities. However, the effects of fatigue and unanticipation on the biomechanics of the kinetic chain may be limited by traditional discrete point analysis. We recruited 28 male, varsity, Gaelic footballers (21.7 ± 2.2 years; 178.7 ± 14.6 m; 81.8 ± 11.4 kg) to perform anticipated and unanticipated side-cutting maneuvers before and after a high-intensity, intermittent exercise protocol (HIIP). Statistical parametric mapping (repeated-measures analysis of varience) identified differences in phases of trunk and stance leg biomechanics during weight acceptance. Unanticipation resulted in less trunk flexion (p < 0.001) and greater side flexion away from the direction of cut (p < 0.001). This led to smaller (internal) knee flexor and greater (internal) knee extensor (p = 0.002-0.007), hip adductor (p = 0.005), and hip external rotator (p = 0.007) moments. The HIIP resulted in increased trunk flexion (p < 0.001) and side flexion away from the direction of cut (p = 0.038), resulting in smaller (internal) knee extensor moments (p = 0.006). One interaction effect was noted demonstrating greater hip extensor moments in the unanticipated condition post-HIIP (p = 0.025). Results demonstrate that unanticipation resulted in trunk kinematics considered an ACL injury risk factor. A subsequent increase in frontal and transverse plane hip loading and sagittal plane knee loading was observed, which may increase ACL strain. Conversely, HIIP-induced trunk kinematic alterations resulted in reduced sagittal plane knee and subsequent ACL loading. Therefore, adequate hip and knee control is important during unanticipated side-cutting maneuvers.
Katiyar, Prateek; Divine, Mathew R; Kohlhofer, Ursula; Quintanilla-Martinez, Leticia; Schölkopf, Bernhard; Pichler, Bernd J; Disselhorst, Jonathan A
2017-04-01
In this study, we described and validated an unsupervised segmentation algorithm for the assessment of tumor heterogeneity using dynamic 18 F-FDG PET. The aim of our study was to objectively evaluate the proposed method and make comparisons with compartmental modeling parametric maps and SUV segmentations using simulations of clinically relevant tumor tissue types. Methods: An irreversible 2-tissue-compartmental model was implemented to simulate clinical and preclinical 18 F-FDG PET time-activity curves using population-based arterial input functions (80 clinical and 12 preclinical) and the kinetic parameter values of 3 tumor tissue types. The simulated time-activity curves were corrupted with different levels of noise and used to calculate the tissue-type misclassification errors of spectral clustering (SC), parametric maps, and SUV segmentation. The utility of the inverse noise variance- and Laplacian score-derived frame weighting schemes before SC was also investigated. Finally, the SC scheme with the best results was tested on a dynamic 18 F-FDG measurement of a mouse bearing subcutaneous colon cancer and validated using histology. Results: In the preclinical setup, the inverse noise variance-weighted SC exhibited the lowest misclassification errors (8.09%-28.53%) at all noise levels in contrast to the Laplacian score-weighted SC (16.12%-31.23%), unweighted SC (25.73%-40.03%), parametric maps (28.02%-61.45%), and SUV (45.49%-45.63%) segmentation. The classification efficacy of both weighted SC schemes in the clinical case was comparable to the unweighted SC. When applied to the dynamic 18 F-FDG measurement of colon cancer, the proposed algorithm accurately identified densely vascularized regions from the rest of the tumor. In addition, the segmented regions and clusterwise average time-activity curves showed excellent correlation with the tumor histology. Conclusion: The promising results of SC mark its position as a robust tool for quantification of tumor heterogeneity using dynamic PET studies. Because SC tumor segmentation is based on the intrinsic structure of the underlying data, it can be easily applied to other cancer types as well. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Association Fields via Cuspless Sub-Riemannian Geodesics in SE(2).
Duits, R; Boscain, U; Rossi, F; Sachkov, Y
To model association fields that underly perceptional organization (gestalt) in psychophysics we consider the problem P curve of minimizing [Formula: see text] for a planar curve having fixed initial and final positions and directions. Here κ ( s ) is the curvature of the curve with free total length ℓ . This problem comes from a model of geometry of vision due to Petitot (in J. Physiol. Paris 97:265-309, 2003; Math. Inf. Sci. Humaines 145:5-101, 1999), and Citti & Sarti (in J. Math. Imaging Vis. 24(3):307-326, 2006). In previous work we proved that the range [Formula: see text] of the exponential map of the underlying geometric problem formulated on SE(2) consists of precisely those end-conditions ( x fin , y fin , θ fin ) that can be connected by a globally minimizing geodesic starting at the origin ( x in , y in , θ in )=(0,0,0). From the applied imaging point of view it is relevant to analyze the sub-Riemannian geodesics and [Formula: see text] in detail. In this article we show that [Formula: see text] is contained in half space x ≥0 and (0, y fin )≠(0,0) is reached with angle π ,show that the boundary [Formula: see text] consists of endpoints of minimizers either starting or ending in a cusp,analyze and plot the cones of reachable angles θ fin per spatial endpoint ( x fin , y fin ),relate the endings of association fields to [Formula: see text] and compute the length towards a cusp,analyze the exponential map both with the common arc-length parametrization t in the sub-Riemannian manifold [Formula: see text] and with spatial arc-length parametrization s in the plane [Formula: see text]. Surprisingly, s -parametrization simplifies the exponential map, the curvature formulas, the cusp-surface, and the boundary value problem,present a novel efficient algorithm solving the boundary value problem,show that sub-Riemannian geodesics solve Petitot's circle bundle model (cf. Petitot in J. Physiol. Paris 97:265-309, [2003]),show a clear similarity with association field lines and sub-Riemannian geodesics.
Hou, Gary Y.; Marquet, Fabrice; Wang, Shutao; Konofagou, Elisa E.
2014-01-01
Harmonic Motion Imaging for Focused Ultrasound (HMIFU) is a recently developed high-intensity focused ultrasound (HIFU) treatment monitoring method with feasibilities demonstrated in vitro and in vivo. Here, a multi-parametric study is performed to investigate both elastic and acoustics-independent viscoelastic tissue changes using the Harmonic Motion Imaging (HMI) displacement, axial compressive strain and change in relative phase-shift during high energy HIFU treatment with tissue boiling. Forty three (n=43) thermal lesions were formed in ex vivo canine liver specimens (n=28). Two dimensional (2D) transverse HMI displacement maps were also obtained before and after lesion formation. The same method was repeated in 10-s, 20-s and 30-s HIFU durations at three different acoustic powers of 8, 10, and 11W, which were selected and verified as treatment parameters capable of inducing boiling using both thermocouple and Passive Cavitation Detection (PCD) measurements. Although a steady decrease in the displacement, compressive strain, and relative change in the focal phase shift (Δφ) were obtained in numerous cases, indicating an overall increase in relative stiffness, the study outcomes also showed that during boiling, a reverse lesion-to-background displacement contrast was detected, indicating potential change in tissue absorption, geometrical change and/or, mechanical gelatification or pulverization. Following treatment, corresponding 2D HMI displacement images of the thermal lesions also mapped consistent discrepancy in the lesion-to-background displacement contrast. Despite unpredictable changes in acoustic properties with boiling, the relative change in phase shift showed a consistent decrease, indicating its robustness to monitor biomechanical properties independent of the acoustic property change throughout the HIFU treatment. In addition, the 2D HMI displacement images confirmed and indicated the increase in the thermal lesion size with treatment duration, which was validated against pathology. In conclusion, multi-parametric HMIFU was shown capable of monitoring and mapping tissue viscoelastic response changes during and after HIFU boiling, some of which were independent of the acoustic parameter changes. PMID:24556974
Hou, Gary Y; Marquet, Fabrice; Wang, Shutao; Konofagou, Elisa E
2014-03-07
Harmonic motion imaging for focused ultrasound (HMIFU) is a recently developed high-intensity focused ultrasound (HIFU) treatment monitoring method with feasibilities demonstrated in vitro and in vivo. Here, a multi-parametric study is performed to investigate both elastic and acoustics-independent viscoelastic tissue changes using the Harmonic Motion Imaging (HMI) displacement, axial compressive strain and change in relative phase shift during high energy HIFU treatment with tissue boiling. Forty three (n = 43) thermal lesions were formed in ex vivo canine liver specimens (n = 28). Two-dimensional (2D) transverse HMI displacement maps were also obtained before and after lesion formation. The same method was repeated in 10 s, 20 s and 30 s HIFU durations at three different acoustic powers of 8, 10, and 11 W, which were selected and verified as treatment parameters capable of inducing boiling using both thermocouple and passive cavitation detection (PCD) measurements. Although a steady decrease in the displacement, compressive strain, and relative change in the focal phase shift (Δϕ) were obtained in numerous cases, indicating an overall increase in relative stiffness, the study outcomes also showed that during boiling, a reverse lesion-to-background displacement contrast was detected, indicating potential change in tissue absorption, geometrical change and/or, mechanical gelatification or pulverization. Following treatment, corresponding 2D HMI displacement images of the thermal lesions also mapped consistent discrepancy in the lesion-to-background displacement contrast. Despite the expectedly chaotic changes in acoustic properties with boiling, the relative change in phase shift showed a consistent decrease, indicating its robustness to monitor biomechanical properties independent of the acoustic property changes throughout the HIFU treatment. In addition, the 2D HMI displacement images confirmed and indicated the increase in the thermal lesion size with treatment duration, which was validated against pathology. In conclusion, multi-parametric HMIFU was shown capable of monitoring and mapping tissue viscoelastic response changes during and after HIFU boiling, some of which were independent of the acoustic parameter changes.
NASA Astrophysics Data System (ADS)
Triantafyllou, Antoine; Bastin, Christophe; Watlet, Arnaud
2016-04-01
GIS software suites are today's essential tools to gather and visualise geological data, to apply spatial and temporal analysis and in fine, to create and share interactive maps for further geosciences' investigations. For these purposes, we developed GeolOkit: an open-source, freeware and lightweight software, written in Python, a high-level, cross-platform programming language. GeolOkit software is accessible through a graphical user interface, designed to run in parallel with Google Earth. It is a super user-friendly toolbox that allows 'geo-users' to import their raw data (e.g. GPS, sample locations, structural data, field pictures, maps), to use fast data analysis tools and to plot these one into Google Earth environment using KML code. This workflow requires no need of any third party software, except Google Earth itself. GeolOkit comes with large number of geosciences' labels, symbols, colours and placemarks and may process : (i) multi-points data, (ii) contours via several interpolations methods, (iii) discrete planar and linear structural data in 2D or 3D supporting large range of structures input format, (iv) clustered stereonets and rose diagram, (v) drawn cross-sections as vertical sections, (vi) georeferenced maps and vectors, (vii) field pictures using either geo-tracking metadata from a camera built-in GPS module, or the same-day track of an external GPS. We are looking for you to discover all the functionalities of GeolOkit software. As this project is under development, we are definitely looking to discussions regarding your proper needs, your ideas and contributions to GeolOkit project.
OGS improvements in the year 2011 in running the Northeastern Italy Seismic Network
NASA Astrophysics Data System (ADS)
Bragato, P. L.; Pesaresi, D.; Saraò, A.; Di Bartolomeo, P.; Durı, G.
2012-04-01
The Centro di Ricerche Sismologiche (CRS, Seismological Research Center) of the Istituto Nazionale di Oceanografia e di Geofisica Sperimentale (OGS, Italian National Institute for Oceanography and Experimental Geophysics) in Udine (Italy) after the strong earthquake of magnitude M=6.4 occurred in 1976 in the Italian Friuli-Venezia Giulia region, started to operate the Northeastern Italy Seismic Network: it currently consists of 15 very sensitive broad band and 21 simpler short period seismic stations, all telemetered to and acquired in real time at the OGS-CRS data center in Udine. Real time data exchange agreements in place with other Italian, Slovenian, Austrian and Swiss seismological institutes lead to a total number of about 100 seismic stations acquired in real time, which makes the OGS the reference institute for seismic monitoring of Northeastern Italy. Since 2002 OGS-CRS is using the Antelope software suite on several workstations plus a SUN Cluster as the main tool for collecting, analyzing, archiving and exchanging seismic data, initially in the framework of the EU Interreg IIIA project "Trans-national seismological networks in the South-Eastern Alps". SeisComP is also used as a real time data exchange server tool. In order to improve the seismological monitoring of the Northeastern Italy area, at OGS-CRS we tuned existing programs and created ad hoc ones like: a customized web server named PickServer to manually relocate earthquakes, a script for automatic moment tensor determination, scripts for web publishing of earthquake parametric data, waveforms, state of health parameters and shaking maps, noise characterization by means of automatic spectra analysis, and last but not least scripts for email/SMS/fax alerting. The OGS-CRS Real Time Seismological website (RTS, http://rts.crs.inogs.it/) operative since several years was initially developed in the framework of the Italian DPC-INGV S3 Project: the RTS website shows classic earthquake locations parametric data plus ShakeMap and moment tensor information. At OGS-CRS we also spent a considerable amount of efforts in improving the long-period performances of broadband seismic stations, either by carrying out full re-installations and/or applying thermal insulations to the seismometers: more examples of PSD plots of the PRED broad band seismic station installation in the cave tunnel of Cave del Predil using a Quanterra Q330HR high resolution digitizer and a Sterckeisen STS-2 broadband seismometer will be illustrated. Efforts in strengthening the reliability of data links, exploring the use of redundant satellite/radio/GPRS links will also be shown.
Wang, Rui; Meinel, Felix G; Schoepf, U Joseph; Canstein, Christian; Spearman, James V; De Cecco, Carlo N
2015-12-01
To evaluate the accuracy, reliability and time saving potential of a novel cardiac CT (CCT)-based, automated software for the assessment of segmental left ventricular function compared to visual and manual quantitative assessment of CCT and cardiac magnetic resonance (CMR). Forty-seven patients with suspected or known coronary artery disease (CAD) were enrolled in the study. Wall thickening was calculated. Segmental LV wall motion was automatically calculated and shown as a colour-coded polar map. Processing time for each method was recorded. Mean wall thickness in both systolic and diastolic phases on polar map, CCT, and CMR was 9.2 ± 0.1 mm and 14.9 ± 0.2 mm, 8.9 ± 0.1 mm and 14.5 ± 0.1 mm, 8.3 ± 0.1 mm and 13.6 ± 0.1 mm, respectively. Mean wall thickening was 68.4 ± 1.5 %, 64.8 ± 1.4 % and 67.1 ± 1.4 %, respectively. Agreement for the assessment of LV wall motion between CCT, CMR and polar maps was good. Bland-Altman plots and ICC indicated good agreement between CCT, CMR and automated polar maps of the diastolic and systolic segmental wall thickness and thickening. The processing time using polar map was significantly decreased compared with CCT and CMR. Automated evaluation of segmental LV function with polar maps provides similar measurements to manual CCT and CMR evaluation, albeit with substantially reduced analysis time. • Cardiac computed tomography (CCT) can accurately assess segmental left ventricular wall function. • A novel automated software permits accurate and fast evaluation of wall function. • The software may improve the clinical implementation of segmental functional analysis.
R-CMap-An open-source software for concept mapping.
Bar, Haim; Mentch, Lucas
2017-02-01
Planning and evaluating projects often involves input from many stakeholders. Fusing and organizing many different ideas, opinions, and interpretations into a coherent and acceptable plan or project evaluation is challenging. This is especially true when seeking contributions from a large number of participants, especially when not all can participate in group discussions, or when some prefer to contribute their perspectives anonymously. One of the major breakthroughs in the area of evaluation and program planning has been the use of graphical tools to represent the brainstorming process. This provides a quantitative framework for organizing ideas and general concepts into simple-to-interpret graphs. We developed a new, open-source concept mapping software called R-CMap, which is implemented in R. This software provides a graphical user interface to guide users through the analytical process of concept mapping. The R-CMap software allows users to generate a variety of plots, including cluster maps, point rating and cluster rating maps, as well as pattern matching and go-zone plots. Additionally, R-CMap is capable of generating detailed reports that contain useful statistical summaries of the data. The plots and reports can be embedded in Microsoft Office tools such as Word and PowerPoint, where users may manually adjust various plot and table features to achieve the best visual results in their presentations and official reports. The graphical user interface of R-CMap allows users to define cluster names, change the number of clusters, select rating variables for relevant plots, and importantly, select subsets of respondents by demographic criteria. The latter is particularly useful to project managers in order to identify different patterns of preferences by subpopulations. R-CMap is user-friendly, and does not require any programming experience. However, proficient R users can add to its functionality by directly accessing built-in functions in R and sharing new features with the concept mapping community. Copyright © 2016 Elsevier Ltd. All rights reserved.
Operational Use of OGC Web Services at the Met Office
NASA Astrophysics Data System (ADS)
Wright, Bruce
2010-05-01
The Met Office has adopted the Service-Orientated Architecture paradigm to deliver services to a range of customers through Rich Internet Applications (RIAs). The approach uses standard Open Geospatial Consortium (OGC) web services to provide information to web-based applications through a range of generic data services. "Invent", the Met Office beta site, is used to showcase Met Office future plans for presenting web-based weather forecasts, product and information to the public. This currently hosts a freely accessible Weather Map Viewer, written in JavaScript, which accesses a Web Map Service (WMS), to deliver innovative web-based visualizations of weather and its potential impacts to the public. The intention is to engage the public in the development of new web-based services that more accurately meet their needs. As the service is intended for public use within the UK, it has been designed to support a user base of 5 million, the analysed level of UK web traffic reaching the Met Office's public weather information site. The required scalability has been realised through the use of multi-tier tile caching: - WMS requests are made for 256x256 tiles for fixed areas and zoom levels; - a Tile Cache, developed in house, efficiently serves tiles on demand, managing WMS request for the new tiles; - Edge Servers, externally hosted by Akamai, provide a highly scalable (UK-centric) service for pre-cached tiles, passing new requests to the Tile Cache; - the Invent Weather Map Viewer uses the Google Maps API to request tiles from Edge Servers. (We would expect to make use of the Web Map Tiling Service, when it becomes an OGC standard.) The Met Office delivers specialist commercial products to market sectors such as transport, utilities and defence, which exploit a Web Feature Service (WFS) for data relating forecasts and observations to specific geographic features, and a Web Coverage Service (WCS) for sub-selections of gridded data. These are locally rendered as maps or graphs, and combined with the WMS pre-rendered images and text, in a FLEX application, to provide sophisticated, user impact-based view of the weather. The OGC web services supporting these applications have been developed in collaboration with commercial companies. Visual Weather was originally a desktop application for forecasters, but IBL have developed it to expose the full range of forecast and observation data through standard web services (WCS and WMS). Forecasts and observations relating to specific locations and geographic features are held in an Oracle Database, and exposed as a WFS using Snowflake Software's GO-Publisher application. The Met Office has worked closely with both IBL and Snowflake Software to ensure that the web services provided strike a balance between conformance to the standards and performance in an operational environment. This has proved challenging in areas where the standards are rapidly evolving (e.g. WCS) or do not allow adequate description of the Met-Ocean domain (e.g. multiple time coordinates and parametric vertical coordinates). It has also become clear that careful selection of the features to expose, based on the way in which you expect users to query those features, in necessary in order to deliver adequate performance. These experiences are providing useful 'real-world' input in to the recently launched OGC MetOcean Domain Working Group and World Meteorological Organisation (WMO) initiatives in this area.
Beam Position and Phase Monitor - Wire Mapping System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Watkins, Heath A; Shurter, Robert B.; Gilpatrick, John D.
2012-04-10
The Los Alamos Neutron Science Center (LANSCE) deploys many cylindrical beam position and phase monitors (BPPM) throughout the linac to measure the beam central position, phase and bunched-beam current. Each monitor is calibrated and qualified prior to installation to insure it meets LANSCE requirements. The BPPM wire mapping system is used to map the BPPM electrode offset, sensitivity and higher order coefficients. This system uses a three-axis motion table to position the wire antenna structure within the cavity, simulating the beam excitation of a BPPM at a fundamental frequency of 201.25 MHz. RF signal strength is measured and recorded formore » the four electrodes as the antenna position is updated. An effort is underway to extend the systems service to the LANSCE facility by replacing obsolete electronic hardware and taking advantage of software enhancements. This paper describes the upgraded wire positioning system's new hardware and software capabilities including its revised antenna structure, motion control interface, RF measurement equipment and Labview software upgrades. The main purpose of the wire mapping system at LANSCE is to characterize the amplitude response versus beam central position of BPPMs before they are installed in the beam line. The wire mapping system is able to simulate a beam using a thin wire and measure the signal response as the wire position is varied within the BPPM aperture.« less
Digital Mapping Techniques '11–12 workshop proceedings
Soller, David R.
2014-01-01
At these meetings, oral and poster presentations and special discussion sessions emphasized: (1) methods for creating and publishing map products (here, "publishing" includes Web-based release); (2) field data capture software and techniques, including the use of LiDAR; (3) digital cartographic techniques; (4) migration of digital maps into ArcGIS Geodatabase formats; (5) analytical GIS techniques; and (6) continued development of the National Geologic Map Database.
,
2006-01-01
GDA (Geologic Data Assistant) is an extension to ArcPad, a mobile mapping software program by Environmental Systems Research Institute (ESRI) designed to run on personal digital assistant (PDA) computers. GDA and ArcPad allow a PDA to replace the paper notebook and field map traditionally used for geologic mapping. GDA allows easy collection of field data.
Evaluation criteria for software classification inventories, accuracies, and maps
NASA Technical Reports Server (NTRS)
Jayroe, R. R., Jr.
1976-01-01
Statistical criteria are presented for modifying the contingency table used to evaluate tabular classification results obtained from remote sensing and ground truth maps. This classification technique contains information on the spatial complexity of the test site, on the relative location of classification errors, on agreement of the classification maps with ground truth maps, and reduces back to the original information normally found in a contingency table.
NASA Astrophysics Data System (ADS)
Grochocka, M.
2013-12-01
Mobile laser scanning is dynamically developing measurement technology, which is becoming increasingly widespread in acquiring three-dimensional spatial information. Continuous technical progress based on the use of new tools, technology development, and thus the use of existing resources in a better way, reveals new horizons of extensive use of MLS technology. Mobile laser scanning system is usually used for mapping linear objects, and in particular the inventory of roads, railways, bridges, shorelines, shafts, tunnels, and even geometrically complex urban spaces. The measurement is done from the perspective of use of the object, however, does not interfere with the possibilities of movement and work. This paper presents the initial results of the segmentation data acquired by the MLS. The data used in this work was obtained as part of an inventory measurement infrastructure railway line. Measurement of point clouds was carried out using a profile scanners installed on the railway platform. To process the data, the tools of 'open source' Point Cloud Library was used. These tools allow to use templates of programming libraries. PCL is an open, independent project, operating on a large scale for processing 2D/3D image and point clouds. Software PCL is released under the terms of the BSD license (Berkeley Software Distribution License), which means it is a free for commercial and research use. The article presents a number of issues related to the use of this software and its capabilities. Segmentation data is based on applying the templates library pcl_ segmentation, which contains the segmentation algorithms to separate clusters. These algorithms are best suited to the processing point clouds, consisting of a number of spatially isolated regions. Template library performs the extraction of the cluster based on the fit of the model by the consensus method samples for various parametric models (planes, cylinders, spheres, lines, etc.). Most of the mathematical operation is carried out on the basis of Eigen library, a set of templates for linear algebra.
Concurrent and Accurate Short Read Mapping on Multicore Processors.
Martínez, Héctor; Tárraga, Joaquín; Medina, Ignacio; Barrachina, Sergio; Castillo, Maribel; Dopazo, Joaquín; Quintana-Ortí, Enrique S
2015-01-01
We introduce a parallel aligner with a work-flow organization for fast and accurate mapping of RNA sequences on servers equipped with multicore processors. Our software, HPG Aligner SA (HPG Aligner SA is an open-source application. The software is available at http://www.opencb.org, exploits a suffix array to rapidly map a large fraction of the RNA fragments (reads), as well as leverages the accuracy of the Smith-Waterman algorithm to deal with conflictive reads. The aligner is enhanced with a careful strategy to detect splice junctions based on an adaptive division of RNA reads into small segments (or seeds), which are then mapped onto a number of candidate alignment locations, providing crucial information for the successful alignment of the complete reads. The experimental results on a platform with Intel multicore technology report the parallel performance of HPG Aligner SA, on RNA reads of 100-400 nucleotides, which excels in execution time/sensitivity to state-of-the-art aligners such as TopHat 2+Bowtie 2, MapSplice, and STAR.
NASA Astrophysics Data System (ADS)
Forkert, Nils Daniel; Siemonsen, Susanne; Dalski, Michael; Verleger, Tobias; Kemmling, Andre; Fiehler, Jens
2014-03-01
The acute ischemic stroke is a leading cause for death and disability in the industry nations. In case of a present acute ischemic stroke, the prediction of the future tissue outcome is of high interest for the clinicians as it can be used to support therapy decision making. Within this context, it has already been shown that the voxel-wise multi-parametric tissue outcome prediction leads to more promising results compared to single channel perfusion map thresholding. Most previously published multi-parametric predictions employ information from perfusion maps derived from perfusion-weighted MRI together with other image sequences such as diffusion-weighted MRI. However, it remains unclear if the typically calculated perfusion maps used for this purpose really include all valuable information from the PWI dataset for an optimal tissue outcome prediction. To investigate this problem in more detail, two different methods to predict tissue outcome using a k-nearest-neighbor approach were developed in this work and evaluated based on 18 datasets of acute stroke patients with known tissue outcome. The first method integrates apparent diffusion coefficient and perfusion parameter (Tmax, MTT, CBV, CBF) information for the voxel-wise prediction, while the second method employs also apparent diffusion coefficient information but the complete perfusion information in terms of the voxel-wise residue functions instead of the perfusion parameter maps for the voxel-wise prediction. Overall, the comparison of the results of the two prediction methods for the 18 patients using a leave-one-out cross validation revealed no considerable differences. Quantitatively, the parameter-based prediction of tissue outcome led to a mean Dice coefficient of 0.474, while the prediction using the residue functions led to a mean Dice coefficient of 0.461. Thus, it may be concluded from the results of this study that the perfusion parameter maps typically derived from PWI datasets include all valuable perfusion information required for a voxel-based tissue outcome prediction, while the complete analysis of the residue functions does not add further benefits for the voxel-wise tissue outcome prediction and is also computationally more expensive.
A high-resolution radiation hybrid map of the bovine genome
USDA-ARS?s Scientific Manuscript database
We are building high-resolution radiation hybrid maps of all 29 bovine autosomes and chromosome X, using a 58,000-marker genotyping assay, and a 12,000-rad whole-genome radiation hybrid (RH) panel. To accommodate the large number of markers, and to automate the map building procedure, a software pip...
Online Concept Maps: Enhancing Collaborative Learning by Using Technology with Concept Maps.
ERIC Educational Resources Information Center
Canas, Alberto J.; Ford, Kenneth M.; Novak, Joseph D.; Hayes, Patrick; Reichherzer, Thomas R.; Suri, Niranjan
2001-01-01
Describes a collaborative software system that allows students from distant schools to share claims derived from their concept maps. Sharing takes place by accessing The Knowledge Soup, a repository of propositions submitted by students and stored on a computer server. Students can use propositions from other students to enhance their concept…
Perceptual Mapping Software as a Tool for Facilitating School-Based Consultation
ERIC Educational Resources Information Center
Rush, S. Craig; Kalish, Ashley; Wheeler, Joanna
2013-01-01
Perceptual mapping is a systematic method for collecting, analyzing, and presenting group perceptions that is potentially useful in consultation. With input and feedback from a consultee group, perceptual mapping allows the consultant to capture the group's collective perceptions and display them as an organized image that may foster…
Collaborative Concept Mapping Activities in a Classroom Scenario
ERIC Educational Resources Information Center
Elorriaga, J. A.; Arruarte, A.; Calvo, I.; Larrañaga, M.; Rueda, U.; Herrán, E.
2013-01-01
The aim of this study is to test collaborative concept mapping activities using computers in a classroom scenario and to evaluate the possibilities that Elkar-CM offers for collaboratively learning non-technical topics. Elkar-CM is a multi-lingual and multi-media software program designed for drawing concept maps (CMs) collaboratively. Concept…
Geographical Information Systems (GIS) Mapping of Environmental Samples across College Campuses
ERIC Educational Resources Information Center
Purvis-Roberts, Kathleen L.; Moeur, Harriet P.; Zanella, Andrew
2007-01-01
In this laboratory experiment, students take environmental samples at various locations around the college campuses, take geospatial coordinates with a global position systems (GPS) unit, and map their results on a geo-referenced campus map with geographical information systems (GIS) software. Nitrogen dioxide air pollution sampling is used as an…
The MAP program: building the digital terrain model.
R.H. Twito; R.W. Mifflin; R.J. McGaughey
1987-01-01
PLANS, a software package for integrated timber-harvest planning, uses digital terrain models to provide the topographic data needed to fit harvest and transportation designs to specific terrain. MAP, an integral program in the PLANS package, is used to construct the digital terrain models required by PLANS. MAP establishes digital terrain models using digitizer-traced...
OpenStreetMap Collaborative Prototype, Phase 1
Wolf, Eric B.; Matthews, Greg D.; McNinch, Kevin; Poore, Barbara S.
2011-01-01
Phase One of the OpenStreetMap Collaborative Prototype (OSMCP) attempts to determine if the open source software developed for the OpenStreetMap (OSM, http://www.openstreetmap.org) can be used for data contributions and improvements that meet or exceed the requirements for integration into The National Map (http://www.nationalmap.gov). OpenStreetMap Collaborative Prototype Phase One focused on road data aggregated at the state level by the Kansas Data Access and Support Center (DASC). Road data from the DASC were loaded into a system hosted by the U.S. Geological Survey (USGS) National Geospatial Technical Operations Center (NGTOC) in Rolla, Missouri. U.S. Geological Survey editing specifications were developed by NGTOC personnel (J. Walters and G. Matthews, USGS, unpub. report, 2010). Interstate and U.S. Highways in the dataset were edited to the specifications by NGTOC personnel while State roads were edited by DASC personnel. Resulting data were successfully improved to meet standards for The National Map once the system and specifications were in place. The OSM software proved effective in providing a usable platform for collaborative data editing
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-01-01
Background It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. Results This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. Conclusion SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes. PMID:18706080
SBML-SAT: a systems biology markup language (SBML) based sensitivity analysis tool.
Zi, Zhike; Zheng, Yanan; Rundell, Ann E; Klipp, Edda
2008-08-15
It has long been recognized that sensitivity analysis plays a key role in modeling and analyzing cellular and biochemical processes. Systems biology markup language (SBML) has become a well-known platform for coding and sharing mathematical models of such processes. However, current SBML compatible software tools are limited in their ability to perform global sensitivity analyses of these models. This work introduces a freely downloadable, software package, SBML-SAT, which implements algorithms for simulation, steady state analysis, robustness analysis and local and global sensitivity analysis for SBML models. This software tool extends current capabilities through its execution of global sensitivity analyses using multi-parametric sensitivity analysis, partial rank correlation coefficient, SOBOL's method, and weighted average of local sensitivity analyses in addition to its ability to handle systems with discontinuous events and intuitive graphical user interface. SBML-SAT provides the community of systems biologists a new tool for the analysis of their SBML models of biochemical and cellular processes.
Kasnakoğlu, Coşku
2016-01-01
Some level of uncertainty is unavoidable in acquiring the mass, geometry parameters and stability derivatives of an aerial vehicle. In certain instances tiny perturbations of these could potentially cause considerable variations in flight characteristics. This research considers the impact of varying these parameters altogether. This is a generalization of examining the effects of particular parameters on selected modes present in existing literature. Conventional autopilot designs commonly assume that each flight channel is independent and develop single-input single-output (SISO) controllers for every one, that are utilized in parallel for actual flight. It is demonstrated that an attitude controller built like this can function flawlessly on separate nominal cases, but can become unstable with a perturbation no more than 2%. Two robust multi-input multi-output (MIMO) design strategies, specifically loop-shaping and μ-synthesis are outlined as potential substitutes and are observed to handle large parametric changes of 30% while preserving decent performance. Duplicating the loop-shaping procedure for the outer loop, a complete flight control system is formed. It is confirmed through software-in-the-loop (SIL) verifications utilizing blade element theory (BET) that the autopilot is capable of navigation and landing exposed to high parametric variations and powerful winds.
Kasnakoğlu, Coşku
2016-01-01
Some level of uncertainty is unavoidable in acquiring the mass, geometry parameters and stability derivatives of an aerial vehicle. In certain instances tiny perturbations of these could potentially cause considerable variations in flight characteristics. This research considers the impact of varying these parameters altogether. This is a generalization of examining the effects of particular parameters on selected modes present in existing literature. Conventional autopilot designs commonly assume that each flight channel is independent and develop single-input single-output (SISO) controllers for every one, that are utilized in parallel for actual flight. It is demonstrated that an attitude controller built like this can function flawlessly on separate nominal cases, but can become unstable with a perturbation no more than 2%. Two robust multi-input multi-output (MIMO) design strategies, specifically loop-shaping and μ-synthesis are outlined as potential substitutes and are observed to handle large parametric changes of 30% while preserving decent performance. Duplicating the loop-shaping procedure for the outer loop, a complete flight control system is formed. It is confirmed through software-in-the-loop (SIL) verifications utilizing blade element theory (BET) that the autopilot is capable of navigation and landing exposed to high parametric variations and powerful winds. PMID:27783706
Turkbey, Baris; Xu, Sheng; Kruecker, Jochen; Locklin, Julia; Pang, Yuxi; Shah, Vijay; Bernardo, Marcelino; Baccala, Angelo; Rastinehad, Ardeshir; Benjamin, Compton; Merino, Maria J; Wood, Bradford J; Choyke, Peter L; Pinto, Peter A
2011-03-29
During transrectal ultrasound (TRUS)-guided prostate biopsies, the actual location of the biopsy site is rarely documented. Here, we demonstrate the capability of TRUS-magnetic resonance imaging (MRI) image fusion to document the biopsy site and correlate biopsy results with multi-parametric MRI findings. Fifty consecutive patients (median age 61 years) with a median prostate-specific antigen (PSA) level of 5.8 ng/ml underwent 12-core TRUS-guided biopsy of the prostate. Pre-procedural T2-weighted magnetic resonance images were fused to TRUS. A disposable needle guide with miniature tracking sensors was attached to the TRUS probe to enable fusion with MRI. Real-time TRUS images during biopsy and the corresponding tracking information were recorded. Each biopsy site was superimposed onto the MRI. Each biopsy site was classified as positive or negative for cancer based on the results of each MRI sequence. Sensitivity, specificity, and receiver operating curve (ROC) area under the curve (AUC) values were calculated for multi-parametric MRI. Gleason scores for each multi-parametric MRI pattern were also evaluated. Six hundred and 5 systemic biopsy cores were analyzed in 50 patients, of whom 20 patients had 56 positive cores. MRI identified 34 of 56 positive cores. Overall, sensitivity, specificity, and ROC area values for multi-parametric MRI were 0.607, 0.727, 0.667, respectively. TRUS-MRI fusion after biopsy can be used to document the location of each biopsy site, which can then be correlated with MRI findings. Based on correlation with tracked biopsies, T2-weighted MRI and apparent diffusion coefficient maps derived from diffusion-weighted MRI are the most sensitive sequences, whereas the addition of delayed contrast enhancement MRI and three-dimensional magnetic resonance spectroscopy demonstrated higher specificity consistent with results obtained using radical prostatectomy specimens.
Time-efficient high-resolution whole-brain three-dimensional macromolecular proton fraction mapping
Yarnykh, Vasily L.
2015-01-01
Purpose Macromolecular proton fraction (MPF) mapping is a quantitative MRI method that reconstructs parametric maps of a relative amount of macromolecular protons causing the magnetization transfer (MT) effect and provides a biomarker of myelination in neural tissues. This study aimed to develop a high-resolution whole-brain MPF mapping technique utilizing a minimal possible number of source images for scan time reduction. Methods The described technique is based on replacement of an actually acquired reference image without MT saturation by a synthetic one reconstructed from R1 and proton density maps, thus requiring only three source images. This approach enabled whole-brain three-dimensional MPF mapping with isotropic 1.25×1.25×1.25 mm3 voxel size and scan time of 20 minutes. The synthetic reference method was validated against standard MPF mapping with acquired reference images based on data from 8 healthy subjects. Results Mean MPF values in segmented white and gray matter appeared in close agreement with no significant bias and small within-subject coefficients of variation (<2%). High-resolution MPF maps demonstrated sharp white-gray matter contrast and clear visualization of anatomical details including gray matter structures with high iron content. Conclusions Synthetic reference method improves resolution of MPF mapping and combines accurate MPF measurements with unique neuroanatomical contrast features. PMID:26102097
Part of the CAMEO suite, MARPLOT® is a mapping application that people can use to quickly create, view, and modify maps. Users can create their own objects in MARPLOT (e.g., facilities, schools, response assets) and display them on top of a basemap.
Drainage identification analysis and mapping, phase 2 : technical brief.
DOT National Transportation Integrated Search
2017-01-01
This research studied, tested and rectified the compatibility issue related to the recent upgrades of : NJDOT vendor inspection software, and uploaded all collected data to make Drainage Identification : Analysis and Mapping System (DIAMS) current an...
Analysis of survival in breast cancer patients by using different parametric models
NASA Astrophysics Data System (ADS)
Enera Amran, Syahila; Asrul Afendi Abdullah, M.; Kek, Sie Long; Afiqah Muhamad Jamil, Siti
2017-09-01
In biomedical applications or clinical trials, right censoring was often arising when studying the time to event data. In this case, some individuals are still alive at the end of the study or lost to follow up at a certain time. It is an important issue to handle the censoring data in order to prevent any bias information in the analysis. Therefore, this study was carried out to analyze the right censoring data with three different parametric models; exponential model, Weibull model and log-logistic models. Data of breast cancer patients from Hospital Sultan Ismail, Johor Bahru from 30 December 2008 until 15 February 2017 was used in this study to illustrate the right censoring data. Besides, the covariates included in this study are the time of breast cancer infection patients survive t, age of each patients X1 and treatment given to the patients X2 . In order to determine the best parametric models in analysing survival of breast cancer patients, the performance of each model was compare based on Akaike Information Criterion (AIC), Bayesian Information Criterion (BIC) and log-likelihood value using statistical software R. When analysing the breast cancer data, all three distributions were shown consistency of data with the line graph of cumulative hazard function resembles a straight line going through the origin. As the result, log-logistic model was the best fitted parametric model compared with exponential and Weibull model since it has the smallest value in AIC and BIC, also the biggest value in log-likelihood.
Cortical Cartography and Caret Software
Van Essen, David C.
2011-01-01
Caret software is widely used for analyzing and visualizing many types of fMRI data, often in conjunction with experimental data from other modalities. This article places Caret’s development in a historical context that spans three decades of brain mapping – from the early days of manually generated flat maps to the nascent field of human connectomics. It also highlights some of Caret’s distinctive capabilities. This includes the ease of visualizing data on surfaces and/or volumes and on atlases as well as individual subjects. Caret can display many types of experimental data using various combinations of overlays (e.g., fMRI activation maps, cortical parcellations, areal boundaries), and it has other features that facilitate the analysis and visualization of complex neuroimaging datasets. PMID:22062192
A Semiparametric Approach for Composite Functional Mapping of Dynamic Quantitative Traits
Yang, Runqing; Gao, Huijiang; Wang, Xin; Zhang, Ji; Zeng, Zhao-Bang; Wu, Rongling
2007-01-01
Functional mapping has emerged as a powerful tool for mapping quantitative trait loci (QTL) that control developmental patterns of complex dynamic traits. Original functional mapping has been constructed within the context of simple interval mapping, without consideration of separate multiple linked QTL for a dynamic trait. In this article, we present a statistical framework for mapping QTL that affect dynamic traits by capitalizing on the strengths of functional mapping and composite interval mapping. Within this so-called composite functional-mapping framework, functional mapping models the time-dependent genetic effects of a QTL tested within a marker interval using a biologically meaningful parametric function, whereas composite interval mapping models the time-dependent genetic effects of the markers outside the test interval to control the genome background using a flexible nonparametric approach based on Legendre polynomials. Such a semiparametric framework was formulated by a maximum-likelihood model and implemented with the EM algorithm, allowing for the estimation and the test of the mathematical parameters that define the QTL effects and the regression coefficients of the Legendre polynomials that describe the marker effects. Simulation studies were performed to investigate the statistical behavior of composite functional mapping and compare its advantage in separating multiple linked QTL as compared to functional mapping. We used the new mapping approach to analyze a genetic mapping example in rice, leading to the identification of multiple QTL, some of which are linked on the same chromosome, that control the developmental trajectory of leaf age. PMID:17947431
DOE Office of Scientific and Technical Information (OSTI.GOV)
BIRKEL, GARRETT; GARCIA MARTIN, HECTOR; MORRELL, WILLIAM
"Arrowland" is a web-based software application primarily for mapping, integrating and visualizing a variety of metabolism data of living organisms, including but not limited to metabolomics, proteomics, transcriptomics and fluxomics. This software application makes multi-omics data analysis intuitive and interactive. It improves data sharing and communication by enabling users to visualize their omics data using a web browser (on a PC or mobile device). It increases user's productivity by simplifying multi-omics data analysis using well developed maps as a guide. Users using this tool can gain insights into their data sets that would be difficult or even impossible to teasemore » out by looking at raw number, or using their currently existing toolchains to generate static single-use maps. Arrowland helps users save time by visualizing relative changes in different conditions or over time, and helps users to produce more significant insights faster. Preexisting maps decrease the learning curve for beginners in the omics field. Sets of multi-omics data are presented in the browser, as a two-dimensional flowchart resembling a map, with varying levels of detail information, based on the scaling of the map. Users can pan and zoom to explore different maps, compare maps, upload their own research data sets onto desired maps, alter map appearance in ways that facilitate interpretation, visualization and analysis of the given data, and export data, reports and actionable items to help the user initiative.« less
Paraboloid-aspheric lenses free of spherical aberration
NASA Astrophysics Data System (ADS)
Lozano-Rincón, Ninfa del C.; Valencia-Estrada, Juan Camilo
2017-07-01
A method to design singlet paraboloid-aspheric lenses free of all orders of spherical aberration with maximum aperture is described. This work includes all parametric formulas to describe paraboloid-aspheric or aspheric-paraboloid lenses for any finite conjugated planes. It also includes the Schwarzchilds approximations (which can be used to calculate one rigorous propagation of light waves in physic optics) to design convex paraboloid-aspheric lenses for imaging an object at infinity, with explicit formulas to calculate thicknesses easily. The results were verified with software through ray tracing.
Improvement of calculation method for electrical parameters of short network of ore-thermal furnaces
NASA Astrophysics Data System (ADS)
Aliferov, A. I.; Bikeev, R. A.; Goreva, L. P.
2017-10-01
The paper describes a new calculation method for active and inductive resistance of split interleaved current leads packages in ore-thermal electric furnaces. The method is developed on basis of regression analysis of dependencies of active and inductive resistances of the packages on their geometrical parameters, mutual disposition and interleaving pattern. These multi-parametric calculations have been performed with ANSYS software. The proposed method allows solving split current lead electrical parameters minimization and balancing problems for ore-thermal furnaces.
Boltzmann sampling for an XY model using a non-degenerate optical parametric oscillator network
NASA Astrophysics Data System (ADS)
Takeda, Y.; Tamate, S.; Yamamoto, Y.; Takesue, H.; Inagaki, T.; Utsunomiya, S.
2018-01-01
We present an experimental scheme of implementing multiple spins in a classical XY model using a non-degenerate optical parametric oscillator (NOPO) network. We built an NOPO network to simulate a one-dimensional XY Hamiltonian with 5000 spins and externally controllable effective temperatures. The XY spin variables in our scheme are mapped onto the phases of multiple NOPO pulses in a single ring cavity and interactions between XY spins are implemented by mutual injections between NOPOs. We show the steady-state distribution of optical phases of such NOPO pulses is equivalent to the Boltzmann distribution of the corresponding XY model. Estimated effective temperatures converged to the setting values, and the estimated temperatures and the mean energy exhibited good agreement with the numerical simulations of the Langevin dynamics of NOPO phases.
Comparison of Numerical Analyses with a Static Load Test of a Continuous Flight Auger Pile
NASA Astrophysics Data System (ADS)
Hoľko, Michal; Stacho, Jakub
2014-12-01
The article deals with numerical analyses of a Continuous Flight Auger (CFA) pile. The analyses include a comparison of calculated and measured load-settlement curves as well as a comparison of the load distribution over a pile's length. The numerical analyses were executed using two types of software, i.e., Ansys and Plaxis, which are based on FEM calculations. Both types of software are different from each other in the way they create numerical models, model the interface between the pile and soil, and use constitutive material models. The analyses have been prepared in the form of a parametric study, where the method of modelling the interface and the material models of the soil are compared and analysed. Our analyses show that both types of software permit the modelling of pile foundations. The Plaxis software uses advanced material models as well as the modelling of the impact of groundwater or overconsolidation. The load-settlement curve calculated using Plaxis is equal to the results of a static load test with a more than 95 % degree of accuracy. In comparison, the load-settlement curve calculated using Ansys allows for the obtaining of only an approximate estimate, but the software allows for the common modelling of large structure systems together with a foundation system.
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
A method for tailoring the information content of a software process model
NASA Technical Reports Server (NTRS)
Perkins, Sharon; Arend, Mark B.
1990-01-01
The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.
Archfield, Stacey A.; Steeves, Peter A.; Guthrie, John D.; Ries, Kernell G.
2013-01-01
Streamflow information is critical for addressing any number of hydrologic problems. Often, streamflow information is needed at locations that are ungauged and, therefore, have no observations on which to base water management decisions. Furthermore, there has been increasing need for daily streamflow time series to manage rivers for both human and ecological functions. To facilitate negotiation between human and ecological demands for water, this paper presents the first publicly available, map-based, regional software tool to estimate historical, unregulated, daily streamflow time series (streamflow not affected by human alteration such as dams or water withdrawals) at any user-selected ungauged river location. The map interface allows users to locate and click on a river location, which then links to a spreadsheet-based program that computes estimates of daily streamflow for the river location selected. For a demonstration region in the northeast United States, daily streamflow was, in general, shown to be reliably estimated by the software tool. Estimating the highest and lowest streamflows that occurred in the demonstration region over the period from 1960 through 2004 also was accomplished but with more difficulty and limitations. The software tool provides a general framework that can be applied to other regions for which daily streamflow estimates are needed.
On-Line Planning and Mapping for Chemical Plume Tracing
2004-06-01
09 - 2005 Final Report 01/04/2001 - 31/10/2004 4. TITLE AND SUBTITLE Sa. CONTRACT NUMBER On-line Planning and Mapping for Chemical Plume Tracing 5b...PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION REPORT NUMBER Department of Electrical Engineering University of California...develop, and implement on-vehicle planning and mapping theory and software to find, trace, and map chemical plumes. This objective included accurate
Perea Palazón, R J; Solé Arqués, M; Prat González, S; de Caralt Robira, T M; Cibeira López, M T; Ortiz Pérez, J T
2015-01-01
Cardiac magnetic resonance imaging is considered the reference technique for characterizing myocardial tissue; for example, T2-weighted sequences make it possible to evaluate areas of edema or myocardial inflammation. However, traditional sequences have many limitations and provide only qualitative information. Moreover, traditional sequences depend on the reference to remote myocardium or skeletal muscle, which limits their ability to detect and quantify diffuse myocardial damage. Recently developed magnetic resonance myocardial mapping techniques enable quantitative assessment of parameters indicative of edema. These techniques have proven better than traditional sequences both in acute cardiomyopathy and in acute ischemic heart disease. This article synthesizes current developments in T2 mapping as well as their clinical applications and limitations. Copyright © 2014 SERAM. Published by Elsevier España, S.L.U. All rights reserved.
Improving Critical Thinking Using Web Based Argument Mapping Exercises with Automated Feedback
ERIC Educational Resources Information Center
Butchart, Sam; Forster, Daniella; Gold, Ian; Bigelow, John; Korb, Kevin; Oppy, Graham; Serrenti, Alexandra
2009-01-01
In this paper we describe a simple software system that allows students to practise their critical thinking skills by constructing argument maps of natural language arguments. As the students construct their maps of an argument, the system provides automatic, real time feedback on their progress. We outline the background and theoretical framework…
Conceptual Maps for Training Tutors in the Distance Learning of Business Administration Course
ERIC Educational Resources Information Center
Mendes, Elise; Jordão de Carvalho, Claudinê; Gargiulo, Victor; da Mota Alves, João Bosco
2014-01-01
This article aims at reporting on the process of tutors training for the planning of distance education at the undergraduate Administration course at the Federal University of Uberlandia-Brazil. It describes a participatory research training of tutors in the use of concept mapping (CM) and concept mapping software to encourage individual…
Evaluating kriging as a tool to improve moderate resolution maps of forest biomass
Elizabeth A. Freeman; Gretchen G. Moisen
2007-01-01
The USDA Forest Service, Forest Inventory and Analysis program (FIA) recently produced a nationwide map of forest biomass by modeling biomass collected on forest inventory plots as nonparametric functions of moderate resolution satellite data and other environmental variables using Cubist software. Efforts are underway to develop methods to enhance this initial map. We...
Enabling cost-effective multimodal trip planners through open transit data.
DOT National Transportation Integrated Search
2011-05-01
This study examined whether multimodal trip planners can be developed using opensource software and open data sources. : OpenStreetMap (OSM), maintained by the nonprofit OpenStreetMap Foundation, is an open, freely available international : rep...
Enabling cost-effective multimodal trip planners through open transit data.
DOT National Transportation Integrated Search
2011-05-01
This study examined whether multimodal trip planners can be developed using opensource software and open data sources. OpenStreetMap (OSM), maintained by the nonprofit OpenStreetMap Foundation, is an open, freely available international reposit...
Testing light-traces-mass in Hubble Frontier Fields Cluster MACS-J0416.1-2403
Sebesta, Kevin; Williams, Liliya L. R.; Mohammed, Irshad; ...
2016-06-17
Here, we reconstruct the projected mass distribution of a massive merging Hubble Frontier Fields cluster MACSJ0416 using the genetic algorithm based free-form technique called Grale. The reconstructions are constrained by 149 lensed images identified by Jauzac et al. using HFF data. No information about cluster galaxies or light is used, which makes our reconstruction unique in this regard. Using visual inspection of the maps, as well as galaxy-mass correlation functions we conclude that overall light does follow mass. Furthermore, the fact that brighter galaxies are more strongly clustered with mass is an important confirmation of the standard biasing scenario inmore » galaxy clusters. On the smallest scales, approximately less than a few arcseconds, the resolution afforded by 149 images is still not sufficient to confirm or rule out galaxy-mass offsets of the kind observed in ACO 3827. We also compare the mass maps of MACSJ0416 obtained by three different groups: Grale, and two parametric Lenstool reconstructions from the CATS and Sharon/Johnson teams. Overall, the three agree well; one interesting discrepancy between Grale and Lenstool galaxy-mass correlation functions occurs on scales of tens of kpc and may suggest that cluster galaxies are more biased tracers of mass than parametric methods generally assume.« less
Testing light-traces-mass in Hubble Frontier Fields Cluster MACS-J0416.1-2403
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sebesta, Kevin; Williams, Liliya L. R.; Mohammed, Irshad
Here, we reconstruct the projected mass distribution of a massive merging Hubble Frontier Fields cluster MACSJ0416 using the genetic algorithm based free-form technique called Grale. The reconstructions are constrained by 149 lensed images identified by Jauzac et al. using HFF data. No information about cluster galaxies or light is used, which makes our reconstruction unique in this regard. Using visual inspection of the maps, as well as galaxy-mass correlation functions we conclude that overall light does follow mass. Furthermore, the fact that brighter galaxies are more strongly clustered with mass is an important confirmation of the standard biasing scenario inmore » galaxy clusters. On the smallest scales, approximately less than a few arcseconds, the resolution afforded by 149 images is still not sufficient to confirm or rule out galaxy-mass offsets of the kind observed in ACO 3827. We also compare the mass maps of MACSJ0416 obtained by three different groups: Grale, and two parametric Lenstool reconstructions from the CATS and Sharon/Johnson teams. Overall, the three agree well; one interesting discrepancy between Grale and Lenstool galaxy-mass correlation functions occurs on scales of tens of kpc and may suggest that cluster galaxies are more biased tracers of mass than parametric methods generally assume.« less
LORETA imaging of P300 in schizophrenia with individual MRI and 128-channel EEG.
Pae, Ji Soo; Kwon, Jun Soo; Youn, Tak; Park, Hae-Jeong; Kim, Myung Sun; Lee, Boreom; Park, Kwang Suk
2003-11-01
We investigated the characteristics of P300 generators in schizophrenics by using voxel-based statistical parametric mapping of current density images. P300 generators, produced by a rare target tone of 1500 Hz (15%) under a frequent nontarget tone of 1000 Hz (85%), were measured in 20 right-handed schizophrenics and 21 controls. Low-resolution electromagnetic tomography (LORETA), using a realistic head model of the boundary element method based on individual MRI, was applied to the 128-channel EEG. Three-dimensional current density images were reconstructed from the LORETA intensity maps that covered the whole cortical gray matter. Spatial normalization and intensity normalization of the smoothed current density images were used to reduce anatomical variance and subject-specific global activity and statistical parametric mapping (SPM) was applied for the statistical analysis. We found that the sources of P300 were consistently localized at the left superior parietal area in normal subjects, while those of schizophrenics were diversely distributed. Upon statistical comparison, schizophrenics, with globally reduced current densities, showed a significant P300 current density reduction in the left medial temporal area and in the left inferior parietal area, while both left prefrontal and right orbitofrontal areas were relatively activated. The left parietotemporal area was found to correlate negatively with Positive and Negative Syndrome Scale total scores of schizophrenic patients. In conclusion, the reduced and increased areas of current density in schizophrenic patients suggest that the medial temporal and frontal areas contribute to the pathophysiology of schizophrenia, the frontotemporal circuitry abnormality.
Breast tumour visualization using 3D quantitative ultrasound methods
NASA Astrophysics Data System (ADS)
Gangeh, Mehrdad J.; Raheem, Abdul; Tadayyon, Hadi; Liu, Simon; Hadizad, Farnoosh; Czarnota, Gregory J.
2016-04-01
Breast cancer is one of the most common cancer types accounting for 29% of all cancer cases. Early detection and treatment has a crucial impact on improving the survival of affected patients. Ultrasound (US) is non-ionizing, portable, inexpensive, and real-time imaging modality for screening and quantifying breast cancer. Due to these attractive attributes, the last decade has witnessed many studies on using quantitative ultrasound (QUS) methods in tissue characterization. However, these studies have mainly been limited to 2-D QUS methods using hand-held US (HHUS) scanners. With the availability of automated breast ultrasound (ABUS) technology, this study is the first to develop 3-D QUS methods for the ABUS visualization of breast tumours. Using an ABUS system, unlike the manual 2-D HHUS device, the whole patient's breast was scanned in an automated manner. The acquired frames were subsequently examined and a region of interest (ROI) was selected in each frame where tumour was identified. Standard 2-D QUS methods were used to compute spectral and backscatter coefficient (BSC) parametric maps on the selected ROIs. Next, the computed 2-D parameters were mapped to a Cartesian 3-D space, interpolated, and rendered to provide a transparent color-coded visualization of the entire breast tumour. Such 3-D visualization can potentially be used for further analysis of the breast tumours in terms of their size and extension. Moreover, the 3-D volumetric scans can be used for tissue characterization and the categorization of breast tumours as benign or malignant by quantifying the computed parametric maps over the whole tumour volume.
Suzuki, Satoshi
2017-09-01
This study investigated the spatial distribution of brain activity on body schema (BS) modification induced by natural body motion using two versions of a hand-tracing task. In Task 1, participants traced Japanese Hiragana characters using the right forefinger, requiring no BS expansion. In Task 2, participants performed the tracing task with a long stick, requiring BS expansion. Spatial distribution was analyzed using general linear model (GLM)-based statistical parametric mapping of near-infrared spectroscopy data contaminated with motion artifacts caused by the hand-tracing task. Three methods were utilized in series to counter the artifacts, and optimal conditions and modifications were investigated: a model-free method (Step 1), a convolution matrix method (Step 2), and a boxcar-function-based Gaussian convolution method (Step 3). The results revealed four methodological findings: (1) Deoxyhemoglobin was suitable for the GLM because both Akaike information criterion and the variance against the averaged hemodynamic response function were smaller than for other signals, (2) a high-pass filter with a cutoff frequency of .014 Hz was effective, (3) the hemodynamic response function computed from a Gaussian kernel function and its first- and second-derivative terms should be included in the GLM model, and (4) correction of non-autocorrelation and use of effective degrees of freedom were critical. Investigating z-maps computed according to these guidelines revealed that contiguous areas of BA7-BA40-BA21 in the right hemisphere became significantly activated ([Formula: see text], [Formula: see text], and [Formula: see text], respectively) during BS modification while performing the hand-tracing task.
Multi-crop area estimation and mapping on a microprocessor/mainframe network
NASA Technical Reports Server (NTRS)
Sheffner, E.
1985-01-01
The data processing system is outlined for a 1985 test aimed at determining the performance characteristics of area estimation and mapping procedures connected with the California Cooperative Remote Sensing Project. The project is a joint effort of the USDA Statistical Reporting Service-Remote Sensing Branch, the California Department of Water Resources, NASA-Ames Research Center, and the University of California Remote Sensing Research Program. One objective of the program was to study performance when data processing is done on a microprocessor/mainframe network under operational conditions. The 1985 test covered the hardware, software, and network specifications and the integration of these three components. Plans for the year - including planned completion of PEDITOR software, testing of software on MIDAS, and accomplishment of data processing on the MIDAS-VAX-CRAY network - are discussed briefly.
NASA Astrophysics Data System (ADS)
Morse, P. E.; Reading, A. M.; Lueg, C.
2014-12-01
Pattern-recognition in scientific data is not only a computational problem but a human-observer problem as well. Human observation of - and interaction with - data visualization software can augment, select, interrupt and modify computational routines and facilitate processes of pattern and significant feature recognition for subsequent human analysis, machine learning, expert and artificial intelligence systems.'Tagger' is a Mac OS X interactive data visualisation tool that facilitates Human-Computer interaction for the recognition of patterns and significant structures. It is a graphical application developed using the Quartz Composer framework. 'Tagger' follows a Model-View-Controller (MVC) software architecture: the application problem domain (the model) is to facilitate novel ways of abstractly representing data to a human interlocutor, presenting these via different viewer modalities (e.g. chart representations, particle systems, parametric geometry) to the user (View) and enabling interaction with the data (Controller) via a variety of Human Interface Devices (HID). The software enables the user to create an arbitrary array of tags that may be appended to the visualised data, which are then saved into output files as forms of semantic metadata. Three fundamental problems that are not strongly supported by conventional scientific visualisation software are addressed:1] How to visually animate data over time, 2] How to rapidly deploy unconventional parametrically driven data visualisations, 3] How to construct and explore novel interaction models that capture the activity of the end-user as semantic metadata that can be used to computationally enhance subsequent interrogation. Saved tagged data files may be loaded into Tagger, so that tags may be tagged, if desired. Recursion opens up the possibility of refining or overlapping different types of tags, tagging a variety of different POIs or types of events, and of capturing different types of specialist observations of important or noticeable events. Other visualisations and modes of interaction will also be demonstrated, with the aim of discovering knowledge in large datasets in the natural, physical sciences. Fig.1 Wave height data from an oceanographic Wave Rider Buoy. Colors/radii are driven by wave height data.
ERIC Educational Resources Information Center
Hostager, Todd J.; Voiovich, Jason; Hughes, Raymond K.
2013-01-01
The authors apply a software-based content analysis method to uncover differences in responses by expert entrepreneurs and undergraduate entrepreneur majors to a new venture investment proposal. Data analyzed via the Leximancer software package yielded conceptual maps highlighting key differences in the nature of these responses. Study methods and…
DESI-Detection of early-season invasives (software-installation manual and user's guide version 1.0)
Kokaly, Raymond F.
2011-01-01
This report describes a software system for detecting early-season invasive plant species, such as cheatgrass. The report includes instructions for installing the software and serves as a user's guide in processing Landsat satellite remote sensing data to map the distributions of cheatgrass and other early-season invasive plants. The software was developed for application to the semi-arid regions of southern Utah; however, the detection parameters can be altered by the user for application to other areas.
Concept for a common operational picture in a guidance vehicle
NASA Astrophysics Data System (ADS)
Wagner, Boris; Eck, Ralf; Maier, Sebastian
2017-05-01
A Common Operational Picture (COP) shows many operational aspects in coded form inside a geodata representation like a map. For building this picture, many specialized groups produce information. Beside the operating forces these are intelligences, logistics, or the own leaders planning group. Operations in which a COP is used typically are disaster management or military actions. An existing software for Interactive Visualization of Integrated Geodata runs on Tablet-PCs, PCs, Digital Map Tables and video walls. It is already used by the Deutsche Führungsakademie (military academy) for the education of staff officers. German civil disaster management agency decided to use the Digital Map Table for their intelligence analysis. In a mobile scenario, however, novel requirements have to be taken into account to adapt the software to the new environment. This paper investigates these requirements as well as the possible adaptions to provide a COP across multiple players on the go. When acting together, the groups do this in a widespread manner. They are physically spread and they use a variety of software and hardware to produce their contribution. This requires hardware to be ruggedized, mobile, and to support a variety of interfaces. The limited bandwidth in such a setting poses the main challenge for the software, which has to synchronize exchanging a minimum of information. Especially for mobile participants, a solution is planned that scales the amount of data (maps/intelligence data) to the available equipment, the upcoming mission, and the underlying theatre. Special focus is laid on a guidance vehicle leading a convoy.
The AR Sandbox: Augmented Reality in Geoscience Education
NASA Astrophysics Data System (ADS)
Kreylos, O.; Kellogg, L. H.; Reed, S.; Hsi, S.; Yikilmaz, M. B.; Schladow, G.; Segale, H.; Chan, L.
2016-12-01
The AR Sandbox is a combination of a physical box full of sand, a 3D (depth) camera such as a Microsoft Kinect, a data projector, and a computer running open-source software, creating a responsive and interactive system to teach geoscience concepts in formal or informal contexts. As one or more users shape the sand surface to create planes, hills, or valleys, the 3D camera scans the surface in real-time, the software creates a dynamic topographic map including elevation color maps and contour lines, and the projector projects that map back onto the sand surface such that real and projected features match exactly. In addition, users can add virtual water to the sandbox, which realistically flows over the real surface driven by a real-time fluid flow simulation. The AR Sandbox can teach basic geographic and hydrologic skills and concepts such as reading topographic maps, interpreting contour lines, formation of watersheds, flooding, or surface wave propagation in a hands-on and explorative manner. AR Sandbox installations in more than 150 institutions have shown high audience engagement and long dwell times of often 20 minutes and more. In a more formal context, the AR Sandbox can be used in field trip preparation, and can teach advanced geoscience skills such as extrapolating 3D sub-surface shapes from surface expression, via advanced software features such as the ability to load digital models of real landscapes and guiding users towards recreating them in the sandbox. Blueprints, installation instructions, and the open-source AR Sandbox software package are available at http://arsandbox.org .
NASA Astrophysics Data System (ADS)
Le Bras, Ronan; Kushida, Noriyuki; Mialle, Pierrick; Tomuta, Elena; Arora, Nimar
2017-04-01
The Preparatory Commission for the Comprehensive Nuclear-Test-Ban Treaty Organization (CTBTO) has been developing a Bayesian method and software to perform the key step of automatic association of seismological, hydroacoustic, and infrasound (SHI) parametric data. In our preliminary testing in the CTBTO, NET_VISA shows much better performance than its currently operating automatic association module, with a rate for automatic events matching the analyst-reviewed events increased by 10%, signifying that the percentage of missed events is lowered by 40%. Initial tests involving analysts also showed that the new software will complete the automatic bulletins of the CTBTO by adding previously missed events. Because products by the CTBTO are also widely distributed to its member States as well as throughout the seismological community, the introduction of a new technology must be carried out carefully, and the first step of operational integration is to first use NET-VISA results within the interactive analysts' software so that the analysts can check the robustness of the Bayesian approach. We report on the latest results both on the progress for automatic processing and for the initial introduction of NET-VISA results in the analyst review process
Morphogenic designer--an efficient tool to digitally design tooth forms.
Hajtó, J; Marinescu, C; Silva, N R F A
2014-01-01
Different digital software tools are available today for the purpose of designing anatomically correct anterior and posterior restorations. The current concepts present weaknesses, which can be potentially addressed by more advanced modeling tools, such as the ones already available in professional CAD (Computer Aided Design) graphical software. This study describes the morphogenic designer (MGD) as an efficient and easy method for digitally designing tooth forms for the anterior and posterior dentition. Anterior and posterior tooth forms were selected from a collection of digitalized natural teeth and subjectively assessed as "average". The models in the form of STL files were filtered, cleaned, idealized, and re-meshed to match the specifications of the software used. The shapes were then imported as wavefront ".obj" model into Modo 701, software built for modeling, texturing, visualization, and animation. In order to create a parametric design system, intentional interactive deformations were performed on the average tooth shapes and then further defined as morph targets. By combining various such parameters, several tooth shapes were formed virtually and their images presented. MGD proved to be a versatile and powerful tool for the purpose of esthetic and functional digital crown designs.
Sensitivity Analysis and Optimization of Aerodynamic Configurations with Blend Surfaces
NASA Technical Reports Server (NTRS)
Thomas, A. M.; Tiwari, S. N.
1997-01-01
A novel (geometrical) parametrization procedure using solutions to a suitably chosen fourth order partial differential equation is used to define a class of airplane configurations. Inclusive in this definition are surface grids, volume grids, and grid sensitivity. The general airplane configuration has wing, fuselage, vertical tail and horizontal tail. The design variables are incorporated into the boundary conditions, and the solution is expressed as a Fourier series. The fuselage has circular cross section, and the radius is an algebraic function of four design parameters and an independent computational variable. Volume grids are obtained through an application of the Control Point Form method. A graphic interface software is developed which dynamically changes the surface of the airplane configuration with the change in input design variable. The software is made user friendly and is targeted towards the initial conceptual development of any aerodynamic configurations. Grid sensitivity with respect to surface design parameters and aerodynamic sensitivity coefficients based on potential flow is obtained using an Automatic Differentiation precompiler software tool ADIFOR. Aerodynamic shape optimization of the complete aircraft with twenty four design variables is performed. Unstructured and structured volume grids and Euler solutions are obtained with standard software to demonstrate the feasibility of the new surface definition.
cit: hypothesis testing software for mediation analysis in genomic applications.
Millstein, Joshua; Chen, Gary K; Breton, Carrie V
2016-08-01
The challenges of successfully applying causal inference methods include: (i) satisfying underlying assumptions, (ii) limitations in data/models accommodated by the software and (iii) low power of common multiple testing approaches. The causal inference test (CIT) is based on hypothesis testing rather than estimation, allowing the testable assumptions to be evaluated in the determination of statistical significance. A user-friendly software package provides P-values and optionally permutation-based FDR estimates (q-values) for potential mediators. It can handle single and multiple binary and continuous instrumental variables, binary or continuous outcome variables and adjustment covariates. Also, the permutation-based FDR option provides a non-parametric implementation. Simulation studies demonstrate the validity of the cit package and show a substantial advantage of permutation-based FDR over other common multiple testing strategies. The cit open-source R package is freely available from the CRAN website (https://cran.r-project.org/web/packages/cit/index.html) with embedded C ++ code that utilizes the GNU Scientific Library, also freely available (http://www.gnu.org/software/gsl/). joshua.millstein@usc.edu Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Zhang, Shu-Dong; Gant, Timothy W
2009-07-31
Connectivity mapping is a process to recognize novel pharmacological and toxicological properties in small molecules by comparing their gene expression signatures with others in a database. A simple and robust method for connectivity mapping with increased specificity and sensitivity was recently developed, and its utility demonstrated using experimentally derived gene signatures. This paper introduces sscMap (statistically significant connections' map), a Java application designed to undertake connectivity mapping tasks using the recently published method. The software is bundled with a default collection of reference gene-expression profiles based on the publicly available dataset from the Broad Institute Connectivity Map 02, which includes data from over 7000 Affymetrix microarrays, for over 1000 small-molecule compounds, and 6100 treatment instances in 5 human cell lines. In addition, the application allows users to add their custom collections of reference profiles and is applicable to a wide range of other 'omics technologies. The utility of sscMap is two fold. First, it serves to make statistically significant connections between a user-supplied gene signature and the 6100 core reference profiles based on the Broad Institute expanded dataset. Second, it allows users to apply the same improved method to custom-built reference profiles which can be added to the database for future referencing. The software can be freely downloaded from http://purl.oclc.org/NET/sscMap.
Griffiths, Silja Torvik; Gundersen, Hilde; Neto, Emanuel; Elgen, Irene; Markestad, Trond; Aukland, Stein M; Hugdahl, Kenneth
2013-08-01
Extremely preterm (EPT)/extremely low-birth-weight (ELBW) children attaining school age and adolescence often have problems with executive functions such as working memory and selective attention. Our aim was to investigate a hypothesized difference in blood oxygen level-dependent (BOLD) activation during a selective attention-working memory task in EPT/ELBW children as compared with term-born controls. A regional cohort of 28 EPT/ELBW children and 28 term-born controls underwent functional magnetic resonance imaging (fMRI) scanning at 11 y of age while performing a combined Stroop n-back task. Group differences in BOLD activation were analyzed with Statistical Parametric Mapping 8 analysis software package, and reaction times (RTs) and response accuracy (RA) were compared in a multifactorial ANOVA test. The BOLD activation pattern in the preterm group involved the same areas (cingulate, prefrontal, and parietal cortexes), but all areas displayed significantly less activation than those in the control group, particularly when the cognitive load was increased. The RA results corresponded with the activation data in that the preterm group had significantly fewer correct responses. No group difference was found regarding RTs. Children born EPT/ELBW displayed reduced working memory and selective attention capacity as compared with term-born controls. These impairments had neuronal correlates with reduced BOLD activation in areas responsible for online stimulus monitoring, working memory, and cognitive control.
Self-referential processing influences functional activation during cognitive control: an fMRI study
Koch, Kathrin; Schachtzabel, Claudia; Peikert, Gregor; Schultz, Carl Christoph; Reichenbach, Jürgen R.; Sauer, Heinrich; Schlösser, Ralf G.
2013-01-01
Rostral anterior cingulate cortex (rACC) plays a central role in the pathophysiology of major depressive disorder (MDD). As we reported in our previous study (Wagner et al., 2006), patients with MDD were characterized by an inability to deactivate this region during cognitive processing leading to a compensatory prefrontal hyperactivation. This hyperactivation in rACC may be related to a deficient inhibitory control of negative self-referential processes, which in turn may interfere with cognitive control task execution and the underlying fronto-cingulate network activation. To test this assumption, a functional magnetic resonance imaging study was conducted in 34 healthy subjects. Univariate and functional connectivity analyses in statistical parametric mapping software 8 were used. Self-referential stimuli and the Stroop task were presented in an event-related design. As hypothesized, rACC was specifically engaged during negative self-referential processing (SRP) and was significantly related to the degree of depressive symptoms in participants. BOLD signal in rACC showed increased valence-dependent (negative vs neutral SRP) interaction with BOLD signal in prefrontal and dorsal anterior cingulate regions during Stroop task performance. This result provides strong support for the notion that enhanced rACC interacts with brain regions involved in cognitive control processes and substantiates our previous interpretation of increased rACC and prefrontal activation in patients during Stroop task. PMID:22798398
A knowledge-guided active model method of cortical structure segmentation on pediatric MR images.
Shan, Zuyao Y; Parra, Carlos; Ji, Qing; Jain, Jinesh; Reddick, Wilburn E
2006-10-01
To develop an automated method for quantification of cortical structures on pediatric MR images. A knowledge-guided active model (KAM) approach was proposed with a novel object function similar to the Gibbs free energy function. Triangular mesh models were transformed to images of a given subject by maximizing entropy, and then actively slithered to boundaries of structures by minimizing enthalpy. Volumetric results and image similarities of 10 different cortical structures segmented by KAM were compared with those traced manually. Furthermore, the segmentation performances of KAM and SPM2, (statistical parametric mapping, a MATLAB software package) were compared. The averaged volumetric agreements between KAM- and manually-defined structures (both 0.95 for structures in healthy children and children with medulloblastoma) were higher than the volumetric agreement for SPM2 (0.90 and 0.80, respectively). The similarity measurements (kappa) between KAM- and manually-defined structures (0.95 and 0.93, respectively) were higher than those for SPM2 (both 0.86). We have developed a novel automatic algorithm, KAM, for segmentation of cortical structures on MR images of pediatric patients. Our preliminary results indicated that when segmenting cortical structures, KAM was in better agreement with manually-delineated structures than SPM2. KAM can potentially be used to segment cortical structures for conformal radiation therapy planning and for quantitative evaluation of changes in disease or abnormality. Copyright (c) 2006 Wiley-Liss, Inc.