Partial volume correction and image analysis methods for intersubject comparison of FDG-PET studies
NASA Astrophysics Data System (ADS)
Yang, Jun
2000-12-01
Partial volume effect is an artifact mainly due to the limited imaging sensor resolution. It creates bias in the measured activity in small structures and around tissue boundaries. In brain FDG-PET studies, especially for Alzheimer's disease study where there is serious gray matter atrophy, accurate estimate of cerebral metabolic rate of glucose is even more problematic due to large amount of partial volume effect. In this dissertation, we developed a framework enabling inter-subject comparison of partial volume corrected brain FDG-PET studies. The framework is composed of the following image processing steps: (1)MRI segmentation, (2)MR-PET registration, (3)MR based PVE correction, (4)MR 3D inter-subject elastic mapping. Through simulation studies, we showed that the newly developed partial volume correction methods, either pixel based or ROI based, performed better than previous methods. By applying this framework to a real Alzheimer's disease study, we demonstrated that the partial volume corrected glucose rates vary significantly among the control, at risk and disease patient groups and this framework is a promising tool useful for assisting early identification of Alzheimer's patients.
Development of a theoretical framework for analyzing cerebrospinal fluid dynamics
Cohen, Benjamin; Voorhees, Abram; Vedel, Søren; Wei, Timothy
2009-01-01
Background To date hydrocephalus researchers acknowledge the need for rigorous but utilitarian fluid mechanics understanding and methodologies in studying normal and hydrocephalic intracranial dynamics. Pressure volume models and electric circuit analogs introduced pressure into volume conservation; but control volume analysis enforces independent conditions on pressure and volume. Previously, utilization of clinical measurements has been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Methods Control volume analysis is presented to introduce the reader to the theoretical background of this foundational fluid mechanics technique for application to general control volumes. This approach is able to directly incorporate the diverse measurements obtained by clinicians to better elucidate intracranial dynamics and progression to disorder. Results Several examples of meaningful intracranial control volumes and the particular measurement sets needed for the analysis are discussed. Conclusion Control volume analysis provides a framework to guide the type and location of measurements and also a way to interpret the resulting data within a fundamental fluid physics analysis. PMID:19772652
Control volume based hydrocephalus research; analysis of human data
NASA Astrophysics Data System (ADS)
Cohen, Benjamin; Wei, Timothy; Voorhees, Abram; Madsen, Joseph; Anor, Tomer
2010-11-01
Hydrocephalus is a neuropathophysiological disorder primarily diagnosed by increased cerebrospinal fluid volume and pressure within the brain. To date, utilization of clinical measurements have been limited to understanding of the relative amplitude and timing of flow, volume and pressure waveforms; qualitative approaches without a clear framework for meaningful quantitative comparison. Pressure volume models and electric circuit analogs enforce volume conservation principles in terms of pressure. Control volume analysis, through the integral mass and momentum conservation equations, ensures that pressure and volume are accounted for using first principles fluid physics. This approach is able to directly incorporate the diverse measurements obtained by clinicians into a simple, direct and robust mechanics based framework. Clinical data obtained for analysis are discussed along with data processing techniques used to extract terms in the conservation equation. Control volume analysis provides a non-invasive, physics-based approach to extracting pressure information from magnetic resonance velocity data that cannot be measured directly by pressure instrumentation.
Ghodsi, Seyed Hamed; Kerachian, Reza; Zahmatkesh, Zahra
2016-04-15
In this paper, an integrated framework is proposed for urban runoff management. To control and improve runoff quality and quantity, Low Impact Development (LID) practices are utilized. In order to determine the LIDs' areas and locations, the Non-dominated Sorting Genetic Algorithm-II (NSGA-II), which considers three objective functions of minimizing runoff volume, runoff pollution and implementation cost of LIDs, is utilized. In this framework, the Storm Water Management Model (SWMM) is used for stream flow simulation. The non-dominated solutions provided by the NSGA-II are considered as management scenarios. To select the most preferred scenario, interactions among the main stakeholders in the study area with conflicting utilities are incorporated by utilizing bargaining models including a non-cooperative game, Nash model and social choice procedures of Borda count and approval voting. Moreover, a new social choice procedure, named pairwise voting method, is proposed and applied. Based on each conflict resolution approach, a scenario is identified as the ideal solution providing the LIDs' areas, locations and implementation cost. The proposed framework is applied for urban water quality and quantity management in the northern part of Tehran metropolitan city, Iran. Results show that the proposed pairwise voting method tends to select a scenario with a higher percentage of reduction in TSS (Total Suspended Solid) load and runoff volume, in comparison with the Borda count and approval voting methods. Besides, the Nash method presents a management scenario with the highest cost for LIDs' implementation and the maximum values for percentage of runoff volume reduction and TSS removal. The results also signify that selection of an appropriate management scenario by stakeholders in the study area depends on the available financial resources and the relative importance of runoff quality improvement in comparison with reducing the runoff volume. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Technical Reports Server (NTRS)
1978-01-01
A unified framework for comparing intercity passenger and freight transportation systems is presented. Composite measures for cost, service/demand, energy, and environmental impact were determined. A set of 14 basic measures were articulated to form the foundation for computing the composite measures. A parameter dependency diagram, constructed to explicitly interrelate the composite and basic measures is discussed. Ground rules and methodology for developing the values of the basic measures are provided and the use of the framework with existing cost and service data is illustrated for various freight systems.
COMPARISON OF VOLUMETRIC REGISTRATION ALGORITHMS FOR TENSOR-BASED MORPHOMETRY
Villalon, Julio; Joshi, Anand A.; Toga, Arthur W.; Thompson, Paul M.
2015-01-01
Nonlinear registration of brain MRI scans is often used to quantify morphological differences associated with disease or genetic factors. Recently, surface-guided fully 3D volumetric registrations have been developed that combine intensity-guided volume registrations with cortical surface constraints. In this paper, we compare one such algorithm to two popular high-dimensional volumetric registration methods: large-deformation viscous fluid registration, formulated in a Riemannian framework, and the diffeomorphic “Demons” algorithm. We performed an objective morphometric comparison, by using a large MRI dataset from 340 young adult twin subjects to examine 3D patterns of correlations in anatomical volumes. Surface-constrained volume registration gave greater effect sizes for detecting morphometric associations near the cortex, while the other two approaches gave greater effects sizes subcortically. These findings suggest novel ways to combine the advantages of multiple methods in the future. PMID:26925198
Evaluation metrics for bone segmentation in ultrasound
NASA Astrophysics Data System (ADS)
Lougheed, Matthew; Fichtinger, Gabor; Ungi, Tamas
2015-03-01
Tracked ultrasound is a safe alternative to X-ray for imaging bones. The interpretation of bony structures is challenging as ultrasound has no specific intensity characteristic of bones. Several image segmentation algorithms have been devised to identify bony structures. We propose an open-source framework that would aid in the development and comparison of such algorithms by quantitatively measuring segmentation performance in the ultrasound images. True-positive and false-negative metrics used in the framework quantify algorithm performance based on correctly segmented bone and correctly segmented boneless regions. Ground-truth for these metrics are defined manually and along with the corresponding automatically segmented image are used for the performance analysis. Manually created ground truth tests were generated to verify the accuracy of the analysis. Further evaluation metrics for determining average performance per slide and standard deviation are considered. The metrics provide a means of evaluating accuracy of frames along the length of a volume. This would aid in assessing the accuracy of the volume itself and the approach to image acquisition (positioning and frequency of frame). The framework was implemented as an open-source module of the 3D Slicer platform. The ground truth tests verified that the framework correctly calculates the implemented metrics. The developed framework provides a convenient way to evaluate bone segmentation algorithms. The implementation fits in a widely used application for segmentation algorithm prototyping. Future algorithm development will benefit by monitoring the effects of adjustments to an algorithm in a standard evaluation framework.
Generalized fourier analyses of the advection-diffusion equation - Part II: two-dimensional domains
NASA Astrophysics Data System (ADS)
Voth, Thomas E.; Martinez, Mario J.; Christon, Mark A.
2004-07-01
Part I of this work presents a detailed multi-methods comparison of the spatial errors associated with the one-dimensional finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. In Part II we extend the analysis to two-dimensional domains and also consider the effects of wave propagation direction and grid aspect ratio on the phase speed, and the discrete and artificial diffusivities. The observed dependence of dispersive and diffusive behaviour on propagation direction makes comparison of methods more difficult relative to the one-dimensional results. For this reason, integrated (over propagation direction and wave number) error and anisotropy metrics are introduced to facilitate comparison among the various methods. With respect to these metrics, the consistent mass Galerkin and consistent mass control-volume finite element methods, and their streamline upwind derivatives, exhibit comparable accuracy, and generally out-perform their lumped mass counterparts and finite-difference based schemes. While this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common mathematical framework. Published in 2004 by John Wiley & Sons, Ltd.
Taylor bubbles at high viscosity ratios: experiments and numerical simulations
NASA Astrophysics Data System (ADS)
Hewakandamby, Buddhika; Hasan, Abbas; Azzopardi, Barry; Xie, Zhihua; Pain, Chris; Matar, Omar
2015-11-01
The Taylor bubble is a single long bubble which nearly fills the entire cross section of a liquid-filled circular tube, often occurring in gas-liquid slug flows in many industrial applications, particularly oil and gas production. The objective of this study is to investigate the fluid dynamics of three-dimensional Taylor bubble rising in highly viscous silicone oil in a vertical pipe. An adaptive unstructured mesh modelling framework is adopted here which can modify and adapt anisotropic unstructured meshes to better represent the underlying physics of bubble rising and reduce computational effort without sacrificing accuracy. The numerical framework consists of a mixed control volume and finite element formulation, a `volume of fluid'-type method for the interface-capturing based on a compressive control volume advection method, and a force-balanced algorithm for the surface tension implementation. Experimental results for the Taylor bubble shape and rise velocity are presented, together with numerical results for the dynamics of the bubbles. A comparison of the simulation predictions with experimental data available in the literature is also presented to demonstrate the capabilities of our numerical method. EPSRC Programme Grant, MEMPHIS, EP/K0039761/1.
Quantification of skeletal fraction volume of a soil pit by means of photogrammetry
NASA Astrophysics Data System (ADS)
Baruck, Jasmin; Zieher, Thomas; Bremer, Magnus; Rutzinger, Martin; Geitner, Clemens
2015-04-01
The grain size distribution of a soil is a key parameter determining soil water behaviour, soil fertility and land use potential. It plays an important role in soil classification and allows drawing conclusions on landscape development as well as soil formation processes. However, fine soil material (i.e. particle diameter ≤2 mm) is usually documented more thoroughly than the skeletal fraction (i.e. particle diameter >2 mm). While fine soil material is commonly analysed in the laboratory in order to determine the soil type, the skeletal fraction is typically estimated in the field at the profile. For a more precise determination of the skeletal fraction other methods can be applied and combined. These methods can be volume-related (sampling rings, percussion coring tubes) or non-volume-related (sieve of spade excavation). In this study we present a framework for the quantification of skeletal fraction volumes of a soil pit by means of photogrammetry. As a first step 3D point clouds of both soil pit and skeletal grains were generated. Therefore all skeletal grains of the pit were spread out onto a plane, clean plastic sheet in the field and numerous digital photos were taken using a reflex camera. With the help of the open source tool VisualSFM (structure from motion) two scaled 3D point clouds were derived. As a second step the skeletal fraction point cloud was segmented by radiometric attributes in order to determine volumes of single skeletal grains. The comparison of the total skeletal fraction volume with the volume of the pit (closed by spline interpolation) yields an estimate of the volumetric proportion of skeletal grains. The presented framework therefore provides an objective reference value of skeletal fraction for the support of qualitative field records.
NASA Astrophysics Data System (ADS)
Yamashita, T.; Akagi, T.; Aso, T.; Kimura, A.; Sasaki, T.
2012-11-01
The pencil beam algorithm (PBA) is reasonably accurate and fast. It is, therefore, the primary method used in routine clinical treatment planning for proton radiotherapy; still, it needs to be validated for use in highly inhomogeneous regions. In our investigation of the effect of patient inhomogeneity, PBA was compared with Monte Carlo (MC). A software framework was developed for the MC simulation of radiotherapy based on Geant4. Anatomical sites selected for the comparison were the head/neck, liver, lung and pelvis region. The dose distributions calculated by the two methods in selected examples were compared, as well as a dose volume histogram (DVH) derived from the dose distributions. The comparison of the off-center ratio (OCR) at the iso-center showed good agreement between the PBA and MC, while discrepancies were seen around the distal fall-off regions. While MC showed a fine structure on the OCR in the distal fall-off region, the PBA showed smoother distribution. The fine structures in MC calculation appeared downstream of very low-density regions. Comparison of DVHs showed that most of the target volumes were similarly covered, while some OARs located around the distal region received a higher dose when calculated by MC than the PBA.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paula S.; Crump, John W.; Ackley, Keith A.
1992-01-01
In the second volume of the Demonstration Framework Document, the graphical representation of the demonstration framework is given. This second document was created to facilitate the reading and comprehension of the demonstration framework. It is designed to be viewed in parallel with Section 4.2 of the first volume to help give a picture of the relationships between the UOB's (Unit of Behavior) of the model. The model is quite large and the design team felt that this form of presentation would make it easier for the reader to get a feel for the processes described in this document. The IDEF3 (Process Description Capture Method) diagrams of the processes of an Information System Development are presented. Volume 1 describes the processes and the agents involved with each process, while this volume graphically shows the precedence relationships among the processes.
A finite-volume module for all-scale Earth-system modelling at ECMWF
NASA Astrophysics Data System (ADS)
Kühnlein, Christian; Malardel, Sylvie; Smolarkiewicz, Piotr
2017-04-01
We highlight recent advancements in the development of the finite-volume module (FVM) (Smolarkiewicz et al., 2016) for the IFS at ECMWF. FVM represents an alternative dynamical core that complements the operational spectral dynamical core of the IFS with new capabilities. Most notably, these include a compact-stencil finite-volume discretisation, flexible meshes, conservative non-oscillatory transport and all-scale governing equations. As a default, FVM solves the compressible Euler equations in a geospherical framework (Szmelter and Smolarkiewicz, 2010). The formulation incorporates a generalised terrain-following vertical coordinate. A hybrid computational mesh, fully unstructured in the horizontal and structured in the vertical, enables efficient global atmospheric modelling. Moreover, a centred two-time-level semi-implicit integration scheme is employed with 3D implicit treatment of acoustic, buoyant, and rotational modes. The associated 3D elliptic Helmholtz problem is solved using a preconditioned Generalised Conjugate Residual approach. The solution procedure employs the non-oscillatory finite-volume MPDATA advection scheme that is bespoke for the compressible dynamics on the hybrid mesh (Kühnlein and Smolarkiewicz, 2017). The recent progress of FVM is illustrated with results of benchmark simulations of intermediate complexity, and comparison to the operational spectral dynamical core of the IFS. C. Kühnlein, P.K. Smolarkiewicz: An unstructured-mesh finite-volume MPDATA for compressible atmospheric dynamics, J. Comput. Phys. (2017), in press. P.K. Smolarkiewicz, W. Deconinck, M. Hamrud, C. Kühnlein, G. Mozdzynski, J. Szmelter, N.P. Wedi: A finite-volume module for simulating global all-scale atmospheric flows, J. Comput. Phys. 314 (2016) 287-304. J. Szmelter, P.K. Smolarkiewicz: An edge-based unstructured mesh discretisation in geospherical framework, J. Comput. Phys. 229 (2010) 4980-4995.
Unifying framework for multimodal brain MRI segmentation based on Hidden Markov Chains.
Bricq, S; Collet, Ch; Armspach, J P
2008-12-01
In the frame of 3D medical imaging, accurate segmentation of multimodal brain MR images is of interest for many brain disorders. However, due to several factors such as noise, imaging artifacts, intrinsic tissue variation and partial volume effects, tissue classification remains a challenging task. In this paper, we present a unifying framework for unsupervised segmentation of multimodal brain MR images including partial volume effect, bias field correction, and information given by a probabilistic atlas. Here-proposed method takes into account neighborhood information using a Hidden Markov Chain (HMC) model. Due to the limited resolution of imaging devices, voxels may be composed of a mixture of different tissue types, this partial volume effect is included to achieve an accurate segmentation of brain tissues. Instead of assigning each voxel to a single tissue class (i.e., hard classification), we compute the relative amount of each pure tissue class in each voxel (mixture estimation). Further, a bias field estimation step is added to the proposed algorithm to correct intensity inhomogeneities. Furthermore, atlas priors were incorporated using probabilistic brain atlas containing prior expectations about the spatial localization of different tissue classes. This atlas is considered as a complementary sensor and the proposed method is extended to multimodal brain MRI without any user-tunable parameter (unsupervised algorithm). To validate this new unifying framework, we present experimental results on both synthetic and real brain images, for which the ground truth is available. Comparison with other often used techniques demonstrates the accuracy and the robustness of this new Markovian segmentation scheme.
ERIC Educational Resources Information Center
UNESCO Institute for Lifelong Learning, 2015
2015-01-01
This second volume of the "Global Inventory of Regional and National Qualifications Frameworks" focuses on national and regional cases of national qualifications frameworks for eighty- six countries from Afghanistan to Uzbekistan and seven regional qualifications frameworks. Each country profile provides a thorough review of the main…
Comparison of international food allergen labeling regulations.
Gendel, Steven M
2012-07-01
Food allergy is a significant public health issue worldwide. Regulatory risk management strategies for allergic consumers have focused on providing information about the presence of food allergens through label declarations. A number of countries and regulatory bodies have recognized the importance of providing this information by enacting laws, regulations or standards for food allergen labeling of "priority allergens". However, different governments and organizations have taken different approaches to identifying these "priority allergens" and to designing labeling declaration regulatory frameworks. The increasing volume of the international food trade suggests that there would be value in supporting sensitive consumers by harmonizing (to the extent possible) these regulatory frameworks. As a first step toward this goal, an inventory of allergen labeling regulations was assembled and analyzed to identify commonalities, differences, and future needs. Published by Elsevier Inc.
NASA Astrophysics Data System (ADS)
Oh, Jungsu S.; Kim, Jae Seung; Chae, Sun Young; Oh, Minyoung; Oh, Seung Jun; Cha, Seung Nam; Chang, Ho-Jong; Lee, Chong Sik; Lee, Jae Hong
2017-03-01
We present an optimized voxelwise statistical parametric mapping (SPM) of partial-volume (PV)-corrected positron emission tomography (PET) of 11C Pittsburgh Compound B (PiB), incorporating the anatomical precision of magnetic resonance image (MRI) and amyloid β (A β) burden-specificity of PiB PET. First, we applied region-based partial-volume correction (PVC), termed the geometric transfer matrix (GTM) method, to PiB PET, creating MRI-based lobar parcels filled with mean PiB uptakes. Then, we conducted a voxelwise PVC by multiplying the original PET by the ratio of a GTM-based PV-corrected PET to a 6-mm-smoothed PV-corrected PET. Finally, we conducted spatial normalizations of the PV-corrected PETs onto the study-specific template. As such, we increased the accuracy of the SPM normalization and the tissue specificity of SPM results. Moreover, lobar smoothing (instead of whole-brain smoothing) was applied to increase the signal-to-noise ratio in the image without degrading the tissue specificity. Thereby, we could optimize a voxelwise group comparison between subjects with high and normal A β burdens (from 10 patients with Alzheimer's disease, 30 patients with Lewy body dementia, and 9 normal controls). Our SPM framework outperformed than the conventional one in terms of the accuracy of the spatial normalization (85% of maximum likelihood tissue classification volume) and the tissue specificity (larger gray matter, and smaller cerebrospinal fluid volume fraction from the SPM results). Our SPM framework optimized the SPM of a PV-corrected A β PET in terms of anatomical precision, normalization accuracy, and tissue specificity, resulting in better detection and localization of A β burdens in patients with Alzheimer's disease and Lewy body dementia.
Santander, Julian E; Tsapatsis, Michael; Auerbach, Scott M
2013-04-16
We have constructed and applied an algorithm to simulate the behavior of zeolite frameworks during liquid adsorption. We applied this approach to compute the adsorption isotherms of furfural-water and hydroxymethyl furfural (HMF)-water mixtures adsorbing in silicalite zeolite at 300 K for comparison with experimental data. We modeled these adsorption processes under two different statistical mechanical ensembles: the grand canonical (V-Nz-μg-T or GC) ensemble keeping volume fixed, and the P-Nz-μg-T (osmotic) ensemble allowing volume to fluctuate. To optimize accuracy and efficiency, we compared pure Monte Carlo (MC) sampling to hybrid MC-molecular dynamics (MD) simulations. For the external furfural-water and HMF-water phases, we assumed the ideal solution approximation and employed a combination of tabulated data and extended ensemble simulations for computing solvation free energies. We found that MC sampling in the V-Nz-μg-T ensemble (i.e., standard GCMC) does a poor job of reproducing both the Henry's law regime and the saturation loadings of these systems. Hybrid MC-MD sampling of the V-Nz-μg-T ensemble, which includes framework vibrations at fixed total volume, provides better results in the Henry's law region, but this approach still does not reproduce experimental saturation loadings. Pure MC sampling of the osmotic ensemble was found to approach experimental saturation loadings more closely, whereas hybrid MC-MD sampling of the osmotic ensemble quantitatively reproduces such loadings because the MC-MD approach naturally allows for locally anisotropic volume changes wherein some pores expand whereas others contract.
NASA Astrophysics Data System (ADS)
Usman, M.; Atkinson, D.; Heathfield, E.; Greil, G.; Schaeffter, T.; Prieto, C.
2015-04-01
Two major challenges in cardiovascular MRI are long scan times due to slow MR acquisition and motion artefacts due to respiratory motion. Recently, a Motion Corrected-Compressed Sensing (MC-CS) technique has been proposed for free breathing 2D dynamic cardiac MRI that addresses these challenges by simultaneously accelerating MR acquisition and correcting for any arbitrary motion in a compressed sensing reconstruction. In this work, the MC-CS framework is combined with parallel imaging for further acceleration, and is termed Motion Corrected Sparse SENSE (MC-SS). Validation of the MC-SS framework is demonstrated in eight volunteers and three patients for left ventricular functional assessment and results are compared with the breath-hold acquisitions as reference. A non-significant difference (P > 0.05) was observed in the volumetric functional measurements (end diastolic volume, end systolic volume, ejection fraction) and myocardial border sharpness values obtained with the proposed and gold standard methods. The proposed method achieves whole heart multi-slice coverage in 2 min under free breathing acquisition eliminating the time needed between breath-holds for instructions and recovery. This results in two-fold speed up of the total acquisition time in comparison to the breath-hold acquisition.
StreamExplorer: A Multi-Stage System for Visually Exploring Events in Social Streams.
Wu, Yingcai; Chen, Zhutian; Sun, Guodao; Xie, Xiao; Cao, Nan; Liu, Shixia; Cui, Weiwei
2017-10-18
Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.Analyzing social streams is important for many applications, such as crisis management. However, the considerable diversity, increasing volume, and high dynamics of social streams of large events continue to be significant challenges that must be overcome to ensure effective exploration. We propose a novel framework by which to handle complex social streams on a budget PC. This framework features two components: 1) an online method to detect important time periods (i.e., subevents), and 2) a tailored GPU-assisted Self-Organizing Map (SOM) method, which clusters the tweets of subevents stably and efficiently. Based on the framework, we present StreamExplorer to facilitate the visual analysis, tracking, and comparison of a social stream at three levels. At a macroscopic level, StreamExplorer uses a new glyph-based timeline visualization, which presents a quick multi-faceted overview of the ebb and flow of a social stream. At a mesoscopic level, a map visualization is employed to visually summarize the social stream from either a topical or geographical aspect. At a microscopic level, users can employ interactive lenses to visually examine and explore the social stream from different perspectives. Two case studies and a task-based evaluation are used to demonstrate the effectiveness and usefulness of StreamExplorer.
Transport Phenomena During Equiaxed Solidification of Alloys
NASA Technical Reports Server (NTRS)
Beckermann, C.; deGroh, H. C., III
1997-01-01
Recent progress in modeling of transport phenomena during dendritic alloy solidification is reviewed. Starting from the basic theorems of volume averaging, a general multiphase modeling framework is outlined. This framework allows for the incorporation of a variety of microscale phenomena in the macroscopic transport equations. For the case of diffusion dominated solidification, a simplified set of model equations is examined in detail and validated through comparisons with numerous experimental data for both columnar and equiaxed dendritic growth. This provides a critical assessment of the various model assumptions. Models that include melt flow and solid phase transport are also discussed, although their validation is still at an early stage. Several numerical results are presented that illustrate some of the profound effects of convective transport on the final compositional and structural characteristics of a solidified part. Important issues that deserve continuing attention are identified.
DOT National Transportation Integrated Search
1988-10-01
This second volume of the study entitled, Optimizing Wartime Materiel Delivery: An Overview of DOD Containerization Efforts, -outlines a framework for action to address containerization issues identified in Volume I. The objectives of the study inclu...
François, Marianne M.
2015-05-28
A review of recent advances made in numerical methods and algorithms within the volume tracking framework is presented. The volume tracking method, also known as the volume-of-fluid method has become an established numerical approach to model and simulate interfacial flows. Its advantage is its strict mass conservation. However, because the interface is not explicitly tracked but captured via the material volume fraction on a fixed mesh, accurate estimation of the interface position, its geometric properties and modeling of interfacial physics in the volume tracking framework remain difficult. Several improvements have been made over the last decade to address these challenges.more » In this study, the multimaterial interface reconstruction method via power diagram, curvature estimation via heights and mean values and the balanced-force algorithm for surface tension are highlighted.« less
Comparison of an adaptive local thresholding method on CBCT and µCT endodontic images
NASA Astrophysics Data System (ADS)
Michetti, Jérôme; Basarab, Adrian; Diemer, Franck; Kouame, Denis
2018-01-01
Root canal segmentation on cone beam computed tomography (CBCT) images is difficult because of the noise level, resolution limitations, beam hardening and dental morphological variations. An image processing framework, based on an adaptive local threshold method, was evaluated on CBCT images acquired on extracted teeth. A comparison with high quality segmented endodontic images on micro computed tomography (µCT) images acquired from the same teeth was carried out using a dedicated registration process. Each segmented tooth was evaluated according to volume and root canal sections through the area and the Feret’s diameter. The proposed method is shown to overcome the limitations of CBCT and to provide an automated and adaptive complete endodontic segmentation. Despite a slight underestimation (-4, 08%), the local threshold segmentation method based on edge-detection was shown to be fast and accurate. Strong correlations between CBCT and µCT segmentations were found both for the root canal area and diameter (respectively 0.98 and 0.88). Our findings suggest that combining CBCT imaging with this image processing framework may benefit experimental endodontology, teaching and could represent a first development step towards the clinical use of endodontic CBCT segmentation during pulp cavity treatment.
An object-oriented framework for medical image registration, fusion, and visualization.
Zhu, Yang-Ming; Cochoff, Steven M
2006-06-01
An object-oriented framework for image registration, fusion, and visualization was developed based on the classic model-view-controller paradigm. The framework employs many design patterns to facilitate legacy code reuse, manage software complexity, and enhance the maintainability and portability of the framework. Three sample applications built a-top of this framework are illustrated to show the effectiveness of this framework: the first one is for volume image grouping and re-sampling, the second one is for 2D registration and fusion, and the last one is for visualization of single images as well as registered volume images.
A Distributed GPU-Based Framework for Real-Time 3D Volume Rendering of Large Astronomical Data Cubes
NASA Astrophysics Data System (ADS)
Hassan, A. H.; Fluke, C. J.; Barnes, D. G.
2012-05-01
We present a framework to volume-render three-dimensional data cubes interactively using distributed ray-casting and volume-bricking over a cluster of workstations powered by one or more graphics processing units (GPUs) and a multi-core central processing unit (CPU). The main design target for this framework is to provide an in-core visualization solution able to provide three-dimensional interactive views of terabyte-sized data cubes. We tested the presented framework using a computing cluster comprising 64 nodes with a total of 128GPUs. The framework proved to be scalable to render a 204GB data cube with an average of 30 frames per second. Our performance analyses also compare the use of NVIDIA Tesla 1060 and 2050GPU architectures and the effect of increasing the visualization output resolution on the rendering performance. Although our initial focus, as shown in the examples presented in this work, is volume rendering of spectral data cubes from radio astronomy, we contend that our approach has applicability to other disciplines where close to real-time volume rendering of terabyte-order three-dimensional data sets is a requirement.
Sturla, Francesco; Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto
2017-04-01
Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow's disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.
The U.S. Department of Energy (DOE) has developed a vehicle framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to DOE’s Technical Targets using four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework model for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be easily estimated. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates the systems parameters required to run the storage system model. Additionally, this design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the framework model and compare it to the DOE Technical Targets. These models will be explained and exercised with existing hydrogen storage materials.« less
Ikezoe, Yasuhiro; Washino, Gosuke; Uemura, Takashi; Kitagawa, Susumu; Matsui, Hiroshi
2012-01-01
There have developed a variety of microsystems that harness energy and convert it to mechanical motion. Here we developed new autonomous biochemical motors by integrating metal-organic framework (MOF) and self-assembling peptides. MOF is applied as an energy-storing cell that assembles peptides inside nanoscale pores of the coordination framework. The robust assembling nature of peptides enables reconfiguring their assemblies at the water-MOF interface, which is converted to fuel energy. Re-organization of hydrophobic peptides could create the large surface tension gradient around the MOF and it efficiently powers the translation motion of MOF. As a comparison, the velocity of normalized by volume for the DPA-MOF particle is faster and the kinetic energy per the unit mass of fuel is more than twice as large as the one for previous gel motor systems. This demonstration opens the new application of MOF and reconfigurable molecular self-assembly and it may evolve into the smart autonomous motor that mimic bacteria to swim and harvest target chemicals by integrating recognition units. PMID:23104155
PRISM: An open source framework for the interactive design of GPU volume rendering shaders.
Drouin, Simon; Collins, D Louis
2018-01-01
Direct volume rendering has become an essential tool to explore and analyse 3D medical images. Despite several advances in the field, it remains a challenge to produce an image that highlights the anatomy of interest, avoids occlusion of important structures, provides an intuitive perception of shape and depth while retaining sufficient contextual information. Although the computer graphics community has proposed several solutions to address specific visualization problems, the medical imaging community still lacks a general volume rendering implementation that can address a wide variety of visualization use cases while avoiding complexity. In this paper, we propose a new open source framework called the Programmable Ray Integration Shading Model, or PRISM, that implements a complete GPU ray-casting solution where critical parts of the ray integration algorithm can be replaced to produce new volume rendering effects. A graphical user interface allows clinical users to easily experiment with pre-existing rendering effect building blocks drawn from an open database. For programmers, the interface enables real-time editing of the code inside the blocks. We show that in its default mode, the PRISM framework produces images very similar to those produced by a widely-adopted direct volume rendering implementation in VTK at comparable frame rates. More importantly, we demonstrate the flexibility of the framework by showing how several volume rendering techniques can be implemented in PRISM with no more than a few lines of code. Finally, we demonstrate the simplicity of our system in a usability study with 5 medical imaging expert subjects who have none or little experience with volume rendering. The PRISM framework has the potential to greatly accelerate development of volume rendering for medical applications by promoting sharing and enabling faster development iterations and easier collaboration between engineers and clinical personnel.
PRISM: An open source framework for the interactive design of GPU volume rendering shaders
Collins, D. Louis
2018-01-01
Direct volume rendering has become an essential tool to explore and analyse 3D medical images. Despite several advances in the field, it remains a challenge to produce an image that highlights the anatomy of interest, avoids occlusion of important structures, provides an intuitive perception of shape and depth while retaining sufficient contextual information. Although the computer graphics community has proposed several solutions to address specific visualization problems, the medical imaging community still lacks a general volume rendering implementation that can address a wide variety of visualization use cases while avoiding complexity. In this paper, we propose a new open source framework called the Programmable Ray Integration Shading Model, or PRISM, that implements a complete GPU ray-casting solution where critical parts of the ray integration algorithm can be replaced to produce new volume rendering effects. A graphical user interface allows clinical users to easily experiment with pre-existing rendering effect building blocks drawn from an open database. For programmers, the interface enables real-time editing of the code inside the blocks. We show that in its default mode, the PRISM framework produces images very similar to those produced by a widely-adopted direct volume rendering implementation in VTK at comparable frame rates. More importantly, we demonstrate the flexibility of the framework by showing how several volume rendering techniques can be implemented in PRISM with no more than a few lines of code. Finally, we demonstrate the simplicity of our system in a usability study with 5 medical imaging expert subjects who have none or little experience with volume rendering. The PRISM framework has the potential to greatly accelerate development of volume rendering for medical applications by promoting sharing and enabling faster development iterations and easier collaboration between engineers and clinical personnel. PMID:29534069
Numerical simulation of air hypersonic flows with equilibrium chemical reactions
NASA Astrophysics Data System (ADS)
Emelyanov, Vladislav; Karpenko, Anton; Volkov, Konstantin
2018-05-01
The finite volume method is applied to solve unsteady three-dimensional compressible Navier-Stokes equations on unstructured meshes. High-temperature gas effects altering the aerodynamics of vehicles are taken into account. Possibilities of the use of graphics processor units (GPUs) for the simulation of hypersonic flows are demonstrated. Solutions of some test cases on GPUs are reported, and a comparison between computational results of equilibrium chemically reacting and perfect air flowfields is performed. Speedup of solution on GPUs with respect to the solution on central processor units (CPUs) is compared. The results obtained provide promising perspective for designing a GPU-based software framework for practical applications.
Total-dose radiation effects data for semiconductor devices, volume 3
NASA Technical Reports Server (NTRS)
Price, W. E.; Martin, K. E.; Nichols, D. K.; Gauthier, M. K.; Brown, S. F.
1982-01-01
Volume 3 of this three-volume set provides a detailed analysis of the data in Volumes 1 and 2, most of which was generated for the Galileo Orbiter Program in support of NASA space programs. Volume 1 includes total ionizing dose radiation test data on diodes, bipolar transistors, field effect transistors, and miscellaneous discrete solid-state devices. Volume 2 includes similar data on integrated circuits and a few large-scale integrated circuits. The data of Volumes 1 and 2 are combined in graphic format in Volume 3 to provide a comparison of radiation sensitivities of devices of a given type and different manufacturer, a comparison of multiple tests for a single data code, a comparison of multiple tests for a single lot, and a comparison of radiation sensitivities vs time (date codes). All data were generated using a steady-state 2.5-MeV electron source (Dynamitron) or a Cobalt-60 gamma ray source. The data that compose Volume 3 represent 26 different device types, 224 tests, and a total of 1040 devices. A comparison of the effects of steady-state electrons and Cobat-60 gamma rays is also presented.
Self-evolving atomistic kinetic Monte Carlo simulations of defects in materials
Xu, Haixuan; Beland, Laurent K.; Stoller, Roger E.; ...
2015-01-29
The recent development of on-the-fly atomistic kinetic Monte Carlo methods has led to an increased amount attention on the methods and their corresponding capabilities and applications. In this review, the framework and current status of Self-Evolving Atomistic Kinetic Monte Carlo (SEAKMC) are discussed. SEAKMC particularly focuses on defect interaction and evolution with atomistic details without assuming potential defect migration/interaction mechanisms and energies. The strength and limitation of using an active volume, the key concept introduced in SEAKMC, are discussed. Potential criteria for characterizing an active volume are discussed and the influence of active volume size on saddle point energies ismore » illustrated. A procedure starting with a small active volume followed by larger active volumes was found to possess higher efficiency. Applications of SEAKMC, ranging from point defect diffusion, to complex interstitial cluster evolution, to helium interaction with tungsten surfaces, are summarized. A comparison of SEAKMC with molecular dynamics and conventional object kinetic Monte Carlo is demonstrated. Overall, SEAKMC is found to be complimentary to conventional molecular dynamics, especially when the harmonic approximation of transition state theory is accurate. However it is capable of reaching longer time scales than molecular dynamics and it can be used to systematically increase the accuracy of other methods such as object kinetic Monte Carlo. Furthermore, the challenges and potential development directions are also outlined.« less
This research outlines a proposed Heavy-Duty Diesel Vehicle Modal Emission Modeling Framework (HDDV-MEMF) for heavy-duty diesel-powered trucks and buses. The heavy-duty vehicle modal modules being developed under this research effort, although different, should be compatible wi...
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.; ...
2018-04-07
The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. Here, these models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH 3BH 3) and endothermic alane (AlH 3).« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brooks, Kriston P.; Sprik, Samuel J.; Tamburello, David A.
The U.S. Department of Energy (DOE) developed a vehicle Framework model to simulate fuel cell-based light-duty vehicle operation for various hydrogen storage systems. This transient model simulates the performance of the storage system, fuel cell, and vehicle for comparison to Technical Targets established by DOE for four drive cycles/profiles. Chemical hydrogen storage models have been developed for the Framework for both exothermic and endothermic materials. Despite the utility of such models, they require that material researchers input system design specifications that cannot be estimated easily. To address this challenge, a design tool has been developed that allows researchers to directlymore » enter kinetic and thermodynamic chemical hydrogen storage material properties into a simple sizing module that then estimates system parameters required to run the storage system model. Additionally, the design tool can be used as a standalone executable file to estimate the storage system mass and volume outside of the Framework model. Here, these models will be explained and exercised with the representative hydrogen storage materials exothermic ammonia borane (NH 3BH 3) and endothermic alane (AlH 3).« less
Virtual screening of inorganic materials synthesis parameters with deep learning
NASA Astrophysics Data System (ADS)
Kim, Edward; Huang, Kevin; Jegelka, Stefanie; Olivetti, Elsa
2017-12-01
Virtual materials screening approaches have proliferated in the past decade, driven by rapid advances in first-principles computational techniques, and machine-learning algorithms. By comparison, computationally driven materials synthesis screening is still in its infancy, and is mired by the challenges of data sparsity and data scarcity: Synthesis routes exist in a sparse, high-dimensional parameter space that is difficult to optimize over directly, and, for some materials of interest, only scarce volumes of literature-reported syntheses are available. In this article, we present a framework for suggesting quantitative synthesis parameters and potential driving factors for synthesis outcomes. We use a variational autoencoder to compress sparse synthesis representations into a lower dimensional space, which is found to improve the performance of machine-learning tasks. To realize this screening framework even in cases where there are few literature data, we devise a novel data augmentation methodology that incorporates literature synthesis data from related materials systems. We apply this variational autoencoder framework to generate potential SrTiO3 synthesis parameter sets, propose driving factors for brookite TiO2 formation, and identify correlations between alkali-ion intercalation and MnO2 polymorph selection.
Parkes, Marie V; Staiger, Chad L; Perry, John J; Allendorf, Mark D; Greathouse, Jeffery A
2013-06-21
The adsorption of noble gases and nitrogen by sixteen metal-organic frameworks (MOFs) was investigated using grand canonical Monte Carlo simulation. The MOFs were chosen to represent a variety of net topologies, pore dimensions, and metal centers. Three commercially available MOFs (HKUST-1, AlMIL-53, and ZIF-8) and PCN-14 were also included for comparison. Experimental adsorption isotherms, obtained from volumetric and gravimetric methods, were used to compare krypton, argon, and nitrogen uptake with the simulation results. Simulated trends in gas adsorption and predicted selectivities among the commercially available MOFs are in good agreement with experiment. In the low pressure regime, the expected trend of increasing adsorption with increasing noble gas polarizabilty is seen. For each noble gas, low pressure adsorption correlates with several MOF properties, including free volume, topology, and metal center. Additionally, a strong correlation exists between the Henry's constant and the isosteric heat of adsorption for all gases and MOFs considered. Finally, we note that the simulated and experimental gas selectivities demonstrated by this small set of MOFs show improved performance compared to similar values reported for zeolites.
ECTA/DaSy Framework Self-Assessment Comparison Tool
ERIC Educational Resources Information Center
Center for IDEA Early Childhood Data Systems (DaSy), 2016
2016-01-01
The Self-Assessment Comparison (SAC) Tool is for state Part C and Section 619/Preschool programs to use to assess changes in the implementation of one or more components of the ECTA System Framework and/or subcomponenets of the DaSy Data System Framework. It is a companion to the ECTA/DaSy Framework Self-Assessment. Key features of the SAC are…
NASA Astrophysics Data System (ADS)
Lefèvre, Victor; Lopez-Pamies, Oscar
2017-02-01
This paper presents an analytical framework to construct approximate homogenization solutions for the macroscopic elastic dielectric response - under finite deformations and finite electric fields - of dielectric elastomer composites with two-phase isotropic particulate microstructures. The central idea consists in employing the homogenization solution derived in Part I of this work for ideal elastic dielectric composites within the context of a nonlinear comparison medium method - this is derived as an extension of the comparison medium method of Lopez-Pamies et al. (2013) in nonlinear elastostatics to the coupled realm of nonlinear electroelastostatics - to generate in turn a corresponding solution for composite materials with non-ideal elastic dielectric constituents. Complementary to this analytical framework, a hybrid finite-element formulation to construct homogenization solutions numerically (in three dimensions) is also presented. The proposed analytical framework is utilized to work out a general approximate homogenization solution for non-Gaussian dielectric elastomers filled with nonlinear elastic dielectric particles that may exhibit polarization saturation. The solution applies to arbitrary (non-percolative) isotropic distributions of filler particles. By construction, it is exact in the limit of small deformations and moderate electric fields. For finite deformations and finite electric fields, its accuracy is demonstrated by means of direct comparisons with finite-element solutions. Aimed at gaining physical insight into the extreme enhancement in electrostriction properties displayed by emerging dielectric elastomer composites, various cases wherein the filler particles are of poly- and mono-disperse sizes and exhibit different types of elastic dielectric behavior are discussed in detail. Contrary to an initial conjecture in the literature, it is found (inter alia) that the isotropic addition of a small volume fraction of stiff (semi-)conducting/high-permittivity particles to dielectric elastomers does not lead to the extreme electrostriction enhancements observed in experiments. It is posited that such extreme enhancements are the manifestation of interphasial phenomena.
The informatics of a C57BL/6J mouse brain atlas.
MacKenzie-Graham, Allan; Jones, Eagle S; Shattuck, David W; Dinov, Ivo D; Bota, Mihail; Toga, Arthur W
2003-01-01
The Mouse Atlas Project (MAP) aims to produce a framework for organizing and analyzing the large volumes of neuroscientific data produced by the proliferation of genetically modified animals. Atlases provide an invaluable aid in understanding the impact of genetic manipulations by providing a standard for comparison. We use a digital atlas as the hub of an informatics network, correlating imaging data, such as structural imaging and histology, with text-based data, such as nomenclature, connections, and references. We generated brain volumes using magnetic resonance microscopy (MRM), classical histology, and immunohistochemistry, and registered them into a common and defined coordinate system. Specially designed viewers were developed in order to visualize multiple datasets simultaneously and to coordinate between textual and image data. Researchers can navigate through the brain interchangeably, in either a text-based or image-based representation that automatically updates information as they move. The atlas also allows the independent entry of other types of data, the facile retrieval of information, and the straight-forward display of images. In conjunction with centralized servers, image and text data can be kept current and can decrease the burden on individual researchers' computers. A comprehensive framework that encompasses many forms of information in the context of anatomic imaging holds tremendous promise for producing new insights. The atlas and associated tools can be found at http://www.loni.ucla.edu/MAP.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xia, Yidong; Andrs, David; Martineau, Richard Charles
This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for timemore » integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.« less
Methods for synthesizing microporous crystals and microporous crystal membranes
Dutta, Prabir; Severance, Michael; Sun, Chenhu
2017-02-07
A method of making a microporous crystal material, comprising: a. forming a mixture comprising NaOH, water, and one or more of an aluminum source, a silicon source, and a phosphate source, whereupon the mixture forms a gel; b. heating the gel for a first time period, whereupon a first volume of water is removed from the gel and micoroporous crystal nuclei form, the nuclei having a framework; and c.(if a membrane is to be formed) applying the gel to a solid support seeded with microporous crystals having a framework that is the same as the framework of the nuclei; d. heating the gel for a second time period. during which a second volume of water is added to the gel; wherein the rate of addition of the second volume of water is between about 0.5 and about 2.0 fold the rate of removal of the first volume of water.
Onorati, Francesco; Puppini, Giovanni; Pappalardo, Omar A.; Selmi, Matteo; Votta, Emiliano; Faggian, Giuseppe; Redaelli, Alberto
2017-01-01
Background Accurate quantification of mitral valve (MV) morphology and dynamic behavior over the cardiac cycle is crucial to understand the mechanisms of degenerative MV dysfunction and to guide the surgical intervention. Cardiac magnetic resonance (CMR) imaging has progressively been adopted to evaluate MV pathophysiology, although a dedicated framework is required to perform a quantitative assessment of the functional MV anatomy. Methods We investigated MV dynamic behavior in subjects with normal MV anatomy (n=10) and patients referred to surgery due to degenerative MV prolapse, classified as fibro-elastic deficiency (FED, n=9) and Barlow’s disease (BD, n=10). A CMR-dedicated framework was adopted to evaluate prolapse height and volume and quantitatively assess valvular morphology and papillary muscles (PAPs) function over the cardiac cycle. Multiple comparison was used to investigate the hallmarks associated to MV degenerative prolapse and evaluate the feasibility of anatomical and functional distinction between FED and BD phenotypes. Results On average, annular dimensions were significantly (P<0.05) larger in BD than in FED and normal subjects while no significant differences were noticed between FED and normal. MV eccentricity progressively decreased passing from normal to FED and BD, with the latter exhibiting a rounder annulus shape. Over the cardiac cycle, we noticed significant differences for BD during systole with an abnormal annular enlargement between mid and late systole (LS) (P<0.001 vs. normal); the PAPs dynamics remained comparable in the three groups. Prolapse height and volume highlighted significant differences among normal, FED and BD valves. Conclusions Our CMR-dedicated framework allows for the quantitative and dynamic evaluation of MV apparatus, with quantifiable annular alterations representing the primary hallmark of severe MV degeneration. This may aid surgeons in the evaluation of the severity of MV dysfunction and the selection of the appropriate MV treatment. PMID:28540065
Unimodular Einstein-Cartan gravity: Dynamics and conservation laws
NASA Astrophysics Data System (ADS)
Bonder, Yuri; Corral, Cristóbal
2018-04-01
Unimodular gravity is an interesting approach to address the cosmological constant problem, since the vacuum energy density of quantum fields does not gravitate in this framework, and the cosmological constant appears as an integration constant. These features arise as a consequence of considering a constrained volume element 4-form that breaks the diffeomorphisms invariance down to volume preserving diffeomorphisms. In this work, the first-order formulation of unimodular gravity is presented by considering the spin density of matter fields as a source of spacetime torsion. Even though the most general matter Lagrangian allowed by the symmetries is considered, dynamical restrictions arise on their functional dependence. The field equations are obtained and the conservation laws associated with the symmetries are derived. It is found that, analogous to torsion-free unimodular gravity, the field equation for the vierbein is traceless; nevertheless, torsion is algebraically related to the spin density as in standard Einstein-Cartan theory. The particular example of massless Dirac spinors is studied, and comparisons with standard Einstein-Cartan theory are shown.
Development of an upwind, finite-volume code with finite-rate chemistry
NASA Technical Reports Server (NTRS)
Molvik, Gregory A.
1994-01-01
Under this grant, two numerical algorithms were developed to predict the flow of viscous, hypersonic, chemically reacting gases over three-dimensional bodies. Both algorithms take advantage of the benefits of upwind differencing, total variation diminishing techniques, and a finite-volume framework, but obtain their solution in two separate manners. The first algorithm is a zonal, time-marching scheme, and is generally used to obtain solutions in the subsonic portions of the flow field. The second algorithm is a much less expensive, space-marching scheme and can be used for the computation of the larger, supersonic portion of the flow field. Both codes compute their interface fluxes with a temporal Riemann solver and the resulting schemes are made fully implicit including the chemical source terms and boundary conditions. Strong coupling is used between the fluid dynamic, chemical, and turbulence equations. These codes have been validated on numerous hypersonic test cases and have provided excellent comparison with existing data.
3D numerical simulations of oblique droplet impact onto a deep liquid pool
NASA Astrophysics Data System (ADS)
Gelderblom, Hanneke; Reijers, Sten A.; Gielen, Marise; Sleutel, Pascal; Lohse, Detlef; Xie, Zhihua; Pain, Christopher C.; Matar, Omar K.
2017-11-01
We study the fluid dynamics of three-dimensional oblique droplet impact, which results in phenomena that include splashing and cavity formation. An adaptive, unstructured mesh modelling framework is employed here, which can modify and adapt unstructured meshes to better represent the underlying physics of droplet dynamics, and reduce computational effort without sacrificing accuracy. The numerical framework consists of a mixed control-volume and finite-element formulation, a volume-of-fluid-type method for the interface-capturing based on a compressive control-volume advection method. The framework also features second-order finite-element methods, and a force-balanced algorithm for the surface tension implementation, minimising the spurious velocities often found in many simulations involving capillary-driven flows. The numerical results generated using this framework are compared with high-speed images of the interfacial shapes of the deformed droplet, and the cavity formed upon impact, yielding good agreement. EPSRC, UK, MEMPHIS program Grant (EP/K003976/1), RAEng Research Chair (OKM).
Fast algorithm for probabilistic bone edge detection (FAPBED)
NASA Astrophysics Data System (ADS)
Scepanovic, Danilo; Kirshtein, Joshua; Jain, Ameet K.; Taylor, Russell H.
2005-04-01
The registration of preoperative CT to intra-operative reality systems is a crucial step in Computer Assisted Orthopedic Surgery (CAOS). The intra-operative sensors include 3D digitizers, fiducials, X-rays and Ultrasound (US). FAPBED is designed to process CT volumes for registration to tracked US data. Tracked US is advantageous because it is real time, noninvasive, and non-ionizing, but it is also known to have inherent inaccuracies which create the need to develop a framework that is robust to various uncertainties, and can be useful in US-CT registration. Furthermore, conventional registration methods depend on accurate and absolute segmentation. Our proposed probabilistic framework addresses the segmentation-registration duality, wherein exact segmentation is not a prerequisite to achieve accurate registration. In this paper, we develop a method for fast and automatic probabilistic bone surface (edge) detection in CT images. Various features that influence the likelihood of the surface at each spatial coordinate are combined using a simple probabilistic framework, which strikes a fair balance between a high-level understanding of features in an image and the low-level number crunching of standard image processing techniques. The algorithm evaluates different features for detecting the probability of a bone surface at each voxel, and compounds the results of these methods to yield a final, low-noise, probability map of bone surfaces in the volume. Such a probability map can then be used in conjunction with a similar map from tracked intra-operative US to achieve accurate registration. Eight sample pelvic CT scans were used to extract feature parameters and validate the final probability maps. An un-optimized fully automatic Matlab code runs in five minutes per CT volume on average, and was validated by comparison against hand-segmented gold standards. The mean probability assigned to nonzero surface points was 0.8, while nonzero non-surface points had a mean value of 0.38 indicating clear identification of surface points on average. The segmentation was also sufficiently crisp, with a full width at half maximum (FWHM) value of 1.51 voxels.
New Horizons in Adult Education. Volumes 3-7. 1989-1993.
ERIC Educational Resources Information Center
New Horizons in Adult Education, 1993
1993-01-01
Volume 3 includes the following: "Comparison of Computer and Audio Teleconferencing" (Norman Coombs); "Intellectual Suppression" [book review] (Roger Boshier). Contents of volume 4 are as follows: "Straight Time and Standard Brand Adult Education" (John Ohliger); "Comparison of Folk High Schools in Denmark, and…
Assessment of the hybrid propagation model, Volume 2: Comparison with the Integrated Noise Model
DOT National Transportation Integrated Search
2012-08-31
This is the second of two volumes of the report on the Hybrid Propagation Model (HPM), an advanced prediction model for aviation noise propagation. This volume presents comparisons of the HPM and the Integrated Noise Model (INM) for conditions of une...
Generalized Fourier analyses of the advection-diffusion equation - Part I: one-dimensional domains
NASA Astrophysics Data System (ADS)
Christon, Mark A.; Martinez, Mario J.; Voth, Thomas E.
2004-07-01
This paper presents a detailed multi-methods comparison of the spatial errors associated with finite difference, finite element and finite volume semi-discretizations of the scalar advection-diffusion equation. The errors are reported in terms of non-dimensional phase and group speed, discrete diffusivity, artificial diffusivity, and grid-induced anisotropy. It is demonstrated that Fourier analysis provides an automatic process for separating the discrete advective operator into its symmetric and skew-symmetric components and characterizing the spectral behaviour of each operator. For each of the numerical methods considered, asymptotic truncation error and resolution estimates are presented for the limiting cases of pure advection and pure diffusion. It is demonstrated that streamline upwind Petrov-Galerkin and its control-volume finite element analogue, the streamline upwind control-volume method, produce both an artificial diffusivity and a concomitant phase speed adjustment in addition to the usual semi-discrete artifacts observed in the phase speed, group speed and diffusivity. The Galerkin finite element method and its streamline upwind derivatives are shown to exhibit super-convergent behaviour in terms of phase and group speed when a consistent mass matrix is used in the formulation. In contrast, the CVFEM method and its streamline upwind derivatives yield strictly second-order behaviour. In Part II of this paper, we consider two-dimensional semi-discretizations of the advection-diffusion equation and also assess the affects of grid-induced anisotropy observed in the non-dimensional phase speed, and the discrete and artificial diffusivities. Although this work can only be considered a first step in a comprehensive multi-methods analysis and comparison, it serves to identify some of the relative strengths and weaknesses of multiple numerical methods in a common analysis framework. Published in 2004 by John Wiley & Sons, Ltd.
Automatic short axis orientation of the left ventricle in 3D ultrasound recordings
NASA Astrophysics Data System (ADS)
Pedrosa, João.; Heyde, Brecht; Heeren, Laurens; Engvall, Jan; Zamorano, Jose; Papachristidis, Alexandros; Edvardsen, Thor; Claus, Piet; D'hooge, Jan
2016-04-01
The recent advent of three-dimensional echocardiography has led to an increased interest from the scientific community in left ventricle segmentation frameworks for cardiac volume and function assessment. An automatic orientation of the segmented left ventricular mesh is an important step to obtain a point-to-point correspondence between the mesh and the cardiac anatomy. Furthermore, this would allow for an automatic division of the left ventricle into the standard 17 segments and, thus, fully automatic per-segment analysis, e.g. regional strain assessment. In this work, a method for fully automatic short axis orientation of the segmented left ventricle is presented. The proposed framework aims at detecting the inferior right ventricular insertion point. 211 three-dimensional echocardiographic images were used to validate this framework by comparison to manual annotation of the inferior right ventricular insertion point. A mean unsigned error of 8, 05° +/- 18, 50° was found, whereas the mean signed error was 1, 09°. Large deviations between the manual and automatic annotations (> 30°) only occurred in 3, 79% of cases. The average computation time was 666ms in a non-optimized MATLAB environment, which potentiates real-time application. In conclusion, a successful automatic real-time method for orientation of the segmented left ventricle is proposed.
NASA Astrophysics Data System (ADS)
Huo, Jianqiang; Yan, Shuai; Li, Haiqiang; Yu, Donghui; Arulsamy, Navamoney
2018-03-01
A series of three-dimensional coordination polymers, namely, [Cd(BIMB)(SCA)]n (1), [M(BIMB)(trans-CHDC)]n (2, M = Cd2+; 3, M = Co2+), where BIMB = 1,4-di(1H-imidazol-1-yl)benzene, SCA2- = succinate dianion, CHDC2- = cyclohexane-1,4-dicarboxylate dianion) are synthesized hydro/solvatothermal methods. The products are characterized by elemental analysis and single-crystal X-ray diffraction data. Both the dianion and BIMB bridge different pairs of the metal ions, the three complexes are polymeric and their three-dimensional topology feature a diamond-like metal-organic framework (MOF). Owing to the length of the two bridging ligands, moderate size voids are formed in the diamondoid networks. However, the voids are filled by mutual interpenetration of four independent equivalent frameworks in a 5-fold interpenetrating architecture, and there is no sufficient void volume available for any guest molecules. The phase purity and thermal stability of the compounds are verified by powder X-ray diffraction (PXRD) and thermogravimetric (TG) data. The solid-state fluorescence spectra for the 3d10 Cd2+ MOF's 1 and 2 reveal significant enhancement in their emission intensities in comparison to the non-metallated BIMB. The enhanced emission is attributed to perturbation of intra-ligand emission states due to Cd2+ coordination.
Lerner, Eitan; Ploetz, Evelyn; Hohlbein, Johannes; Cordes, Thorben; Weiss, Shimon
2016-07-07
Single-molecule, protein-induced fluorescence enhancement (PIFE) serves as a molecular ruler at molecular distances inaccessible to other spectroscopic rulers such as Förster-type resonance energy transfer (FRET) or photoinduced electron transfer. In order to provide two simultaneous measurements of two distances on different molecular length scales for the analysis of macromolecular complexes, we and others recently combined measurements of PIFE and FRET (PIFE-FRET) on the single molecule level. PIFE relies on steric hindrance of the fluorophore Cy3, which is covalently attached to a biomolecule of interest, to rotate out of an excited-state trans isomer to the cis isomer through a 90° intermediate. In this work, we provide a theoretical framework that accounts for relevant photophysical and kinetic parameters of PIFE-FRET, show how this framework allows the extraction of the fold-decrease in isomerization mobility from experimental data, and show how these results provide information on changes in the accessible volume of Cy3. The utility of this model is then demonstrated for experimental results on PIFE-FRET measurement of different protein-DNA interactions. The proposed model and extracted parameters could serve as a benchmark to allow quantitative comparison of PIFE effects in different biological systems.
DOT National Transportation Integrated Search
1978-09-01
This report documents comparisons between extensive rail freight service measurements (previously presented in Volume II) and simulations of the same operations using a sophisticated train performance calculator computer program. The comparisons cove...
Personalized models of bones based on radiographic photogrammetry.
Berthonnaud, E; Hilmi, R; Dimnet, J
2009-07-01
The radiographic photogrammetry is applied, for locating anatomical landmarks in space, from their two projected images. The goal of this paper is to define a personalized geometric model of bones, based uniquely on photogrammetric reconstructions. The personalized models of bones are obtained from two successive steps: their functional frameworks are first determined experimentally, then, the 3D bone representation results from modeling techniques. Each bone functional framework is issued from direct measurements upon two radiographic images. These images may be obtained using either perpendicular (spine and sacrum) or oblique incidences (pelvis and lower limb). Frameworks link together their functional axes and punctual landmarks. Each global bone volume is decomposed in several elementary components. Each volumic component is represented by simple geometric shapes. Volumic shapes are articulated to the patient's bone structure. The volumic personalization is obtained by best fitting the geometric model projections to their real images, using adjustable articulations. Examples are presented to illustrating the technique of personalization of bone volumes, directly issued from the treatment of only two radiographic images. The chosen techniques for treating data are then discussed. The 3D representation of bones completes, for clinical users, the information brought by radiographic images.
NASA Astrophysics Data System (ADS)
Min, Byung Jun; Nam, Heerim; Jeong, Il Sun; Lee, Hyebin
2015-07-01
In recent years, the use of a picture archiving and communication system (PACS) for radiation therapy has become the norm in hospital environments and has been suggested for collecting and managing data using Digital Imaging and Communication in Medicine (DICOM) objects from different treatment planning systems (TPSs). However, some TPSs do not provide the ability to export the dose-volume histogram (DVH) in text or other format. In addition, plan review systems for various TPSs often allow DVH recalculations with different algorithms. These algorithms result in inevitable discrepancies between the values obtained with the recalculation and those obtained with TPS itself. The purpose of this study was to develop a simple method for generating reproducible DVH values by using the TPSs. Treatment planning information, including structures and delivered dose, was exported in the DICOM format from the Eclipse v8.9 or the Pinnacle v9.6 planning systems. The supersampling and trilinear interpolation methods were employed to calculate the DVH data from 35 treatment plans. The discrepancies between the DVHs extracted from each TPS and those extracted by using the proposed calculation method were evaluated with respect to the supersampling ratio. The volume, minimum dose, maximum dose, and mean dose were compared. The variations in DVHs from multiple TPSs were compared by using the MIM software v6.1, which is a commercially available treatment planning comparison tool. The overall comparisons of the volume, minimum dose, maximum dose, and mean dose showed that the proposed method generated relatively smaller discrepancies compared with TPS than the MIM software did compare with the TPS. As the structure volume decreased, the overall percent difference increased. The largest difference was observed in small organs such as the eye ball, eye lens, and optic nerve which had volume below 10 cc. A simple and useful technique was developed to generate a DVH with an acceptable error from a proprietary TPS. This study provides a convenient and common framework that will allow the use of a single well-managed storage solution for an independent information system.
Economical and ecological comparison of granular activated carbon (GAC) adsorber refill strategies.
Bayer, Peter; Heuer, Edda; Karl, Ute; Finkel, Michael
2005-05-01
Technical constraints can leave a considerable freedom in the design of a technology, production or service strategy. Choosing between economical or ecological decision criteria then characteristically leads to controversial solutions of ideal systems. For the adaptation of granular-activated carbon (GAC) fixed beds, various technical factors determine the adsorber volume required to achieve a desired service life. In considering carbon replacement and recycling, a variety of refill strategies are available that differ in terms of refill interval, respective adsorber volume, and time-dependent use of virgin, as well as recycled GAC. Focusing on the treatment of contaminant groundwater, we compare cost-optimal reactor configurations and refill strategies to the ecologically best alternatives. Costs and consumption of GAC are quantified within a technical-economical framework. The emissions from GAC production out of hard coal, transport and recycling are equally derived through a life cycle impact assessment. It is shown how high discount rates lead to a preference of small fixed-bed volumes, and accordingly, a high number of refills. For fixed discount rates, the investigation reveals that both the economical as well as ecological assessment of refill strategies are especially sensitive to the relative valuation of virgin and recycled GAC. Since recycling results in economic and ecological benefits, optimized systems thus may differ only slightly.
NASA Astrophysics Data System (ADS)
Barreca, Giovanni; Branca, Stefano; Monaco, Carmelo
2018-03-01
3-D modeling of Mount Etna, the largest and most active volcano in Europe, has for the first time enabled acquiring new information on the volumes of products emitted during the volcanic phases that have formed Mount Etna and particularly during the last 60 ka, an issue previously not fully addressed. Volumes emitted over time allow determining the trend of eruption rates during the volcano's lifetime, also highlighting a drastic increase of emitted products in the last 15 ka. The comparison of Mount Etna's eruption rates with those of other volcanic systems in different geodynamic frameworks worldwide revealed that since 60 ka ago, eruption rates have reached a value near to that of oceanic-arc volcanic systems, although Mount Etna is considered a continental rift strato-volcano. This finding agrees well with previous studies on a possible transition of Mount Etna's magmatic source from plume-related to island-arc related. As suggested by tomographic studies, trench-parallel breakoff of the Ionian slab has occurred north of Mount Etna. Slab gateway formation right between the Aeolian magmatic province and the Mount Etna area probably induced a previously softened and fluid-enriched suprasubduction mantle wedge to flow toward the volcano with consequent magmatic source mixing.
NASA Astrophysics Data System (ADS)
Penfold, Scott; Zalas, Rafał; Casiraghi, Margherita; Brooke, Mark; Censor, Yair; Schulte, Reinhard
2017-05-01
A split feasibility formulation for the inverse problem of intensity-modulated radiation therapy treatment planning with dose-volume constraints included in the planning algorithm is presented. It involves a new type of sparsity constraint that enables the inclusion of a percentage-violation constraint in the model problem and its handling by continuous (as opposed to integer) methods. We propose an iterative algorithmic framework for solving such a problem by applying the feasibility-seeking CQ-algorithm of Byrne combined with the automatic relaxation method that uses cyclic projections. Detailed implementation instructions are furnished. Functionality of the algorithm was demonstrated through the creation of an intensity-modulated proton therapy plan for a simple 2D C-shaped geometry and also for a realistic base-of-skull chordoma treatment site. Monte Carlo simulations of proton pencil beams of varying energy were conducted to obtain dose distributions for the 2D test case. A research release of the Pinnacle 3 proton treatment planning system was used to extract pencil beam doses for a clinical base-of-skull chordoma case. In both cases the beamlet doses were calculated to satisfy dose-volume constraints according to our new algorithm. Examination of the dose-volume histograms following inverse planning with our algorithm demonstrated that it performed as intended. The application of our proposed algorithm to dose-volume constraint inverse planning was successfully demonstrated. Comparison with optimized dose distributions from the research release of the Pinnacle 3 treatment planning system showed the algorithm could achieve equivalent or superior results.
Morphing continuum theory for turbulence: Theory, computation, and visualization.
Chen, James
2017-10-01
A high order morphing continuum theory (MCT) is introduced to model highly compressible turbulence. The theory is formulated under the rigorous framework of rational continuum mechanics. A set of linear constitutive equations and balance laws are deduced and presented from the Coleman-Noll procedure and Onsager's reciprocal relations. The governing equations are then arranged in conservation form and solved through the finite volume method with a second-order Lax-Friedrichs scheme for shock preservation. A numerical example of transonic flow over a three-dimensional bump is presented using MCT and the finite volume method. The comparison shows that MCT-based direct numerical simulation (DNS) provides a better prediction than Navier-Stokes (NS)-based DNS with less than 10% of the mesh number when compared with experiments. A MCT-based and frame-indifferent Q criterion is also derived to show the coherent eddy structure of the downstream turbulence in the numerical example. It should be emphasized that unlike the NS-based Q criterion, the MCT-based Q criterion is objective without the limitation of Galilean invariance.
Zhou, Jinghao; Yan, Zhennan; Lasio, Giovanni; Huang, Junzhou; Zhang, Baoshe; Sharma, Navesh; Prado, Karl; D'Souza, Warren
2015-12-01
To resolve challenges in image segmentation in oncologic patients with severely compromised lung, we propose an automated right lung segmentation framework that uses a robust, atlas-based active volume model with a sparse shape composition prior. The robust atlas is achieved by combining the atlas with the output of sparse shape composition. Thoracic computed tomography images (n=38) from patients with lung tumors were collected. The right lung in each scan was manually segmented to build a reference training dataset against which the performance of the automated segmentation method was assessed. The quantitative results of this proposed segmentation method with sparse shape composition achieved mean Dice similarity coefficient (DSC) of (0.72, 0.81) with 95% CI, mean accuracy (ACC) of (0.97, 0.98) with 95% CI, and mean relative error (RE) of (0.46, 0.74) with 95% CI. Both qualitative and quantitative comparisons suggest that this proposed method can achieve better segmentation accuracy with less variance than other atlas-based segmentation methods in the compromised lung segmentation. Published by Elsevier Ltd.
Development of an upwind, finite-volume code with finite-rate chemistry
NASA Technical Reports Server (NTRS)
Molvik, Gregory A.
1995-01-01
Under this grant, two numerical algorithms were developed to predict the flow of viscous, hypersonic, chemically reacting gases over three-dimensional bodies. Both algorithms take advantage of the benefits of upwind differencing, total variation diminishing techniques and of a finite-volume framework, but obtain their solution in two separate manners. The first algorithm is a zonal, time-marching scheme, and is generally used to obtain solutions in the subsonic portions of the flow field. The second algorithm is a much less expensive, space-marching scheme and can be used for the computation of the larger, supersonic portion of the flow field. Both codes compute their interface fluxes with a temporal Riemann solver and the resulting schemes are made fully implicit including the chemical source terms and boundary conditions. Strong coupling is used between the fluid dynamic, chemical and turbulence equations. These codes have been validated on numerous hypersonic test cases and have provided excellent comparison with existing data. This report summarizes the research that took place from August 1,1994 to January 1, 1995.
A Cost Comparison Framework for Use in Optimizing Ground Water Pump and Treat Systems
This fact sheet has been prepared to provide a framework for conducting cost comparisons to evaluate whether or not to pursue potential opportunities from an optimization evaluation for improving, replacing, or supplementing the P&T system.
Compositeness of hadron resonances in finite volume
NASA Astrophysics Data System (ADS)
Tsuchida, Yujiro; Hyodo, Tetsuo
2018-05-01
We develop a theoretical framework to quantify the structure of unstable hadron resonances. With the help of the corresponding system in a finite volume, we define the compositeness of resonance states which can be interpreted as a probability. This framework is used to study the structure of the scalar mesons f0(980 ) and a0(980 ) . In both mesons, the K ¯K component dominates about a half of the wave function. The method is also applied to the Λ (1405 ) resonance. We argue that a single energy level in finite volume represents the two eigenstates in infinite volume. The K ¯N component of Λ (1405 ) , including contributions from both eigenstates, is found to be 58%, and the rest is composed of the π Σ and other channels.
A Graphics Design Framework to Visualize Multi-Dimensional Economic Datasets
ERIC Educational Resources Information Center
Chandramouli, Magesh; Narayanan, Badri; Bertoline, Gary R.
2013-01-01
This study implements a prototype graphics visualization framework to visualize multidimensional data. This graphics design framework serves as a "visual analytical database" for visualization and simulation of economic models. One of the primary goals of any kind of visualization is to extract useful information from colossal volumes of…
Numerical simulations of loop quantum Bianchi-I spacetimes
NASA Astrophysics Data System (ADS)
Diener, Peter; Joe, Anton; Megevand, Miguel; Singh, Parampreet
2017-05-01
Due to the numerical complexities of studying evolution in an anisotropic quantum spacetime, in comparison to the isotropic models, the physics of loop quantized anisotropic models has remained largely unexplored. In particular, robustness of bounce and the validity of effective dynamics have so far not been established. Our analysis fills these gaps for the case of vacuum Bianchi-I spacetime. To efficiently solve the quantum Hamiltonian constraint we perform an implementation of the Cactus framework which is conventionally used for applications in numerical relativity. Using high performance computing, numerical simulations for a large number of initial states with a wide variety of fluctuations are performed. Big bang singularity is found to be replaced by anisotropic bounces for all the cases. We find that for initial states which are sharply peaked at the late times in the classical regime and bounce at a mean volume much greater than the Planck volume, effective dynamics is an excellent approximation to the underlying quantum dynamics. Departures of the effective dynamics from the quantum evolution appear for the states probing deep Planck volumes. A detailed analysis of the behavior of this departure reveals a non-monotonic and subtle dependence on fluctuations of the initial states. We find that effective dynamics in almost all of the cases underestimates the volume and hence overestimates the curvature at the bounce, a result in synergy with earlier findings in the isotropic case. The expansion and shear scalars are found to be bounded throughout the evolution.
Ground and Space Radar Volume Matching and Comparison Software
NASA Technical Reports Server (NTRS)
Morris, Kenneth; Schwaller, Mathew
2010-01-01
This software enables easy comparison of ground- and space-based radar observations. The software was initially designed to compare ground radar reflectivity from operational, ground based Sand C-band meteorological radars with comparable measurements from the Tropical Rainfall Measuring Mission (TRMM) satellite s Precipitation Radar (PR) instrument. The software is also applicable to other ground-based and space-based radars. The ground and space radar volume matching and comparison software was developed in response to requirements defined by the Ground Validation System (GVS) of Goddard s Global Precipitation Mission (GPM) project. This software innovation is specifically concerned with simplifying the comparison of ground- and spacebased radar measurements for the purpose of GPM algorithm and data product validation. This software is unique in that it provides an operational environment to routinely create comparison products, and uses a direct geometric approach to derive common volumes of space- and ground-based radar data. In this approach, spatially coincident volumes are defined by the intersection of individual space-based Precipitation Radar rays with the each of the conical elevation sweeps of the ground radar. Thus, the resampled volume elements of the space and ground radar reflectivity can be directly compared to one another.
Ferritin Levels and Their Association With Regional Brain Volumes in Tourette's Syndrome
Gorman, Daniel A.; Zhu, Hongtu; Anderson, George M.; Davies, Mark; Peterson, Bradley S.
2008-01-01
Objective A previous small study showed lower serum ferritin levels in subjects with Tourette's syndrome than in healthy subjects. The authors measured peripheral iron indices in a large group of Tourette's syndrome and comparison subjects and explored associations of ferritin levels with regional brain volumes. Method Ferritin was measured in 107 children and adults (63 Tourette's syndrome, 44 comparison); serum iron was measured in 73 (41 Tourette's syndrome, 32 comparison). Magnetic resonance imaging scans were used to measure volumes of the basal ganglia and cortical gray matter. Results Ferritin and serum iron were significantly lower in the Tourette's syndrome subjects, although still within the normal range. No association was found between tic severity and either iron index. In the Tourette's syndrome subjects, ferritin did not correlate significantly with caudate volume but did correlate positively with putamen volume. In the comparison subjects, ferritin correlated inversely with caudate volume but did not correlate significantly with putamen volume. Irrespective of diagnosis, ferritin correlated positively with volumes of the sensorimotor, midtemporal, and subgenual cortices. Conclusions The lower peripheral ferritin and iron levels in persons with Tourette's syndrome are consistent with findings in other movement disorders and suggest that lower iron availability may have a causal role in the pathophysiology of tic disorders. Lower iron stores may contribute to hypoplasia of the caudate and putamen, increasing vulnerability to developing tics or to having more severe tics. Lower iron stores may also contribute to smaller cortical volumes and consequently to reduced inhibitory control of tics. PMID:16816233
NASA Technical Reports Server (NTRS)
1976-01-01
The applicability of energy storage devices to any energy system depends on the performance and cost characteristics of the larger basic system. A comparative assessment of energy storage alternatives for application to IUS which addresses the systems aspects of the overall installation is described. Factors considered include: (1) descriptions of the two no-storage IUS baselines utilized as yardsticks for comparison throughout the study; (2) discussions of the assessment criteria and the selection framework employed; (3) a summary of the rationale utilized in selecting water storage as the primary energy storage candidate for near term application to IUS; (4) discussion of the integration aspects of water storage systems; and (5) an assessment of IUS with water storage in alternative climates.
ERIC Educational Resources Information Center
Arneberg, Marie; Bowitz, Einar
2006-01-01
International comparisons of data on expenditure on education use purchasing power parities for currency conversion and adjustment for price differences between countries to allow for volume comparisons. The resulting indicators are commonly interpreted as differences between countries in input volumes to the education sector-teachers, materials,…
NASA Astrophysics Data System (ADS)
Fischer, R. X.; Baur, W. H.
This document is part of Subvolume E `Zeolite-Type Crystal Structures and their Chemistry. Framework Type Codes RON to STI' of Volume 14 `Microporous and other Framework Materials with Zeolite-Type Structures' of Landolt-Börnstein Group IV `Physical Chemistry'.
NASA Astrophysics Data System (ADS)
Fischer, R. X.; Baur, W. H.
This document is part of Subvolume F 'Zeolite-Type Crystal Structures and their Chemistry. Framework Type Codes STO to ZON' of Volume 14 'Microporous and other Framework Materials with Zeolite-Type Structures' of Landolt-Börnstein Group IV 'Physical Chemistry'.
Unstructured medical image query using big data - An epilepsy case study.
Istephan, Sarmad; Siadat, Mohammad-Reza
2016-02-01
Big data technologies are critical to the medical field which requires new frameworks to leverage them. Such frameworks would benefit medical experts to test hypotheses by querying huge volumes of unstructured medical data to provide better patient care. The objective of this work is to implement and examine the feasibility of having such a framework to provide efficient querying of unstructured data in unlimited ways. The feasibility study was conducted specifically in the epilepsy field. The proposed framework evaluates a query in two phases. In phase 1, structured data is used to filter the clinical data warehouse. In phase 2, feature extraction modules are executed on the unstructured data in a distributed manner via Hadoop to complete the query. Three modules have been created, volume comparer, surface to volume conversion and average intensity. The framework allows for user-defined modules to be imported to provide unlimited ways to process the unstructured data hence potentially extending the application of this framework beyond epilepsy field. Two types of criteria were used to validate the feasibility of the proposed framework - the ability/accuracy of fulfilling an advanced medical query and the efficiency that Hadoop provides. For the first criterion, the framework executed an advanced medical query that spanned both structured and unstructured data with accurate results. For the second criterion, different architectures were explored to evaluate the performance of various Hadoop configurations and were compared to a traditional Single Server Architecture (SSA). The surface to volume conversion module performed up to 40 times faster than the SSA (using a 20 node Hadoop cluster) and the average intensity module performed up to 85 times faster than the SSA (using a 40 node Hadoop cluster). Furthermore, the 40 node Hadoop cluster executed the average intensity module on 10,000 models in 3h which was not even practical for the SSA. The current study is limited to epilepsy field and further research and more feature extraction modules are required to show its applicability in other medical domains. The proposed framework advances data-driven medicine by unleashing the content of unstructured medical data in an efficient and unlimited way to be harnessed by medical experts. Copyright © 2015 Elsevier Inc. All rights reserved.
Xi, Kai; Cao, Shuai; Peng, Xiaoyu; Ducati, Caterina; Kumar, R Vasant; Cheetham, Anthony K
2013-03-18
This paper presents a novel method and rationale for utilizing carbonized MOFs for sulphur loading to fabricate cathode structures for lithium-sulphur batteries. Unique carbon materials with differing hierarchical pore structures were synthesized from four types of zinc-containing metal-organic frameworks (MOFs). It is found that cathode materials made from MOFs-derived carbons with higher mesopore (2-50 nm) volumes exhibit increased initial discharge capacities, whereas carbons with higher micropore (<2 nm) volumes lead to cathode materials with better cycle stability.
NASA Astrophysics Data System (ADS)
Babonis, G. S.; Csatho, B. M.; Schenk, A. F.
2016-12-01
We present a new record of Antarctic ice thickness changes, reconstructed from ICESat laser altimetry observations, from 2004-2009, at over 100,000 locations across the Antarctic Ice Sheet (AIS). This work generates elevation time series at ICESat groundtrack crossover regions on an observation-by-observation basis, with rigorous, quantified, error estimates using the SERAC approach (Schenk and Csatho, 2012). The results include average and annual elevation, volume and mass changes in Antarctica, fully corrected for glacial isostatic adjustment (GIA) and known intercampaign biases; and partitioned into contributions from surficial processes (e.g. firn densification) and ice dynamics. The modular flexibility of the SERAC framework allows for the assimilation of multiple ancillary datasets (e.g. GIA models, Intercampaign Bias Corrections, IBC), in a common framework, to calculate mass changes for several different combinations of GIA models and IBCs and to arrive at a measure of variability from these results. We are able to determine the effect these corrections have on annual and average volume and mass change calculations in Antarctica, and to explore how these differences vary between drainage basins and with elevation. As such, this contribution presents a method that compliments, and is consistent with, the 2012 Ice sheet Mass Balance Inter-comparison Exercise (IMBIE) results (Shepherd 2012). Additionally, this work will contribute to the 2016 IMBIE, which seeks to reconcile ice sheet mass changes from different observations,, including laser altimetry, using a different methodologies and ancillary datasets including GIA models, Firn Densification Models, and Intercampaign Bias Corrections.
NASA Astrophysics Data System (ADS)
Zhou, Qianqian; Leng, Guoyong; Huang, Maoyi
2018-01-01
As China becomes increasingly urbanised, flooding has become a regular occurrence in its major cities. Assessing the effects of future climate change on urban flood volumes is crucial to informing better management of such disasters given the severity of the devastating impacts of flooding (e.g. the 2016 flooding events across China). Although recent studies have investigated the impacts of future climate change on urban flooding, the effects of both climate change mitigation and adaptation have rarely been accounted for together in a consistent framework. In this study, we assess the benefits of mitigating climate change by reducing greenhouse gas (GHG) emissions and locally adapting to climate change by modifying drainage systems to reduce urban flooding under various climate change scenarios through a case study conducted in northern China. The urban drainage model - Storm Water Management Model - was used to simulate urban flood volumes using current and two adapted drainage systems (i.e. pipe enlargement and low-impact development, LID), driven by bias-corrected meteorological forcing from five general circulation models in the Coupled Model Intercomparison Project Phase 5 archive. Results indicate that urban flood volume is projected to increase by 52 % over 2020-2040 compared to the volume in 1971-2000 under the business-as-usual scenario (i.e. Representative Concentration Pathway (RCP) 8.5). The magnitudes of urban flood volumes are found to increase nonlinearly with changes in precipitation intensity. On average, the projected flood volume under RCP 2.6 is 13 % less than that under RCP 8.5, demonstrating the benefits of global-scale climate change mitigation efforts in reducing local urban flood volumes. Comparison of reduced flood volumes between climate change mitigation and local adaptation (by improving drainage systems) scenarios suggests that local adaptation is more effective than climate change mitigation in reducing future flood volumes. This has broad implications for the research community relative to drainage system design and modelling in a changing environment. This study highlights the importance of accounting for local adaptation when coping with future urban floods.
Zhou, Qianqian; Leng, Guoyong; Huang, Maoyi
2018-01-15
As China becomes increasingly urbanised, flooding has become a regular occurrence in its major cities. Assessing the effects of future climate change on urban flood volumes is crucial to informing better management of such disasters given the severity of the devastating impacts of flooding (e.g. the 2016 flooding events across China). Although recent studies have investigated the impacts of future climate change on urban flooding, the effects of both climate change mitigation and adaptation have rarely been accounted for together in a consistent framework. In this study, we assess the benefits of mitigating climate change by reducing greenhouse gas (GHG)more » emissions and locally adapting to climate change by modifying drainage systems to reduce urban flooding under various climate change scenarios through a case study conducted in northern China. The urban drainage model – Storm Water Management Model – was used to simulate urban flood volumes using current and two adapted drainage systems (i.e. pipe enlargement and low-impact development, LID), driven by bias-corrected meteorological forcing from five general circulation models in the Coupled Model Intercomparison Project Phase 5 archive. Results indicate that urban flood volume is projected to increase by 52 % over 2020–2040 compared to the volume in 1971–2000 under the business-as-usual scenario (i.e. Representative Concentration Pathway (RCP) 8.5). The magnitudes of urban flood volumes are found to increase nonlinearly with changes in precipitation intensity. On average, the projected flood volume under RCP 2.6 is 13 % less than that under RCP 8.5, demonstrating the benefits of global-scale climate change mitigation efforts in reducing local urban flood volumes. Comparison of reduced flood volumes between climate change mitigation and local adaptation (by improving drainage systems) scenarios suggests that local adaptation is more effective than climate change mitigation in reducing future flood volumes. This has broad implications for the research community relative to drainage system design and modelling in a changing environment. Furthermore, this study highlights the importance of accounting for local adaptation when coping with future urban floods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhou, Qianqian; Leng, Guoyong; Huang, Maoyi
As China becomes increasingly urbanised, flooding has become a regular occurrence in its major cities. Assessing the effects of future climate change on urban flood volumes is crucial to informing better management of such disasters given the severity of the devastating impacts of flooding (e.g. the 2016 flooding events across China). Although recent studies have investigated the impacts of future climate change on urban flooding, the effects of both climate change mitigation and adaptation have rarely been accounted for together in a consistent framework. In this study, we assess the benefits of mitigating climate change by reducing greenhouse gas (GHG)more » emissions and locally adapting to climate change by modifying drainage systems to reduce urban flooding under various climate change scenarios through a case study conducted in northern China. The urban drainage model – Storm Water Management Model – was used to simulate urban flood volumes using current and two adapted drainage systems (i.e. pipe enlargement and low-impact development, LID), driven by bias-corrected meteorological forcing from five general circulation models in the Coupled Model Intercomparison Project Phase 5 archive. Results indicate that urban flood volume is projected to increase by 52 % over 2020–2040 compared to the volume in 1971–2000 under the business-as-usual scenario (i.e. Representative Concentration Pathway (RCP) 8.5). The magnitudes of urban flood volumes are found to increase nonlinearly with changes in precipitation intensity. On average, the projected flood volume under RCP 2.6 is 13 % less than that under RCP 8.5, demonstrating the benefits of global-scale climate change mitigation efforts in reducing local urban flood volumes. Comparison of reduced flood volumes between climate change mitigation and local adaptation (by improving drainage systems) scenarios suggests that local adaptation is more effective than climate change mitigation in reducing future flood volumes. This has broad implications for the research community relative to drainage system design and modelling in a changing environment. Furthermore, this study highlights the importance of accounting for local adaptation when coping with future urban floods.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Barin, G; Krungleviciute, V; Gutov, O
2014-07-07
We successfully demonstrate an approach based on linker fragmentation to create defects and tune the pore volumes and surface areas of two metal-organic frameworks, NU-125 and HKUST-1, both of which feature copper paddlewheel nodes. Depending on the linker fragment composition, the defect can be either a vacant site or a functional group that the original linker does not have. In the first case, we show that both surface area and pore volume increase, while in the second case they decrease. The effect of defects on the high-pressure gas uptake is also studied over a large temperature and pressure range formore » different gases. We found that despite an increase in pore volume and surface area in structures with vacant sites, the absolute adsorption for methane decreases for HKUST-1 and slightly increases for NU-125. However, the working capacity (deliverable amount between 65 and 5 bar) in both cases remains similar to parent frameworks due to lower uptakes at low pressures. In the case of NU-125, the effect of defects became more pronounced at lower temperatures, reflecting the greater surface areas and pore volumes of the altered forms.« less
Szewczyk, Wojciech; Prajsner, Andrzej; Kozina, Janusz; Login, Tomasz; Kaczorowski, Marek
2004-01-01
General practitioner very often uses transabdominal ultrasonograpy (TAUS) in order to measure prostatic volume. Using this method it is rather impossible to distinguish between tissue of benign prostatic hyperplasia (BPH) and prostatic tissue which forms so called surgical capsule of BPH. The aim of this study was a comparison of prostatic volume measured during suprapubic (transabdominal) ultrasonography and volume of the enucleated gland after open prostatectomy. Regarding the results authors created a nomogram based on TAUS measurement of the prostate which helps to predict the volume of BPH. They also stated that surgical capsule of the BPH makes about 1/3 of the whole volume of the prostate measured by TAUS.
Konheim, Jeremy A; Kon, Zachary N; Pasrija, Chetan; Luo, Qingyang; Sanchez, Pablo G; Garcia, Jose P; Griffith, Bartley P; Jeudy, Jean
2016-04-01
Size matching for lung transplantation is widely accomplished using height comparisons between donors and recipients. This gross approximation allows for wide variation in lung size and, potentially, size mismatch. Three-dimensional computed tomography (3D-CT) volumetry comparisons could offer more accurate size matching. Although recipient CT scans are universally available, donor CT scans are rarely performed. Therefore, predicted donor lung volumes could be used for comparison to measured recipient lung volumes, but no such predictive equations exist. We aimed to use 3D-CT volumetry measurements from a normal patient population to generate equations for predicted total lung volume (pTLV), predicted right lung volume (pRLV), and predicted left lung volume (pLLV), for size-matching purposes. Chest CT scans of 400 normal patients were retrospectively evaluated. 3D-CT volumetry was performed to measure total lung volume, right lung volume, and left lung volume of each patient, and predictive equations were generated. The fitted model was tested in a separate group of 100 patients. The model was externally validated by comparison of total lung volume with total lung capacity from pulmonary function tests in a subset of those patients. Age, gender, height, and race were independent predictors of lung volume. In the test group, there were strong linear correlations between predicted and actual lung volumes measured by 3D-CT volumetry for pTLV (r = 0.72), pRLV (r = 0.72), and pLLV (r = 0.69). A strong linear correlation was also observed when comparing pTLV and total lung capacity (r = 0.82). We successfully created a predictive model for pTLV, pRLV, and pLLV. These may serve as reference standards and predict donor lung volume for size matching in lung transplantation. Copyright © 2016 The American Association for Thoracic Surgery. Published by Elsevier Inc. All rights reserved.
Superpixel guided active contour segmentation of retinal layers in OCT volumes
NASA Astrophysics Data System (ADS)
Bai, Fangliang; Gibson, Stuart J.; Marques, Manuel J.; Podoleanu, Adrian
2018-03-01
Retinal OCT image segmentation is a precursor to subsequent medical diagnosis by a clinician or machine learning algorithm. In the last decade, many algorithms have been proposed to detect retinal layer boundaries and simplify the image representation. Inspired by the recent success of superpixel methods for pre-processing natural images, we present a novel framework for segmentation of retinal layers in OCT volume data. In our framework, the region of interest (e.g. the fovea) is located using an adaptive-curve method. The cell layer boundaries are then robustly detected firstly using 1D superpixels, applied to A-scans, and then fitting active contours in B-scan images. Thereafter the 3D cell layer surfaces are efficiently segmented from the volume data. The framework was tested on healthy eye data and we show that it is capable of segmenting up to 12 layers. The experimental results imply the effectiveness of proposed method and indicate its robustness to low image resolution and intrinsic speckle noise.
Computer-aided pulmonary image analysis in small animal models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.
Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next.more » The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.« less
Exposure Render: An Interactive Photo-Realistic Volume Rendering Framework
Kroes, Thomas; Post, Frits H.; Botha, Charl P.
2012-01-01
The field of volume visualization has undergone rapid development during the past years, both due to advances in suitable computing hardware and due to the increasing availability of large volume datasets. Recent work has focused on increasing the visual realism in Direct Volume Rendering (DVR) by integrating a number of visually plausible but often effect-specific rendering techniques, for instance modeling of light occlusion and depth of field. Besides yielding more attractive renderings, especially the more realistic lighting has a positive effect on perceptual tasks. Although these new rendering techniques yield impressive results, they exhibit limitations in terms of their exibility and their performance. Monte Carlo ray tracing (MCRT), coupled with physically based light transport, is the de-facto standard for synthesizing highly realistic images in the graphics domain, although usually not from volumetric data. Due to the stochastic sampling of MCRT algorithms, numerous effects can be achieved in a relatively straight-forward fashion. For this reason, we have developed a practical framework that applies MCRT techniques also to direct volume rendering (DVR). With this work, we demonstrate that a host of realistic effects, including physically based lighting, can be simulated in a generic and flexible fashion, leading to interactive DVR with improved realism. In the hope that this improved approach to DVR will see more use in practice, we have made available our framework under a permissive open source license. PMID:22768292
NASA Astrophysics Data System (ADS)
Gloger, Oliver; Tönnies, Klaus; Bülow, Robin; Völzke, Henry
2017-07-01
To develop the first fully automated 3D spleen segmentation framework derived from T1-weighted magnetic resonance (MR) imaging data and to verify its performance for spleen delineation and volumetry. This approach considers the issue of low contrast between spleen and adjacent tissue in non-contrast-enhanced MR images. Native T1-weighted MR volume data was performed on a 1.5 T MR system in an epidemiological study. We analyzed random subsamples of MR examinations without pathologies to develop and verify the spleen segmentation framework. The framework is modularized to include different kinds of prior knowledge into the segmentation pipeline. Classification by support vector machines differentiates between five different shape types in computed foreground probability maps and recognizes characteristic spleen regions in axial slices of MR volume data. A spleen-shape space generated by training produces subject-specific prior shape knowledge that is then incorporated into a final 3D level set segmentation method. Individually adapted shape-driven forces as well as image-driven forces resulting from refined foreground probability maps steer the level set successfully to the segment the spleen. The framework achieves promising segmentation results with mean Dice coefficients of nearly 0.91 and low volumetric mean errors of 6.3%. The presented spleen segmentation approach can delineate spleen tissue in native MR volume data. Several kinds of prior shape knowledge including subject-specific 3D prior shape knowledge can be used to guide segmentation processes achieving promising results.
Review of the National Research Council's Framework for K-12 Science Education
ERIC Educational Resources Information Center
Gross, Paul R.
2011-01-01
The new "Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas" is a big, comprehensive volume, carefully organized and heavily documented. It is the long-awaited product of the Committee on a Conceptual Framework for New K-12 Science Education Standards. As noted, it is a weighty document (more than 300…
A comparison of justice frameworks for international research.
Pratt, Bridget; Loff, Bebe
2015-07-01
Justice frameworks have been developed for international research that provide guidance on the selection of research targets, ancillary care, research capacity strengthening, and post-trial benefits. Yet there has been limited comparison of the different frameworks. This paper examines the underlying aims and theoretical bases of three such frameworks--the fair benefits framework, the human development approach and research for health justice--and considers how their aims impact their guidance on the aforementioned four ethical issues. It shows that the frameworks' underlying objectives vary across two dimensions. First, whether they seek to prevent harmful or exploitative international research or to promote international research with health benefits for low and middle-income countries. Second, whether they address justice at the micro level or the macro level. The fair benefits framework focuses on reforming contractual elements in individual international research collaborations to ensure fairness, whereas the other two frameworks aim to connect international research with the reduction of global health inequities. The paper then highlights where there is overlap between the frameworks' requirements and where differences in the strength and content of the obligations they identify arise as a result of their varying objectives and theoretical bases. In doing so, it does not offer a critical comparison of the frameworks but rather seeks to add clarity to current debates on justice and international research by showing how they are positioned relative to one another. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
A Mechanics-Based Framework Leading to Improved Diagnosis and Treatment of Hydrocephalus
NASA Astrophysics Data System (ADS)
Cohen, Benjamin; Soren, Vedels; Wagshul, Mark; Egnor, Michael; Voorhees, Abram; Wei, Timothy
2007-11-01
Hydrocephalus is defined as an accumulation of cerebrospinal fluid (CSF) in the cranium, at the expense of brain tissue. The result is a disruption of the normal pressure and/or flow dynamics of the intracranial blood and CSF. We seek to introduce integral control volume analysis to the study of hydrocephalus. The goal is to provide a first principles framework to integrate a broad spectrum of sometimes disparate investigations into a highly complex, multidisciplinary problem. The general technique for the implementation of control volumes to hydrocephalus will be presented. This includes factors faced in choosing control volumes and making the required measurements to evaluate mass and momentum conservation. In addition, the use of our Digital Particle Image Velocimetry (DPIV) processing program has been extended to measure the displacement of the ventricles' walls from Magnetic Resonance (MR) images. This is done to determine the volume change of the intracranial fluid spaces.
Chalal, Hocine; Abed-Meraim, Farid
2018-06-20
In the current contribution, prismatic and hexahedral quadratic solid⁻shell (SHB) finite elements are proposed for the geometrically nonlinear analysis of thin structures made of functionally graded material (FGM). The proposed SHB finite elements are developed within a purely 3D framework, with displacements as the only degrees of freedom. Also, the in-plane reduced-integration technique is combined with the assumed-strain method to alleviate various locking phenomena. Furthermore, an arbitrary number of integration points are placed along a special direction, which represents the thickness. The developed elements are coupled with functionally graded behavior for the modeling of thin FGM plates. To this end, the Young modulus of the FGM plate is assumed to vary gradually in the thickness direction, according to a volume fraction distribution. The resulting formulations are implemented into the quasi-static ABAQUS/Standard finite element software in the framework of large displacements and rotations. Popular nonlinear benchmark problems are considered to assess the performance and accuracy of the proposed SHB elements. Comparisons with reference solutions from the literature demonstrate the good capabilities of the developed SHB elements for the 3D simulation of thin FGM plates.
Huang, Yuanbiao; Lin, Zujin; Fu, Hongru; Wang, Fei; Shen, Min; Wang, Xusheng; Cao, Rong
2014-09-01
A three-dimensional microporous anionic metal-organic framework (MOF) (Et4N)3[In3(TATB)4] (FJI-C1, H3TATB=4,4',4''-s-triazine-2,4,6-triyltribenzoic acid) with large unit cell volume has been synthesized. Assisted by the organic cation group Et4N in the pores of the compound, FJI-C1 not only shows high adsorption uptakes of C2 and C3 hydrocarbons, but also exhibits highly selective separation of propane, acetylene, ethane, and ethylene from methane at room temperature. Furthermore, it also exhibits high separation selectivity for propane over C2 hydrocarbons and acetylene can be readily separated from their C2 hydrocarbons mixtures at low pressure due to the high selectivity for C2H2 in comparison to C2H4 and C2H6. In addition, FJI-C1 with hydrophilic internal pores surfaces shows highly efficient adsorption separation of polar molecules from nonpolar molecules. Notably, it exhibits high separation selectivity for benzene over cyclohexane due to the π-π interactions between benzene molecules and s-triazine rings of the porous MOF. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
A conceptually and computationally simple method for the definition, display, quantification, and comparison of the shapes of three-dimensional mathematical molecular models is presented. Molecular or solvent-accessible volume and surface area can also be calculated. Algorithms, ...
Emmett, Christopher; Close, Helen; Mason, James; Taheri, Shiva; Stevens, Natasha; Eldridge, Sandra; Norton, Christine; Knowles, Charles; Yiannakou, Yan
2017-03-31
Constipation is common in adults and up to 20% of the population report this symptom. Chronic constipation (CC), usually defined as more than 6 months of symptoms, is less common but results in 0.5 million UK GP consultations per annum. The effect of symptoms on measured quality of life (QOL) is significant, and CC consumes significant health care resources. In the UK, it is estimated that 10% of district nursing time is spent on constipation. Trans-anal irrigation therapy has become a widely used treatment despite a lack of robust efficacy data to support its use. The long-term outcome of treatment is also unclear. A randomised comparison of two different methods of irrigation (high- and low-volume) will provide valuable evidence of superiority of one system over the other, as well as providing efficacy data for the treatment as a whole. Participants will be recruited based on predetermined eligibility criteria. Following informed consent, they will be randomised to either high-volume (HV) or low-volume (LV) irrigation and undergo standardised radiological and physiological investigations. Following training, they will commence home irrigation with the allocated device. Data will be collected at 1, 3, 6 and 12 months according to a standardised outcomes framework. The primary outcome is PAC-QOL, measured at 3 months. The study is powered to detect a 10% difference in outcome between systems at 3 months; this means that 300 patients will need to be recruited. This study will be the first randomised comparison of two different methods of trans-anal irrigation. It will also be the largest prospective study of CC patients treated with irrigation. It will provide evidence for the effectiveness of irrigation in the treatment of CC, as well as the comparative effectiveness of the two methods. This will enable more cost-effective and evidence-based use of irrigation. Also, the results will be combined with the other studies in the CapaCiTY programme to generate an evidence-based treatment algorithm for CC in adults. ISRCTN, identifier: ISRCTN11093872 . Registered on 11 November 2015. Trial not retrospectively registered. Protocol version 3 (22 January 2016).
NASA Technical Reports Server (NTRS)
Spangelo, Sara
2015-01-01
The goal of this paper is to explore the mission opportunities that are uniquely enabled by U-class Solar Electric Propulsion (SEP) technologies. Small SEP thrusters offers significant advantages relative to existing technologies and will revolutionize the class of mission architectures that small spacecraft can accomplish by enabling trajectory maneuvers with significant change in velocity requirements and reaction wheel-free attitude control. This paper aims to develop and apply a common system-level modeling framework to evaluate these thrusters for relevant upcoming mission scenarios, taking into account the mass, power, volume, and operational constraints of small highly-constrained missions. We will identify the optimal technology for broad classes of mission applications for different U-class spacecraft sizes and provide insights into what constrains the system performance to identify technology areas where improvements are needed.
NASA Astrophysics Data System (ADS)
Yankovsky, Valentine A.; Manuilova, Rada O.
2017-11-01
The altitude profiles of ozone concentration are retrieved from measurements of the volume emission rate in the 1.27 μm oxygen band in the TIMED-SABER experiment. In this study we compare the methods of retrieval of daytime [O3] altitude profile in the framework of two models: electronic-vibrational kinetics and a purely electronic kinetics of excited products of ozone and oxygen photolysis. In order to retrieve the [O3] altitude profile from the measurements of the intensity of the O2 band in the region of 1.27 μm correctly, it is necessary to use the photochemical model of the electronic-vibrational kinetics of excited products of ozone and oxygen photolysis in the mesosphere and lower thermosphere.
A high-performance spatial database based approach for pathology imaging algorithm evaluation
Wang, Fusheng; Kong, Jun; Gao, Jingjing; Cooper, Lee A.D.; Kurc, Tahsin; Zhou, Zhengwen; Adler, David; Vergara-Niedermayr, Cristobal; Katigbak, Bryan; Brat, Daniel J.; Saltz, Joel H.
2013-01-01
Background: Algorithm evaluation provides a means to characterize variability across image analysis algorithms, validate algorithms by comparison with human annotations, combine results from multiple algorithms for performance improvement, and facilitate algorithm sensitivity studies. The sizes of images and image analysis results in pathology image analysis pose significant challenges in algorithm evaluation. We present an efficient parallel spatial database approach to model, normalize, manage, and query large volumes of analytical image result data. This provides an efficient platform for algorithm evaluation. Our experiments with a set of brain tumor images demonstrate the application, scalability, and effectiveness of the platform. Context: The paper describes an approach and platform for evaluation of pathology image analysis algorithms. The platform facilitates algorithm evaluation through a high-performance database built on the Pathology Analytic Imaging Standards (PAIS) data model. Aims: (1) Develop a framework to support algorithm evaluation by modeling and managing analytical results and human annotations from pathology images; (2) Create a robust data normalization tool for converting, validating, and fixing spatial data from algorithm or human annotations; (3) Develop a set of queries to support data sampling and result comparisons; (4) Achieve high performance computation capacity via a parallel data management infrastructure, parallel data loading and spatial indexing optimizations in this infrastructure. Materials and Methods: We have considered two scenarios for algorithm evaluation: (1) algorithm comparison where multiple result sets from different methods are compared and consolidated; and (2) algorithm validation where algorithm results are compared with human annotations. We have developed a spatial normalization toolkit to validate and normalize spatial boundaries produced by image analysis algorithms or human annotations. The validated data were formatted based on the PAIS data model and loaded into a spatial database. To support efficient data loading, we have implemented a parallel data loading tool that takes advantage of multi-core CPUs to accelerate data injection. The spatial database manages both geometric shapes and image features or classifications, and enables spatial sampling, result comparison, and result aggregation through expressive structured query language (SQL) queries with spatial extensions. To provide scalable and efficient query support, we have employed a shared nothing parallel database architecture, which distributes data homogenously across multiple database partitions to take advantage of parallel computation power and implements spatial indexing to achieve high I/O throughput. Results: Our work proposes a high performance, parallel spatial database platform for algorithm validation and comparison. This platform was evaluated by storing, managing, and comparing analysis results from a set of brain tumor whole slide images. The tools we develop are open source and available to download. Conclusions: Pathology image algorithm validation and comparison are essential to iterative algorithm development and refinement. One critical component is the support for queries involving spatial predicates and comparisons. In our work, we develop an efficient data model and parallel database approach to model, normalize, manage and query large volumes of analytical image result data. Our experiments demonstrate that the data partitioning strategy and the grid-based indexing result in good data distribution across database nodes and reduce I/O overhead in spatial join queries through parallel retrieval of relevant data and quick subsetting of datasets. The set of tools in the framework provide a full pipeline to normalize, load, manage and query analytical results for algorithm evaluation. PMID:23599905
Global traffic and disease vector dispersal
Tatem, Andrew J.; Hay, Simon I.; Rogers, David J.
2006-01-01
The expansion of global air travel and seaborne trade overcomes geographic barriers to insect disease vectors, enabling them to move great distances in short periods of time. Here we apply a coupled human–environment framework to describe the historical spread of Aedes albopictus, a competent mosquito vector of 22 arboviruses in the laboratory. We contrast this dispersal with the relatively unchanged distribution of Anopheles gambiae and examine possible future movements of this malaria vector. We use a comprehensive database of international ship and aircraft traffic movements, combined with climatic information, to remap the global transportation network in terms of disease vector suitability and accessibility. The expansion of the range of Ae. albopictus proved to be surprisingly predictable using this combination of climate and traffic data. Traffic volumes were more than twice as high on shipping routes running from the historical distribution of Ae. albopictus to ports where it has established in comparison with routes to climatically similar ports where it has yet to invade. In contrast, An. gambiae has rarely spread from Africa, which we suggest is partly due to the low volume of sea traffic from the continent and, until very recently, a European destination for most flights. PMID:16606847
NASA Technical Reports Server (NTRS)
Alter, Stephen J.; Reuthler, James J.; McDaniel, Ryan D.
2003-01-01
A flexible framework for the development of block structured volume grids for hypersonic Navier-Stokes flow simulations was developed for analysis of the Shuttle Orbiter Columbia. The development of the flexible framework, resulted in an ability to quickly generate meshes to directly correlate solutions contributed by participating groups on a common surface mesh, providing confidence for the extension of the envelope of solutions and damage scenarios. The framework draws on the experience of NASA Langely and NASA Ames Research Centers in structured grid generation, and consists of a grid generation process that is implemented through a division of responsibilities. The nominal division of labor consisted of NASA Johnson Space Center coordinating the damage scenarios to be analyzed by the Aerothermodynamics Columbia Accident Investigation (CAI) team, Ames developing the surface grids that described the computational volume about the orbiter, and Langely improving grid quality of Ames generated data and constructing the final volume grids. Distributing the work among the participants in the Aerothermodynamic CIA team resulted in significantly less time required to construct complete meshes than possible by any individual participant. The approach demonstrated that the One-NASA grid generation team could sustain the demand for new meshes to explore new damage scenarios within a aggressive timeline.
Min, Yugang; Santhanam, Anand; Neelakkantan, Harini; Ruddy, Bari H; Meeks, Sanford L; Kupelian, Patrick A
2010-09-07
In this paper, we present a graphics processing unit (GPU)-based simulation framework to calculate the delivered dose to a 3D moving lung tumor and its surrounding normal tissues, which are undergoing subject-specific lung deformations. The GPU-based simulation framework models the motion of the 3D volumetric lung tumor and its surrounding tissues, simulates the dose delivery using the dose extracted from a treatment plan using Pinnacle Treatment Planning System, Phillips, for one of the 3DCTs of the 4DCT and predicts the amount and location of radiation doses deposited inside the lung. The 4DCT lung datasets were registered with each other using a modified optical flow algorithm. The motion of the tumor and the motion of the surrounding tissues were simulated by measuring the changes in lung volume during the radiotherapy treatment using spirometry. The real-time dose delivered to the tumor for each beam is generated by summing the dose delivered to the target volume at each increase in lung volume during the beam delivery time period. The simulation results showed the real-time capability of the framework at 20 discrete tumor motion steps per breath, which is higher than the number of 4DCT steps (approximately 12) reconstructed during multiple breathing cycles.
Wu, Wei-Sheng; Jhou, Meng-Jhun
2017-01-13
Missing value imputation is important for microarray data analyses because microarray data with missing values would significantly degrade the performance of the downstream analyses. Although many microarray missing value imputation algorithms have been developed, an objective and comprehensive performance comparison framework is still lacking. To solve this problem, we previously proposed a framework which can perform a comprehensive performance comparison of different existing algorithms. Also the performance of a new algorithm can be evaluated by our performance comparison framework. However, constructing our framework is not an easy task for the interested researchers. To save researchers' time and efforts, here we present an easy-to-use web tool named MVIAeval (Missing Value Imputation Algorithm evaluator) which implements our performance comparison framework. MVIAeval provides a user-friendly interface allowing users to upload the R code of their new algorithm and select (i) the test datasets among 20 benchmark microarray (time series and non-time series) datasets, (ii) the compared algorithms among 12 existing algorithms, (iii) the performance indices from three existing ones, (iv) the comprehensive performance scores from two possible choices, and (v) the number of simulation runs. The comprehensive performance comparison results are then generated and shown as both figures and tables. MVIAeval is a useful tool for researchers to easily conduct a comprehensive and objective performance evaluation of their newly developed missing value imputation algorithm for microarray data or any data which can be represented as a matrix form (e.g. NGS data or proteomics data). Thus, MVIAeval will greatly expedite the progress in the research of missing value imputation algorithms.
Comparison of standing volume estimates using optical dendrometers
Neil A. Clark; Stanley J. Zarnoch; Alexander Clark; Gregory A. Reams
2001-01-01
This study compared height and diameter measurements and volume estimates on 20 hardwood and 20 softwood stems using traditional optical dendrometers, an experimental camera instrument, and mechanical calipers. Multiple comparison tests showed significant differences among the means for lower stem diameters when the camera was used. There were no significant...
Comparison of Standing Volume Estimates Using Optical Dendrometers
Neil A. Clark; Stanley J. Zarnoch; Alexander Clark; Gregory A. Reams
2001-01-01
This study compared height and diameter measurements and volume estimates on 20 hardwood and 20 softwood stems using traditional optical dendrometers, an experimental camera instrument, and mechanical calipers. Multiple comparison tests showed significant differences among the means for lower stem diameters when the camera was used. There were no significant...
Research report on: Specialized physiological studies in support of manned space flight
NASA Technical Reports Server (NTRS)
Luft, U. C.
1975-01-01
An investigation of the role of 02 fluctuations in oxygen uptake observed with changing posture is reported. A comparison of the closing volume test with other pulmonary function measurements is presented along with a comparison of hydrostatic weighing, and a stereophotogrammetric method for determining body volume.
The U.S. Environmental Protection Agency has standardized methods for performing acute marine amphipod sediment toxicity tests. A test design reducing sediment volume from 200 to 50 ml and overlying water from 600 to 150 ml was recently proposed. An interlaboratory comparison wa...
ERIC Educational Resources Information Center
Wu, Margaret
2010-01-01
This paper makes an in-depth comparison of the PISA (OECD) and TIMSS (IEA) mathematics assessments conducted in 2003. First, a comparison of survey methodologies is presented, followed by an examination of the mathematics frameworks in the two studies. The methodologies and the frameworks in the two studies form the basis for providing…
Meshless Modeling of Deformable Shapes and their Motion
Adams, Bart; Ovsjanikov, Maks; Wand, Michael; Seidel, Hans-Peter; Guibas, Leonidas J.
2010-01-01
We present a new framework for interactive shape deformation modeling and key frame interpolation based on a meshless finite element formulation. Starting from a coarse nodal sampling of an object’s volume, we formulate rigidity and volume preservation constraints that are enforced to yield realistic shape deformations at interactive frame rates. Additionally, by specifying key frame poses of the deforming shape and optimizing the nodal displacements while targeting smooth interpolated motion, our algorithm extends to a motion planning framework for deformable objects. This allows reconstructing smooth and plausible deformable shape trajectories in the presence of possibly moving obstacles. The presented results illustrate that our framework can handle complex shapes at interactive rates and hence is a valuable tool for animators to realistically and efficiently model and interpolate deforming 3D shapes. PMID:24839614
Scaling of number, size, and metabolic rate of cells with body size in mammals.
Savage, Van M; Allen, Andrew P; Brown, James H; Gillooly, James F; Herman, Alexander B; Woodruff, William H; West, Geoffrey B
2007-03-13
The size and metabolic rate of cells affect processes from the molecular to the organismal level. We present a quantitative, theoretical framework for studying relationships among cell volume, cellular metabolic rate, body size, and whole-organism metabolic rate that helps reveal the feedback between these levels of organization. We use this framework to show that average cell volume and average cellular metabolic rate cannot both remain constant with changes in body size because of the well known body-size dependence of whole-organism metabolic rate. Based on empirical data compiled for 18 cell types in mammals, we find that many cell types, including erythrocytes, hepatocytes, fibroblasts, and epithelial cells, follow a strategy in which cellular metabolic rate is body size dependent and cell volume is body size invariant. We suggest that this scaling holds for all quickly dividing cells, and conversely, that slowly dividing cells are expected to follow a strategy in which cell volume is body size dependent and cellular metabolic rate is roughly invariant with body size. Data for slowly dividing neurons and adipocytes show that cell volume does indeed scale with body size. From these results, we argue that the particular strategy followed depends on the structural and functional properties of the cell type. We also discuss consequences of these two strategies for cell number and capillary densities. Our results and conceptual framework emphasize fundamental constraints that link the structure and function of cells to that of whole organisms.
ERIC Educational Resources Information Center
Aagaard, James S.; And Others
This two-volume document specifies a protocol that was developed using the Reference Model for Open Systems Interconnection (OSI), which provides a framework for communications within a heterogeneous network environment. The protocol implements the features necessary for bibliographic searching, record maintenance, and mail transfer between…
ERIC Educational Resources Information Center
Nyhan, Barry, Ed.; Kelleher, Michael, Ed.; Cressey, Peter, Ed.; Poell, Rob, Ed.
This volume, the second of a two-volume publication, comprises 15 papers that present the work of individual European projects dealing with learning within organizations. These five chapters in Part 1, The Meaning of the Learning Organization, examine the conceptual frameworks and dilemmas at the heart of the notion of the learning organization:…
Roldan-Valadez, Ernesto; Garcia-Ulloa, Ana Cristina; Gonzalez-Gutierrez, Omar; Martinez-Lopez, Manuel
2011-01-01
Computed-assisted three-dimensional data (3D) allows for an accurate evaluation of volumes compared with traditional measurements. An in vitro method comparison between geometric volume and 3D volumetry to obtain reference data for pituitary volumes in normal pituitary glands (PGs) and PGs containing adenomas. Prospective, transverse, analytical study. Forty-eight subjects underwent brain magnetic resonance imaging (MRI) with 3D sequencing for computer-aided volumetry. PG phantom volumes by both methods were compared. Using the best volumetric method, volumes of normal PGs and PGs with adenoma were compared. Statistical analysis used the Bland-Altman method, t-statistics, effect size and linear regression analysis. Method comparison between 3D volumetry and geometric volume revealed a lower bias and precision for 3D volumetry. A total of 27 patients exhibited normal PGs (mean age, 42.07 ± 16.17 years), although length, height, width, geometric volume and 3D volumetry were greater in women than in men. A total of 21 patients exhibited adenomas (mean age 39.62 ± 10.79 years), and length, height, width, geometric volume and 3D volumetry were greater in men than in women, with significant volumetric differences. Age did not influence pituitary volumes on linear regression analysis. Results from the present study showed that 3D volumetry was more accurate than the geometric method. In addition, the upper normal limits of PGs overlapped with lower volume limits during early stage microadenomas.
A Computational Framework for Bioimaging Simulation.
Watabe, Masaki; Arjunan, Satya N V; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi
2015-01-01
Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units.
Asset management program enhancement plan : baseline assessment phases I and II.
DOT National Transportation Integrated Search
2015-09-01
This project resulted in the development of a framework for making asset management decisions on low-volume bridges. The : research focused on low-volume bridges located in the agricultural counties of Iowa because recent research has shown that thes...
Baxter, Emma F; Bennett, Thomas D; Cairns, Andrew B; Brownbill, Nick J; Goodwin, Andrew L; Keen, David A; Chater, Philip A; Blanc, Frédéric; Cheetham, Anthony K
2016-03-14
X-ray diffraction has been used to investigate the kinetics of amorphization through ball-milling at 20 Hz, for five zeolitic imidazolate frameworks (ZIFs) - ZIF-8, ZIF-4, ZIF-zni, BIF-1-Li and CdIF-1. We find that the rates of amorphization for the zinc-containing ZIFs increase with increasing solvent accessible volume (SAV) in the sequence ZIF-8 > ZIF-4 > ZIF-zni. The Li-B analogue of the dense ZIF-zni amorphizes more slowly than the corresponding zinc phase, with the behaviour showing a correlation with their relative bulk moduli and SAVs. The cadmium analogue of ZIF-8 (CdIF-1) amorphizes more rapidly than the zinc counterpart, which we ascribe primarily to its relatively weak M-N bonds as well as the higher SAV. The results for the ZIFs are compared to three classical zeolites - Na-X, Na-Y and ZSM-5 - with these taking up to four times longer to amorphize. The presence of adsorbed solvent in the pores is found to render both ZIF and zeolite frameworks more resistant to amorphization. X-ray total scattering measurements show that amorphous ZIF-zni is structurally indistinguishable from amorphous ZIF-4 with both structures retaining the same short-range order that is present in their crystalline precursors. By contrast, both X-ray total scattering measurements and (113)Cd NMR measurements point to changes in the local environment of amorphous CdIF-1 compared with its crystalline CdIF-1 precursor.
A novel adaptive scoring system for segmentation validation with multiple reference masks
NASA Astrophysics Data System (ADS)
Moltz, Jan H.; Rühaak, Jan; Hahn, Horst K.; Peitgen, Heinz-Otto
2011-03-01
The development of segmentation algorithms for different anatomical structures and imaging protocols is an important task in medical image processing. The validation of these methods, however, is often treated as a subordinate task. Since manual delineations, which are widely used as a surrogate for the ground truth, exhibit an inherent uncertainty, it is preferable to use multiple reference segmentations for an objective validation. This requires a consistent framework that should fulfill three criteria: 1) it should treat all reference masks equally a priori and not demand consensus between the experts; 2) it should evaluate the algorithmic performance in relation to the inter-reference variability, i.e., be more tolerant where the experts disagree about the true segmentation; 3) it should produce results that are comparable for different test data. We show why current state-of-the-art frameworks as the one used at several MICCAI segmentation challenges do not fulfill these criteria and propose a new validation methodology. A score is computed in an adaptive way for each individual segmentation problem, using a combination of volume- and surface-based comparison metrics. These are transformed into the score by relating them to the variability between the reference masks which can be measured by comparing the masks with each other or with an estimated ground truth. We present examples from a study on liver tumor segmentation in CT scans where our score shows a more adequate assessment of the segmentation results than the MICCAI framework.
Functional validation and comparison framework for EIT lung imaging.
Grychtol, Bartłomiej; Elke, Gunnar; Meybohm, Patrick; Weiler, Norbert; Frerichs, Inéz; Adler, Andy
2014-01-01
Electrical impedance tomography (EIT) is an emerging clinical tool for monitoring ventilation distribution in mechanically ventilated patients, for which many image reconstruction algorithms have been suggested. We propose an experimental framework to assess such algorithms with respect to their ability to correctly represent well-defined physiological changes. We defined a set of clinically relevant ventilation conditions and induced them experimentally in 8 pigs by controlling three ventilator settings (tidal volume, positive end-expiratory pressure and the fraction of inspired oxygen). In this way, large and discrete shifts in global and regional lung air content were elicited. We use the framework to compare twelve 2D EIT reconstruction algorithms, including backprojection (the original and still most frequently used algorithm), GREIT (a more recent consensus algorithm for lung imaging), truncated singular value decomposition (TSVD), several variants of the one-step Gauss-Newton approach and two iterative algorithms. We consider the effects of using a 3D finite element model, assuming non-uniform background conductivity, noise modeling, reconstructing for electrode movement, total variation (TV) reconstruction, robust error norms, smoothing priors, and using difference vs. normalized difference data. Our results indicate that, while variation in appearance of images reconstructed from the same data is not negligible, clinically relevant parameters do not vary considerably among the advanced algorithms. Among the analysed algorithms, several advanced algorithms perform well, while some others are significantly worse. Given its vintage and ad-hoc formulation backprojection works surprisingly well, supporting the validity of previous studies in lung EIT.
Comparing Observations, 1st Experimental Edition.
ERIC Educational Resources Information Center
Butts, David P.
Objectives for this module include the ability to: (1) order objects by comparing a property which the objects have in common (such as length, area, volume or mass), (2) describe objects (length, area, volume, mass, etc.) by comparing them quantitatively using either arbitrary units of comparison or standard units of comparison, and (3) describe…
SUPPLEMENTARY COMPARISON: COOMET.RI(II)-S1.Rn-222 (169/UA/98): Rn-222 volume activity comparison
NASA Astrophysics Data System (ADS)
Skliarov, V.; Röttger, A.; Honig, A.; Korostin, S.; Kuznetsov, S.; Lapenas, A.; Milevsky, V.; Ivaniukovich, A.; Kharitonov, I.; Sepman, S.
2009-01-01
According to a first program, a supplementary comparison of Rn-222 volume activity was drawn up as a bilateral supplementary comparison between NSC 'Institute of Metrology', Ukraine, and VNIIFTRI, Russia. It took place in March 2005. In April 2005, at the 5th meeting of COOMET held in Braunschweig (Germany), representatives of these institutes exchanged data which showed the comparability of the national standards of Ukraine and Russia for the check points. During the discussion of the procedure some other institutes decided to join the comparison program, which was extended to BelGIM (Belarus), PTB (Germany), VNIIM (Russia) and RMTC (Latvia). The national standards of volume activity of radon-222 were thus calibrated using one standard radon radiometer as the transfer standard. Results are shown in the Final Report of the comparison. Main text. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by COOMET, according to the provisions of the CIPM Mutual Recognition Arrangement (MRA).
Dosimetric comparison of photon and proton treatment techniques for chondrosarcoma of thoracic spine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yadav, Poonam, E-mail: yadav@humonc.wisc.edu; Department of Medical Physics, University of Wisconsin, Madison, WI; University of Wisconsin Riverview Cancer Center, Wisconsin Rapids, WI
2013-10-01
Chondrosarcomas are relatively radiotherapy resistant, and also delivering high radiation doses is not feasible owing to anatomic constraints. In this study, the feasibility of helical tomotherapy for treatment of chondrosarcoma of thoracic spine is explored and compared with other available photon and proton radiotherapy techniques in the clinical setting. A patient was treated for high-grade chondrosarcoma of the thoracic spine using tomotherapy. Retrospectively, the tomotherapy plan was compared with intensity-modulated radiation therapy, dynamic arc photon therapy, and proton therapy. Two primary comparisons were made: (1) comparison of normal tissue sparing with comparable target volume coverage (plan-1), and (2) comparison ofmore » target volume coverage with a constrained maximum dose to the cord center (plan-2). With constrained target volume coverage, proton plans were found to yield lower mean doses for all organs at risk (spinal cord, esophagus, heart, and both lungs). Tomotherapy planning resulted in the lowest mean dose to all organs at risk amongst photon-based methods. For cord dose constrained plans, the static-field intensity-modulated radiation therapy and dynamic arc plans resulted target underdosing in 20% and 12% of planning target volume2 volumes, respectively, whereas both proton and tomotherapy plans provided clinically acceptable target volume coverage with no portion of planning target volume2 receiving less than 90% of the prescribed dose. Tomotherapy plans are comparable to proton plans and produce superior results compared with other photon modalities. This feasibility study suggests that tomotherapy is an attractive alternative to proton radiotherapy for delivering high doses to lesions in the thoracic spine.« less
Results of the supplementary comparison SIM.M.FF-S12 for volume of liquids at 20 L
NASA Astrophysics Data System (ADS)
Maldonado, M.; Castillo, E.; Rodríguez, L. D.
2018-01-01
A supplementary comparison was performed in order to compare national measurement systems to determine volume of liquids, particularly at fixed volume of 20 L. The participants were CENAM (Mexico), LACOMET (Costa Rica) and RECOPE (Costa Rica). The measurements were carried out from October 2016 to June 2017. The chosen value of volume (20 L) is representative of the Calibration and Measurement Capabilities (CMCs) declared by the three participants. The transfer standard (TS) was a stainless steel pipette for volume at 20 L. Prior to the beginning of the comparison, the TS was tested by CENAM. The results of the test phase showed excellent values for both repeatability and reproducibility. During the SIM.M.FF-S12, the results of the laboratories showed good agreement with the reference values. The best estimation of the measurands, as reported by the participants, showed a +/- 0.0022 % as the largest difference among the laboratories. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
Comparison of infusion pumps calibration methods
NASA Astrophysics Data System (ADS)
Batista, Elsa; Godinho, Isabel; do Céu Ferreira, Maria; Furtado, Andreia; Lucas, Peter; Silva, Claudia
2017-12-01
Nowadays, several types of infusion pump are commonly used for drug delivery, such as syringe pumps and peristaltic pumps. These instruments present different measuring features and capacities according to their use and therapeutic application. In order to ensure the metrological traceability of these flow and volume measuring equipment, it is necessary to use suitable calibration methods and standards. Two different calibration methods can be used to determine the flow error of infusion pumps. One is the gravimetric method, considered as a primary method, commonly used by National Metrology Institutes. The other calibration method, a secondary method, relies on an infusion device analyser (IDA) and is typically used by hospital maintenance offices. The suitability of the IDA calibration method was assessed by testing several infusion instruments at different flow rates using the gravimetric method. In addition, a measurement comparison between Portuguese Accredited Laboratories and hospital maintenance offices was performed under the coordination of the Portuguese Institute for Quality, the National Metrology Institute. The obtained results were directly related to the used calibration method and are presented in this paper. This work has been developed in the framework of the EURAMET projects EMRP MeDD and EMPIR 15SIP03.
Global Inventory of Regional and National Qualifications Frameworks. Volume I: Thematic Chapters
ERIC Educational Resources Information Center
Deij, Arjen; Graham, Michael; Bjornavold, Jens; Grm, Slava Pevec; Villalba, Ernesto; Christensen, Hanne; Chakroun, Borhene; Daelman, Katrien; Carlsen, Arne; Singh, Madhu
2015-01-01
The "Global Inventory of Regional and National Qualifications Frameworks," the result of collaborative work between the European Training Foundation (ETF), the European Centre for the Development of Vocational Training (Cedefop), UNESCO [United Nations Educational, Scientific and Cultural Organization] and UIL [UNESCO Institute for…
Transaction-Based Building Controls Framework, Volume 1: Reference Guide
DOE Office of Scientific and Technical Information (OSTI.GOV)
Somasundaram, Sriram; Pratt, Robert G.; Akyol, Bora A.
This document proposes a framework concept to achieve the objectives of raising buildings’ efficiency and energy savings potential benefitting building owners and operators. We call it a transaction-based framework, wherein mutually-beneficial and cost-effective market-based transactions can be enabled between multiple players across different domains. Transaction-based building controls are one part of the transactional energy framework. While these controls realize benefits by enabling automatic, market-based intra-building efficiency optimizations, the transactional energy framework provides similar benefits using the same market -based structure, yet on a larger scale and beyond just buildings, to the society at large.
Development of a Flexible Framework for Hypersonic Navier-Stoke Space Shuttle Orbiter Meshes
NASA Technical Reports Server (NTRS)
Alter, Stephen J.; Reuthler, James J.; McDaniel, Ryan D.
2004-01-01
A flexible framework constructing block structured volume grids for hypersonic Navier-Strokes flow simulations was developed for the analysis of the Shuttle Orbiter Columbia. The development of the framework, which was partially basedon the requirements of the primary flow solvers used resulted in an ability to directly correlate solutions contributed by participating groups on a common surface mesh. A foundation was built through the assessment of differences between differnt solvers, which provided confidence for independent assessment of other damage scenarios by team members. The framework draws on the experience of NASA Langley and NASA Ames Research Centers in structured grid generation, and consists of a grid generation, and consist of a grid generation process implemented through a division of responsibilities. The nominal division of labor consisted of NASA Johnson Space Center coordinating the damage scenarios to be analyzed by the Aerothermodynamics Columbia Accident Investigation (ACAI) team, Ames developing the surface grids that described the computational volume about the Orbiter, and Langley improving grid quality of Ames generated data and constructing the final computational volume grids. Distributing the work among the participant in th ACAI team resulted in significantl less time required to construct complete meshes than possible by any individual participant. The approach demonstrated that the One-NASA grid generation team could sustain the demand of for five new meshes to explore new damage scenarios within an aggressive time-line.
Gray matter abnormalities of the dorsal posterior cingulate in sleep walking.
Heidbreder, Anna; Stefani, Ambra; Brandauer, Elisabeth; Steiger, Ruth; Kremser, Christian; Gizewski, Elke R; Young, Peter; Poewe, Werner; Högl, Birgit; Scherfler, Christoph
2017-08-01
This study aimed to determine whether voxel-based analysis of T1 weighted magnetic resonance imaging (MRI) and diffusion tensor imaging is able to detect alterations of gray and white matter morphometry as well as measures of mean diffusivity and fractional anisotropy in patients with non-rapid eye movement parasomnia. 3 Tesla MRI was performed in 14 drug-free, polysomnography-confirmed adult patients with non-rapid eye movement parasomnia (age: 29 ± 4.2 years; disease duration 19.2 ± 7.7 years) and 14 healthy subjects, matched for age and gender. Statistical parametric mapping was applied to objectively identify focal changes of MRI parameters throughout the entire brain volume. Statistical parametric mapping localized significant decreases of gray matter volume in the left dorsal posterior cingulate cortex (BA23) and posterior midcingulate cortex (BA24) in patients with non-rapid eye movement parasomnias compared to the control group (p < 0.001, corrected for multiple comparisons). No significant differences of mean diffusivity and fractional anisotropy measures were found between the non-rapid eye movement parasomnia group and the healthy control group. Recently, the simultaneous co-existence of arousal or wakefulness originating from the motor and cingulate cortices and persistent sleep in associative cortical regions was suggested as a functional framework of somnambulism. Gray matter volume decline in the dorsal posterior and posterior midcingulate cortex reported in this study might represent the neuroanatomical substrate for this condition. Copyright © 2017 Elsevier B.V. All rights reserved.
A damage analysis for brittle materials using stochastic micro-structural information
NASA Astrophysics Data System (ADS)
Lin, Shih-Po; Chen, Jiun-Shyan; Liang, Shixue
2016-03-01
In this work, a micro-crack informed stochastic damage analysis is performed to consider the failures of material with stochastic microstructure. The derivation of the damage evolution law is based on the Helmholtz free energy equivalence between cracked microstructure and homogenized continuum. The damage model is constructed under the stochastic representative volume element (SRVE) framework. The characteristics of SRVE used in the construction of the stochastic damage model have been investigated based on the principle of the minimum potential energy. The mesh dependency issue has been addressed by introducing a scaling law into the damage evolution equation. The proposed methods are then validated through the comparison between numerical simulations and experimental observations of a high strength concrete. It is observed that the standard deviation of porosity in the microstructures has stronger effect on the damage states and the peak stresses than its effect on the Young's and shear moduli in the macro-scale responses.
A low-dissipation monotonicity-preserving scheme for turbulent flows in hydraulic turbines
NASA Astrophysics Data System (ADS)
Yang, L.; Nadarajah, S.
2016-11-01
The objective of this work is to improve the inherent dissipation of the numerical schemes under the framework of a Reynolds-averaged Navier-Stokes (RANS) simulation. The governing equations are solved by the finite volume method with the k-ω SST turbulence model. Instead of the van Albada limiter, a novel eddy-preserving limiter is employed in the MUSCL reconstructions to minimize the dissipation of the vortex. The eddy-preserving procedure inactivates the van Albada limiter in the swirl plane and reduces the artificial dissipation to better preserve vortical flow structures. Steady and unsteady simulations of turbulent flows in a straight channel and a straight asymmetric diffuser are demonstrated. Profiles of velocity, Reynolds shear stress and turbulent kinetic energy are presented and compared against large eddy simulation (LES) and/or experimental data. Finally, comparisons are made to demonstrate the capability of the eddy-preserving limiter scheme.
Database Are Not Toasters: A Framework for Comparing Data Warehouse Appliances
NASA Astrophysics Data System (ADS)
Trajman, Omer; Crolotte, Alain; Steinhoff, David; Nambiar, Raghunath Othayoth; Poess, Meikel
The success of Business Intelligence (BI) applications depends on two factors, the ability to analyze data ever more quickly and the ability to handle ever increasing volumes of data. Data Warehouse (DW) and Data Mart (DM) installations that support BI applications have historically been built using traditional architectures either designed from the ground up or based on customized reference system designs. The advent of Data Warehouse Appliances (DA) brings packaged software and hardware solutions that address performance and scalability requirements for certain market segments. The differences between DAs and custom installations make direct comparisons between them impractical and suggest the need for a targeted DA benchmark. In this paper we review data warehouse appliances by surveying thirteen products offered today. We assess the common characteristics among them and propose a classification for DA offerings. We hope our results will help define a useful benchmark for DAs.
Job Scheduling in a Heterogeneous Grid Environment
NASA Technical Reports Server (NTRS)
Shan, Hong-Zhang; Smith, Warren; Oliker, Leonid; Biswas, Rupak
2004-01-01
Computational grids have the potential for solving large-scale scientific problems using heterogeneous and geographically distributed resources. However, a number of major technical hurdles must be overcome before this potential can be realized. One problem that is critical to effective utilization of computational grids is the efficient scheduling of jobs. This work addresses this problem by describing and evaluating a grid scheduling architecture and three job migration algorithms. The architecture is scalable and does not assume control of local site resources. The job migration policies use the availability and performance of computer systems, the network bandwidth available between systems, and the volume of input and output data associated with each job. An extensive performance comparison is presented using real workloads from leading computational centers. The results, based on several key metrics, demonstrate that the performance of our distributed migration algorithms is significantly greater than that of a local scheduling framework and comparable to a non-scalable global scheduling approach.
Jeazet, Harold B. Tanh; Koschine, Tönjes; Staudt, Claudia; Raetzke, Klaus; Janiak, Christoph
2013-01-01
Hydrothermally stable particles of the metal-organic framework MIL-101(Cr) were incorporated into a polysulfone (PSF) matrix to produce mixed-matrix or composite membranes with excellent dispersion of MIL-101 particles and good adhesion within the polymer matrix. Pure gas (O2, N2, CO2 and CH4) permeation tests showed a significant increase of gas permeabilities of the mixed-matrix membranes without any loss in selectivity. Positron annihilation lifetime spectroscopy (PALS) indicated that the increased gas permeability is due to the free volume in the PSF polymer and the added large free volume inside the MIL-101 particles. The trend of the gas transport properties of the composite membranes could be reproduced by a Maxwell model. PMID:24957061
NASA Astrophysics Data System (ADS)
Rusu, Mirabela; Wang, Haibo; Golden, Thea; Gow, Andrew; Madabhushi, Anant
2013-03-01
Mouse lung models facilitate the investigation of conditions such as chronic inflammation which are associated with common lung diseases. The multi-scale manifestation of lung inflammation prompted us to use multi-scale imaging - both in vivo, ex vivo MRI along with ex vivo histology, for its study in a new quantitative way. Some imaging modalities, such as MRI, are non-invasive and capture macroscopic features of the pathology, while others, e.g. ex vivo histology, depict detailed structures. Registering such multi-modal data to the same spatial coordinates will allow the construction of a comprehensive 3D model to enable the multi-scale study of diseases. Moreover, it may facilitate the identification and definition of quantitative of in vivo imaging signatures for diseases and pathologic processes. We introduce a quantitative, image analytic framework to integrate in vivo MR images of the entire mouse with ex vivo histology of the lung alone, using lung ex vivo MRI as conduit to facilitate their co-registration. In our framework, we first align the MR images by registering the in vivo and ex vivo MRI of the lung using an interactive rigid registration approach. Then we reconstruct the 3D volume of the ex vivo histological specimen by efficient group wise registration of the 2D slices. The resulting 3D histologic volume is subsequently registered to the MRI volumes by interactive rigid registration, directly to the ex vivo MRI, and implicitly to in vivo MRI. Qualitative evaluation of the registration framework was performed by comparing airway tree structures in ex vivo MRI and ex vivo histology where airways are visible and may be annotated. We present a use case for evaluation of our co-registration framework in the context of studying chronic inammation in a diseased mouse.
Higher Education: Handbook of Theory and Research. Volume XI.
ERIC Educational Resources Information Center
Smart, John C., Ed.
This volume contains 10 papers on higher education theory and research. "Variation Among Academic Disciplines: Analytical Frameworks and Research" (John M. Braxton and Lowell L. Hargens) reviews work on disciplinary differences and proposed conceptual schemes for explaining these differences. "Public Policy and Public Trust: The Use…
Parallel Distributed Processing at 25: further explorations in the microstructure of cognition.
Rogers, Timothy T; McClelland, James L
2014-08-01
This paper introduces a special issue of Cognitive Science initiated on the 25th anniversary of the publication of Parallel Distributed Processing (PDP), a two-volume work that introduced the use of neural network models as vehicles for understanding cognition. The collection surveys the core commitments of the PDP framework, the key issues the framework has addressed, and the debates the framework has spawned, and presents viewpoints on the current status of these issues. The articles focus on both historical roots and contemporary developments in learning, optimality theory, perception, memory, language, conceptual knowledge, cognitive control, and consciousness. Here we consider the approach more generally, reviewing the original motivations, the resulting framework, and the central tenets of the underlying theory. We then evaluate the impact of PDP both on the field at large and within specific subdomains of cognitive science and consider the current role of PDP models within the broader landscape of contemporary theoretical frameworks in cognitive science. Looking to the future, we consider the implications for cognitive science of the recent success of machine learning systems called "deep networks"-systems that build on key ideas presented in the PDP volumes. Copyright © 2014 Cognitive Science Society, Inc.
R.B. Ferguson; V. Clark Baldwin
1995-01-01
Estimating tree and stand volume in mature plantations is time consuming, involving much manpower and equipment; however, several sampling and volume-prediction techniques are available. This study showed that a well-constructed, volume-equation method yields estimates comparable to those of the often more time-consuming, height-accumulation method, even though the...
Control volume based hydrocephalus research; a phantom study
NASA Astrophysics Data System (ADS)
Cohen, Benjamin; Voorhees, Abram; Madsen, Joseph; Wei, Timothy
2009-11-01
Hydrocephalus is a complex spectrum of neurophysiological disorders involving perturbation of the intracranial contents; primarily increased intraventricular cerebrospinal fluid (CSF) volume and intracranial pressure are observed. CSF dynamics are highly coupled to the cerebral blood flows and pressures as well as the mechanical properties of the brain. Hydrocephalus, as such, is a very complex biological problem. We propose integral control volume analysis as a method of tracking these important interactions using mass and momentum conservation principles. As a first step in applying this methodology in humans, an in vitro phantom is used as a simplified model of the intracranial space. The phantom's design consists of a rigid container filled with a compressible gel. Within the gel a hollow spherical cavity represents the ventricular system and a cylindrical passage represents the spinal canal. A computer controlled piston pump supplies sinusoidal volume fluctuations into and out of the flow phantom. MRI is used to measure fluid velocity and volume change as functions of time. Independent pressure measurements and momentum flow rate measurements are used to calibrate the MRI data. These data are used as a framework for future work with live patients and normal individuals. Flow and pressure measurements on the flow phantom will be presented through the control volume framework.
NASA Astrophysics Data System (ADS)
Sobina, E.; Zimathis, A.; Prinz, C.; Emmerling, F.; Unger, W.; de Santis Neves, R.; Galhardo, C. E.; De Robertis, E.; Wang, H.; Mizuno, K.; Kurokawa, A.
2016-01-01
CCQM key comparison K-136 Measurement of porosity properties (specific adsorption, BET specific surface area, specific pore volume and pore diameter) of nanoporous Al2O3 has been performed by the Surface Analysis Working Group (SAWG) of the Consultative Committee for Amount of Substance (CCQM). The objective of this key comparison is to compare the equivalency of the National Metrology Institutes (NMIs) and Designated Institutes (DIs) for the measurement of specific adsorption, BET specific surface area, specific pore volume and pore diameter) of nanoporous substances (sorbents, catalytic agents, cross-linkers, zeolites, etc) used in advanced technology. In this key comparison, a commercial sorbent (aluminum oxide) was supplied as a sample. Five NMIs participated in this key comparison. All participants used a gas adsorption method, here nitrogen adsorption at 77.3 K, for analysis according to the international standards ISO 15901-2 and 9277. In this key comparison, the degrees of equivalence uncertainties for specific adsorption, BET specific surface area, specific pore volume and pore diameter was established. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCQM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
A Computational Framework for Bioimaging Simulation
Watabe, Masaki; Arjunan, Satya N. V.; Fukushima, Seiya; Iwamoto, Kazunari; Kozuka, Jun; Matsuoka, Satomi; Shindo, Yuki; Ueda, Masahiro; Takahashi, Koichi
2015-01-01
Using bioimaging technology, biologists have attempted to identify and document analytical interpretations that underlie biological phenomena in biological cells. Theoretical biology aims at distilling those interpretations into knowledge in the mathematical form of biochemical reaction networks and understanding how higher level functions emerge from the combined action of biomolecules. However, there still remain formidable challenges in bridging the gap between bioimaging and mathematical modeling. Generally, measurements using fluorescence microscopy systems are influenced by systematic effects that arise from stochastic nature of biological cells, the imaging apparatus, and optical physics. Such systematic effects are always present in all bioimaging systems and hinder quantitative comparison between the cell model and bioimages. Computational tools for such a comparison are still unavailable. Thus, in this work, we present a computational framework for handling the parameters of the cell models and the optical physics governing bioimaging systems. Simulation using this framework can generate digital images of cell simulation results after accounting for the systematic effects. We then demonstrate that such a framework enables comparison at the level of photon-counting units. PMID:26147508
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chen Ting; Kim, Sung; Goyal, Sharad
2010-01-15
Purpose: High-speed nonrigid registration between the planning CT and the treatment CBCT data is critical for real time image guided radiotherapy (IGRT) to improve the dose distribution and to reduce the toxicity to adjacent organs. The authors propose a new fully automatic 3D registration framework that integrates object-based global and seed constraints with the grayscale-based ''demons'' algorithm. Methods: Clinical objects were segmented on the planning CT images and were utilized as meshless deformable models during the nonrigid registration process. The meshless models reinforced a global constraint in addition to the grayscale difference between CT and CBCT in order to maintainmore » the shape and the volume of geometrically complex 3D objects during the registration. To expedite the registration process, the framework was stratified into hierarchies, and the authors used a frequency domain formulation to diffuse the displacement between the reference and the target in each hierarchy. Also during the registration of pelvis images, they replaced the air region inside the rectum with estimated pixel values from the surrounding rectal wall and introduced an additional seed constraint to robustly track and match the seeds implanted into the prostate. The proposed registration framework and algorithm were evaluated on 15 real prostate cancer patients. For each patient, prostate gland, seminal vesicle, bladder, and rectum were first segmented by a radiation oncologist on planning CT images for radiotherapy planning purpose. The same radiation oncologist also manually delineated the tumor volumes and critical anatomical structures in the corresponding CBCT images acquired at treatment. These delineated structures on the CBCT were only used as the ground truth for the quantitative validation, while structures on the planning CT were used both as the input to the registration method and the ground truth in validation. By registering the planning CT to the CBCT, a displacement map was generated. Segmented volumes in the CT images deformed using the displacement field were compared against the manual segmentations in the CBCT images to quantitatively measure the convergence of the shape and the volume. Other image features were also used to evaluate the overall performance of the registration. Results: The algorithm was able to complete the segmentation and registration process within 1 min, and the superimposed clinical objects achieved a volumetric similarity measure of over 90% between the reference and the registered data. Validation results also showed that the proposed registration could accurately trace the deformation inside the target volume with average errors of less than 1 mm. The method had a solid performance in registering the simulated images with up to 20 Hounsfield unit white noise added. Also, the side by side comparison with the original demons algorithm demonstrated its improved registration performance over the local pixel-based registration approaches. Conclusions: Given the strength and efficiency of the algorithm, the proposed method has significant clinical potential to accelerate and to improve the CBCT delineation and targets tracking in online IGRT applications.« less
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts,
1980-06-01
theoretical framework for an experimental program is described. The theory of one dimensional wave propagation is used to show how data from instrumented long rods and targets may be fitted together to give a...the theoretical framework . In the final section the results to date are discussed.
Saini, Vipin K; Pires, João
2017-05-01
Reticulated foam shaped adsorbents are more efficient for the removal of volatile organic compounds (VOCs), particularly from low VOC-concentration indoor air streams. In this study composite structure of zeolite and metal organic frameworks (MOFs), referred as ZMF, has been fabricated by immobilization of fine MOF-199 powder on foam shaped Zeolite Socony Mobil-5 (ZSM-5) Zeolitic structure, referred as ZF. The ZMF possess a uniform and well-dispersed coating of MOF-199 on the porous framework of ZF. It shows higher surface area, pore volume, and VOCs adsorption capacity, as compared to ZF-structure. Post-fabrication changes in selective adsorption properties of ZMF were studied with three common indoor VOCs (benzene, n-hexane, and cyclohexane), using gravimetric adsorption technique. The adsorption capacity of ZMF with different VOCs follow the order of benzene>n-hexane>cyclohexane. In comparison with MOF-199 and ZF, the composite structure ZMF shows improvement in selectivity for benzene from other two VOCs. Further, improvement in efficiency and stability of prepared ZMF was found to be associated with its high MOF loading capacity and unique morphological and structural properties. The developed composite structure with improved VOCs removal and recyclability could be a promising material for small to limited scale air pollution treatment units. Copyright © 2016. Published by Elsevier B.V.
A novel analytical description of periodic volume coil geometries in MRI
NASA Astrophysics Data System (ADS)
Koh, D.; Felder, J.; Shah, N. J.
2018-03-01
MRI volume coils can be represented by equivalent lumped element circuits and for a variety of these circuit configurations analytical design equations have been presented. The unification of several volume coil topologies results in a two-dimensional gridded equivalent lumped element circuit which compromises the birdcage resonator, its multiple endring derivative but also novel structures like the capacitive coupled ring resonator. The theory section analyzes a general two-dimensional circuit by noting that its current distribution can be decomposed into a longitudinal and an azimuthal dependency. This can be exploited to compare the current distribution with a transfer function of filter circuits along one direction. The resonances of the transfer function coincide with the resonance of the volume resonator and the simple analytical solution can be used as a design equation. The proposed framework is verified experimentally against a novel capacitive coupled ring structure which was derived from the general circuit formulation and is proven to exhibit a dominant homogeneous mode. In conclusion, a unified analytical framework is presented that allows determining the resonance frequency of any volume resonator that can be represented by a two dimensional meshed equivalent circuit.
Dutta, Rabijit; Xing, Tao; Swanson, Craig; Heltborg, Jeff; Murdoch, Gordon K
2018-01-01
Objective A comparison between flow and gas washout data for high-frequency percussive ventilation (HFPV) and pressure control ventilation (PCV) under similar conditions is currently not available. This bench study aims to compare and describe the flow and gas washout behavior of HFPV and PCV in a newly designed experimental setup and establish a framework for future clinical and animal studies. Approach We studied gas washout behavior using a newly designed experimental setup that is motivated by the multi-breath nitrogen washout measurements. In this procedure, a test lung was filled with nitrogen gas before it was connected to a ventilator. Pressure, volume, and oxygen concentrations were recorded under different compliance and resistance conditions. PCV was compared with two settings of HFPV, namely, HFPV-High and HFPV-Low, to simulate the different variations in its clinical application. In the HFPV-Low mode, the peak pressures and drive pressures of HFPV and PCV are matched, whereas in the HFPV-High mode, the mean airway pressures (MAP) are matched. Main results HFPV-Low mode delivers smaller tidal volume (VT) as compared to PCV under all lung conditions, whereas HFPV-High delivers a larger VT. HFPV-High provides rapid washout as compared to PCV under all lung conditions. HFPV-Low takes a longer time to wash out nitrogen except at a low compliance, where it expedites washout at a smaller VT and MAP compared to PCV washout. Significance Various flow parameters for HFPV and PCV are mathematically defined. A shorter washout time at a small VT in low compliant test lungs for HFPV could be regarded as a hypothesis for lung protective ventilation for animal or human lungs. PMID:29369819
Dutta, Rabijit; Xing, Tao; Swanson, Craig; Heltborg, Jeff; Murdoch, Gordon K
2018-03-15
A comparison between flow and gas washout data for high-frequency percussive ventilation (HFPV) and pressure control ventilation (PCV) under similar conditions is currently not available. This bench study aims to compare and describe the flow and gas washout behavior of HFPV and PCV in a newly designed experimental setup and establish a framework for future clinical and animal studies. We studied gas washout behavior using a newly designed experimental setup that is motivated by the multi-breath nitrogen washout measurements. In this procedure, a test lung was filled with nitrogen gas before it was connected to a ventilator. Pressure, volume, and oxygen concentrations were recorded under different compliance and resistance conditions. PCV was compared with two settings of HFPV, namely, HFPV-High and HFPV-Low, to simulate the different variations in its clinical application. In the HFPV-Low mode, the peak pressures and drive pressures of HFPV and PCV are matched, whereas in the HFPV-High mode, the mean airway pressures (MAP) are matched. HFPV-Low mode delivers smaller tidal volume (V T ) as compared to PCV under all lung conditions, whereas HFPV-High delivers a larger V T . HFPV-High provides rapid washout as compared to PCV under all lung conditions. HFPV-Low takes a longer time to wash out nitrogen except at a low compliance, where it expedites washout at a smaller V T and MAP compared to PCV washout. Various flow parameters for HFPV and PCV are mathematically defined. A shorter washout time at a small V T in low compliant test lungs for HFPV could be regarded as a hypothesis for lung protective ventilation for animal or human lungs.
Arsznov, Bradley M; Sakai, Sharleen T
2013-01-01
The present study investigated whether increased relative brain size, including regional brain volumes, is related to differing behavioral specializations exhibited by three member species of the family Procyonidae. Procyonid species exhibit continuums of behaviors related to social and physical environmental complexities: the mostly solitary, semiarboreal and highly dexterous raccoons (Procyon lotor); the exclusively arboreal kinkajous (Potos flavus), which live either alone or in small polyandrous family groups, and the social, terrestrial coatimundi (Nasua nasua, N. narica). Computed tomographic (CT) scans of 45 adult skulls including 17 coatimundis (9 male, 8 female), 14 raccoons (7 male, 7 female), and 14 kinkajous (7 male, 7 female) were used to create three-dimensional virtual endocasts. Endocranial volume was positively correlated with two separate measures of body size: skull basal length (r = 0.78, p < 0.01) and basicranial axis length (r = 0.45, p = 0.002). However, relative brain size (total endocranial volume as a function of body size) varied by species depending on which body size measurement (skull basal length or basicranial axis length) was used. Comparisons of relative regional brain volumes revealed that the anterior cerebrum volume consisting mainly of frontal cortex and surface area was significantly larger in the social coatimundi compared to kinkajous and raccoons. The dexterous raccoon had the largest relative posterior cerebrum volume, which includes the somatosensory cortex, in comparison to the other procyonid species studied. The exclusively arboreal kinkajou had the largest relative cerebellum and brain stem volume in comparison to the semi arboreal raccoon and the terrestrial coatimundi. Finally, intraspecific comparisons failed to reveal any sex differences, except in the social coatimundi. Female coatimundis possessed a larger relative frontal cortical volume than males. Social life histories differ in male and female coatimundis but not in either kinkajous or raccoons. This difference may reflect the differing social life histories experienced by females who reside in their natal bands, and forage and engage in antipredator behavior as a group, while males disperse upon reaching adulthood and are usually solitary thereafter. This analysis in the three procyonid species supports the comparative neurology principle that behavioral specializations correspond to an expansion of neural tissue involved in that function.
NASA Astrophysics Data System (ADS)
Hachaj, Tomasz; Ogiela, Marek R.
2012-10-01
The proposed framework for cognitive analysis of perfusion computed tomography images is a fusion of image processing, pattern recognition, and image analysis procedures. The output data of the algorithm consists of: regions of perfusion abnormalities, anatomy atlas description of brain tissues, measures of perfusion parameters, and prognosis for infracted tissues. That information is superimposed onto volumetric computed tomography data and displayed to radiologists. Our rendering algorithm enables rendering large volumes on off-the-shelf hardware. This portability of rendering solution is very important because our framework can be run without using expensive dedicated hardware. The other important factors are theoretically unlimited size of rendered volume and possibility of trading of image quality for rendering speed. Such rendered, high quality visualizations may be further used for intelligent brain perfusion abnormality identification, and computer aided-diagnosis of selected types of pathologies.
NASA Technical Reports Server (NTRS)
Sadunas, J. A.; French, E. P.; Sexton, H.
1973-01-01
A 1/25 scale model S-2 stage base region thermal environment test is presented. Analytical results are included which reflect the effect of engine operating conditions, model scale, turbo-pump exhaust gas injection on base region thermal environment. Comparisons are made between full scale flight data, model test data, and analytical results. The report is prepared in two volumes. The description of analytical predictions and comparisons with flight data are presented. Tabulation of the test data is provided.
Measurement of limb volume: laser scanning versus volume displacement.
McKinnon, John Gregory; Wong, Vanessa; Temple, Walley J; Galbraith, Callum; Ferry, Paul; Clynch, George S; Clynch, Colin
2007-10-01
Determining the prevalence and treatment success of surgical lymphedema requires accurate and reproducible measurement. A new method of measurement of limb volume is described. A series of inanimate objects of known and unknown volume was measured using digital laser scanning and water displacement. A similar comparison was made with 10 human volunteers. Digital scanning was evaluated by comparison to the established method of water displacement, then to itself to determine reproducibility of measurement. (1) Objects of known volume: Laser scanning accurately measured the calculated volume but water displacement became less accurate as the size of the object increased. (2) Objects of unknown volume: As average volume increased, there was an increasing bias of underestimation of volume by the water displacement method. The coefficient of reproducibility of water displacement was 83.44 ml. In contrast, the reproducibility of the digital scanning method was 19.0 ml. (3) Human data: The mean difference between water displacement volume and laser scanning volume was 151.7 ml (SD +/- 189.5). The coefficient of reproducibility of water displacement was 450.8 ml whereas for laser scanning it was 174 ml. Laser scanning is an innovative method of measuring tissue volume that combines precision and reproducibility and may have clinical utility for measuring lymphedema. 2007 Wiley-Liss, Inc
Building Background Knowledge through Reading: Rethinking Text Sets
ERIC Educational Resources Information Center
Lupo, Sarah M.; Strong, John Z.; Lewis, William; Walpole, Sharon; McKenna, Michael C.
2018-01-01
To increase reading volume and help students access challenging texts, the authors propose a four-dimensional framework for text sets. The quad text set framework is designed around a target text: a challenging content area text, such as a canonical literary work, research article, or historical primary source document. The three remaining…
ERIC Educational Resources Information Center
Sandieson, Robert W.; Kirkpatrick, Lori C.; Sandieson, Rachel M.; Zimmerman, Walter
2010-01-01
Digital technologies enable the storage of vast amounts of information, accessible with remarkable ease. However, along with this facility comes the challenge to find pertinent information from the volumes of nonrelevant information. The present article describes the pearl-harvesting methodological framework for information retrieval. Pearl…
Development of a Framework for Teaching Mathematics in Depth
ERIC Educational Resources Information Center
LaFramenta, Joanne Jensen
2011-01-01
This study illuminates the practice of teaching mathematics in depth by developing a framework to serve practicing teachers and those who educate teachers. A thorough reading of the literature that began with all of the volumes in the decades since the publication of the Standards (1989) identified six elements that were profitable for effective…
ERIC Educational Resources Information Center
Wohlstetter, Priscilla; Mohrman, Susan Albers
This document presents findings of the Assessment of School-Based Management Study, which identified the conditions in schools that promote high performance through school-based management (SBM). The study's conceptual framework was based on Edward E. Lawler's (1986) model. The high-involvement framework posits that four resources must spread…
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics
2017-01-01
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration. PMID:29065473
Reasoning and Knowledge Acquisition Framework for 5G Network Analytics.
Sotelo Monge, Marco Antonio; Maestre Vidal, Jorge; García Villalba, Luis Javier
2017-10-21
Autonomic self-management is a key challenge for next-generation networks. This paper proposes an automated analysis framework to infer knowledge in 5G networks with the aim to understand the network status and to predict potential situations that might disrupt the network operability. The framework is based on the Endsley situational awareness model, and integrates automated capabilities for metrics discovery, pattern recognition, prediction techniques and rule-based reasoning to infer anomalous situations in the current operational context. Those situations should then be mitigated, either proactive or reactively, by a more complex decision-making process. The framework is driven by a use case methodology, where the network administrator is able to customize the knowledge inference rules and operational parameters. The proposal has also been instantiated to prove its adaptability to a real use case. To this end, a reference network traffic dataset was used to identify suspicious patterns and to predict the behavior of the monitored data volume. The preliminary results suggest a good level of accuracy on the inference of anomalous traffic volumes based on a simple configuration.
Craddock, William H.; Buursink, Marc L.; Covault, Jacob A.; Brennan, Sean T.; Doolan, Colin A.; Drake II, Ronald M.; Merrill, Matthew D.; Roberts-Ashby, Tina L.; Slucher, Ernie R.; Warwick, Peter D.; Blondes, Madalyn S.; Freeman, P.A.; Cahan, Steven N.; DeVera, Christina A.; Lohr, Celeste D.; Warwick, Peter D.; Corum, Margo D.
2014-01-01
For each SAU in both of the basins, we discuss the areal distribution of suitable CO2 sequestration reservoir rock. We also characterize the overlying sealing unit and describe the geologic characteristics that influence the potential CO2 storage volume and reservoir performance. These characteristics include reservoir depth, gross thickness, net thickness, porosity, permeability, and groundwater salinity. Case-by-case strategies for estimating the pore volume existing within structurally and (or) stratigraphically closed traps are presented. Although assessment results are not contained in this report, the geologic information included herein was employed to calculate the potential storage volume in the various SAUs. Lastly, in this report, we present the rationale for not conducting assessment work in fifteen sedimentary basins distributed across the Alaskan interior and within Alaskan State waters.
Application of Control Volume Analysis to Cerebrospinal Fluid Dynamics
NASA Astrophysics Data System (ADS)
Wei, Timothy; Cohen, Benjamin; Anor, Tomer; Madsen, Joseph
2011-11-01
Hydrocephalus is among the most common birth defects and may not be prevented nor cured. Afflicted individuals face serious issues, which at present are too complicated and not well enough understood to treat via systematic therapies. This talk outlines the framework and application of a control volume methodology to clinical Phase Contrast MRI data. Specifically, integral control volume analysis utilizes a fundamental, fluid dynamics methodology to quantify intracranial dynamics within a precise, direct, and physically meaningful framework. A chronically shunted, hydrocephalic patient in need of a revision procedure was used as an in vivo case study. Magnetic resonance velocity measurements within the patient's aqueduct were obtained in four biomedical state and were analyzed using the methods presented in this dissertation. Pressure force estimates were obtained, showing distinct differences in amplitude, phase, and waveform shape for different intracranial states within the same individual. Thoughts on the physiological and diagnostic research and development implications/opportunities will be presented.
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; ...
2016-02-18
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald; Fenelon, Joseph M.
2016-01-01
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks provide the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. Testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mirus, Benjamin B.; Halford, Keith J.; Sweetkind, Donald
The suitability of geologic frameworks for extrapolating hydraulic conductivity (K) to length scales commensurate with hydraulic data is difficult to assess. A novel method is presented for evaluating assumed relations between K and geologic interpretations for regional-scale groundwater modeling. The approach relies on simultaneous interpretation of multiple aquifer tests using alternative geologic frameworks of variable complexity, where each framework is incorporated as prior information that assumes homogeneous K within each model unit. This approach is tested at Pahute Mesa within the Nevada National Security Site (USA), where observed drawdowns from eight aquifer tests in complex, highly faulted volcanic rocks providemore » the necessary hydraulic constraints. The investigated volume encompasses 40 mi3 (167 km3) where drawdowns traversed major fault structures and were detected more than 2 mi (3.2 km) from pumping wells. Complexity of the five frameworks assessed ranges from an undifferentiated mass of rock with a single unit to 14 distinct geologic units. Results show that only four geologic units can be justified as hydraulically unique for this location. The approach qualitatively evaluates the consistency of hydraulic property estimates within extents of investigation and effects of geologic frameworks on extrapolation. Distributions of transmissivity are similar within the investigated extents irrespective of the geologic framework. In contrast, the extrapolation of hydraulic properties beyond the volume investigated with interfering aquifer tests is strongly affected by the complexity of a given framework. As a result, testing at Pahute Mesa illustrates how this method can be employed to determine the appropriate level of geologic complexity for large-scale groundwater modeling.« less
Atomic-Scale Lightning Rod Effect in Plasmonic Picocavities: A Classical View to a Quantum Effect.
Urbieta, Mattin; Barbry, Marc; Zhang, Yao; Koval, Peter; Sánchez-Portal, Daniel; Zabala, Nerea; Aizpurua, Javier
2018-01-23
Plasmonic gaps are known to produce nanoscale localization and enhancement of optical fields, providing small effective mode volumes of about a few hundred nm 3 . Atomistic quantum calculations based on time-dependent density functional theory reveal the effect of subnanometric localization of electromagnetic fields due to the presence of atomic-scale features at the interfaces of plasmonic gaps. Using a classical model, we explain this as a nonresonant lightning rod effect at the atomic scale that produces an extra enhancement over that of the plasmonic background. The near-field distribution of atomic-scale hot spots around atomic features is robust against dynamical screening and spill-out effects and follows the potential landscape determined by the electron density around the atomic sites. A detailed comparison of the field distribution around atomic hot spots from full quantum atomistic calculations and from the local classical approach considering the geometrical profile of the atoms' electronic density validates the use of a classical framework to determine the effective mode volume in these extreme subnanometric optical cavities. This finding is of practical importance for the community of surface-enhanced molecular spectroscopy and quantum nanophotonics, as it provides an adequate description of the local electromagnetic fields around atomic-scale features with use of simplified classical methods.
Measuring service line competitive position. A systematic methodology for hospitals.
Studnicki, J
1991-01-01
To mount a broad effort aimed at improving their competitive position for some service or group of services, hospitals have begun to pursue product line management techniques. A few hospitals have even reorganized completely under the product line framework. The benefits include focusing accountability for operations and results, facilitating coordination between departments and functions, stimulating market segmentation, and promoting rigorous examination of new and existing programs. As part of its strategic planning process, a suburban Baltimore hospital developed a product line management methodology with six basic steps: (1) define the service lines (which they did by grouping all existing diagnosis-related groups into 35 service lines), (2) determine the contribution of each service line to total inpatient volume, (3) determine trends in service line volumes (by comparing data over time), (4) derive a useful comparison group (competing hospitals or groups of hospitals with comparable size, scope of services, payer mix, and financial status), (5) review multiple time frames, and (6) summarize the long- and short-term performance of the hospital's service lines to focus further analysis. This type of systematic and disciplined analysis can become part of a permanent strategic intelligence program. When hospitals have such a program in place, their market research, planning, budgeting, and operations will be tied together in a true management decision support system.
Fluid-particle characteristics in fully-developed cluster-induced turbulence
NASA Astrophysics Data System (ADS)
Capecelatro, Jesse; Desjardins, Olivier; Fox, Rodney
2014-11-01
In this study, we present a theoretical framework for collisional fluid-particle turbulence. To identify the key mechanisms responsible for energy exchange between the two phases, an Eulerian-Lagrangian strategy is used to simulate fully-developed cluster-inudced turbulence (CIT) under a range of Reynolds numbers, where fluctuations in particle concentration generate and sustain the carrier-phase turbulence. Using a novel filtering approach, a length-scale separation between the correlated particle velocity and uncorrelated granular temperature (GT) is achieved. This separation allows us to extract the instantaneous Eulerian volume fraction, velocity and GT fields from the Lagrangian data. Direct comparisons can thus be made with the relevant terms that appear in the multiphase turbulence model. It is shown that the granular pressure is highly anisotropic, and thus additional transport equations (as opposed to a single equation for GT) are necessary in formulating a predictive multiphase turbulence model. In addition to reporting the relevant contributions to the Reynolds stresses of each phase, two-point statistics, integral length/timescales, averages conditioned on the local volume fraction, and PDFs of the key multiphase statistics are presented and discussed. The research reported in this paper is partially supported by the HPC equipment purchased through U.S. National Science Foundation MRI Grant Number CNS 1229081 and CRI Grant Number 1205413.
Ferreira-Pêgo, Cíntia; Nissensohn, Mariela; Kavouras, Stavros A; Babio, Nancy; Serra-Majem, Lluís; Martín Águila, Adys; Mauromoustakos, Andy; Álvarez Pérez, Jacqueline; Salas-Salvadó, Jordi
2016-07-30
We assess the repeatability and relative validity of a Spanish beverage intake questionnaire for assessing water intake from beverages. The present analysis was performed within the framework of the PREDIMED-PLUS trial. The study participants were adults (aged 55-75) with a BMI ≥27 and <40 kg/m², and at least three components of Metabolic Syndrome (MetS). A trained dietitian completed the questionnaire. Participants provided 24-h urine samples, and the volume and urine osmolality were recorded. The repeatability of the baseline measurement at 6 and 1 year was examined by paired Student's t-test comparisons. A total of 160 participants were included in the analysis. The Bland-Altman analysis showed relatively good agreement between total daily fluid intake assessed using the fluid-specific questionnaire, and urine osmolality and 24-h volume with parameter estimates of -0.65 and 0.22, respectively (R² = 0.20; p < 0.001). In the repeatability test, no significant differences were found between neither type of beverage nor total daily fluid intake at 6 months and 1-year assessment, compared to baseline. The proposed fluid-specific assessment questionnaire designed to assess the consumption of water and other beverages in Spanish adult individuals was found to be relatively valid with good repeatability.
ERIC Educational Resources Information Center
Underhill, Robert G., Ed.
This document, presented in two volumes, reports on a psychology of mathematics education conference, the theme of which was "Theoretical and Conceptual Frameworks in Mathematics Education." The two volumes include 58 papers, descriptions of 4 poster and 2 video presentations, and reports of and reactions to 2 plenary sessions presented…
ERIC Educational Resources Information Center
Seymour, Daniel, Ed.; And Others
This publication provides research-based discussion in 20 chapters of possible extension of the Malcolm Baldrige National Quality Award to honor high performing colleges. Chapters are organized into two volumes, the first exploring a broad range of issues from a scholarly point of view and the second emphasizing the practical application of a…
Bruce, Pamela J; Helmer, Stephen D; Osland, Jacqueline S; Ammar, Alex D
2010-01-01
To determine the effect of the 80-hour work week restrictions on general surgery resident operative volume in a large, community-based, university-affiliated, general surgery residency program. We performed a retrospective review of Accreditation Council for Graduate Medical Education (ACGME) operative logs of general surgery residents graduating from a single residency. The control group consisted of the residents graduating in the 3 years prior to the work-hour restriction implementation (2001, 2002, and 2003). Our comparison group consisted of those residents graduating in the first 2 classes whose entire residency was conducted after the implementation of the 80-hour work week (2008 and 2009). Comparisons were made between the control and the comparison groups in the 19 ACGME defined categories, total number of major cases, total number of chief cases, and total number of teaching assist cases. Operative volumes in 13 categories (skin/soft tissue/breast, alimentary tract, abdominal, liver, pancreas, vascular, endocrine, pediatrics, endoscopy, laparoscopic-complex, total chief cases, total major cases, and teaching cases) were not significantly affected by the implementation of the 80-hour work week. One of the 19 categories (laparoscopic-basic) showed a significant increase in operative volume (p < 0.0001). In 4 of the 19 categories (head/neck, operative-trauma, thoracic, and plastics), operative volume was significantly decreased in the post-80-hour work week era (p < 0.05). Nonoperative trauma could not be assessed, as the category did not exist before the work-hour restrictions. Resident operative volume at our institution's general surgery residency program largely has been unaffected by implementation of the 80-hour work week. Residencies in general surgery can be structured in a manner to allow for compliance with duty-hour regulations while maintaining the required operative volume outlined by the ACGME defined categories. Copyright © 2010 Association of Program Directors in Surgery. Published by Elsevier Inc. All rights reserved.
NASA Technical Reports Server (NTRS)
Callender, E. David; Steinbacher, Jody
1989-01-01
This is the third of five volumes on Information System Life-Cycle and Documentation Standards which present a well organized, easily used standard for providing technical information needed for developing information systems, components, and related processes. This volume states the Software Management and Assurance Program documentation standard for a product specification document and for data item descriptions. The framework can be applied to any NASA information system, software, hardware, operational procedures components, and related processes.
Kawata, Yasuo; Arimura, Hidetaka; Ikushima, Koujirou; Jin, Ze; Morita, Kento; Tokunaga, Chiaki; Yabu-Uchi, Hidetake; Shioyama, Yoshiyuki; Sasaki, Tomonari; Honda, Hiroshi; Sasaki, Masayuki
2017-10-01
The aim of this study was to investigate the impact of pixel-based machine learning (ML) techniques, i.e., fuzzy-c-means clustering method (FCM), and the artificial neural network (ANN) and support vector machine (SVM), on an automated framework for delineation of gross tumor volume (GTV) regions of lung cancer for stereotactic body radiation therapy. The morphological and metabolic features for GTV regions, which were determined based on the knowledge of radiation oncologists, were fed on a pixel-by-pixel basis into the respective FCM, ANN, and SVM ML techniques. Then, the ML techniques were incorporated into the automated delineation framework of GTVs followed by an optimum contour selection (OCS) method, which we proposed in a previous study. The three-ML-based frameworks were evaluated for 16 lung cancer cases (six solid, four ground glass opacity (GGO), six part-solid GGO) with the datasets of planning computed tomography (CT) and 18 F-fluorodeoxyglucose (FDG) positron emission tomography (PET)/CT images using the three-dimensional Dice similarity coefficient (DSC). DSC denotes the degree of region similarity between the GTVs contoured by radiation oncologists and those estimated using the automated framework. The FCM-based framework achieved the highest DSCs of 0.79±0.06, whereas DSCs of the ANN-based and SVM-based frameworks were 0.76±0.14 and 0.73±0.14, respectively. The FCM-based framework provided the highest segmentation accuracy and precision without a learning process (lowest calculation cost). Therefore, the FCM-based framework can be useful for delineation of tumor regions in practical treatment planning. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
CLARA: CLAS12 Reconstruction and Analysis Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gyurjyan, Vardan; Matta, Sebastian Mancilla; Oyarzun, Ricardo
2016-11-01
In this paper we present SOA based CLAS12 event Reconstruction and Analyses (CLARA) framework. CLARA design focus is on two main traits: real-time data stream processing, and service-oriented architecture (SOA) in a flow based programming (FBP) paradigm. Data driven and data centric architecture of CLARA presents an environment for developing agile, elastic, multilingual data processing applications. The CLARA framework presents solutions capable of processing large volumes of data interactively and substantially faster than batch systems.
Penetration with Long Rods: A Theoretical Framework and Comparison with Instrumented Impacts
1981-05-01
program to begin probing the details of the interaction process. The theoretical framework underlying such a program is explained in detail. The theory of...of the time sequence of events during penetration. Data from one series of experiments, reported in detail elsewhere, is presented and discussed within the theoretical framework .
ERIC Educational Resources Information Center
Sianos, Helen
2015-01-01
In 2013 the Ontario Ministry of Training, Colleges and Universities released Ontario's Differentiation Policy Framework for Postsecondary Education, for colleges and universities in the province. All 24 Ontario colleges responded to this Framework by presenting their Strategic Mandate Agreements (SMA). The Framework contrasts the original…
The Impact and Implementation of National Qualifications Frameworks: A Comparison of 16 Countries
ERIC Educational Resources Information Center
Allais, Stephanie M.
2011-01-01
This article provides some of the key findings of a comparative study commissioned by the International Labour Organization (ILO), which attempted to understand more about the impact and implementation of national qualifications frameworks (NQFs). Sixteen case studies were produced, on qualifications frameworks in Australia; Bangladesh; Botswana;…
Trochesset, Denise A; Serchuk, Richard B; Colosi, Dan C
2014-03-01
Identification of unknown individuals using dental comparison is well established in the forensic setting. The identification technique can be time and resource consuming if many individuals need to be identified at once. Medical CT (MDCT) for dental profiling has had limited success, mostly due to artifact from metal-containing dental restorations and implants. The authors describe a CBCT reformatting technique that creates images, which closely approximate conventional dental images. Using a i-CAT Platinum CBCT unit and standard issue i-CAT Vision software, a protocol is developed to reproducibly and reliably reformat CBCT volumes. The reformatted images are presented with conventional digital images from the same anatomic area for comparison. The authors conclude that images derived from CBCT volumes following this protocol are similar enough to conventional dental radiographs to allow for dental forensic comparison/identification and that CBCT offers a superior option over MDCT for this purpose. © 2013 American Academy of Forensic Sciences.
ERIC Educational Resources Information Center
Egetenmeyer, Regina, Ed.
2016-01-01
This volume presents comparisons of adult education and lifelong learning with a focus on educational policies, professionalization in adult education, participation in adult learning and education, quality in adult education, and educational guidance and counselling. The essays are based on comparisons discussed at the international Winter School…
The social impacts of dams: A new framework for scholarly analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kirchherr, Julian, E-mail: julian.kirchherr@sant.ox.ac.uk; Charles, Katrina J., E-mail: katrina.charles@ouce.ox.ac.uk
No commonly used framework exists in the scholarly study of the social impacts of dams. This hinders comparisons of analyses and thus the accumulation of knowledge. The aim of this paper is to unify scholarly understanding of dams' social impacts via the analysis and aggregation of the various frameworks currently used in the scholarly literature. For this purpose, we have systematically analyzed and aggregated 27 frameworks employed by academics analyzing dams' social impacts (found in a set of 217 articles). A key finding of the analysis is that currently used frameworks are often not specific to dams and thus omitmore » key impacts associated with them. The result of our analysis and aggregation is a new framework for scholarly analysis (which we call ‘matrix framework’) specifically on dams' social impacts, with space, time and value as its key dimensions as well as infrastructure, community and livelihood as its key components. Building on the scholarly understanding of this topic enables us to conceptualize the inherently complex and multidimensional issues of dams' social impacts in a holistic manner. If commonly employed in academia (and possibly in practice), this framework would enable more transparent assessment and comparison of projects.« less
A Study of Child Variance, Volume 2: Interventions; Conceptual Project in Emotional Disturbance.
ERIC Educational Resources Information Center
Rhodes, William C.; Tracy, Michael L.
Presented in the second volume of a series emanating from a conceptual project on emotional disturbance are six papers on general aspects of interventions as well as biophysical, behavioral, psychodynamic, environmental, and counter theoretical interventions. In an "Overview of Interventions", W. Rhodes discusses a framework for viewing…
ERIC Educational Resources Information Center
Watts, Richard E., Ed.
This volume presents a collection of practical strategies for enhancing communication between couples and families. Experts in the field outline proven techniques from cognitive and constructivist/constructionist frameworks, structural and strategic orientations, and couple/family play therapy. Chapters are: (1) "Letter for a Change: Using Letter…
ERIC Educational Resources Information Center
Pinnell, Charles; Wacholder, Michael
The fourth of a five-volume series concerned with higher educational planning provides techniques for the estimation of an institution's facility requirements. The facilities are discussed within the framework of two broad categories--(1) academic program facilities, and (2) residential housing facilities. The academic program facilities provide…
Environmental Design Research. Volume One: Selected Papers. Community Development Series.
ERIC Educational Resources Information Center
Preiser, Wolfgang F. E., Ed.
The items contained in this volume are summaries and critiques of 43 research papers grouped within a framework of nine general topics which represents an attempt to delineate the basic concepts and structure of environmental design research. The papers are grouped under the following headings: (1) Theoretical issues in man-environment relations,…
DAsHER CD: Developing a Data-Oriented Human-Centric Enterprise Architecture for EarthCube
NASA Astrophysics Data System (ADS)
Yang, C. P.; Yu, M.; Sun, M.; Qin, H.; Robinson, E.
2015-12-01
One of the biggest challenges that face Earth scientists is the resource discovery, access, and sharing in a desired fashion. EarthCube is targeted to enable geoscientists to address the challenges by fostering community-governed efforts that develop a common cyberinfrastructure for the purpose of collecting, accessing, analyzing, sharing and visualizing all forms of data and related resources, through the use of advanced technological and computational capabilities. Here we design an Enterprise Architecture (EA) for EarthCube to facilitate the knowledge management, communication and human collaboration in pursuit of the unprecedented data sharing across the geosciences. The design results will provide EarthCube a reference framework for developing geoscience cyberinfrastructure collaborated by different stakeholders, and identifying topics which should invoke high interest in the community. The development of this EarthCube EA framework leverages popular frameworks, such as Zachman, Gartner, DoDAF, and FEAF. The science driver of this design is the needs from EarthCube community, including the analyzed user requirements from EarthCube End User Workshop reports and EarthCube working group roadmaps, and feedbacks or comments from scientists obtained by organizing workshops. The final product of this Enterprise Architecture is a four-volume reference document: 1) Volume one is this document and comprises an executive summary of the EarthCube architecture, serving as an overview in the initial phases of architecture development; 2) Volume two is the major body of the design product. It outlines all the architectural design components or viewpoints; 3) Volume three provides taxonomy of the EarthCube enterprise augmented with semantics relations; 4) Volume four describes an example of utilizing this architecture for a geoscience project.
CONRAD—A software framework for cone-beam imaging in radiology
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maier, Andreas; Choi, Jang-Hwan; Riess, Christian
2013-11-15
Purpose: In the community of x-ray imaging, there is a multitude of tools and applications that are used in scientific practice. Many of these tools are proprietary and can only be used within a certain lab. Often the same algorithm is implemented multiple times by different groups in order to enable comparison. In an effort to tackle this problem, the authors created CONRAD, a software framework that provides many of the tools that are required to simulate basic processes in x-ray imaging and perform image reconstruction with consideration of nonlinear physical effects.Methods: CONRAD is a Java-based state-of-the-art software platform withmore » extensive documentation. It is based on platform-independent technologies. Special libraries offer access to hardware acceleration such as OpenCL. There is an easy-to-use interface for parallel processing. The software package includes different simulation tools that are able to generate up to 4D projection and volume data and respective vector motion fields. Well known reconstruction algorithms such as FBP, DBP, and ART are included. All algorithms in the package are referenced to a scientific source.Results: A total of 13 different phantoms and 30 processing steps have already been integrated into the platform at the time of writing. The platform comprises 74.000 nonblank lines of code out of which 19% are used for documentation. The software package is available for download at http://conrad.stanford.edu. To demonstrate the use of the package, the authors reconstructed images from two different scanners, a table top system and a clinical C-arm system. Runtimes were evaluated using the RabbitCT platform and demonstrate state-of-the-art runtimes with 2.5 s for the 256 problem size and 12.4 s for the 512 problem size.Conclusions: As a common software framework, CONRAD enables the medical physics community to share algorithms and develop new ideas. In particular this offers new opportunities for scientific collaboration and quantitative performance comparison between the methods of different groups.« less
Kim, Seung Hyup
2008-01-01
Objective To evaluate the correlations between prostate volumes estimated by transabdominal, transrectal, and three-dimensional US and the factors affecting the differences. Materials and Methods The prostate volumes of 94 consecutive patients were measured by both transabdominal and transrectal US. Next, the prostate volumes of 58 other patients was measured by both transrectal and three-dimensional US. We evaluated the degree of correlation and mean difference in each comparison. We also analyzed possible factors affecting the differences, such as the experiences of examiners in transrectal US, bladder volume, and prostate volume. Results In the comparison of transabdominal and transrectal US methods, the mean difference was 8.4 ± 10.5 mL and correlation coefficient (r) was 0.775 (p < 0.01). The experienced examiner for the transrectal US method had the highest correlation (r = 0.967) and the significantly smallest difference (5.4 ± 3.9 mL) compared to the other examiners (the beginner and the trained; p < 0.05). Prostate volume measured by transrectal US showed a weak correlation with the difference (r = 0.360, p < 0.05). Bladder volume did not show significant correlation with the difference (r = -0.043, p > 0.05). The comparison between the transrectal and three-dimensional US methods revealed a mean difference of 3.7 ± 3.4 mL and the correlation coefficient was 0.924 for the experienced examiner. Furthermore, no significant difference existed between examiners (p > 0.05). Prostate volume measured by transrectal US showed a positive correlation with the difference for the beginner only (r = 0.405, p < 0.05). Conclusion In the prostate volume estimation by US, experience in transrectal US is important in the correlation with transabdominal US, but not with three-dimensional US. Also, less experienced examiners' assessment of the prostate volume can be affected by prostate volume itself. PMID:18385560
Functional Validation and Comparison Framework for EIT Lung Imaging
Meybohm, Patrick; Weiler, Norbert; Frerichs, Inéz; Adler, Andy
2014-01-01
Introduction Electrical impedance tomography (EIT) is an emerging clinical tool for monitoring ventilation distribution in mechanically ventilated patients, for which many image reconstruction algorithms have been suggested. We propose an experimental framework to assess such algorithms with respect to their ability to correctly represent well-defined physiological changes. We defined a set of clinically relevant ventilation conditions and induced them experimentally in 8 pigs by controlling three ventilator settings (tidal volume, positive end-expiratory pressure and the fraction of inspired oxygen). In this way, large and discrete shifts in global and regional lung air content were elicited. Methods We use the framework to compare twelve 2D EIT reconstruction algorithms, including backprojection (the original and still most frequently used algorithm), GREIT (a more recent consensus algorithm for lung imaging), truncated singular value decomposition (TSVD), several variants of the one-step Gauss-Newton approach and two iterative algorithms. We consider the effects of using a 3D finite element model, assuming non-uniform background conductivity, noise modeling, reconstructing for electrode movement, total variation (TV) reconstruction, robust error norms, smoothing priors, and using difference vs. normalized difference data. Results and Conclusions Our results indicate that, while variation in appearance of images reconstructed from the same data is not negligible, clinically relevant parameters do not vary considerably among the advanced algorithms. Among the analysed algorithms, several advanced algorithms perform well, while some others are significantly worse. Given its vintage and ad-hoc formulation backprojection works surprisingly well, supporting the validity of previous studies in lung EIT. PMID:25110887
NASA Astrophysics Data System (ADS)
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-01
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
Gloger, Oliver; Tönnies, Klaus; Mensel, Birger; Völzke, Henry
2015-11-21
In epidemiological studies as well as in clinical practice the amount of produced medical image data strongly increased in the last decade. In this context organ segmentation in MR volume data gained increasing attention for medical applications. Especially in large-scale population-based studies organ volumetry is highly relevant requiring exact organ segmentation. Since manual segmentation is time-consuming and prone to reader variability, large-scale studies need automatized methods to perform organ segmentation. Fully automatic organ segmentation in native MR image data has proven to be a very challenging task. Imaging artifacts as well as inter- and intrasubject MR-intensity differences complicate the application of supervised learning strategies. Thus, we propose a modularized framework of a two-stepped probabilistic approach that generates subject-specific probability maps for renal parenchyma tissue, which are refined subsequently by using several, extended segmentation strategies. We present a three class-based support vector machine recognition system that incorporates Fourier descriptors as shape features to recognize and segment characteristic parenchyma parts. Probabilistic methods use the segmented characteristic parenchyma parts to generate high quality subject-specific parenchyma probability maps. Several refinement strategies including a final shape-based 3D level set segmentation technique are used in subsequent processing modules to segment renal parenchyma. Furthermore, our framework recognizes and excludes renal cysts from parenchymal volume, which is important to analyze renal functions. Volume errors and Dice coefficients show that our presented framework outperforms existing approaches.
High efficient optical remote sensing images acquisition for nano-satellite-framework
NASA Astrophysics Data System (ADS)
Li, Feng; Xin, Lei; Liu, Yang; Fu, Jie; Liu, Yuhong; Guo, Yi
2017-09-01
It is more difficult and challenging to implement Nano-satellite (NanoSat) based optical Earth observation missions than conventional satellites because of the limitation of volume, weight and power consumption. In general, an image compression unit is a necessary onboard module to save data transmission bandwidth and disk space. The image compression unit can get rid of redundant information of those captured images. In this paper, a new image acquisition framework is proposed for NanoSat based optical Earth observation applications. The entire process of image acquisition and compression unit can be integrated in the photo detector array chip, that is, the output data of the chip is already compressed. That is to say, extra image compression unit is no longer needed; therefore, the power, volume, and weight of the common onboard image compression units consumed can be largely saved. The advantages of the proposed framework are: the image acquisition and image compression are combined into a single step; it can be easily built in CMOS architecture; quick view can be provided without reconstruction in the framework; Given a certain compression ratio, the reconstructed image quality is much better than those CS based methods. The framework holds promise to be widely used in the future.
Cache-Cache Comparison for Supporting Meaningful Learning
ERIC Educational Resources Information Center
Wang, Jingyun; Fujino, Seiji
2015-01-01
The paper presents a meaningful discovery learning environment called "cache-cache comparison" for a personalized learning support system. The processing of seeking hidden relations or concepts in "cache-cache comparison" is intended to encourage learners to actively locate new knowledge in their knowledge framework and check…
ERIC Educational Resources Information Center
Commission of the European Communities, Brussels (Belgium).
This report, the second volume in a three volume set, summarizes the results of a study performed by the DELTA (Developing European Learning through Technological Advance) unit in parallel with the projects underway in the research and development Exploratory Action. The report identifies the key issues, associated requirements and options, and…
As Teachers Tell It: Implementing All Aspects of the Industry. The Case Studies. [Volume One].
ERIC Educational Resources Information Center
Andrew, Erika Nielsen, Ed.
The All Aspects of the Industry (AAI) approach, one of a number of educational reforms designed to reduce the gap between vocational and academic education, provides a framework for schools to redesign their programs around broadly conceived, interdisciplinary, industry-focused programs. With an AAI framework, schools can prepare students for a…
ERIC Educational Resources Information Center
Becker, Robert
1992-01-01
Presents a framework used at Western State College to teach an interdisciplinary general education course. The framework helps students organize a large volume of material about Contemporary World Cultures according to a taxonomy of human experience, including artistic/literary expression; thought and belief; relationships/associations with…
ERIC Educational Resources Information Center
Lafferty, Meghan
2009-01-01
This article examines what is desirable in online reference books in science and technology and outlines a framework for evaluating their interfaces. The framework considers factors unique to these subject areas like chemical structures and numerical data. Criteria in three categories, navigability, searchability, and results, were applied to five…
Review article: A systematic review of emergency department incident classification frameworks.
Murray, Matthew; McCarthy, Sally
2018-06-01
As in any part of the hospital system, safety incidents can occur in the ED. These incidents arguably have a distinct character, as the ED involves unscheduled flows of urgent patients who require disparate services. To aid understanding of safety issues and support risk management of the ED, a comparison of published ED specific incident classification frameworks was performed. A review of emergency medicine, health management and general medical publications, using Ovid SP to interrogate Medline (1976-2016) was undertaken to identify any type of taxonomy or classification-like framework for ED related incidents. These frameworks were then analysed and compared. The review identified 17 publications containing an incident classification framework. Comparison of factors and themes making up the classification constituent elements revealed some commonality, but no overall consistency, nor evolution towards an ideal framework. Inconsistency arises from differences in the evidential basis and design methodology of classifications, with design itself being an inherently subjective process. It was not possible to identify an 'ideal' incident classification framework for ED risk management, and there is significant variation in the selection of categories used by frameworks. The variation in classification could risk an unbalanced emphasis in findings through application of a particular framework. Design of an ED specific, ideal incident classification framework should be informed by a much wider range of theories of how organisations and systems work, in addition to clinical and human factors. © 2017 Australasian College for Emergency Medicine and Australasian Society for Emergency Medicine.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gloudemans, J.R.
1991-08-01
The multiloop integral system test (MIST) was part of a multiphase program started in 1983 to address small-break loss-of-coolant accidents (SBLOCAs) specific to Babcock Wilcox-designed plants. MIST was sponsored by the US Nuclear Regulatory Commission, the Babcock Wilcox Owners Group, the Electric Power Research Institute, and Babcock Wilcox. The unique features of the Babcock Wilcox design, specifically the hot leg U-bends and steam generators, prevented the use of existing integral system data or existing integral system facilities to addresss the thermal-hydraulic SBLOCA questions. MIST was specifically designed and constructed for this program, and an existing facility -- the once-through integralmore » system (OTIS) -- was also used. Data from MIST and OTIS are used to benchmark the adequacy of system codes, such as RELAP5 and TRAC, for predicting abnormal plant transients. The MIST program is reported in eleven volumes; Volumes 2 through 8 pertain to groups of Phase 3 tests by type, Volume 9 presents inter-group comparisons. Volume 10 provides comparisons between the RELAP5 MOD2 calculations and MIST observations, and Volume 11 (with addendum) presents the later, Phase 4 tests. This is Volume 1 of the MIST final report, a summary of the entire MIST program. Major topics include: test advisory grop (TAG) issues; facility scaling and design; test matrix; observations; comparisons of RELAP5 calculations to MIST observations; and MIST versus the TAG issues. 11 refs., 29 figs., 9 tabs.« less
The New Curriculum Standards for Astronomy in the United States
NASA Astrophysics Data System (ADS)
Schleigh, Sharon P.; Slater, Stephanie J.; Slater, Timothy F.; Stork, Debra J.
2015-12-01
There is widespread interest in constraining the wide range and vast domain of the possible topics one might teach about astronomy into a manageable framework. Although there is no mandated national curriculum in the United States, an analysis of the three recent national efforts to create an age-appropriate sequence of astronomy concepts to be taught in primary and secondary schools reveals a considerable lack of consensus of which concepts are most age-appropriate and which topics should be covered. The most recent standardization framework for US science education, the Next Generation Science Standards, suggests that most astronomy concepts should be taught only in the last years of one’s education; however, the framework has been met with considerable criticism. A comparison of astronomy learning frameworks in the United States, and a brief discussion of their criticisms, might provide international astronomy educators with comparison data in formulating recommendations in their own regions.
NASA Astrophysics Data System (ADS)
Annor, Frank; van de Giesen, Nick; Bogaard, Thom; Eilander, Dirk
2013-04-01
Small water reservoirs for water resources management have as important socio-economic advantage that they bring water close to villages and households. This proximity allows for many water uses in addition to irrigation, such as fisheries, household water, building materials (loam, reeds), tourism and recreation, and cattle watering. These positive aspects are offset by the relatively large evaporative losses in comparison to larger reservoirs, although, it is not exactly known how large these losses are. For decision makers, investors and donors, the decision to construct a small reservoir should be multifactored; and based on economic, socio-cultural and environmental factors. For the latter, getting the water balance and the energy budget of small reservoirs right is key for any environmental impact analyses. For Northern Ghana, the relation between volume of a small reservoir and its' surface area has been established in a robust equation as: Volume = 0.00857Area1.4367 with the surface area explaining more than 95% of the variation in water volume of the reservoirs. This allows the use of remote sensing observations for estimating water volume of small reservoirs in northern Ghana. Hydrological analyses of time series of small reservoir areas comprises estimates of evaporation fluxes and cumulative surface runoff curves. Once the reservoirs are full, spillage will occur and volumes and surface areas remain stable at their maximum extents. This implies that the time series of reservoir surface area contains information concerning the on-set of downstream surface runoff. This on-set does not coincide with the on-set of the rainy season but largely depends on the distribution of rainfall events and storage capacity in the subsurface. The main requirement for this analysis is that the reservoir has negligible seepage losses or water influx from the underlying subsurface. In our research, we carried out a time series analysis of surface area extent for about 45 small reservoirs in the Upper East Region of Ghana. Reservoirs without obvious large seepage losses (field survey) were selected. To verify this, stable water isotopic samples are collected from groundwater upstream and downstream from the reservoir. By looking at possible enrichment of downstream groundwater, a good estimate of seepage can be made in addition to estimates on evaporation. We estimated the evaporative losses and compared those with field measurements using eddy correlation measurements. Lastly, we determined the cumulative surface runoff curves for the small reservoirs .We will present this analytical framework for extracting hydrological information from time series of small reservoirs and show the first results for our study region of northern Ghana.
ERIC Educational Resources Information Center
Weikart, David P.; And Others
This report describes the Cognitively Oriented Curriculum based on Piagetian theory which is used in the Perry Preschool Project. The purpose of this long-term project is to help educationally disadvantaged Negro children develop the concepts and abilities necessary for academic success. The Piagetian theory of cognitive development is discussed…
ERIC Educational Resources Information Center
Miller, S. J., Ed.; Kirkland, David E., Ed.
2010-01-01
"Change Matters," written by leading scholars committed to social justice in English education, provides researchers, university instructors, and preservice and inservice teachers with a framework that pivots social justice toward policy. The chapters in this volume detail rationales about generating social justice theory in what Freire calls "the…
Satellite-Distributed Educational Television For Developing Countries; Working Papers. Volume 4.
ERIC Educational Resources Information Center
Schramm, Wilbur; And Others
This volume contains a collection of working papers designed to accompany the case studies of India and Latin America in providing a framework for the planning of educational broadcasting systems. The first two papers offer useful orientation to policy-makers who are considering educational television. Working Paper No. 3 compares the two media of…
First passage times in homogeneous nucleation: Dependence on the total number of particles
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yvinec, Romain; Bernard, Samuel; Pujo-Menjouet, Laurent
2016-01-21
Motivated by nucleation and molecular aggregation in physical, chemical, and biological settings, we present an extension to a thorough analysis of the stochastic self-assembly of a fixed number of identical particles in a finite volume. We study the statistics of times required for maximal clusters to be completed, starting from a pure-monomeric particle configuration. For finite volumes, we extend previous analytical approaches to the case of arbitrary size-dependent aggregation and fragmentation kinetic rates. For larger volumes, we develop a scaling framework to study the first assembly time behavior as a function of the total quantity of particles. We find thatmore » the mean time to first completion of a maximum-sized cluster may have a surprisingly weak dependence on the total number of particles. We highlight how higher statistics (variance, distribution) of the first passage time may nevertheless help to infer key parameters, such as the size of the maximum cluster. Finally, we present a framework to quantify formation of macroscopic sized clusters, which are (asymptotically) very unlikely and occur as a large deviation phenomenon from the mean-field limit. We argue that this framework is suitable to describe phase transition phenomena, as inherent infrequent stochastic processes, in contrast to classical nucleation theory.« less
First passage times in homogeneous nucleation: Dependence on the total number of particles
NASA Astrophysics Data System (ADS)
Yvinec, Romain; Bernard, Samuel; Hingant, Erwan; Pujo-Menjouet, Laurent
2016-01-01
Motivated by nucleation and molecular aggregation in physical, chemical, and biological settings, we present an extension to a thorough analysis of the stochastic self-assembly of a fixed number of identical particles in a finite volume. We study the statistics of times required for maximal clusters to be completed, starting from a pure-monomeric particle configuration. For finite volumes, we extend previous analytical approaches to the case of arbitrary size-dependent aggregation and fragmentation kinetic rates. For larger volumes, we develop a scaling framework to study the first assembly time behavior as a function of the total quantity of particles. We find that the mean time to first completion of a maximum-sized cluster may have a surprisingly weak dependence on the total number of particles. We highlight how higher statistics (variance, distribution) of the first passage time may nevertheless help to infer key parameters, such as the size of the maximum cluster. Finally, we present a framework to quantify formation of macroscopic sized clusters, which are (asymptotically) very unlikely and occur as a large deviation phenomenon from the mean-field limit. We argue that this framework is suitable to describe phase transition phenomena, as inherent infrequent stochastic processes, in contrast to classical nucleation theory.
Accurate Characterization of the Pore Volume in Microporous Crystalline Materials
2017-01-01
Pore volume is one of the main properties for the characterization of microporous crystals. It is experimentally measurable, and it can also be obtained from the refined unit cell by a number of computational techniques. In this work, we assess the accuracy and the discrepancies between the different computational methods which are commonly used for this purpose, i.e, geometric, helium, and probe center pore volumes, by studying a database of more than 5000 frameworks. We developed a new technique to fully characterize the internal void of a microporous material and to compute the probe-accessible and -occupiable pore volume. We show that, unlike the other definitions of pore volume, the occupiable pore volume can be directly related to the experimentally measured pore volumes from nitrogen isotherms. PMID:28636815
Accurate Characterization of the Pore Volume in Microporous Crystalline Materials
Ongari, Daniele; Boyd, Peter G.; Barthel, Senja; ...
2017-06-21
Pore volume is one of the main properties for the characterization of microporous crystals. It is experimentally measurable, and it can also be obtained from the refined unit cell by a number of computational techniques. In this work, we assess the accuracy and the discrepancies between the different computational methods which are commonly used for this purpose, i.e, geometric, helium, and probe center pore volumes, by studying a database of more than 5000 frameworks. We developed a new technique to fully characterize the internal void of a microporous material and to compute the probe-accessible and -occupiable pore volume. Lasty, wemore » show that, unlike the other definitions of pore volume, the occupiable pore volume can be directly related to the experimentally measured pore volumes from nitrogen isotherms.« less
NASA Astrophysics Data System (ADS)
Deich, Martha L.
The concept of density can be difficult to learn. In the middle grades, students characteristically conflate mass and density, and even after instruction many students do not distinguish them consistently (Smith, Maclin, Grosslight, & Davis, 1997). Few develop a conceptualization of density that accounts for the implications of changing mass, volume, temperature, and/or state. My work looks specifically at how students make sense of the relationship between mass and volume as they refine their understanding of density. The concept of density is challenging to teach. Traditional methods of teaching density in middle-school classrooms typically involve either the measurement of an object's mass and volume and the subsequent calculation of the ratio of the two quantities, or the observation of different materials in water to learn about their buoyancy. Unfortunately, as Carol Smith and her colleagues have documented (1985, 1992, 1997), these approaches leave many students stuck in their "commonsense frameworks" that merge mass and density into one concept. Teachers need better ways to teach density. Hence I designed an intervention to study the effects of some possibly more effective ways to teach density. I developed and taught a complex intervention (Brown, 1992) featuring student modeling, extensive student dialogue on data and data analyses, formative assessments, the substitution of hands-on inquiry for mathematical problem sets, and multiple thought experiments. The hallmarks of the intervention were modeling and student dialogue, and the research question I posed was: Does classroom practice that encourages modeling with open-ended discourse help students differentiate between the concepts of mass and density? I patterned my research on a Smith study of density instruction in eighth grade (Smith, Maclin, Grosslight, & Davis, 1997), which had a quasiexperimental research design that compared the results of teaching density differently in two classrooms. I selected an intervention class and a comparison class from those I was teaching. The core of the density curriculum was similar in both classes. Instead of the intervention, though, the comparison class closely followed the lesson sequences provided by the classroom textbook, which tended to focus on formal and formulaic density instruction. I modified Smith's assessments for sixth graders. After teaching one class the intervention curriculum and the other the textbook-based curriculum, I evaluated and compared the progress of research participants in both classrooms by means of a pre- and post-instruction clinical interview, a pre- and post-instruction written test, and the end-of-chapter test from the textbook used in the comparison classroom. The results of my study were consistent: the intervention students outperformed and showed greater improvement on all assessments compared to the comparison students. In this study, modeling and student discourse were more effective ways to teach density than a standard textbook-based lesson sequence. The intervention helped students start to disrupt the conflation of mass and density, fostering both the comprehension of volume as a variable property of matter, and a nuanced understanding of density beyond formulaic reasoning. This dissertation is a report of my study for two audiences---academics and science educators. For the latter, I include recommendations for improving density instruction that are informed by my research.
Spacecraft Habitable Volume: Results of an Interdisciplinary Workshop
NASA Technical Reports Server (NTRS)
Fitts, David J.; Connolly, Janis; Howard, Robert
2011-01-01
NASA's Human Exploration Framework Team posed the question: "Is 80 cubic meters per person of habitable volume acceptable for a proposed Deep Space Habitat?" The goal of the workshop was to address the "net habitable volume" necessary for long-duration human spaceflight missions and identify design and psychological issues and mitigations. The objectives were: (1) Identify psychological factors -- i.e., "stressors" -- that impact volume and layout specifications for long duration missions (2) Identify mitigation strategies for stressors, especially those that can be written as volume design specifications (3) Identify a forward research roadmap -- i.e., what future work is needed to define and validate objective design metrics? (4) Provide advisories on the human factors consequences of poor net habitable volume allocation and layout design.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ongari, Daniele; Boyd, Peter G.; Barthel, Senja
Pore volume is one of the main properties for the characterization of microporous crystals. It is experimentally measurable, and it can also be obtained from the refined unit cell by a number of computational techniques. In this work, we assess the accuracy and the discrepancies between the different computational methods which are commonly used for this purpose, i.e, geometric, helium, and probe center pore volumes, by studying a database of more than 5000 frameworks. We developed a new technique to fully characterize the internal void of a microporous material and to compute the probe-accessible and -occupiable pore volume. Lasty, wemore » show that, unlike the other definitions of pore volume, the occupiable pore volume can be directly related to the experimentally measured pore volumes from nitrogen isotherms.« less
Comparison and Contrast of Two General Functional Regression Modeling Frameworks
Morris, Jeffrey S.
2017-01-01
In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable. PMID:28736502
Comparison and Contrast of Two General Functional Regression Modeling Frameworks.
Morris, Jeffrey S
2017-02-01
In this article, Greven and Scheipl describe an impressively general framework for performing functional regression that builds upon the generalized additive modeling framework. Over the past number of years, my collaborators and I have also been developing a general framework for functional regression, functional mixed models, which shares many similarities with this framework, but has many differences as well. In this discussion, I compare and contrast these two frameworks, to hopefully illuminate characteristics of each, highlighting their respecitve strengths and weaknesses, and providing recommendations regarding the settings in which each approach might be preferable.
Hayashida, Kenshi; Imanaka, Yuichi; Fukuda, Haruhisa
2007-09-03
In Japan, as in many other countries, several quality and safety assurance measures have been implemented since the 1990's. This has occurred in spite of cost containment efforts. Although government and hospital decision-makers demand comprehensive analysis of these activities at the hospital-wide level, there have been few studies that actually quantify them. Therefore, the aims of this study were to measure hospital-wide activities for patient safety and infection control through a systematic framework, and to identify the incremental volume of these activities implemented over the last five years. Using the conceptual framework of incremental activity corresponding to incremental cost, we defined the scope of patient safety and infection control activities. We then drafted a questionnaire to analyze these realms. After implementing the questionnaire, we conducted several in-person interviews with managers and other staff in charge of patient safety and infection control in seven acute care teaching hospitals in Japan. At most hospitals, nurses and clerical employees acted as the main figures in patient safety practices. The annual amount of activity ranged from 14,557 to 72,996 person-hours (per 100 beds: 6,240; per 100 staff: 3,323) across participant hospitals. Pharmacists performed more incremental activities than their proportional share. With respect to infection control activities, the annual volume ranged from 3,015 to 12,196 person-hours (per 100 beds: 1,141; per 100 staff: 613). For infection control, medical doctors and nurses tended to perform somewhat more of the duties relative to their share. We developed a systematic framework to quantify hospital-wide activities for patient safety and infection control. We also assessed the incremental volume of these activities in Japanese hospitals under the reimbursement containment policy. Government and hospital decision makers can benefit from this type of analytic framework and its empirical findings.
NASA Astrophysics Data System (ADS)
Dumpuri, Prashanth; Clements, Logan W.; Li, Rui; Waite, Jonathan M.; Stefansic, James D.; Geller, David A.; Miga, Michael I.; Dawant, Benoit M.
2009-02-01
Preoperative planning combined with image-guidance has shown promise towards increasing the accuracy of liver resection procedures. The purpose of this study was to validate one such preoperative planning tool for four patients undergoing hepatic resection. Preoperative computed tomography (CT) images acquired before surgery were used to identify tumor margins and to plan the surgical approach for resection of these tumors. Surgery was then performed with intraoperative digitization data acquire by an FDA approved image-guided liver surgery system (Pathfinder Therapeutics, Inc., Nashville, TN). Within 5-7 days after surgery, post-operative CT image volumes were acquired. Registration of data within a common coordinate reference was achieved and preoperative plans were compared to the postoperative volumes. Semi-quantitative comparisons are presented in this work and preliminary results indicate that significant liver regeneration/hypertrophy in the postoperative CT images may be present post-operatively. This could challenge pre/post operative CT volume change comparisons as a means to evaluate the accuracy of preoperative surgical plans.
a Conceptual Framework for Indoor Mapping by Using Grammars
NASA Astrophysics Data System (ADS)
Hu, X.; Fan, H.; Zipf, A.; Shang, J.; Gu, F.
2017-09-01
Maps are the foundation of indoor location-based services. Many automatic indoor mapping approaches have been proposed, but they rely highly on sensor data, such as point clouds and users' location traces. To address this issue, this paper presents a conceptual framework to represent the layout principle of research buildings by using grammars. This framework can benefit the indoor mapping process by improving the accuracy of generated maps and by dramatically reducing the volume of the sensor data required by traditional reconstruction approaches. In addition, we try to present more details of partial core modules of the framework. An example using the proposed framework is given to show the generation process of a semantic map. This framework is part of an ongoing research for the development of an approach for reconstructing semantic maps.
ERIC Educational Resources Information Center
Commission of the European Communities, Brussels (Belgium).
This annex to the main report, the third volume in a three volume set, is based on a study performed by the DELTA (Developing European Learning through Technological Advance) unit in parallel with the projects underway in the research and development Exploratory Action. It provides an assessment of the world situation in flexible and distance…
Schwarzer, Ruth; Siebert, Uwe
2009-07-01
The objectives of this study were (i) to develop a systematic framework for describing and comparing different features of health technology assessment (HTA) agencies, (ii) to identify and describe similarities and differences between the agencies, and (iii) to draw conclusions both for producers and users of HTA in research, policy, and practice. We performed a systematic literature search, added information from HTA agencies, and developed a conceptual framework comprising eight main domains: organization, scope, processes, methods, dissemination, decision, implementation, and impact. We grouped relevant items of these domains in an evidence table and chose five HTA agencies to test our framework: DAHTA@DIMDI, HAS, IQWiG, NICE, and SBU. Item and domain similarity was assessed using the percentage of identical characteristics in pairwise comparisons across agencies. RESULTS were interpreted across agencies by demonstrating similarities and differences. Based on 306 included documents, we identified 90 characteristics of eight main domains appropriate for our framework. After applying the framework to the five agencies, we were able to show 40 percent similarities in "dissemination," 38 percent in "scope," 35 percent in "organization," 29 percent in "methods," 26 percent in "processes," 23 percent in "impact," 19 percent in "decision," and 17 percent in "implementation." We found considerably more differences than similarities of HTA features across agencies and countries. Our framework and comparison provides insights and clarification into the need for harmonization. Our findings could serve as descriptive database facilitating communication between producers and users.
Andersson, Jesper L R; Graham, Mark S; Drobnjak, Ivana; Zhang, Hui; Filippini, Nicola; Bastiani, Matteo
2017-05-15
Most motion correction methods work by aligning a set of volumes together, or to a volume that represents a reference location. These are based on an implicit assumption that the subject remains motionless during the several seconds it takes to acquire all slices in a volume, and that any movement occurs in the brief moment between acquiring the last slice of one volume and the first slice of the next. This is clearly an approximation that can be more or less good depending on how long it takes to acquire one volume and in how rapidly the subject moves. In this paper we present a method that increases the temporal resolution of the motion correction by modelling movement as a piecewise continous function over time. This intra-volume movement correction is implemented within a previously presented framework that simultaneously estimates distortions, movement and movement-induced signal dropout. We validate the method on highly realistic simulated data containing all of these effects. It is demonstrated that we can estimate the true movement with high accuracy, and that scalar parameters derived from the data, such as fractional anisotropy, are estimated with greater fidelity when data has been corrected for intra-volume movement. Importantly, we also show that the difference in fidelity between data affected by different amounts of movement is much reduced when taking intra-volume movement into account. Additional validation was performed on data from a healthy volunteer scanned when lying still and when performing deliberate movements. We show an increased correspondence between the "still" and the "movement" data when the latter is corrected for intra-volume movement. Finally we demonstrate a big reduction in the telltale signs of intra-volume movement in data acquired on elderly subjects. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.
Item Response Modeling of Paired Comparison and Ranking Data
ERIC Educational Resources Information Center
Maydeu-Olivares, Alberto; Brown, Anna
2010-01-01
The comparative format used in ranking and paired comparisons tasks can significantly reduce the impact of uniform response biases typically associated with rating scales. Thurstone's (1927, 1931) model provides a powerful framework for modeling comparative data such as paired comparisons and rankings. Although Thurstonian models are generally…
NASA Astrophysics Data System (ADS)
Ahmadibasir, Mohammad
In this study an interpretive learning framework that aims to measure learning on the classroom level is introduced. In order to develop and evaluate the value of the framework, a theoretical/empirical study is designed. The researcher attempted to illustrate how the proposed framework provides insights on the problem of classroom-level learning. The framework is developed by construction of connections between the current literature on science learning and Wittgenstein's language-game theory. In this framework learning is defined as change of classroom language-game or discourse. In the proposed framework, learning is measured by analysis of classroom discourse. The empirical explanation power of the framework is evaluated by applying the framework in the analysis of learning in a fifth-grade science classroom. The researcher attempted to analyze how students' colloquial discourse changed to a discourse that bears more resemblance to science discourse. The results of the empirical part of the investigation are presented in three parts: first, the gap between what students did and what they were supposed to do was reported. The gap showed that students during the classroom inquiry wanted to do simple comparisons by direct observation, while they were supposed to do tool-assisted observation and procedural manipulation for a complete comparison. Second, it was illustrated that the first attempt to connect the colloquial to science discourse was done by what was immediately intelligible for students and then the teacher negotiated with students in order to help them to connect the old to the new language-game more purposefully. The researcher suggested that these two events in the science classroom are critical in discourse change. Third, it was illustrated that through the academic year, the way that students did the act of comparison was improved and by the end of the year more accurate causal inferences were observable in classroom communication. At the end of the study, the researcher illustrates that the application of the proposed framework resulted in an improved version of the framework. The improved version of the proposed framework is more connected to the topic of science learning, and is able to measure the change of discourse in higher resolution.
Schweickart, Oliver; Brown, Norman R
2014-02-01
How do people compare quantitative attributes of real-world objects? (e.g., Which country has the higher per capita GDP, Mauritania or Nepal?). The research literature on this question is divided: Although researchers in the 1970s and 1980s assumed that a 2-stage magnitude comparison process underlies these types of judgments (Banks, 1977), more recent approaches emphasize the role of probabilistic cues and simple heuristics (Gigerenzer, Todd, & The ABC Research Group, 1999). In this article, we review the magnitude comparison literature and propose a framework for magnitude comparison under uncertainty (MaC). Predictions from this framework were tested in a choice context involving one recognized and one unrecognized object, and were contrasted with those based on the recognition heuristic (Goldstein & Gigerenzer, 2002). This was done in 2 paired-comparison studies. In both, participants were timed as they decided which of 2 countries had the higher per capita gross domestic product (GDP). Consistent with the MaC account, we found that response times (RTs) displayed a classic symbolic distance effect: RTs were inversely related to the difference between the subjective per capita GDPs of the compared countries. Furthermore, choice of the recognized country became more frequent as subjective difference increased. These results indicate that the magnitude comparison process extends to choice contexts that have previously been associated only with cue-based strategies. We end by discussing how several findings reported in the recent heuristics literature relate to the MaC framework.
Final report on the EURAMET.M.FF-K4.2.2014 volume comparison at 100 μL—calibration of micropipettes
NASA Astrophysics Data System (ADS)
Batista, Elsa; Matus, Michael; Metaxiotou, Zoe; Tudor, Maria; Lenard, Elzbieta; Buker, Oliver; Wennergren, Per; Piluri, Erinda; Miteva, Mariana; Vicarova, Martina; Vospĕlová, Alena; Turnsek, Urska; Micic, Ljiljana; Grue, Lise-Lote; Mihailovic, Mirjana; Sarevska, Anastazija
2017-01-01
During the EURAMET TC-F meeting of 2014 and following the finalization of CCM.FF-K4.2.2011 comparison, it was agreed to start a Regional Key Comparison (KC) on volume measurements using two 100 μL micropipettes (piston pipettes) allowing the participating laboratories to assess the agreement of their results and uncertainties. Two 100 μL micropipettes were tested by 15 participants. One participant was not a member or associate member of the BIPM and was be removed from this report. The comparison started in July 2015 and ended in March 2016. The Volume and Flow Laboratory of the Portuguese Institute for Quality (IPQ) was the pilot laboratory and performed the initial and final measurements of the micropipettes. The micropipettes showed a stable volume during the whole comparison, which was confirmed by the results from the pilot laboratory. The original results of all participant NMIs were corrected to the standard atmospheric pressure in order to compare results under the same calibration conditions, and the contribution of the 'process-related handling contribution' was added to the uncertainty budget of each participant. In general the declared CMCs are in accordance with the KCDB. For the micropipette 354828Z, two laboratories had inconsistent results. For micropipette 354853Z, three laboratories had inconsistent results. Main text To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mendenhall, M.; Bowden, N.; Brodsky, J.
Electron anti-neutrino ( e) detectors can support nuclear safeguards, from reactor monitoring to spent fuel characterization. In recent years, the scientific community has developed multiple detector concepts, many of which have been prototyped or deployed for specific measurements by their respective collaborations. However, the diversity of technical approaches, deployment conditions, and analysis techniques complicates direct performance comparison between designs. We have begun development of a simulation framework to compare and evaluate existing and proposed detector designs for nonproliferation applications in a uniform manner. This report demonstrates the intent and capabilities of the framework by evaluating four detector design concepts, calculatingmore » generic reactor antineutrino counting sensitivity, and capabilities in a plutonium disposition application example.« less
A high-level 3D visualization API for Java and ImageJ.
Schmid, Benjamin; Schindelin, Johannes; Cardona, Albert; Longair, Mark; Heisenberg, Martin
2010-05-21
Current imaging methods such as Magnetic Resonance Imaging (MRI), Confocal microscopy, Electron Microscopy (EM) or Selective Plane Illumination Microscopy (SPIM) yield three-dimensional (3D) data sets in need of appropriate computational methods for their analysis. The reconstruction, segmentation and registration are best approached from the 3D representation of the data set. Here we present a platform-independent framework based on Java and Java 3D for accelerated rendering of biological images. Our framework is seamlessly integrated into ImageJ, a free image processing package with a vast collection of community-developed biological image analysis tools. Our framework enriches the ImageJ software libraries with methods that greatly reduce the complexity of developing image analysis tools in an interactive 3D visualization environment. In particular, we provide high-level access to volume rendering, volume editing, surface extraction, and image annotation. The ability to rely on a library that removes the low-level details enables concentrating software development efforts on the algorithm implementation parts. Our framework enables biomedical image software development to be built with 3D visualization capabilities with very little effort. We offer the source code and convenient binary packages along with extensive documentation at http://3dviewer.neurofly.de.
Molina-Romero, Miguel; Gómez, Pedro A; Sperl, Jonathan I; Czisch, Michael; Sämann, Philipp G; Jones, Derek K; Menzel, Marion I; Menze, Bjoern H
2018-03-23
The compartmental nature of brain tissue microstructure is typically studied by diffusion MRI, MR relaxometry or their correlation. Diffusion MRI relies on signal representations or biophysical models, while MR relaxometry and correlation studies are based on regularized inverse Laplace transforms (ILTs). Here we introduce a general framework for characterizing microstructure that does not depend on diffusion modeling and replaces ill-posed ILTs with blind source separation (BSS). This framework yields proton density, relaxation times, volume fractions, and signal disentanglement, allowing for separation of the free-water component. Diffusion experiments repeated for several different echo times, contain entangled diffusion and relaxation compartmental information. These can be disentangled by BSS using a physically constrained nonnegative matrix factorization. Computer simulations, phantom studies, together with repeatability and reproducibility experiments demonstrated that BSS is capable of estimating proton density, compartmental volume fractions and transversal relaxations. In vivo results proved its potential to correct for free-water contamination and to estimate tissue parameters. Formulation of the diffusion-relaxation dependence as a BSS problem introduces a new framework for studying microstructure compartmentalization, and a novel tool for free-water elimination. © 2018 International Society for Magnetic Resonance in Medicine.
Instrumentation for Environmental Monitoring: Water, Volume 2.
ERIC Educational Resources Information Center
California Univ., Berkeley. Lawrence Berkeley Lab.
This volume is one of a series discussing instrumentation for environmental monitoring. Each volume contains an overview of the basic problems, comparisons among the basic methods of sensing and detection, and notes that summarize the characteristics of presently available instruments and techniques. The text of this survey discusses the…
DESCQA: Synthetic Sky Catalog Validation Framework
NASA Astrophysics Data System (ADS)
Mao, Yao-Yuan; Uram, Thomas D.; Zhou, Rongpu; Kovacs, Eve; Ricker, Paul M.; Kalmbach, J. Bryce; Padilla, Nelson; Lanusse, François; Zu, Ying; Tenneti, Ananth; Vikraman, Vinu; DeRose, Joseph
2018-04-01
The DESCQA framework provides rigorous validation protocols for assessing the quality of high-quality simulated sky catalogs in a straightforward and comprehensive way. DESCQA enables the inspection, validation, and comparison of an inhomogeneous set of synthetic catalogs via the provision of a common interface within an automated framework. An interactive web interface is also available at portal.nersc.gov/project/lsst/descqa.
A metamorphic inorganic framework that can be switched between eight single-crystalline states
NASA Astrophysics Data System (ADS)
Zhan, Caihong; Cameron, Jamie M.; Gabb, David; Boyd, Thomas; Winter, Ross S.; Vilà-Nadal, Laia; Mitchell, Scott G.; Glatzel, Stefan; Breternitz, Joachim; Gregory, Duncan H.; Long, De-Liang; MacDonell, Andrew; Cronin, Leroy
2017-02-01
The design of highly flexible framework materials requires organic linkers, whereas inorganic materials are more robust but inflexible. Here, by using linkable inorganic rings made up of tungsten oxide (P8W48O184) building blocks, we synthesized an inorganic single crystal material that can undergo at least eight different crystal-to-crystal transformations, with gigantic crystal volume contraction and expansion changes ranging from -2,170 to +1,720 Å3 with no reduction in crystallinity. Not only does this material undergo the largest single crystal-to-single crystal volume transformation thus far reported (to the best of our knowledge), the system also shows conformational flexibility while maintaining robustness over several cycles in the reversible uptake and release of guest molecules switching the crystal between different metamorphic states. This material combines the robustness of inorganic materials with the flexibility of organic frameworks, thereby challenging the notion that flexible materials with robustness are mutually exclusive.
Luo, Yi; Eickhoff, Simon B; Hétu, Sébastien; Feng, Chunliang
2018-01-01
Social comparison is ubiquitous across human societies with dramatic influence on people's well-being and decision making. Downward comparison (comparing to worse-off others) and upward comparison (comparing to better-off others) constitute two types of social comparisons that produce different neuropsychological consequences. Based on studies exploring neural signatures associated with downward and upward comparisons, the current study utilized a coordinate-based meta-analysis to provide a refinement of understanding about the underlying neural architecture of social comparison. We identified consistent involvement of the ventral striatum and ventromedial prefrontal cortex in downward comparison and consistent involvement of the anterior insula and dorsal anterior cingulate cortex in upward comparison. These findings fit well with the "common-currency" hypothesis that neural representations of social gain or loss resemble those for non-social reward or loss processing. Accordingly, we discussed our findings in the framework of general reinforcement learning (RL) hypothesis, arguing how social gain/loss induced by social comparisons could be encoded by the brain as a domain-general signal (i.e., prediction errors) serving to adjust people's decisions in social settings. Although the RL account may serve as a heuristic framework for the future research, other plausible accounts on the neuropsychological mechanism of social comparison were also acknowledged. Hum Brain Mapp 39:440-458, 2018. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.
Dalwani, Manish S; McMahon, Mary Agnes; Mikulich-Gilbertson, Susan K; Young, Susan E; Regner, Michael F; Raymond, Kristen M; McWilliams, Shannon K; Banich, Marie T; Tanabe, Jody L; Crowley, Thomas J; Sakai, Joseph T
2015-01-01
Structural neuroimaging studies have demonstrated lower regional gray matter volume in adolescents with severe substance and conduct problems. These research studies, including ours, have generally focused on male-only or mixed-sex samples of adolescents with conduct and/or substance problems. Here we compare gray matter volume between female adolescents with severe substance and conduct problems and female healthy controls of similar ages. Female adolescents with severe substance and conduct problems will show significantly less gray matter volume in frontal regions critical to inhibition (i.e. dorsolateral prefrontal cortex and ventrolateral prefrontal cortex), conflict processing (i.e., anterior cingulate), valuation of expected outcomes (i.e., medial orbitofrontal cortex) and the dopamine reward system (i.e. striatum). We conducted whole-brain voxel-based morphometric comparison of structural MR images of 22 patients (14-18 years) with severe substance and conduct problems and 21 controls of similar age using statistical parametric mapping (SPM) and voxel-based morphometric (VBM8) toolbox. We tested group differences in regional gray matter volume with analyses of covariance, adjusting for age and IQ at p<0.05, corrected for multiple comparisons at whole-brain cluster-level threshold. Female adolescents with severe substance and conduct problems compared to controls showed significantly less gray matter volume in right dorsolateral prefrontal cortex, left ventrolateral prefrontal cortex, medial orbitofrontal cortex, anterior cingulate, bilateral somatosensory cortex, left supramarginal gyrus, and bilateral angular gyrus. Considering the entire brain, patients had 9.5% less overall gray matter volume compared to controls. Female adolescents with severe substance and conduct problems in comparison to similarly aged female healthy controls showed substantially lower gray matter volume in brain regions involved in inhibition, conflict processing, valuation of outcomes, decision-making, reward, risk-taking, and rule-breaking antisocial behavior.
NASA Technical Reports Server (NTRS)
Mayer, Richard J.; Blinn, Thomas M.; Dewitte, Paul S.; Crump, John W.; Ackley, Keith A.
1992-01-01
The Framework Programmable Software Development Platform (FPP) is a project aimed at effectively combining tool and data integration mechanisms with a model of the software development process to provide an intelligent integrated software development environment. Guided by the model, this system development framework will take advantage of an integrated operating environment to automate effectively the management of the software development process so that costly mistakes during the development phase can be eliminated. The Advanced Software Development Workstation (ASDW) program is conducting research into development of advanced technologies for Computer Aided Software Engineering (CASE).
Janiszewska-Olszowska, Joanna; Tandecka, Katarzyna; Szatkiewicz, Tomasz; Stępień, Piotr; Sporniak-Tutak, Katarzyna; Grocholewicz, Katarzyna
2015-11-18
The present study aimed at 3D analysis of adhesive remnants and enamel loss following the debonding of orthodontic molar tubes and orthodontic clean-up to assess the effectiveness and safety of One-Step Finisher and Polisher and Adhesive Residue Remover in comparison to tungsten carbide bur. Thirty human molars were bonded with chemical-cure orthodontic adhesive (Unite, 3M, USA), stored 24 h in 0.9 % saline solution, debonded and cleaned using three methods (Three groups of ten): tungsten carbide bur (Dentaurum, Pforzheim, Germany), one-step finisher and polisher (One gloss, Shofu Dental, Kyoto, Japan) and Adhesive Residue Remover (Dentaurum, Pforzheim, Germany). Direct 3D scanning in blue-light technology to the nearest 2 μm was performed before etching and after adhesive removal. Adhesive remnant height and volume as well as enamel loss depth and volume were calculated. An index of effectiveness and safety was proposed and calculated for every tool; adhesive remnant volume and duplicated enamel lost volume were divided by a sum of multiplicands. Comparisons using parametric ANOVA or nonparametric ANOVA rank Kruskal-Wallis tests were used to compare between tools for adhesive remnant height and volume, enamel loss depth and volume as well as for the proposed index. No statistically significant differences in the volume (p = 0.35) or mean height (p = 0.24) of adhesive remnants were found (ANOVA rank Kruskal-Wallis test) between the groups of teeth cleaned using different tools. Mean volume of enamel loss was 2.159 mm(3) for tungsten carbide bur, 1.366 mm(3) for Shofu One Gloss and 0.659 mm(3) for Adhesive Residue Remover - (F = 2.816, p = 0.0078). A comparison of the proposed new index between tools revealed highly statistically significant differences (p = 0.0081), supporting the best value for Adhesive Residue Remover and the worst - for tungsten carbide bur. The evaluated tools were all characterized by similar effectiveness. The most destructive tool with regards to enamel was the tungsten carbide bur, and the least was Adhesive Residue Removal.
Direct and simultaneous estimation of cardiac four chamber volumes by multioutput sparse regression.
Zhen, Xiantong; Zhang, Heye; Islam, Ali; Bhaduri, Mousumi; Chan, Ian; Li, Shuo
2017-02-01
Cardiac four-chamber volume estimation serves as a fundamental and crucial role in clinical quantitative analysis of whole heart functions. It is a challenging task due to the huge complexity of the four chambers including great appearance variations, huge shape deformation and interference between chambers. Direct estimation has recently emerged as an effective and convenient tool for cardiac ventricular volume estimation. However, existing direct estimation methods were specifically developed for one single ventricle, i.e., left ventricle (LV), or bi-ventricles; they can not be directly used for four chamber volume estimation due to the great combinatorial variability and highly complex anatomical interdependency of the four chambers. In this paper, we propose a new, general framework for direct and simultaneous four chamber volume estimation. We have addressed two key issues, i.e., cardiac image representation and simultaneous four chamber volume estimation, which enables accurate and efficient four-chamber volume estimation. We generate compact and discriminative image representations by supervised descriptor learning (SDL) which can remove irrelevant information and extract discriminative features. We propose direct and simultaneous four-chamber volume estimation by the multioutput sparse latent regression (MSLR), which enables jointly modeling nonlinear input-output relationships and capturing four-chamber interdependence. The proposed method is highly generalized, independent of imaging modalities, which provides a general regression framework that can be extensively used for clinical data prediction to achieve automated diagnosis. Experiments on both MR and CT images show that our method achieves high performance with a correlation coefficient of up to 0.921 with ground truth obtained manually by human experts, which is clinically significant and enables more accurate, convenient and comprehensive assessment of cardiac functions. Copyright © 2016 Elsevier B.V. All rights reserved.
Treuer, Harald; Hoevels, Moritz; Luyken, Klaus; Visser-Vandewalle, Veerle; Wirths, Jochen; Kocher, Martin; Ruge, Maximilian
2015-06-01
Stereotactic radiosurgery with an adapted linear accelerator (linac-SRS) is an established therapy option for brain metastases, benign brain tumors, and arteriovenous malformations. We intended to investigate whether the dosimetric quality of treatment plans achieved with a CyberKnife (CK) is at least equivalent to that for linac-SRS with circular or micromultileaf collimators (microMLC). A random sample of 16 patients with 23 target volumes, previously treated with linac-SRS, was replanned with CK. Planning constraints were identical dose prescription and clinical applicability. In all cases uniform optimization scripts and inverse planning objectives were used. Plans were compared with respect to coverage, minimal dose within target volume, conformity index, and volume of brain tissue irradiated with ≥ 10 Gy. Generating the CK plan was unproblematic with simple optimization scripts in all cases. With the CK plans, coverage, minimal target volume dosage, and conformity index were significantly better, while no significant improvement could be shown regarding the 10 Gy volume. Multiobjective comparison for the irradiated target volumes was superior in the CK plan in 20 out of 23 cases and equivalent in 3 out of 23 cases. Multiobjective comparison for the treated patients was superior in the CK plan in all 16 cases. The results clearly demonstrate the superiority of the irradiation plan for CK compared to classical linac-SRS with circular collimators and microMLC. In particular, the average minimal target volume dose per patient, increased by 1.9 Gy, and at the same time a 14% better conformation index seems to be an improvement with clinical relevance.
Nagata, Koichi; Pethel, Timothy D
2017-07-01
Although anisotropic analytical algorithm (AAA) and Acuros XB (AXB) are both radiation dose calculation algorithms that take into account the heterogeneity within the radiation field, Acuros XB is inherently more accurate. The purpose of this retrospective method comparison study was to compare them and evaluate the dose discrepancy within the planning target volume (PTV). Radiation therapy (RT) plans of 11 dogs with intranasal tumors treated by radiation therapy at the University of Georgia were evaluated. All dogs were planned for intensity-modulated radiation therapy using nine coplanar X-ray beams that were equally spaced, then dose calculated with anisotropic analytical algorithm. The same plan with the same monitor units was then recalculated using Acuros XB for comparisons. Each dog's planning target volume was separated into air, bone, and tissue and evaluated. The mean dose to the planning target volume estimated by Acuros XB was 1.3% lower. It was 1.4% higher for air, 3.7% lower for bone, and 0.9% lower for tissue. The volume of planning target volume covered by the prescribed dose decreased by 21% when Acuros XB was used due to increased dose heterogeneity within the planning target volume. Anisotropic analytical algorithm relatively underestimates the dose heterogeneity and relatively overestimates the dose to the bone and tissue within the planning target volume for the radiation therapy planning of canine intranasal tumors. This can be clinically significant especially if the tumor cells are present within the bone, because it may result in relative underdosing of the tumor. © 2017 American College of Veterinary Radiology.
NASA Astrophysics Data System (ADS)
Rahman, Abdul Samad Abdul; Noor, Mohd Jamaludin Md; Ahmad, Juhaizad Bin; Sidek, Norbaya
2017-10-01
The concept of effective stress has been the principal concept in characterizing soil volume change behavior in soil mechanics, the settlement models developed using this concept have been empirical in nature. However, there remain certain unexplained soil volume change behaviors that cannot be explained using the effective stress concept, one such behaviour is the inundation settlement. Studies have begun to indicate the inevitable role of shear strength as a critical element to be incorporated in models to unravel the unexplained soil behaviours. One soil volume change model that applies the concept of effective stress and the shear strength interaction is the Rotational Multiple Yield Surface Framework (RMYSF) model. This model has been developed from the soil-strain behavior under anisotropic stress condition. Hence, the RMYSF actually measure the soil actual elasto-plastic response to stress rather than assuming it to be fully elastic or plastic as normally perceived by the industry. The frameworks measures the increase in the mobilize shear strength when the soil undergo anisotropic settlement.
Cosmological singularity resolution from quantum gravity: The emergent-bouncing universe
NASA Astrophysics Data System (ADS)
Alesci, Emanuele; Botta, Gioele; Cianfrani, Francesco; Liberati, Stefano
2017-08-01
Alternative scenarios to the big bang singularity have been subject of intense research for several decades by now. Most popular in this sense have been frameworks were such singularity is replaced by a bounce around some minimal cosmological volume or by some early quantum phase. This latter scenario was devised a long time ago and referred as an "emergent universe" (in the sense that our universe emerged from a constant volume quantum phase). We show here that within an improved framework of canonical quantum gravity (the so-called quantum reduced loop gravity) the Friedmann equations for cosmology are modified in such a way to replace the big bang singularity with a short bounce preceded by a metastable quantum phase in which the volume of the universe oscillates between a series of local maxima and minima. We call this hybrid scenario an "emergent-bouncing universe" since after a pure oscillating quantum phase the classical Friedmann spacetime emerges. Perspective developments and possible tests of this scenario are discussed in the end.
A behavior-based framework for assessing barrier effects to wildlife from vehicle traffic volume
Sandra L. Jacobson; Leslie L. Bliss-Ketchum; Catherine E. de Rivera; Winston P. Smith; D. P. C. Peters
2016-01-01
Roads, while central to the function of human society, create barriers to animal movement through collisions and habitat fragmentation. Barriers to animal movement affect the evolution and trajectory of populations. Investigators have attempted to use traffic volume, the number of vehicles passing a point on a road segment, to predict effects to wildlife populations...
NASA Astrophysics Data System (ADS)
Shallcross, Gregory; Capecelatro, Jesse
2017-11-01
Compressible particle-laden flows are common in engineering systems. Applications include but are not limited to water injection in high-speed jet flows for noise suppression, rocket-plume surface interactions during planetary landing, and explosions during coal mining operations. Numerically, it is challenging to capture these interactions due to the wide range of length and time scales. Additionally, there are many forms of the multiphase compressible flow equations with volume fraction effects, some of which are conflicting in nature. The purpose of this presentation is to develop the capability to accurately capture particle-shock interactions in systems with a large number of particles from dense to dilute regimes. A thorough derivation of the volume filtered equations is presented. The volume filtered equations are then implemented in a high-order, energy-stable Eulerian-Lagrangian framework. We show this framework is capable of decoupling the fluid mesh from the particle size, enabling arbitrary particle size distributions in the presence of shocks. The proposed method is then assessed against particle-laden shock tube data. Quantities of interest include fluid-phase pressure profiles and particle spreading rates. The effect of collisions in 2D and 3D are also evaluated.
NASA Technical Reports Server (NTRS)
Dezfuli, Homayoon; Benjamin, Allan; Everett, Christopher; Feather, Martin; Rutledge, Peter; Sen, Dev; Youngblood, Robert
2015-01-01
This is the second of two volumes that collectively comprise the NASA System Safety Handbook. Volume 1 (NASASP-210-580) was prepared for the purpose of presenting the overall framework for System Safety and for providing the general concepts needed to implement the framework. Volume 2 provides guidance for implementing these concepts as an integral part of systems engineering and risk management. This guidance addresses the following functional areas: 1.The development of objectives that collectively define adequate safety for a system, and the safety requirements derived from these objectives that are levied on the system. 2.The conduct of system safety activities, performed to meet the safety requirements, with specific emphasis on the conduct of integrated safety analysis (ISA) as a fundamental means by which systems engineering and risk management decisions are risk-informed. 3.The development of a risk-informed safety case (RISC) at major milestone reviews to argue that the systems safety objectives are satisfied (and therefore that the system is adequately safe). 4.The evaluation of the RISC (including supporting evidence) using a defined set of evaluation criteria, to assess the veracity of the claims made therein in order to support risk acceptance decisions.
Energy efficient industrialized housing research program
DOE Office of Scientific and Technical Information (OSTI.GOV)
Berg, R.; Brown, G.Z.; Finrow, J.
1989-01-01
This is the second volume of a two volume report on energy efficient industrialized housing. Volume II contains support documentation for Volume I. The following items are included: individual trip reports; software bibliography; industry contacts in the US, Denmark, and Japan; Cost comparison of industrialized housing in the US and Denmark; draft of the final report on the systems analysis for Fleetwood Mobile Home Manufacturers. (SM)
System for detecting operating errors in a variable valve timing engine using pressure sensors
Wiles, Matthew A.; Marriot, Craig D
2013-07-02
A method and control module includes a pressure sensor data comparison module that compares measured pressure volume signal segments to ideal pressure volume segments. A valve actuation hardware remedy module performs a hardware remedy in response to comparing the measured pressure volume signal segments to the ideal pressure volume segments when a valve actuation hardware failure is detected.
What can we learn from international comparisons of health systems and health system reform?
McPake, B.; Mills, A.
2000-01-01
Most commonly, lessons derived from comparisons of international health sector reform can only be generalized in a limited way to similar countries. However, there is little guidance as to what constitutes "similarity" in this respect. We propose that a framework for assessing similarity could be derived from the performance of individual policies in different contexts, and from the cause and effect processes related to the policies. We demonstrate this process by considering research evidence in the "public-private mix", and propose variables for an initial framework that we believe determine private involvement in the public health sector. The most influential model of public leadership places the private role in a contracting framework. Research in countries that have adopted this model suggests an additional list of variables to add to the framework. The variables can be grouped under the headings "demand factors", "supply factors", and "strength of the public sector". These illustrate the nature of a framework that could emerge, and which would help countries aiming to learn from international experience. PMID:10916918
NASA Astrophysics Data System (ADS)
Lane, R. J. L.
2015-12-01
At Geoscience Australia, we are upgrading our gravity and magnetic modeling tools to provide new insights into the composition, properties, and structure of the subsurface. The scale of the investigations varies from the size of tectonic plates to the size of a mineral prospect. To accurately model potential field data at all of these scales, we require modeling software that can operate in both spherical and Cartesian coordinate frameworks. The models are in the form of a mesh, with spherical prismatic (tesseroid) elements for spherical coordinate models of large volumes, and rectangular prisms for smaller volumes evaluated in a Cartesian coordinate framework. The software can compute the forward response of supplied rock property models and can perform inversions using constraints that vary from weak generic smoothness through to very specific reference models compiled from various types of "hard facts" (i.e., surface mapping, drilling information, crustal seismic interpretations). To operate efficiently, the software is being specifically developed to make use of the resources of the National Computational Infrastructure (NCI) at the Australian National University (ANU). The development of these tools is been carried out in collaboration with researchers from the Colorado School of Mines (CSM) and the China University of Geosciences (CUG) and is at the stage of advanced testing. The creation of individual 3D geological models will provide immediate insights. Users will also be able to combine models, either by stitching them together or by nesting smaller and more detailed models within a larger model. Comparison of the potential field response of a composite model with the observed fields will give users a sense of how comprehensively these models account for the observations. Users will also be able to model the residual fields (i.e., the observed minus calculated response) to discover features that are not represented in the input composite model.
Toward a dose reduction strategy using model-based reconstruction with limited-angle tomosynthesis
NASA Astrophysics Data System (ADS)
Haneda, Eri; Tkaczyk, J. E.; Palma, Giovanni; Iordache, Rǎzvan; Zelakiewicz, Scott; Muller, Serge; De Man, Bruno
2014-03-01
Model-based iterative reconstruction (MBIR) is an emerging technique for several imaging modalities and appli- cations including medical CT, security CT, PET, and microscopy. Its success derives from an ability to preserve image resolution and perceived diagnostic quality under impressively reduced signal level. MBIR typically uses a cost optimization framework that models system geometry, photon statistics, and prior knowledge of the recon- structed volume. The challenge of tomosynthetic geometries is that the inverse problem becomes more ill-posed due to the limited angles, meaning the volumetric image solution is not uniquely determined by the incom- pletely sampled projection data. Furthermore, low signal level conditions introduce additional challenges due to noise. A fundamental strength of MBIR for limited-views and limited-angle is that it provides a framework for constraining the solution consistent with prior knowledge of expected image characteristics. In this study, we analyze through simulation the capability of MBIR with respect to prior modeling components for limited-views, limited-angle digital breast tomosynthesis (DBT) under low dose conditions. A comparison to ground truth phantoms shows that MBIR with regularization achieves a higher level of fidelity and lower level of blurring and streaking artifacts compared to other state of the art iterative reconstructions, especially for high contrast objects. The benefit of contrast preservation along with less artifacts may lead to detectability improvement of microcalcification for more accurate cancer diagnosis.
Discreteness-induced concentration inversion in mesoscopic chemical systems.
Ramaswamy, Rajesh; González-Segredo, Nélido; Sbalzarini, Ivo F; Grima, Ramon
2012-04-10
Molecular discreteness is apparent in small-volume chemical systems, such as biological cells, leading to stochastic kinetics. Here we present a theoretical framework to understand the effects of discreteness on the steady state of a monostable chemical reaction network. We consider independent realizations of the same chemical system in compartments of different volumes. Rate equations ignore molecular discreteness and predict the same average steady-state concentrations in all compartments. However, our theory predicts that the average steady state of the system varies with volume: if a species is more abundant than another for large volumes, then the reverse occurs for volumes below a critical value, leading to a concentration inversion effect. The addition of extrinsic noise increases the size of the critical volume. We theoretically predict the critical volumes and verify, by exact stochastic simulations, that rate equations are qualitatively incorrect in sub-critical volumes.
Sloped terrain segmentation for autonomous drive using sparse 3D point cloud.
Cho, Seoungjae; Kim, Jonghyun; Ikram, Warda; Cho, Kyungeun; Jeong, Young-Sik; Um, Kyhyun; Sim, Sungdae
2014-01-01
A ubiquitous environment for road travel that uses wireless networks requires the minimization of data exchange between vehicles. An algorithm that can segment the ground in real time is necessary to obtain location data between vehicles simultaneously executing autonomous drive. This paper proposes a framework for segmenting the ground in real time using a sparse three-dimensional (3D) point cloud acquired from undulating terrain. A sparse 3D point cloud can be acquired by scanning the geography using light detection and ranging (LiDAR) sensors. For efficient ground segmentation, 3D point clouds are quantized in units of volume pixels (voxels) and overlapping data is eliminated. We reduce nonoverlapping voxels to two dimensions by implementing a lowermost heightmap. The ground area is determined on the basis of the number of voxels in each voxel group. We execute ground segmentation in real time by proposing an approach to minimize the comparison between neighboring voxels. Furthermore, we experimentally verify that ground segmentation can be executed at about 19.31 ms per frame.
UCSC genome browser: deep support for molecular biomedical research.
Mangan, Mary E; Williams, Jennifer M; Lathe, Scott M; Karolchik, Donna; Lathe, Warren C
2008-01-01
The volume and complexity of genomic sequence data, and the additional experimental data required for annotation of the genomic context, pose a major challenge for display and access for biomedical researchers. Genome browsers organize this data and make it available in various ways to extract useful information to advance research projects. The UCSC Genome Browser is one of these resources. The official sequence data for a given species forms the framework to display many other types of data such as expression, variation, cross-species comparisons, and more. Visual representations of the data are available for exploration. Data can be queried with sequences. Complex database queries are also easily achieved with the Table Browser interface. Associated tools permit additional query types or access to additional data sources such as images of in situ localizations. Support for solving researcher's issues is provided with active discussion mailing lists and by providing updated training materials. The UCSC Genome Browser provides a source of deep support for a wide range of biomedical molecular research (http://genome.ucsc.edu).
ERIC Educational Resources Information Center
King, Donald W.; Boyce, Peter B.; Montgomery, Carol Hansen; Tenopir, Carol
2003-01-01
Focuses on library economic metrics, and presents a conceptual framework for library economic metrics including service input and output, performance, usage, effectiveness, outcomes, impact, and cost and benefit comparisons. Gives examples of these measures for comparison of library electronic and print collections and collection services.…
A Comparison of Regional and SiteSpecific Volume Estimation Equations
Joe P. McClure; Jana Anderson; Hans T. Schreuder
1987-01-01
Regression equations for volume by region and site class were examined for lobiolly pine. The regressions for the Coastal Plain and Piedmont regions had significantly different slopes. The results shared important practical differences in percentage of confidence intervals containing the true total volume and in percentage of estimates within a specific proportion of...
Digging Deeper Using Neuroimaging Tools Reveals Important Clues to Early-Onset Schizophrenia
ERIC Educational Resources Information Center
Kumra, Sanjiv
2008-01-01
The article describes the use of structural neuroimaging to understand the psychopathology of childhood-onset schizophrenia. Results showed an increase in lateral volumes, reduced total and regional volumes of gray matter in the cortex and increased basal ganglia volumes as in adult-onset schizophrenia in comparison with healthy subjects.
ERIC Educational Resources Information Center
Stringer, William L.; Cunningham, Alisa F.
This report contains a conceptual framework for analyzing costs and prices by evaluating the higher education production function and the determinants of both prices and costs. The framework can be used to strengthen understanding of costs and prices within individual institutions and to inform macro level investments at state and national levels.…
Bremner, J. Douglas; Randall, Penny; Scott, Tammy M.; Bronen, Richard A.; Seibyl, John P.; Southwick, Steven M.; Delaney, Richard C.; McCarthy, Gregory; Charney, Dennis S.; Innis, Robert B.
2011-01-01
Objective Studies in nonhuman primates suggest that high levels of cortisol associated with stress have neurotoxic effects on the hippocampus, a brain structure involved in memory. The authors previously showed that patients with combat-related posttraumatic stress disorder (PTSD) had deficits in short-term memory. The purpose of this study was to compare the hippocampal volume of patients with PTSD to that of subjects without psychiatric disorder. Method Magnetic resonance imaging was used to measure the volume of the hippocampus in 26 Vietnam combat veterans with PTSD and 22 comparison subjects selected to be similar to the patients in age, sex, race, years of education, socioeconomic status, body size, and years of alcohol abuse. Results The PTSD patients had a statistically significant 8% smaller right hippocampal volume relative to that of the comparison subjects, but there was no difference in the volume of other brain regions (caudate and temporal lobe). Deficits in short-term verbal memory as measured with the Wechsler Memory Scale were associated with smaller right hippocampal volume in the PTSD patients only. Conclusions These findings are consistent with a smaller right hippocampal volume in PTSD that is associated with functional deficits in verbal memory. PMID:7793467
NASA Astrophysics Data System (ADS)
Zhang, Hao; Gang, Grace J.; Lee, Junghoon; Wong, John; Stayman, J. Webster
2017-03-01
Purpose: There are many clinical situations where diagnostic CT is used for an initial diagnosis or treatment planning, followed by one or more CBCT scans that are part of an image-guided intervention. Because the high-quality diagnostic CT scan is a rich source of patient-specific anatomical knowledge, this provides an opportunity to incorporate the prior CT image into subsequent CBCT reconstruction for improved image quality. We propose a penalized-likelihood method called reconstruction of difference (RoD), to directly reconstruct differences between the CBCT scan and the CT prior. In this work, we demonstrate the efficacy of RoD with clinical patient datasets. Methods: We introduce a data processing workflow using the RoD framework to reconstruct anatomical changes between the prior CT and current CBCT. This workflow includes processing steps to account for non-anatomical differences between the two scans including 1) scatter correction for CBCT datasets due to increased scatter fractions in CBCT data; 2) histogram matching for attenuation variations between CT and CBCT; and 3) registration for different patient positioning. CBCT projection data and CT planning volumes for two radiotherapy patients - one abdominal study and one head-and-neck study - were investigated. Results: In comparisons between the proposed RoD framework and more traditional FDK and penalized-likelihood reconstructions, we find a significant improvement in image quality when prior CT information is incorporated into the reconstruction. RoD is able to provide additional low-contrast details while correctly incorporating actual physical changes in patient anatomy. Conclusions: The proposed framework provides an opportunity to either improve image quality or relax data fidelity constraints for CBCT imaging when prior CT studies of the same patient are available. Possible clinical targets include CBCT image-guided radiotherapy and CBCT image-guided surgeries.
Ferreira-Pêgo, Cíntia; Nissensohn, Mariela; Kavouras, Stavros A.; Babio, Nancy; Serra-Majem, Lluís; Martín Águila, Adys; Mauromoustakos, Andy; Álvarez Pérez, Jacqueline; Salas-Salvadó, Jordi
2016-01-01
We assess the repeatability and relative validity of a Spanish beverage intake questionnaire for assessing water intake from beverages. The present analysis was performed within the framework of the PREDIMED-PLUS trial. The study participants were adults (aged 55–75) with a BMI ≥27 and <40 kg/m2, and at least three components of Metabolic Syndrome (MetS). A trained dietitian completed the questionnaire. Participants provided 24-h urine samples, and the volume and urine osmolality were recorded. The repeatability of the baseline measurement at 6 and 1 year was examined by paired Student’s t-test comparisons. A total of 160 participants were included in the analysis. The Bland–Altman analysis showed relatively good agreement between total daily fluid intake assessed using the fluid-specific questionnaire, and urine osmolality and 24-h volume with parameter estimates of −0.65 and 0.22, respectively (R2 = 0.20; p < 0.001). In the repeatability test, no significant differences were found between neither type of beverage nor total daily fluid intake at 6 months and 1-year assessment, compared to baseline. The proposed fluid-specific assessment questionnaire designed to assess the consumption of water and other beverages in Spanish adult individuals was found to be relatively valid with good repeatability. PMID:27483318
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 1
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. This is Volume 1, an Executive Summary. Volume 2 contains Appendices A (Aerothermal Comparisons) and B (Flight Derived h sub 1/h sub u vs. M sub inf. Plots), and Volume 3 contains Appendix C (Comparison of Interference Factors among OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Large CSF Volume Not Attributable to Ventricular Volume in Schizotypal Personality Disorder
Dickey, Chandlee C.; Shenton, Martha E.; Hirayasu, Yoshio; Fischer, Iris; Voglmaier, Martina M.; Niznikiewicz, Margaret A.; Seidman, Larry J.; Fraone, Stephanie; McCarley, Robert W.
2010-01-01
Objective The purpose of this study was to determine whether schizotypal personality disorder, which has the same genetic diathesis as schizophrenia, manifests abnormalities in whole-brain and CSF volumes. Method Sixteen right-handed and neuroleptic-naive men with schizotypal personality disorder were recruited from the community and were age-matched to 14 healthy comparison subjects. Magnetic resonance images were obtained from the subjects and automatically parcellated into CSF, gray matter, and white matter. Subsequent manual editing separated cortical from noncortical gray matter. Lateral ventricles and temporal horns were also delineated. Results The men with schizotypal personality disorder had larger CSF volumes than the comparison subjects; the difference was not attributable to larger lateral ventricles. The cortical gray matter was somewhat smaller in the men with schizotypal personality disorder, but the difference was not statistically significant. Conclusions Consistent with many studies of schizophrenia, this examination of schizotypal personality disorder indicated abnormalities in brain CSF volumes. PMID:10618012
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 3
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. Volume 2 contains Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub 1/h sub u vs. M sub inf. Plots). This is Volume 3, containing Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Orbital flight test shuttle external tank aerothermal flight evaluation, volume 2
NASA Technical Reports Server (NTRS)
Praharaj, Sarat C.; Engel, Carl D.; Warmbrod, John D.
1986-01-01
This 3-volume report discusses the evaluation of aerothermal flight measurements made on the orbital flight test Space Shuttle External Tanks (ETs). Six ETs were instrumented to measure various quantities during flight; including heat transfer, pressure, and structural temperature. The flight data was reduced and analyzed against math models established from an extensive wind tunnel data base and empirical heat-transfer relationships. This analysis has supported the validity of the current aeroheating methodology and existing data base; and, has also identified some problem areas which require methodology modifications. Volume 1 is the Executive Summary. This is volume 2, containing Appendix A (Aerothermal Comparisons), and Appendix B (Flight-Derived h sub i/h sub u vs. M sub inf. Plots). Volume 3 contains Appendix C (Comparison of Interference Factors between OFT Flight, Prediction and 1H-97A Data), Appendix D (Freestream Stanton Number and Reynolds Number Correlation for Flight and Tunnel Data), and Appendix E (Flight-Derived h sub i/h sub u Tables).
Reliably Modeling the Mechanical Stability of Rigid and Flexible Metal–Organic Frameworks
2017-01-01
Conspectus Over the past two decades, metal–organic frameworks (MOFs) have matured from interesting academic peculiarities toward a continuously expanding class of hybrid, nanoporous materials tuned for targeted technological applications such as gas storage and heterogeneous catalysis. These oft-times crystalline materials, composed of inorganic moieties interconnected by organic ligands, can be endowed with desired structural and chemical features by judiciously functionalizing or substituting these building blocks. As a result of this reticular synthesis, MOF research is situated at the intriguing intersection between chemistry and physics, and the building block approach could pave the way toward the construction of an almost infinite number of possible crystalline structures, provided that they exhibit stability under the desired operational conditions. However, this enormous potential is largely untapped to date, as MOFs have not yet found a major breakthrough in technological applications. One of the remaining challenges for this scale-up is the densification of MOF powders, which is generally achieved by subjecting the material to a pressurization step. However, application of an external pressure may substantially alter the chemical and physical properties of the material. A reliable theoretical guidance that can presynthetically identify the most stable materials could help overcome this technological challenge. In this Account, we describe the recent research the progress on computational characterization of the mechanical stability of MOFs. So far, three complementary approaches have been proposed, focusing on different aspects of mechanical stability: (i) the Born stability criteria, (ii) the anisotropy in mechanical moduli such as the Young and shear moduli, and (iii) the pressure-versus-volume equations of state. As these three methods are grounded in distinct computational approaches, it is expected that their accuracy and efficiency will vary. To date, however, it is unclear which set of properties are suited and reliable for a given application, as a comprehensive comparison for a broad variety of MOFs is absent, impeding the widespread use of these theoretical frameworks. Herein, we fill this gap by critically assessing the performance of the three computational models on a broad set of MOFs that are representative for current applications. These materials encompass the mechanically rigid UiO-66(Zr) and MOF-5(Zn) as well as the flexible MIL-47(V) and MIL-53(Al), which undergo pressure-induced phase transitions. It is observed that the Born stability criteria and pressure-versus-volume equations of state give complementary insight into the macroscopic and microscopic origins of instability, respectively. However, interpretation of the Born stability criteria becomes increasingly difficult when less symmetric materials are considered. Moreover, pressure fluctuations during the simulations hamper their accuracy for flexible materials. In contrast, the pressure-versus-volume equations of state are determined in a thermodynamic ensemble specifically targeted to mitigate the effects of these instantaneous fluctuations, yielding more accurate results. The critical Account presented here paves the way toward a solid computational framework for an extensive presynthetic screening of MOFs to select those that are mechanically stable and can be postsynthetically densified before their use in targeted applications. PMID:29155552
Reliably Modeling the Mechanical Stability of Rigid and Flexible Metal-Organic Frameworks.
Rogge, Sven M J; Waroquier, Michel; Van Speybroeck, Veronique
2018-01-16
Over the past two decades, metal-organic frameworks (MOFs) have matured from interesting academic peculiarities toward a continuously expanding class of hybrid, nanoporous materials tuned for targeted technological applications such as gas storage and heterogeneous catalysis. These oft-times crystalline materials, composed of inorganic moieties interconnected by organic ligands, can be endowed with desired structural and chemical features by judiciously functionalizing or substituting these building blocks. As a result of this reticular synthesis, MOF research is situated at the intriguing intersection between chemistry and physics, and the building block approach could pave the way toward the construction of an almost infinite number of possible crystalline structures, provided that they exhibit stability under the desired operational conditions. However, this enormous potential is largely untapped to date, as MOFs have not yet found a major breakthrough in technological applications. One of the remaining challenges for this scale-up is the densification of MOF powders, which is generally achieved by subjecting the material to a pressurization step. However, application of an external pressure may substantially alter the chemical and physical properties of the material. A reliable theoretical guidance that can presynthetically identify the most stable materials could help overcome this technological challenge. In this Account, we describe the recent research the progress on computational characterization of the mechanical stability of MOFs. So far, three complementary approaches have been proposed, focusing on different aspects of mechanical stability: (i) the Born stability criteria, (ii) the anisotropy in mechanical moduli such as the Young and shear moduli, and (iii) the pressure-versus-volume equations of state. As these three methods are grounded in distinct computational approaches, it is expected that their accuracy and efficiency will vary. To date, however, it is unclear which set of properties are suited and reliable for a given application, as a comprehensive comparison for a broad variety of MOFs is absent, impeding the widespread use of these theoretical frameworks. Herein, we fill this gap by critically assessing the performance of the three computational models on a broad set of MOFs that are representative for current applications. These materials encompass the mechanically rigid UiO-66(Zr) and MOF-5(Zn) as well as the flexible MIL-47(V) and MIL-53(Al), which undergo pressure-induced phase transitions. It is observed that the Born stability criteria and pressure-versus-volume equations of state give complementary insight into the macroscopic and microscopic origins of instability, respectively. However, interpretation of the Born stability criteria becomes increasingly difficult when less symmetric materials are considered. Moreover, pressure fluctuations during the simulations hamper their accuracy for flexible materials. In contrast, the pressure-versus-volume equations of state are determined in a thermodynamic ensemble specifically targeted to mitigate the effects of these instantaneous fluctuations, yielding more accurate results. The critical Account presented here paves the way toward a solid computational framework for an extensive presynthetic screening of MOFs to select those that are mechanically stable and can be postsynthetically densified before their use in targeted applications.
2014-01-09
workloads were determined by matching heart rate responses from each LBNP level. Heart rate and stroke volume (SV) were measured via Finom- eter. ECG, heat...learning algorithm for the assessment of central blood volume via pulse pressure [a noninvasive surrogate of stroke volume (SV)]. These data...30 130 125 120 115 110 105 100 Actual - Finometer Predicted - Algorithm Fig. 3. Comparison of average stroke volume (SV) derived from the
ERIC Educational Resources Information Center
Commission of the European Communities, Brussels (Belgium).
This report, the first volume in a three volume set, summarizes the results of a study performed by the DELTA (Developing European Learning through Technological Advance) Unit in parallel with the projects underway in the research and development Exploratory Action. The report identifies the key issues, associated requirements and options, and…
Automatic learning-based beam angle selection for thoracic IMRT
DOE Office of Scientific and Technical Information (OSTI.GOV)
Amit, Guy; Marshall, Andrea; Purdie, Thomas G., E-mail: tom.purdie@rmp.uhn.ca
Purpose: The treatment of thoracic cancer using external beam radiation requires an optimal selection of the radiation beam directions to ensure effective coverage of the target volume and to avoid unnecessary treatment of normal healthy tissues. Intensity modulated radiation therapy (IMRT) planning is a lengthy process, which requires the planner to iterate between choosing beam angles, specifying dose–volume objectives and executing IMRT optimization. In thorax treatment planning, where there are no class solutions for beam placement, beam angle selection is performed manually, based on the planner’s clinical experience. The purpose of this work is to propose and study a computationallymore » efficient framework that utilizes machine learning to automatically select treatment beam angles. Such a framework may be helpful for reducing the overall planning workload. Methods: The authors introduce an automated beam selection method, based on learning the relationships between beam angles and anatomical features. Using a large set of clinically approved IMRT plans, a random forest regression algorithm is trained to map a multitude of anatomical features into an individual beam score. An optimization scheme is then built to select and adjust the beam angles, considering the learned interbeam dependencies. The validity and quality of the automatically selected beams evaluated using the manually selected beams from the corresponding clinical plans as the ground truth. Results: The analysis included 149 clinically approved thoracic IMRT plans. For a randomly selected test subset of 27 plans, IMRT plans were generated using automatically selected beams and compared to the clinical plans. The comparison of the predicted and the clinical beam angles demonstrated a good average correspondence between the two (angular distance 16.8° ± 10°, correlation 0.75 ± 0.2). The dose distributions of the semiautomatic and clinical plans were equivalent in terms of primary target volume coverage and organ at risk sparing and were superior over plans produced with fixed sets of common beam angles. The great majority of the automatic plans (93%) were approved as clinically acceptable by three radiation therapy specialists. Conclusions: The results demonstrated the feasibility of utilizing a learning-based approach for automatic selection of beam angles in thoracic IMRT planning. The proposed method may assist in reducing the manual planning workload, while sustaining plan quality.« less
Modeling the Earth System, volume 3
NASA Technical Reports Server (NTRS)
Ojima, Dennis (Editor)
1992-01-01
The topics covered fall under the following headings: critical gaps in the Earth system conceptual framework; development needs for simplified models; and validating Earth system models and their subcomponents.
ERIC Educational Resources Information Center
Markley, O. W.
This is the second volume of a report on a study that (1) investigated the "normative structure" (the governance system) of knowledge production and utilization (KPU) activities in education, (2) developed an analytical framework through which to understand how formal policy acts as a "regulator" of activities in KPU, (3) described the major…
NASA Astrophysics Data System (ADS)
Lee, K. David; Wiesenfeld, Eric; Gelfand, Andrew
2007-04-01
One of the greatest challenges in modern combat is maintaining a high level of timely Situational Awareness (SA). In many situations, computational complexity and accuracy considerations make the development and deployment of real-time, high-level inference tools very difficult. An innovative hybrid framework that combines Bayesian inference, in the form of Bayesian Networks, and Possibility Theory, in the form of Fuzzy Logic systems, has recently been introduced to provide a rigorous framework for high-level inference. In previous research, the theoretical basis and benefits of the hybrid approach have been developed. However, lacking is a concrete experimental comparison of the hybrid framework with traditional fusion methods, to demonstrate and quantify this benefit. The goal of this research, therefore, is to provide a statistical analysis on the comparison of the accuracy and performance of hybrid network theory, with pure Bayesian and Fuzzy systems and an inexact Bayesian system approximated using Particle Filtering. To accomplish this task, domain specific models will be developed under these different theoretical approaches and then evaluated, via Monte Carlo Simulation, in comparison to situational ground truth to measure accuracy and fidelity. Following this, a rigorous statistical analysis of the performance results will be performed, to quantify the benefit of hybrid inference to other fusion tools.
Growth of nanostructures with controlled diameter
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pfefferle, Lisa; Haller, Gary; Ciuparu, Dragos
2009-02-03
Transition metal-substituted MCM-41 framework structures with a high degree of structural order and a narrow pore diameter distribution were reproducibly synthesized by a hydrothermal method using a surfactant and an anti-foaming agent. The pore size and the mesoporous volume depend linearly on the surfactant chain length. The transition metals, such as cobalt, are incorporated substitutionally and highly dispersed in the silica framework. Single wall carbon nanotubes with a narrow diameter distribution that correlates with the pore diameter of the catalytic framework structure were prepared by a Boudouard reaction. Nanostructures with a specified diameter or cross-sectional area can therefore be predictablymore » prepared by selecting a suitable pore size of the framework structure.« less
Dostál, P; Senkeřík, M; Pařízková, R; Bareš, D; Zivný, P; Zivná, H; Cerný, V
2010-01-01
Hypothermia was shown to attenuate ventilator-induced lung injury due to large tidal volumes. It is unclear if the protective effect of hypothermia is maintained under less injurious mechanical ventilation in animals without previous lung injury. Tracheostomized rats were randomly allocated to non-ventilated group (group C) or ventilated groups of normothermia (group N) and mild hypothermia (group H). After two hours of mechanical ventilation with inspiratory fraction of oxygen 1.0, respiratory rate 60 min(-1), tidal volume 10 ml x kg(-1), positive end-expiratory pressure (PEEP) 2 cm H2O or immediately after tracheostomy in non-ventilated animals inspiratory pressures were recorded, rats were sacrificed, pressure-volume (PV) curve of respiratory system constructed, bronchoalveolar lavage (BAL) fluid and aortic blood samples obtained. Group N animals exhibited a higher rise in peak inspiratory pressures in comparison to group H animals. Shift of the PV curve to right, higher total protein and interleukin-6 levels in BAL fluid were observed in normothermia animals in comparison with hypothermia animals and non-ventilated controls. Tumor necrosis factor-alpha was lower in the hypothermia group in comparison with normothermia and non-ventilated groups. Mild hypothermia attenuated changes in respiratory system mechanics and modified cytokine concentration in bronchoalveolar lavage fluid during low lung volume ventilation in animals without previous lung injury.
Hinaman, Kurt
2005-01-01
The Powder River Basin in Wyoming and Montana is an important source of energy resources for the United States. Coalbed methane gas is contained in Tertiary and upper Cretaceous hydrogeologic units in the Powder River Basin. This gas is released when water pressure in coalbeds is lowered, usually by pumping ground water. Issues related to disposal and uses of by-product water from coalbed methane production have developed, in part, due to uncertainties in hydrologic properties. One hydrologic property of primary interest is the amount of water contained in Tertiary and upper Cretaceous hydrogeologic units in the Powder River Basin. The U.S. Geological Survey, in cooperation with the Bureau of Land Management, conducted a study to describe the hydrogeologic framework and to estimate ground-water volumes in different facies of Tertiary and upper Cretaceous hydrogeologic units in the Powder River Basin in Wyoming. A geographic information system was used to compile and utilize hydrogeologic maps, to describe the hydrogeologic framework, and to estimate the volume of ground water in Tertiary and upper Cretaceous hydrogeologic units in the Powder River structural basin in Wyoming. Maps of the altitudes of potentiometric surfaces, altitudes of the tops and bottoms of hydrogeologic units, thicknesses of hydrogeologic units, percent sand of hydrogeologic units, and outcrop boundaries for the following hydrogeologic units were used: Tongue River-Wasatch aquifer, Lebo confining unit, Tullock aquifer, Upper Hell Creek confining unit, and the Fox Hills-Lower Hell Creek aquifer. Literature porosity values of 30 percent for sand and 35 percent for non-sand facies were used to calculate the volume of total ground water in each hydrogeologic unit. Literature specific yield values of 26 percent for sand and 10 percent for non-sand facies, and literature specific storage values of 0.0001 ft-1 (1/foot) for sand facies and 0.00001 ft-1 for non-sand facies, were used to calculate a second volume of ground water for each hydrogeologic unit. Significant figure considerations limited estimates of ground-water volumes to two significant digits. A total ground-water volume of 2.0x1014 ft3 (cubic feet) was calculated using porosity values, and a total ground-water volume of 3.6x1013 ft3 was calculated using specific yield and specific storage values. These results are consistent with retention properties, which would have some of the total water being retained in the sediments. Sensitivity analysis shows that the estimates of ground-water volume are most sensitive to porosity. The estimates also are sensitive to confined thickness and saturated thickness. Better spatial information for hydrogeologic units could help refine the ground-water volume estimates.
A probabilistic topic model for clinical risk stratification from electronic health records.
Huang, Zhengxing; Dong, Wei; Duan, Huilong
2015-12-01
Risk stratification aims to provide physicians with the accurate assessment of a patient's clinical risk such that an individualized prevention or management strategy can be developed and delivered. Existing risk stratification techniques mainly focus on predicting the overall risk of an individual patient in a supervised manner, and, at the cohort level, often offer little insight beyond a flat score-based segmentation from the labeled clinical dataset. To this end, in this paper, we propose a new approach for risk stratification by exploring a large volume of electronic health records (EHRs) in an unsupervised fashion. Along this line, this paper proposes a novel probabilistic topic modeling framework called probabilistic risk stratification model (PRSM) based on Latent Dirichlet Allocation (LDA). The proposed PRSM recognizes a patient clinical state as a probabilistic combination of latent sub-profiles, and generates sub-profile-specific risk tiers of patients from their EHRs in a fully unsupervised fashion. The achieved stratification results can be easily recognized as high-, medium- and low-risk, respectively. In addition, we present an extension of PRSM, called weakly supervised PRSM (WS-PRSM) by incorporating minimum prior information into the model, in order to improve the risk stratification accuracy, and to make our models highly portable to risk stratification tasks of various diseases. We verify the effectiveness of the proposed approach on a clinical dataset containing 3463 coronary heart disease (CHD) patient instances. Both PRSM and WS-PRSM were compared with two established supervised risk stratification algorithms, i.e., logistic regression and support vector machine, and showed the effectiveness of our models in risk stratification of CHD in terms of the Area Under the receiver operating characteristic Curve (AUC) analysis. As well, in comparison with PRSM, WS-PRSM has over 2% performance gain, on the experimental dataset, demonstrating that incorporating risk scoring knowledge as prior information can improve the performance in risk stratification. Experimental results reveal that our models achieve competitive performance in risk stratification in comparison with existing supervised approaches. In addition, the unsupervised nature of our models makes them highly portable to the risk stratification tasks of various diseases. Moreover, patient sub-profiles and sub-profile-specific risk tiers generated by our models are coherent and informative, and provide significant potential to be explored for the further tasks, such as patient cohort analysis. We hypothesize that the proposed framework can readily meet the demand for risk stratification from a large volume of EHRs in an open-ended fashion. Copyright © 2015 Elsevier Inc. All rights reserved.
Comparison of Past, Present, and Future Volume Estimation Methods for Tennessee
Stanley J. Zarnoch; Alexander Clark; Ray A. Souter
2003-01-01
Forest Inventory and Analysis 1999 survey data for Tennessee were used to compare stem-volume estimates obtained using a previous method, the current method, and newly developed taper models that will be used in the future. Compared to the current method, individual tree volumes were consistently underestimated with the previous method, especially for the hardwoods....
ERIC Educational Resources Information Center
Van Velsor, Ellen; Leslie, Jean Brittain
"Feedback to Managers" is a two-volume report. Volume 2 compares 16 of the better feedback instruments available. The following are the instruments: (1) ACUMEN Group Feedback; (2) BENCHMARKS; (3) the Campbell Leadership Index; (4) COMPASS: the Managerial Practices Survey; (5) the Executive Success Profile; (6) Leader Behavior Analysis…
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fraysse, F., E-mail: francois.fraysse@rs2n.eu; E. T. S. de Ingeniería Aeronáutica y del Espacio, Universidad Politécnica de Madrid, Madrid; Redondo, C.
This article is devoted to the numerical discretisation of the hyperbolic two-phase flow model of Baer and Nunziato. A special attention is paid on the discretisation of intercell flux functions in the framework of Finite Volume and Discontinuous Galerkin approaches, where care has to be taken to efficiently approximate the non-conservative products inherent to the model equations. Various upwind approximate Riemann solvers have been tested on a bench of discontinuous test cases. New discretisation schemes are proposed in a Discontinuous Galerkin framework following the criterion of Abgrall and the path-conservative formalism. A stabilisation technique based on artificial viscosity is appliedmore » to the high-order Discontinuous Galerkin method and compared against classical TVD-MUSCL Finite Volume flux reconstruction.« less
Stereometric body volume measurement
NASA Technical Reports Server (NTRS)
Herron, R. E.
1975-01-01
The following studies are reported: (1) effects of extended space flight on body form of Skylab astronauts using biostereometrics; (2) comparison of body volume determinations using hydrostatic weighing and biostereometrics; and (3) training of technicians in biostereometric principles and procedures.
NASA Astrophysics Data System (ADS)
Martin, Spencer; Brophy, Mark; Palma, David; Louie, Alexander V.; Yu, Edward; Yaremko, Brian; Ahmad, Belal; Barron, John L.; Beauchemin, Steven S.; Rodrigues, George; Gaede, Stewart
2015-02-01
This work aims to propose and validate a framework for tumour volume auto-segmentation based on ground-truth estimates derived from multi-physician input contours to expedite 4D-CT based lung tumour volume delineation. 4D-CT datasets of ten non-small cell lung cancer (NSCLC) patients were manually segmented by 6 physicians. Multi-expert ground truth (GT) estimates were constructed using the STAPLE algorithm for the gross tumour volume (GTV) on all respiratory phases. Next, using a deformable model-based method, multi-expert GT on each individual phase of the 4D-CT dataset was propagated to all other phases providing auto-segmented GTVs and motion encompassing internal gross target volumes (IGTVs) based on GT estimates (STAPLE) from each respiratory phase of the 4D-CT dataset. Accuracy assessment of auto-segmentation employed graph cuts for 3D-shape reconstruction and point-set registration-based analysis yielding volumetric and distance-based measures. STAPLE-based auto-segmented GTV accuracy ranged from (81.51 ± 1.92) to (97.27 ± 0.28)% volumetric overlap of the estimated ground truth. IGTV auto-segmentation showed significantly improved accuracies with reduced variance for all patients ranging from 90.87 to 98.57% volumetric overlap of the ground truth volume. Additional metrics supported these observations with statistical significance. Accuracy of auto-segmentation was shown to be largely independent of selection of the initial propagation phase. IGTV construction based on auto-segmented GTVs within the 4D-CT dataset provided accurate and reliable target volumes compared to manual segmentation-based GT estimates. While inter-/intra-observer effects were largely mitigated, the proposed segmentation workflow is more complex than that of current clinical practice and requires further development.
Martin, Spencer; Brophy, Mark; Palma, David; Louie, Alexander V; Yu, Edward; Yaremko, Brian; Ahmad, Belal; Barron, John L; Beauchemin, Steven S; Rodrigues, George; Gaede, Stewart
2015-02-21
This work aims to propose and validate a framework for tumour volume auto-segmentation based on ground-truth estimates derived from multi-physician input contours to expedite 4D-CT based lung tumour volume delineation. 4D-CT datasets of ten non-small cell lung cancer (NSCLC) patients were manually segmented by 6 physicians. Multi-expert ground truth (GT) estimates were constructed using the STAPLE algorithm for the gross tumour volume (GTV) on all respiratory phases. Next, using a deformable model-based method, multi-expert GT on each individual phase of the 4D-CT dataset was propagated to all other phases providing auto-segmented GTVs and motion encompassing internal gross target volumes (IGTVs) based on GT estimates (STAPLE) from each respiratory phase of the 4D-CT dataset. Accuracy assessment of auto-segmentation employed graph cuts for 3D-shape reconstruction and point-set registration-based analysis yielding volumetric and distance-based measures. STAPLE-based auto-segmented GTV accuracy ranged from (81.51 ± 1.92) to (97.27 ± 0.28)% volumetric overlap of the estimated ground truth. IGTV auto-segmentation showed significantly improved accuracies with reduced variance for all patients ranging from 90.87 to 98.57% volumetric overlap of the ground truth volume. Additional metrics supported these observations with statistical significance. Accuracy of auto-segmentation was shown to be largely independent of selection of the initial propagation phase. IGTV construction based on auto-segmented GTVs within the 4D-CT dataset provided accurate and reliable target volumes compared to manual segmentation-based GT estimates. While inter-/intra-observer effects were largely mitigated, the proposed segmentation workflow is more complex than that of current clinical practice and requires further development.
Cardona, Albert; Saalfeld, Stephan; Preibisch, Stephan; Schmid, Benjamin; Cheng, Anchi; Pulokas, Jim; Tomancak, Pavel; Hartenstein, Volker
2010-01-01
The analysis of microcircuitry (the connectivity at the level of individual neuronal processes and synapses), which is indispensable for our understanding of brain function, is based on serial transmission electron microscopy (TEM) or one of its modern variants. Due to technical limitations, most previous studies that used serial TEM recorded relatively small stacks of individual neurons. As a result, our knowledge of microcircuitry in any nervous system is very limited. We applied the software package TrakEM2 to reconstruct neuronal microcircuitry from TEM sections of a small brain, the early larval brain of Drosophila melanogaster. TrakEM2 enables us to embed the analysis of the TEM image volumes at the microcircuit level into a light microscopically derived neuro-anatomical framework, by registering confocal stacks containing sparsely labeled neural structures with the TEM image volume. We imaged two sets of serial TEM sections of the Drosophila first instar larval brain neuropile and one ventral nerve cord segment, and here report our first results pertaining to Drosophila brain microcircuitry. Terminal neurites fall into a small number of generic classes termed globular, varicose, axiform, and dendritiform. Globular and varicose neurites have large diameter segments that carry almost exclusively presynaptic sites. Dendritiform neurites are thin, highly branched processes that are almost exclusively postsynaptic. Due to the high branching density of dendritiform fibers and the fact that synapses are polyadic, neurites are highly interconnected even within small neuropile volumes. We describe the network motifs most frequently encountered in the Drosophila neuropile. Our study introduces an approach towards a comprehensive anatomical reconstruction of neuronal microcircuitry and delivers microcircuitry comparisons between vertebrate and insect neuropile. PMID:20957184
General Criteria for Evaluating Social Programs.
ERIC Educational Resources Information Center
Shipman, Stephanie
1989-01-01
A framework of general evaluation criteria for ensuring the comprehensiveness of program reviews and appropriate and fair comparison of children's programs is outlined. It has two components: (1) descriptive; and (2) evaluative. The framework was developed by researchers at the General Accounting Office for evaluation of federal programs. (TJH)
Analyzing Agricultural Technology Systems: A Research Report.
ERIC Educational Resources Information Center
Swanson, Burton E.
The International Program for Agricultural Knowledge Systems (INTERPAKS) research team is developing a descriptive and analytic framework to examine and assess agricultural technology systems. The first part of the framework is an inductive methodology that organizes data collection and orders data for comparison between countries. It requires and…
20th Annual Systems Engineering Conference, Thursday, Volume 4
2017-10-26
Daniel Dault, Air Force Research Lab 19809 Physics Based Modeling & Simulation For Shock and Vulnerability Assessments - Navy Enhanced Sierra...19811 Version 1.0 of the New INCOSE Competency Framework u Mr. Don Gelosh 19515 A Proposed Engineering Training Framework and Competency Methodology...nonlinearity ▪ QEV, Transient, Frequency Domain ▪ Inverse Methods Capability ▪ Coupled Physics ▪ Fluids: nemo, aero and sigma ▪ Thermal (unidirection): fuego
NASA Technical Reports Server (NTRS)
Ripple, W. J.; Wang, S.; Isaacson, D. L.; Paine, D. P.
1991-01-01
Digital Landsat Thematic Mapper (TM) and SPOT high-resolution visible (HRV) images of coniferous forest canopies were compared in their relationship to forest wood volume using correlation and regression analyses. Significant inverse relationships were found between softwood volume and the spectral bands from both sensors (P less than 0.01). The highest correlations were between the log of softwood volume and the near-infrared bands.
Sideloading - Ingestion of Large Point Clouds Into the Apache Spark Big Data Engine
NASA Astrophysics Data System (ADS)
Boehm, J.; Liu, K.; Alis, C.
2016-06-01
In the geospatial domain we have now reached the point where data volumes we handle have clearly grown beyond the capacity of most desktop computers. This is particularly true in the area of point cloud processing. It is therefore naturally lucrative to explore established big data frameworks for big geospatial data. The very first hurdle is the import of geospatial data into big data frameworks, commonly referred to as data ingestion. Geospatial data is typically encoded in specialised binary file formats, which are not naturally supported by the existing big data frameworks. Instead such file formats are supported by software libraries that are restricted to single CPU execution. We present an approach that allows the use of existing point cloud file format libraries on the Apache Spark big data framework. We demonstrate the ingestion of large volumes of point cloud data into a compute cluster. The approach uses a map function to distribute the data ingestion across the nodes of a cluster. We test the capabilities of the proposed method to load billions of points into a commodity hardware compute cluster and we discuss the implications on scalability and performance. The performance is benchmarked against an existing native Apache Spark data import implementation.
Meek, M E (Bette); Palermo, Christine M; Bachman, Ammie N; North, Colin M; Jeffrey Lewis, R
2014-01-01
The mode of action human relevance (MOA/HR) framework increases transparency in systematically considering data on MOA for end (adverse) effects and their relevance to humans. This framework continues to evolve as experience increases in its application. Though the MOA/HR framework is not designed to address the question of “how much information is enough” to support a hypothesized MOA in animals or its relevance to humans, its organizing construct has potential value in considering relative weight of evidence (WOE) among different cases and hypothesized MOA(s). This context is explored based on MOA analyses in published assessments to illustrate the relative extent of supporting data and their implications for dose–response analysis and involved comparisons for chemical assessments on trichloropropane, and carbon tetrachloride with several hypothesized MOA(s) for cancer. The WOE for each hypothesized MOA was summarized in narrative tables based on comparison and contrast of the extent and nature of the supporting database versus potentially inconsistent or missing information. The comparison was based on evolved Bradford Hill considerations rank ordered to reflect their relative contribution to WOE determinations of MOA taking into account increasing experience in their application internationally. This clarification of considerations for WOE determinations as a basis for comparative analysis is anticipated to contribute to increasing consistency in the application of MOA/HR analysis and potentially, transparency in separating science judgment from public policy considerations in regulatory risk assessment. Copyright © 2014. The Authors. Journal of Applied Toxicology Published by John Wiley & Sons Ltd. The potential value of the mode of action (MOA)/human relevance (species concordance) framework in considering relative weight of evidence (WOE) amongst different cases and hypothesized MOA(s) is explored based on the content of several published assessments. The comparison is based on evolved Bradford Hill considerations rank ordered to reflect their relative contribution to WOE determinations for MOA based on experience internationally. PMID:24777878
ELM Meets Urban Big Data Analysis: Case Studies
Chen, Huajun; Chen, Jiaoyan
2016-01-01
In the latest years, the rapid progress of urban computing has engendered big issues, which creates both opportunities and challenges. The heterogeneous and big volume of data and the big difference between physical and virtual worlds have resulted in lots of problems in quickly solving practical problems in urban computing. In this paper, we propose a general application framework of ELM for urban computing. We present several real case studies of the framework like smog-related health hazard prediction and optimal retain store placement. Experiments involving urban data in China show the efficiency, accuracy, and flexibility of our proposed framework. PMID:27656203
TETAM Model Verification Study. Volume I. Representation of Intervisibility, Initial Comparisons
1976-02-01
simulation models in terms of firings, engagements, and losses between tank and antitank as compared with the field data collected during the free play battles of Field Experiment 11.8 are found in Volume III. (Author)
Computing Diffeomorphic Paths for Large Motion Interpolation.
Seo, Dohyung; Jeffrey, Ho; Vemuri, Baba C
2013-06-01
In this paper, we introduce a novel framework for computing a path of diffeomorphisms between a pair of input diffeomorphisms. Direct computation of a geodesic path on the space of diffeomorphisms Diff (Ω) is difficult, and it can be attributed mainly to the infinite dimensionality of Diff (Ω). Our proposed framework, to some degree, bypasses this difficulty using the quotient map of Diff (Ω) to the quotient space Diff ( M )/ Diff ( M ) μ obtained by quotienting out the subgroup of volume-preserving diffeomorphisms Diff ( M ) μ . This quotient space was recently identified as the unit sphere in a Hilbert space in mathematics literature, a space with well-known geometric properties. Our framework leverages this recent result by computing the diffeomorphic path in two stages. First, we project the given diffeomorphism pair onto this sphere and then compute the geodesic path between these projected points. Second, we lift the geodesic on the sphere back to the space of diffeomerphisms, by solving a quadratic programming problem with bilinear constraints using the augmented Lagrangian technique with penalty terms. In this way, we can estimate the path of diffeomorphisms, first, staying in the space of diffeomorphisms, and second, preserving shapes/volumes in the deformed images along the path as much as possible. We have applied our framework to interpolate intermediate frames of frame-sub-sampled video sequences. In the reported experiments, our approach compares favorably with the popular Large Deformation Diffeomorphic Metric Mapping framework (LDDMM).
A conformal truncation framework for infinite-volume dynamics
Katz, Emanuel; Khandker, Zuhair U.; Walters, Matthew T.
2016-07-28
Here, we present a new framework for studying conformal field theories deformed by one or more relevant operators. The original CFT is described in infinite volume using a basis of states with definite momentum, P, and conformal Casimir, C. The relevant deformation is then considered using lightcone quantization, with the resulting Hamiltonian expressed in terms of this CFT basis. Truncating to states with C ≤ C max, one can numerically find the resulting spectrum, as well as other dynamical quantities, such as spectral densities of operators. This method requires the introduction of an appropriate regulator, which can be chosen tomore » preserve the conformal structure of the basis. We check this framework in three dimensions for various perturbative deformations of a free scalar CFT, and for the case of a free O(N) CFT deformed by a mass term and a non-perturbative quartic interaction at large- N. In all cases, the truncation scheme correctly reproduces known analytic results. As a result, we also discuss a general procedure for generating a basis of Casimir eigenstates for a free CFT in any number of dimensions.« less
ERIC Educational Resources Information Center
Freedman, Ruth
The first of a six-volume series on the community adjustment of deinstitutionalized mentally retarded persons examines 28 research studies on community adjustment and proposes a framework for reviewing criterion measures and predictors of adjustment. Summary descriptions of major characteristics of the studies are provided, and matrices listing…
ERIC Educational Resources Information Center
Byrne, Eileen M.
This document is a "methodological annex" to volume I of the Women in Science and Technology in Australia (WISTA) final research report. The 10 discussion papers that make up this document deal with the 10 core factors of influence that formed one main axis of the study's theoretical framework for inquiry. A diagram illustrates this…
ERIC Educational Resources Information Center
Puig, Luis, Ed.; Gutierrez, Angel, Ed.
The first volume of this proceedings contains three plenary addresses: (1) "Visualization in 3-dimensional geometry: In search of a framework" (A. Gutierrez); (2) "The ongoing value of proof" (G. Hanna); and (3) "Modern times: The symbolic surfaces of language, mathematics and art" (D. Pimm). Plenary panels include: (1) "Contribution to the panel…
Connecting the Dots: A Handbook Series for Teachers of English Language Arts, Grades 6-8.
ERIC Educational Resources Information Center
North Carolina State Dept. of Public Instruction, Raleigh. Instructional Services.
Designed to grow into a 2-inch three-ring binder, this handbook is part of a series for teachers of grades 6-8 language arts. The handbook aims to clarify some points, to provide a framework for new teachers and to give veteran teachers some new ideas. Volume I contains activities to use in the classroom, and Volume II is "all about…
Tang, Yu Pan; Wang, Huan; Chung, Tai Shung
2015-01-01
The microstructural evolution of a series of triazine framework-based microporous (TFM) membranes under different conditions has been explored in this work. The pristine TFM membrane is in situ fabricated in the course of polymer synthesis via a facile Brønsted-acid-catalyzed cyclotrimerizaiton reaction. The as-synthesized polymer exhibits a microporous network with high thermal stability. The free volume size of the TFM membranes gradually evolved from a unimodal distribution to a bimodal distribution under annealing, as analyzed by positron annihilation lifetime spectroscopy (PALS). The emergence of the bimodal distribution is probably ascribed to the synergetic effect of quenching and thermal cyclization reaction. In addition, the fractional free volume (FFV) of the membranes presents a concave trend with increasing annealing temperature. Vapor sorption tests reveal that the mass transport properties are closely associated with the free volume evolution, which provides an optimal condition for dehydration of biofuels. A promising separation performance with extremely high water permeability has been attained for dehydration of an 85 wt % ethanol aqueous solution at 45 °C. The study on the free volume evolution of the TFM membranes may provide useful insights about the microstructure and mass transport behavior of the microporous polymeric materials. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Little, Keriann; Olsson, Craig A; Youssef, George J; Whittle, Sarah; Simmons, Julian G; Yücel, Murat; Sheeber, Lisa B; Foley, Debra L; Allen, Nicholas B
2015-11-01
A single imaging gene-environment (IGxE) framework that is able to simultaneously model genetic, neurobiological, and environmental influences on psychopathology outcomes is needed to improve understanding of how complex interrelationships between allelic variation, differences in neuroanatomy or neuroactivity, and environmental experience affect risk for psychiatric disorder. In a longitudinal study of adolescent development we demonstrate the utility of such an IGxE framework by testing whether variation in parental behavior at age 12 altered the strength of an imaging genetics pathway, involving an indirect association between allelic variation in the serotonin transporter gene to variation in hippocampal volume and consequent onset of major depressive disorder by age 18. Results were consistent with the presence of an indirect effect of the serotonin transporter S-allele on depression onset via smaller left and right hippocampal volumes that was significant only in family environments involving either higher levels of parental aggression or lower levels of positive parenting. The previously reported finding of S-allele carriers' increased risk of depression in adverse environments may, therefore, be partly because of the effects of these environments on a neurobiological pathway from the serotonin transporter gene to depression onset that proceeds through variation in hippocampal volume. (c) 2015 APA, all rights reserved).
This report presents examples of the relationships between the results of laboratory leaching tests, as defined by the Leaching Environmental Assessment Framework (LEAF) or analogous international test methods, and leaching of constituents from a broad range of materials under di...
ERIC Educational Resources Information Center
Lysons, Art
1999-01-01
Suggests that organizational effectiveness research has made considerable progress in empirically deriving a systematic framework of theoretical and practical utility in Australian higher education. Offers a taxonomy based on the competing values framework and discusses use of inter-organizational comparisons and profiles for diagnosis in…
ERIC Educational Resources Information Center
Okazaki, Shintaro; Alonso Rivas, Javier
2002-01-01
Discussion of research methodology for evaluating the degree of standardization in multinational corporations' online communication strategies across differing cultures focuses on a research framework for cross-cultural comparison of corporate Web pages, applying traditional advertising content study techniques. Describes pre-tests that examined…
Teaching Conflict Management Using a Scenario-Based Approach
ERIC Educational Resources Information Center
Callanan, Gerard A.; Perri, David F.
2006-01-01
In this article, the authors present a framework for the teaching of conflict management in college courses. The framework describes an experiential learning approach for helping individuals understand the influence of contextual factors in the selection of conflict handling strategy. It also includes a comparison of participants' choice of style,…
ERIC Educational Resources Information Center
Wang, Chia-Yu; Barrow, Lloyd H.
2013-01-01
The purpose of the study was to explore students' conceptual frameworks of models of atomic structure and periodic variations, chemical bonding, and molecular shape and polarity, and how these conceptual frameworks influence their quality of explanations and ability to shift among chemical representations. This study employed a purposeful sampling…
Free volumes and gas transport in polymers: amine-modified epoxy resins as a case study.
Patil, Pushkar N; Roilo, David; Brusa, Roberto S; Miotello, Antonio; Aghion, Stefano; Ferragut, Rafael; Checchetto, Riccardo
2016-02-07
The CO2 transport process was studied in a series of amine-modified epoxy resins having different cross-linking densities but the same chemical environment for the penetrant molecules. Positron Annihilation Lifetime Spectroscopy (PALS) was used to monitor the free volume structure of the samples and experimentally evaluate their fractional free volume fh(T) and its temperature evolution. The analysis of the free volume hole size distribution showed that all the holes have a size large enough to accommodate the penetrant molecules at temperatures T above the glass transition temperature Tg. The measured gas diffusion constants at T > Tg have been reproduced in the framework of the free volume theory of diffusion using a novel procedure based on the use of fh(T) as an input experimental parameter.
Large-eddy simulation of nitrogen injection at trans- and supercritical conditions
DOE Office of Scientific and Technical Information (OSTI.GOV)
Müller, Hagen; Pfitzner, Michael; Niedermeier, Christoph A.
2016-01-15
Large-eddy simulations (LESs) of cryogenic nitrogen injection into a warm environment at supercritical pressure are performed and real-gas thermodynamics models and subgrid-scale (SGS) turbulence models are evaluated. The comparison of different SGS models — the Smagorinsky model, the Vreman model, and the adaptive local deconvolution method — shows that the representation of turbulence on the resolved scales has a notable effect on the location of jet break-up, whereas the particular modeling of unresolved scales is less important for the overall mean flow field evolution. More important are the models for the fluid’s thermodynamic state. The injected fluid is either inmore » a supercritical or in a transcritical state and undergoes a pseudo-boiling process during mixing. Such flows typically exhibit strong density gradients that delay the instability growth and can lead to a redistribution of turbulence kinetic energy from the radial to the axial flow direction. We evaluate novel volume-translation methods on the basis of the cubic Peng-Robinson equation of state in the framework of LES. At small extra computational cost, their application considerably improves the simulation results compared to the standard formulation. Furthermore, we found that the choice of inflow temperature is crucial for the reproduction of the experimental results and that heat addition within the injector can affect the mean flow field in comparison to results with an adiabatic injector.« less
DETERMINATION OF THE SPEED OF SOUND ALONG THE HUGONIOT IN A SHOCKED MATERIAL
2017-04-25
correctly predict higher speeds of sound for the higher energy shocked states. The approximations of higher shock pressures diverge progressively...List 11 FIGURES 1 Copper Hugoniot pressure-specific volume plane 4 2 Copper Hugoniot energy -specific volume plane 4 3 Comparison between rate of...volume and energy are being used. = (, ) Then by the chain rule: = | + | Dividing by dv
Christopher M. Oswalt; Adam M. Saunders
2009-01-01
Sound estimation procedures are desideratum for generating credible population estimates to evaluate the status and trends in resource conditions. As such, volume estimation is an integral component of the U.S. Department of Agriculture, Forest Service, Forest Inventory and Analysis (FIA) program's reporting. In effect, reliable volume estimation procedures are...
Comparing volume of fluid and level set methods for evaporating liquid-gas flows
NASA Astrophysics Data System (ADS)
Palmore, John; Desjardins, Olivier
2016-11-01
This presentation demonstrates three numerical strategies for simulating liquid-gas flows undergoing evaporation. The practical aim of this work is to choose a framework capable of simulating the combustion of liquid fuels in an internal combustion engine. Each framework is analyzed with respect to its accuracy and computational cost. All simulations are performed using a conservative, finite volume code for simulating reacting, multiphase flows under the low-Mach assumption. The strategies used in this study correspond to different methods for tracking the liquid-gas interface and handling the transport of the discontinuous momentum and vapor mass fractions fields. The first two strategies are based on conservative, geometric volume of fluid schemes using directionally split and un-split advection, respectively. The third strategy is the accurate conservative level set method. For all strategies, special attention is given to ensuring the consistency between the fluxes of mass, momentum, and vapor fractions. The study performs three-dimensional simulations of an isolated droplet of a single component fuel evaporating into air. Evaporation rates and vapor mass fractions are compared to analytical results.
Realization of the medium and high vacuum primary standard in CENAM, Mexico
NASA Astrophysics Data System (ADS)
Torres-Guzman, J. C.; Santander, L. A.; Jousten, K.
2005-12-01
A medium and high vacuum primary standard, based on the static expansion method, has been set up at Centro Nacional de Metrología (CENAM), Mexico. This system has four volumes and covers a measuring range of 1 × 10-5 Pa to 1 × 103 Pa of absolute pressure. As part of its realization, a characterization was performed, which included volume calibrations, several tests and a bilateral key comparison. To determine the expansion ratios, two methods were applied: the gravimetric method and the method with a linearized spinning rotor gauge. The outgassing ratios for the whole system were also determined. A comparison was performed with Physikalisch-Technische Bundesanstalt (comparison SIM-Euromet.M.P-BK3). By means of this comparison, a link has been achieved with the Euromet comparison (Euromet.M.P-K1.b). As a result, it is concluded that the value obtained at CENAM is equivalent to the Euromet reference value, and therefore the design, construction and operation of CENAM's SEE-1 vacuum primary standard were successful.
NASA Technical Reports Server (NTRS)
Deutsch, W. F.
1972-01-01
Activities and data reported cover experimental design, mechanization onboard the aircraft, survey operations, quick look and automated data reduction, and a qualitative comparison of survey data with predicted values for the radio frequency survey. The survey was designed to measure amplitude, frequency and time of occurrence of terrestrial emissions in the VHF band during overflights of heavily populated metropolitan areas located on the Pacific Coast of the Continental United States by sensing and recording equipment installed in jet aircraft.
The effect of chronic erythrocytic polycythemia and high altitude upon plasma and blood volumes.
NASA Technical Reports Server (NTRS)
Burton, R. R.; Smith, A. H.
1972-01-01
Comparison of two kinds of physiological chronic erythrocytic polycythemias in order to differentiate the specific effect of erythrocytic polycythemia from the general effects of high altitude upon the plasma volume. The two kinds were produced hormonally in female chickens, at sea level, or by protracted high-altitude exposures. It appears that the vascular system of the body may account for an increase in red blood cell mass either by reduction in plasma volume, or by no change in plasma volume, resulting in differential changes in total blood volumes.
Human Factors of CC-130 Operations. Volume 5: Human Factors in Decision Making
1998-02-01
known about human information processing and decision making. Topics for HFDM training come directly from this theoretical framework . The proposed...The proposed training can be distinguished from other approaches with similar goals (either explicit or implicit) by its base within a theoretical ... framework of human information processing. The differences lie less in the content than in the way the material is organized and shaped by theory. The
Kim, Min-Soo; Lee, Jeong-Rim; Shin, Yang-Sik; Chung, Ji-Won; Lee, Kyu-Ho; Ahn, Ki Ryang
2014-03-01
This single-center, prospective, randomized, double-blind, 2-arm, parallel group comparison trial was performed to establish whether the adult-sized laryngeal mask airway (LMA) Classic (The Laryngeal Mask Company Ltd, Henley-on-Thames, UK) could be used safely without any consideration of cuff hyperinflation when a cuff of the LMA Classic was inflated using half the maximum inflation volume or the resting volume before insertion of device. Eighty patients aged 20 to 70 years scheduled for general anesthesia using the LMA Classic were included. Before insertion, the cuff was partially filled with half the maximum inflation volume in the half volume group or the resting volume created by opening the pilot balloon valve to equalize with atmospheric pressure in the resting volume group. Several parameters regarding insertion, intracuff pressure, airway leak pressure, and leakage volume/fraction were collected after LMA insertion. The LMA Classic with a partially inflated cuff was successfully inserted in all enrolled patients. Both groups had the same success rate of 95% at the first insertion attempt. The half volume group had a lower mean intracuff pressure compared with the resting volume group (54.5 ± 16.1 cm H2O vs 61.8 ± 16.1 cm H2O; P = .047). There was no difference in airway leak pressure or leakage volume/fraction between the 2 groups under mechanical ventilation. The partially inflated cuff method using half the maximum recommended inflation volume or the resting volume is feasible with the adult-sized LMA Classic, resulting in a high success rate of insertion and adequate range of intracuff pressures. Copyright © 2014 Elsevier Inc. All rights reserved.
Han, Doug Hyun; Lyoo, In Kyoon; Renshaw, Perry F.
2015-01-01
Patients with on-line game addiction (POGA) and professional video game players play video games for extended periods of time, but experience very different consequences for their on-line game play. Brain regions consisting of anterior cingulate, thalamus and occpito-temporal areas may increase the likelihood of becoming a pro-gamer or POGA. Twenty POGA, seventeen pro-gamers, and eighteen healthy comparison subjects (HC) were recruited. All magnetic resonance imaging (MRI) was performed on a 1.5 Tesla Espree MRI scanner (SIEMENS, Erlangen, Germany). Voxel-wise comparisons of gray matter volume were performed between the groups using the two-sample t-test with statistical parametric mapping (SPM5). Compared to HC, the POGA group showed increased impulsiveness and perseverative errors, and volume in left thalamus gray matter, but decreased gray matter volume in both inferior temporal gyri, right middle occipital gyrus, and left inferior occipital gyrus, compared with HC. Pro-gamers showed increased gray matter volume in left cingulate gyrus, but decreased gray matter volume in left middle occipital gyrus and right inferior temporal gyrus compared with HC. Additionally, the pro-gamer group showed increased gray matter volume in left cingulate gyrus and decreased left thalamus gray matter volume compared with the POGA group. The current study suggests that increased gray matter volumes of the left cingulate gyrus in pro-gamers and of the left thalamus in POGA may contribute to the different clinical characteristics of pro-gamers and POGA. PMID:22277302
Kinetics of the mechanochemical synthesis of alkaline-earth metal amides
NASA Astrophysics Data System (ADS)
Garroni, Sebastiano; Takacs, Laszlo; Leng, Haiyan; Delogu, Francesco
2014-07-01
A phenomenological framework is developed to model the kinetics of the formation of alkaline-earth metal amides by the ball milling induced reaction of their hydrides with gaseous ammonia. It is shown that the exponential character of the kinetic curves is modulated by the increase of the total volume of the powder inside the reactor due to the substantially larger molar volume of the products compared to the reactants. It is claimed that the volume of powder effectively processed during each collision connects the transformation rate to the physical and chemical processes underlying the mechanochemical transformations.
A comparison between active and passive sensing of soil moisture from vegetated terrains
NASA Technical Reports Server (NTRS)
Fung, A. K.; Eom, H. J.
1985-01-01
A comparison between active and passive sensing of soil moisture over vegetated areas is studied via scattering models. In active sensing three contributing terms to radar backscattering can be identified: (1) the ground surface scatter term; (2) the volume scatter term representing scattering from the vegetation layer; and (3) the surface volume scatter term accounting for scattering from both surface and volume. In emission three sources of contribution can also be identified: (1) surface emission; (2) upward volume emission from the vegetation layer; and (3) downward volume emission scattered upward by the ground surface. As ground moisture increases, terms (1) and (3) increase due to increase in permittivity in the active case. However, in passive sensing, term (1) decreases but term (3) increases for the same reason. This self compensating effect produces a loss in sensitivity to change in ground moisture. Furthermore, emission from vegetation may be larger than that from the ground. Hence, the presence of vegetation layer causes a much greater loss of sensitivity to passive than active sensing of soil moisture.
A comparison between active and passive sensing of soil moisture from vegetated terrains
NASA Technical Reports Server (NTRS)
Fung, A. K.; Eom, H. J.
1984-01-01
A comparison between active and passive sensing of soil moisture over vegetated areas is studied via scattering models. In active sensing three contributing terms to radar backscattering can be identified: (1) the ground surface scatter term; (2) the volume scatter term representing scattering from the vegetation layer; and (3) the surface volume scatter term accounting for scattering from both surface and volume. In emission three sources of contribution can also be identified: (1) surface emission; (2) upward volume emission from the vegetation layer; and (3) downward volume emission scattered upward by the ground surface. As ground moisture increases, terms (1) and (3) increase due to increase in permittivity in the active case. However, in passive sensing, term (1) decreases but term (3) increases for the same reason. This self conpensating effect produces a loss in sensitivity to change in ground moisture. Furthermore, emission from vegetation may be larger than that from the ground. Hence, the presence of vegetation layer causes a much greater loss of sensitivity to passive than active sensing of soil moisture.
Quantitative impact of pediatric sinus surgery on facial growth.
Senior, B; Wirtschafter, A; Mai, C; Becker, C; Belenky, W
2000-11-01
To quantitatively evaluate the long-term impact of sinus surgery on paranasal sinus development in the pediatric patient. Longitudinal review of eight pediatric patients treated with unilateral sinus surgery for periorbital or orbital cellulitis with an average follow-up of 6.9 years. Control subjects consisted of two groups, 9 normal adult patients with no computed tomographic evidence of sinusitis and 10 adult patients with scans consistent with sinusitis and a history of sinus-related symptoms extending to childhood. Application of computed tomography (CT) volumetrics, a technique allowing for precise calculation of volumes using thinly cut CT images, to the study and control groups. Paired Student t test analyses of side-to-side volume comparisons in the normal patients, patients with sinusitis, and patients who had surgery revealed no statistically significant differences. Comparisons between the orbital volumes of patients who did and did not have surgery revealed a statistically significant increase in orbital volume in patients who had surgery. Only minimal changes in facial volume measurements have been found, confirming clinical impressions that sinus surgery in children is safe and without significant cosmetic sequelae.
Poultry Processing Work and Respiratory Health of Latino Men and Women in North Carolina
Mirabelli, Maria C.; Chatterjee, Arjun B.; Arcury, Thomas A.; Mora, Dana C.; Blocker, Jill N.; Grzywacz, Joseph G.; Chen, Haiying; Marín, Antonio J.; Schulz, Mark R.; Quandt, Sara A.
2015-01-01
Objective To evaluate associations between poultry processing work and respiratory health among working Latino men and women in North Carolina. Methods Between May 2009 and November 2010, 402 poultry processing workers and 339 workers in a comparison population completed interviewer-administered questionnaires. Of these participants, 279 poultry processing workers and 222 workers in the comparison population also completed spirometry testing to provide measurements of forced expiratory volume in 1 second and forced vital capacity. Results Nine percent of poultry processing workers and 10% of workers in the comparison population reported current asthma. Relative to the comparison population, adjusted mean forced expiratory volume in 1 second and forced vital capacity were lower in the poultry processing population, particularly among men who reported sanitation job activities. Conclusions Despite the low prevalence of respiratory symptoms reported, poultry processing work may affect lung function. PMID:22237034
Albarrak, Abdulrahman; Coenen, Frans; Zheng, Yalin
2017-01-01
Three-dimensional (3D) (volumetric) diagnostic imaging techniques are indispensable with respect to the diagnosis and management of many medical conditions. However there is a lack of automated diagnosis techniques to facilitate such 3D image analysis (although some support tools do exist). This paper proposes a novel framework for volumetric medical image classification founded on homogeneous decomposition and dictionary learning. In the proposed framework each image (volume) is recursively decomposed until homogeneous regions are arrived at. Each region is represented using a Histogram of Oriented Gradients (HOG) which is transformed into a set of feature vectors. The Gaussian Mixture Model (GMM) is then used to generate a "dictionary" and the Improved Fisher Kernel (IFK) approach is used to encode feature vectors so as to generate a single feature vector for each volume, which can then be fed into a classifier generator. The principal advantage offered by the framework is that it does not require the detection (segmentation) of specific objects within the input data. The nature of the framework is fully described. A wide range of experiments was conducted with which to analyse the operation of the proposed framework and these are also reported fully in the paper. Although the proposed approach is generally applicable to 3D volumetric images, the focus for the work is 3D retinal Optical Coherence Tomography (OCT) images in the context of the diagnosis of Age-related Macular Degeneration (AMD). The results indicate that excellent diagnostic predictions can be produced using the proposed framework. Copyright © 2016 Elsevier Ltd. All rights reserved.
A permutation testing framework to compare groups of brain networks.
Simpson, Sean L; Lyday, Robert G; Hayasaka, Satoru; Marsh, Anthony P; Laurienti, Paul J
2013-01-01
Brain network analyses have moved to the forefront of neuroimaging research over the last decade. However, methods for statistically comparing groups of networks have lagged behind. These comparisons have great appeal for researchers interested in gaining further insight into complex brain function and how it changes across different mental states and disease conditions. Current comparison approaches generally either rely on a summary metric or on mass-univariate nodal or edge-based comparisons that ignore the inherent topological properties of the network, yielding little power and failing to make network level comparisons. Gleaning deeper insights into normal and abnormal changes in complex brain function demands methods that take advantage of the wealth of data present in an entire brain network. Here we propose a permutation testing framework that allows comparing groups of networks while incorporating topological features inherent in each individual network. We validate our approach using simulated data with known group differences. We then apply the method to functional brain networks derived from fMRI data.
Carlo, C N; Stefanacci, L; Semendeferi, K; Stevens, C F
2010-04-15
The amygdaloid complex (AC), a key component of the limbic system, is a brain region critical for the detection and interpretation of emotionally salient information. Therefore, changes in its structure and function are likely to provide correlates of mood and emotion disorders, diseases that afflict a large portion of the human population. Previous gross comparisons of the AC in control and diseased individuals have, however, mainly failed to discover these expected correlations with diseases. We have characterized AC nuclei in different nonhuman primate species to establish a baseline for more refined comparisons between the normal and the diseased amygdala. AC nuclei volume and neuron number in 19 subdivisions are reported from 13 Old and New World primate brains, spanning five primate species, and compared with corresponding data from humans. Analysis of the four largest AC nuclei revealed that volume and neuron number of one component, the central nucleus, has a negative allometric relationship with total amygdala volume and neuron number, which is in contrast with the isometric relationship found in the other AC nuclei (for both neuron number and volume). Neuron density decreases across all four nuclei according to a single power law with an exponent of about minus one-half. Because we have included quantitative comparisons with great apes and humans, our conclusions apply to human brains, and our scaling laws can potentially be used to study the anatomical correlates of the amygdala in disorders involving pathological emotion processing. (c) 2009 Wiley-Liss, Inc.
NASA Astrophysics Data System (ADS)
Seryotkin, Yu. V.; Bakakin, V. V.; Likhacheva, A. Yu.; Dementiev, S. N.; Rashchenko, S. V.
2017-10-01
The structural evolution of Tl-exchanged natrolite with idealized formula Tl2[Al2Si3O10]·2H2O, compressed in penetrating (water:ethanol 1:1) and non-penetrating (paraffin) media, was studied up to 4 GPa. The presence of Tl+ with non-bonded electron lone pairs, which can be either stereo-chemically active or passive, determines distinctive features of the high-pressure behavior of the Tl-form. The effective volume of assemblages Tl+(O,H2O) n depends on the E-pairs activity: single-sided coordination correlates with smaller volumes. At ambient conditions, there are two types of Tl positions, only one of them having a nearly single-sided coordination as a characteristic of stereo-activity of the Tl+ E pair. Upon the compression in paraffin, a phase transition occurs: a 5% volume contraction of flexible natrolite framework is accompanied by the conversion of all the Tl+ cations into stereo-chemically active state with a single-sided coordination. This effect requires the reconstruction of all the extra-framework subsystems with the inversion of the cation and H2O positions. The compression in water-containing medium leads to the increase of H2O content up to three molecules pfu through the filling of partly vacant positions. This hinders a single-sided coordination of Tl ions and preserves the configuration of their ion-molecular subsystem. It is likely that the extra-framework subsystem is responsible for the super-structure modulation.
An estimation framework for building information modeling (BIM)-based demolition waste by type.
Kim, Young-Chan; Hong, Won-Hwa; Park, Jae-Woo; Cha, Gi-Wook
2017-12-01
Most existing studies on demolition waste (DW) quantification do not have an official standard to estimate the amount and type of DW. Therefore, there are limitations in the existing literature for estimating DW with a consistent classification system. Building information modeling (BIM) is a technology that can generate and manage all the information required during the life cycle of a building, from design to demolition. Nevertheless, there has been a lack of research regarding its application to the demolition stage of a building. For an effective waste management plan, the estimation of the type and volume of DW should begin from the building design stage. However, the lack of tools hinders an early estimation. This study proposes a BIM-based framework that estimates DW in the early design stages, to achieve an effective and streamlined planning, processing, and management. Specifically, the input of construction materials in the Korean construction classification system and those in the BIM library were matched. Based on this matching integration, the estimates of DW by type were calculated by applying the weight/unit volume factors and the rates of DW volume change. To verify the framework, its operation was demonstrated by means of an actual BIM modeling and by comparing its results with those available in the literature. This study is expected to contribute not only to the estimation of DW at the building level, but also to the automated estimation of DW at the district level.
Change of mean platelet volume values in asthmatic children as an inflammatory marker.
Tuncel, T; Uysal, P; Hocaoglu, A B; Erge, D O; Karaman, O; Uzuner, N
2012-01-01
Asthma is the most common chronic disease of childhood in industrialised countries. T helper-2 (Th-2) cells, mast cells and eosinophils have a role in inflammation of asthma. Recently it was shown that platelets also play a role in asthma. Mean platelet volume shows platelet size and reflects platelet activation. The aim of this retrospective study is to evaluate levels of mean platelet volume in asthmatic patients during asymptomatic periods and exacerbations compared with healthy controls. The study consisted of 100 asthmatic patients (male/female: 55/45, mean age: 8.2±3.3) and 49 age and sex matched healthy children as a control group. Mean platelet volume values of asthmatic patients during asymptomatic period were 7.7±0.8fL while mean platelet volume values in asthmatics during exacerbation were 7.8±0.9fL. Comparison of mean platelet volume values of asthmatic patients and healthy controls both in acute asthmatic attack and asymptomatic period showed no difference (p>0.05). Comparison of mean platelet volume values at asthmatic attack and asymptomatic period also had no difference (p>0.05). The presence of atopy, infection, eosinophilia, elevated immunoglobulin E, and severity of acute asthmatic attack did not influence mean platelet volume values. The results of our study suggest that mean platelet volume values may not be used as a marker in bronchial asthma, although prospective studies with larger number of patients are needed to evaluate the role of mean platelet volume in asthma. Copyright © 2011 SEICAP. Published by Elsevier Espana. All rights reserved.
A simple model for the spatially-variable coastal response to hurricanes
Stockdon, H.F.; Sallenger, A.H.; Holman, R.A.; Howd, P.A.
2007-01-01
The vulnerability of a beach to extreme coastal change during a hurricane can be estimated by comparing the relative elevations of storm-induced water levels to those of the dune or berm. A simple model that defines the coastal response based on these elevations was used to hindcast the potential impact regime along a 50-km stretch of the North Carolina coast to the landfalls of Hurricane Bonnie on August 27, 1998, and Hurricane Floyd on September 16, 1999. Maximum total water levels at the shoreline were calculated as the sum of modeled storm surge, astronomical tide, and wave runup, estimated from offshore wave conditions and the local beach slope using an empirical parameterization. Storm surge and wave runup each accounted for ∼ 48% of the signal (the remaining 4% is attributed to astronomical tides), indicating that wave-driven process are a significant contributor to hurricane-induced water levels. Expected water levels and lidar-derived measures of pre-storm dune and berm elevation were used to predict the spatially-varying storm-impact regime: swash, collision, or overwash. Predictions were compared to the observed response quantified using a lidar topography survey collected following hurricane landfall. The storm-averaged mean accuracy of the model in predicting the observed impact regime was 55.4%, a significant improvement over the 33.3% accuracy associated with random chance. Model sensitivity varied between regimes and was highest within the overwash regime where the accuracies were 84.2% and 89.7% for Hurricanes Bonnie and Floyd, respectively. The model not only allows for prediction of the general coastal response to storms, but also provides a framework for examining the longshore-variable magnitudes of observed coastal change. For Hurricane Bonnie, shoreline and beach volume changes within locations that experienced overwash or dune erosion were two times greater than locations where wave runup was confined to the foreshore (swash regime). During Hurricane Floyd, this pattern became more pronounced as magnitudes of change were four times greater within the overwash regime than in the swash regime. Comparisons of pre-storm topography to a calm weather survey collected one year after Hurricane Floyd's landfall show long-term beach volume loss at overwash locations. Here, the volume of sand eroded from the beach was balanced by the volume of overwash deposits, indicating that the majority of the sand removed from the beach was transported landward across the island rather than being transported offshore. In overwash locations, sand was removed from the nearshore system and unavailable for later beach recovery, resulting in a more permanent response than observed within the other regimes. These results support the predictive capabilities of the storm scaling model and illustrate that the impact regimes provide a framework for explaining the longshore-variable coastal response to hurricanes.
ERIC Educational Resources Information Center
Rees, Alan M; Schultz, Douglas G.
An empirical study of the nature and variability of the relevance judgment process was conducted from July 1, 1965 to September 30, 1967. Volume I of the final report presents a literature review and statement of the theoretical framework of the study, a discussion of the experimental design and a summary of data analyses. The study had two…
ERIC Educational Resources Information Center
Schalock, H. Del, Ed.; Sell, G. Roger, Ed.
This volume represents the output of a yearlong effort to clarify and firm the conceptual base that underlies educational RDD&E. It contains three commissioned papers authored by Drs. Hendrik D. Gideonse, Gene V Glass and Blaine R. Worthen, and by Leslie J. Briggs and one paper prepared by H. Del Schalock and G. Roger Sell of the staff of the…
Comparison of logging residue from lump sum and log scale timber sales.
James O Howard; Donald J. DeMars
1985-01-01
Data from 1973 and 1980 logging residues studies were used to compare the volume of residue from lump sum and log scale timber sales. Covariance analysis was used to adjust the mean volume for each data set for potential variation resulting from differences in stand conditions. Mean residue volumes from the two sale types were significantly different at the 5-percent...
Dae-Kwan Kim; Daniel M. Spotts; Donald F. Holecek
1998-01-01
This paper compares estimates of pleasure trip volume and expenditures derived from a regional telephone survey to those derived from the TravelScope mail panel survey. Significantly different estimates emerged, suggesting that survey-based estimates of pleasure trip volume and expenditures, at least in the case of the two surveys examined, appear to be affected by...
Validation of Volume and Taper Equations For Loblolly Shortleaf and Slash Pine
Allan E. Tiarks; V. Clark Baldwin
1999-01-01
Inside-bark diameter measurements at 6.64 intervals of 137 loblolly, 52 shortleaf, and 64 slash pines were used to calculate the actual volume and taper of each species for comparison with volumes and tapers predicted from published equations. The loblolly pine were cut in Texas (TX) and Louisiana (LA) while the shortleaf was sampled only in TX. The slash pine were...
Don Minore; Donald R. Gedney
1960-01-01
A large proportion of present-day timber cruising is done by measuring or estimating three tree dimensions: diameter at breast height, form class, and merchantable height. Tree volumes are then determined from tables which equate volume to the varying combinations of height, d.b.h., and form class. Assumptions concerning merchantable height were made in constructing...
Harry V., Jr. Wiant; Michael L. Spangler; John E. Baumgras
2002-01-01
Various taper systems and the centroid method were compared to unbiased volume estimates made by importance sampling for 720 hardwood trees selected throughout the state of West Virginia. Only the centroid method consistently gave volumes estimates that did not differ significantly from those made by importance sampling, although some taper equations did well for most...
An Analytical Framework for the Cross-Country Comparison of Higher Education Governance
ERIC Educational Resources Information Center
Dobbins, Michael; Knill, Christoph; Vogtle, Eva Maria
2011-01-01
In this article we provide an integrated framework for the analysis of higher education governance which allows us to more systematically trace the changes that European higher education systems are currently undergoing. We argue that, despite highly insightful previous analyses, there is a need for more specific empirically observable indicators…
Evaluating theories of drought-induced vegetation mortality using a multimodel-experiment framework
Nate G. McDowell; Rosie A. Fisher; Chonggang Xu; J. C. Domec; Teemu Holtta; D. Scott Mackay; John S. Sperry; Amanda Boutz; Lee Dickman; Nathan Gehres; Jean Marc Limousin; Alison Macalady; Jordi Martinez-Vilalta; Maurizio Mencuccini; Jennifer A. Plaut; Jerome Ogee; Robert E. Pangle; Daniel P. Rasse; Michael G. Ryan; Sanna Sevanto; Richard H. Waring; A. Park Williams; Enrico A. Yepez; William T. Pockman
2013-01-01
Model-data comparisons of plant physiological processes provide an understanding of mechanisms underlying vegetation responses to climate. We simulated the physiology of a pinon pine-juniper woodland (Pinus edulis-Juniperus monosperma) that experienced mortality during a 5 yr precipitation-reduction experiment, allowing a framework with which to examine our knowledge...
ERIC Educational Resources Information Center
Butler, Norman L.; Pachocinski, Ryszard; Davidson, Barry S.
2006-01-01
The aim of this study was to compare Polish post-secondary vocational institutions with Canadian community colleges using an information technology theoretical framework consisting of three parts: participation, feedback and partnership. The research concentrated upon programs in nursing, tourism and information technology delivered by the three…
A Comparison of State Alternative Education Accountability Policies and Frameworks
ERIC Educational Resources Information Center
Schlessman, Amy; Hurtado, Kelly
2012-01-01
The purpose of this policy study was to report descriptive research on state-level policy and frameworks for accountability systems of alternative education in the United States. The six states; California, Colorado, Texas, Florida, Oklahoma, and North Carolina; identified in the 2010 Jobs for the Futures policy analysis of alternative education…
Introducing an Equal Rights Framework for Older Persons in Residential Care
Jönson, Håkan; Harnett, Tove
2016-01-01
This article reconceptualizes residential care for older persons by introducing a framework developed from a rights-based principle of disability policies: the normalization principle. This principle is part of the social model and states that society should make available for people who have impairments living conditions that are as close as possible to those of “others.” Using the framework on the case of eldercare in Sweden shows that although disability policies have used people without impairments as a comparative (external) reference group for claiming rights, eldercare policies use internal reference groups, basing comparisons on other care users. The article highlights the need for external comparisons in eldercare and suggests that the third age, which so far has been a normative reference group for older people, could be a comparative reference group when older persons in need of care claim rights to equal conditions. PMID:26035884
NASA Astrophysics Data System (ADS)
Zhu, Cheng; Pouya, Ahmad; Arson, Chloé
2015-11-01
This paper aims to gain fundamental understanding of the microscopic mechanisms that control the transition between secondary and tertiary creep around salt caverns in typical geological storage conditions. We use a self-consistent inclusion-matrix model to homogenize the viscoplastic deformation of halite polycrystals and predict the number of broken grains in a Representative Elementary Volume of salt. We use this micro-macro modeling framework to simulate creep tests under various axial stresses, which gives us the critical viscoplastic strain at which grain breakage (i.e., tertiary creep) is expected to occur. The comparison of simulation results for short-term and long-term creep indicates that the initiation of tertiary creep depends on the stress and the viscoplastic strain. We use the critical viscoplastic deformation as a yield criterion to control the transition between secondary and tertiary creep in a phenomenological viscoplastic model, which we implement into the Finite Element Method program POROFIS. We model a 850-m-deep salt cavern of irregular shape, in axis-symmetric conditions. Simulations of cavern depressurization indicate that a strain-dependent damage evolution law is more suitable than a stress-dependent damage evolution law, because it avoids high damage concentrations and allows capturing the formation of a damaged zone around the cavity. The modeling framework explained in this paper is expected to provide new insights to link grain breakage to phenomenological damage variables used in Continuum Damage Mechanics.
NASA Astrophysics Data System (ADS)
Mubako, S. T.; Fullerton, T. M.; Walke, A.; Collins, T.; Mubako, G.; Walker, W. S.
2014-12-01
Water productivity is an area of growing interest in assessing the impact of human economic activities on water resources, especially in arid regions. Indicators of water productivity can assist water users in evaluating sectoral water use efficiency, identifying sources of pressure on water resources, and in supporting water allocation rationale under scarcity conditions. This case study for the water-scarce Middle Rio Grande River Basin aims to develop an environmental-economic accounting approach for water use in arid river basins through a methodological framework that relates water use to human economic activities impacting regional water resources. Water uses are coupled to economic transactions, and the complex but mutual relations between various water using sectors estimated. A comparison is made between the calculated water productivity indicators and representative cost/price per unit volume of water for the main water use sectors. Although it contributes very little to regional economic output, preliminary results confirm that Irrigation is among the sectors with the largest direct water use intensities. High economic value and low water use intensity economic sectors in the study region include Manufacturing, Mining, and Steam Electric Power. Water accounting challenges revealed by the study include differences in water management regimes between jurisdictions, and little understanding of the impact of major economic activities on the interaction between surface and groundwater systems in this region. A more comprehensive assessment would require the incorporation of environmental and social sustainability indicators to the calculated water productivity indicators.
NASA Technical Reports Server (NTRS)
Mohr, Karen Irene; Tao, Wei-Kuo; Chern, Jiun-Dar; Kumar, Sujay V.; Peters-Lidard, Christa D.
2013-01-01
The present generation of general circulation models (GCM) use parameterized cumulus schemes and run at hydrostatic grid resolutions. To improve the representation of cloud-scale moist processes and landeatmosphere interactions, a global, Multi-scale Modeling Framework (MMF) coupled to the Land Information System (LIS) has been developed at NASA-Goddard Space Flight Center. The MMFeLIS has three components, a finite-volume (fv) GCM (Goddard Earth Observing System Ver. 4, GEOS-4), a 2D cloud-resolving model (Goddard Cumulus Ensemble, GCE), and the LIS, representing the large-scale atmospheric circulation, cloud processes, and land surface processes, respectively. The non-hydrostatic GCE model replaces the single-column cumulus parameterization of fvGCM. The model grid is composed of an array of fvGCM gridcells each with a series of embedded GCE models. A horizontal coupling strategy, GCE4fvGCM4Coupler4LIS, offered significant computational efficiency, with the scalability and I/O capabilities of LIS permitting landeatmosphere interactions at cloud-scale. Global simulations of 2007e2008 and comparisons to observations and reanalysis products were conducted. Using two different versions of the same land surface model but the same initial conditions, divergence in regional, synoptic-scale surface pressure patterns emerged within two weeks. The sensitivity of largescale circulations to land surface model physics revealed significant functional value to using a scalable, multi-model land surface modeling system in global weather and climate prediction.
Exact consideration of data redundancies for spiral cone-beam CT
NASA Astrophysics Data System (ADS)
Lauritsch, Guenter; Katsevich, Alexander; Hirsch, Michael
2004-05-01
In multi-slice spiral computed tomography (CT) there is an obvious trend in adding more and more detector rows. The goals are numerous: volume coverage, isotropic spatial resolution, and speed. Consequently, there will be a variety of scan protocols optimizing clinical applications. Flexibility in table feed requires consideration of data redundancies to ensure efficient detector usage. Until recently this was achieved by approximate reconstruction algorithms only. However, due to the increasing cone angles there is a need of exact treatment of the cone beam geometry. A new, exact and efficient 3-PI algorithm for considering three-fold data redundancies was derived from a general, theoretical framework based on 3D Radon inversion using Grangeat's formula. The 3-PI algorithm possesses a simple and efficient structure as the 1-PI method for non-redundant data previously proposed. Filtering is one-dimensional, performed along lines with variable tilt on the detector. This talk deals with a thorough evaluation of the performance of the 3-PI algorithm in comparison to the 1-PI method. Image quality of the 3-PI algorithm is superior. The prominent spiral artifacts and other discretization artifacts are significantly reduced due to averaging effects when taking into account redundant data. Certainly signal-to-noise ratio is increased. The computational expense is comparable even to that of approximate algorithms. The 3-PI algorithm proves its practicability for applications in medical imaging. Other exact n-PI methods for n-fold data redundancies (n odd) can be deduced from the general, theoretical framework.
Inter-comparison of isotropic and anisotropic sea ice rheology in a fully coupled model
NASA Astrophysics Data System (ADS)
Roberts, A.; Cassano, J. J.; Maslowski, W.; Osinski, R.; Seefeldt, M. W.; Hughes, M.; Duvivier, A.; Nijssen, B.; Hamman, J.; Hutchings, J. K.; Hunke, E. C.
2015-12-01
We present the sea ice climate of the Regional Arctic System Model (RASM), using a suite of new physics available in the Los Alamos Sea Ice Model (CICE5). RASM is a high-resolution fully coupled pan-Arctic model that also includes the Parallel Ocean Program (POP), the Weather Research and Forecasting Model (WRF) and Variable Infiltration Capacity (VIC) land model. The model domain extends from ~45˚N to the North Pole and is configured to run at ~9km resolution for the ice and ocean components, coupled to 50km resolution atmosphere and land models. The baseline sea ice model configuration includes mushy-layer sea ice thermodynamics and level-ice melt ponds. Using this configuration, we compare the use of isotropic and anisotropic sea ice mechanics, and evaluate model performance using these two variants against observations including Arctic buoy drift and deformation, satellite-derived drift and deformation, and sea ice volume estimates from ICESat. We find that the isotropic rheology better approximates spatial patterns of thickness observed across the Arctic, but that both rheologies closely approximate scaling laws observed in the pack using buoys and RGPS data. A fundamental component of both ice mechanics variants, the so called Elastic-Viscous-Plastic (EVP) and Anisotropic-Elastic-Plastic (EAP), is that they are highly sensitive to the timestep used for elastic sub-cycling in an inertial-resolving coupled framework, and this has a significant affect on surface fluxes in the fully coupled framework.
NASA Astrophysics Data System (ADS)
Mahootian, F.
2009-12-01
The rapid convergence of advancing sensor technology, computational power, and knowledge discovery techniques over the past decade has brought unprecedented volumes of astronomical data together with unprecedented capabilities of data assimilation and analysis. A key result is that a new, data-driven "observational-inductive'' framework for scientific inquiry is taking shape and proving viable. The anticipated rise in data flow and processing power will have profound effects, e.g., confirmations and disconfirmations of existing theoretical claims both for and against the big bang model. But beyond enabling new discoveries can new data-driven frameworks of scientific inquiry reshape the epistemic ideals of science? The history of physics offers a comparison. The Bohr-Einstein debate over the "completeness'' of quantum mechanics centered on a question of ideals: what counts as science? We briefly examine lessons from that episode and pose questions about their applicability to cosmology. If the history of 20th century physics is any indication, the abandonment of absolutes (e.g., space, time, simultaneity, continuity, determinacy) can produce fundamental changes in understanding. The classical ideal of science, operative in both physics and cosmology, descends from the European Enlightenment. This ideal has for over 200 years guided science to seek the ultimate order of nature, to pursue the absolute theory, the "theory of everything.'' But now that we have new models of scientific inquiry powered by new technologies and driven more by data than by theory, it is time, finally, to relinquish dreams of a "final'' theory.
Multi-Modal Glioblastoma Segmentation: Man versus Machine
Pica, Alessia; Schucht, Philippe; Beck, Jürgen; Verma, Rajeev Kumar; Slotboom, Johannes; Reyes, Mauricio; Wiest, Roland
2014-01-01
Background and Purpose Reproducible segmentation of brain tumors on magnetic resonance images is an important clinical need. This study was designed to evaluate the reliability of a novel fully automated segmentation tool for brain tumor image analysis in comparison to manually defined tumor segmentations. Methods We prospectively evaluated preoperative MR Images from 25 glioblastoma patients. Two independent expert raters performed manual segmentations. Automatic segmentations were performed using the Brain Tumor Image Analysis software (BraTumIA). In order to study the different tumor compartments, the complete tumor volume TV (enhancing part plus non-enhancing part plus necrotic core of the tumor), the TV+ (TV plus edema) and the contrast enhancing tumor volume CETV were identified. We quantified the overlap between manual and automated segmentation by calculation of diameter measurements as well as the Dice coefficients, the positive predictive values, sensitivity, relative volume error and absolute volume error. Results Comparison of automated versus manual extraction of 2-dimensional diameter measurements showed no significant difference (p = 0.29). Comparison of automated versus manual segmentation of volumetric segmentations showed significant differences for TV+ and TV (p<0.05) but no significant differences for CETV (p>0.05) with regard to the Dice overlap coefficients. Spearman's rank correlation coefficients (ρ) of TV+, TV and CETV showed highly significant correlations between automatic and manual segmentations. Tumor localization did not influence the accuracy of segmentation. Conclusions In summary, we demonstrated that BraTumIA supports radiologists and clinicians by providing accurate measures of cross-sectional diameter-based tumor extensions. The automated volume measurements were comparable to manual tumor delineation for CETV tumor volumes, and outperformed inter-rater variability for overlap and sensitivity. PMID:24804720
Multiloop integral system test (MIST): Final report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gloudemans, J.R.
1991-04-01
The Multiloop Integral System Test (MIST) is part of a multiphase program started in 1983 to address small-break loss-of-coolant accidents (SBLOCAs) specific to Babcock and Wilcox designed plants. MIST is sponsored by the US Nuclear Regulatory Commission, the Babcock Wilcox Owners Group, the Electric Power Research Institute, and Babcock and Wilcox. The unique features of the Babcock and Wilcox design, specifically the hot leg U-bends and steam generators, prevented the use of existing integral system data or existing integral facilities to address the thermal-hydraulic SBLOCA questions. MIST was specifically designed and constructed for this program, and an existing facility --more » the Once Through Integral System (OTIS) -- was also used. Data from MIST and OTIS are used to benchmark the adequacy of system codes, such as RELAP5 and TRAC, for predicting abnormal plant transients. The MIST program is reported in 11 volumes. Volumes 2 through 8 pertain to groups of Phase 3 tests by type; Volume 9 presents inter-group comparisons; Volume 10 provides comparisons between the RELAP5/MOD2 calculations and MIST observations, and Volume 11 (with addendum) presents the later Phase 4 tests. This is Volume 1 of the MIST final report, a summary of the entire MIST program. Major topics include, Test Advisory Group (TAG) issues, facility scaling and design, test matrix, observations, comparison of RELAP5 calculations to MIST observations, and MIST versus the TAG issues. MIST generated consistent integral-system data covering a wide range of transient interactions. MIST provided insight into integral system behavior and assisted the code effort. The MIST observations addressed each of the TAG issues. 11 refs., 29 figs., 9 tabs.« less
Gilmore, John H.; Kang, Chaeryon; Evans, Dianne D.; Wolfe, Honor M.; Smith, J. Keith; Lieberman, Jeffrey A.; Lin, Weili; Hamer, Robert M.; Styner, Martin; Gerig, Guido
2011-01-01
Objective Schizophrenia is a neurodevelopmental disorder associated with abnormalities of brain structure and white matter, although little is known about when these abnormalities arise. This study was conducted to identify structural brain abnormalities in the prenatal and neonatal periods associated with genetic risk for schizophrenia. Method Prenatal ultrasound scans and neonatal structural magnetic resonance imaging (MRI) and diffusion tensor imaging were prospectively obtained in the offspring of mothers with schizophrenia or schizoaffective disorder (N=26) and matched comparison mothers without psychiatric illness (N=26). Comparisons were made for prenatal lateral ventricle width and head circumference, for neonatal intracranial, CSF, gray matter, white matter, and lateral ventricle volumes, and for neonatal diffusion properties of the genu and splenium of the corpus callosum and corticospinal tracts. Results Relative to the matched comparison subjects, the offspring of mothers with schizophrenia did not differ in prenatal lateral ventricle width or head circumference. Overall, the high-risk neonates had nonsignificantly larger intracranial, CSF, and lateral ventricle volumes. Subgroup analysis revealed that male high-risk infants had significantly larger intracranial, CSF, total gray matter, and lateral ventricle volumes; the female high-risk neonates were similar to the female comparison subjects. There were no group differences in white matter diffusion tensor properties. Conclusions Male neonates at genetic risk for schizophrenia had several larger than normal brain volumes, while females did not. To the authors' knowledge, this study provides the first evidence, in the context of its limitations, that early neonatal brain development may be abnormal in males at genetic risk for schizophrenia. PMID:20516153
Chedid, Aljamir D.; Chedid, Marcio F.; Winkelmann, Leonardo V.; Filho, Tomaz J. M. Grezzana; Kruel, Cleber D. P.
2015-01-01
Perioperative mortality following pancreaticoduodenectomy has improved over time and is lower than 5% in selected high-volume centers. Based on several large literature series on pancreaticoduodenectomy from high-volume centers, some defend that high annual volumes are necessary for good outcomes after pancreaticoduodenectomy. We report here the outcomes of a low annual volume pancreaticoduodenectomy series after incorporating technical expertise from a high-volume center. We included all patients who underwent pancreaticoduodenectomy performed by a single surgeon (ADC.) as treatment for periampullary malignancies from 1981 to 2005. Outcomes of this series were compared to those of 3 high-volume literature series. Additionally, outcomes for first 10 cases in the present series were compared to those of all 37 remaining cases in this series. A total of 47 pancreaticoduodenectomies were performed over a 25-year period. Overall in-hospital mortality was 2 cases (4.3%), and morbidity occurred in 23 patients (48.9%). Both mortality and morbidity were similar to those of each of the three high-volume center comparison series. Comparison of the outcomes for the first 10 to the remaining 37 cases in this series revealed that the latter 37 cases had inferior mortality (20% versus 0%; P = 0.042), less tumor-positive margins (50 versus 13.5%; P = 0.024), less use of intraoperative blood transfusions (90% versus 32.4%; P = 0.003), and tendency to a shorter length of in-hospital stay (20 versus 15.8 days; P = 0.053). Accumulation of surgical experience and incorporation of expertise from high-volume centers may enable achieving satisfactory outcomes after pancreaticoduodenectomy in low-volume settings whenever referral to a high-volume center is limited. PMID:25875555
A novel framework of tissue membrane systems for image fusion.
Zhang, Zulin; Yi, Xinzhong; Peng, Hong
2014-01-01
This paper proposes a tissue membrane system-based framework to deal with the optimal image fusion problem. A spatial domain fusion algorithm is given, and a tissue membrane system of multiple cells is used as its computing framework. Based on the multicellular structure and inherent communication mechanism of the tissue membrane system, an improved velocity-position model is developed. The performance of the fusion framework is studied with comparison of several traditional fusion methods as well as genetic algorithm (GA)-based and differential evolution (DE)-based spatial domain fusion methods. Experimental results show that the proposed fusion framework is superior or comparable to the other methods and can be efficiently used for image fusion.
Wang, Hongxiang; Wang, Wei; Hu, Dandan; Luo, Min; Xue, Chaozhuang; Li, Dongsheng; Wu, Tao
2018-06-04
Reported here is a unique crystalline semiconductor open-framework material built from the large-sized supertetrahedral T4 and T5 clusters with the Mn-In-S compositions. The hybrid assembly between T4 and T5 clusters by sharing terminal μ 2 -S 2- is for the first time observed among the cluster-based chalcogenide open frameworks. Such three-dimensional structure displays non-interpenetrated diamond-type topology with extra-large nonframework volume of 82%. Moreover, ion exchange, CO 2 adsorption, as well as photoluminescence properties of the title compound are also investigated.
Traffic offense sentencing processes and highway safety. Volume 1, Summary report
DOT National Transportation Integrated Search
1977-04-01
The history and development of traffic offense sanctions are reviewed. Criteria for traffic offense sanctions are discussed in terms of evenness, economy, appropriateness, rational allocation, effectiveness and parsimony. The framework for developmen...
Open Scenario Study, Phase I. Volume 3. Questionnaire Response
2008-03-01
a conceptual framework to assist users to grasp new ideas. As the concepts are used by a broad community and internationally, classified scenarios are...framework to assist users to grasp new ideas. As the concepts are used by a broad community and internationally, classified scenarios are not necessary for...from other sources Comment: Normally have an old version that we can update. Occasionally we’ve had to develop new scenarios - the SSSP for example - we
Use of Mechanical Turk as a MapReduce Framework for Macular OCT Segmentation.
Lee, Aaron Y; Lee, Cecilia S; Keane, Pearse A; Tufail, Adnan
2016-01-01
Purpose. To evaluate the feasibility of using Mechanical Turk as a massively parallel platform to perform manual segmentations of macular spectral domain optical coherence tomography (SD-OCT) images using a MapReduce framework. Methods. A macular SD-OCT volume of 61 slice images was map-distributed to Amazon Mechanical Turk. Each Human Intelligence Task was set to $0.01 and required the user to draw five lines to outline the sublayers of the retinal OCT image after being shown example images. Each image was submitted twice for segmentation, and interrater reliability was calculated. The interface was created using custom HTML5 and JavaScript code, and data analysis was performed using R. An automated pipeline was developed to handle the map and reduce steps of the framework. Results. More than 93,500 data points were collected using this framework for the 61 images submitted. Pearson's correlation of interrater reliability was 0.995 (p < 0.0001) and coefficient of determination was 0.991. The cost of segmenting the macular volume was $1.21. A total of 22 individual Mechanical Turk users provided segmentations, each completing an average of 5.5 HITs. Each HIT was completed in an average of 4.43 minutes. Conclusions. Amazon Mechanical Turk provides a cost-effective, scalable, high-availability infrastructure for manual segmentation of OCT images.
Use of Mechanical Turk as a MapReduce Framework for Macular OCT Segmentation
Lee, Aaron Y.; Lee, Cecilia S.; Keane, Pearse A.; Tufail, Adnan
2016-01-01
Purpose. To evaluate the feasibility of using Mechanical Turk as a massively parallel platform to perform manual segmentations of macular spectral domain optical coherence tomography (SD-OCT) images using a MapReduce framework. Methods. A macular SD-OCT volume of 61 slice images was map-distributed to Amazon Mechanical Turk. Each Human Intelligence Task was set to $0.01 and required the user to draw five lines to outline the sublayers of the retinal OCT image after being shown example images. Each image was submitted twice for segmentation, and interrater reliability was calculated. The interface was created using custom HTML5 and JavaScript code, and data analysis was performed using R. An automated pipeline was developed to handle the map and reduce steps of the framework. Results. More than 93,500 data points were collected using this framework for the 61 images submitted. Pearson's correlation of interrater reliability was 0.995 (p < 0.0001) and coefficient of determination was 0.991. The cost of segmenting the macular volume was $1.21. A total of 22 individual Mechanical Turk users provided segmentations, each completing an average of 5.5 HITs. Each HIT was completed in an average of 4.43 minutes. Conclusions. Amazon Mechanical Turk provides a cost-effective, scalable, high-availability infrastructure for manual segmentation of OCT images. PMID:27293877
Takahashi, Masahiro; Kimura, Fumiko; Umezawa, Tatsuya; Watanabe, Yusuke; Ogawa, Harumi
2016-01-01
Adaptive statistical iterative reconstruction (ASIR) has been used to reduce radiation dose in cardiac computed tomography. However, change of image parameters by ASIR as compared to filtered back projection (FBP) may influence quantification of coronary calcium. To investigate the influence of ASIR on calcium quantification in comparison to FBP. In 352 patients, CT images were reconstructed using FBP alone, FBP combined with ASIR 30%, 50%, 70%, and ASIR 100% based on the same raw data. Image noise, plaque density, Agatston scores and calcium volumes were compared among the techniques. Image noise, Agatston score, and calcium volume decreased significantly with ASIR compared to FBP (each P < 0.001). Use of ASIR reduced Agatston score by 10.5% to 31.0%. In calcified plaques both of patients and a phantom, ASIR decreased maximum CT values and calcified plaque size. In comparison to FBP, adaptive statistical iterative reconstruction (ASIR) may significantly decrease Agatston scores and calcium volumes. Copyright © 2016 Society of Cardiovascular Computed Tomography. Published by Elsevier Inc. All rights reserved.
Framework for Identifying Cybersecurity Risks in Manufacturing
Hutchins, Margot J.; Bhinge, Raunak; Micali, Maxwell K.; ...
2015-10-21
Increasing connectivity, use of digital computation, and off-site data storage provide potential for dramatic improvements in manufacturing productivity, quality, and cost. However, there are also risks associated with the increased volume and pervasiveness of data that are generated and potentially accessible to competitors or adversaries. Enterprises have experienced cyber attacks that exfiltrate confidential and/or proprietary data, alter information to cause an unexpected or unwanted effect, and destroy capital assets. Manufacturers need tools to incorporate these risks into their existing risk management processes. This article establishes a framework that considers the data flows within a manufacturing enterprise and throughout its supplymore » chain. The framework provides several mechanisms for identifying generic and manufacturing-specific vulnerabilities and is illustrated with details pertinent to an automotive manufacturer. Finally, in addition to providing manufacturers with insights into their potential data risks, this framework addresses an outcome identified by the NIST Cybersecurity Framework.« less
Framework for Identifying Cybersecurity Risks in Manufacturing
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hutchins, Margot J.; Bhinge, Raunak; Micali, Maxwell K.
Increasing connectivity, use of digital computation, and off-site data storage provide potential for dramatic improvements in manufacturing productivity, quality, and cost. However, there are also risks associated with the increased volume and pervasiveness of data that are generated and potentially accessible to competitors or adversaries. Enterprises have experienced cyber attacks that exfiltrate confidential and/or proprietary data, alter information to cause an unexpected or unwanted effect, and destroy capital assets. Manufacturers need tools to incorporate these risks into their existing risk management processes. This article establishes a framework that considers the data flows within a manufacturing enterprise and throughout its supplymore » chain. The framework provides several mechanisms for identifying generic and manufacturing-specific vulnerabilities and is illustrated with details pertinent to an automotive manufacturer. Finally, in addition to providing manufacturers with insights into their potential data risks, this framework addresses an outcome identified by the NIST Cybersecurity Framework.« less
Vellela, Melissa; Qian, Hong
2009-10-06
Schlögl's model is the canonical example of a chemical reaction system that exhibits bistability. Because the biological examples of bistability and switching behaviour are increasingly numerous, this paper presents an integrated deterministic, stochastic and thermodynamic analysis of the model. After a brief review of the deterministic and stochastic modelling frameworks, the concepts of chemical and mathematical detailed balances are discussed and non-equilibrium conditions are shown to be necessary for bistability. Thermodynamic quantities such as the flux, chemical potential and entropy production rate are defined and compared across the two models. In the bistable region, the stochastic model exhibits an exchange of the global stability between the two stable states under changes in the pump parameters and volume size. The stochastic entropy production rate shows a sharp transition that mirrors this exchange. A new hybrid model that includes continuous diffusion and discrete jumps is suggested to deal with the multiscale dynamics of the bistable system. Accurate approximations of the exponentially small eigenvalue associated with the time scale of this switching and the full time-dependent solution are calculated using Matlab. A breakdown of previously known asymptotic approximations on small volume scales is observed through comparison with these and Monte Carlo results. Finally, in the appendix section is an illustration of how the diffusion approximation of the chemical master equation can fail to represent correctly the mesoscopically interesting steady-state behaviour of the system.
Quantitative framework for prospective motion correction evaluation.
Pannetier, Nicolas A; Stavrinos, Theano; Ng, Peter; Herbst, Michael; Zaitsev, Maxim; Young, Karl; Matson, Gerald; Schuff, Norbert
2016-02-01
Establishing a framework to evaluate performances of prospective motion correction (PMC) MRI considering motion variability between MRI scans. A framework was developed to obtain quantitative comparisons between different motion correction setups, considering that varying intrinsic motion patterns between acquisitions can induce bias. Intrinsic motion was considered by replaying in a phantom experiment the recorded motion trajectories from subjects. T1-weighted MRI on five volunteers and two different marker fixations (mouth guard and nose bridge fixations) were used to test the framework. Two metrics were investigated to quantify the improvement of the image quality with PMC. Motion patterns vary between subjects as well as between repeated scans within a subject. This variability can be approximated by replaying the motion in a distinct phantom experiment and used as a covariate in models comparing motion corrections. We show that considering the intrinsic motion alters the statistical significance in comparing marker fixations. As an example, two marker fixations, a mouth guard and a nose bridge, were evaluated in terms of their effectiveness for PMC. A mouth guard achieved better PMC performance. Intrinsic motion patterns can bias comparisons between PMC configurations and must be considered for robust evaluations. A framework for evaluating intrinsic motion patterns in PMC is presented. © 2015 Wiley Periodicals, Inc.
Assessing Vocational Interests in the Basque Country Using Paired Comparison Design
ERIC Educational Resources Information Center
Elosua, Paula
2007-01-01
This article proposes the Thurstonian paired comparison model to assess vocational preferences and uses this approach to evaluate the Realistic, Investigative, Artistic, Social, Enterprise, and Conventional (RIASEC) model in the Basque Country (Spain). First, one unrestricted model is estimated in the Structural Equation Modelling framework using…
A Bayesian Missing Data Framework for Generalized Multiple Outcome Mixed Treatment Comparisons
ERIC Educational Resources Information Center
Hong, Hwanhee; Chu, Haitao; Zhang, Jing; Carlin, Bradley P.
2016-01-01
Bayesian statistical approaches to mixed treatment comparisons (MTCs) are becoming more popular because of their flexibility and interpretability. Many randomized clinical trials report multiple outcomes with possible inherent correlations. Moreover, MTC data are typically sparse (although richer than standard meta-analysis, comparing only two…
Bakken, Tor Haakon; Aase, Anne Guri; Hagen, Dagmar; Sundt, Håkon; Barton, David N; Lujala, Päivi
2014-07-01
Climate change and the needed reductions in the use of fossil fuels call for the development of renewable energy sources. However, renewable energy production, such as hydropower (both small- and large-scale) and wind power have adverse impacts on the local environment by causing reductions in biodiversity and loss of habitats and species. This paper compares the environmental impacts of many small-scale hydropower plants with a few large-scale hydropower projects and one wind power farm, based on the same set of environmental parameters; land occupation, reduction in wilderness areas (INON), visibility and impacts on red-listed species. Our basis for comparison was similar energy volumes produced, without considering the quality of the energy services provided. The results show that small-scale hydropower performs less favourably in all parameters except land occupation. The land occupation of large hydropower and wind power is in the range of 45-50 m(2)/MWh, which is more than two times larger than the small-scale hydropower, where the large land occupation for large hydropower is explained by the extent of the reservoirs. On all the three other parameters small-scale hydropower performs more than two times worse than both large hydropower and wind power. Wind power compares similarly to large-scale hydropower regarding land occupation, much better on the reduction in INON areas, and in the same range regarding red-listed species. Our results demonstrate that the selected four parameters provide a basis for further development of a fair and consistent comparison of impacts between the analysed renewable technologies. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.
NASA Astrophysics Data System (ADS)
Hardiyanti, Y.; Haekal, M.; Waris, A.; Haryanto, F.
2016-08-01
This research compares the quadratic optimization program on Intensity Modulated Radiation Therapy Treatment Planning (IMRTP) with the Computational Environment for Radiotherapy Research (CERR) software. We assumed that the number of beams used for the treatment planner was about 9 and 13 beams. The case used the energy of 6 MV with Source Skin Distance (SSD) of 100 cm from target volume. Dose calculation used Quadratic Infinite beam (QIB) from CERR. CERR was used in the comparison study between Gauss Primary threshold method and Gauss Primary exponential method. In the case of lung cancer, the threshold variation of 0.01, and 0.004 was used. The output of the dose was distributed using an analysis in the form of DVH from CERR. The maximum dose distributions obtained were on the target volume (PTV) Planning Target Volume, (CTV) Clinical Target Volume, (GTV) Gross Tumor Volume, liver, and skin. It was obtained that if the dose calculation method used exponential and the number of beam 9. When the dose calculation method used the threshold and the number of beam 13, the maximum dose distributions obtained were on the target volume PTV, GTV, heart, and skin.
Hamoud Al-Tamimi, Mohammed Sabbih; Sulong, Ghazali; Shuaib, Ibrahim Lutfi
2015-07-01
Resection of brain tumors is a tricky task in surgery due to its direct influence on the patients' survival rate. Determining the tumor resection extent for its complete information via-à-vis volume and dimensions in pre- and post-operative Magnetic Resonance Images (MRI) requires accurate estimation and comparison. The active contour segmentation technique is used to segment brain tumors on pre-operative MR images using self-developed software. Tumor volume is acquired from its contours via alpha shape theory. The graphical user interface is developed for rendering, visualizing and estimating the volume of a brain tumor. Internet Brain Segmentation Repository dataset (IBSR) is employed to analyze and determine the repeatability and reproducibility of tumor volume. Accuracy of the method is validated by comparing the estimated volume using the proposed method with that of gold-standard. Segmentation by active contour technique is found to be capable of detecting the brain tumor boundaries. Furthermore, the volume description and visualization enable an interactive examination of tumor tissue and its surrounding. Admirable features of our results demonstrate that alpha shape theory in comparison to other existing standard methods is superior for precise volumetric measurement of tumor. Copyright © 2015 Elsevier Inc. All rights reserved.
Sexual steroids in serum and prostatic tissue of human non-cancerous prostate (STERPROSER trial).
Neuzillet, Yann; Raynaud, Jean-Pierre; Radulescu, Camélia; Fiet, Jean; Giton, Franck; Dreyfus, Jean-François; Ghoneim, Tarek P; Lebret, Thierry; Botto, Henry
2017-11-01
The specific involvement of the sex steroids in the growth of the prostatic tissue remains unclear. Sex steroid concentrations in plasma and in fresh surgical samples of benign central prostate were correlated to prostate volume. Monocentric prospective study performed between September 2014 and January 2017. Age, obesity parameters, and both serum and intraprostatic concentrations of sex steroids were collected complying with the latest Endocrine Society guidelines and the steroids assessed by GC/MS. Statistical calculations were adjusted for age and body mass index (BMI). Thirty-two patients, equally divided between normal- and high-volume prostate groups, were included in the analysis. High-volume prostate patients were older, heavier and had higher BMI. Comparison adjusted for age and BMI showed higher DHT concentrations in high-volume prostate. Both normal- and high-volume prostate tissues concentrate sex steroids in a similar way. Comparison of enzymatic activity surrogate marker ratios within tissue highlighted similar TT/E1 and TT/E2 ratios, and higher DHT/E1 ratio and lower DHT/PSA ratio in the high-volume prostates. STERPROSER trial provides evidence for higher DHT concentration in highvolume prostates, that could reflect either higher 5-alpha reductase expression or lower expression of downstream metabolizing enzymes such as 3a-hydoxysteroid dehydrogenase. © 2017 Wiley Periodicals, Inc.
Towards a Transferable UAV-Based Framework for River Hydromorphological Characterization
González, Rocío Ballesteros; Leinster, Paul; Wright, Ros
2017-01-01
The multiple protocols that have been developed to characterize river hydromorphology, partly in response to legislative drivers such as the European Union Water Framework Directive (EU WFD), make the comparison of results obtained in different countries challenging. Recent studies have analyzed the comparability of existing methods, with remote sensing based approaches being proposed as a potential means of harmonizing hydromorphological characterization protocols. However, the resolution achieved by remote sensing products may not be sufficient to assess some of the key hydromorphological features that are required to allow an accurate characterization. Methodologies based on high resolution aerial photography taken from Unmanned Aerial Vehicles (UAVs) have been proposed by several authors as potential approaches to overcome these limitations. Here, we explore the applicability of an existing UAV based framework for hydromorphological characterization to three different fluvial settings representing some of the distinct ecoregions defined by the WFD geographical intercalibration groups (GIGs). The framework is based on the automated recognition of hydromorphological features via tested and validated Artificial Neural Networks (ANNs). Results show that the framework is transferable to the Central-Baltic and Mediterranean GIGs with accuracies in feature identification above 70%. Accuracies of 50% are achieved when the framework is implemented in the Very Large Rivers GIG. The framework successfully identified vegetation, deep water, shallow water, riffles, side bars and shadows for the majority of the reaches. However, further algorithm development is required to ensure a wider range of features (e.g., chutes, structures and erosion) are accurately identified. This study also highlights the need to develop an objective and fit for purpose hydromorphological characterization framework to be adopted within all EU member states to facilitate comparison of results. PMID:28954434
Towards a Transferable UAV-Based Framework for River Hydromorphological Characterization.
Rivas Casado, Mónica; González, Rocío Ballesteros; Ortega, José Fernando; Leinster, Paul; Wright, Ros
2017-09-26
The multiple protocols that have been developed to characterize river hydromorphology, partly in response to legislative drivers such as the European Union Water Framework Directive (EU WFD), make the comparison of results obtained in different countries challenging. Recent studies have analyzed the comparability of existing methods, with remote sensing based approaches being proposed as a potential means of harmonizing hydromorphological characterization protocols. However, the resolution achieved by remote sensing products may not be sufficient to assess some of the key hydromorphological features that are required to allow an accurate characterization. Methodologies based on high resolution aerial photography taken from Unmanned Aerial Vehicles (UAVs) have been proposed by several authors as potential approaches to overcome these limitations. Here, we explore the applicability of an existing UAV based framework for hydromorphological characterization to three different fluvial settings representing some of the distinct ecoregions defined by the WFD geographical intercalibration groups (GIGs). The framework is based on the automated recognition of hydromorphological features via tested and validated Artificial Neural Networks (ANNs). Results show that the framework is transferable to the Central-Baltic and Mediterranean GIGs with accuracies in feature identification above 70%. Accuracies of 50% are achieved when the framework is implemented in the Very Large Rivers GIG. The framework successfully identified vegetation, deep water, shallow water, riffles, side bars and shadows for the majority of the reaches. However, further algorithm development is required to ensure a wider range of features (e.g., chutes, structures and erosion) are accurately identified. This study also highlights the need to develop an objective and fit for purpose hydromorphological characterization framework to be adopted within all EU member states to facilitate comparison of results.
Relative Evaluation of the Independent Volume Measures of Caverns
DOE Office of Scientific and Technical Information (OSTI.GOV)
MUNSON,DARRELL E.
2000-08-01
Throughout the construction and operation of the caverns of the Strategic Petroleum Reserve (SPR), three types of cavern volume measurements have been maintained. These are: (1) the calculated solution volume determined during initial construction by solution mining and any subsequent solutioning during oil transfers, (2) the calculated sonar volume determined through sonar surveys of the cavern dimensions, and (3) the direct metering of oil to determine the volume of the cavern occupied by the oil. The objective of this study is to compare these measurements to each other and determine, if possible, the uncertainties associated with a given type ofmore » measurement. Over time, each type of measurement has acquired a customary, or an industry accepted, stated uncertainty. This uncertainty is not necessarily the result of a technical analysis. Ultimately there is one definitive quantity, the oil volume measure by the oil custody transfer meters, taken by all parties to the transfer as the correct ledger amount and for which the SPR Project is accountable. However, subsequent transfers within a site may not be with meters of the same accuracy. In this study, a very simple theory of the perfect relationship is used to evaluate the correlation (deviation) of the various measures. This theory permits separation of uncertainty and bias. Each of the four SPR sites are examined, first with comparisons between the calculated solution volumes and the sonar volumes determined during construction, then with comparisons of the oil inventories and the sonar volumes obtained either by surveying through brine prior to oil filling or through the oil directly.« less
Han, Doug Hyun; Lyoo, In Kyoon; Renshaw, Perry F
2012-04-01
Patients with on-line game addiction (POGA) and professional video game players play video games for extended periods of time, but experience very different consequences for their on-line game play. Brain regions consisting of anterior cingulate, thalamus and occpito-temporal areas may increase the likelihood of becoming a pro-gamer or POGA. Twenty POGA, seventeen pro-gamers, and eighteen healthy comparison subjects (HC) were recruited. All magnetic resonance imaging (MRI) was performed on a 1.5 Tesla Espree MRI scanner (SIEMENS, Erlangen, Germany). Voxel-wise comparisons of gray matter volume were performed between the groups using the two-sample t-test with statistical parametric mapping (SPM5). Compared to HC, the POGA group showed increased impulsiveness and perseverative errors, and volume in left thalamus gray matter, but decreased gray matter volume in both inferior temporal gyri, right middle occipital gyrus, and left inferior occipital gyrus, compared with HC. Pro-gamers showed increased gray matter volume in left cingulate gyrus, but decreased gray matter volume in left middle occipital gyrus and right inferior temporal gyrus compared with HC. Additionally, the pro-gamer group showed increased gray matter volume in left cingulate gyrus and decreased left thalamus gray matter volume compared with the POGA group. The current study suggests that increased gray matter volumes of the left cingulate gyrus in pro-gamers and of the left thalamus in POGA may contribute to the different clinical characteristics of pro-gamers and POGA. Copyright © 2012 Elsevier Ltd. All rights reserved.
Open Architecture Framework for Improved Early Stage Submarine Design
2010-06-01
Figure 12 Data Structure Example – VPH ..................................................................................... 41 Figure 13 SUBSTART Data...in the MIT SMM (e.g. – VPH -VB, pressure hull volume without volume of the in-hull variable ballast). Tables 4 and 5 list the variables by module...MUD1frac FFsurf θ(x,t) V(t) VPH ∆ff VCGLEADs BTf MUDfwd GMt R(x) Vambt VPHguess ∆mbt VCGnsc BTops OBambt ηa tenvsurf Vaux VPH -VB ∆nsc VCGVL
The Blurred Line between Form and Process: A Comparison of Stream Channel Classification Frameworks
Kasprak, Alan; Hough-Snee, Nate
2016-01-01
Stream classification provides a means to understand the diversity and distribution of channels and floodplains that occur across a landscape while identifying links between geomorphic form and process. Accordingly, stream classification is frequently employed as a watershed planning, management, and restoration tool. At the same time, there has been intense debate and criticism of particular frameworks, on the grounds that these frameworks classify stream reaches based largely on their physical form, rather than direct measurements of their component hydrogeomorphic processes. Despite this debate surrounding stream classifications, and their ongoing use in watershed management, direct comparisons of channel classification frameworks are rare. Here we implement four stream classification frameworks and explore the degree to which each make inferences about hydrogeomorphic process from channel form within the Middle Fork John Day Basin, a watershed of high conservation interest within the Columbia River Basin, U.S.A. We compare the results of the River Styles Framework, Natural Channel Classification, Rosgen Classification System, and a channel form-based statistical classification at 33 field-monitored sites. We found that the four frameworks consistently classified reach types into similar groups based on each reach or segment’s dominant hydrogeomorphic elements. Where classified channel types diverged, differences could be attributed to the (a) spatial scale of input data used, (b) the requisite metrics and their order in completing a framework’s decision tree and/or, (c) whether the framework attempts to classify current or historic channel form. Divergence in framework agreement was also observed at reaches where channel planform was decoupled from valley setting. Overall, the relative agreement between frameworks indicates that criticism of individual classifications for their use of form in grouping stream channels may be overstated. These form-based criticisms may also ignore the geomorphic tenet that channel form reflects formative hydrogeomorphic processes across a given landscape. PMID:26982076
NASA Technical Reports Server (NTRS)
Long, W. C.
1988-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed and analysis of the Communication and Tracking hardware, generating draft failure modes and potential critical items. The IOA results were then compared to the NASA FMEA/CIL baseline. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter Communication and Tracking hardware. Volume 2 continues the presentation of IOA worksheets.
A Comparative Analysis of PISA Scientific Literacy Framework in Finnish and Thai Science Curricula
ERIC Educational Resources Information Center
Sothayapetch, Pavinee; Lavonen, Jari; Juuti, Kalle
2013-01-01
A curriculum is a master plan that regulates teaching and learning. This paper compares Finnish and Thai primary school level science curricula to the PISA 2006 Scientific Literacy Framework. Curriculum comparison was made following the procedure of deductive content analysis. In the analysis, there were four main categories adopted from PISA…
ERIC Educational Resources Information Center
Chen, Haiwen
2012-01-01
In this article, linear item response theory (IRT) observed-score equating is compared under a generalized kernel equating framework with Levine observed-score equating for nonequivalent groups with anchor test design. Interestingly, these two equating methods are closely related despite being based on different methodologies. Specifically, when…
The Development of a Proposed Global Work-Integrated Learning Framework
ERIC Educational Resources Information Center
McRae, Norah; Johnston, Nancy
2016-01-01
Building on the work completed in BC that resulted in the development of a WIL Matrix for comparing and contrasting various forms of WIL with the Canadian co-op model, this paper proposes a Global Work-Integrated Learning Framework that allows for the comparison of a variety of models of work-integrated learning found in the international…
The GENIE Neutrino Monte Carlo Generator: Physics and User Manual
DOE Office of Scientific and Technical Information (OSTI.GOV)
Andreopoulos, Costas; Barry, Christopher; Dytman, Steve
2015-10-20
GENIE is a suite of products for the experimental neutrino physics community. This suite includes i) a modern software framework for implementing neutrino event generators, a state-of-the-art comprehensive physics model and tools to support neutrino interaction simulation for realistic experimental setups (the Generator product), ii) extensive archives of neutrino, charged-lepton and hadron scattering data and software to produce a comprehensive set of data/MC comparisons (the Comparisons product), and iii) a generator tuning framework and fitting applications (the Tuning product). This book provides the definite guide for the GENIE Generator: It presents the software architecture and a detailed description of itsmore » physics model and official tunes. In addition, it provides a rich set of data/MC comparisons that characterise the physics performance of GENIE. Detailed step-by-step instructions on how to install and configure the Generator, run its applications and analyze its outputs are also included.« less
Prediction of compression-induced image interpretability degradation
NASA Astrophysics Data System (ADS)
Blasch, Erik; Chen, Hua-Mei; Irvine, John M.; Wang, Zhonghai; Chen, Genshe; Nagy, James; Scott, Stephen
2018-04-01
Image compression is an important component in modern imaging systems as the volume of the raw data collected is increasing. To reduce the volume of data while collecting imagery useful for analysis, choosing the appropriate image compression method is desired. Lossless compression is able to preserve all the information, but it has limited reduction power. On the other hand, lossy compression, which may result in very high compression ratios, suffers from information loss. We model the compression-induced information loss in terms of the National Imagery Interpretability Rating Scale or NIIRS. NIIRS is a user-based quantification of image interpretability widely adopted by the Geographic Information System community. Specifically, we present the Compression Degradation Image Function Index (CoDIFI) framework that predicts the NIIRS degradation (i.e., a decrease of NIIRS level) for a given compression setting. The CoDIFI-NIIRS framework enables a user to broker the maximum compression setting while maintaining a specified NIIRS rating.
Gaussian mixtures on tensor fields for segmentation: applications to medical imaging.
de Luis-García, Rodrigo; Westin, Carl-Fredrik; Alberola-López, Carlos
2011-01-01
In this paper, we introduce a new approach for tensor field segmentation based on the definition of mixtures of Gaussians on tensors as a statistical model. Working over the well-known Geodesic Active Regions segmentation framework, this scheme presents several interesting advantages. First, it yields a more flexible model than the use of a single Gaussian distribution, which enables the method to better adapt to the complexity of the data. Second, it can work directly on tensor-valued images or, through a parallel scheme that processes independently the intensity and the local structure tensor, on scalar textured images. Two different applications have been considered to show the suitability of the proposed method for medical imaging segmentation. First, we address DT-MRI segmentation on a dataset of 32 volumes, showing a successful segmentation of the corpus callosum and favourable comparisons with related approaches in the literature. Second, the segmentation of bones from hand radiographs is studied, and a complete automatic-semiautomatic approach has been developed that makes use of anatomical prior knowledge to produce accurate segmentation results. Copyright © 2010 Elsevier Ltd. All rights reserved.
Sloped Terrain Segmentation for Autonomous Drive Using Sparse 3D Point Cloud
Cho, Seoungjae; Kim, Jonghyun; Ikram, Warda; Cho, Kyungeun; Sim, Sungdae
2014-01-01
A ubiquitous environment for road travel that uses wireless networks requires the minimization of data exchange between vehicles. An algorithm that can segment the ground in real time is necessary to obtain location data between vehicles simultaneously executing autonomous drive. This paper proposes a framework for segmenting the ground in real time using a sparse three-dimensional (3D) point cloud acquired from undulating terrain. A sparse 3D point cloud can be acquired by scanning the geography using light detection and ranging (LiDAR) sensors. For efficient ground segmentation, 3D point clouds are quantized in units of volume pixels (voxels) and overlapping data is eliminated. We reduce nonoverlapping voxels to two dimensions by implementing a lowermost heightmap. The ground area is determined on the basis of the number of voxels in each voxel group. We execute ground segmentation in real time by proposing an approach to minimize the comparison between neighboring voxels. Furthermore, we experimentally verify that ground segmentation can be executed at about 19.31 ms per frame. PMID:25093204
Ryder, Robert T.; Kinney, Scott A.; Suitt, Stephen E.; Merrill, Matthew D.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
In 2006 and 2007, the greenline Appalachian basin field maps were digitized under the supervision of Scott Kinney and converted to geographic information system (GIS) files for chapter I.1 (this volume). By converting these oil and gas field maps to a digital format and maintaining the field names where noted, they are now available for a variety of oil and gas and possibly carbon-dioxide sequestration projects. Having historical names assigned to known digitized conventional fields provides a convenient classification scheme into which cumulative production and ultimate field-size databases can be organized. Moreover, as exploratory and development drilling expands across the basin, many previously named fields that were originally treated as conventional fields have evolved into large, commonly unnamed continuous-type accumulations. These new digital maps will facilitate a comparison between EUR values from recently drilled, unnamed parts of continuous accumulations and EUR values from named fields discovered early during the exploration cycle of continuous accumulations.
High-speed reacting flow simulation using USA-series codes
NASA Astrophysics Data System (ADS)
Chakravarthy, S. R.; Palaniswamy, S.
In this paper, the finite-rate chemistry (FRC) formulation for the USA-series of codes and three sets of validations are presented. USA-series computational fluid dynamics (CFD) codes are based on Unified Solution Algorithms including explicity and implicit formulations, factorization and relaxation approaches, time marching and space marching methodolgies, etc., in order to be able to solve a very wide class of CDF problems using a single framework. Euler or Navier-Stokes equations are solved using a finite-volume treatment with upwind Total Variation Diminishing discretization for the inviscid terms. Perfect and real gas options are available including equilibrium and nonequilibrium chemistry. This capability has been widely used to study various problems including Space Shuttle exhaust plumes, National Aerospace Plane (NASP) designs, etc. (1) Numerical solutions are presented showing the full range of possible solutions to steady detonation wave problems. (2) Comparison between the solution obtained by the USA code and Generalized Kinetics Analysis Program (GKAP) is shown for supersonic combustion in a duct. (3) Simulation of combustion in a supersonic shear layer is shown to have reasonable agreement with experimental observations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Clark, Haley; BC Cancer Agency, Surrey, B.C.; BC Cancer Agency, Vancouver, B.C.
2014-08-15
Many have speculated about the future of computational technology in clinical radiation oncology. It has been advocated that the next generation of computational infrastructure will improve on the current generation by incorporating richer aspects of automation, more heavily and seamlessly featuring distributed and parallel computation, and providing more flexibility toward aggregate data analysis. In this report we describe how a recently created — but currently existing — analysis framework (DICOMautomaton) incorporates these aspects. DICOMautomaton supports a variety of use cases but is especially suited for dosimetric outcomes correlation analysis, investigation and comparison of radiotherapy treatment efficacy, and dose-volume computation. Wemore » describe: how it overcomes computational bottlenecks by distributing workload across a network of machines; how modern, asynchronous computational techniques are used to reduce blocking and avoid unnecessary computation; and how issues of out-of-date data are addressed using reactive programming techniques and data dependency chains. We describe internal architecture of the software and give a detailed demonstration of how DICOMautomaton could be used to search for correlations between dosimetric and outcomes data.« less
Hong, Ung Gi; Park, Hai Woong; Lee, Joongwon; Hwang, Sunhwan; Kwak, Jimin; Yi, Jongheop; Song, In Kyu
2013-11-01
Copper-containing mesoporous carbon (Cu-MC) was prepared by a single-step surfactant-templating method. For comparison, copper-impregnated mesoporous carbon (Cu/MC) was also prepared by a surfactant-templating method and a subsequent impregnation method. Rhenium catalysts supported on copper-containing mesoporous carbon and copper-impregnated mesoporous carbon (Re/Cu-MC and Re/Cu/MC, respectively) were then prepared by an incipient wetness method, and they were applied to the liquid-phase hydrogenation of succinic acid to 1,4-butanediol (BDO). It was observed that copper in the Re/Cu-MC catalyst was well incorporated into carbon framework, resulting in higher surface area and larger pore volume than those of Re/Cu/MC catalyst. Therefore, Re/Cu-MC catalyst showed higher copper dispersion than Re/Cu/MC catalyst, although both catalysts retained the same amounts of copper and rhenium. In the liquid-phase hydrogenation of succinic acid to BDO, Re/Cu-MC catalyst showed a better catalytic activity than Re/Cu/MC catalyst. Fine dispersion of copper in the Re/Cu-MC catalyst was responsible for its enhanced catalytic activity.
Ryder, Robert T.; Milici, Robert C.; Swezey, Christopher S.; Trippi, Michael H.; Ruppert, Leslie F.; Ryder, Robert T.
2014-01-01
The most recent U.S. Geological Survey (USGS) assessment of undiscovered oil and gas resources of the Appalachian basin was completed in 2002 (Milici and others, 2003). This assessment was based on the total petroleum system (TPS), a concept introduced by Magoon and Dow (1994) and developed during subsequent studies such as those by the U.S. Geological Survey World Energy Assessment Team (2000) and by Biteau and others (2003a,b). Each TPS is based on specific geologic elements that include source rocks, traps and seals, reservoir rocks, and the generation and migration of hydrocarbons. This chapter identifies the TPSs defined in the 2002 Appalachian basin oil and gas assessment and places them in the context of the stratigraphic framework associated with regional geologic cross sections D–D′ (Ryder and others, 2009, which was re-released in this volume, chap. E.4.1) and E–E′ (Ryder and others, 2008, which was re-released in this volume, chap. E.4.2). Furthermore, the chapter presents a recent estimate of the ultimate recoverable oil and natural gas in the basin.
A Scalable Framework For Segmenting Magnetic Resonance Images
Hore, Prodip; Goldgof, Dmitry B.; Gu, Yuhua; Maudsley, Andrew A.; Darkazanli, Ammar
2009-01-01
A fast, accurate and fully automatic method of segmenting magnetic resonance images of the human brain is introduced. The approach scales well allowing fast segmentations of fine resolution images. The approach is based on modifications of the soft clustering algorithm, fuzzy c-means, that enable it to scale to large data sets. Two types of modifications to create incremental versions of fuzzy c-means are discussed. They are much faster when compared to fuzzy c-means for medium to extremely large data sets because they work on successive subsets of the data. They are comparable in quality to application of fuzzy c-means to all of the data. The clustering algorithms coupled with inhomogeneity correction and smoothing are used to create a framework for automatically segmenting magnetic resonance images of the human brain. The framework is applied to a set of normal human brain volumes acquired from different magnetic resonance scanners using different head coils, acquisition parameters and field strengths. Results are compared to those from two widely used magnetic resonance image segmentation programs, Statistical Parametric Mapping and the FMRIB Software Library (FSL). The results are comparable to FSL while providing significant speed-up and better scalability to larger volumes of data. PMID:20046893
Rusu, Mirabela; Golden, Thea; Wang, Haibo; Gow, Andrew; Madabhushi, Anant
2015-08-01
Pulmonary inflammation is associated with a variety of diseases. Assessing pulmonary inflammation on in vivo imaging may facilitate the early detection and treatment of lung diseases. Although routinely used in thoracic imaging, computed tomography has thus far not been compellingly shown to characterize inflammation in vivo. Alternatively, magnetic resonance imaging (MRI) is a nonionizing radiation technique to better visualize and characterize pulmonary tissue. Prior to routine adoption of MRI for early characterization of inflammation in humans, a rigorous and quantitative characterization of the utility of MRI to identify inflammation is required. Such characterization may be achieved by considering ex vivo histology as the ground truth, since it enables the definitive spatial assessment of inflammation. In this study, the authors introduce a novel framework to integrate 2D histology, ex vivo and in vivo imaging to enable the mapping of the extent of disease from ex vivo histology onto in vivo imaging, with the goal of facilitating computerized feature analysis and interrogation of disease appearance on in vivo imaging. The authors' framework was evaluated in a preclinical preliminary study aimed to identify computer extracted features on in vivo MRI associated with chronic pulmonary inflammation. The authors' image analytics framework first involves reconstructing the histologic volume in 3D from individual histology slices. Second, the authors map the disease ground truth onto in vivo MRI via coregistration with 3D histology using the ex vivo lung MRI as a conduit. Finally, computerized feature analysis of the disease extent is performed to identify candidate in vivo imaging signatures of disease presence and extent. The authors evaluated the framework by assessing the quality of the 3D histology reconstruction and the histology-MRI fusion, in the context of an initial use case involving characterization of chronic inflammation in a mouse model. The authors' evaluation considered three mice, two with an inflammation phenotype and one control. The authors' iterative 3D histology reconstruction yielded a 70.1% ± 2.7% overlap with the ex vivo MRI volume. Across a total of 17 anatomic landmarks manually delineated at the division of airways, the target registration error between the ex vivo MRI and 3D histology reconstruction was 0.85 ± 0.44 mm, suggesting that a good alignment of the ex vivo 3D histology and ex vivo MRI had been achieved. The 3D histology-in vivo MRI coregistered volumes resulted in an overlap of 73.7% ± 0.9%. Preliminary computerized feature analysis was performed on an additional four control mice, for a total of seven mice considered in this study. Gabor texture filters appeared to best capture differences between the inflamed and noninflamed regions on MRI. The authors' 3D histology reconstruction and multimodal registration framework were successfully employed to reconstruct the histology volume of the lung and fuse it with in vivo MRI to create a ground truth map for inflammation on in vivo MRI. The analytic platform presented here lays the framework for a rigorous validation of the identified imaging features for chronic lung inflammation on MRI in a large prospective cohort.
NASA Astrophysics Data System (ADS)
Rodrigues, Pedro L.; Rodrigues, Nuno F.; Fonseca, Jaime C.; Vilaça, João. L.
2015-03-01
An accurate percutaneous puncture is essential for disintegration and removal of renal stones. Although this procedure has proven to be safe, some organs surrounding the renal target might be accidentally perforated. This work describes a new intraoperative framework where tracked surgical tools are superimposed within 4D ultrasound imaging for security assessment of the percutaneous puncture trajectory (PPT). A PPT is first generated from the skin puncture site towards an anatomical target, using the information retrieved by electromagnetic motion tracking sensors coupled to surgical tools. Then, 2D ultrasound images acquired with a tracked probe are used to reconstruct a 4D ultrasound around the PPT under GPU processing. Volume hole-filling was performed in different processing time intervals by a tri-linear interpolation method. At spaced time intervals, the volume of the anatomical structures was segmented to ascertain if any vital structure is in between PPT and might compromise the surgical success. To enhance the volume visualization of the reconstructed structures, different render transfer functions were used. Results: Real-time US volume reconstruction and rendering with more than 25 frames/s was only possible when rendering only three orthogonal slice views. When using the whole reconstructed volume one achieved 8-15 frames/s. 3 frames/s were reached when one introduce the segmentation and detection if some structure intersected the PPT. The proposed framework creates a virtual and intuitive platform that can be used to identify and validate a PPT to safely and accurately perform the puncture in percutaneous nephrolithotomy.
Delpon, Grégory; Escande, Alexandre; Ruef, Timothée; Darréon, Julien; Fontaine, Jimmy; Noblet, Caroline; Supiot, Stéphane; Lacornerie, Thomas; Pasquier, David
2016-01-01
Automated atlas-based segmentation (ABS) algorithms present the potential to reduce the variability in volume delineation. Several vendors offer software that are mainly used for cranial, head and neck, and prostate cases. The present study will compare the contours produced by a radiation oncologist to the contours computed by different automated ABS algorithms for prostate bed cases, including femoral heads, bladder, and rectum. Contour comparison was evaluated by different metrics such as volume ratio, Dice coefficient, and Hausdorff distance. Results depended on the volume of interest showed some discrepancies between the different software. Automatic contours could be a good starting point for the delineation of organs since efficient editing tools are provided by different vendors. It should become an important help in the next few years for organ at risk delineation. PMID:27536556
Vogel, J.R.; Brown, G.O.
2003-01-01
Semivariograms of samples of Culebra Dolomite have been determined at two different resolutions for gamma ray computed tomography images. By fitting models to semivariograms, small-scale and large-scale correlation lengths are determined for four samples. Different semivariogram parameters were found for adjacent cores at both resolutions. Relative elementary volume (REV) concepts are related to the stationarity of the sample. A scale disparity factor is defined and is used to determine sample size required for ergodic stationarity with a specified correlation length. This allows for comparison of geostatistical measures and representative elementary volumes. The modifiable areal unit problem is also addressed and used to determine resolution effects on correlation lengths. By changing resolution, a range of correlation lengths can be determined for the same sample. Comparison of voxel volume to the best-fit model correlation length of a single sample at different resolutions reveals a linear scaling effect. Using this relationship, the range of the point value semivariogram is determined. This is the range approached as the voxel size goes to zero. Finally, these results are compared to the regularization theory of point variables for borehole cores and are found to be a better fit for predicting the volume-averaged range.
1978-12-01
prioritization. 5 (We have chosen to use a variation of Saaty’s method in our hierarchical analysis, discussed in chapter 5, but for a purpose different ...the word "framework" to refer to an abstract structure for think- ing through policy-tieel management problems. This structure raises method - ological...readiness and logistics system performance, and we relied heavily on "structural" and trend analysis. By structural analysis, we meant a formal method for
NASA Technical Reports Server (NTRS)
Ripple, William J.; Wang, S.; Isaacson, Dennis L.; Paine, D. P.
1995-01-01
Digital Landsat Thematic Mapper (TM) and Satellite Probatoire d'Observation de la Terre (SPOT) High Resolution Visible (HRV) images of coniferous forest canopies were compared in their relationship to forest wood volume using correlation and regression analyses. Significant inverse relationships were found between softwood volume and the spectral bands from both sensors (P less than 0.01). The highest correlations were between the log of softwood volume and the near-infrared bands (HRV band 3, r = -0.89; TM band 4, r = -0.83).
Thermal Battery Operating Gas Atmosphere Control and Heat Transfer Optimization
2012-09-01
volume of water vapor present at 21.8 C in sample bottles std atm cc 1.533645 Maximum volume of water vapor present at 21.8 C in gas handling system and...sample bottles std atm cc Comparison of gas volumes measured at 838.197 and 1682.297 seconds shows that no water vapor was present and that the gas reacted...temperature of 22.0 ºC torr 0.241556 Maximum volume of water vapor present in one sample bottle std atm cc 0.000194 Maximum weight of water vapor present
NASA Astrophysics Data System (ADS)
Zhu, Weifang; Zhang, Li; Shi, Fei; Xiang, Dehui; Wang, Lirong; Guo, Jingyun; Yang, Xiaoling; Chen, Haoyu; Chen, Xinjian
2017-07-01
Cystoid macular edema (CME) and macular hole (MH) are the leading causes for visual loss in retinal diseases. The volume of the CMEs can be an accurate predictor for visual prognosis. This paper presents an automatic method to segment the CMEs from the abnormal retina with coexistence of MH in three-dimensional-optical coherence tomography images. The proposed framework consists of preprocessing and CMEs segmentation. The preprocessing part includes denoising, intraretinal layers segmentation and flattening, and MH and vessel silhouettes exclusion. In the CMEs segmentation, a three-step strategy is applied. First, an AdaBoost classifier trained with 57 features is employed to generate the initialization results. Second, an automated shape-constrained graph cut algorithm is applied to obtain the refined results. Finally, cyst area information is used to remove false positives (FPs). The method was evaluated on 19 eyes with coexistence of CMEs and MH from 18 subjects. The true positive volume fraction, FP volume fraction, dice similarity coefficient, and accuracy rate for CMEs segmentation were 81.0%±7.8%, 0.80%±0.63%, 80.9%±5.7%, and 99.7%±0.1%, respectively.
NASA Technical Reports Server (NTRS)
Connolly, Joseph W.; Kopasakis, George; Lemon, Kimberly A.
2010-01-01
A turbofan simulation has been developed for use in aero-propulso-servo-elastic coupling studies, on supersonic vehicles. A one-dimensional lumped volume approach is used whereby each component (fan, high-pressure compressor, combustor, etc.) is represented as a single volume using characteristic performance maps and conservation equations for continuity, momentum and energy. The simulation is developed in the MATLAB/SIMULINK (The MathWorks, Inc.) environment in order to facilitate controls development, and ease of integration with a future aero-servo-elastic vehicle model being developed at NASA Langley. The complete simulation demonstrated steady state results that closely match a proposed engine suitable for a supersonic business jet at the cruise condition. Preliminary investigation of the transient simulation revealed expected trends for fuel flow disturbances as well as upstream pressure disturbances. A framework for system identification enables development of linear models for controller design. Utilizing this framework, a transfer function modeling an upstream pressure disturbance s impacts on the engine speed is developed as an illustrative case of the system identification. This work will eventually enable an overall vehicle aero-propulso-servo-elastic model
U.S., Soviets Face Common Science Problems.
ERIC Educational Resources Information Center
Lepkowski, Wil
1981-01-01
Summarizes recent findings reported in a two-volume publication, "Science Policy: USA/USSR," issued by the National Science Foundation. Volumes I and II review U.S. and Soviet science policy in research and development, respectively. Comparisons are made concerning common problems around energy, environment, and the meaning of security.…
COMPARISON OF TWO DIFFERENT SOLID PHASE EXTRACTION/LARGE VOLUME INJECTION PROCEDURES FOR METHOD 8270
Two solid phase (SPE) and one traditional continuous liquid-liquid extraction method are compared for analysis of Method 8270 SVOCs. Productivity parameters include data quality, sample volume, analysis time and solvent waste.
One SPE system, unique in the U.S., uses aut...
30 CFR 227.600 - What automated verification functions may a State perform?
Code of Federal Regulations, 2010 CFR
2010-07-01
... involves systematic monitoring of production and royalty reports to identify and resolve reporting or... reported by royalty reporters to sales and transfer volumes reported by production reporters. If you request delegation of automated comparison of sales and production volumes, you must perform at least the...
Volume, conservation and instruction: A classroom based solomon four group study of conflict
NASA Astrophysics Data System (ADS)
Rowell, J. A.; Dawson, C. J.
The research reported is an attempt to widen the applicability of Piagetian theory-based conflict methodology from individual situations to whole classes. A Solomon four group experimental design augmented by a delayed posttest, was used to provide a controlled framework for studying the effects of conflict instruction on Grade 8 students' ability to conserve volume of noncompressible matter, and to apply that knowledge to gas volume. The results, reported for individuals and groups, show the methodology can be effective, particularly when instruction is preceded by a pretest. Immediate posttest differences in knowledge of gas volume between spontaneous (pretest) conservers and instructed conservers of volume of noncompressible matter were no longer in evidence on the delayed posttest. This observation together with the effects of pretesting and of the instructional sequence are shown to have a consistent Piagetian interpretation. Practical implications are discussed.
Changes in apparent molar water volume and DKP solubility yield insights on the Hofmeister effect.
Payumo, Alexander Y; Huijon, R Michael; Mansfield, Deauna D; Belk, Laurel M; Bui, Annie K; Knight, Anne E; Eggers, Daryl K
2011-12-15
This study examines the properties of a 4 × 2 matrix of aqueous cations and anions at concentrations up to 8.0 M. The apparent molar water volume, as calculated by subtracting the mass and volume of the ions from the corresponding solution density, was found to exceed the molar volume of ice in many concentrated electrolyte solutions, underscoring the nonideal behavior of these systems. The solvent properties of water were also analyzed by measuring the solubility of diketopiperazine (DKP) in 2.000 M salt solutions prepared from the same ion combinations. Solution rankings for DKP solubility were found to parallel the Hofmeister series for both cations and anions, whereas molar water volume concurred with the cation series only. The results are discussed within the framework of a desolvation energy model that attributes solute-specific changes in equilibria to solute-dependent changes in the free energy of bulk water.
Changes in Apparent Molar Water Volume and DKP Solubility Yield Insights on the Hofmeister Effect
Payumo, Alexander Y.; Huijon, R. Michael; Mansfield, Deauna D.; Belk, Laurel M.; Bui, Annie K.; Knight, Anne E.; Eggers, Daryl K.
2011-01-01
This study examines the properties of a 4 × 2 matrix of aqueous cations and anions at concentrations up to 8.0 M. The apparent molar water volume, as calculated by subtracting the mass and volume of the ions from the corresponding solution density, was found to exceed the molar volume of ice in many concentrated electrolyte solutions, underscoring the non-ideal behavior of these systems. The solvent properties of water were also analyzed by measuring the solubility of diketopiperazine (DKP) in 2.000 M salt solutions prepared from the same ion combinations. Solution rankings for DKP solubility were found to parallel the Hofmeister series for both cations and anions, whereas molar water volume concurred with the cation series only. The results are discussed within the framework of a desolvation energy model that attributes solute-specific changes in equilibria to solute-dependent changes in the free energy of bulk water. PMID:22029390
NASA Astrophysics Data System (ADS)
Duarte Queirós, S. M.
2005-08-01
This letter reports on a stochastic dynamical scenario whose associated stationary probability density function is exactly a generalised form, with a power law instead of exponencial decay, of the ubiquitous Gamma distribution. This generalisation, also known as F-distribution, was empirically proposed for the first time to adjust for high-frequency stock traded volume distributions in financial markets and verified in experiments with granular material. The dynamical assumption presented herein is based on local temporal fluctuations of the average value of the observable under study. This proposal is related to superstatistics and thus to the current nonextensive statistical mechanics framework. For the specific case of stock traded volume, we connect the local fluctuations in the mean stock traded volume with the typical herding behaviour presented by financial traders. Last of all, NASDAQ 1 and 2 minute stock traded volume sequences and probability density functions are numerically reproduced.
Partial-Wave Representations of Laser Beams for Use in Light-Scattering Calculations
NASA Technical Reports Server (NTRS)
Gouesbet, Gerard; Lock, James A.; Grehan, Gerard
1995-01-01
In the framework of generalized Lorenz-Mie theory, laser beams are described by sets of beam-shape coefficients. The modified localized approximation to evaluate these coefficients for a focused Gaussian beam is presented. A new description of Gaussian beams, called standard beams, is introduced. A comparison is made between the values of the beam-shape coefficients in the framework of the localized approximation and the beam-shape coefficients of standard beams. This comparison leads to new insights concerning the electromagnetic description of laser beams. The relevance of our discussion is enhanced by a demonstration that the localized approximation provides a very satisfactory description of top-hat beams as well.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment.
Puhan, Milo A; Singh, Sonal; Weiss, Carlos O; Varadhan, Ravi; Boyd, Cynthia M
2012-11-19
Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches.
A framework for organizing and selecting quantitative approaches for benefit-harm assessment
2012-01-01
Background Several quantitative approaches for benefit-harm assessment of health care interventions exist but it is unclear how the approaches differ. Our aim was to review existing quantitative approaches for benefit-harm assessment and to develop an organizing framework that clarifies differences and aids selection of quantitative approaches for a particular benefit-harm assessment. Methods We performed a review of the literature to identify quantitative approaches for benefit-harm assessment. Our team, consisting of clinicians, epidemiologists, and statisticians, discussed the approaches and identified their key characteristics. We developed a framework that helps investigators select quantitative approaches for benefit-harm assessment that are appropriate for a particular decisionmaking context. Results Our framework for selecting quantitative approaches requires a concise definition of the treatment comparison and population of interest, identification of key benefit and harm outcomes, and determination of the need for a measure that puts all outcomes on a single scale (which we call a benefit and harm comparison metric). We identified 16 quantitative approaches for benefit-harm assessment. These approaches can be categorized into those that consider single or multiple key benefit and harm outcomes, and those that use a benefit-harm comparison metric or not. Most approaches use aggregate data and can be used in the context of single studies or systematic reviews. Although the majority of approaches provides a benefit and harm comparison metric, only four approaches provide measures of uncertainty around the benefit and harm comparison metric (such as a 95 percent confidence interval). None of the approaches considers the actual joint distribution of benefit and harm outcomes, but one approach considers competing risks when calculating profile-specific event rates. Nine approaches explicitly allow incorporating patient preferences. Conclusion The choice of quantitative approaches depends on the specific question and goal of the benefit-harm assessment as well as on the nature and availability of data. In some situations, investigators may identify only one appropriate approach. In situations where the question and available data justify more than one approach, investigators may want to use multiple approaches and compare the consistency of results. When more evidence on relative advantages of approaches accumulates from such comparisons, it will be possible to make more specific recommendations on the choice of approaches. PMID:23163976
Comparison between Hydrogen, Methane and Ethylene Fuels in a 3-D Scramjet at Mach 8
2016-06-24
characteristics in air. The disadvantage of hydrogen is its low density, which is a particular problem for small vehicles with significant internal...characteristics in air. The disadvantage of hydrogen is its low density, which is a particular problem for small vehicles with significant internal volume...The low energy per unit volume of gaseous hydrogen, however, is a significant problem for small vehicles with internal volume constraints, in addition
Shiradkar, Rakesh; Podder, Tarun K; Algohary, Ahmad; Viswanath, Satish; Ellis, Rodney J; Madabhushi, Anant
2016-11-10
Radiomics or computer - extracted texture features have been shown to achieve superior performance than multiparametric MRI (mpMRI) signal intensities alone in targeting prostate cancer (PCa) lesions. Radiomics along with deformable co-registration tools can be used to develop a framework to generate targeted focal radiotherapy treatment plans. The Rad-TRaP framework comprises three distinct modules. Firstly, a module for radiomics based detection of PCa lesions on mpMRI via a feature enabled machine learning classifier. The second module comprises a multi-modal deformable co-registration scheme to map tissue, organ, and delineated target volumes from MRI onto CT. Finally, the third module involves generation of a radiomics based dose plan on MRI for brachytherapy and on CT for EBRT using the target delineations transferred from the MRI to the CT. Rad-TRaP framework was evaluated using a retrospective cohort of 23 patient studies from two different institutions. 11 patients from the first institution were used to train a radiomics classifier, which was used to detect tumor regions in 12 patients from the second institution. The ground truth cancer delineations for training the machine learning classifier were made by an experienced radiation oncologist using mpMRI, knowledge of biopsy location and radiology reports. The detected tumor regions were used to generate treatment plans for brachytherapy using mpMRI, and tumor regions mapped from MRI to CT to generate corresponding treatment plans for EBRT. For each of EBRT and brachytherapy, 3 dose plans were generated - whole gland homogeneous ([Formula: see text]) which is the current clinical standard, radiomics based focal ([Formula: see text]), and whole gland with a radiomics based focal boost ([Formula: see text]). Comparison of [Formula: see text] against conventional [Formula: see text] revealed that targeted focal brachytherapy would result in a marked reduction in dosage to the OARs while ensuring that the prescribed dose is delivered to the lesions. [Formula: see text] resulted in only a marginal increase in dosage to the OARs compared to [Formula: see text]. A similar trend was observed in case of EBRT with [Formula: see text] and [Formula: see text] compared to [Formula: see text]. A radiotherapy planning framework to generate targeted focal treatment plans has been presented. The focal treatment plans generated using the framework showed reduction in dosage to the organs at risk and a boosted dose delivered to the cancerous lesions.
Estimating near-road pollutant dispersion: a model inter-comparison
A model inter-comparison study to assess the abilities of steady-state Gaussian dispersion models to capture near-road pollutant dispersion has been carried out with four models (AERMOD, run with both the area-source and volume-source options to represent roadways, CALINE, versio...
Hydraulic fracturing volume is associated with induced earthquake productivity in the Duvernay play
NASA Astrophysics Data System (ADS)
Schultz, R.; Atkinson, G.; Eaton, D. W.; Gu, Y. J.; Kao, H.
2018-01-01
A sharp increase in the frequency of earthquakes near Fox Creek, Alberta, began in December 2013 in response to hydraulic fracturing. Using a hydraulic fracturing database, we explore relationships between injection parameters and seismicity response. We show that induced earthquakes are associated with completions that used larger injection volumes (104 to 105 cubic meters) and that seismic productivity scales linearly with injection volume. Injection pressure and rate have an insignificant association with seismic response. Further findings suggest that geological factors play a prominent role in seismic productivity, as evidenced by spatial correlations. Together, volume and geological factors account for ~96% of the variability in the induced earthquake rate near Fox Creek. This result is quantified by a seismogenic index–modified frequency-magnitude distribution, providing a framework to forecast induced seismicity.
Comparison of JSFR design with EDF requirements for future SFR
DOE Office of Scientific and Technical Information (OSTI.GOV)
Uematsu, M. M.; Prele, G.; Mariteau, P.
2012-07-01
A comparison of Japan sodium-cooled fast reactor (JSFR) design with future French SFR concept has been done based on the requirement of EDF, the investor-operator of future French SFR, and the French safety baseline, under the framework of EDF-JAEA bilateral agreement of research and development cooperation on future SFR. (authors)
Fine, David; Warner, Lee; Salomon, Sarah; Johnson, David M
2017-07-01
We assessed the impact of staff, clinic, and community interventions on male and female family planning client visit volume and sexually transmitted infection testing at a multisite community-based health care agency. Staff training, clinic environmental changes, in-reach/outreach, and efficiency assessments were implemented in two Family Health Center (San Diego, CA) family planning clinics during 2010-2012; five Family Health Center family planning programs were identified as comparison clinics. Client visit records were compared between preintervention (2007-2009) and postintervention (2010-2012) for both sets of clinics. Of 7,826 male client visits during the time before intervention, most were for clients who were aged <30 years (50%), Hispanic (64%), and uninsured (81%). From preintervention to postintervention, intervention clinics significantly increased the number of male visits (4,004 to 8,385; Δ = +109%); for comparison clinics, male visits increased modestly (3,822 to 4,500; Δ = +18%). The proportion of male clinic visits where chlamydia testing was performed increased in intervention clinics (35% to 42%; p < .001) but decreased in comparison clinics (37% to 33%; p < .001). Subgroup analyses conducted among adolescent and young adult males yielded similar findings for male client volume and chlamydia testing. The number of female visits declined nearly 40% in both comparison (21,800 to 13,202; -39%) and intervention clinics (30,830 to 19,971; -35%) between preintervention and postintervention periods. Multilevel interventions designed to increase male client volume and sexually transmitted infection testing services in family planning clinics succeeded without affecting female client volume or services. Copyright © 2017 Society for Adolescent Health and Medicine. All rights reserved.
Intelligent transportation systems impact assessment framework : final report
DOT National Transportation Integrated Search
1995-09-30
One of the most compelling reasons for investment in Intelligent Transportation System : (ITS) services is to realize a reduction in traffic congestion. Volume on Americas : highway network is expected to double by the year 2020 from 1.9 trillion ...
Scalable Metadata Management for a Large Multi-Source Seismic Data Repository
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaylord, J. M.; Dodge, D. A.; Magana-Zook, S. A.
In this work, we implemented the key metadata management components of a scalable seismic data ingestion framework to address limitations in our existing system, and to position it for anticipated growth in volume and complexity.
FRAMEWORK DESIGN FOR BMP PLACEMENT IN URBAN WATERSHEDS
A number of stormwater control strategies, commonly known as best management practices (BMPs), are used to mitigate runoff volumes and associated nonpoint source pollution due to wet-weather flows (WWFs). BMP types include ponds, bioretention facilities, infiltration trenches, g...
FRAMEWORK FOR PLACEMENT OF BMPS IN URBAN WATERSHEDS
A number of stormwater control strategies, commonly known as best management practices (BMPs), are used to mitigate runoff volumes and associated nonpoint source pollution due to wet-weather flows (WWFs). BMP types include ponds, bioretention facilities, infiltration trenches, g...
Porphyrin network materials: Chemical exploration in the supramolecular solid-state
NASA Astrophysics Data System (ADS)
Kosal, Margaret Elizabeth
Rational design of solid-state materials from molecular building blocks possessing desired physical and chemical characteristics remains among the most challenging tasks for the synthetic chemist. Using p-carboxylic acid tetraphenyl porphyrin molecules, H2T(p-CO 2H)PP, as the organic building block, the synthesis of novel microporous coordination framework materials has been pursued for this work. The self-assembly of the anionic carboxylate with divalent alkaline earth or transition metal cations yielded clathrate, lamellar and three-dimensional network materials. The solvothermal synthesis, characterization, and selective sorption properties of a 3-dimensional metalloporphyrin network solid, [CoT( p-CO2)PPCo1.5], named PIZA-1 for Porphyrinic Illinois Zeolite Analogue 1, have been investigated. The extended structure reveals a single, independent, neutral network with large, bi-directional oval-shaped channels (9 x 7 A) along the crystallographic b - and c-axes and another set of channels (14 x 7 A) along the a-axis. At the intersection of channels, an internal chamber (31 x 31 x 10 A) is realized. Channel-shape is attributable to ruffling of the metalloporphyrin macrocycles when coordinated to the bridging trinuclear Co(II)-carboxylate clusters. The void volume of the stable, thermally robust, solvate-free material is calculated to be 74% of the total unit cell volume. Size-, shape- and functional-group-selective sorption indicates a preference for water and amines. This organic zeolite analogue also demonstrates remarkable ability as a molecular sieve for removal of water from common organic solvents. By powder X-ray diffraction, BET gas adsorption studies and FTIR, this material has been shown to maintain its porous structure as a guest-free solid when heated under vacuum to 250°C. PIZA-1 demonstrates extremely high capacity for repeated selective sorption of water. In comparison to 4A molecular sieves, PIZA-1 exhibits higher capacity and faster response for the selective adsorption of water from common organic solvents. Molecular modeling of corroborates experimental results. The large internal cavities of PIZA-1 are a consequence of the trinuclear Co(II)carboxylate cluster forcing the ruffling of the porphyrin building blocks. The linear trinuclear metal-carboxylate cluster of PIZA-1 is contrasted with the bent trinuclear M(II) carboxylate clusters (M = Co, Mn) of isostructural 3-dimensional frameworks: PIZA-2 and PIZA-3. Containing near-planar metalloporphyrin macrocycles, PIZA-2 and PIZA-3 manifest lower void volumes (56%).
Comparison of Physics Frameworks for WebGL-Based Game Engine
NASA Astrophysics Data System (ADS)
Yogya, Resa; Kosala, Raymond
2014-03-01
Recently, a new technology called WebGL shows a lot of potentials for developing games. However since this technology is still new, there are still many potentials in the game development area that are not explored yet. This paper tries to uncover the potential of integrating physics frameworks with WebGL technology in a game engine for developing 2D or 3D games. Specifically we integrated three open source physics frameworks: Bullet, Cannon, and JigLib into a WebGL-based game engine. Using experiment, we assessed these frameworks in terms of their correctness or accuracy, performance, completeness and compatibility. The results show that it is possible to integrate open source physics frameworks into a WebGLbased game engine, and Bullet is the best physics framework to be integrated into the WebGL-based game engine.
Applications of algebraic topology to compatible spatial discretizations.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bochev, Pavel Blagoveston; Hyman, James M.
We provide a common framework for compatible discretizations using algebraic topology to guide our analysis. The main concept is the natural inner product on cochains, which induces a combinatorial Hodge theory. The framework comprises of mutually consistent operations of differentiation and integration, has a discrete Stokes theorem, and preserves the invariants of the DeRham cohomology groups. The latter allows for an elementary calculation of the kernel of the discrete Laplacian. Our framework provides an abstraction that includes examples of compatible finite element, finite volume and finite difference methods. We describe how these methods result from the choice of a reconstructionmore » operator and when they are equivalent.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thallapally, Praveen K.; Grate, Jay W.; Motkuri, Radha K.
2012-01-11
Two well known Metal organic frameworks (MOF-5, NiDOBDC) were synthesized and studied for facile xenon capture and separation. Our results indicate the NiDOBDC adsorbs significantly more xenon than MOF-5, releases it more readily than activated carbon, and is more selective for Xe over Kr than activated carbon.
ERIC Educational Resources Information Center
Waite, Sue; Bølling, Mads; Bentsen, Peter
2016-01-01
Using a conceptual model focused on purposes, aims, content, pedagogy, outcomes, and barriers, we review and interpret literature on two forms of outdoor learning: Forest Schools in England and "udeskole" in Denmark. We examine pedagogical principles within a comparative analytical framework and consider how adopted pedagogies reflect…
ERIC Educational Resources Information Center
Pearce, Kimber Charles; Fadely, Dean
1992-01-01
Analyzes the quasi-logical argumentative framework of George Bush's address in which he endeavored to gain compliance and justify his actions at the beginning of the Persian Gulf War. Identifies arguments of comparison and sacrifice within that framework and examines the role of justice in the speech. (TB)
A Comparison of Product Realization Frameworks
1993-10-01
software (integrated FrameMaker ). Also included are BOLD for on-line documentation delivery, printer/plotter support, and 18 network licensing support. AMPLE...are built with DSS. Documentation tools include an on-line information system (BOLD), text editing (Notepad), word processing (integrated FrameMaker ...within an application. FrameMaker is fully integrated with the Falcon Framework to provide consistent documentation capabilities within engineering
NASA Astrophysics Data System (ADS)
Mao, Yiyin; Li, Gaoran; Guo, Yi; Li, Zhoupeng; Liang, Chengdu; Peng, Xinsheng; Lin, Zhan
2017-03-01
Lithium-sulfur batteries are promising technologies for powering flexible devices due to their high energy density, low cost and environmental friendliness, when the insulating nature, shuttle effect and volume expansion of sulfur electrodes are well addressed. Here, we report a strategy of using foldable interpenetrated metal-organic frameworks/carbon nanotubes thin film for binder-free advanced lithium-sulfur batteries through a facile confinement conversion. The carbon nanotubes interpenetrate through the metal-organic frameworks crystal and interweave the electrode into a stratified structure to provide both conductivity and structural integrity, while the highly porous metal-organic frameworks endow the electrode with strong sulfur confinement to achieve good cyclability. These hierarchical porous interpenetrated three-dimensional conductive networks with well confined S8 lead to high sulfur loading and utilization, as well as high volumetric energy density.
Wang, Mingming; Sweetapple, Chris; Fu, Guangtao; Farmani, Raziyeh; Butler, David
2017-10-01
This paper presents a new framework for decision making in sustainable drainage system (SuDS) scheme design. It integrates resilience, hydraulic performance, pollution control, rainwater usage, energy analysis, greenhouse gas (GHG) emissions and costs, and has 12 indicators. The multi-criteria analysis methods of entropy weight and Technique for Order Preference by Similarity to Ideal Solution (TOPSIS) were selected to support SuDS scheme selection. The effectiveness of the framework is demonstrated with a SuDS case in China. Indicators used include flood volume, flood duration, a hydraulic performance indicator, cost and resilience. Resilience is an important design consideration, and it supports scheme selection in the case study. The proposed framework will help a decision maker to choose an appropriate design scheme for implementation without subjectivity. Copyright © 2017 Elsevier Ltd. All rights reserved.
Mao, Yiyin; Li, Gaoran; Guo, Yi; Li, Zhoupeng; Liang, Chengdu; Peng, Xinsheng; Lin, Zhan
2017-01-01
Lithium–sulfur batteries are promising technologies for powering flexible devices due to their high energy density, low cost and environmental friendliness, when the insulating nature, shuttle effect and volume expansion of sulfur electrodes are well addressed. Here, we report a strategy of using foldable interpenetrated metal-organic frameworks/carbon nanotubes thin film for binder-free advanced lithium–sulfur batteries through a facile confinement conversion. The carbon nanotubes interpenetrate through the metal-organic frameworks crystal and interweave the electrode into a stratified structure to provide both conductivity and structural integrity, while the highly porous metal-organic frameworks endow the electrode with strong sulfur confinement to achieve good cyclability. These hierarchical porous interpenetrated three-dimensional conductive networks with well confined S8 lead to high sulfur loading and utilization, as well as high volumetric energy density. PMID:28262801
Physics of automated driving in framework of three-phase traffic theory.
Kerner, Boris S
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Physics of automated driving in framework of three-phase traffic theory
NASA Astrophysics Data System (ADS)
Kerner, Boris S.
2018-04-01
We have revealed physical features of automated driving in the framework of the three-phase traffic theory for which there is no fixed time headway to the preceding vehicle. A comparison with the classical model approach to automated driving for which an automated driving vehicle tries to reach a fixed (desired or "optimal") time headway to the preceding vehicle has been made. It turns out that automated driving in the framework of the three-phase traffic theory can exhibit the following advantages in comparison with the classical model of automated driving: (i) The absence of string instability. (ii) Considerably smaller speed disturbances at road bottlenecks. (iii) Automated driving vehicles based on the three-phase theory can decrease the probability of traffic breakdown at the bottleneck in mixed traffic flow consisting of human driving and automated driving vehicles; on the contrary, even a single automated driving vehicle based on the classical approach can provoke traffic breakdown at the bottleneck in mixed traffic flow.
Hierarchical probabilistic Gabor and MRF segmentation of brain tumours in MRI volumes.
Subbanna, Nagesh K; Precup, Doina; Collins, D Louis; Arbel, Tal
2013-01-01
In this paper, we present a fully automated hierarchical probabilistic framework for segmenting brain tumours from multispectral human brain magnetic resonance images (MRIs) using multiwindow Gabor filters and an adapted Markov Random Field (MRF) framework. In the first stage, a customised Gabor decomposition is developed, based on the combined-space characteristics of the two classes (tumour and non-tumour) in multispectral brain MRIs in order to optimally separate tumour (including edema) from healthy brain tissues. A Bayesian framework then provides a coarse probabilistic texture-based segmentation of tumours (including edema) whose boundaries are then refined at the voxel level through a modified MRF framework that carefully separates the edema from the main tumour. This customised MRF is not only built on the voxel intensities and class labels as in traditional MRFs, but also models the intensity differences between neighbouring voxels in the likelihood model, along with employing a prior based on local tissue class transition probabilities. The second inference stage is shown to resolve local inhomogeneities and impose a smoothing constraint, while also maintaining the appropriate boundaries as supported by the local intensity difference observations. The method was trained and tested on the publicly available MICCAI 2012 Brain Tumour Segmentation Challenge (BRATS) Database [1] on both synthetic and clinical volumes (low grade and high grade tumours). Our method performs well compared to state-of-the-art techniques, outperforming the results of the top methods in cases of clinical high grade and low grade tumour core segmentation by 40% and 45% respectively.
Meek, M E Bette; Palermo, Christine M; Bachman, Ammie N; North, Colin M; Jeffrey Lewis, R
2014-06-01
The mode of action human relevance (MOA/HR) framework increases transparency in systematically considering data on MOA for end (adverse) effects and their relevance to humans. This framework continues to evolve as experience increases in its application. Though the MOA/HR framework is not designed to address the question of "how much information is enough" to support a hypothesized MOA in animals or its relevance to humans, its organizing construct has potential value in considering relative weight of evidence (WOE) among different cases and hypothesized MOA(s). This context is explored based on MOA analyses in published assessments to illustrate the relative extent of supporting data and their implications for dose-response analysis and involved comparisons for chemical assessments on trichloropropane, and carbon tetrachloride with several hypothesized MOA(s) for cancer. The WOE for each hypothesized MOA was summarized in narrative tables based on comparison and contrast of the extent and nature of the supporting database versus potentially inconsistent or missing information. The comparison was based on evolved Bradford Hill considerations rank ordered to reflect their relative contribution to WOE determinations of MOA taking into account increasing experience in their application internationally. This clarification of considerations for WOE determinations as a basis for comparative analysis is anticipated to contribute to increasing consistency in the application of MOA/HR analysis and potentially, transparency in separating science judgment from public policy considerations in regulatory risk assessment. Copyright © 2014. The Authors. Journal of Applied Toxicology Published by John Wiley & Sons Ltd.
Clinical knowledge-based inverse treatment planning
NASA Astrophysics Data System (ADS)
Yang, Yong; Xing, Lei
2004-11-01
Clinical IMRT treatment plans are currently made using dose-based optimization algorithms, which do not consider the nonlinear dose-volume effects for tumours and normal structures. The choice of structure specific importance factors represents an additional degree of freedom of the system and makes rigorous optimization intractable. The purpose of this work is to circumvent the two problems by developing a biologically more sensible yet clinically practical inverse planning framework. To implement this, the dose-volume status of a structure was characterized by using the effective volume in the voxel domain. A new objective function was constructed with the incorporation of the volumetric information of the system so that the figure of merit of a given IMRT plan depends not only on the dose deviation from the desired distribution but also the dose-volume status of the involved organs. The conventional importance factor of an organ was written into a product of two components: (i) a generic importance that parametrizes the relative importance of the organs in the ideal situation when the goals for all the organs are met; (ii) a dose-dependent factor that quantifies our level of clinical/dosimetric satisfaction for a given plan. The generic importance can be determined a priori, and in most circumstances, does not need adjustment, whereas the second one, which is responsible for the intractable behaviour of the trade-off seen in conventional inverse planning, was determined automatically. An inverse planning module based on the proposed formalism was implemented and applied to a prostate case and a head-neck case. A comparison with the conventional inverse planning technique indicated that, for the same target dose coverage, the critical structure sparing was substantially improved for both cases. The incorporation of clinical knowledge allows us to obtain better IMRT plans and makes it possible to auto-select the importance factors, greatly facilitating the inverse planning process. The new formalism proposed also reveals the relationship between different inverse planning schemes and gives important insight into the problem of therapeutic plan optimization. In particular, we show that the EUD-based optimization is a special case of the general inverse planning formalism described in this paper.
Robust Intratumor Partitioning to Identify High-Risk Subregions in Lung Cancer: A Pilot Study.
Wu, Jia; Gensheimer, Michael F; Dong, Xinzhe; Rubin, Daniel L; Napel, Sandy; Diehn, Maximilian; Loo, Billy W; Li, Ruijiang
2016-08-01
To develop an intratumor partitioning framework for identifying high-risk subregions from (18)F-fluorodeoxyglucose positron emission tomography (FDG-PET) and computed tomography (CT) imaging and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. In this institutional review board-approved retrospective study, we analyzed the pretreatment FDG-PET and CT scans of 44 lung cancer patients treated with radiation therapy. A novel, intratumor partitioning method was developed, based on a 2-stage clustering process: first at the patient level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET and CT images; next, tumor subregions were identified by merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Three spatially distinct subregions were identified within each tumor that were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI of 0.66-0.67. When restricting the analysis to patients with stage III disease (n=32), the same subregion achieved an even higher CI of 0.75 (hazard ratio 3.93, log-rank P=.002) for predicting OS, and a CI of 0.76 (hazard ratio 4.84, log-rank P=.002) for predicting OFP. In comparison, conventional imaging markers, including tumor volume, maximum standardized uptake value, and metabolic tumor volume using threshold of 50% standardized uptake value maximum, were not predictive of OS or OFP, with CI mostly below 0.60 (log-rank P>.05). We propose a robust intratumor partitioning method to identify clinically relevant, high-risk subregions in lung cancer. We envision that this approach will be applicable to identifying useful imaging biomarkers in many cancer types. Copyright © 2016 Elsevier Inc. All rights reserved.
Robust Intratumor Partitioning to Identify High-Risk Subregions in Lung Cancer: A Pilot Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Jia; Gensheimer, Michael F.; Dong, Xinzhe
2016-08-01
Purpose: To develop an intratumor partitioning framework for identifying high-risk subregions from {sup 18}F-fluorodeoxyglucose positron emission tomography (FDG-PET) and computed tomography (CT) imaging and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. Methods and Materials: In this institutional review board–approved retrospective study, we analyzed the pretreatment FDG-PET and CT scans of 44 lung cancer patients treated with radiation therapy. A novel, intratumor partitioning method was developed, based on a 2-stage clustering process: first at the patient level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET andmore » CT images; next, tumor subregions were identified by merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Results: Three spatially distinct subregions were identified within each tumor that were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI of 0.66-0.67. When restricting the analysis to patients with stage III disease (n=32), the same subregion achieved an even higher CI of 0.75 (hazard ratio 3.93, log-rank P=.002) for predicting OS, and a CI of 0.76 (hazard ratio 4.84, log-rank P=.002) for predicting OFP. In comparison, conventional imaging markers, including tumor volume, maximum standardized uptake value, and metabolic tumor volume using threshold of 50% standardized uptake value maximum, were not predictive of OS or OFP, with CI mostly below 0.60 (log-rank P>.05). Conclusion: We propose a robust intratumor partitioning method to identify clinically relevant, high-risk subregions in lung cancer. We envision that this approach will be applicable to identifying useful imaging biomarkers in many cancer types.« less
Wavelet-based fMRI analysis: 3-D denoising, signal separation, and validation metrics
Khullar, Siddharth; Michael, Andrew; Correa, Nicolle; Adali, Tulay; Baum, Stefi A.; Calhoun, Vince D.
2010-01-01
We present a novel integrated wavelet-domain based framework (w-ICA) for 3-D de-noising functional magnetic resonance imaging (fMRI) data followed by source separation analysis using independent component analysis (ICA) in the wavelet domain. We propose the idea of a 3-D wavelet-based multi-directional de-noising scheme where each volume in a 4-D fMRI data set is sub-sampled using the axial, sagittal and coronal geometries to obtain three different slice-by-slice representations of the same data. The filtered intensity value of an arbitrary voxel is computed as an expected value of the de-noised wavelet coefficients corresponding to the three viewing geometries for each sub-band. This results in a robust set of de-noised wavelet coefficients for each voxel. Given the decorrelated nature of these de-noised wavelet coefficients; it is possible to obtain more accurate source estimates using ICA in the wavelet domain. The contributions of this work can be realized as two modules. First, the analysis module where we combine a new 3-D wavelet denoising approach with better signal separation properties of ICA in the wavelet domain, to yield an activation component that corresponds closely to the true underlying signal and is maximally independent with respect to other components. Second, we propose and describe two novel shape metrics for post-ICA comparisons between activation regions obtained through different frameworks. We verified our method using simulated as well as real fMRI data and compared our results against the conventional scheme (Gaussian smoothing + spatial ICA: s-ICA). The results show significant improvements based on two important features: (1) preservation of shape of the activation region (shape metrics) and (2) receiver operating characteristic (ROC) curves. It was observed that the proposed framework was able to preserve the actual activation shape in a consistent manner even for very high noise levels in addition to significant reduction in false positives voxels. PMID:21034833
NASA Technical Reports Server (NTRS)
DeLoach, Richard; Micol, John R.
2011-01-01
The factors that determine data volume requirements in a typical wind tunnel test are identified. It is suggested that productivity in wind tunnel testing can be enhanced by managing the inference error risk associated with evaluating residuals in a response surface modeling experiment. The relationship between minimum data volume requirements and the factors upon which they depend is described and certain simplifications to this relationship are realized when specific model adequacy criteria are adopted. The question of response model residual evaluation is treated and certain practical aspects of response surface modeling are considered, including inference subspace truncation. A wind tunnel test plan developed by using the Modern Design of Experiments illustrates the advantages of an early estimate of data volume requirements. Comparisons are made with a representative One Factor At a Time (OFAT) wind tunnel test matrix developed to evaluate a surface to air missile.
Farooq, Zerwa; Behzadi, Ashkan Heshmatzadeh; Blumenfeld, Jon D; Zhao, Yize; Prince, Martin R
To compare MRI segmentation methods for measuring liver cyst volumes in autosomal dominant polycystic kidney disease (ADPKD). Liver cyst volumes in 42 ADPKD patients were measured using region growing, thresholding and cyst diameter techniques. Manual segmentation was the reference standard. Root mean square deviation was 113, 155, and 500 for cyst diameter, thresholding and region growing respectively. Thresholding error for cyst volumes below 500ml was 550% vs 17% for cyst volumes above 500ml (p<0.001). For measuring volume of a small number of cysts, cyst diameter and manual segmentation methods are recommended. For severe disease with numerous, large hepatic cysts, thresholding is an acceptable alternative. Copyright © 2017 Elsevier Inc. All rights reserved.
ERIC Educational Resources Information Center
Keefe, James W., Ed.; Walberg, Herbert J., Ed.
This volume represents a variety of current efforts to incorporate thought-provoking methods into teaching. There are three sections. "Curriculum Developments" defines key curricular terms and offers a framework and general examples of teaching tactics. In this section, Barbara Presseisen distinguishes thinking from other cognitive…
FRAMEWORK FOR PLACEMENT OF BMP/LID IN URBAN WATERSHEDS
A number of stormwater control strategies, commonly known as best management practices (BMPs), are used to mitigate runoff volumes and associated nonpoint source pollution due to wet-weather flows (WWFs). BMP types include ponds, bioretention facilities, infiltration trenches, gr...
FRAMEWORK FOR PLACEMENT OF BMP/LID IN URBAN WATERSHED
A number of stormwater control strategies, commonly known as best management practices (BMPs), are used to mitigate runoff volumes and associated nonpoint source pollution due to wet-weather flows (WWFs). BMP types include ponds, bioretention facilities, infiltration trenches, g...
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hovanski, Yuri; Carsley, John; Carlson, Blair
2014-01-15
A comparison of welding techniques was performed to determine the most effective method for producing aluminum tailor-welded blanks for high volume automotive applications. Aluminum sheet was joined with an emphasis on post weld formability, surface quality and weld speed. Comparative results from several laser based welding techniques along with friction stir welding are presented. The results of this study demonstrate a quantitative comparison of weld methodologies in preparing tailor-welded aluminum stampings for high volume production in the automotive industry. Evaluation of nearly a dozen welding variations ultimately led to down selecting a single process based on post-weld quality and performance.
Yilmaz, Burak; Alshahrani, Faris A; Kale, Ediz; Johnston, William M
2018-02-06
Veneering with porcelain may adversely affect the marginal fit of long-span computer-aided design and computer-aided manufacturing (CAD-CAM) implant-supported fixed prostheses. Moreover, data regarding the precision of fit of CAD-CAM-fabricated implant-supported complete zirconia fixed dental prostheses (FDPs) before and after porcelain layering are limited. The purpose of this in vitro study was to evaluate the effect of porcelain layering on the marginal fit of CAD-CAM-fabricated complete-arch implant-supported, screw-retained FDPs with presintered zirconia frameworks compared with titanium. An autopolymerizing acrylic resin-fixed complete denture framework prototype was fabricated on an edentulous typodont master model (all-on-4 concept; Nobel Biocare) with 2 straight in the anterior and 2 distally tilted internal-hexagon dental implants in the posterior with multiunit abutments bilaterally in canine and first molar locations. A 3-dimensional (3D) laser scanner (S600 ARTI; Zirkonzahn) was used to digitize the prototype and the master model by using scan bodies to generate a virtual 3D CAD framework. Five presintered zirconia (ICE Zirkon Translucent - 95H16; Zirkonzahn) and 5 titanium (Titan 5 - 95H14; Zirkonzahn) frameworks were fabricated using the CAM milling unit (M1 Wet Heavy Metal Milling Unit; Zirkonzahn).The 1-screw test was applied by fixing the frameworks at the location of the maxillary left first molar abutment, and an industrial computed tomography (CT) scanner (XT H 225 - Basic Configuration; Nikon) was used to scan the framework-model complex to evaluate the passive fit of the frameworks on the master model. The scanned data were transported in standard tessellation language (STL) from Volume Graphics analysis software to PolyWorks analysis software by using the maximum-fit algorithm to fit scanned planes in order to mimic the mating surfaces in the best way. 3D virtual assessment of the marginal fit was performed at the abutment-framework interface at the maxillary right canine (gap 3) and right first molar (gap 4) abutments without prosthetic screws. The facial or buccal aspects of the teeth on frameworks were layered with corresponding porcelain (Initial Dental Ceramic System; GC) and CT-scanned again using the same protocol. Marginal fit measurements were made for 4 groups: titanium (Ti) (control), porcelain-layered titanium (Ti-P) (control), zirconia (Zr), and porcelain-layered zirconia (Zr-P). 3D discrepancy mean values were computed and calculated, and the results were analyzed with a repeated measures 3-way ANOVA using the maximum likelihood estimation method and Bonferroni adjustments for selected pairwise comparison t-tests (α=.05). The 3D fit was measured at gap 3 and gap 4. Statistically significant differences in mean 3D discrepancies were observed between Zr-P (175 μm) and Zr (89 μm) and between Zr-P and Ti-P (71 μm) (P<.001). Porcelain layering had a significant effect on the marginal fit of CAD-CAM-fabricated complete-arch implant-supported, screw-retained FDPs with partially sintered zirconia frameworks. 3D marginal discrepancy mean values for all groups were within clinically acceptable limits (<120 μm), except for the layered zirconia framework. Copyright © 2017 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.
Thermodynamic Entropy and the Accessible States of Some Simple Systems
ERIC Educational Resources Information Center
Sands, David
2008-01-01
Comparison of the thermodynamic entropy with Boltzmann's principle shows that under conditions of constant volume the total number of arrangements in a simple thermodynamic system with temperature-independent constant-volume heat capacity, C, is T[superscript C/k]. A physical interpretation of this function is given for three such systems: an…
Volume Comparison of Pine, Spruce, and Aspen Growing Side by Side
David H. Alban
1985-01-01
Red pine produced significantly more volume than the other species on all five sites in the Lake States. By age 40 to 50 white spruce was second to red pine and beyond this age it is expected that these two species will increase their lead over the other especies even more.
Status and trends of land change in the Midwest–South Central United States—1973 to 2000
Auch, Roger F.; Karstensen, Krista A.; Auch, Roger F.; Karstensen, Krista A.
2015-12-10
U.S. Geological Survey (USGS) Professional Paper 1794–C is the third in a four-volume series on the status and trends of the Nation’s land use and land cover, providing an assessment of the rates and causes of land-use and land-cover change in the Midwest–South Central United States between 1973 and 2000. Volumes A, B, and D provide similar analyses for the Western United States, the Great Plains of the United States, and the Eastern United States, respectively. The assessments of land-use and land-cover trends are conducted on an ecoregion-by-ecoregion basis, and each ecoregion assessment is guided by a nationally consistent study design that includes mapping, statistical methods, field studies, and analysis. Individual assessments provide a picture of the characteristics of land change occurring in a given ecoregion; in combination, they provide a framework for understanding the complex national mosaic of change and also the causes and consequences of change. Thus, each volume in this series provides a regional assessment of how (and how fast) land use and land cover are changing, and why. The four volumes together form the first comprehensive picture of land change across the Nation.Geographic understanding of land-use and land-cover change is directly relevant to a wide variety of stakeholders, including land and resource managers, policymakers, and scientists. The chapters in this volume present brief summaries of the patterns and rates of land change observed in each ecoregion in the Midwest–South Central United States, together with field photographs, statistics, and comparisons with other assessments. In addition, a synthesis chapter summarizes the scope of land change observed across the entire Midwest–South Central United States. The studies provide a way of integrating information across the landscape, and they form a critical component in the efforts to understand how land use and land cover affect important issues such as the provision of ecological goods and services and also the determination of risks to, and vulnerabilities of, human communities. Results from this project also are published in peer-reviewed journals, and they are further used to produce maps of change and other tools for land management, as well as to provide inputs for carbon-cycle modeling and other climate change research.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minelli, Matteo; Doghieri, Ferruccio
2014-05-15
Data for kinetics of mass uptake from vapor sorption experiments in thin glassy polymer samples are here interpreted in terms of relaxation times for volume dilation. To this result, both models from non-equilibrium thermodynamics and from mechanics of volume relaxation contribute. Different kind of sorption experiments have been considered in order to facilitate the direct comparison between kinetics of solute induced volume dilation and corresponding data from process driven by pressure or temperature jumps.
NASA Astrophysics Data System (ADS)
Liu, Jiachao; Li, Ziyi; Chen, Kewei; Yao, Li; Wang, Zhiqun; Li, Kunchen; Guo, Xiaojuan
2011-03-01
Gray matter volume and cortical thickness are two indices of concern in brain structure magnetic resonance imaging research. Gray matter volume reflects mixed-measurement information of cerebral cortex, while cortical thickness reflects only the information of distance between inner surface and outer surface of cerebral cortex. Using Scaled Subprofile Modeling based on Principal Component Analysis (SSM_PCA) and Pearson's Correlation Analysis, this study further provided quantitative comparisons and depicted both global relevance and local relevance to comprehensively investigate morphometrical abnormalities in cerebral cortex in Alzheimer's disease (AD). Thirteen patients with AD and thirteen age- and gender-matched healthy controls were included in this study. Results showed that factor scores from the first 8 principal components accounted for ~53.38% of the total variance for gray matter volume, and ~50.18% for cortical thickness. Factor scores from the fifth principal component showed significant correlation. In addition, gray matter voxel-based volume was closely related to cortical thickness alterations in most cortical cortex, especially, in some typical abnormal brain regions such as insula and the parahippocampal gyrus in AD. These findings suggest that these two measurements are effective indices for understanding the neuropathology in AD. Studies using both gray matter volume and cortical thickness can separate the causes of the discrepancy, provide complementary information and carry out a comprehensive description of the morphological changes of brain structure.
[Comparison of SIB-IMRT treatment plans for upper esophageal carcinoma].
Fu, Wei-hua; Wang, Lv-hua; Zhou, Zong-mei; Dai, Jian-rong; Hu, Yi-min
2003-06-01
To implement simultaneous integrated boost intensity-modulated radiotherapy(SIB-IMRT) plans for upper esophageal carcinoma and investigate the dose profiles of tumor and electively treated region and the dose to organs at risk (OARs). SIB-IMRT plans were designed for two patients with upper esophageal carcinoma. Two target volumes were predefined: PTV1, the target volume of the primary lesion, which was given to 67.2 Gy, and PTV2, the target volume of electively treated region, which was given to 50.4 Gy. With the same dose-volume constraints, but different beams arrangements (3, 5, 7, or 9 equispaced coplanar beams), four plans were generated. Indices, including dose distribution, dose volume histogram (DVH) and conformity index, were used for comparison of these plans. The plan with three intensity-modulated beams could produce good dose distribution for the two target volumes. The dose conformity to targets and the dose to OARs were improved as the beam number increased. The dose distributions in targets changed little when the beam number increased from 7 to 9. Five to seven intensity-modulated beams can produce desirable dose distributions for simultaneous integrated boost (SIB) treatment for upper esophageal carcinoma. The primary tumor can get higher equivalent dose by SIB treatments. It is easier and more efficient to design plans with equispaced coplanar beams. The efficacy of SIB-IMRT remains to be determined by the clinical outcome.
Hippocampus and amygdala volumes in parents of children with autistic disorder.
Rojas, Donald C; Smith, J Allegra; Benkers, Tara L; Camou, Suzanne L; Reite, Martin L; Rogers, Sally J
2004-11-01
Structural and functional abnormalities in the medial temporal lobe, particularly the hippocampus and amygdala, have been described in people with autism. The authors hypothesized that parents of children with a diagnosis of autistic disorder would show similar changes in these structures. Magnetic resonance imaging scans were performed in 17 biological parents of children with a diagnosis of DSM-IV autistic disorder. The scans were compared with scans from 15 adults with autistic disorder and 17 age-matched comparison subjects with no personal or familial history of autism. The volumes of the hippocampus, amygdala, and total brain were measured in all participants. The volume of the left hippocampus was larger in both the parents of children with autistic disorder and the adults with autistic disorder, relative to the comparison subjects. The hippocampus was significantly larger in the adults with autistic disorder than in the parents of children with autistic disorder. The left amygdala was smaller in the adults with autistic disorder, relative to the other two groups. No differences in total brain volume were observed between the three groups. The finding of larger hippocampal volume in autism is suggestive of abnormal early neurodevelopmental processes but is partly consistent with only one prior study and contradicts the findings of several others. The finding of larger hippocampal volume for the parental group suggests a potential genetic basis for hippocampal abnormalities in autism.
Low cost sensing of vegetation volume and structure with a Microsoft Kinect sensor
NASA Astrophysics Data System (ADS)
Azzari, G.; Goulden, M.
2011-12-01
The market for videogames and digital entertainment has decreased the cost of advanced technology to affordable levels. The Microsoft Kinect sensor for Xbox 360 is an infrared time of flight camera designed to track body position and movement at a single-articulation level. Using open source drivers and libraries, we acquired point clouds of vegetation directly from the Kinect sensor. The data were filtered for outliers, co-registered, and cropped to isolate the plant of interest from the surroundings and soil. The volume of single plants was then estimated with several techniques, including fitting with solid shapes (cylinders, spheres, boxes), voxel counts, and 3D convex/concave hulls. Preliminary results are presented here. The volume of a series of wild artichoke plants was measured from nadir using a Kinect on a 3m-tall tower. The calculated volumes were compared with harvested biomass; comparisons and derived allometric relations will be presented, along with examples of the acquired point clouds. This Kinect sensor shows promise for ground-based, automated, biomass measurement systems, and possibly for comparison/validation of remotely sensed LIDAR.
Lustig, Audrey; Worner, Susan P; Pitt, Joel P W; Doscher, Crile; Stouffer, Daniel B; Senay, Senait D
2017-10-01
Natural and human-induced events are continuously altering the structure of our landscapes and as a result impacting the spatial relationships between individual landscape elements and the species living in the area. Yet, only recently has the influence of the surrounding landscape on invasive species spread started to be considered. The scientific community increasingly recognizes the need for broader modeling framework that focuses on cross-study comparisons at different spatiotemporal scales. Using two illustrative examples, we introduce a general modeling framework that allows for a systematic investigation of the effect of habitat change on invasive species establishment and spread. The essential parts of the framework are (i) a mechanistic spatially explicit model (a modular dispersal framework-MDIG) that allows population dynamics and dispersal to be modeled in a geographical information system (GIS), (ii) a landscape generator that allows replicated landscape patterns with partially controllable spatial properties to be generated, and (iii) landscape metrics that depict the essential aspects of landscape with which dispersal and demographic processes interact. The modeling framework provides functionality for a wide variety of applications ranging from predictions of the spatiotemporal spread of real species and comparison of potential management strategies, to theoretical investigation of the effect of habitat change on population dynamics. Such a framework allows to quantify how small-grain landscape characteristics, such as habitat size and habitat connectivity, interact with life-history traits to determine the dynamics of invasive species spread in fragmented landscape. As such, it will give deeper insights into species traits and landscape features that lead to establishment and spread success and may be key to preventing new incursions and the development of efficient monitoring, surveillance, control or eradication programs.
Free stream capturing in fluid conservation law for moving coordinates in three dimensions
NASA Technical Reports Server (NTRS)
Obayashi, Shigeru
1991-01-01
The free-stream capturing technique for both the finite-volume (FV) and finite-difference (FD) framework is summarized. For an arbitrary motion of the grid, the FV analysis shows that volumes swept by all six surfaces of the cell have to be computed correctly. This means that the free-stream capturing time-metric terms should be calculated not only from a surface vector of a cell at a single time level, but also from a volume swept by the cell surface in space and time. The FV free-stream capturing formulation is applicable to the FD formulation by proper translation from an FV cell to an FD mesh.
World-volume effective theory for higher-dimensional black holes.
Emparan, Roberto; Harmark, Troels; Niarchos, Vasilis; Obers, Niels A
2009-05-15
We argue that the main feature behind novel properties of higher-dimensional black holes, compared to four-dimensional ones, is that their horizons can have two characteristic lengths of very different size. We develop a long-distance world-volume effective theory that captures the black hole dynamics at scales much larger than the short scale. In this limit the black hole is regarded as a blackfold: a black brane (possibly boosted locally) whose world volume spans a curved submanifold of the spacetime. This approach reveals black objects with novel horizon geometries and topologies more complex than the black ring, but more generally it provides a new organizing framework for the dynamics of higher-dimensional black holes.
Inter-algorithm lesion volumetry comparison of real and 3D simulated lung lesions in CT
NASA Astrophysics Data System (ADS)
Robins, Marthony; Solomon, Justin; Hoye, Jocelyn; Smith, Taylor; Ebner, Lukas; Samei, Ehsan
2017-03-01
The purpose of this study was to establish volumetric exchangeability between real and computational lung lesions in CT. We compared the overall relative volume estimation performance of segmentation tools when used to measure real lesions in actual patient CT images and computational lesions virtually inserted into the same patient images (i.e., hybrid datasets). Pathologically confirmed malignancies from 30 thoracic patient cases from Reference Image Database to Evaluate Therapy Response (RIDER) were modeled and used as the basis for the comparison. Lesions included isolated nodules as well as those attached to the pleura or other lung structures. Patient images were acquired using a 16 detector row or 64 detector row CT scanner (Lightspeed 16 or VCT; GE Healthcare). Scans were acquired using standard chest protocols during a single breath-hold. Virtual 3D lesion models based on real lesions were developed in Duke Lesion Tool (Duke University), and inserted using a validated image-domain insertion program. Nodule volumes were estimated using multiple commercial segmentation tools (iNtuition, TeraRecon, Inc., Syngo.via, Siemens Healthcare, and IntelliSpace, Philips Healthcare). Consensus based volume comparison showed consistent trends in volume measurement between real and virtual lesions across all software. The average percent bias (+/- standard error) shows -9.2+/-3.2% for real lesions versus -6.7+/-1.2% for virtual lesions with tool A, 3.9+/-2.5% and 5.0+/-0.9% for tool B, and 5.3+/-2.3% and 1.8+/-0.8% for tool C, respectively. Virtual lesion volumes were statistically similar to those of real lesions (< 4% difference) with p >.05 in most cases. Results suggest that hybrid datasets had similar inter-algorithm variability compared to real datasets.
Vertical Case Studies and the Challenges of Culture, Context and Comparison
ERIC Educational Resources Information Center
Bartlett, Lesley
2014-01-01
The Teachers College Symposium invited scholars to rethink culture, context, and comparison in educational research. In his response to these questions (this volume), Joe Tobin promoted comparative ethnographies to understand how social, cultural, and political processes play out across multiple locations and time periods. He urged careful…
On Capillary Rise and Nucleation
ERIC Educational Resources Information Center
Prasad, R.
2008-01-01
A comparison of capillary rise and nucleation is presented. It is shown that both phenomena result from a balance between two competing energy factors: a volume energy and a surface energy. Such a comparison may help to introduce nucleation with a topic familiar to the students, capillary rise. (Contains 1 table and 3 figures.)
ERIC Educational Resources Information Center
Limongelli, Carla; Sciarrone, Filippo; Temperini, Marco; Vaste, Giulia
2011-01-01
LS-Lab provides automatic support to comparison/evaluation of the Learning Object Sequences produced by different Curriculum Sequencing Algorithms. Through this framework a teacher can verify the correspondence between the behaviour of different sequencing algorithms and her pedagogical preferences. In fact the teacher can compare algorithms…
A Submodularity Framework for Data Subset Selection
2013-09-01
37 7 List of Language Modeling Corpora in thet Arabic -to-English NIST Task ............. 37 8...Task ( Arabic -to-English) ................. 39 10 Baseline BLEU (%) PER Scores on Transtac Task (English-to- Arabic ) ................. 39 11...Comparison of BLEU (%) PER Scores on Transtac Task ( Arabic -to-English) ....... 39 12 Comparison of BLEU (%) PER Scores on Transtac Task (English-to- Arabic
ERIC Educational Resources Information Center
St. Louis, Kenneth O.
2011-01-01
Purpose: The "Public Opinion Survey of Human Attributes-Stuttering" ("POSHA-S") was developed to make available worldwide a standard measure of public attitudes toward stuttering that is practical, reliable, valid, and translatable. Mean data from past field studies as comparisons for interpretation of "POSHA-S" results are reported. Method: Means…
Lai, Fu-Jou; Chang, Hong-Tsun; Wu, Wei-Sheng
2015-01-01
Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs.
2015-01-01
Background Computational identification of cooperative transcription factor (TF) pairs helps understand the combinatorial regulation of gene expression in eukaryotic cells. Many advanced algorithms have been proposed to predict cooperative TF pairs in yeast. However, it is still difficult to conduct a comprehensive and objective performance comparison of different algorithms because of lacking sufficient performance indices and adequate overall performance scores. To solve this problem, in our previous study (published in BMC Systems Biology 2014), we adopted/proposed eight performance indices and designed two overall performance scores to compare the performance of 14 existing algorithms for predicting cooperative TF pairs in yeast. Most importantly, our performance comparison framework can be applied to comprehensively and objectively evaluate the performance of a newly developed algorithm. However, to use our framework, researchers have to put a lot of effort to construct it first. To save researchers time and effort, here we develop a web tool to implement our performance comparison framework, featuring fast data processing, a comprehensive performance comparison and an easy-to-use web interface. Results The developed tool is called PCTFPeval (Predicted Cooperative TF Pair evaluator), written in PHP and Python programming languages. The friendly web interface allows users to input a list of predicted cooperative TF pairs from their algorithm and select (i) the compared algorithms among the 15 existing algorithms, (ii) the performance indices among the eight existing indices, and (iii) the overall performance scores from two possible choices. The comprehensive performance comparison results are then generated in tens of seconds and shown as both bar charts and tables. The original comparison results of each compared algorithm and each selected performance index can be downloaded as text files for further analyses. Conclusions Allowing users to select eight existing performance indices and 15 existing algorithms for comparison, our web tool benefits researchers who are eager to comprehensively and objectively evaluate the performance of their newly developed algorithm. Thus, our tool greatly expedites the progress in the research of computational identification of cooperative TF pairs. PMID:26677932
OpenCluster: A Flexible Distributed Computing Framework for Astronomical Data Processing
NASA Astrophysics Data System (ADS)
Wei, Shoulin; Wang, Feng; Deng, Hui; Liu, Cuiyin; Dai, Wei; Liang, Bo; Mei, Ying; Shi, Congming; Liu, Yingbo; Wu, Jingping
2017-02-01
The volume of data generated by modern astronomical telescopes is extremely large and rapidly growing. However, current high-performance data processing architectures/frameworks are not well suited for astronomers because of their limitations and programming difficulties. In this paper, we therefore present OpenCluster, an open-source distributed computing framework to support rapidly developing high-performance processing pipelines of astronomical big data. We first detail the OpenCluster design principles and implementations and present the APIs facilitated by the framework. We then demonstrate a case in which OpenCluster is used to resolve complex data processing problems for developing a pipeline for the Mingantu Ultrawide Spectral Radioheliograph. Finally, we present our OpenCluster performance evaluation. Overall, OpenCluster provides not only high fault tolerance and simple programming interfaces, but also a flexible means of scaling up the number of interacting entities. OpenCluster thereby provides an easily integrated distributed computing framework for quickly developing a high-performance data processing system of astronomical telescopes and for significantly reducing software development expenses.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Efroymson, R.A.
2001-01-12
This is a companion report to the risk assessment framework proposed by Suter et al. (1998): ''A Framework for Assessment of Risks of Military Training and Testing to Natural Resources,'' hereafter referred to as the ''generic framework.'' The generic framework is an ecological risk assessment methodology for use in environmental assessments on Department of Defense (DoD) installations. In the generic framework, the ecological risk assessment framework of the US Environmental Protection Agency (EPA 1998) is modified for use in the context of (1) multiple and diverse stressors and activities at a military installation and (2) risks resulting from causal chains,more » e.g., effects on habitat that indirectly impact wildlife. Both modifications are important if the EPA framework is to be used on military installations. In order for the generic risk assessment framework to be useful to DoD environmental staff and contractors, the framework must be applied to specific training and testing activities. Three activity-specific ecological risk assessment frameworks have been written (1) to aid environmental staff in conducting risk assessments that involve these activities and (2) to guide staff in the development of analogous frameworks for other DoD activities. The three activities are: (1) low-altitude overflights by fixed-wing and rotary-wing aircraft (this volume), (2) firing at targets on land, and (3) ocean explosions. The activities were selected as priority training and testing activities by the advisory committee for this project.« less
Mathematical modelling of fluid transport and its regulation at multiple scales.
Chara, Osvaldo; Brusch, Lutz
2015-04-01
Living matter equals water, to a first approximation, and water transport across barriers such as membranes and epithelia is vital. Water serves two competing functions. On the one hand, it is the fundamental solvent enabling random mobility of solutes and therefore biochemical reactions and intracellular signal propagation. Homeostasis of the intracellular water volume is required such that messenger concentration encodes the stimulus and not inverse volume fluctuations. On the other hand, water flow is needed for transport of solutes to and away from cells in a directed manner, threatening volume homeostasis and signal transduction fidelity of cells. Feedback regulation of fluid transport reconciles these competing objectives. The regulatory mechanisms often span across multiple spatial scales from cellular interactions up to the architecture of organs. Open questions relate to the dependency of water fluxes and steady state volumes on control parameters and stimuli. We here review selected mathematical models of feedback regulation of fluid transport at the cell scale and identify a general "core-shell" structure of such models. We propose that fluid transport models at other spatial scales can be constructed in a generalised core-shell framework, in which the core accounts for the biophysical effects of fluid transport whilst the shell reflects the regulatory mechanisms. We demonstrate the applicability of this framework for tissue lumen growth and suggest future experiments in zebrafish to test lumen size regulation mechanisms. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Automated geometric optimization for robotic HIFU treatment of liver tumors.
Williamson, Tom; Everitt, Scott; Chauhan, Sunita
2018-05-01
High intensity focused ultrasound (HIFU) represents a non-invasive method for the destruction of cancerous tissue within the body. Heating of targeted tissue by focused ultrasound transducers results in the creation of ellipsoidal lesions at the target site, the locations of which can have a significant impact on treatment outcomes. Towards this end, this work describes a method for the optimization of lesion positions within arbitrary tumors, with specific anatomical constraints. A force-based optimization framework was extended to the case of arbitrary tumor position and constrained orientation. Analysis of the approximate reachable treatment volume for the specific case of treatment of liver tumors was performed based on four transducer configurations and constraint conditions derived. Evaluation was completed utilizing simplified spherical and ellipsoidal tumor models and randomly generated tumor volumes. The total volume treated, lesion overlap and healthy tissue ablated was evaluated. Two evaluation scenarios were defined and optimized treatment plans assessed. The optimization framework resulted in improvements of up to 10% in tumor volume treated, and reductions of up to 20% in healthy tissue ablated as compared to the standard lesion rastering approach. Generation of optimized plans proved feasible for both sub- and intercostally located tumors. This work describes an optimized method for the planning of lesion positions during HIFU treatment of liver tumors. The approach allows the determination of optimal lesion locations and orientations, and can be applied to arbitrary tumor shapes and sizes. Copyright © 2018 Elsevier Ltd. All rights reserved.
Balfour, Margaret E; Tanner, Kathleen; Jurica, Paul J; Rhoads, Richard; Carson, Chris A
2016-01-01
Crisis and emergency psychiatric services are an integral part of the healthcare system, yet there are no standardized measures for programs providing these services. We developed the Crisis Reliability Indicators Supporting Emergency Services (CRISES) framework to create measures that inform internal performance improvement initiatives and allow comparison across programs. The framework consists of two components-the CRISES domains (timely, safe, accessible, least-restrictive, effective, consumer/family centered, and partnership) and the measures supporting each domain. The CRISES framework provides a foundation for development of standardized measures for the crisis field. This will become increasingly important as pay-for-performance initiatives expand with healthcare reform.
ERIC Educational Resources Information Center
Hoffman, Michael F.; Quittner, Alexandra L.; Cejas, Ivette
2015-01-01
This study compared levels of social competence and language development in 74 young children with hearing loss and 38 hearing peers aged 2.5-5.3 years. This study was the first to examine the relationship between oral language and social competence using a dynamic systems framework in children with and without hearing loss. We hypothesized that,…
NASA Technical Reports Server (NTRS)
Narvet, Steven W.; Frigm, Ryan C.; Hejduk, Matthew D.
2011-01-01
Conjunction Assessment operations require screening assets against the space object catalog by placing a pre-determined spatial volume around each asset and predicting when another object will violate that volume. The selection of the screening volume used for each spacecraft is a trade-off between observing all conjunction events that may pose a potential risk to the primary spacecraft and the ability to analyze those predicted events. If the screening volumes are larger, then more conjunctions can be observed and therefore the probability of a missed detection of a high risk conjunction event is small; however, the amount of data which needs to be analyzed increases. This paper characterizes the sensitivity of screening volume size to capturing typical orbit uncertainties and the expected number of conjunction events observed. These sensitivities are quantified in the form of a trade space that allows for selection of appropriate screen-ing volumes to fit the desired concept of operations, system limitations, and tolerable analyst workloads. This analysis will specifically highlight the screening volume determination and selection process for use in the NASA Conjunction Assessment Risk Analysis process but will also provide a general framework for other Owner / Operators faced with similar decisions.
NASA Technical Reports Server (NTRS)
Long, W. C.
1988-01-01
The results of the Independent Orbiter Assessment (IOA) of the Failure Modes and Effects Analysis (FMEA) and Critical Items List (CIL) are presented. The IOA effort first completed and analysis of the Communication and Tracking hardware, generating draft failure modes and potential critical items. To preserve independence, this analysis was accomplished without reliance upon the results contained within the NASA FMEA/CIL documentation. The IOA results were then compared to the NASA FMEA/CIL baseline. A resolution of each discrepancy from the comparison is provided through additional analysis as required. This report documents the results of that comparison for the Orbiter Communication and Tracking hardware. The IOA product for the Communication and Tracking consisted of 1,108 failure mode worksheets that resulted in 298 critical items being identified. Comparison was made to the NASA baseline which consists of 697 FMEAs and 239 CIL items. The comparison determined if there were any results which had been found by IOA but were not in the NASA baseline. This comparison produced agreement on all but 407 FMEAs which caused differences in 294 CIL items. Volume 1 contains the subsystem description, assessment results, ground rules and assumptions, and some of the IOA worksheets.
Xie, Sheng-Ming; Zhang, Mei; Fei, Zhi-Xin; Yuan, Li-Ming
2014-10-10
Chiral metal-organic frameworks (MOFs) are a new class of multifunctional material, which possess diverse structures and unusual properties such as high surface area, uniform and permanent cavities, as well as good chemical and thermal stability. Their chiral functionality makes them attractive as novel enantioselective adsorbents and stationary phases in separation science. In this paper, the experimental comparison of a chiral MOF [In₃O(obb)₃(HCO₂)(H₂O)] solvent used as a stationary phase was investigated in gas chromatography (GC), high-performance liquid chromatography (HPLC) and capillary electrochromatography (CEC). The potential relationship between the structure and components of chiral MOFs with their chiral recognition ability and selectivity are presented. Copyright © 2014 Elsevier B.V. All rights reserved.
de Hoop, Bartjan; Gietema, Hester; van Ginneken, Bram; Zanen, Pieter; Groenewegen, Gerard; Prokop, Mathias
2009-04-01
We compared interexamination variability of CT lung nodule volumetry with six currently available semi-automated software packages to determine the minimum change needed to detect the growth of solid lung nodules. We had ethics committee approval. To simulate a follow-up examination with zero growth, we performed two low-dose unenhanced CT scans in 20 patients referred for pulmonary metastases. Between examinations, patients got off and on the table. Volumes of all pulmonary nodules were determined on both examinations using six nodule evaluation software packages. Variability (upper limit of the 95% confidence interval of the Bland-Altman plot) was calculated for nodules for which segmentation was visually rated as adequate. We evaluated 214 nodules (mean diameter 10.9 mm, range 3.3 mm-30.0 mm). Software packages provided adequate segmentation in 71% to 86% of nodules (p < 0.001). In case of adequate segmentation, variability in volumetry between scans ranged from 16.4% to 22.3% for the various software packages. Variability with five to six software packages was significantly less for nodules >or=8 mm in diameter (range 12.9%-17.1%) than for nodules <8 mm (range 18.5%-25.6%). Segmented volumes of each package were compared to each of the other packages. Systematic volume differences were detected in 11/15 comparisons. This hampers comparison of nodule volumes between software packages.
DOT National Transportation Integrated Search
2011-05-01
Safety has always been an important component in the planning, design, and operation of highways. In an effort : to reduce crashes occurring on highway facilities, the Safe, Accountable, Flexible, and Efficient Transportation : Equity Act - A Legacy ...
DOT National Transportation Integrated Search
2011-05-01
Safety has always been an important component in the planning, design, and operation of highways. In an effort : to reduce crashes occurring on highway facilities, the Safe, Accountable, Flexible, and Efficient Transportation : Equity Act - A Legacy ...
Advances in Library Administration and Organization. Volume 16.
ERIC Educational Resources Information Center
Williams, Delmus E., Ed.; Garten, Edward D., Ed.
This book presents the following papers on library organizational and management issues: (1) "Knowledge Management: An Essential Framework for Corporate Library Leadership" (Deanna J. Streng); (2) "Critical Factors for Collaboration in an Academic Library Setting" (Charlotte M. Knoche); (3) "Salaries of Academic Librarians…
Volume Holograms in Photopolymers: Comparison between Analytical and Rigorous Theories
Gallego, Sergi; Neipp, Cristian; Estepa, Luis A.; Ortuño, Manuel; Márquez, Andrés; Francés, Jorge; Pascual, Inmaculada; Beléndez, Augusto
2012-01-01
There is no doubt that the concept of volume holography has led to an incredibly great amount of scientific research and technological applications. One of these applications is the use of volume holograms as optical memories, and in particular, the use of a photosensitive medium like a photopolymeric material to record information in all its volume. In this work we analyze the applicability of Kogelnik’s Coupled Wave theory to the study of volume holograms recorded in photopolymers. Some of the theoretical models in the literature describing the mechanism of hologram formation in photopolymer materials use Kogelnik’s theory to analyze the gratings recorded in photopolymeric materials. If Kogelnik’s theory cannot be applied is necessary to use a more general Coupled Wave theory (CW) or the Rigorous Coupled Wave theory (RCW). The RCW does not incorporate any approximation and thus, since it is rigorous, permits judging the accurateness of the approximations included in Kogelnik’s and CW theories. In this article, a comparison between the predictions of the three theories for phase transmission diffraction gratings is carried out. We have demonstrated the agreement in the prediction of CW and RCW and the validity of Kogelnik’s theory only for gratings with spatial frequencies higher than 500 lines/mm for the usual values of the refractive index modulations obtained in photopolymers.
Comparison of Dynamic Contrast Enhanced MRI and Quantitative SPECT in a Rat Glioma Model
Skinner, Jack T.; Yankeelov, Thomas E.; Peterson, Todd E.; Does, Mark D.
2012-01-01
Pharmacokinetic modeling of dynamic contrast enhanced (DCE)-MRI data provides measures of the extracellular volume fraction (ve) and the volume transfer constant (Ktrans) in a given tissue. These parameter estimates may be biased, however, by confounding issues such as contrast agent and tissue water dynamics, or assumptions of vascularization and perfusion made by the commonly used model. In contrast to MRI, radiotracer imaging with SPECT is insensitive to water dynamics. A quantitative dual-isotope SPECT technique was developed to obtain an estimate of ve in a rat glioma model for comparison to the corresponding estimates obtained using DCE-MRI with a vascular input function (VIF) and reference region model (RR). Both DCE-MRI methods produced consistently larger estimates of ve in comparison to the SPECT estimates, and several experimental sources were postulated to contribute to these differences. PMID:22991315
Calibration of mass and conventional mass of weights between INACAL-CENAM-CESMEC (SIM.M.M-S13)
NASA Astrophysics Data System (ADS)
Becerra, Luis Omar; Cori Almonte, Luz Marina; García, Fernando; García, Francisco; Hernández, Raúl; Peña, Luis Manuel; Quiroga Rojas, Aldo; Taipe Araujo, Donny
2017-01-01
This supplementary comparison was piloted by the Instituto Nacional de Calidad (INACAL, Peru) and describes the results of the comparison in mass and conventional mass in weights of 2 kg and 10 kg between INACAL-PERU, CENAM-MEXICO and CESMEC-CHILE. The travelling standards were OIML class E2 weights of stainless steel. INACAL measured the volume of the traveling standards of 10 kg and the volume value used for the 2 kg mass standard was provided by the PTB. To reach the main text of this paper, click on Final Report. Note that this text is that which appears in Appendix B of the BIPM key comparison database kcdb.bipm.org/. The final report has been peer-reviewed and approved for publication by the CCM, according to the provisions of the CIPM Mutual Recognition Arrangement (CIPM MRA).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Rusu, Mirabela, E-mail: mirabela.rusu@gmail.com; Wang, Haibo; Madabhushi, Anant
Purpose: Pulmonary inflammation is associated with a variety of diseases. Assessing pulmonary inflammation on in vivo imaging may facilitate the early detection and treatment of lung diseases. Although routinely used in thoracic imaging, computed tomography has thus far not been compellingly shown to characterize inflammation in vivo. Alternatively, magnetic resonance imaging (MRI) is a nonionizing radiation technique to better visualize and characterize pulmonary tissue. Prior to routine adoption of MRI for early characterization of inflammation in humans, a rigorous and quantitative characterization of the utility of MRI to identify inflammation is required. Such characterization may be achieved by considering exmore » vivo histology as the ground truth, since it enables the definitive spatial assessment of inflammation. In this study, the authors introduce a novel framework to integrate 2D histology, ex vivo and in vivo imaging to enable the mapping of the extent of disease from ex vivo histology onto in vivo imaging, with the goal of facilitating computerized feature analysis and interrogation of disease appearance on in vivo imaging. The authors’ framework was evaluated in a preclinical preliminary study aimed to identify computer extracted features on in vivo MRI associated with chronic pulmonary inflammation. Methods: The authors’ image analytics framework first involves reconstructing the histologic volume in 3D from individual histology slices. Second, the authors map the disease ground truth onto in vivo MRI via coregistration with 3D histology using the ex vivo lung MRI as a conduit. Finally, computerized feature analysis of the disease extent is performed to identify candidate in vivo imaging signatures of disease presence and extent. Results: The authors evaluated the framework by assessing the quality of the 3D histology reconstruction and the histology—MRI fusion, in the context of an initial use case involving characterization of chronic inflammation in a mouse model. The authors’ evaluation considered three mice, two with an inflammation phenotype and one control. The authors’ iterative 3D histology reconstruction yielded a 70.1% ± 2.7% overlap with the ex vivo MRI volume. Across a total of 17 anatomic landmarks manually delineated at the division of airways, the target registration error between the ex vivo MRI and 3D histology reconstruction was 0.85 ± 0.44 mm, suggesting that a good alignment of the ex vivo 3D histology and ex vivo MRI had been achieved. The 3D histology-in vivo MRI coregistered volumes resulted in an overlap of 73.7% ± 0.9%. Preliminary computerized feature analysis was performed on an additional four control mice, for a total of seven mice considered in this study. Gabor texture filters appeared to best capture differences between the inflamed and noninflamed regions on MRI. Conclusions: The authors’ 3D histology reconstruction and multimodal registration framework were successfully employed to reconstruct the histology volume of the lung and fuse it with in vivo MRI to create a ground truth map for inflammation on in vivo MRI. The analytic platform presented here lays the framework for a rigorous validation of the identified imaging features for chronic lung inflammation on MRI in a large prospective cohort.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, J; Gensheimer, M; Dong, X
Purpose: To develop an intra-tumor partitioning framework for identifying high-risk subregions from 18F-fluorodeoxyglucose positron emission tomography (FDG-PET) and CT imaging, and to test whether tumor burden associated with the high-risk subregions is prognostic of outcomes in lung cancer. Methods: In this institutional review board-approved retrospective study, we analyzed the pre-treatment FDG-PET and CT scans of 44 lung cancer patients treated with radiotherapy. A novel, intra-tumor partitioning method was developed based on a two-stage clustering process: first at patient-level, each tumor was over-segmented into many superpixels by k-means clustering of integrated PET and CT images; next, tumor subregions were identified bymore » merging previously defined superpixels via population-level hierarchical clustering. The volume associated with each of the subregions was evaluated using Kaplan-Meier analysis regarding its prognostic capability in predicting overall survival (OS) and out-of-field progression (OFP). Results: Three spatially distinct subregions were identified within each tumor, which were highly robust to uncertainty in PET/CT co-registration. Among these, the volume of the most metabolically active and metabolically heterogeneous solid component of the tumor was predictive of OS and OFP on the entire cohort, with a concordance index or CI = 0.66–0.67. When restricting the analysis to patients with stage III disease (n = 32), the same subregion achieved an even higher CI = 0.75 (HR = 3.93, logrank p = 0.002) for predicting OS, and a CI = 0.76 (HR = 4.84, logrank p = 0.002) for predicting OFP. In comparison, conventional imaging markers including tumor volume, SUVmax and MTV50 were not predictive of OS or OFP, with CI mostly below 0.60 (p < 0.001). Conclusion: We propose a robust intra-tumor partitioning method to identify clinically relevant, high-risk subregions in lung cancer. We envision that this approach will be applicable to identifying useful imaging biomarkers in many cancer types.« less
NASA Astrophysics Data System (ADS)
Yuan, Chao; Chareyre, Bruno; Darve, Félix
2016-09-01
A pore-scale model is introduced for two-phase flow in dense packings of polydisperse spheres. The model is developed as a component of a more general hydromechanical coupling framework based on the discrete element method, which will be elaborated in future papers and will apply to various processes of interest in soil science, in geomechanics and in oil and gas production. Here the emphasis is on the generation of a network of pores mapping the void space between spherical grains, and the definition of local criteria governing the primary drainage process. The pore space is decomposed by Regular Triangulation, from which a set of pores connected by throats are identified. A local entry capillary pressure is evaluated for each throat, based on the balance of capillary pressure and surface tension at equilibrium. The model reflects the possible entrapment of disconnected patches of the receding wetting phase. It is validated by a comparison with drainage experiments. In the last part of the paper, a series of simulations are reported to illustrate size and boundary effects, key questions when studying small samples made of spherical particles be it in simulations or experiments. Repeated tests on samples of different sizes give evolution of water content which are not only scattered but also strongly biased for small sample sizes. More than 20,000 spheres are needed to reduce the bias on saturation below 0.02. Additional statistics are generated by subsampling a large sample of 64,000 spheres. They suggest that the minimal sampling volume for evaluating saturation is one hundred times greater that the sampling volume needed for measuring porosity with the same accuracy. This requirement in terms of sample size induces a need for efficient computer codes. The method described herein has a low algorithmic complexity in order to satisfy this requirement. It will be well suited to further developments toward coupled flow-deformation problems in which evolution of the microstructure require frequent updates of the pore network.
1984-06-29
effort that requires hard copy documentation. As a result, there are generally numerous delays in providing current quality information. In the FoF...process have had fixed controls or were based on " hard -coded" information. A template, for example, is hard -coded information defining the shape of a...represents soft-coded control information. (Although manual handling of punch tapes still possess some of the limitations of " hard -coded" controls
Yaghi, Omar M.; Eddaoudi, Mohamed; Li, Hailian; Kim, Jaheon; Rosi, Nathaniel
2005-08-16
An isoreticular metal-organic framework (IRMOF) and method for systematically forming the same. The method comprises the steps of dissolving at least one source of metal cations and at least one organic linking compound in a solvent to form a solution; and crystallizing the solution under predetermined conditions to form a predetermined IRMOF. At least one of functionality, dimension, pore size and free volume of the IRMOF is substantially determined by the organic linking compound.
A framework for longitudinal data analysis via shape regression
NASA Astrophysics Data System (ADS)
Fishbaugh, James; Durrleman, Stanley; Piven, Joseph; Gerig, Guido
2012-02-01
Traditional longitudinal analysis begins by extracting desired clinical measurements, such as volume or head circumference, from discrete imaging data. Typically, the continuous evolution of a scalar measurement is estimated by choosing a 1D regression model, such as kernel regression or fitting a polynomial of fixed degree. This type of analysis not only leads to separate models for each measurement, but there is no clear anatomical or biological interpretation to aid in the selection of the appropriate paradigm. In this paper, we propose a consistent framework for the analysis of longitudinal data by estimating the continuous evolution of shape over time as twice differentiable flows of deformations. In contrast to 1D regression models, one model is chosen to realistically capture the growth of anatomical structures. From the continuous evolution of shape, we can simply extract any clinical measurements of interest. We demonstrate on real anatomical surfaces that volume extracted from a continuous shape evolution is consistent with a 1D regression performed on the discrete measurements. We further show how the visualization of shape progression can aid in the search for significant measurements. Finally, we present an example on a shape complex of the brain (left hemisphere, right hemisphere, cerebellum) that demonstrates a potential clinical application for our framework.
Nanothermodynamics in the strong coupling regime
NASA Astrophysics Data System (ADS)
Jarzynski, Christopher
In macroscopic thermodynamics, energy gained by a system is lost by its surroundings (or vice-versa), in accordance with the first law of thermodynamics. However, if the system-environment interaction energy cannot be neglected - as in the case of a microscopic system such as a single molecule in solution - then it is not immediately clear where to draw the line between the energy of the system and that of the environment. To which subsystem does the interaction energy belong? I will describe a microscopic formulation of both the first and second laws of thermodynamics that applies to this situation. In this framework, seven key thermodynamic quantities - internal energy, entropy, volume, enthalpy, Gibbs free energy, heat and work - are given precise microscopic definitions, and the first and second laws are preserved without requiring corrections due to finite system-environment coupling. Furthermore, these definitions reduce to the usual ones in the limit of macroscopic systems of interest. This condition establishes that a unifying framework can be developed, encompassing stochastic thermodynamics at one end and macroscopic thermodynamics at the other. A central element of this framework is a thermodynamic definition of the volume of the system of interest, which converges to the usual geometric definition when the system is large. This research was supported by the U.S. National Science Foundation through Grant No. DMR-1506969.
Demitri, Nevine; Zoubir, Abdelhak M
2017-01-01
Glucometers present an important self-monitoring tool for diabetes patients and, therefore, must exhibit high accuracy as well as good usability features. Based on an invasive photometric measurement principle that drastically reduces the volume of the blood sample needed from the patient, we present a framework that is capable of dealing with small blood samples, while maintaining the required accuracy. The framework consists of two major parts: 1) image segmentation; and 2) convergence detection. Step 1 is based on iterative mode-seeking methods to estimate the intensity value of the region of interest. We present several variations of these methods and give theoretical proofs of their convergence. Our approach is able to deal with changes in the number and position of clusters without any prior knowledge. Furthermore, we propose a method based on sparse approximation to decrease the computational load, while maintaining accuracy. Step 2 is achieved by employing temporal tracking and prediction, herewith decreasing the measurement time, and, thus, improving usability. Our framework is tested on several real datasets with different characteristics. We show that we are able to estimate the underlying glucose concentration from much smaller blood samples than is currently state of the art with sufficient accuracy according to the most recent ISO standards and reduce measurement time significantly compared to state-of-the-art methods.
Gkoumas, Spyridon; Villanueva-Perez, Pablo; Wang, Zhentian; Romano, Lucia; Abis, Matteo; Stampanoni, Marco
2016-01-01
In X-ray grating interferometry, dark-field contrast arises due to partial extinction of the detected interference fringes. This is also called visibility reduction and is attributed to small-angle scattering from unresolved structures in the imaged object. In recent years, analytical quantitative frameworks of dark-field contrast have been developed for highly diluted monodisperse microsphere suspensions with maximum 6% volume fraction. These frameworks assume that scattering particles are separated by large enough distances, which make any interparticle scattering interference negligible. In this paper, we start from the small-angle scattering intensity equation and, by linking Fourier and real-space, we introduce the structure factor and thus extend the analytical and experimental quantitative interpretation of dark-field contrast, for a range of suspensions with volume fractions reaching 40%. The structure factor accounts for interparticle scattering interference. Without introducing any additional fitting parameters, we successfully predict the experimental values measured at the TOMCAT beamline, Swiss Light Source. Finally, we apply this theoretical framework to an experiment probing a range of system correlation lengths by acquiring dark-field images at different energies. This proposed method has the potential to be applied in single-shot-mode using a polychromatic X-ray tube setup and a single-photon-counting energy-resolving detector. PMID:27734931
Diffusion tensor analysis with invariant gradients and rotation tangents.
Kindlmann, Gordon; Ennis, Daniel B; Whitaker, Ross T; Westin, Carl-Fredrik
2007-11-01
Guided by empirically established connections between clinically important tissue properties and diffusion tensor parameters, we introduce a framework for decomposing variations in diffusion tensors into changes in shape and orientation. Tensor shape and orientation both have three degrees-of-freedom, spanned by invariant gradients and rotation tangents, respectively. As an initial demonstration of the framework, we create a tunable measure of tensor difference that can selectively respond to shape and orientation. Second, to analyze the spatial gradient in a tensor volume (a third-order tensor), our framework generates edge strength measures that can discriminate between different neuroanatomical boundaries, as well as creating a novel detector of white matter tracts that are adjacent yet distinctly oriented. Finally, we apply the framework to decompose the fourth-order diffusion covariance tensor into individual and aggregate measures of shape and orientation covariance, including a direct approximation for the variance of tensor invariants such as fractional anisotropy.
Modeling Complex Biological Flows in Multi-Scale Systems using the APDEC Framework
DOE Office of Scientific and Technical Information (OSTI.GOV)
Trebotich, D
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA-laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscousmore » flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.« less
Modeling complex biological flows in multi-scale systems using the APDEC framework
NASA Astrophysics Data System (ADS)
Trebotich, David
2006-09-01
We have developed advanced numerical algorithms to model biological fluids in multiscale flow environments using the software framework developed under the SciDAC APDEC ISIC. The foundation of our computational effort is an approach for modeling DNA laden fluids as ''bead-rod'' polymers whose dynamics are fully coupled to an incompressible viscous solvent. The method is capable of modeling short range forces and interactions between particles using soft potentials and rigid constraints. Our methods are based on higher-order finite difference methods in complex geometry with adaptivity, leveraging algorithms and solvers in the APDEC Framework. Our Cartesian grid embedded boundary approach to incompressible viscous flow in irregular geometries has also been interfaced to a fast and accurate level-sets method within the APDEC Framework for extracting surfaces from volume renderings of medical image data and used to simulate cardio-vascular and pulmonary flows in critical anatomies.
The Isprs Benchmark on Indoor Modelling
NASA Astrophysics Data System (ADS)
Khoshelham, K.; Díaz Vilariño, L.; Peter, M.; Kang, Z.; Acharya, D.
2017-09-01
Automated generation of 3D indoor models from point cloud data has been a topic of intensive research in recent years. While results on various datasets have been reported in literature, a comparison of the performance of different methods has not been possible due to the lack of benchmark datasets and a common evaluation framework. The ISPRS benchmark on indoor modelling aims to address this issue by providing a public benchmark dataset and an evaluation framework for performance comparison of indoor modelling methods. In this paper, we present the benchmark dataset comprising several point clouds of indoor environments captured by different sensors. We also discuss the evaluation and comparison of indoor modelling methods based on manually created reference models and appropriate quality evaluation criteria. The benchmark dataset is available for download at: http://www2.isprs.org/commissions/comm4/wg5/benchmark-on-indoor-modelling.html.
Informing the NCA: EPA's Climate Change Impact and Risk Analysis Framework
NASA Astrophysics Data System (ADS)
Sarofim, M. C.; Martinich, J.; Kolian, M.; Crimmins, A. R.
2017-12-01
The Climate Change Impact and Risk Analysis (CIRA) framework is designed to quantify the physical impacts and economic damages in the United States under future climate change scenarios. To date, the framework has been applied to 25 sectors, using scenarios and projections developed for the Fourth National Climate Assessment. The strength of this framework has been in the use of consistent climatic, socioeconomic, and technological assumptions and inputs across the impact sectors to maximize the ease of cross-sector comparison. The results of the underlying CIRA sectoral analyses are informing the sustained assessment process by helping to address key gaps related to economic valuation and risk. Advancing capacity and scientific literature in this area has created opportunity to consider future applications and strengthening of the framework. This presentation will describe the CIRA framework, present results for various sectors such as heat mortality, air & water quality, winter recreation, and sea level rise, and introduce potential enhancements that can improve the utility of the framework for decision analysis.
Sugar pine utilization: a 30-year transition.
Susan Willits; Thomas D. Fahey
1991-01-01
Utilization standards and measurement systems have changed since the first lumber recovery study was conducted on sugar pine in 1957. These changes prompted a new study to provide new information on lumber volume and value recovery and a comparison to older studies. Lumber volume and value recovery are presented for the recent study on a board- and cubic-foot bases....
Comparison of optical dendrometers for prediction of standing tree volume
Robert C. Parker; Thomas G. Matney
2000-01-01
Enhanced sets of compatible stem profile equations were used with data collected from felled and standing pine trees to calculate tree volumes to various top merchantability limits. Standing trees were measured with the Criterion 400 Laser, Tele-Relaskop, and Wheeler Pentaprism. These measurements were used to compare accuracies of the optical dendrometers for the...
Planum Temporale Volume in Children and Adolescents with Autism
ERIC Educational Resources Information Center
Rojas, Donald C.; Camou, Suzanne L.; Reite, Martin L.; Rogers, Sally J.
2005-01-01
Previous research has revealed a lack of planum temporale (PT) asymmetry in adults with autism. This finding is now extended to children and adolescents with the disorder. MRI scans were obtained from 12 children with autism and 12 gender, handedness and age-matched comparison participants. The volume of gray matter in the PT and Heschl's gyrus…
Graphical Neuroimaging Informatics: Application to Alzheimer’s Disease
Bowman, Ian; Joshi, Shantanu H.; Greer, Vaughan
2013-01-01
The Informatics Visualization for Neuroimaging (INVIZIAN) framework allows one to graphically display image and meta-data information from sizeable collections of neuroimaging data as a whole using a dynamic and compelling user interface. Users can fluidly interact with an entire collection of cortical surfaces using only their mouse. In addition, users can cluster and group brains according in multiple ways for subsequent comparison using graphical data mining tools. In this article, we illustrate the utility of INVIZIAN for simultaneous exploration and mining a large collection of extracted cortical surface data arising in clinical neuroimaging studies of patients with Alzheimer’s Disease, mild cognitive impairment, as well as healthy control subjects. Alzheimer’s Disease is particularly interesting due to the wide-spread effects on cortical architecture and alterations of volume in specific brain areas associated with memory. We demonstrate INVIZIAN’s ability to render multiple brain surfaces from multiple diagnostic groups of subjects, showcase the interactivity of the system, and showcase how INVIZIAN can be employed to generate hypotheses about the collection of data which would be suitable for direct access to the underlying raw data and subsequent formal statistical analysis. Specifically, we use INVIZIAN show how cortical thickness and hippocampal volume differences between group are evident even in the absence of more formal hypothesis testing. In the context of neurological diseases linked to brain aging such as AD, INVIZIAN provides a unique means for considering the entirety of whole brain datasets, look for interesting relationships among them, and thereby derive new ideas for further research and study. PMID:24203652
Computed tomography-based volumetric tool for standardized measurement of the maxillary sinus
Giacomini, Guilherme; Pavan, Ana Luiza Menegatti; Altemani, João Mauricio Carrasco; Duarte, Sergio Barbosa; Fortaleza, Carlos Magno Castelo Branco; Miranda, José Ricardo de Arruda
2018-01-01
Volume measurements of maxillary sinus may be useful to identify diseases affecting paranasal sinuses. However, literature shows a lack of consensus in studies measuring the volume. This may be attributable to different computed tomography data acquisition techniques, segmentation methods, focuses of investigation, among other reasons. Furthermore, methods for volumetrically quantifying the maxillary sinus are commonly manual or semiautomated, which require substantial user expertise and are time-consuming. The purpose of the present study was to develop an automated tool for quantifying the total and air-free volume of the maxillary sinus based on computed tomography images. The quantification tool seeks to standardize maxillary sinus volume measurements, thus allowing better comparisons and determinations of factors that influence maxillary sinus size. The automated tool utilized image processing techniques (watershed, threshold, and morphological operators). The maxillary sinus volume was quantified in 30 patients. To evaluate the accuracy of the automated tool, the results were compared with manual segmentation that was performed by an experienced radiologist using a standard procedure. The mean percent differences between the automated and manual methods were 7.19% ± 5.83% and 6.93% ± 4.29% for total and air-free maxillary sinus volume, respectively. Linear regression and Bland-Altman statistics showed good agreement and low dispersion between both methods. The present automated tool for maxillary sinus volume assessment was rapid, reliable, robust, accurate, and reproducible and may be applied in clinical practice. The tool may be used to standardize measurements of maxillary volume. Such standardization is extremely important for allowing comparisons between studies, providing a better understanding of the role of the maxillary sinus, and determining the factors that influence maxillary sinus size under normal and pathological conditions. PMID:29304130
Chieng, Norman; Cicerone, Marcus T.; Zhong, Qin; Liu, Ming; Pikal, Michael J.
2013-01-01
Amorphous HES/disaccharide (trehalose or sucrose) formulations, with and without added polyols (glycerol and sorbitol) and disaccharide formulations of human growth hormone (hGH), were prepared by freeze drying and characterized with particular interest in methodology for using high precision density measurements to evaluate free volume changes and a focus on comparisons between “free volume” changes obtained from analysis of density data, fast dynamics (local mobility), and PALS characterization of “free volume” hole size. Density measurements were performed using a helium gas pycnometer, and fast dynamics was characterized using incoherent neutron scattering spectrometer. Addition of sucrose and trehalose to hGH decreases free volume in the system with sucrose marginally more effective than trehalose, consistent with superior pharmaceutical stability of sucrose hGH formulations well below Tg relative to trehalose. We find that density data may be analyzed in terms of free volume changes by evaluation of volume changes on mixing and calculation of apparent specific volumes from the densities. Addition of sucrose to HES decreases free volume, but the effect of trehalose is not detectable above experimental error. Addition of sorbitol or glycerol to HES/trehalose base formulations appears to significantly decrease free volume, consistent with the positive impact of such additions on pharmaceutical stability (i.e., degradation) in the glassy state. Free volume changes, evaluated from density data, fast dynamics amplitude of local motion, and PALS hole size data generally are in qualitative agreement for the HES/disaccharide systems studied. All predict decreasing molecular mobility as disaccharides are added to HES. Global mobility as measured by enthalpy relaxation times, increases as disaccharides, particularly sucrose, are added to HES. PMID:23623797
A generalized threshold model for computing bed load grain size distribution
NASA Astrophysics Data System (ADS)
Recking, Alain
2016-12-01
For morphodynamic studies, it is important to compute not only the transported volumes of bed load, but also the size of the transported material. A few bed load equations compute fractional transport (i.e., both the volume and grain size distribution), but many equations compute only the bulk transport (a volume) with no consideration of the transported grain sizes. To fill this gap, a method is proposed to compute the bed load grain size distribution separately to the bed load flux. The method is called the Generalized Threshold Model (GTM), because it extends the flow competence method for threshold of motion of the largest transported grain size to the full bed surface grain size distribution. This was achieved by replacing dimensional diameters with their size indices in the standard hiding function, which offers a useful framework for computation, carried out for each indices considered in the range [1, 100]. New functions are also proposed to account for partial transport. The method is very simple to implement and is sufficiently flexible to be tested in many environments. In addition to being a good complement to standard bulk bed load equations, it could also serve as a framework to assist in analyzing the physics of bed load transport in future research.
Direct Visuo-Haptic 4D Volume Rendering Using Respiratory Motion Models.
Fortmeier, Dirk; Wilms, Matthias; Mastmeyer, Andre; Handels, Heinz
2015-01-01
This article presents methods for direct visuo-haptic 4D volume rendering of virtual patient models under respiratory motion. Breathing models are computed based on patient-specific 4D CT image data sequences. Virtual patient models are visualized in real-time by ray casting based rendering of a reference CT image warped by a time-variant displacement field, which is computed using the motion models at run-time. Furthermore, haptic interaction with the animated virtual patient models is provided by using the displacements computed at high rendering rates to translate the position of the haptic device into the space of the reference CT image. This concept is applied to virtual palpation and the haptic simulation of insertion of a virtual bendable needle. To this aim, different motion models that are applicable in real-time are presented and the methods are integrated into a needle puncture training simulation framework, which can be used for simulated biopsy or vessel puncture in the liver. To confirm real-time applicability, a performance analysis of the resulting framework is given. It is shown that the presented methods achieve mean update rates around 2,000 Hz for haptic simulation and interactive frame rates for volume rendering and thus are well suited for visuo-haptic rendering of virtual patients under respiratory motion.
NASA Astrophysics Data System (ADS)
Sund, Nicole; Porta, Giovanni; Bolster, Diogo; Parashar, Rishi
2017-11-01
Prediction of effective transport for mixing-driven reactive systems at larger scales, requires accurate representation of mixing at small scales, which poses a significant upscaling challenge. Depending on the problem at hand, there can be benefits to using a Lagrangian framework, while in others an Eulerian might have advantages. Here we propose and test a novel hybrid model which attempts to leverage benefits of each. Specifically, our framework provides a Lagrangian closure required for a volume-averaging procedure of the advection diffusion reaction equation. This hybrid model is a LAgrangian Transport Eulerian Reaction Spatial Markov model (LATERS Markov model), which extends previous implementations of the Lagrangian Spatial Markov model and maps concentrations to an Eulerian grid to quantify closure terms required to calculate the volume-averaged reaction terms. The advantage of this approach is that the Spatial Markov model is known to provide accurate predictions of transport, particularly at preasymptotic early times, when assumptions required by traditional volume-averaging closures are least likely to hold; likewise, the Eulerian reaction method is efficient, because it does not require calculation of distances between particles. This manuscript introduces the LATERS Markov model and demonstrates by example its ability to accurately predict bimolecular reactive transport in a simple benchmark 2-D porous medium.
Ambrisko, Tamas D; Klide, Alan M
2011-10-01
To assess agreement between anesthetic agent concentrations measured by use of an infrared anesthetic gas monitor (IAGM) and refractometry. SAMPLE-4 IAGMs of the same type and 1 refractometer. Mixtures of oxygen and isoflurane, sevoflurane, desflurane, or N(2)O were used. Agent volume percent was measured simultaneously with 4 IAGMs and a refractometer at the common gas outlet. Measurements obtained with each of the 4 IAGMs were compared with the corresponding refractometer measurements via the Bland-Altman method. Similarly, Bland-Altman plots were also created with either IAGM or refractometer measurements and desflurane vaporizer dial settings. Bias ± 2 SD for comparisons of IAGM and refractometer measurements was as follows: isoflurane, -0.03 ± 0.18 volume percent; sevoflurane, -0.19 ± 0.23 volume percent; desflurane, 0.43 ± 1.22 volume percent; and N(2)O, -0.21 ± 1.88 volume percent. Bland-Altman plots comparing IAGM and refractometer measurements revealed nonlinear relationships for sevoflurane, desflurane, and N(2)O. Desflurane measurements were notably affected; bias ± limits of agreement (2 SD) were small (0.1 ± 0.22 volume percent) at < 12 volume percent, but both bias and limits of agreement increased at higher concentrations. Because IAGM measurements did not but refractometer measurements did agree with the desflurane vaporizer dial settings, infrared measurement technology was a suspected cause of the nonlinear relationships. Given that the assumption of linearity is a cornerstone of anesthetic monitor calibration, this assumption should be confirmed before anesthetic monitors are used in experiments.
Entorhinal Cortex Volume in Antipsychotic-naïve Schizophrenia.
Jose, Sam P; Sharma, Eesha; Narayanaswamy, Janardhanan C; Rajendran, Vishnurajan; Kalmady, Sunil V; Rao, Naren P; Venkatasubramanian, Ganesan; Gangadhar, Bangalore N
2012-04-01
Entorhinal cortex (ERC), a multimodal sensory relay station for the hippocampus, is critically involved in learning, emotion, and novelty detection. One of the pathogenetic mechanistic bases in schizophrenia is proposed to involve aberrant information processing in the ERC. Several studies have looked at cytoarchitectural and morphometric changes in the ERC, but results have been inconsistent possibly due to the potential confounding effects of antipsychotic treatment. In this study, we have examined the entorhinal cortex volume in antipsychotic-naïve schizophrenia patients (n=40; M:F=22:18) in comparison with age, sex, and handedness, matched (as a group) with healthy subjects (n=42; M:F=25:17) using a valid method. 3-Tesla MR images with 1-mm sections were used and the data was analyzed using the SPSS software. Female schizophrenia patients (1.25±0.22 mL) showed significant volume deficit in the right ERC in comparison with female healthy controls (1.45±0.34 mL) (F=4.9; P=0.03), after controlling for the potential confounding effects of intracranial volume. However, male patients did not differ from male controls. The left ERC volume did not differ between patients and controls. Consistent with the findings of a few earlier studies we found a reduction in the right ERC volume in patients. However, this was limited to women. Contextually, our study finding supports the role for ERC deficit in schizophrenia pathogenesis - perhaps mediated through aberrant novelty detection. Sex-differential observation of ERC volume deficit in schizophrenia needs further studies.
Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne
2010-01-01
Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.
Polarimetric and Structural Properties of a Boreal Forest at P-Band and L-Band
NASA Astrophysics Data System (ADS)
Tebaldini, S.; Rocca, F.
2010-12-01
With this paper we investigate the structural and polarimetric of the boreal forest within the Krycklan river catchment, Northern Sweden, basing on multi-polarimetric and multi-baseline SAR surveys at P-Band and L-Band collected in the framework of the ESA campaign BioSAR 2008. The analysis has been carried out by applying the Algebraic Synthesis (AS) technique, recently introduced in literature, which provides a theoretical framework for the decomposition of the backscattered signal into ground-only and volume-only contributions, basing on both baseline and polarization diversity. The availability of multiple baselines allows the formation of a synthetic aperture not only along the azimuth direction but also in elevation. Accordingly, the backscattered echoes can be focused not only in the slant range, azimuth plane, but in the whole 3D space. This is the rationale of the SAR Tomography (T-SAR) concept, which has been widely considered in the literature of the last years. It follows that, as long as the penetration in the scattering volume is guaranteed, the vertical profile of the vegetation layer is retrieved by separating backscatter contributions along the vertical direction, which is the main reason for the exploitation of Tomographic techniques at longer wavelengths. Still, the capabilities of T-SAR are limited to imaging the global vertical structure of the electromagnetic scattering in a certain polarization. It then becomes important to develop methodologies for the investigation of the vertical structure of different Scattering Mechanisms (SMs), such as ground and volume scattering, in such a way as to derive information that can be delivered also outside the field of Radar processing. This is an issue that may become relevant at longer wavelengths, such as P-Band, where the presence of multiple scattering arising from the interaction with terrain could hinder the correct reconstruction of the forest structure. The availability of multiple polarizations allows to overcome this limitation, thus providing a way to obtain the vertical structures associated with volume-only contributions. Experimental results will be provided showing the following. At P-Band the most relevant scattering contributions are observed at the ground level, not only in the co-polar channels, but also in HV, consistently with he first BioSAR campaign. L-Band data have shown a remarkable difference, resulting in a more uniform distribution of the backscattered power along the vertical direction. Volume top height has been observed to be substantially invariant to the choice of the solution for volume-only scattering. These results underline the validity of modeling a forest scenario as being constituted by volume and ground (or rather ground-locked) scattering, and the importance of forest top height as the most robust indicator of the forest structure as imaged through microwaves measurements. Nevertheless, it has also been shown that different solutions for volume scattering correspond to dramatically different vertical structures. In this framework, tomography represents a powerful tool for investigating the potential solutions, as it allows to see what kind of vertical structure has been retrieved. On this basis, a solution has been proposed as a criterion to emphasize volume contributions at P-Band.
NASA Astrophysics Data System (ADS)
Mamali, Dimitra; Marinou, Eleni; Pikridas, Michael; Kottas, Michael; Binietoglou, Ioannis; Kokkalis, Panagiotis; Tsekeri, Aleksandra; Amiridis, Vasilis; Sciare, Jean; Keleshis, Christos; Engelmann, Ronny; Ansmann, Albert; Russchenberg, Herman W. J.; Biskos, George
2017-04-01
Vertical profiles of the aerosol mass concentration derived from light detection and ranging (lidar) measurements were compared to airborne dried optical particle counter (OPC MetOne; Model 212) measurements during the INUIT-BACCHUS-ACTRIS campaign. The campaign took place in April 2016 and its main focus was the study of aerosol dust particles. During the campaign the NOA Polly-XT Raman lidar located at Nicosia (35.08° N, 33.22° E) was providing round-the-clock vertical profiles of aerosol optical properties. In addition, an unmanned aerial vehicle (UAV) carrying an OPC flew on 7 days during the first morning hours. The flights were performed at Orounda (35.1018° N, 33.0944° E) reaching altitudes of 2.5 km a.s.l, which allows comparison with a good fraction of the recorded lidar data. The polarization lidar photometer networking method (POLIPHON) was used for the estimation of the fine (non-dust) and coarse (dust) mode aerosol mass concentration profiles. This method uses as input the particle backscatter coefficient and the particle depolarization profiles of the lidar at 532 nm wavelength and derives the aerosol mass concentration. The first step in this approach makes use of the lidar observations to separate the backscatter and extinction contributions of the weakly depolarizing non-dust aerosol components from the contributions of the strongly depolarizing dust particles, under the assumption of an externally mixed two-component aerosol. In the second step, sun photometer retrievals of the fine and the coarse modes aerosol optical thickness (AOT) and volume concentration are used to calculate the associated concentrations from the extinction coefficients retrieved from the lidar. The estimated aerosol volume concentrations were converted into mass concentration with an assumption for the bulk aerosol density, and compared with the OPC measurements. The first results show agreement within the experimental uncertainty. This project received funding from the European Union's Seventh Framework Programme (FP7) project BACCHUS under grant agreement no. 603445, and the European Union's Horizon 2020 research and innovation programme ACTRIS-2 under grant agreement No 654109.
Maxwell, M; Howie, J G; Pryde, C J
1998-01-01
BACKGROUND: Prescribing matters (particularly budget setting and research into prescribing variation between doctors) have been handicapped by the absence of credible measures of the volume of drugs prescribed. AIM: To use the defined daily dose (DDD) method to study variation in the volume and cost of drugs prescribed across the seven main British National Formulary (BNF) chapters with a view to comparing different methods of setting prescribing budgets. METHOD: Study of one year of prescribing statistics from all 129 general practices in Lothian, covering 808,059 patients: analyses of prescribing statistics for 1995 to define volume and cost/volume of prescribing for one year for 10 groups of practices defined by the age and deprivation status of their patients, for seven BNF chapters; creation of prescribing budgets for 1996 for each individual practice based on the use of target volume and cost statistics; comparison of 1996 DDD-based budgets with those set using the conventional historical approach; and comparison of DDD-based budgets with budgets set using a capitation-based formula derived from local cost/patient information. RESULTS: The volume of drugs prescribed was affected by the age structure of the practices in BNF Chapters 1 (gastrointestinal), 2 (cardiovascular), and 6 (endocrine), and by deprivation structure for BNF Chapters 3 (respiratory) and 4 (central nervous system). Costs per DDD in the major BNF chapters were largely independent of age, deprivation structure, or fundholding status. Capitation and DDD-based budgets were similar to each other, but both differed substantially from historic budgets. One practice in seven gained or lost more than 100,000 Pounds per annum using DDD or capitation budgets compared with historic budgets. The DDD-based budget, but not the capitation-based budget, can be used to set volume-specific prescribing targets. CONCLUSIONS: DDD-based and capitation-based prescribing budgets can be set using a simple explanatory model and generalizable methods. In this study, both differed substantially from historic budgets. DDD budgets could be created to accommodate new prescribing strategies and raised or lowered to reflect local intentions to alter overall prescribing volume or cost targets. We recommend that future work on setting budgets and researching prescribing variations should be based on DDD statistics. PMID:10024703
Identifying and Comparing Scandinavian Ethnography: Comparisons and Influences
ERIC Educational Resources Information Center
Beach, Dennis
2010-01-01
In recent years, there has been a significant growth in the volume of research production in education ethnography in Scandinavia due partly to a regionally financed network. The present article makes some comparisons between Scandinavian and other education research contexts in relation to aspects of general ethnographic design to try to analyse…
Why is Coal Ash of Concern and How to Assess Potential Impacts?
EPA's new test methods - the leaching environmental assessment framework (LEAF) are discussed including how they have been used to evaluate fly ash and scrubber residues. Work to evaluate high-volume encapsulated use of fly ash in cementitious material is also described.
Treatment of Bipolar Disorder in the University Student Population
ERIC Educational Resources Information Center
Federman, Russ
2011-01-01
University counseling centers are faced with the challenge of effectively treating bipolar students while also utilizing brief treatment frameworks and managing high patient volumes. Potential destabilization, particularly within the elevated mood phase, poses significant behavioral management issues for university clinicians and administrators,…
Genre and Literacy-Modeling Context in Educational Linguistics.
ERIC Educational Resources Information Center
Martin, James R.
1992-01-01
Complements review in previous volume concerning Australian literacy (in first- and second-language) initiatives that drew on systemic functional linguistics, highlights ongoing research within the same theoretical framework, and focuses on the question of modeling context in educational linguistics. The discussion includes modeling context as…
Variation and Linguistic Theory.
ERIC Educational Resources Information Center
Bailey, Charles-James N.
This volume presents principles and models for describing language variation, and introduces a time-based, dynamic framework for linguistic description. The book first summarizes some of the problems of grammatical description encountered from Saussure through the present and then outlines possibilities for new descriptions of language which take…
CALS Baseline Architecture Analysis of Weapons System. Technical Information: Army, Draft. Volume 8
DOT National Transportation Integrated Search
1989-09-01
This effort was performed to provide a common framework for analysis and planning of CALS initiatives across the military services, leading eventually to the development of a common DoD-wide architecture for CALS. This study addresses Army technical ...
Application of ISO 9000 Standards to Education and Training.
ERIC Educational Resources Information Center
Van den Berghe, Wouter
1998-01-01
ISO 9000 certification has the advantages of a measurable framework for quality efforts, continuous improvement, and better customer service. Drawbacks for education and training providers include volume of paperwork, ongoing cost, risk of a growing bureaucracy, and the difficulty of making changes quickly. (SK)