Science.gov

Sample records for 3-dimensional computational analysis

  1. MICRO-DOSIMETRY ANALYSIS USING 3-DIMENSIONAL COMPUTER SIMULATIONS OF PARTICLE DEPOSITION IN HUMAN LUNGS

    EPA Science Inventory

    The aim of this project is to develop three-dimensional computer simulations for aerosol transport and deposition in human respiratory tract. Three-dimensional CFPD (computational fluid and particle dynamics) modeling is a powerful tool to obtain microscopic dose information at l...

  2. From 2-dimensional cephalograms to 3-dimensional computed tomography scans.

    PubMed

    Halazonetis, Demetrios J

    2005-05-01

    Computed tomography is entering the orthodontic specialty as a mainstream diagnostic modality. Radiation exposure and cost have decreased significantly, and the diagnostic value is very high compared with traditional radiographic options. However, 3-dimensional data present new challenges and need a different approach from traditional viewing of static images to make the most of the available possibilities. Advances in computer hardware and software now enable interactive display of the data on personal computers, with the ability to selectively view soft or hard tissues from any angle. Transfer functions are used to apply transparency and color. Cephalometric measurements can be taken by digitizing points in 3-dimensional coordinates. Application of 3-dimensional data is expected to increase significantly soon and might eventually replace many conventional orthodontic records that are in use today. PMID:15877045

  3. Particle trajectory computation on a 3-dimensional engine inlet. Final Report Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kim, J. J.

    1986-01-01

    A 3-dimensional particle trajectory computer code was developed to compute the distribution of water droplet impingement efficiency on a 3-dimensional engine inlet. The computed results provide the essential droplet impingement data required for the engine inlet anti-icing system design and analysis. The droplet trajectories are obtained by solving the trajectory equation using the fourth order Runge-Kutta and Adams predictor-corrector schemes. A compressible 3-D full potential flow code is employed to obtain a cylindrical grid definition of the flowfield on and about the engine inlet. The inlet surface is defined mathematically through a system of bi-cubic parametric patches in order to compute the droplet impingement points accurately. Analysis results of the 3-D trajectory code obtained for an axisymmetric droplet impingement problem are in good agreement with NACA experimental data. Experimental data are not yet available for the engine inlet impingement problem analyzed. Applicability of the method to solid particle impingement problems, such as engine sand ingestion, is also demonstrated.

  4. Unification of color postprocessing techniques for 3-dimensional computational mechanics

    NASA Technical Reports Server (NTRS)

    Bailey, Bruce Charles

    1985-01-01

    To facilitate the understanding of complex three-dimensional numerical models, advanced interactive color postprocessing techniques are introduced. These techniques are sufficiently flexible so that postprocessing difficulties arising from model size, geometric complexity, response variation, and analysis type can be adequately overcome. Finite element, finite difference, and boundary element models may be evaluated with the prototype postprocessor. Elements may be removed from parent models to be studied as independent subobjects. Discontinuous responses may be contoured including responses which become singular, and nonlinear color scales may be input by the user for the enhancement of the contouring operation. Hit testing can be performed to extract precise geometric, response, mesh, or material information from the database. In addition, stress intensity factors may be contoured along the crack front of a fracture model. Stepwise analyses can be studied, and the user can recontour responses repeatedly, as if he were paging through the response sets. As a system, these tools allow effective interpretation of complex analysis results.

  5. Computer-assisted 3-dimensional anthropometry of the scaphoid.

    PubMed

    Pichler, Wolfgang; Windisch, Gunther; Schaffler, Gottfried; Heidari, Nima; Dorr, Katrin; Grechenig, Wolfgang

    2010-02-01

    Scaphoid fracture fixation using a cannulated headless compression screw and the Matti-Russe procedure for the treatment of scaphoid nonunions are performed routinely. Surgeons performing these procedures need to be familiar with the anatomy of the scaphoid. A literature review reveals relatively few articles on this subject. The goal of this anatomical study was to measure the scaphoid using current technology and to discuss the findings with respect to the current, relevant literature.Computed tomography scans of 30 wrists were performed using a 64-slice SOMATOM Sensation CT system (resolution 0.6 mm) (Siemens Medical Solutions Inc, Malvern, Pennsylvania). Three-dimensional reconstructions from the raw data were generated by MIMICS software (Materialise, Leuven, Belgium). The scaphoid had a mean length of 26.0 mm (range, 22.3-30.7 mm), and men had a significantly longer (P<.001) scaphoid than women (27.861.6 mm vs 24.561.6 mm, respectively). The width and height were measured at 3 different levels for volume calculations, resulting in a mean volume of 3389.5 mm(3). Men had a significantly larger (P<.001) scaphoid volume than women (4057.86740.7 mm(3) vs 2846.56617.5 mm(3), respectively).We found considerable variation in the length and volume of the scaphoid in our cohort. We also demonstrated a clear correlation between scaphoid size and sex. Surgeons performing operative fixation of scaphoid fractures and corticocancellous bone grafting for nonunions need to be familiar with these anatomical variations. PMID:20192143

  6. A 3-dimensional Analysis of the Cassiopeia A Supernova Remnant

    NASA Astrophysics Data System (ADS)

    Isensee, Karl

    We present a multi-wavelength study of the nearby supernova remnant Cassiopeia A (Cas A). Easily resolvable supernova remnants such as Cas A provide a unique opportunity to test supernova explosion models. Additionally, we can observe key processes in the interstellar medium as the ejecta from the initial explosion encounter Cas A's powerful shocks. In order to accomplish these science goals, we used the Spitzer Space Telescope's Infrared Spectrograph to create a high resolution spectral map of select regions of Cas A, allowing us to make a Doppler reconstruction of its 3-dimensional structure structure. In the center of the remnant, we find relatively pristine ejecta that have not yet reached Cas A's reverse shock or interacted with the circumstellar environment. We observe O, Si, and S emission. These ejecta can form both sheet-like structures as well as filaments. Si and O, which come from different nucleosynthetic layers of the star, are observed to be coincident in some regions, and separated by >500 km s -1 in others. Observed ejecta traveling toward us are, on average, ˜800 km s -1 slower than the material traveling away from us. We compare our observations to recent supernova explosion models and find that no single model can simultaneously reproduce all the observed features. However, models of different supernova explosions can collectively produce the observed geometries and structures of the emission interior to Cas A's reverse shock. We use the results from the models to address the conditions during the supernova explosion, concentrating on asymmetries in the shock structure. We also predict that the back surface of Cassiopeia A will begin brightening in ∼30 years, and the front surface in ˜100 years. We then used similar observations from 3 regions on Cas A's reverse shock in order to create more 3-dimensional maps. In these regions, we observe supernova ejecta both immediately before and during the shock-ejecta interaction. We determine that the

  7. Simple computer program to model 3-dimensional underground heat flow with realistic boundary conditions

    NASA Astrophysics Data System (ADS)

    Metz, P. D.

    A FORTRAN computer program called GROCS (GRound Coupled Systems) has been developed to study 3-dimensional underground heat flow. Features include the use of up to 30 finite elements or blocks of Earth which interact via finite difference heat flow equations and a subprogram which sets realistic time and depth dependent boundary conditions. No explicit consideration of mositure movement or freezing is given. GROCS has been used to model the thermal behavior of buried solar heat storage tanks (with and without insulation) and serpentine pipe fields for solar heat pump space conditioning systems. The program is available independently or in a form compatible with specially written TRNSYS component TYPE subroutines. The approach taken in the design of GROCS, the mathematics contained and the program architecture, are described. Then, the operation of the stand-alone version is explained. Finally, the validity of GROCS is discussed.

  8. The Effectiveness of an Interactive 3-Dimensional Computer Graphics Model for Medical Education

    PubMed Central

    Konishi, Takeshi; Tamura, Yoko; Moriguchi, Hiroki

    2012-01-01

    Background Medical students often have difficulty achieving a conceptual understanding of 3-dimensional (3D) anatomy, such as bone alignment, muscles, and complex movements, from 2-dimensional (2D) images. To this end, animated and interactive 3-dimensional computer graphics (3DCG) can provide better visual information to users. In medical fields, research on the advantages of 3DCG in medical education is relatively new. Objective To determine the educational effectiveness of interactive 3DCG. Methods We divided 100 participants (27 men, mean (SD) age 17.9 (0.6) years, and 73 women, mean (SD) age 18.1 (1.1) years) from the Health Sciences University of Mongolia (HSUM) into 3DCG (n = 50) and textbook-only (control) (n = 50) groups. The control group used a textbook and 2D images, while the 3DCG group was trained to use the interactive 3DCG shoulder model in addition to a textbook. We conducted a questionnaire survey via an encrypted satellite network between HSUM and Tokushima University. The questionnaire was scored on a 5-point Likert scale from strongly disagree (score 1) to strongly agree (score 5). Results Interactive 3DCG was effective in undergraduate medical education. Specifically, there was a significant difference in mean (SD) scores between the 3DCG and control groups in their response to questionnaire items regarding content (4.26 (0.69) vs 3.85 (0.68), P = .001) and teaching methods (4.33 (0.65) vs 3.74 (0.79), P < .001), but no significant difference in the Web category. Participants also provided meaningful comments on the advantages of interactive 3DCG. Conclusions Interactive 3DCG materials have positive effects on medical education when properly integrated into conventional education. In particular, our results suggest that interactive 3DCG is more efficient than textbooks alone in medical education and can motivate students to understand complex anatomical structures. PMID:23611759

  9. Cerebral Degeneration in Amyotrophic Lateral Sclerosis Revealed by 3-Dimensional Texture Analysis

    PubMed Central

    Maani, Rouzbeh; Yang, Yee-Hong; Emery, Derek; Kalra, Sanjay

    2016-01-01

    Introduction: Routine MR images do not consistently reveal pathological changes in the brain in ALS. Texture analysis, a method to quantitate voxel intensities and their patterns and interrelationships, can detect changes in images not apparent to the naked eye. Our objective was to evaluate cerebral degeneration in ALS using 3-dimensional texture analysis of MR images of the brain. Methods: In a case-control design, voxel-based texture analysis was performed on T1-weighted MR images of 20 healthy subjects and 19 patients with ALS. Four texture features, namely, autocorrelation, sum of squares variance, sum average, and sum variance were computed. Texture features were compared between the groups by statistical parametric mapping and correlated with clinical measures of disability and upper motor neuron dysfunction. Results: Texture features were different in ALS in motor regions including the precentral gyrus and corticospinal tracts. To a lesser extent, changes were also found in the thalamus, cingulate gyrus, and temporal lobe. Texture features in the precentral gyrus correlated with disease duration, and in the corticospinal tract they correlated with finger tapping speed. Conclusions: Changes in MR image textures are present in motor and non-motor regions in ALS and correlate with clinical features. Whole brain texture analysis has potential in providing biomarkers of cerebral degeneration in ALS. PMID:27064416

  10. Computation of transonic potential flow about 3 dimensional inlets, ducts, and bodies

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1982-01-01

    An analysis was developed and a computer code, P465 Version A, written for the prediction of transonic potential flow about three dimensional objects including inlet, duct, and body geometries. Finite differences and line relaxation are used to solve the complete potential flow equation. The coordinate system used for the calculations is independent of body geometry. Cylindrical coordinates are used for the computer code. The analysis is programmed in extended FORTRAN 4 for the CYBER 203 vector computer. The programming of the analysis is oriented toward taking advantage of the vector processing capabilities of this computer. Comparisons of computed results with experimental measurements are presented to verify the analysis. Descriptions of program input and output formats are also presented.

  11. Automated image analysis reveals the dynamic 3-dimensional organization of multi-ciliary arrays

    PubMed Central

    Galati, Domenico F.; Abuin, David S.; Tauber, Gabriel A.; Pham, Andrew T.; Pearson, Chad G.

    2016-01-01

    ABSTRACT Multi-ciliated cells (MCCs) use polarized fields of undulating cilia (ciliary array) to produce fluid flow that is essential for many biological processes. Cilia are positioned by microtubule scaffolds called basal bodies (BBs) that are arranged within a spatially complex 3-dimensional geometry (3D). Here, we develop a robust and automated computational image analysis routine to quantify 3D BB organization in the ciliate, Tetrahymena thermophila. Using this routine, we generate the first morphologically constrained 3D reconstructions of Tetrahymena cells and elucidate rules that govern the kinetics of MCC organization. We demonstrate the interplay between BB duplication and cell size expansion through the cell cycle. In mutant cells, we identify a potential BB surveillance mechanism that balances large gaps in BB spacing by increasing the frequency of closely spaced BBs in other regions of the cell. Finally, by taking advantage of a mutant predisposed to BB disorganization, we locate the spatial domains that are most prone to disorganization by environmental stimuli. Collectively, our analyses reveal the importance of quantitative image analysis to understand the principles that guide the 3D organization of MCCs. PMID:26700722

  12. Automated image analysis reveals the dynamic 3-dimensional organization of multi-ciliary arrays.

    PubMed

    Galati, Domenico F; Abuin, David S; Tauber, Gabriel A; Pham, Andrew T; Pearson, Chad G

    2015-01-01

    Multi-ciliated cells (MCCs) use polarized fields of undulating cilia (ciliary array) to produce fluid flow that is essential for many biological processes. Cilia are positioned by microtubule scaffolds called basal bodies (BBs) that are arranged within a spatially complex 3-dimensional geometry (3D). Here, we develop a robust and automated computational image analysis routine to quantify 3D BB organization in the ciliate, Tetrahymena thermophila. Using this routine, we generate the first morphologically constrained 3D reconstructions of Tetrahymena cells and elucidate rules that govern the kinetics of MCC organization. We demonstrate the interplay between BB duplication and cell size expansion through the cell cycle. In mutant cells, we identify a potential BB surveillance mechanism that balances large gaps in BB spacing by increasing the frequency of closely spaced BBs in other regions of the cell. Finally, by taking advantage of a mutant predisposed to BB disorganization, we locate the spatial domains that are most prone to disorganization by environmental stimuli. Collectively, our analyses reveal the importance of quantitative image analysis to understand the principles that guide the 3D organization of MCCs. PMID:26700722

  13. Contributions of the Musculus Uvulae to Velopharyngeal Closure Quantified With a 3-Dimensional Multimuscle Computational Model.

    PubMed

    Inouye, Joshua M; Lin, Kant Y; Perry, Jamie L; Blemker, Silvia S

    2016-02-01

    The convexity of the dorsal surface of the velum is critical for normal velopharyngeal (VP) function and is largely attributed to the levator veli palatini (LVP) and musculus uvulae (MU). Studies have correlated a concave or flat nasal velar surface to symptoms of VP dysfunction including hypernasality and nasal air emission. In the context of surgical repair of cleft palates, the MU has been given relatively little attention in the literature compared with the larger LVP. A greater understanding of the mechanics of the MU will provide insight into understanding the influence of a dysmorphic MU, as seen in cleft palate, as it relates to VP function. The purpose of this study was to quantify the contributions of the MU to VP closure in a computational model. We created a novel 3-dimensional (3D) finite element model of the VP mechanism from magnetic resonance imaging data collected from an individual with healthy noncleft VP anatomy. The model components included the velum, posterior pharyngeal wall (PPW), LVP, and MU. Simulations were based on the muscle and soft tissue mechanical properties from the literature. We found that, similar to previous hypotheses, the MU acts as (i) a space-occupying structure and (ii) a velar extensor. As a space-occupying structure, the MU helps to nearly triple the midline VP contact length. As a velar extensor, the MU acting alone without the LVP decreases the VP distance 62%. Furthermore, activation of the MU decreases the LVP activation required for closure almost 3-fold, from 20% (without MU) to 8% (with MU). Our study suggests that any possible salvaging and anatomical reconstruction of viable MU tissue in a cleft patient may improve VP closure due to its mechanical function. In the absence or dysfunction of MU tissue, implantation of autologous or engineered tissues at the velar midline, as a possible substitute for the MU, may produce a geometric convexity more favorable to VP closure. In the future, more complex models will

  14. Manufacturing models of fetal malformations built from 3-dimensional ultrasound, magnetic resonance imaging, and computed tomography scan data.

    PubMed

    Werner, Heron; Rolo, Liliam Cristine; Araujo Júnior, Edward; Dos Santos, Jorge Roberto Lopes

    2014-03-01

    Technological innovations accompanying advances in medicine have given rise to the possibility of obtaining better-defined fetal images that assist in medical diagnosis and contribute toward genetic counseling offered to parents during the prenatal period. In this article, we show our innovative experience of diagnosing fetal malformations through correlating 3-dimensional ultrasonography, magnetic resonance imaging, and computed tomography, which are accurate techniques for fetal assessment, with a fetal image reconstruction technique to create physical fetal models. PMID:24901782

  15. Computer-Aided Designed, 3-Dimensionally Printed Porous Tissue Bioscaffolds For Craniofacial Soft Tissue Reconstruction

    PubMed Central

    Zopf, David A.; Mitsak, Anna G.; Flanagan, Colleen L.; Wheeler, Matthew; Green, Glenn E.; Hollister, Scott J.

    2016-01-01

    Objectives To determine the potential of integrated image-based Computer Aided Design (CAD) and 3D printing approach to engineer scaffolds for head and neck cartilaginous reconstruction for auricular and nasal reconstruction. Study Design Proof of concept revealing novel methods for bioscaffold production with in vitro and in vivo animal data. Setting Multidisciplinary effort encompassing two academic institutions. Subjects and Methods DICOM CT images are segmented and utilized in image-based computer aided design to create porous, anatomic structures. Bioresorbable, polycaprolactone scaffolds with spherical and random porous architecture are produced using a laser-based 3D printing process. Subcutaneous in vivo implantation of auricular and nasal scaffolds was performed in a porcine model. Auricular scaffolds were seeded with chondrogenic growth factors in a hyaluronic acid/collagen hydrogel and cultured in vitro over 2 months duration. Results Auricular and nasal constructs with several microporous architectures were rapidly manufactured with high fidelity to human patient anatomy. Subcutaneous in vivo implantation of auricular and nasal scaffolds resulted in excellent appearance and complete soft tissue ingrowth. Histologic analysis of in vitro scaffolds demonstrated native appearing cartilaginous growth respecting the boundaries of the scaffold. Conclusions Integrated image-based computer-aided design (CAD) and 3D printing processes generated patient-specific nasal and auricular scaffolds that supported cartilage regeneration. PMID:25281749

  16. A New 3-Dimensional Dynamic Quantitative Analysis System of Facial Motion: An Establishment and Reliability Test

    PubMed Central

    Feng, Guodong; Zhao, Yang; Tian, Xu; Gao, Zhiqiang

    2014-01-01

    This study aimed to establish a 3-dimensional dynamic quantitative facial motion analysis system, and then determine its accuracy and test-retest reliability. The system could automatically reconstruct the motion of the observational points. Standardized T-shaped rod and L-shaped rods were used to evaluate the static and dynamic accuracy of the system. Nineteen healthy volunteers were recruited to test the reliability of the system. The average static distance error measurement was 0.19 mm, and the average angular error was 0.29°. The measuring results decreased with the increase of distance between the cameras and objects, 80 cm of which was considered to be optimal. It took only 58 seconds to perform the full facial measurement process. The average intra-class correlation coefficient for distance measurement and angular measurement was 0.973 and 0.794 respectively. The results demonstrated that we successfully established a practical 3-dimensional dynamic quantitative analysis system that is accurate and reliable enough to meet both clinical and research needs. PMID:25390881

  17. Porous Media Contamination: 3-Dimensional Visualization and Quantification Using X-Ray Computed Tomography

    NASA Astrophysics Data System (ADS)

    Goldstein, L.; Prasher, S. O.; Ghoshal, S.

    2004-05-01

    Non-aqueous phase liquids (NAPLs), if spilled into the subsurface, will migrate downward, and a significant fraction will become trapped in the soil matrix. These trapped NAPL globules partition into the water and/or vapor phase, and serve as continuous sources of contamination (e.g. source zones). At present, the presence of NAPL in the subsurface is typically inferred from chemical analysis data. There are no accepted methodologies or protocols available for the direct characterization of NAPLs in the subsurface. Proven and cost-effective methodologies are needed to allow effective implementation of remediation technologies at NAPL contaminated sites. X-ray Computed Tomography (CT) has the potential to non-destructively quantify NAPL mass and distribution in soil cores due to this technology's ability to detect small atomic density differences of solid, liquid, gas, and NAPL phases present in a representative volume element. We have demonstrated that environmentally significant NAPLs, such as gasoline and other oil products, chlorinated solvents, and PCBs possess a characteristic and predictable X-ray attenuation coefficient that permits their quantification in porous media at incident beam energies, typical of medical and industrial X-ray CT scanners. As part of this study, methodologies were developed for generating and analyzing X-ray CT data for the study of NAPLs in natural porous media. Columns of NAPL-contaminated soils were scanned, flushed with solvents and water to remove entrapped NAPL, and re-scanned. X-ray CT data was analyzed to obtain numerical arrays of soil porosity, NAPL saturation, and NAPL volume at a spatial resolution of 1 mm. This methodology was validated using homogeneous and heterogeneous soil columns with known quantities of gasoline and tetrachloroethylene. NAPL volumes computed using X-ray CT data was compared with known volumes from volume balance calculations. Error analysis revealed that in a 5 cm long and 2.5 cm diameter soil

  18. Surgical Classification of the Mandibular Deformity in Craniofacial Microsomia Using 3-Dimensional Computed Tomography

    PubMed Central

    Swanson, Jordan W.; Mitchell, Brianne T.; Wink, Jason A.; Taylor, Jesse A.

    2016-01-01

    Background: Grading systems of the mandibular deformity in craniofacial microsomia (CFM) based on conventional radiographs have shown low interrater reproducibility among craniofacial surgeons. We sought to design and validate a classification based on 3-dimensional CT (3dCT) that correlates features of the deformity with surgical treatment. Methods: CFM mandibular deformities were classified as normal (T0), mild (hypoplastic, likely treated with orthodontics or orthognathic surgery; T1), moderate (vertically deficient ramus, likely treated with distraction osteogenesis; T2), or severe (ramus rudimentary or absent, with either adequate or inadequate mandibular body bone stock; T3 and T4, likely treated with costochondral graft or free fibular flap, respectively). The 3dCT face scans of CFM patients were randomized and then classified by craniofacial surgeons. Pairwise agreement and Fleiss' κ were used to assess interrater reliability. Results: The 3dCT images of 43 patients with CFM (aged 0.1–15.8 years) were reviewed by 15 craniofacial surgeons, representing an average 15.2 years of experience. Reviewers demonstrated fair interrater reliability with average pairwise agreement of 50.4 ± 9.9% (Fleiss' κ = 0.34). This represents significant improvement over the Pruzansky–Kaban classification (pairwise agreement, 39.2%; P = 0.0033.) Reviewers demonstrated substantial interrater reliability with average pairwise agreement of 83.0 ± 7.6% (κ = 0.64) distinguishing deformities requiring graft or flap reconstruction (T3 and T4) from others. Conclusion: The proposed classification, designed for the era of 3dCT, shows improved consensus with respect to stratifying the severity of mandibular deformity and type of operative management. PMID:27104097

  19. Role of the Animator in the Generation of 3-Dimensional Computer Generated Animation.

    ERIC Educational Resources Information Center

    Wedge, John Christian

    This master's thesis investigates the relationship between the traditional animator and the computer as computer animation systems allow them to apply traditional skills with a high degree of success. The advantages and disadvantages of traditional animation as a medium for expressing motion and character are noted, and it is argued that the…

  20. Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code

    NASA Technical Reports Server (NTRS)

    Weinberg, B. C.; Mcdonald, H.

    1980-01-01

    There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.

  1. Interactive 3-dimensional segmentation of MRI data in personal computer environment.

    PubMed

    Yoo, S S; Lee, C U; Choi, B G; Saiviroonporn, P

    2001-11-15

    We describe a method of interactive three-dimensional segmentation and visualization for anatomical magnetic resonance imaging (MRI) data in a personal computer environment. The visual feedback necessary during 3-D segmentation was provided by a ray casting algorithm, which was designed to allow users to interactively decide the visualization quality depending on the task-requirement. Structures such as gray matter, white matter, and facial skin from T1-weighted high-resolution MRI data were segmented and later visualized with surface rendering. Personal computers with central processing unit (CPU) speeds of 266, 400, and 700 MHz, were used for the implementation. The 3-D visualization upon each execution of the segmentation operation was achieved in the order of 2 s with a 700 MHz CPU. Our results suggest that 3-D volume segmentation with semi real-time visual feedback could be effectively implemented in a PC environment without the need for dedicated graphics processing hardware. PMID:11640960

  2. A 3-dimensional Navier-Stokes-Euler code for blunt-body flow computations

    NASA Technical Reports Server (NTRS)

    Li, C. P.

    1985-01-01

    The shock-layer flowfield is obtained with or without viscous and heat-conducting dissipations from the conservative laws of fluid dynamics equations using a shock-fitting implicity finite-difference technique. The governing equations are cast in curvilinear-orthogonal coordinates and transformed to the domain between the shock and the body. Another set of equations is used for the singular coordinate axis, which, together with a cone generator away from the stagnation point, encloses the computation domain. A time-dependent alternating direction implicit factorization technique is applied to integrate the equations with local-time increment until a steady solution is reached. The shock location is updated after the flowfield computation, but the wall conditions are implemented into the implicit procedure. Innovative procedures are introduced to define the initial flowfield, to treat both perfect and equilibrium gases, to advance the solution on a coarse-to-fine grid sequence, and to start viscous flow computations from their corresponding inviscid solutions. The results are obtained from a grid no greater than 28 by 18 by 7 and converged within 300 integration steps. They are of sufficient accuracy to start parabolized Navier-Stokes or Euler calculations beyond the nose region, to compare with flight and wind-tunnel data, and to evaluate conceptual designs of reentry spacecraft.

  3. 3-dimensional (orthogonal) structural complexity of time-series data using low-order moment analysis

    NASA Astrophysics Data System (ADS)

    Law, Victor J.; O'Neill, Feidhlim T.; Dowling, Denis P.

    2012-09-01

    The recording of atmospheric pressure plasmas (APP) electro-acoustic emission data has been developed as a plasma metrology tool in the last couple of years. The industrial applications include automotive and aerospace industry for surface activation of polymers prior to bonding [1, 2, and 3]. It has been shown that as the APP jets proceeds over a treatment surface, at a various fixed heights, two contrasting acoustic signatures are produced which correspond to two very different plasma-surface entropy states (blow arc ˜ 1700 ± 100 K; and; afterglow ˜ 300-400 K) [4]. The metrology challenge is now to capture deterministic data points within data clusters. For this to be achieved new real-time data cluster measurement techniques needs to be developed [5]. The cluster information must be extracted within the allotted process time period if real-time process control is to be achieved. This abstract describes a theoretical structural complexity analysis (in terms crossing points) of 2 and 3-dimentional line-graphs that contain time-series data. In addition LabVIEW implementation of the 3-dimensional data analysis is performed. It is also shown the cluster analysis technique can be transfer to other (non-acoustic) datasets.

  4. A Modular Computer Code for Simulating Reactive Multi-Species Transport in 3-Dimensional Groundwater Systems

    SciTech Connect

    TP Clement

    1999-06-24

    RT3DV1 (Reactive Transport in 3-Dimensions) is computer code that solves the coupled partial differential equations that describe reactive-flow and transport of multiple mobile and/or immobile species in three-dimensional saturated groundwater systems. RT3D is a generalized multi-species version of the US Environmental Protection Agency (EPA) transport code, MT3D (Zheng, 1990). The current version of RT3D uses the advection and dispersion solvers from the DOD-1.5 (1997) version of MT3D. As with MT3D, RT3D also requires the groundwater flow code MODFLOW for computing spatial and temporal variations in groundwater head distribution. The RT3D code was originally developed to support the contaminant transport modeling efforts at natural attenuation demonstration sites. As a research tool, RT3D has also been used to model several laboratory and pilot-scale active bioremediation experiments. The performance of RT3D has been validated by comparing the code results against various numerical and analytical solutions. The code is currently being used to model field-scale natural attenuation at multiple sites. The RT3D code is unique in that it includes an implicit reaction solver that makes the code sufficiently flexible for simulating various types of chemical and microbial reaction kinetics. RT3D V1.0 supports seven pre-programmed reaction modules that can be used to simulate different types of reactive contaminants including benzene-toluene-xylene mixtures (BTEX), and chlorinated solvents such as tetrachloroethene (PCE) and trichloroethene (TCE). In addition, RT3D has a user-defined reaction option that can be used to simulate any other types of user-specified reactive transport systems. This report describes the mathematical details of the RT3D computer code and its input/output data structure. It is assumed that the user is familiar with the basics of groundwater flow and contaminant transport mechanics. In addition, RT3D users are expected to have some experience in

  5. Effect of decompression on cystic lesions of the mandible: 3-dimensional volumetric analysis.

    PubMed

    Song, I S; Park, H S; Seo, B M; Lee, J H; Kim, M J

    2015-11-01

    Decompression is effective in reducing both the size of cystic lesions on jaws and the associated morbidity of resection. However, quantitative measurement of reduced volume after decompression among different cystic diseases has not been fully investigated. We have retrospectively investigated the difference in reduction in volume among keratocystic odontogenic tumours (n=17), unicystic ameloblastomas (n=10), and dentigerous cysts (n=10) of the posterior mandible using 3-dimensional computed tomography (CT). Various other influential factors such as age, sex, the presence of impacted teeth, and the number of drains were also recorded. There was no significant difference in the speed of shrinkage among the 3 groups, but there was a significant correlation (p<0.01) between the initial detected volume of the lesion and the absolute speed of shrinkage in each type of cyst. Initial volume was also significantly associated (p<0.01) with reduction of total volume in each type of cyst. Age may correlate negatively with the rate of reduction in dentigerous cysts, which means that the older the patient is, the less the reduction. Treatment seemed to last longer as the speed of shrinkage lessened in the keratocystic tumours and dentigerous cysts (p<0.05) as multiple regression has shown. The relative speed of shrinkage of unicystic ameloblastomas seemed to be slower when an impacted tooth was involved in the lesion (p=0.019). However, the sample size was too small to make any definite statistical statement. These results suggest that the rate of reduction of volume was related to the original size of the lesion. Despite the need for a second operation and longer duration of treatment compared with excision alone, decompression is a valuable way of reducing the size of large cystic lesions, with low morbidity and recurrence rate. There was no difference in the rate of reduction according to the underlying histopathological picture. PMID:26212420

  6. Scene-of-crime analysis by a 3-dimensional optical digitizer: a useful perspective for forensic science.

    PubMed

    Sansoni, Giovanna; Cattaneo, Cristina; Trebeschi, Marco; Gibelli, Daniele; Poppa, Pasquale; Porta, Davide; Maldarella, Monica; Picozzi, Massimo

    2011-09-01

    Analysis and detailed registration of the crime scene are of the utmost importance during investigations. However, this phase of activity is often affected by the risk of loss of evidence due to the limits of traditional scene of crime registration methods (ie, photos and videos). This technical note shows the utility of the application of a 3-dimensional optical digitizer on different crime scenes. This study aims in fact at verifying the importance and feasibility of contactless 3-dimensional reconstruction and modeling by optical digitization to achieve an optimal registration of the crime scene. PMID:21811148

  7. Comparison of 1-, 2-, and 3-dimensional modeling of the TFTR for nuclear radiation transport analysis

    SciTech Connect

    Ku, L.P.; Kolibal, J.G.; Liew, S.L.

    1985-09-01

    The computational models of the TFTR constructed for the radiation transport analysis for the Q approx. 1 demonstration are summarized and reviewed. These models can be characterized by the dimensionality required to describe the geometry, and by the numerical methods of solving the transport equation. Results obtained with these models in the test cell are compared and discussed.

  8. Morphological analysis and preoperative simulation of a double-chambered right ventricle using 3-dimensional printing technology.

    PubMed

    Shirakawa, Takashi; Koyama, Yasushi; Mizoguchi, Hiroki; Yoshitatsu, Masao

    2016-05-01

    We present a case of a double-chambered right ventricle in adulthood, in which we tried a detailed morphological assessment and preoperative simulation using 3-dimensional (3D) heart models for improved surgical planning. Polygonal object data for the heart were constructed from computed tomography images of this patient, and transferred to a desktop 3D printer to print out models in actual size. Medical staff completed all of the work processes. Because the 3D heart models were examined by hand, observed from various viewpoints and measured by callipers with ease, we were able to create an image of the complete form of the heart. The anatomical structure of an anomalous bundle was clearly observed, and surgical approaches to the lesion were simulated accurately. During surgery, we used an incision on the pulmonary infundibulum and resected three muscular components of the stenosis. The similarity between the models and the actual heart was excellent. As a result, the operation for this rare defect was performed safely and successfully. We concluded that the custom-made model was useful for morphological analysis and preoperative simulation. PMID:26860990

  9. Input generator for Denton 3-dimensional turbomachine-blade-row analysis code

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.; Wood, J. R.

    1983-01-01

    A users manual is presented for a computer program that prepares the bulk of the input data set required for the Denton three dimensional turbomachine blade row analysis code. The Denton input is generated from a minimum of geometry and flow variable information by using cubic spline curve fitting procedures. The features of the program are discussed. The input is described and special instructions are included to assist in its preparation. Sample input and output are included.

  10. User's manual for master: Modeling of aerodynamic surfaces by 3-dimensional explicit representation. [input to three dimensional computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Gibson, S. G.

    1983-01-01

    A system of computer programs was developed to model general three dimensional surfaces. Surfaces are modeled as sets of parametric bicubic patches. There are also capabilities to transform coordinates, to compute mesh/surface intersection normals, and to format input data for a transonic potential flow analysis. A graphical display of surface models and intersection normals is available. There are additional capabilities to regulate point spacing on input curves and to compute surface/surface intersection curves. Input and output data formats are described; detailed suggestions are given for user input. Instructions for execution are given, and examples are shown.

  11. Tracking Error analysis of Concentrator Photovoltaic Module Using Total 3-Dimensional Simulator

    NASA Astrophysics Data System (ADS)

    Ota, Yasuyuki; Nishioka, Kensuke

    2011-12-01

    A 3-dimensional (3D) operating simulator for concentrator photovoltaic (CPV) module using triple-junction solar cell was developed. By connecting 3D equivalent circuit simulation for triple-junction solar cell and ray-trace simulation for optics model, the operating characteristics of CPV module were calculated. A typical flat Fresnel lens and homogenizer were adapted to the optics model. The influence of tracking error on the performance of CPV module was calculated. There was the correlation between the optical efficiency and Isc. However, Pm was not correlated with these values, and was strongly dependent on FF. We can use this total simulator for the evaluation and optimization from the light incidence to operating characteristic of CPV modules.

  12. Computational Fluid Dynamics of Intracranial and Extracranal Arteries using 3-Dimensional Angiography: Technical Considerations with Physician's Point of View

    PubMed Central

    Yoon, Kyunghwan; Ko, Young Bae; Suh, Dae Chul

    2013-01-01

    We investigate the potentials and limitations of computational fluid dynamics (CFD) analysis of patient specific models from 3D angiographies. There are many technical problems in acquisition of proper vascular models, in pre-processing for making 2D surface and 3D volume meshes and also in post-processing steps for display the CFD analysis. We hope that our study could serves as a technical reference to validating other tools and CFD results. PMID:24024073

  13. Biomechanical 3-Dimensional Finite Element Analysis of Obturator Protheses Retained with Zygomatic and Dental Implants in Maxillary Defects

    PubMed Central

    Akay, Canan; Yaluğ, Suat

    2015-01-01

    Background The objective of this study was to investigate the stress distribution in the bone around zygomatic and dental implants for 3 different implant-retained obturator prostheses designs in a Aramany class IV maxillary defect using 3-dimensional finite element analysis (FEA). Material\\Methods A 3-dimensional finite element model of an Aramany class IV defect was created. Three different implant-retained obturator prostheses were modeled: model 1 with 1 zygomatic implant and 1 dental implant, model 2 with 1 zygomatic implant and 2 dental implants, and model 3 with 2 zygomatic implants. Locator attachments were used as a superstructure. A 150-N load was applied 3 different ways. Qualitative analysis was based on the scale of maximum principal stress; values obtained through quantitative analysis are expressed in MPa. Results In all loading conditions, model 3 (when compared models 1 and 2) showed the lowest maximum principal stress value. Model 3 is the most appropirate reconstruction in Aramany class IV maxillary defects. Two zygomatic implants can reduce the stresses in model 3. The distribution of stresses on prostheses were more rational with the help of zygoma implants, which can distribute the stresses on each part of the maxilla. Conclusions Aramany class IV obturator prosthesis placement of 2 zygomatic implants in each side of the maxilla is more advantageous than placement of dental implants. In the non-defective side, increasing the number of dental implants is not as suitable as zygomatic implants. PMID:25714086

  14. BOPACE 3-D (the Boeing Plastic Analysis Capability for 3-dimensional Solids Using Isoparametric Finite Elements)

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Straayer, J. W.

    1975-01-01

    The BOPACE 3-D is a finite element computer program, which provides a general family of three-dimensional isoparametric solid elements, and includes a new algorithm for improving the efficiency of the elastic-plastic-creep solution procedure. Theoretical, user, and programmer oriented sections are presented to describe the program.

  15. Hydroelectric structures studies using 3-dimensional methods

    SciTech Connect

    Harrell, T.R.; Jones, G.V.; Toner, C.K. )

    1989-01-01

    Deterioration and degradation of aged, hydroelectric project structures can significantly affect the operation and safety of a project. In many cases, hydroelectric headworks (in particular) have complicated geometrical configurations, loading patterns and hence, stress conditions. An accurate study of such structures can be performed using 3-dimensional computer models. 3-D computer models can be used for both stability evaluation and for finite element stress analysis. Computer aided engineering processes facilitate the use of 3-D methods in both pre-processing and post-processing of data. Two actual project examples are used to emphasize the authors' points.

  16. Hybrid-finite-element analysis of some nonlinear and 3-dimensional problems of engineering fracture mechanics

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.; Nakagaki, M.; Kathiresan, K.

    1980-01-01

    In this paper, efficient numerical methods for the analysis of crack-closure effects on fatigue-crack-growth-rates, in plane stress situations, and for the solution of stress-intensity factors for arbitrary shaped surface flaws in pressure vessels, are presented. For the former problem, an elastic-plastic finite element procedure valid for the case of finite deformation gradients is developed and crack growth is simulated by the translation of near-crack-tip elements with embedded plastic singularities. For the latter problem, an embedded-elastic-singularity hybrid finite element method, which leads to a direct evaluation of K-factors, is employed.

  17. 3-dimensional microscope analysis of bone and tooth surface modifications: comparisons of fossil specimens and replicas.

    PubMed

    Bello, Silvia M; Verveniotou, Efstratia; Cornish, Lorraine; Parfitt, Simon A

    2011-01-01

    Cut-marks on fossil bones and teeth are an important source of evidence in the reconstruction of ancient butchery practices. The analysis of butchery marks has allowed archaeologists to interpret aspects of past subsistence strategies and the behavior of early humans. Recent advances in optical scanning microscopy allow detailed measurements of cut-mark morphology to be undertaken. An example of this technology is the Alicona 3D InfiniteFocus imaging microscope, which has been applied recently to the study of surface modifications on bones and teeth. Three-dimensional models generated by the Alicona microscope have been used to identify cross-sectional features of experimental cut-marks that are characteristic for specific cutting actions (e.g., slicing, chopping, scraping) and different tool types (e.g., metal versus stone tools). More recently, this technology has been applied successfully to the analysis of ∼500,000 year-old cut-marked animal bones from Boxgrove (U.K.), as well as cannibalized 14,700 cal BP year-old human bones from Gough's Cave (U.K.). This article describes molding methods used to replicate fragile prehistoric bones and teeth, where image quality was adversely affected by specimen translucency and reflectivity. Alicona images generated from molds and casts are often of better quality than those of the original specimen. PMID:21660994

  18. Development of a liquid jet model for implementation in a 3-dimensional Eularian analysis tool

    NASA Astrophysics Data System (ADS)

    Buschman, Francis X., III

    The ability to model the thermal behavior of a nuclear reactor is of utmost importance to the reactor designer. Condensation is an important phenomenon when modeling a reactor system's response to a Loss Of Coolant Accident (LOCA). Condensation is even more important with the use of passive safety systems which rely on condensation heat transfer for long term cooling. The increasing use of condensation heat transfer, including condensation on jets of water, in safety systems puts added pressure to correctly model this phenomenon with thermal-hydraulic system and sub-channel analysis codes. In this work, a stand alone module with which to simulate condensation on a liquid jet was developed and then implemented within a reactor vessel analysis code to improve that code's handling of jet condensation. It is shown that the developed liquid jet model vastly improves the ability of COBRA-TF to model condensation on turbulent liquid jets. The stand alone jet model and the coupled liquid jet COBRA-TF have been compared to experimental data. Jet condensation heat transfer experiments by Celata et al. with a variety of jet diameters, velocities, and subcooling were utilized to evaluate the models. A sensitivity study on the effects of noncondensables on jet condensation was also carried out using the stand alone jet model.

  19. All-on-4 concept: a 3-dimensional finite element analysis.

    PubMed

    Sannino, Gianpaolo

    2015-04-01

    The aim of this work was to study the biomechanical behavior of an All-on-4 implant-supported prosthesis through a finite element analysis comparing 3 different tilt degrees of the distal implants. Three-dimensional finite element models of an edentulous maxilla restored with a prosthesis supported by 4 implants were reconstructed to carry out the analysis. Three distinct configurations, corresponding to 3 tilt degrees of the distal implants (15°, 30°, and 45°) were subjected to 4 loading simulations. The von Mises stresses generated around the implants were localized and quantified for comparison. Negligible differences in von Mises stress values were found in the comparison of the 15° and 30° models. From a stress-level viewpoint, the 45° model was revealed to be the most critical for peri-implant bone. In all the loading simulations, the maximum stress values were always found at the neck of the distal implants. The stress in the distal implants increased in the apical direction as the tilt degree increased. The stress location and distribution patterns were very similar among the evaluated models. The increase in the tilt degree of the distal implants was proportional to the increase in stress concentration. The 45° model induced higher stress values at the bone-implant interface, especially in the distal aspect, than the other 2 models analyzed. PMID:23560570

  20. Error analysis of a direct current electromagnetic tracking system in digitizing 3-dimensional surface geometries.

    PubMed

    Milne, A D; Lee, J M

    1999-01-01

    The direct current electromagnetic tracking device has seen increasing use in biomechanics studies of joint kinematics and anatomical surface geometry. In these applications, a stylus is attached to a sensor to measure the spatial location of three-dimensional landmarks. Stylus calibration is performed by rotating the stylus about a fixed point in space and using regression analysis to determine the tip offset vector. Measurement errors can be induced via several pathways, including; intrinsic system errors in sensor position or angle and tip offset calibration errors. A detailed study was performed to determine the errors introduced in digitizing small surfaces with different stylus lengths (35, 55, and 65 mm) and approach angles (30 and 45 degrees) using a plastic calibration board and hemispherical models. Two-point discrimination errors increased to an average of 1.93 mm for a 254 mm step size. Rotation about a single point produced mean errors of 0.44 to 1.18 mm. Statistically significant differences in error were observed with increasing approach angles (p < 0.001). Errors of less than 6% were observed in determining the curvature of a 19 mm hemisphere. This study demonstrates that the "Flock of Birds" can be used as a digitizing tool with accuracy better than 0.76% over 254 mm step sizes. PMID:11143353

  1. Solution of Poisson equations for 3-dimensional grid generations. [computations of a flow field over a thin delta wing

    NASA Technical Reports Server (NTRS)

    Fujii, K.

    1983-01-01

    A method for generating three dimensional, finite difference grids about complicated geometries by using Poisson equations is developed. The inhomogenous terms are automatically chosen such that orthogonality and spacing restrictions at the body surface are satisfied. Spherical variables are used to avoid the axis singularity, and an alternating-direction-implicit (ADI) solution scheme is used to accelerate the computations. Computed results are presented that show the capability of the method. Since most of the results presented have been used as grids for flow-field computations, this is indicative that the method is a useful tool for generating three-dimensional grids about complicated geometries.

  2. Meta-analysis of incidence of early lung toxicity in 3-dimensional conformal irradiation of breast carcinomas

    PubMed Central

    2013-01-01

    Background This meta-analysis aims to ascertain the significance of early lung toxicity with 3-Dimensional (3D) conformal irradiation for breast carcinomas and identify the sub-groups of patients with increased risk. Methods Electronic databases, reference sections of major oncological textbooks and identified studies were searched for synonyms of breast radiotherapy and radiation pneumonitis (RP). Major studies in thoracic irradiation were reviewed to identify factors frequently associated with RP. Meta-analysis for RP incidence estimation and odds ratio calculation were carried out. Results The overall incidence of Clinical and Radiological RP is 14% and 42% respectively. Ten studies were identified. Dose-volume Histogram (DVH) related dosimetric factors (Volume of lung receiving certain dose, Vdose and Mean lung Dose, MLD), supraclavicular fossa (SCF) irradiation and age are significantly associated with RP, but not sequential chemotherapy and concomitant use of Tamoxifen. A poorly powered study in IMN group contributed to the negative finding. Smoking has a trend towards protective effect against RP. Conclusion Use of other modalities may be considered when Ipsilateral lung V20Gy > 30% or MLD > 15 Gy. Extra caution is needed in SCF and IMN irradiation as they are likely to influence these dosimetric parameters. PMID:24229418

  3. Evaluation of Temperature and Stress Distribution on 2 Different Post Systems Using 3-Dimensional Finite Element Analysis

    PubMed Central

    Değer, Yalçın; Adigüzel, Özkan; Özer, Senem Yiğit; Kaya, Sadullah; Polat, Zelal Seyfioğlu; Bozyel, Bejna

    2015-01-01

    Background The mouth is exposed to thermal irritation from hot and cold food and drinks. Thermal changes in the oral cavity produce expansions and contractions in tooth structures and restorative materials. The aim of this study was to investigate the effect of temperature and stress distribution on 2 different post systems using the 3-dimensional (3D) finite element method. Material/Methods The 3D finite element model shows a labio-lingual cross-sectional view of the endodontically treated upper right central incisor and supporting periodontal ligament with bone structures. Stainless steel and glass fiber post systems with different physical and thermal properties were modelled in the tooth restored with composite core and ceramic crown. We placed 100 N static vertical occlusal loading onto the center of the incisal surface of the tooth. Thermal loads of 0°C and 65°C were applied on the model for 5 s. Temperature and thermal stresses were determined on the labio-lingual section of the model at 6 different points. Results The distribution of stress, including thermal stress values, was calculated using 3D finite element analysis. The stainless steel post system produced more temperature and thermal stresses on the restorative materials, tooth structures, and posts than did the glass fiber reinforced composite posts. Conclusions Thermal changes generated stresses in the restorative materials, tooth, and supporting structures. PMID:26615495

  4. Cost-Effectiveness Analysis of Intensity Modulated Radiation Therapy Versus 3-Dimensional Conformal Radiation Therapy for Anal Cancer

    SciTech Connect

    Hodges, Joseph C.; Beg, Muhammad S.; Das, Prajnan; Meyer, Jeffrey

    2014-07-15

    Purpose: To compare the cost-effectiveness of intensity modulated radiation therapy (IMRT) and 3-dimensional conformal radiation therapy (3D-CRT) for anal cancer and determine disease, patient, and treatment parameters that influence the result. Methods and Materials: A Markov decision model was designed with the various disease states for the base case of a 65-year-old patient with anal cancer treated with either IMRT or 3D-CRT and concurrent chemotherapy. Health states accounting for rates of local failure, colostomy failure, treatment breaks, patient prognosis, acute and late toxicities, and the utility of toxicities were informed by existing literature and analyzed with deterministic and probabilistic sensitivity analysis. Results: In the base case, mean costs and quality-adjusted life expectancy in years (QALY) for IMRT and 3D-CRT were $32,291 (4.81) and $28,444 (4.78), respectively, resulting in an incremental cost-effectiveness ratio of $128,233/QALY for IMRT compared with 3D-CRT. Probabilistic sensitivity analysis found that IMRT was cost-effective in 22%, 47%, and 65% of iterations at willingness-to-pay thresholds of $50,000, $100,000, and $150,000 per QALY, respectively. Conclusions: In our base model, IMRT was a cost-ineffective strategy despite the reduced acute treatment toxicities and their associated costs of management. The model outcome was sensitive to variations in local and colostomy failure rates, as well as patient-reported utilities relating to acute toxicities.

  5. Analysis of shape and motion of the mitral annulus in subjects with and without cardiomyopathy by echocardiographic 3-dimensional reconstruction

    NASA Technical Reports Server (NTRS)

    Flachskampf, F. A.; Chandra, S.; Gaddipatti, A.; Levine, R. A.; Weyman, A. E.; Ameling, W.; Hanrath, P.; Thomas, J. D.

    2000-01-01

    The shape and dynamics of the mitral annulus of 10 patients without heart disease (controls), 3 patients with dilated cardiomyopathy, and 5 patients with hypertrophic obstructive cardiomyopathy and normal systolic function were analyzed by transesophageal echocardiography and 3-dimensional reconstruction. Mitral annular orifice area, apico-basal motion of the annulus, and nonplanarity were calculated over time. Annular area was largest in end diastole and smallest in end systole. Mean areas were 11.8 +/- 2.5 cm(2) (controls), 15.2 +/- 4.2 cm(2) (dilated cardiomyopathy), and 10.2 +/- 2.4 cm(2) (hypertrophic cardiomyopathy) (P = not significant). After correction for body surface, annuli from patients with normal left ventricular function were smaller than annuli from patients with dilated cardiomyopathy (5.9 +/- 1.2 cm(2)/m(2) vs 7.7 +/- 1.0 cm(2)/m(2); P <.02). The change in area during the cardiac cycle showed significant differences: 23.8% +/- 5.1% (controls), 13.2% +/- 2.3% (dilated cardiomyopathy), and 32.4% +/- 7.6% (hypertrophic cardiomyopathy) (P <.001). Apico-basal motion was highest in controls, followed by those with hypertrophic obstructive and dilated cardiomyopathy (1.0 +/- 0.3 cm, 0.8 +/- 0.2 cm, 0.3 +/- 0.2 cm, respectively; P <.01). Visual inspection and Fourier analysis showed a consistent pattern of anteroseptal and posterolateral elevations of the annulus toward the left atrium. In conclusion, although area changes and apico-basal motion of the mitral annulus strongly depend on left ventricular systolic function, nonplanarity is a structural feature preserved throughout the cardiac cycle in all three groups.

  6. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles

    PubMed Central

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  7. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles.

    PubMed

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  8. Evolution of the 3-dimensional video system for facial motion analysis: ten years' experiences and recent developments.

    PubMed

    Tzou, Chieh-Han John; Pona, Igor; Placheta, Eva; Hold, Alina; Michaelidou, Maria; Artner, Nicole; Kropatsch, Walter; Gerber, Hans; Frey, Manfred

    2012-08-01

    Since the implementation of the computer-aided system for assessing facial palsy in 1999 by Frey et al (Plast Reconstr Surg. 1999;104:2032-2039), no similar system that can make an objective, three-dimensional, quantitative analysis of facial movements has been marketed. This system has been in routine use since its launch, and it has proven to be reliable, clinically applicable, and therapeutically accurate. With the cooperation of international partners, more than 200 patients were analyzed. Recent developments in computer vision--mostly in the area of generative face models, applying active--appearance models (and extensions), optical flow, and video-tracking-have been successfully incorporated to automate the prototype system. Further market-ready development and a business partner will be needed to enable the production of this system to enhance clinical methodology in diagnostic and prognostic accuracy as a personalized therapy concept, leading to better results and higher quality of life for patients with impaired facial function. PMID:21734549

  9. A 3 dimensional assessment of the depth of tumor invasion in microinvasive tongue squamous cell carcinoma - A case series analysis

    PubMed Central

    Amit-Byatnal, Aditi; Natarajan, Jayalakshmi; Shenoy, Satish; Kamath, Asha; Hunter, Keith

    2015-01-01

    Background Accurate assessment of the depth of tumor invasion (DI) in microinvasive squamous cell carcinoma (MISCC) of the tongue is critical to prognosis. An arithmetic model is generated to determine a reliable method of measurement of DI and correlate this with the local recurrence. Material and Methods Tumor thickness (TT) and DI were measured in tissue sections of 14 cases of MISCC of the tongue, by manual ocular micrometer and digital image analysis at four reference points (A, B, C, and D). The comparison of TT and DI with relevant clinicopathologic parameters was assessed using Mann Whitney U test. Reliability of these methods and the values obtained were compared and correlated with the recurrence of tumors by Wilcoxon Signed Ranks Test. 3D reconstruction of the lesion was done on a Cartesian coordinate system. X face was on the YZ plane and Z face was on the XY plane of the coordinate system. Results Computer generated 3D model of oral mucosa in four cases that recurred showed increased DI in the Z coordinate compared to the XY coordinate. The median DI measurements between XY and Z coordinates in these cases showed no significant difference (Wilcoxon Signed Ranks Test, p = 0.068). Conclusions The assessment of DI in 3 dimensions is critical for accurate assessment of MISCC and precise DI allows complete removal of tumor. Key words:Depth of invasion, tumor thickness, microinvasive squamous cell carcinoma, tongue squamous cell carcinoma. PMID:26449426

  10. Control point analysis comparison for 3 different treatment planning and delivery complexity levels using a commercial 3-dimensional diode array.

    PubMed

    Abdellatif, Ady; Gaede, Stewart

    2014-01-01

    To investigate the use of "Control Point Analysis" (Sun Nuclear Corporation, Melbourne, FL) to analyze and compare delivered volumetric-modulated arc therapy (VMAT) plans for 3 different treatment planning complexity levels. A total of 30 patients were chosen and fully anonymized for the purpose of this study. Overall, 10 lung stereotactic body radiotherapy (SBRT), 10 head-and-neck (H&N), and 10 prostate VMAT plans were generated on Pinnacle(3) and delivered on a Varian linear accelerator (LINAC). The delivered dose was measured using ArcCHECK (Sun Nuclear Corporation, Melbourne, FL). Each plan was analyzed using "Sun Nuclear Corporation (SNC) Patient 6" and "Control Point Analysis." Gamma passing percentage was used to assess the differences between the measured and planned dose distributions and to assess the role of various control point binning combinations. Of the different sites considered, the prostate cases reported the highest gamma passing percentages calculated with "SNC Patient 6" (97.5% to 99.2% for the 3%, 3mm) and "Control Point Analysis" (95.4% to 98.3% for the 3%, 3mm). The mean percentage of passing control point sectors for the prostate cases increased from 51.8 ± 7.8% for individual control points to 70.6 ± 10.5% for 5 control points binned together to 87.8 ± 11.0% for 10 control points binned together (2%, 2-mm passing criteria). Overall, there was an increasing trend in the percentage of sectors passing gamma analysis with an increase in the number of control points binned together in a sector for both the gamma passing criteria (2%, 2mm and 3%, 3mm). Although many plans passed the clinical quality assurance criteria, plans involving the delivery of high Monitor Unit (MU)/control point (SBRT) and plans involving high degree of modulation (H&N) showed less delivery accuracy per control point compared with plans with low MU/control point and low degree of modulation (prostate). PMID:24480374

  11. Control Point Analysis comparison for 3 different treatment planning and delivery complexity levels using a commercial 3-dimensional diode array

    SciTech Connect

    Abdellatif, Ady; Gaede, Stewart

    2014-07-01

    To investigate the use of “Control Point Analysis” (Sun Nuclear Corporation, Melbourne, FL) to analyze and compare delivered volumetric-modulated arc therapy (VMAT) plans for 3 different treatment planning complexity levels. A total of 30 patients were chosen and fully anonymized for the purpose of this study. Overall, 10 lung stereotactic body radiotherapy (SBRT), 10 head-and-neck (H and N), and 10 prostate VMAT plans were generated on Pinnacle{sup 3} and delivered on a Varian linear accelerator (LINAC). The delivered dose was measured using ArcCHECK (Sun Nuclear Corporation, Melbourne, FL). Each plan was analyzed using “Sun Nuclear Corporation (SNC) Patient 6” and “Control Point Analysis.” Gamma passing percentage was used to assess the differences between the measured and planned dose distributions and to assess the role of various control point binning combinations. Of the different sites considered, the prostate cases reported the highest gamma passing percentages calculated with “SNC Patient 6” (97.5% to 99.2% for the 3%, 3 mm) and “Control Point Analysis” (95.4% to 98.3% for the 3%, 3 mm). The mean percentage of passing control point sectors for the prostate cases increased from 51.8 ± 7.8% for individual control points to 70.6 ± 10.5% for 5 control points binned together to 87.8 ± 11.0% for 10 control points binned together (2%, 2-mm passing criteria). Overall, there was an increasing trend in the percentage of sectors passing gamma analysis with an increase in the number of control points binned together in a sector for both the gamma passing criteria (2%, 2 mm and 3%, 3 mm). Although many plans passed the clinical quality assurance criteria, plans involving the delivery of high Monitor Unit (MU)/control point (SBRT) and plans involving high degree of modulation (H and N) showed less delivery accuracy per control point compared with plans with low MU/control point and low degree of modulation (prostate)

  12. The Effects of Different Miniscrew Thread Designs and Force Directions on Stress Distribution by 3-dimensional Finite Element Analysis

    PubMed Central

    Fattahi, Hamidreza; Ajami, Shabnam; Nabavizadeh Rafsanjani, Ali

    2015-01-01

    Statement of the Problem The use of miniscrew as an absolute anchorage device in clinical orthodontics is growing increasingly. Many attempts have been made to reduce the size, to improve the design, and to increase the stability of miniscrew. Purpose The purpose of this study was to determine the effects of different thread shapes and force directions of orthodontic miniscrew on stress distribution in the supporting bone structure. Materials and Method A three-dimensional finite element analysis was used. A 200-cN force in three angles (0°, 45°, and 90°) was applied on the head of the miniscrew. The stress distribution between twelve thread shapes was investigated as categorized in four main groups; buttress, reverse buttress, square, and V-shape. Results Stress distribution was not significantly different among different thread shapes. The maximum amount of bone stress at force angles 0°, 45°, and 90° were 38.90, 30.57 and 6.62 MPa, respectively. Analyzing the von Mises stress values showed that in all models, the maximum stress was concentrated on the lowest diameter of the shank, especially the part that was in the soft tissue and cervical cortical bone regions. Conclusion There was no relation between thread shapes and von Mises stress distribution in the bone; however, different force angles could affect the von Mises stress in the bone and miniscrew. PMID:26636123

  13. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  14. Computational engine structural analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Johns, R. H.

    1986-01-01

    A significant research activity at the NASA Lewis Research Center is the computational simulation of complex multidisciplinary engine structural problems. This simulation is performed using computational engine structural analysis (CESA) which consists of integrated multidisciplinary computer codes in conjunction with computer post-processing for problem-specific application. A variety of the computational simulations of specific cases are described in some detail in this paper. These case studies include: (1) aeroelastic behavior of bladed rotors, (2) high velocity impact of fan blades, (3) blade-loss transient response, (4) rotor/stator/squeeze-film/bearing interaction, (5) blade-fragment/rotor-burst containment, and (6) structural behavior of advanced swept turboprops. These representative case studies are selected to demonstrate the breath of the problems analyzed and the role of the computer including post-processing and graphical display of voluminous output data.

  15. Frontal soft tissue analysis using a 3 dimensional camera following two-jaw rotational orthognathic surgery in skeletal class III patients.

    PubMed

    Choi, Jong Woo; Lee, Jang Yeol; Oh, Tae-Suk; Kwon, Soon Man; Yang, Sung Joon; Koh, Kyung Suk

    2014-04-01

    Although two dimensional cephalometry is the standard method for analyzing the results of orthognathic surgery, it has potential limits in frontal soft tissue analysis. We have utilized a 3 dimensional camera to examine changes in soft tissue landmarks in patients with skeletal class III dentofacial deformity who underwent two-jaw rotational setback surgery. We assessed 25 consecutive Asian patients (mean age, 22 years; range, 17-32 years) with skeletal class III dentofacial deformities who underwent two-jaw rotational surgery without maxillary advancement. Using a 3D camera, we analyzed changes in facial proportions, including vertical and horizontal dimensions, facial surface areas, nose profile, lip contour, and soft tissue cheek convexity, as well as landmarks related to facial symmetry. The average mandibular setback was 10.7 mm (range: 5-17 mm). The average SNA changed from 77.4° to 77.8°, the average SNB from 89.2° to 81.1°, and the average occlusal plane from 8.7° to 11.4°. The mid third vertical dimension changed from 58.8 mm to 57.8 mm (p = 0.059), and the lower third vertical dimension changed from 70.4 mm to 68.2 mm (p = 0.0006). The average bigonial width decreased from 113.5 mm to 109.2 mm (p = 0.0028), the alar width increased from 34.7 mm to 36.1 mm (p-value = 0.0002), and lip length was unchanged. Mean mid and lower facial surface areas decreased significantly, from 171.8 cm(2) to 166.2 cm(2) (p = 0.026) and from 71.23 cm(2) to 61.9 cm(2) (p < 0.0001), respectively. Cheek convexity increased significantly, from 171.8° to 155.9° (p = 0.0007). The 3D camera was effective in frontal soft tissue analysis for orthognathic surgery, and enabled quantitative analysis of changes in frontal soft tissue landmarks and facial proportions that were not possible with conventional 2D cephalometric analysis. PMID:23870714

  16. Comparative Analysis of Visitors' Experiences and Knowledge Acquisition between a 3Dimensional Online and a Real-World Art Museum Tour

    ERIC Educational Resources Information Center

    D' Alba, Adriana; Jones, Greg; Wright, Robert

    2015-01-01

    This paper discusses a study conducted in the fall of 2011 and the spring of 2012 which explored the use of existing 3D virtual environment technologies by bringing a selected permanent museum exhibit displayed at a museum located in central Mexico into an online 3Dimensional experience. Using mixed methods, the research study analyzed knowledge…

  17. Computation of synthetic seismograms in a 3 dimensional Earth and inversion of eigenfrequency and Q quality factor datasets of normal modes

    NASA Astrophysics Data System (ADS)

    Roch, Julien; Clevede, Eric; Roult, Genevieve

    2010-05-01

    The 26 December 2004 Sumatra-Andaman event is the third biggest earthquake that has never been recorded but the first recorded with high quality broad-band seismometers. Such an earthquake offered a good opportunity for studying the normal modes of the Earth and particularly the gravest ones (frequency lower than 1 mHz) which provide important information on deep Earth. The splitting of some modes has been carefully analyzed. The eigenfrequencies and the Q quality factors of particular singlets have been retrieved with an unprecedented precision. In some cases, the eigenfrequencies of some singlets exhibit a clear shift when compared to the theoretical eigenfrequencies. Some core modes such as the 3S2 mode present an anomalous splitting, that is to say, a splitting width much larger than the expected one. Such anomalous splitting is presently admitted to be due to the existence of lateral heterogeneities in the inner core. We need an accurate model of the whole Earth and a method to compute synthetic seismograms in order to compare synthetic and observed data and to explain the behavior of such modes. Synthetic seismograms are computed by normal modes summation using a perturbative method developed up to second order in amplitude and up to third order in frequency (HOPT method). The last step consists in inverting both eigenfrequency and Q quality factor datasets in order to better constrain the deep Earth structure and especially the inner core. In order to find models of acceptable data fit in a multidimensional parameter space, we use the neighborhood algorithm method which is a derivative-free search method. It is particularly well adapted in our case (non linear problem) and is easy to tune with only 2 parameters. Our purpose is to find an ensemble of models that fit the data rather than a unique model.

  18. 3-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements and direct solvers parallelized on symmetric multiprocessor computers - Part II: direct data-space inverse solution

    NASA Astrophysics Data System (ADS)

    Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.

    2016-01-01

    Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.

  19. Distinction of Green Sweet Peppers by Using Various Color Space Models and Computation of 3 Dimensional Location Coordinates of Recognized Green Sweet Peppers Based on Parallel Stereovision System

    NASA Astrophysics Data System (ADS)

    Bachche, Shivaji; Oka, Koichi

    2013-06-01

    This paper presents the comparative study of various color space models to determine the suitable color space model for detection of green sweet peppers. The images were captured by using CCD cameras and infrared cameras and processed by using Halcon image processing software. The LED ring around the camera neck was used as an artificial lighting to enhance the feature parameters. For color images, CieLab, YIQ, YUV, HSI and HSV whereas for infrared images, grayscale color space models were selected for image processing. In case of color images, HSV color space model was found more significant with high percentage of green sweet pepper detection followed by HSI color space model as both provides information in terms of hue/lightness/chroma or hue/lightness/saturation which are often more relevant to discriminate the fruit from image at specific threshold value. The overlapped fruits or fruits covered by leaves can be detected in better way by using HSV color space model as the reflection feature from fruits had higher histogram than reflection feature from leaves. The IR 80 optical filter failed to distinguish fruits from images as filter blocks useful information on features. Computation of 3D coordinates of recognized green sweet peppers was also conducted in which Halcon image processing software provides location and orientation of the fruits accurately. The depth accuracy of Z axis was examined in which 500 to 600 mm distance between cameras and fruits was found significant to compute the depth distance precisely when distance between two cameras maintained to 100 mm.

  20. General design method for 3-dimensional, potential flow fields. Part 2: Computer program DIN3D1 for simple, unbranched ducts

    NASA Technical Reports Server (NTRS)

    Stanitz, J. D.

    1985-01-01

    The general design method for three-dimensional, potential, incompressible or subsonic-compressible flow developed in part 1 of this report is applied to the design of simple, unbranched ducts. A computer program, DIN3D1, is developed and five numerical examples are presented: a nozzle, two elbows, an S-duct, and the preliminary design of a side inlet for turbomachines. The two major inputs to the program are the upstream boundary shape and the lateral velocity distribution on the duct wall. As a result of these inputs, boundary conditions are overprescribed and the problem is ill posed. However, it appears that there are degrees of compatibility between these two major inputs and that, for reasonably compatible inputs, satisfactory solutions can be obtained. By not prescribing the shape of the upstream boundary, the problem presumably becomes well posed, but it is not clear how to formulate a practical design method under this circumstance. Nor does it appear desirable, because the designer usually needs to retain control over the upstream (or downstream) boundary shape. The problem is further complicated by the fact that, unlike the two-dimensional case, and irrespective of the upstream boundary shape, some prescribed lateral velocity distributions do not have proper solutions.

  1. Evaluation of the middle cerebral artery occlusion techniques in the rat by in-vitro 3-dimensional micro- and nano computed tomography

    PubMed Central

    2010-01-01

    Background Animal models of focal cerebral ischemia are widely used in stroke research. The purpose of our study was to evaluate and compare the cerebral macro- and microvascular architecture of rats in two different models of permanent middle cerebral artery occlusion using an innovative quantitative micro- and nano-CT imaging technique. Methods 4h of middle cerebral artery occlusion was performed in rats using the macrosphere method or the suture technique. After contrast perfusion, brains were isolated and scanned en-bloc using micro-CT (8 μm)3 or nano-CT at 500 nm3 voxel size to generate 3D images of the cerebral vasculature. The arterial vascular volume fraction and gray scale attenuation was determined and the significance of differences in measurements was tested with analysis of variance [ANOVA]. Results Micro-CT provided quantitative information on vascular morphology. Micro- and nano-CT proved to visualize and differentiate vascular occlusion territories performed in both models of cerebral ischemia. The suture technique leads to a remarkable decrease in the intravascular volume fraction of the middle cerebral artery perfusion territory. Blocking the medial cerebral artery with macrospheres, the vascular volume fraction of the involved hemisphere decreased significantly (p < 0.001), independently of the number of macrospheres, and was comparable to the suture method. We established gray scale measurements by which focal cerebral ischemia could be radiographically categorized (p < 0.001). Nano-CT imaging demonstrates collateral perfusion related to different occluded vessel territories after macrosphere perfusion. Conclusion Micro- and Nano-CT imaging is feasible for analysis and differentiation of different models of focal cerebral ischemia in rats. PMID:20509884

  2. A Multi-Center Study Comparing Shunt Type in the Norwood Procedure for Single-Ventricle Lesions: 3-Dimensional Echocardiographic Analysis

    PubMed Central

    Marx, Gerald R.; Shirali, Girish; Levine, Jami C.; Guey, Lin T.; Cnota, James F.; Baffa, Jeanne M.; Border, William L.; Colan, Steve; Ensing, Gregory; Friedberg, Mark K.; Goldberg, David J.; Idriss, Salim F.; John, J. Blaine; Lai, Wyman W.; Lu, Minmin; Menon, Shaji C.; Ohye, Richard G.; Saudek, David; Wong, Pierre C.; Pearson, Gail D.

    2013-01-01

    Background The Pediatric Heart Network’s (PHN) Single Ventricle Reconstruction Trial (SVR) randomized infants with single right ventricles (RV) undergoing a Norwood procedure to a modified Blalock-Taussig or RV-to-pulmonary artery shunt. This report compares RV parameters in the two groups using 3-dimensional echocardiography (3DE). Methods and Results 3DE studies were obtained at 10/15 SVR centers. Of the 549 subjects, 314 underwent 3DE studies at one to four time points (pre-Norwood, post-Norwood, pre-stage II, and 14 months) for a total of 757 3DEs. Of these, 565 (75%) were acceptable for analysis. RV volume, mass, mass:volume ratio, ejection fraction (EF), and severity of tricuspid regurgitation did not differ by shunt type. RV volumes and mass did not change after the Norwood, but increased from pre-Norwood to pre-stage II (end-diastolic volume [EDV, ml]/body surface area [BSA]1.3, end-systolic volume [ESV, ml]/BSA1.3 and mass[g]/BSA1.3 mean difference [95% confidence interval] = 25.0 [8.7, 41.3], 19.3 [8.3, 30.4], and 17.9 [7.3, 28.5], then decreased by 14 months (EDV/BSA1.3, ESV/BSA1.3 and mass/BSA1.3 mean difference [95% confidence interval] = −24.4 [−35.0, −13.7], −9.8 [−17.9, −1.7], and −15.3 [−22.0, −8.6]. EF decreased from pre-Norwood to pre-stage II (mean difference [95% confidence interval] = −3.7% [−6.9%, −0.5%]), but did not decrease further by 14 months. Conclusions We found no statistically significant differences between study groups in 3DE measures of RV size and function, or magnitude of tricuspid regurgitation. Volume unloading was seen after stage II, as expected, but EF did not improve. This study provides insights into the remodeling of the operated univentricular RV in infancy. Clinical Trial Registration URL: http://www.clinicaltrials.gov. Unique identifier: NCT00115934. PMID:24097422

  3. Reconstruction 3-dimensional image from 2-dimensional image of status optical coherence tomography (OCT) for analysis of changes in retinal thickness

    SciTech Connect

    Arinilhaq,; Widita, Rena

    2014-09-30

    Optical Coherence Tomography is often used in medical image acquisition to diagnose that change due easy to use and low price. Unfortunately, this type of examination produces a two-dimensional retinal image of the point of acquisition. Therefore, this study developed a method that combines and reconstruct 2-dimensional retinal images into three-dimensional images to display volumetric macular accurately. The system is built with three main stages: data acquisition, data extraction and 3-dimensional reconstruction. At data acquisition step, Optical Coherence Tomography produced six *.jpg images of each patient were further extracted with MATLAB 2010a software into six one-dimensional arrays. The six arrays are combined into a 3-dimensional matrix using a kriging interpolation method with SURFER9 resulting 3-dimensional graphics of macula. Finally, system provides three-dimensional color graphs based on the data distribution normal macula. The reconstruction system which has been designed produces three-dimensional images with size of 481 × 481 × h (retinal thickness) pixels.

  4. Incorporating 3-dimensional models in online articles

    PubMed Central

    Cevidanes, Lucia H. S.; Ruellasa, Antonio C. O.; Jomier, Julien; Nguyen, Tung; Pieper, Steve; Budin, Francois; Styner, Martin; Paniagua, Beatriz

    2015-01-01

    Introduction The aims of this article were to introduce the capability to view and interact with 3-dimensional (3D) surface models in online publications, and to describe how to prepare surface models for such online 3D visualizations. Methods Three-dimensional image analysis methods include image acquisition, construction of surface models, registration in a common coordinate system, visualization of overlays, and quantification of changes. Cone-beam computed tomography scans were acquired as volumetric images that can be visualized as 3D projected images or used to construct polygonal meshes or surfaces of specific anatomic structures of interest. The anatomic structures of interest in the scans can be labeled with color (3D volumetric label maps), and then the scans are registered in a common coordinate system using a target region as the reference. The registered 3D volumetric label maps can be saved in .obj, .ply, .stl, or .vtk file formats and used for overlays, quantification of differences in each of the 3 planes of space, or color-coded graphic displays of 3D surface distances. Results All registered 3D surface models in this study were saved in .vtk file format and loaded in the Elsevier 3D viewer. In this study, we describe possible ways to visualize the surface models constructed from cone-beam computed tomography images using 2D and 3D figures. The 3D surface models are available in the article’s online version for viewing and downloading using the reader’s software of choice. These 3D graphic displays are represented in the print version as 2D snapshots. Overlays and color-coded distance maps can be displayed using the reader’s software of choice, allowing graphic assessment of the location and direction of changes or morphologic differences relative to the structure of reference. The interpretation of 3D overlays and quantitative color-coded maps requires basic knowledge of 3D image analysis. Conclusions When submitting manuscripts, authors can

  5. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  6. A 3-DIMENSIONAL MATRIX ASSAY THAT MAY HELP PREDICT TREATMENT RESPONSE TO TEMOZOLOMIDE IN PATIENTS WITH GLIOBASTOMA: SUBGROUP ANALYSIS OF PATIENTS UNDERGOING MGMT TESTING

    PubMed Central

    Megyesi, Joseph F.; Costello, Penny; McDonald, Warren; Macdonald, David; Easaw, Jay

    2014-01-01

    BACKGROUND: (blind field). METHODS: Records for patients treated for newly diagnosed or recurrent glioblastoma were analyzed. All patients had undergone surgical resection and tumor specimens at time of surgery were available for culture in a 3-dimensional matrix assay and observed for growth and invasion. Drug effects on mean invasion and growth were expressed as a ratio relative to control conditions. Length of survival was compared between temozolomide treated patients whose screening results had predicted a positive or negative response to temozolomide. The MGMT status of a subgroup of these patients was analyzed and correlated with the response of tumor tissue in the assay to temozolomide. RESULTS: Fifty-eight patients with glioblastoma were assessed. Each patient's tumor displayed a unique invasion and response profile. We looked in particular at the correlation between the outcome of a patient with glioblastoma treated with temozolomide and the response of that patient's tumor tissue to temozolomide in the 3-dimensional assay. Mean survival time for patients whose tumors were not significantly sensitive to temozolomide in the assay was 181.7 +/- 43 days. Mean survival time for patients whose tumors were significantly sensitive to temozolomide in the assay was 290.0 +/- 33 days. Twelve patients underwent MGMT testing. In 10 of the 12 patients there was a correlation between tumor response in the assay and MGMT status. CONCLUSIONS: The 3-dimensional assay may help predict glioblastoma patients who will show a treatment response to temozolomide. There appears to be a positive correlation between the response profiles in the assay to the MGMT status of the patient's tumor. SECONDARY CATEGORY: n/a.

  7. Design of 3-dimensional complex airplane configurations with specified pressure distribution via optimization

    NASA Technical Reports Server (NTRS)

    Kubrynski, Krzysztof

    1991-01-01

    A subcritical panel method applied to flow analysis and aerodynamic design of complex aircraft configurations is presented. The analysis method is based on linearized, compressible, subsonic flow equations and indirect Dirichlet boundary conditions. Quadratic dipol and linear source distribution on flat panels are applied. In the case of aerodynamic design, the geometry which minimizes differences between design and actual pressure distribution is found iteratively, using numerical optimization technique. Geometry modifications are modeled by surface transpiration concept. Constraints in respect to resulting geometry can be specified. A number of complex 3-dimensional design examples are presented. The software is adopted to personal computers, and as result an unexpected low cost of computations is obtained.

  8. Quantitative analysis of aortic regurgitation: real-time 3-dimensional and 2-dimensional color Doppler echocardiographic method--a clinical and a chronic animal study

    NASA Technical Reports Server (NTRS)

    Shiota, Takahiro; Jones, Michael; Tsujino, Hiroyuki; Qin, Jian Xin; Zetts, Arthur D.; Greenberg, Neil L.; Cardon, Lisa A.; Panza, Julio A.; Thomas, James D.

    2002-01-01

    BACKGROUND: For evaluating patients with aortic regurgitation (AR), regurgitant volumes, left ventricular (LV) stroke volumes (SV), and absolute LV volumes are valuable indices. AIM: The aim of this study was to validate the combination of real-time 3-dimensional echocardiography (3DE) and semiautomated digital color Doppler cardiac flow measurement (ACM) for quantifying absolute LV volumes, LVSV, and AR volumes using an animal model of chronic AR and to investigate its clinical applicability. METHODS: In 8 sheep, a total of 26 hemodynamic states were obtained pharmacologically 20 weeks after the aortic valve noncoronary (n = 4) or right coronary (n = 4) leaflet was incised to produce AR. Reference standard LVSV and AR volume were determined using the electromagnetic flow method (EM). Simultaneous epicardial real-time 3DE studies were performed to obtain LV end-diastolic volumes (LVEDV), end-systolic volumes (LVESV), and LVSV by subtracting LVESV from LVEDV. Simultaneous ACM was performed to obtain LVSV and transmitral flows; AR volume was calculated by subtracting transmitral flow volume from LVSV. In a total of 19 patients with AR, real-time 3DE and ACM were used to obtain LVSVs and these were compared with each other. RESULTS: A strong relationship was found between LVSV derived from EM and those from the real-time 3DE (r = 0.93, P <.001, mean difference (3D - EM) = -1.0 +/- 9.8 mL). A good relationship between LVSV and AR volumes derived from EM and those by ACM was found (r = 0.88, P <.001). A good relationship between LVSV derived from real-time 3DE and that from ACM was observed (r = 0.73, P <.01, mean difference = 2.5 +/- 7.9 mL). In patients, a good relationship between LVSV obtained by real-time 3DE and ACM was found (r = 0.90, P <.001, mean difference = 0.6 +/- 9.8 mL). CONCLUSION: The combination of ACM and real-time 3DE for quantifying LV volumes, LVSV, and AR volumes was validated by the chronic animal study and was shown to be clinically applicable.

  9. Teleportation of a 3-dimensional GHZ State

    NASA Astrophysics Data System (ADS)

    Cao, Hai-Jing; Wang, Huai-Sheng; Li, Peng-Fei; Song, He-Shan

    2012-05-01

    The process of teleportation of a completely unknown 3-dimensional GHZ state is considered. Three maximally entangled 3-dimensional Bell states function as quantum channel in the scheme. This teleportation scheme can be directly generalized to teleport an unknown d-dimensional GHZ state.

  10. 3-dimensional Oil Drift Simulations

    NASA Astrophysics Data System (ADS)

    Wettre, C.; Reistad, M.; Hjøllo, B.Å.

    Simulation of oil drift has been an ongoing activity at the Norwegian Meteorological Institute since the 1970's. The Marine Forecasting Centre provides a 24-hour service for the Norwegian Pollution Control Authority and the oil companies operating in the Norwegian sector. The response time is 30 minutes. From 2002 the service is extended to simulation of oil drift from oil spills in deep water, using the DeepBlow model developed by SINTEF Applied Chemistry. The oil drift model can be applied both for instantaneous and continuous releases. The changes in the mass of oil and emulsion as a result of evaporation and emulsion are computed. For oil spill at deep water, hydrate formation and gas dissolution are taken into account. The properties of the oil depend on the oil type, and in the present version 64 different types of oil can be simulated. For accurate oil drift simulations it is important to have the best possible data on the atmospheric and oceanic conditions. The oil drift simulations at the Norwegian Meteorological Institute are always based on the most updated data from numerical models of the atmosphere and the ocean. The drift of the surface oil is computed from the vectorial sum of the surface current from the ocean model and the wave induced Stokes drift computed from wave energy spectra from the wave prediction model. In the new model the current distribution with depth is taken into account when calculating the drift of the dispersed oil droplets. Salinity and temperature profiles from the ocean model are needed in the DeepBlow model. The result of the oil drift simulations can be plotted on sea charts used for navigation, either as trajectory plots or particle plots showing the situation at a given time. The results can also be sent as data files to be included in the user's own GIS system.

  11. Sensitivity analysis in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1984-01-01

    Information on sensitivity analysis in computational aerodynamics is given in outline, graphical, and chart form. The prediction accuracy if the MCAERO program, a perturbation analysis method, is discussed. A procedure for calculating perturbation matrix, baseline wing paneling for perturbation analysis test cases and applications of an inviscid sensitivity matrix are among the topics covered.

  12. Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment

    PubMed Central

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to evaluate accuracy and reliability in human bones, and clinical application studies to assess fracture risk and effects of osteoporosis medication are overviewed. PMID:26309819

  13. Effect of Foot Hyperpronation on Lumbar Lordosis and Thoracic Kyphosis in Standing Position Using 3-Dimensional Ultrasound-Based Motion Analysis System

    PubMed Central

    Farokhmanesh, Khatere; Shirzadian, Toraj; Mahboubi, Mohammad; Shahri, Mina Neyakan

    2014-01-01

    Based on clinical observations, foot hyperpronation is very common. Excessive pronation (hyperpronation) can cause malalignment of the lower extremities. This most often leads to functional and structural deficits. The aim of this study was to assess the effect of foot hyperpronation on lumbar lordosis and thoracic kyphosis. Thirty five healthy subjects (age range, 18030 years) were asked to stand on 4 positions including a flat surface (normal position) and on wedges angled at 10, 15, and 20 degrees. Sampling was done using simple random sampling. Measurements were made by a motion analysis system. For data analysis, the SPSS software (ver. 18) using paired t-test and repeated measures analysis of variance (ANOVA) was applied. The eversion created by the wedges caused a significant increase in lumbar lordosis and thoracic kyphosis. The most significant change occurred between two consecutive positions of flat surface and the first wedge. The t-test for repeated measures showed a high correlation between each two consecutive positions. The results showed that with increased bilateral foot pronation, lumbar lordosis and thoracic kyphosis increased as well. In fact, each of these results is a compensation phenomenon. Further studies are required to determine long-term results of excessive foot pronation and its probable effect on damage progression. PMID:25169004

  14. Impact of the bifurcation angle on major cardiac events after cross-over single stent strategy in unprotected left main bifurcation lesions: 3-dimensional quantitative coronary angiographic analysis

    PubMed Central

    Amemiya, Kisaki; Domei, Takenori; Iwabuchi, Masashi; Shirai, Shinichi; Ando, Kenji; Goya, Masahiko; Yokoi, Hiroyoshi; Nobuyoshi, Masakiyo

    2014-01-01

    The impact of the bifurcation angle (BA) between the left main (LM) and the main branch on clinical outcomes after single stenting has never been documented. Therefore, the aim of this study was to investigate the impact of the BA on clinical outcomes after single cross-over LM to left anterior descending artery (LAD) stenting. A total of 170 patients who underwent percutaneous coronary intervention (PCI) in unprotected LM bifurcation with successful single cross-over stenting from the LM into the LAD were enrolled. The main vessel angle between the LM and the LAD was computed in end-diastole before PCI with three-dimensional (3D) quantitative coronary angiography (QCA) software. The patients were classified into three groups according to tertiles of the main vessel angle. The cumulative incidence of major adverse cardiac event (MACE: cardiac death, myocardial infarction, any revascularization including target lesion revascularization) rates throughout a 12-month period were compared between the three groups. Baseline patient characteristics were not a significant difference between the three groups. Compared to the high angle group, the low angle group had a significantly higher incidence of MACE (p = 0.041). In conclusion, this study revealed that low BA between the LM and the LAD had an adverse clinical impact after single cross-over LM to LAD stenting. PMID:25628958

  15. Computer analysis of railcar vibrations

    NASA Technical Reports Server (NTRS)

    Vlaminck, R. R.

    1975-01-01

    Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.

  16. A 3-Dimensional Analysis of the Galactic Gamma-Ray Emission Resulting from Cosmic-Ray Interactions with the Interstellar Gas and Radiation Fields

    NASA Technical Reports Server (NTRS)

    Sodroski, Thomas J.; Dwek, Eli (Technical Monitor)

    2001-01-01

    The contractor will provide support for the analysis of data under ADP (NRA 96-ADP- 09; Proposal No . 167-96adp). The primary task objective is to construct a 3-D model for the distribution of high-energy (20 MeV - 30 GeV) gamma-ray emission in the Galactic disk. Under this task the contractor will utilize data from the EGRET instrument on the Compton Gamma-Ray Observatory, H I and CO surveys, radio-continuum surveys at 408 MHz, 1420 MHz, 5 GHz, and 19 GHz, the COBE Diffuse Infrared Background Experiment (DIME) all-sky maps from 1 to 240 p, and ground-based B, V, J, H, and K photometry. The respective contributions to the gamma-ray emission from cosmic ray/matter interactions, inverse Compton scattering, and extragalactic emission will be determined.

  17. Cardiothoracic Applications of 3-dimensional Printing.

    PubMed

    Giannopoulos, Andreas A; Steigner, Michael L; George, Elizabeth; Barile, Maria; Hunsaker, Andetta R; Rybicki, Frank J; Mitsouras, Dimitris

    2016-09-01

    Medical 3-dimensional (3D) printing is emerging as a clinically relevant imaging tool in directing preoperative and intraoperative planning in many surgical specialties and will therefore likely lead to interdisciplinary collaboration between engineers, radiologists, and surgeons. Data from standard imaging modalities such as computed tomography, magnetic resonance imaging, echocardiography, and rotational angiography can be used to fabricate life-sized models of human anatomy and pathology, as well as patient-specific implants and surgical guides. Cardiovascular 3D-printed models can improve diagnosis and allow for advanced preoperative planning. The majority of applications reported involve congenital heart diseases and valvular and great vessels pathologies. Printed models are suitable for planning both surgical and minimally invasive procedures. Added value has been reported toward improving outcomes, minimizing perioperative risk, and developing new procedures such as transcatheter mitral valve replacements. Similarly, thoracic surgeons are using 3D printing to assess invasion of vital structures by tumors and to assist in diagnosis and treatment of upper and lower airway diseases. Anatomic models enable surgeons to assimilate information more quickly than image review, choose the optimal surgical approach, and achieve surgery in a shorter time. Patient-specific 3D-printed implants are beginning to appear and may have significant impact on cosmetic and life-saving procedures in the future. In summary, cardiothoracic 3D printing is rapidly evolving and may be a potential game-changer for surgeons. The imager who is equipped with the tools to apply this new imaging science to cardiothoracic care is thus ideally positioned to innovate in this new emerging imaging modality. PMID:27149367

  18. Movement within foot and ankle joint in children with spastic cerebral palsy: a 3-dimensional ultrasound analysis of medial gastrocnemius length with correction for effects of foot deformation

    PubMed Central

    2013-01-01

    Background In spastic cerebral palsy (SCP), a limited range of motion of the foot (ROM), limits gait and other activities. Assessment of this limitation of ROM and knowledge of active mechanisms is of crucial importance for clinical treatment. Methods For a comparison between spastic cerebral palsy (SCP) children and typically developing children (TD), medial gastrocnemius muscle-tendon complex length was assessed using 3-D ultrasound imaging techniques, while exerting externally standardized moments via a hand-held dynamometer. Exemplary X-ray imaging of ankle and foot was used to confirm possible TD-SCP differences in foot deformation. Results SCP and TD did not differ in normalized level of excitation (EMG) of muscles studied. For given moments exerted in SCP, foot plate angles were all more towards plantar flexion than in TD. However, foot plate angle proved to be an invalid estimator of talocrural joint angle, since at equal foot plate angles, GM muscle-tendon complex was shorter in SCP (corresponding to an equivalent of 1 cm). A substantial difference remained even after normalizing for individual differences in tibia length. X-ray imaging of ankle and foot of one SCP child and two typically developed adults, confirmed that in SCP that of total footplate angle changes (0-4 Nm: 15°), the contribution of foot deformation to changes in foot plate angle (8) were as big as the contribution of dorsal flexion at the talocrural joint (7°). In typically developed individuals there were relatively smaller contributions (10 -11%) by foot deformation to changes in foot plate angle, indicating that the contribution of talocrural angle changes was most important. Using a new estimate for position at the talocrural joint (the difference between GM muscle–tendon complex length and tibia length, GM relative length) removed this effect, thus allowing more fair comparison of SCP and TD data. On the basis of analysis of foot plate angle and GM relative length as a function

  19. Distributed Design and Analysis of Computer Experiments

    SciTech Connect

    Doak, Justin

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. For example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an

  20. Distributed Design and Analysis of Computer Experiments

    Energy Science and Technology Software Center (ESTSC)

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation

  1. 3-Dimensional Topographic Models for the Classroom

    NASA Technical Reports Server (NTRS)

    Keller, J. W.; Roark, J. H.; Sakimoto, S. E. H.; Stockman, S.; Frey, H. V.

    2003-01-01

    We have recently undertaken a program to develop educational tools using 3-dimensional solid models of digital elevation data acquired by the Mars Orbital Laser Altimeter (MOLA) for Mars as well as a variety of sources for elevation data of the Earth. This work is made possible by the use of rapid prototyping technology to construct solid 3-Dimensional models of science data. We recently acquired rapid prototyping machine that builds 3-dimensional models in extruded plastic. While the machine was acquired to assist in the design and development of scientific instruments and hardware, it is also fully capable of producing models of spacecraft remote sensing data. We have demonstrated this by using Mars Orbiter Laser Altimeter (MOLA) topographic data and Earth based topographic data to produce extruded plastic topographic models which are visually appealing and instantly engage those who handle them.

  2. 3-dimensional imaging at nanometer resolutions

    DOEpatents

    Werner, James H.; Goodwin, Peter M.; Shreve, Andrew P.

    2010-03-09

    An apparatus and method for enabling precise, 3-dimensional, photoactivation localization microscopy (PALM) using selective, two-photon activation of fluorophores in a single z-slice of a sample in cooperation with time-gated imaging for reducing the background radiation from other image planes to levels suitable for single-molecule detection and spatial location, are described.

  3. Computer vision in microstructural analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  4. Biochemical Applications Of 3-Dimensional Fluorescence Spectrometry

    NASA Astrophysics Data System (ADS)

    Leiner, Marc J.; Wolfbeis, Otto S.

    1988-06-01

    We investigated the 3-dimensional fluorescence of complex mixtures of bioloquids such as human serum, serum ultrafiltrate, human urine, and human plasma low density lipoproteins. The total fluorescence of human serum can be divided into a few peaks. When comparing fluorescence topograms of sera, from normal and cancerous subjects, we found significant differences in tryptophan fluorescence. Although the total fluorescence of human urine can be resolved into 3-5 distinct peaks, some of them. do not result from single fluorescent urinary metabolites, but rather from. several species having similar spectral properties. Human plasma, low density lipoproteins possess a native fluorescence that changes when submitted to in-vitro autoxidation. The 3-dimensional fluorescence demonstrated the presence of 7 fluorophores in the lipid domain, and 6 fluorophores in the protein. dovain- The above results demonstrated that 3-dimensional fluorescence can resolve the spectral properties of complex ,lxtures much better than other methods. Moreover, other parameters than excitation and emission wavelength and intensity (for instance fluorescence lifetime, polarization, or quenchability) may be exploited to give a multidl,ensio,a1 matrix, that is unique for each sample. Consequently, 3-dimensio:Hhal fluorescence as such, or in combination with separation techniques is therefore considered to have the potential of becoming a useful new H.ethod in clinical chemistry and analytical biochemistry.

  5. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  6. Preliminary Toxicity Analysis of 3-Dimensional Conformal Radiation Therapy Versus Intensity Modulated Radiation Therapy on the High-Dose Arm of the Radiation Therapy Oncology Group 0126 Prostate Cancer Trial

    SciTech Connect

    Michalski, Jeff M.; Yan, Yan; Watkins-Bruner, Deborah; Bosch, Walter R.; Winter, Kathryn; Galvin, James M.; Bahary, Jean-Paul; Morton, Gerard C.; Parliament, Matthew B.; Sandler, Howard M.

    2013-12-01

    Purpose: To give a preliminary report of clinical and treatment factors associated with toxicity in men receiving high-dose radiation therapy (RT) on a phase 3 dose-escalation trial. Methods and Materials: The trial was initiated with 3-dimensional conformal RT (3D-CRT) and amended after 1 year to allow intensity modulated RT (IMRT). Patients treated with 3D-CRT received 55.8 Gy to a planning target volume that included the prostate and seminal vesicles, then 23.4 Gy to prostate only. The IMRT patients were treated to the prostate and proximal seminal vesicles to 79.2 Gy. Common Toxicity Criteria, version 2.0, and Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer late morbidity scores were used for acute and late effects. Results: Of 763 patients randomized to the 79.2-Gy arm of Radiation Therapy Oncology Group 0126 protocol, 748 were eligible and evaluable: 491 and 257 were treated with 3D-CRT and IMRT, respectively. For both bladder and rectum, the volumes receiving 65, 70, and 75 Gy were significantly lower with IMRT (all P<.0001). For grade (G) 2+ acute gastrointestinal/genitourinary (GI/GU) toxicity, both univariate and multivariate analyses showed a statistically significant decrease in G2+ acute collective GI/GU toxicity for IMRT. There were no significant differences with 3D-CRT or IMRT for acute or late G2+ or 3+ GU toxicities. Univariate analysis showed a statistically significant decrease in late G2+ GI toxicity for IMRT (P=.039). On multivariate analysis, IMRT showed a 26% reduction in G2+ late GI toxicity (P=.099). Acute G2+ toxicity was associated with late G3+ toxicity (P=.005). With dose–volume histogram data in the multivariate analysis, RT modality was not significant, whereas white race (P=.001) and rectal V70 ≥15% were associated with G2+ rectal toxicity (P=.034). Conclusions: Intensity modulated RT is associated with a significant reduction in acute G2+ GI/GU toxicity. There is a trend for a

  7. Video Based Sensor for Tracking 3-Dimensional Targets

    NASA Technical Reports Server (NTRS)

    Howard, R. T.; Book, Michael L.; Bryan, Thomas C.

    2000-01-01

    Video-Based Sensor for Tracking 3-Dimensional Targets The National Aeronautics and Space Administration's (NASAs) Marshall Space Flight Center (MSFC) has been developing and testing video-based sensors for automated spacecraft guidance for several years, and the next generation of video sensor will have tracking rates up to 100 Hz and will be able to track multiple reflectors and targets. The Video Guidance Sensor (VGS) developed over the past several years has performed well in testing and met the objective of being used as the terminal guidance sensor for an automated rendezvous and capture system. The first VGS was successfully tested in closed-loop 3-degree-of-freedom (3- DOF) tests in 1989 and then in 6-DOF open-loop tests in 1992 and closed-loop tests in 1993-4. Development and testing continued, and in 1995 approval was given to test the VGS in an experiment on the Space Shuttle. The VGS flew in 1997 and in 1998, performing well for both flights. During the development and testing before, during, and after the flight experiments, numerous areas for improvement were found. The VGS was developed with a sensor head and an electronics box, connected by cables. The VGS was used in conjunction with a target that had wavelength-filtered retro-reflectors in a specific pattern, The sensor head contained the laser diodes, video camera, and heaters and coolers. The electronics box contained a frame grabber, image processor, the electronics to control the components in the sensor head, the communications electronics, and the power supply. The system works by sequentially firing two different wavelengths of laser diodes at the target and processing the two images. Since the target only reflects one wavelength, it shows up well in one image and not at all in the other. Because the target's dimensions are known, the relative positions and attitudes of the target and the sensor can be computed from the spots reflected from the target. The system was designed to work from I

  8. 3-dimensional bioprinting for tissue engineering applications.

    PubMed

    Gu, Bon Kang; Choi, Dong Jin; Park, Sang Jun; Kim, Min Sup; Kang, Chang Mo; Kim, Chun-Ho

    2016-01-01

    The 3-dimensional (3D) printing technologies, referred to as additive manufacturing (AM) or rapid prototyping (RP), have acquired reputation over the past few years for art, architectural modeling, lightweight machines, and tissue engineering applications. Among these applications, tissue engineering field using 3D printing has attracted the attention from many researchers. 3D bioprinting has an advantage in the manufacture of a scaffold for tissue engineering applications, because of rapid-fabrication, high-precision, and customized-production, etc. In this review, we will introduce the principles and the current state of the 3D bioprinting methods. Focusing on some of studies that are being current application for biomedical and tissue engineering fields using printed 3D scaffolds. PMID:27114828

  9. A Petaflops Era Computing Analysis

    NASA Technical Reports Server (NTRS)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  10. Computer analysis of cardiovascular parameters.

    PubMed

    Mass, H J; Gean, J T; Gwirtz, P A

    1987-01-01

    A computer program is described for the analysis of several cardiovascular parameters frequently measured or derived in the chronically instrumented dog model. Data are stored on magnetic tape and are subsequently analyzed with the Apple IIe microcomputer equipped with the ADALAB (Interactive Microware, Inc.) analog-to-digital convertor. Not limited to the chronically instrumented animal model, the program is capable of analyzing left ventricular pressure, three channels of regional myocardial segment length, coronary flow velocity as measured by the Doppler ultrasonic flow technique, and two channels of systemic arterial pressure. Derived data include: left ventricular dP/dtmax, left ventricular pressure-heart rate product, left ventricular ejection time, tension time index; percent segment length shortening and velocity of shortening, dL/dt(s)max, regional stroke work and power, duration of systole and diastole; mean coronary flow velocity, peak diastolic and systolic flow velocity, and true mean systemic arterial pressure. PMID:3581809

  11. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  12. Computer graphics in aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.

    1984-01-01

    The use of computer graphics and its application to aerodynamic analyses on a routine basis is outlined. The mathematical modelling of the aircraft geometries and the shading technique implemented are discussed. Examples of computer graphics used to display aerodynamic flow field data and aircraft geometries are shown. A future need in computer graphics for aerodynamic analyses is addressed.

  13. P17.56A 3-DIMENSIONAL MATRIX ASSAY TO HELP PREDICT TREATMENT RESPONSE TO TEMOZOLOMIDE IN PATIENTS WITH GLIOBASTOMA: UPDATE OF RESULTS AND SUBGROUP ANALYSIS OF PATIENTS UNDERGOING MGMT TESTING

    PubMed Central

    Megyesi, J.F.; Costello, P.; McDonald, W.; Macdonald, D.; Easaw, J.

    2014-01-01

    INTRODUCTION: Usual treatment for glioblastoma is surgical resection, if possible, followed by radiotherapy with adjuvant chemotherapy using temozolomide. However a significant number of patients have a short response to temozolomide and subsequently a poorer prognosis. We investigated the possibility that surgical specimens obtained at the time of surgery might provide valuable information regarding sensitivity to chemotherapies, including temozolomide. In order to do this we used a 3-dimensional matrix assay that mimics brain. We analyzed a subgroup of these patients for O-6-methylguanine-DNA methyltransferase (MGMT) status and correlated this with the response of tumor tissue in the assay to temozolomide. METHODS: Records for patients treated for newly diagnosed or recurrent glioblastoma were analyzed. All patients had undergone surgical resection and tumor specimens at time of surgery were available for culture in a 3-dimensional matrix assay and observed for growth and invasion. Drug effects on mean invasion and growth were expressed as a ratio relative to control conditions. Length of survival was compared between temozolomide treated patients whose screening results had predicted a positive or negative response to temozolomide. The MGMT status of a subgroup of these patients was analyzed and correlated with the response of tumor tissue in the assay to temozolomide. RESULTS: Fifty-eight patients with glioblastoma were assessed. Each patient's tumor displayed a unique invasion and response profile. We looked in particular at the correlation between the outcome of a patient with glioblastoma treated with temozolomide and the response of that patient's tumor tissue to temozolomide in the 3-dimensional assay. Mean survival time for patients whose tumors were not significantly sensitive to temozolomide in the assay was 181.7 +/- 43 days. Mean survival time for patients whose tumors were significantly sensitive to temozolomide in the assay was 290.0 +/- 33 days

  14. Grid computing in image analysis

    PubMed Central

    2011-01-01

    Diagnostic surgical pathology or tissue–based diagnosis still remains the most reliable and specific diagnostic medical procedure. The development of whole slide scanners permits the creation of virtual slides and to work on so-called virtual microscopes. In addition to interactive work on virtual slides approaches have been reported that introduce automated virtual microscopy, which is composed of several tools focusing on quite different tasks. These include evaluation of image quality and image standardization, analysis of potential useful thresholds for object detection and identification (segmentation), dynamic segmentation procedures, adjustable magnification to optimize feature extraction, and texture analysis including image transformation and evaluation of elementary primitives. Grid technology seems to possess all features to efficiently target and control the specific tasks of image information and detection in order to obtain a detailed and accurate diagnosis. Grid technology is based upon so-called nodes that are linked together and share certain communication rules in using open standards. Their number and functionality can vary according to the needs of a specific user at a given point in time. When implementing automated virtual microscopy with Grid technology, all of the five different Grid functions have to be taken into account, namely 1) computation services, 2) data services, 3) application services, 4) information services, and 5) knowledge services. Although all mandatory tools of automated virtual microscopy can be implemented in a closed or standardized open system, Grid technology offers a new dimension to acquire, detect, classify, and distribute medical image information, and to assure quality in tissue–based diagnosis. PMID:21516880

  15. Computer-Based Linguistic Analysis.

    ERIC Educational Resources Information Center

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  16. Bimolecular dynamics by computer analysis

    SciTech Connect

    Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.

    1984-01-01

    As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.

  17. Application of 3-dimensional printing in hand surgery for production of a novel bone reduction clamp.

    PubMed

    Fuller, Sam M; Butz, Daniel R; Vevang, Curt B; Makhlouf, Mansour V

    2014-09-01

    Three-dimensional printing is being rapidly incorporated in the medical field to produce external prosthetics for improved cosmesis and fabricated molds to aid in presurgical planning. Biomedically engineered products from 3-dimensional printers are also utilized as implantable devices for knee arthroplasty, airway orthoses, and other surgical procedures. Although at first expensive and conceptually difficult to construct, 3-dimensional printing is now becoming more affordable and widely accessible. In hand surgery, like many other specialties, new or customized instruments would be desirable; however, the overall production cost restricts their development. We are presenting our step-by-step experience in creating a bone reduction clamp for finger fractures using 3-dimensional printing technology. Using free, downloadable software, a 3-dimensional model of a bone reduction clamp for hand fractures was created based on the senior author's (M.V.M.) specific design, previous experience, and preferences for fracture fixation. Once deemed satisfactory, the computer files were sent to a 3-dimensional printing company for the production of the prototypes. Multiple plastic prototypes were made and adjusted, affording a fast, low-cost working model of the proposed clamp. Once a workable design was obtained, a printing company produced the surgical clamp prototype directly from the 3-dimensional model represented in the computer files. This prototype was used in the operating room, meeting the expectations of the surgeon. Three-dimensional printing is affordable and offers the benefits of reducing production time and nurturing innovations in hand surgery. This article presents a step-by-step description of our design process using online software programs and 3-dimensional printing services. As medical technology advances, it is important that hand surgeons remain aware of available resources, are knowledgeable about how the process works, and are able to take advantage of

  18. Improving Perceptual Skills with 3-Dimensional Animations.

    ERIC Educational Resources Information Center

    Johns, Janet Faye; Brander, Julianne Marie

    1998-01-01

    Describes three-dimensional computer aided design (CAD) models for every component in a representative mechanical system; the CAD models made it easy to generate 3-D animations that are ideal for teaching perceptual skills in multimedia computer-based technical training. Fifteen illustrations are provided. (AEF)

  19. CAVASS: a computer-assisted visualization and analysis software system.

    PubMed

    Grevera, George; Udupa, Jayaram; Odhner, Dewey; Zhuge, Ying; Souza, Andre; Iwanaga, Tad; Mishra, Shipra

    2007-11-01

    The Medical Image Processing Group at the University of Pennsylvania has been developing (and distributing with source code) medical image analysis and visualization software systems for a long period of time. Our most recent system, 3DVIEWNIX, was first released in 1993. Since that time, a number of significant advancements have taken place with regard to computer platforms and operating systems, networking capability, the rise of parallel processing standards, and the development of open-source toolkits. The development of CAVASS by our group is the next generation of 3DVIEWNIX. CAVASS will be freely available and open source, and it is integrated with toolkits such as Insight Toolkit and Visualization Toolkit. CAVASS runs on Windows, Unix, Linux, and Mac but shares a single code base. Rather than requiring expensive multiprocessor systems, it seamlessly provides for parallel processing via inexpensive clusters of work stations for more time-consuming algorithms. Most importantly, CAVASS is directed at the visualization, processing, and analysis of 3-dimensional and higher-dimensional medical imagery, so support for digital imaging and communication in medicine data and the efficient implementation of algorithms is given paramount importance. PMID:17786517

  20. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  1. Computational analysis on plug-in hybrid electric motorcycle chassis

    NASA Astrophysics Data System (ADS)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  2. Computer applications for engineering/structural analysis

    NASA Astrophysics Data System (ADS)

    Zaslawsky, M.; Samaddar, S. K.

    1991-10-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequence of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  3. Computer applications for engineering/structural analysis

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-01-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  4. 3-dimensional strain fields from tomographic measurements

    NASA Astrophysics Data System (ADS)

    Haldrup, K.; Nielsen, S. F.; Mishnaevsky, L., Jr.; Beckmann, F.; Wert, J. A.

    2006-08-01

    Understanding the distributions of strain within solid bodies undergoing plastic deformations has been of interest for many years in a wide range of disciplines, ranging from basic materials science to biology. However, the desire to investigate these strain fields has been frustrated by the inaccessibility of the interior of most samples to detailed investigation without destroying the sample in the process. To some extent, this has been remedied by the development of advanced surface measurement techniques as well as computer models based on Finite Element methods. Over the last decade, this situation has changed by the introduction of a range of tomographic methods based both on advances in computer technology and in instrumentation, advances which have opened up the interior of optically opaque samples for detailed investigations. We present a general method for assessing the strain in the interior of marker-containing specimens undergoing various types of deformation. The results are compared with Finite Element modelling.

  5. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  6. Computer Programming in a Spatial Analysis Course.

    ERIC Educational Resources Information Center

    Gesler, Wilbert; Kaplan, Abram

    1993-01-01

    Contends that students in spatial analysis courses generally are familiar with computer use and programs but lack basic computer programing skills. Describes four exercises in which students learn programing using BASIC and dBASE. Asserts that programming exercises help students clarify concepts, understand the rationale behind calculations, use…

  7. Discourse Analysis of Teaching Computing Online

    ERIC Educational Resources Information Center

    Bower, Matt

    2009-01-01

    This paper analyses the teaching and learning of computing in a Web-conferencing environment. A discourse analysis of three introductory programming learning episodes is presented to demonstrate issues and effects that arise when teaching computing using such an approach. The subject of discussion, the interactive nature of discussion and any…

  8. A Computer Language for ECG Contour Analysis

    PubMed Central

    McConnochie, John W.

    1982-01-01

    The purpose of this paper is to demonstrate contructively that criteria for ECG contour analysis can be interpreted directly by a computer. Thereby, the programming task is greatly reduced. Direct interpretation is achieved by the creation of a computer language that is well-suited for the expression of such criteria. Further development of the language is planned.

  9. Computer aided analysis of phonocardiogram.

    PubMed

    Singh, J; Anand, R S

    2007-01-01

    In the present paper analysis of phonocardiogram (PCG) records are presented. The analysis has been carried out in both time and frequency domains with the aim of detecting certain correlations between the time and frequency domain representations of PCG. The analysis is limited to first and second heart sounds (S1 and S2) only. In the time domain analysis the moving window averaging technique is used to determine the occurrence of S1 and S2, which helps in determination of cardiac interval and absolute and relative time duration of individual S1 and S2, as well as absolute and relative duration between them. In the frequency domain, fast Fourier transform (FFT) of the complete PCG record, and short time Fourier transform (STFT) and wavelet transform of individual heart sounds have been carried out. The frequency domain analysis gives an idea about the dominant frequency components in individual records and frequency spectrum of individual heart sounds. A comparative observation on both the analyses gives some correlation between time domain and frequency domain representations of PCG. PMID:17701776

  10. Radiological Safety Analysis Computer Program

    Energy Science and Technology Software Center (ESTSC)

    2001-08-28

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface andmore » plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FOR 11 and 12.« less

  11. Distributed computing and nuclear reactor analysis

    SciTech Connect

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-03-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.

  12. Computer aided cogeneration feasibility analysis

    SciTech Connect

    Anaya, D.A.; Caltenco, E.J.L.; Robles, L.F.

    1996-12-31

    A successful cogeneration system design depends of several factors, and the optimal configuration can be founded using a steam and power simulation software. The key characteristics of one of this kind of software are described below, and its application on a process plant cogeneration feasibility analysis is shown in this paper. Finally a study case is illustrated. 4 refs., 2 figs.

  13. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  14. COMPUTATIONAL FLUID DYNAMICS MODELING ANALYSIS OF COMBUSTORS

    SciTech Connect

    Mathur, M.P.; Freeman, Mark; Gera, Dinesh

    2001-11-06

    In the current fiscal year FY01, several CFD simulations were conducted to investigate the effects of moisture in biomass/coal, particle injection locations, and flow parameters on carbon burnout and NO{sub x} inside a 150 MW GEEZER industrial boiler. Various simulations were designed to predict the suitability of biomass cofiring in coal combustors, and to explore the possibility of using biomass as a reburning fuel to reduce NO{sub x}. Some additional CFD simulations were also conducted on CERF combustor to examine the combustion characteristics of pulverized coal in enriched O{sub 2}/CO{sub 2} environments. Most of the CFD models available in the literature treat particles to be point masses with uniform temperature inside the particles. This isothermal condition may not be suitable for larger biomass particles. To this end, a stand alone program was developed from the first principles to account for heat conduction from the surface of the particle to its center. It is envisaged that the recently developed non-isothermal stand alone module will be integrated with the Fluent solver during next fiscal year to accurately predict the carbon burnout from larger biomass particles. Anisotropy in heat transfer in radial and axial will be explored using different conductivities in radial and axial directions. The above models will be validated/tested on various fullscale industrial boilers. The current NO{sub x} modules will be modified to account for local CH, CH{sub 2}, and CH{sub 3} radicals chemistry, currently it is based on global chemistry. It may also be worth exploring the effect of enriched O{sub 2}/CO{sub 2} environment on carbon burnout and NO{sub x} concentration. The research objective of this study is to develop a 3-Dimensional Combustor Model for Biomass Co-firing and reburning applications using the Fluent Computational Fluid Dynamics Code.

  15. Statistical energy analysis computer program, user's guide

    NASA Technical Reports Server (NTRS)

    Trudell, R. W.; Yano, L. I.

    1981-01-01

    A high frequency random vibration analysis, (statistical energy analysis (SEA) method) is examined. The SEA method accomplishes high frequency prediction of arbitrary structural configurations. A general SEA computer program is described. A summary of SEA theory, example problems of SEA program application, and complete program listing are presented.

  16. Use of Some "Discriminant Analysis" Computer Programs

    ERIC Educational Resources Information Center

    Huberty, Carl J.

    1977-01-01

    The objective of this paper is to review the outputs of selected computer programs often used to carry out a "discriminant analysis" with respect to two purposes of such an analysis, discrimination and classification. The programs selected are three BMD programs. (Author/JKS)

  17. A computational image analysis glossary for biologists.

    PubMed

    Roeder, Adrienne H K; Cunha, Alexandre; Burl, Michael C; Meyerowitz, Elliot M

    2012-09-01

    Recent advances in biological imaging have resulted in an explosion in the quality and quantity of images obtained in a digital format. Developmental biologists are increasingly acquiring beautiful and complex images, thus creating vast image datasets. In the past, patterns in image data have been detected by the human eye. Larger datasets, however, necessitate high-throughput objective analysis tools to computationally extract quantitative information from the images. These tools have been developed in collaborations between biologists, computer scientists, mathematicians and physicists. In this Primer we present a glossary of image analysis terms to aid biologists and briefly discuss the importance of robust image analysis in developmental studies. PMID:22872081

  18. Automating sensitivity analysis of computer models using computer calculus

    SciTech Connect

    Oblow, E.M.; Pin, F.G.

    1985-01-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with ''direct'' and ''adjoint'' sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach is found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies. 24 refs., 2 figs.

  19. Computer aided nonlinear electrical networks analysis

    NASA Technical Reports Server (NTRS)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  20. Characterization of Students' Reasoning and Proof Abilities in 3-Dimensional Geometry

    ERIC Educational Resources Information Center

    Gutierrez, Angel; Pegg, John; Lawrie, Christine

    2004-01-01

    In this paper we report on a research aimed to identify and characterize secondary school students' reasoning and proof abilities when working with 3-dimensional geometric solids. We analyze students' answers to two problems asking them to prove certain properties of prisms. As results of this analysis, we get, on the one side, a characterization…

  1. Computer aided stress analysis of long bones utilizing computer tomography

    SciTech Connect

    Marom, S.A.

    1986-01-01

    A computer aided analysis method, utilizing computed tomography (CT) has been developed, which together with a finite element program determines the stress-displacement pattern in a long bone section. The CT data file provides the geometry, the density and the material properties for the generated finite element model. A three-dimensional finite element model of a tibial shaft is automatically generated from the CT file by a pre-processing procedure for a finite element program. The developed pre-processor includes an edge detection algorithm which determines the boundaries of the reconstructed cross-sectional images of the scanned bone. A mesh generation procedure than automatically generates a three-dimensional mesh of a user-selected refinement. The elastic properties needed for the stress analysis are individually determined for each model element using the radiographic density (CT number) of each pixel with the elemental borders. The elastic modulus is determined from the CT radiographic density by using an empirical relationship from the literature. The generated finite element model, together with applied loads, determined from existing gait analysis and initial displacements, comprise a formatted input for the SAP IV finite element program. The output of this program, stresses and displacements at the model elements and nodes, are sorted and displayed by a developed post-processor to provide maximum and minimum values at selected locations in the model.

  2. Computer analysis of foetal monitoring signals.

    PubMed

    Nunes, Inês; Ayres-de-Campos, Diogo

    2016-01-01

    Five systems for computer analysis of foetal monitoring signals are currently available, incorporating the evaluation of cardiotocographic (CTG) or combined CTG with electrocardiographic ST data. All systems have been integrated with central monitoring stations, allowing the simultaneous monitoring of several tracings on the same computer screen in multiple hospital locations. Computer analysis elicits real-time visual and sound alerts for health care professionals when abnormal patterns are detected, with the aim of prompting a re-evaluation and subsequent clinical action, if considered necessary. Comparison between the CTG analyses provided by the computer and clinical experts has been carried out in all systems, and in three of them, the accuracy of computer alerts in predicting newborn outcomes was evaluated. Comparisons between these studies are hampered by the differences in selection criteria and outcomes. Two of these systems have just completed multicentre randomised clinical trials comparing them with conventional CTG monitoring, and their results are awaited shortly. For the time being, there is limited evidence regarding the impact of computer analysis of foetal monitoring signals on perinatal indicators and on health care professionals' behaviour. PMID:26211832

  3. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  4. Discrete computer analysis in petroleum geology

    SciTech Connect

    Zakharian, A.Z.

    1995-08-01

    Computer analysis must not be resembling on geologist`s work, having its own way because of uncertainty and shortness of geological information even on mature stage of exploration, when our original system of formal discrete computer analysis, realised on {open_quotes}FoxPro for Windows{close_quotes} with not substantial but probabilistic (without ever driving the usual maps) representation of geological situation was used for picking out the sets of best points for exploration drilling in south part of Dheprovsko-Donetzky oil-gas basin.

  5. Temporal fringe pattern analysis with parallel computing

    SciTech Connect

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-11-20

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis.

  6. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  7. X ray computed tomography for failure analysis

    NASA Astrophysics Data System (ADS)

    Bossi, Richard H.; Crews, Alan R.; Georgeson, Gary E.

    1992-08-01

    Under a preliminary testing task assignment of the Advanced Development of X-Ray Computed Tomography Application program, computed tomography (CT) has been studied for its potential as a tool to assist in failure analysis investigations. CT provides three-dimensional spatial distribution of material that can be used to assess internal configurations and material conditions nondestructively. This capability has been used in failure analysis studies to determine the position of internal components and their operation. CT is particularly advantageous on complex systems, composite failure studies, and testing under operational or environmental conditions. CT plays an important role in reducing the time and effort of a failure analysis investigation. Aircraft manufacturing or logistical facilities perform failure analysis operations routinely and could be expected to reduce schedules, reduce costs and/or improve evaluation on about 10 to 30 percent of the problems they investigate by using CT.

  8. Computer based terrain analysis for operational planning

    SciTech Connect

    Powell, D.R.

    1987-01-01

    Analysis of operational capability is an ongoing task for military commanders. In peacetime, most analysis is conducted via computer based combat simulations, where selected force structures engage in simulated combat to gain insight into specific scenarios. The command and control (C/sup 2/) mechanisms that direct combat forces are often neglected relative to the fidelity of representation of mechanical and physical entities. C/sup 2/ capabilities should include the ability to plan a mission, monitor execution activities, and redirect combat power when appropriate. This paper discusses the development of a computer based approach to mission planning for land warfare. The aspect emphasized is the computation and representation of relevant terrain features in the context of operational planning.

  9. Final Report Computational Analysis of Dynamical Systems

    SciTech Connect

    Guckenheimer, John

    2012-05-08

    This is the final report for DOE Grant DE-FG02-93ER25164, initiated in 1993. This grant supported research of John Guckenheimer on computational analysis of dynamical systems. During that period, seventeen individuals received PhD degrees under the supervision of Guckenheimer and over fifty publications related to the grant were produced. This document contains copies of these publications.

  10. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  11. Computational analysis of a multistage axial compressor

    NASA Astrophysics Data System (ADS)

    Mamidoju, Chaithanya

    Turbomachines are used extensively in Aerospace, Power Generation, and Oil & Gas Industries. Efficiency of these machines is often an important factor and has led to the continuous effort to improve the design to achieve better efficiency. The axial flow compressor is a major component in a gas turbine with the turbine's overall performance depending strongly on compressor performance. Traditional analysis of axial compressors involves throughflow calculations, isolated blade passage analysis, Quasi-3D blade-to-blade analysis, single-stage (rotor-stator) analysis, and multi-stage analysis involving larger design cycles. In the current study, the detailed flow through a 15 stage axial compressor is analyzed using a 3-D Navier Stokes CFD solver in a parallel computing environment. Methodology is described for steady state (frozen rotor stator) analysis of one blade passage per component. Various effects such as mesh type and density, boundary conditions, tip clearance and numerical issues such as turbulence model choice, advection model choice, and parallel processing performance are analyzed. A high sensitivity of the predictions to the above was found. Physical explanation to the flow features observed in the computational study are given. The total pressure rise verses mass flow rate was computed.

  12. Development and Validation of a 3-Dimensional CFB Furnace Model

    NASA Astrophysics Data System (ADS)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  13. Risk analysis of computer system designs

    NASA Technical Reports Server (NTRS)

    Vallone, A.

    1981-01-01

    Adverse events during implementation can affect final capabilities, schedule and cost of a computer system even though the system was accurately designed and evaluated. Risk analysis enables the manager to forecast the impact of those events and to timely ask for design revisions or contingency plans before making any decision. This paper presents a structured procedure for an effective risk analysis. The procedure identifies the required activities, separates subjective assessments from objective evaluations, and defines a risk measure to determine the analysis results. The procedure is consistent with the system design evaluation and enables a meaningful comparison among alternative designs.

  14. Computer design and analysis of vacuum systems

    SciTech Connect

    Santeler, D.J.

    1987-07-01

    A computer program has been developed for an IBM compatible personal computer to assist in the design and analysis of vacuum systems. The program has a selection of 12 major schematics with several thousand minor variants incorporating diffusion, turbomolecular, cryogenic, ion, mechanical, and sorption pumps as well as circular tubes, bends, valves, traps, and purge gas connections. The gas throughput versus the inlet pressure of the pump is presented on a log--log graphical display. The conductance of each series component is sequentially added to the graph to obtain the net system behavior Q/sub (//sub P//sub )/. The component conductances may be calculated either from the inlet area and the transmission probability or from the tube length and the diameter. The gas-flow calculations are valid for orifices, short tubes, and long tubes throughout the entire pressure range from molecular through viscous to choked and nonchoked exit flows. The roughing-pump and high-vacuum-pump characteristic curves are numerically integrated to provide a graphical presentation of the system pumpdown. Outgassing data for different materials is then combined to produce a graph of the net system ''outgassing pressure.'' Computer routines are provided for differentiating a real pumpdown curve for system analysis. The computer program is included with the American Vacuum Society course, ''Advanced Vacuum System Design and Analysis,'' or it may be purchased from Process Applications, Inc.

  15. Computational strategies for tire monitoring and analysis

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.; Noor, Ahmed K.; Green, James S.

    1995-01-01

    Computational strategies are presented for the modeling and analysis of tires in contact with pavement. A procedure is introduced for simple and accurate determination of tire cross-sectional geometric characteristics from a digitally scanned image. Three new strategies for reducing the computational effort in the finite element solution of tire-pavement contact are also presented. These strategies take advantage of the observation that footprint loads do not usually stimulate a significant tire response away from the pavement contact region. The finite element strategies differ in their level of approximation and required amount of computer resources. The effectiveness of the strategies is demonstrated by numerical examples of frictionless and frictional contact of the space shuttle Orbiter nose-gear tire. Both an in-house research code and a commercial finite element code are used in the numerical studies.

  16. The 3-dimensional construction of the Rae craton, central Canada

    NASA Astrophysics Data System (ADS)

    Snyder, David B.; Craven, James A.; Pilkington, Mark; Hillier, Michael J.

    2015-10-01

    Reconstruction of the 3-dimensional tectonic assembly of early continents, first as Archean cratons and then Proterozoic shields, remains poorly understood. In this paper, all readily available geophysical and geochemical data are assembled in a 3-D model with the most accurate bedrock geology in order to understand better the geometry of major structures within the Rae craton of central Canada. Analysis of geophysical observations of gravity and seismic wave speed variations revealed several lithospheric-scale discontinuities in physical properties. Where these discontinuities project upward to correlate with mapped upper crustal geological structures, the discontinuities can be interpreted as shear zones. Radiometric dating of xenoliths provides estimates of rock types and ages at depth beneath sparse kimberlite occurrences. These ages can also be correlated to surface rocks. The 3.6-2.6 Ga Rae craton comprises at least three smaller continental terranes, which "cratonized" during a granitic bloom. Cratonization probably represents final differentiation of early crust into a relatively homogeneous, uniformly thin (35-42 km), tonalite-trondhjemite-granodiorite crust with pyroxenite layers near the Moho. The peak thermotectonic event at 1.86-1.7 Ga was associated with the Hudsonian orogeny that assembled several cratons and lesser continental blocks into the Canadian Shield using a number of southeast-dipping megathrusts. This orogeny metasomatized, mineralized, and recrystallized mantle and lower crustal rocks, apparently making them more conductive by introducing or concentrating sulfides or graphite. Little evidence exists of thin slabs similar to modern oceanic lithosphere in this Precambrian construction history whereas underthrusting and wedging of continental lithosphere is inferred from multiple dipping discontinuities.

  17. Computational analysis of forebody tangential slot blowing

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Agosta-Greenman, Roxana M.; Rizk, Yehia M.; Schiff, Lewis B.; Cummings, Russell M.

    1994-01-01

    An overview of the computational effort to analyze forebody tangential slot blowing is presented. Tangential slot blowing generates side force and yawing moment which may be used to control an aircraft flying at high-angle-of-attack. Two different geometries are used in the analysis: (1) The High Alpha Research Vehicle; and (2) a generic chined forebody. Computations using the isolated F/A-18 forebody are obtained at full-scale wind tunnel test conditions for direct comparison with available experimental data. The effects of over- and under-blowing on force and moment production are analyzed. Time-accurate solutions using the isolated forebody are obtained to study the force onset timelag of tangential slot blowing. Computations using the generic chined forebody are obtained at experimental wind tunnel conditions, and the results compared with available experimental data. This computational analysis compliments the experimental results and provides a detailed understanding of the effects of tangential slot blowing on the flow field about simple and complex geometries.

  18. Probabilistic structural analysis computer code (NESSUS)

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  19. Statistical Data Analysis in the Computer Age

    NASA Astrophysics Data System (ADS)

    Efron, Bradley; Tibshirani, Robert

    1991-07-01

    Most of our familiar statistical methods, such as hypothesis testing, linear regression, analysis of variance, and maximum likelihood estimation, were designed to be implemented on mechanical calculators. modern electronic computation has encouraged a host of new statistical methods that require fewer distributional assumptions than their predecessors and can be applied to more complicated statistical estimators. These methods allow the scientist to explore and describe data and draw valid statistical inferences without the usual concerns for mathematical tractability. This is possible because traditional methods of mathematical analysis are replaced by specially constructed computer algorithms. Mathematics has not disappeared from statistical theory. It is the main method for deciding which algorithms are correct and efficient tools for automating statistical inference.

  20. MT3D: a 3 dimensional magnetotelluric modeling program (user's guide and documentation for Rev. 1)

    SciTech Connect

    Nutter, C.; Wannamaker, P.E.

    1980-11-01

    MT3D.REV1 is a non-interactive computer program written in FORTRAN to do 3-dimensional magnetotelluric modeling. A 3-D volume integral equation has been adapted to simulate the MT response of a 3D body in the earth. An integro-difference scheme has been incorporated to increase the accuracy. This is a user's guide for MT3D.REV1 on the University of Utah Research Institute's (UURI) PRIME 400 computer operating under PRIMOS IV, Rev. 17.

  1. Differential Cross Section Kinematics for 3-dimensional Transport Codes

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Dick, Frank

    2008-01-01

    In support of the development of 3-dimensional transport codes, this paper derives the relevant relativistic particle kinematic theory. Formulas are given for invariant, spectral and angular distributions in both the lab (spacecraft) and center of momentum frames, for collisions involving 2, 3 and n - body final states.

  2. Controlled teleportation of a 3-dimensional bipartite quantum state

    NASA Astrophysics Data System (ADS)

    Cao, Hai-Jing; Chen, Zhong-Hua; Song, He-Shan

    2008-07-01

    A controlled teleportation scheme of an unknown 3-dimensional (3D) two-particle quantum state is proposed, where a 3D Bell state and 3D GHZ state function as the quantum channel. This teleportation scheme can be directly generalized to teleport an unknown d-dimensional bipartite quantum state.

  3. Aerodynamic analysis of Pegasus - Computations vs reality

    NASA Technical Reports Server (NTRS)

    Mendenhall, Michael R.; Lesieutre, Daniel J.; Whittaker, C. H.; Curry, Robert E.; Moulton, Bryan

    1993-01-01

    Pegasus, a three-stage, air-launched, winged space booster was developed to provide fast and efficient commercial launch services for small satellites. The aerodynamic design and analysis of Pegasus was conducted without benefit of wind tunnel tests using only computational aerodynamic and fluid dynamic methods. Flight test data from the first two operational flights of Pegasus are now available, and they provide an opportunity to validate the accuracy of the predicted pre-flight aerodynamic characteristics. Comparisons of measured and predicted flight characteristics are presented and discussed. Results show that the computational methods provide reasonable aerodynamic design information with acceptable margins. Post-flight analyses illustrate certain areas in which improvements are desired.

  4. Semiconductor Device Analysis on Personal Computers

    Energy Science and Technology Software Center (ESTSC)

    1993-02-08

    PC-1D models the internal operation of bipolar semiconductor devices by solving for the concentrations and quasi-one-dimensional flow of electrons and holes resulting from either electrical or optical excitation. PC-1D uses the same detailed physical models incorporated in mainframe computer programs, yet runs efficiently on personal computers. PC-1D was originally developed with DOE funding to analyze solar cells. That continues to be its primary mode of usage, with registered copies in regular use at more thanmore » 100 locations worldwide. The program has been successfully applied to the analysis of silicon, gallium-arsenide, and indium-phosphide solar cells. The program is also suitable for modeling bipolar transistors and diodes, including heterojunction devices. Its easy-to-use graphical interface makes it useful as a teaching tool as well.« less

  5. Meaningful statistical analysis of large computational clusters.

    SciTech Connect

    Gentile, Ann C.; Marzouk, Youssef M.; Brandt, James M.; Pebay, Philippe Pierre

    2005-07-01

    Effective monitoring of large computational clusters demands the analysis of a vast amount of raw data from a large number of machines. The fundamental interactions of the system are not, however, well-defined, making it difficult to draw meaningful conclusions from this data, even if one were able to efficiently handle and process it. In this paper we show that computational clusters, because they are comprised of a large number of identical machines, behave in a statistically meaningful fashion. We therefore can employ normal statistical methods to derive information about individual systems and their environment and to detect problems sooner than with traditional mechanisms. We discuss design details necessary to use these methods on a large system in a timely and low-impact fashion.

  6. Computational frameworks for discrete Gabor analysis

    NASA Astrophysics Data System (ADS)

    Strohmer, Thomas

    1997-10-01

    The Gabor transform yields a discrete representation of a signal in the phase space. Since the Gabor transform is non-orthogonal, efficient reconstruction of a signal from its phase space samples is not straightforward and involves the computation of the so- called dual Gabor function. We present a unifying approach to the derivation of numerical algorithms for discrete Gabor analysis, based on unitary matrix factorization. The factorization point of view is notably useful for the design of efficient numerical algorithms. This presentation is the first systematic account of its kind. In particular, it is shown that different algorithms for the computation of the dual window correspond to different factorizations of the frame operator. Simple number theoretic conditions on the time-frequency lattice parameters imply additional structural properties of the frame operator.

  7. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  8. Multimodality 3-Dimensional Image Integration for Congenital Cardiac Catheterization

    PubMed Central

    2014-01-01

    Cardiac catheterization procedures for patients with congenital and structural heart disease are becoming more complex. New imaging strategies involving integration of 3-dimensional images from rotational angiography, magnetic resonance imaging (MRI), computerized tomography (CT), and transesophageal echocardiography (TEE) are employed to facilitate these procedures. We discuss the current use of these new 3D imaging technologies and their advantages and challenges when used to guide complex diagnostic and interventional catheterization procedures in patients with congenital heart disease. PMID:25114757

  9. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  10. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  11. Successful Parenchyma-Sparing Anatomical Surgery by 3-Dimensional Reconstruction of Hilar Cholangiocarcinoma Combined with Anatomic Variation.

    PubMed

    Ni, Qihong; Wang, Haolu; Liang, Xiaowen; Zhang, Yunhe; Chen, Wei; Wang, Jian

    2016-06-01

    The combination of hilar cholangiocarcinoma and anatomic variation constitutes a rare and complicated condition. Precise understanding of 3-dimensional position of tumor in the intrahepatic structure in such cases is important for operation planning and navigation. We report a case of a 61-year woman presenting with hilar cholangiocarcinoma. Anatomic variation and tumor location were well depicted on preoperative multidetector computed tomography (MDCT) combined with 3-dimensional reconstruction as the right posterior segmental duct drained to left hepatic duct. The common hepatic duct, biliary confluence, right anterior segmental duct, and right anterior branch of portal vein were involved by the tumor (Bismuth IIIa). After carefully operation planning, we successfully performed a radical parenchyma-sparing anatomical surgery of hilar cholangiocarcinoma: Liver segmentectomy (segments 5 and 8) and caudate lobectomy. MDCTcombined with 3-dimensional reconstruction is a reliable non-invasive modality for preoperative evaluation of hilar cholangiocarcinoma. PMID:27376205

  12. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  13. Good relationships between computational image analysis and radiological physics

    NASA Astrophysics Data System (ADS)

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-01

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  14. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  15. Computed tomographic analysis of meteorite inclusions

    NASA Technical Reports Server (NTRS)

    Arnold, J. R.; Testa, J. P., Jr.; Friedman, P. J.; Kambic, G. X.

    1983-01-01

    The feasibility of obtaining nondestructively a cross-sectional display of very dense heterogeneous rocky specimens, whether lunar, terrestrial or meteoritic, by using a fourth generation computed tomographic (CT) scanner, with modifications to the software only, is discussed. A description of the scanner, and of the experimental and analytical procedures is given. Using this technique, the interior of heterogeneous materials such as Allende can be probed nondestructively. The regions of material with high and low atomic numbers are displayed quickly; the object can then be cut to obtain for analysis just the areas of interest. A comparison of this technique with conventional industrial and medical techniques is made in terms of image resolution and density distribution display precision.

  16. Computational based functional analysis of Bacillus phytases.

    PubMed

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry. PMID:26672917

  17. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  18. The 3-dimensional cellular automata for HIV infection

    NASA Astrophysics Data System (ADS)

    Mo, Youbin; Ren, Bin; Yang, Wencao; Shuai, Jianwei

    2014-04-01

    The HIV infection dynamics is discussed in detail with a 3-dimensional cellular automata model in this paper. The model can reproduce the three-phase development, i.e., the acute period, the asymptotic period and the AIDS period, observed in the HIV-infected patients in a clinic. We show that the 3D HIV model performs a better robustness on the model parameters than the 2D cellular automata. Furthermore, we reveal that the occurrence of a perpetual source to successively generate infectious waves to spread to the whole system drives the model from the asymptotic state to the AIDS state.

  19. A 3-dimensional finite-difference method for calculating the dynamic coefficients of seals

    NASA Technical Reports Server (NTRS)

    Dietzen, F. J.; Nordmann, R.

    1989-01-01

    A method to calculate the dynamic coefficients of seals with arbitrary geometry is presented. The Navier-Stokes equations are used in conjunction with the k-e turbulence model to describe the turbulent flow. These equations are solved by a full 3-dimensional finite-difference procedure instead of the normally used perturbation analysis. The time dependence of the equations is introduced by working with a coordinate system rotating with the precession frequency of the shaft. The results of this theory are compared with coefficients calculated by a perturbation analysis and with experimental results.

  20. Analysis on the security of cloud computing

    NASA Astrophysics Data System (ADS)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  1. Computer Analysis of a Physical Pendulum.

    ERIC Educational Resources Information Center

    Priest, Joseph; Potts, Larry

    1990-01-01

    The interfacing of a physical pendulum to an Apple IIe computer and the physic instruction associated with it are discussed. Laboratory procedures, software commands, and computations used in this lesson are described. (CW)

  2. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  3. Computing in Qualitative Analysis: A Healthy Development?

    ERIC Educational Resources Information Center

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  4. Ferrofluids: Modeling, numerical analysis, and scientific computation

    NASA Astrophysics Data System (ADS)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  5. TAIR- TRANSONIC AIRFOIL ANALYSIS COMPUTER CODE

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.

    1994-01-01

    The Transonic Airfoil analysis computer code, TAIR, was developed to employ a fast, fully implicit algorithm to solve the conservative full-potential equation for the steady transonic flow field about an arbitrary airfoil immersed in a subsonic free stream. The full-potential formulation is considered exact under the assumptions of irrotational, isentropic, and inviscid flow. These assumptions are valid for a wide range of practical transonic flows typical of modern aircraft cruise conditions. The primary features of TAIR include: a new fully implicit iteration scheme which is typically many times faster than classical successive line overrelaxation algorithms; a new, reliable artifical density spatial differencing scheme treating the conservative form of the full-potential equation; and a numerical mapping procedure capable of generating curvilinear, body-fitted finite-difference grids about arbitrary airfoil geometries. Three aspects emphasized during the development of the TAIR code were reliability, simplicity, and speed. The reliability of TAIR comes from two sources: the new algorithm employed and the implementation of effective convergence monitoring logic. TAIR achieves ease of use by employing a "default mode" that greatly simplifies code operation, especially by inexperienced users, and many useful options including: several airfoil-geometry input options, flexible user controls over program output, and a multiple solution capability. The speed of the TAIR code is attributed to the new algorithm and the manner in which it has been implemented. Input to the TAIR program consists of airfoil coordinates, aerodynamic and flow-field convergence parameters, and geometric and grid convergence parameters. The airfoil coordinates for many airfoil shapes can be generated in TAIR from just a few input parameters. Most of the other input parameters have default values which allow the user to run an analysis in the default mode by specifing only a few input parameters

  6. 3-Dimensional Marine CSEM Modeling by Employing TDFEM with Parallel Solvers

    NASA Astrophysics Data System (ADS)

    Wu, X.; Yang, T.

    2013-12-01

    In this paper, parallel fulfillment is developed for forward modeling of the 3-Dimensional controlled source electromagnetic (CSEM) by using time-domain finite element method (TDFEM). Recently, a greater attention rises on research of hydrocarbon (HC) reservoir detection mechanism in the seabed. Since China has vast ocean resources, seeking hydrocarbon reservoirs become significant in the national economy. However, traditional methods of seismic exploration shown a crucial obstacle to detect hydrocarbon reservoirs in the seabed with a complex structure, due to relatively high acquisition costs and high-risking exploration. In addition, the development of EM simulations typically requires both a deep knowledge of the computational electromagnetics (CEM) and a proper use of sophisticated techniques and tools from computer science. However, the complexity of large-scale EM simulations often requires large memory because of a large amount of data, or solution time to address problems concerning matrix solvers, function transforms, optimization, etc. The objective of this paper is to present parallelized implementation of the time-domain finite element method for analysis of three-dimensional (3D) marine controlled source electromagnetic problems. Firstly, we established a three-dimensional basic background model according to the seismic data, then electromagnetic simulation of marine CSEM was carried out by using time-domain finite element method, which works on a MPI (Message Passing Interface) platform with exact orientation to allow fast detecting of hydrocarbons targets in ocean environment. To speed up the calculation process, SuperLU of an MPI (Message Passing Interface) version called SuperLU_DIST is employed in this approach. Regarding the representation of three-dimension seabed terrain with sense of reality, the region is discretized into an unstructured mesh rather than a uniform one in order to reduce the number of unknowns. Moreover, high-order Whitney

  7. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  8. PROMALS3D: multiple protein sequence alignment enhanced with evolutionary and 3-dimensional structural information

    PubMed Central

    Pei, Jimin; Grishin, Nick V.

    2015-01-01

    SUMMARY Multiple sequence alignment (MSA) is an essential tool with many applications in bioinformatics and computational biology. Accurate MSA construction for divergent proteins remains a difficult computational task. The constantly increasing protein sequences and structures in public databases could be used to improve alignment quality. PROMALS3D is a tool for protein MSA construction enhanced with additional evolutionary and structural information from database searches. PROMALS3D automatically identifies homologs from sequence and structure databases for input proteins, derives structure-based constraints from alignments of 3-dimensional structures, and combines them with sequence-based constraints of profile-profile alignments in a consistency-based framework to construct high-quality multiple sequence alignments. PROMALS3D output is a consensus alignment enriched with sequence and structural information about input proteins and their homologs. PROMALS3D web server and package are available at http://prodata.swmed.edu/PROMALS3D. PMID:24170408

  9. Computational method for analysis of polyethylene biodegradation

    NASA Astrophysics Data System (ADS)

    Watanabe, Masaji; Kawai, Fusako; Shibata, Masaru; Yokoyama, Shigeo; Sudate, Yasuhiro

    2003-12-01

    In a previous study concerning the biodegradation of polyethylene, we proposed a mathematical model based on two primary factors: the direct consumption or absorption of small molecules and the successive weight loss of large molecules due to β-oxidation. Our model is an initial value problem consisting of a differential equation whose independent variable is time. Its unknown variable represents the total weight of all the polyethylene molecules that belong to a molecular-weight class specified by a parameter. In this paper, we describe a numerical technique to introduce experimental results into analysis of our model. We first establish its mathematical foundation in order to guarantee its validity, by showing that the initial value problem associated with the differential equation has a unique solution. Our computational technique is based on a linear system of differential equations derived from the original problem. We introduce some numerical results to illustrate our technique as a practical application of the linear approximation. In particular, we show how to solve the inverse problem to determine the consumption rate and the β-oxidation rate numerically, and illustrate our numerical technique by analyzing the GPC patterns of polyethylene wax obtained before and after 5 weeks cultivation of a fungus, Aspergillus sp. AK-3. A numerical simulation based on these degradation rates confirms that the primary factors of the polyethylene biodegradation posed in modeling are indeed appropriate.

  10. Environmental studies: Mathematical, computational, and statistical analysis

    SciTech Connect

    Wheeler, M.F.

    1996-12-31

    The Summer Program on Mathematical, Computational, and Statistical Analyses in Environmental Studies held 6--31 July 1992 was designed to provide a much needed interdisciplinary forum for joint exploration of recent advances in the formulation and application of (A) environmental models, (B) environmental data and data assimilation, (C) stochastic modeling and optimization, and (D) global climate modeling. These four conceptual frameworks provided common themes among a broad spectrum of specific technical topics at this workshop. The program brought forth a mix of physical concepts and processes such as chemical kinetics, atmospheric dynamics, cloud physics and dynamics, flow in porous media, remote sensing, climate statistical, stochastic processes, parameter identification, model performance evaluation, aerosol physics and chemistry, and data sampling together with mathematical concepts in stiff differential systems, advective-diffusive-reactive PDEs, inverse scattering theory, time series analysis, particle dynamics, stochastic equations, optimal control, and others. Nineteen papers are presented in this volume. Selected papers are indexed separately for inclusion in the Energy Science and Technology Database.

  11. [Computational genome analysis of three marine algoviruses].

    PubMed

    Stepanova, O A; Boĭko, A L; Shcherbatenko, I S

    2013-01-01

    Computational analysis of genomic sequences of three new marine algoviruses: Tetraselmis viridis virus (TvV-S20 and TvV-SI1 strains) and Dunaliella viridis virus (DvV-SI2 strain) was conducted. Both considerable similarity and essential distinctions between studied strains and the most studied marine algoviruses of Phycodnaviridae family were revealed. Our data show that the tested strains are new viruses with the following features: only they were isolated from marine eukaryotic microalgae T. viridis and D. viridis, coding sequences (CDSs) of their genomes are localized mainly on one of the DNA strands and form several clusters with short intergenic spaces; there are considerable variations in genome structure within viruses and their strains; viral genomic DNA has a high GC-content (55.5 - 67.4%); their genes contain no well-known optimal contexts of translation start codones, and the contexts of terminal codons read-through; the vast majority of viral genes and proteins do not have any matches in gene banks. PMID:24479317

  12. Computer-aided petrographic analysis of sandstones

    SciTech Connect

    Thayer, P.A.; Helmold, K.P.

    1987-05-01

    Thin-section point counting, mathematical and statistical analysis of petrographic-petrophysical data, report generation, and graphical presentation of results can be done efficiently by computer. Compositional and textural data are collected with a modified Schares point-counting system. The system uses an MS-DOS microcomputer programmed in BASIC to drive a motorized stage attached to a polarizing microscope. Numeric codes for up to 500 different categories of minerals, cements, pores, etc, are input using a separate keypad. Calculation and printing of constituent percentages, QFR, Folk name, and grain-size distribution are completed in seconds after data entry. Raw data files, compatible with software such as Lotus 1-2-3, SPSS, and SAS, are stored on floppy disk. Petrographic data files are transferred directly to a mainframe, merged with log and petrophysical data, analyzed statistically with SAS, and reports generated. SAS/GRAPH and TELL-A-GRAF routines linked with SAS generate a variety of cross plots, histograms, pie and bar charts, ternary diagrams, and vertical variation diagrams (e.g., depth vs. porosity, permeability, mean size, sorting, and percent grains-matrix-cement).

  13. High-speed 3-dimensional imaging in robot-assisted thoracic surgical procedures.

    PubMed

    Kajiwara, Naohiro; Akata, Soichi; Hagiwara, Masaru; Yoshida, Koichi; Kato, Yasufumi; Kakihana, Masatoshi; Ohira, Tatsuo; Kawate, Norihiko; Ikeda, Norihiko

    2014-06-01

    We used a high-speed 3-dimensional (3D) image analysis system (SYNAPSE VINCENT, Fujifilm Corp, Tokyo, Japan) to determine the best positioning of robotic arms and instruments preoperatively. The da Vinci S (Intuitive Surgical Inc, Sunnyvale, CA) was easily set up accurately and rapidly for this operation. Preoperative simulation and intraoperative navigation using the SYNAPSE VINCENT for robot-assisted thoracic operations enabled efficient planning of the operation settings. The SYNAPSE VINCENT can detect the tumor location and depict surrounding tissues quickly, accurately, and safely. This system is also excellent for navigational and educational use. PMID:24882302

  14. Parallel Analysis and Visualization on Cray Compute Node Linux

    SciTech Connect

    Pugmire, Dave; Ahern, Sean

    2008-01-01

    Capability computer systems are deployed to give researchers the computational power required to investigate and solve key challenges facing the scientific community. As the power of these computer systems increases, the computational problem domain typically increases in size, complexity and scope. These increases strain the ability of commodity analysis and visualization clusters to effectively perform post-processing tasks and provide critical insight and understanding to the computed results. An alternative to purchasing increasingly larger, separate analysis and visualization commodity clusters is to use the computational system itself to perform post-processing tasks. In this paper, the recent successful port of VisIt, a parallel, open source analysis and visualization tool, to compute node linux running on the Cray is detailed. Additionally, the unprecedented ability of this resource for analysis and visualization is discussed and a report on obtained results is presented.

  15. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  16. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  17. 3-Dimensional Imaging Modalities for Phenotyping Genetically Engineered Mice

    PubMed Central

    Powell, K. A.; Wilson, D.

    2013-01-01

    A variety of 3-dimensional (3D) digital imaging modalities are available for whole-body assessment of genetically engineered mice: magnetic resonance microscopy (MRM), X-ray microcomputed tomography (microCT), optical projection tomography (OPT), episcopic and cryoimaging, and ultrasound biomicroscopy (UBM). Embryo and adult mouse phenotyping can be accomplished at microscopy or near microscopy spatial resolutions using these modalities. MRM and microCT are particularly well-suited for evaluating structural information at the organ level, whereas episcopic and OPT imaging provide structural and functional information from molecular fluorescence imaging at the cellular level. UBM can be used to monitor embryonic development longitudinally in utero. Specimens are not significantly altered during preparation, and structures can be viewed in their native orientations. Technologies for rapid automated data acquisition and high-throughput phenotyping have been developed and continually improve as this exciting field evolves. PMID:22146851

  18. Protalign: a 3-dimensional protein alignment assessment tool.

    PubMed

    Meads, D; Hansen, M D; Pang, A

    1999-01-01

    Protein fold recognition (sometimes called threading) is the prediction of a protein's 3-dimensional shape based on its similarity to a protein of known structure. Fold predictions are low resolution; that is, no effort is made to rotate the protein's component amino acid side chains into their correct spatial orientations. The goal is simply to recognize the protein family member that most closely resembles the target sequence of unknown structure and to create a sensible alignment of the target to the known structure (i.e., a structure-sequence alignment). To facilitate this type of structure prediction, we have designed a low resolution molecular graphics tool. ProtAlign introduces the ability to interact with and edit alignments directly in the 3-dimensional structure as well as in the usual 2-dimensional layout. It also contains several functions and features to help the user assess areas within the alignment. ProtAlign implements an open pipe architecture to allow other programs to access its molecular graphics capabilities. In addition, it is capable of "driving" other programs. Because amino acid side chain orientation is not relevant in fold recognition, we represent amino acid residues as abstract shapes or glyphs much like Lego (tm) blocks and we borrow techniques from comparative flow visualization using streamlines to provide clean depictions of the entire protein model. By creating a low resolution representation of protein structure, we are able to at least double the amount of information on the screen. At the same time, we create a view that is not as busy as the corresponding representations using traditional high resolution visualization methods which show detailed atomic structure. This eliminates distracting and possibly misleading visual clutter resulting from the mapping of protein alignment information onto a high resolution display of the known structure. This molecular graphics program is implemented in Open GL to facilitate porting to

  19. NASA Applications for Computational Electromagnetic Analysis

    NASA Technical Reports Server (NTRS)

    Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.

    2011-01-01

    Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.

  20. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    ERIC Educational Resources Information Center

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  1. Transonic wing analysis using advanced computational methods

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    This paper discusses the application of three-dimensional computational transonic flow methods to several different types of transport wing designs. The purpose of these applications is to evaluate the basic accuracy and limitations associated with such numerical methods. The use of such computational methods for practical engineering problems can only be justified after favorable evaluations are completed. The paper summarizes a study of both the small-disturbance and the full potential technique for computing three-dimensional transonic flows. Computed three-dimensional results are compared to both experimental measurements and theoretical results. Comparisons are made not only of pressure distributions but also of lift and drag forces. Transonic drag rise characteristics are compared. Three-dimensional pressure distributions and aerodynamic forces, computed from the full potential solution, compare reasonably well with experimental results for a wide range of configurations and flow conditions.

  2. Frequency Analysis Program for a Computer Assisted Laboratory.

    ERIC Educational Resources Information Center

    Aburdene, Maurice F.

    1983-01-01

    Describes a Fortran program used in a computer-assisted-laboratory course. Program utilizes computer-controlled frequency sweeping to measure response (amplitude/phase) of a series RLC circuit, modeling the circuit and comparing experimental/theoretical results for system gain with computer gain using least squares analysis. Plots of both gain…

  3. Numerical Package in Computer Supported Numeric Analysis Teaching

    ERIC Educational Resources Information Center

    Tezer, Murat

    2007-01-01

    At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…

  4. 3-Dimensional quantitative detection of nanoparticle content in biological tissue samples after local cancer treatment

    NASA Astrophysics Data System (ADS)

    Rahn, Helene; Alexiou, Christoph; Trahms, Lutz; Odenbach, Stefan

    2014-06-01

    X-ray computed tomography is nowadays used for a wide range of applications in medicine, science and technology. X-ray microcomputed tomography (XµCT) follows the same principles used for conventional medical CT scanners, but improves the spatial resolution to a few micrometers. We present an example of an application of X-ray microtomography, a study of 3-dimensional biodistribution, as along with the quantification of nanoparticle content in tumoral tissue after minimally invasive cancer therapy. One of these minimal invasive cancer treatments is magnetic drug targeting, where the magnetic nanoparticles are used as controllable drug carriers. The quantification is based on a calibration of the XµCT-equipment. The developed calibration procedure of the X-ray-µCT-equipment is based on a phantom system which allows the discrimination between the various gray values of the data set. These phantoms consist of a biological tissue substitute and magnetic nanoparticles. The phantoms have been studied with XµCT and have been examined magnetically. The obtained gray values and nanoparticle concentration lead to a calibration curve. This curve can be applied to tomographic data sets. Accordingly, this calibration enables a voxel-wise assignment of gray values in the digital tomographic data set to nanoparticle content. Thus, the calibration procedure enables a 3-dimensional study of nanoparticle distribution as well as concentration.

  5. Alternative Computational Approaches for Probalistic Fatigue Analysis

    NASA Technical Reports Server (NTRS)

    Ebbeler, D. H.; Newlin, L. E.; Sutharshana, S.; Moore, N. R.; Grigoriu, M.

    1995-01-01

    The feasibility is discussed for alternative methods of direct Monte Carlo simulation for failure probability computations. First and second order reliability methods are used for fatigue crack growth and low cycle fatigue structural failure modes to illustrate typical problems.

  6. System balance analysis for vector computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.; Poole, W. G., Jr.; Voight, R. G.

    1975-01-01

    The availability of vector processors capable of sustaining computing rates of 10 to the 8th power arithmetic results pers second raised the question of whether peripheral storage devices representing current technology can keep such processors supplied with data. By examining the solution of a large banded linear system on these computers, it was found that even under ideal conditions, the processors will frequently be waiting for problem data.

  7. Three parallel computation methods for structural vibration analysis

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf; Bostic, Susan; Patrick, Merrell; Mahajan, Umesh; Ma, Shing

    1988-01-01

    The Lanczos (1950), multisectioning, and subspace iteration sequential methods for vibration analysis presently used as bases for three parallel algorithms are noted, in the aftermath of three example problems, to maintain reasonable accuracy in the computation of vibration frequencies. Significant computation time reductions are obtained as the number of processors increases. An analysis is made of the performance of each method, in order to characterize relative strengths and weaknesses as well as to identify those parameters that most strongly affect computation efficiency.

  8. New Technique for Developing a Proton Range Compensator With Use of a 3-Dimensional Printer

    SciTech Connect

    Ju, Sang Gyu; Kim, Min Kyu; Hong, Chae-Seon; Kim, Jin Sung; Han, Youngyih; Choi, Doo Ho; Shin, Dongho; Lee, Se Byeong

    2014-02-01

    Purpose: A new system for manufacturing a proton range compensator (RC) was developed by using a 3-dimensional printer (3DP). The physical accuracy and dosimetric characteristics of the new RC manufactured by 3DP (RC{sub 3}DP) were compared with those of a conventional RC (RC{sub C}MM) manufactured by a computerized milling machine (CMM). Methods and Materials: An RC for brain tumor treatment with a scattered proton beam was calculated with a treatment planning system, and the resulting data were converted into a new format for 3DP using in-house software. The RC{sub 3}DP was printed with ultraviolet curable acrylic plastic, and an RC{sub C}MM was milled into polymethylmethacrylate using a CMM. The inner shape of both RCs was scanned by using a 3D scanner and compared with TPS data by applying composite analysis (CA; with 1-mm depth difference and 1 mm distance-to-agreement criteria) to verify their geometric accuracy. The position and distal penumbra of distal dose falloff at the central axis and field width of the dose profile at the midline depth of spread-out Bragg peak were measured for the 2 RCs to evaluate their dosimetric characteristics. Both RCs were imaged on a computed tomography scanner to evaluate uniformity of internal density. The manufacturing times for both RCs were compared to evaluate the production efficiency. Results: The pass rates for the CA test were 99.5% and 92.5% for RC{sub 3}DP and RC{sub C}MM, respectively. There was no significant difference in dosimetric characteristics and uniformity of internal density between the 2 RCs. The net fabrication times of RC{sub 3}DP and RC{sub C}MM were about 18 and 3 hours, respectively. Conclusions: The physical accuracy and dosimetric characteristics of RC{sub 3}DP were comparable with those of the conventional RC{sub C}MM, and significant system minimization was provided.

  9. Computational analysis of LDDMM for brain mapping

    PubMed Central

    Ceritoglu, Can; Tang, Xiaoying; Chow, Margaret; Hadjiabadi, Darian; Shah, Damish; Brown, Timothy; Burhanullah, Muhammad H.; Trinh, Huong; Hsu, John T.; Ament, Katarina A.; Crocetti, Deana; Mori, Susumu; Mostofsky, Stewart H.; Yantis, Steven; Miller, Michael I.; Ratnanather, J. Tilak

    2013-01-01

    One goal of computational anatomy (CA) is to develop tools to accurately segment brain structures in healthy and diseased subjects. In this paper, we examine the performance and complexity of such segmentation in the framework of the large deformation diffeomorphic metric mapping (LDDMM) registration method with reference to atlases and parameters. First we report the application of a multi-atlas segmentation approach to define basal ganglia structures in healthy and diseased kids' brains. The segmentation accuracy of the multi-atlas approach is compared with the single atlas LDDMM implementation and two state-of-the-art segmentation algorithms—Freesurfer and FSL—by computing the overlap errors between automatic and manual segmentations of the six basal ganglia nuclei in healthy subjects as well as subjects with diseases including ADHD and Autism. The high accuracy of multi-atlas segmentation is obtained at the cost of increasing the computational complexity because of the calculations necessary between the atlases and a subject. Second, we examine the effect of parameters on total LDDMM computation time and segmentation accuracy for basal ganglia structures. Single atlas LDDMM method is used to automatically segment the structures in a population of 16 subjects using different sets of parameters. The results show that a cascade approach and using fewer time steps can reduce computational complexity as much as five times while maintaining reliable segmentations. PMID:23986653

  10. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  11. Global detailed geoid computation and model analysis

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Vincent, S.

    1974-01-01

    Comparisons and analyses were carried out through the use of detailed gravimetric geoids which we have computed by combining models with a set of 26,000 1 deg x 1 deg mean free air gravity anomalies. The accuracy of the detailed gravimetric geoid computed using the most recent Goddard earth model (GEM-6) in conjunction with the set of 1 deg x 1 deg mean free air gravity anomalies is assessed at + or - 2 meters on the continents of North America, Europe, and Australia, 2 to 5 meters in the Northeast Pacific and North Atlantic areas, and 5 to 10 meters in other areas where surface gravity data are sparse. The R.M.S. differences between this detailed geoid and the detailed geoids computed using the other satellite gravity fields in conjuction with same set of surface data range from 3 to 7 meters.

  12. Hypercube-Computer Analysis Of Electromagnetic Scattering

    NASA Technical Reports Server (NTRS)

    Patterson, J. E.; Liewer, P. C.; Calalo, R. H.; Manshadi, F.

    1990-01-01

    Capabilities of hypercube and parallel processing demonstrated. Report describes use of Mark III Hypercube computer to analyze scattering of electromagnetic waves. Purpose of study to assess utility of parallel computing in such computation-intensive problems as large-scale electromagnetic scattering. Two electromagnetic codes based on different algorithms converted to run on Mark III Hypercube. First code implements finite-difference, time-domain solution of Maxwell's curl equations. Second code is Numerical Electromagnetics Code (NEC-2) which embodies frequency-domain method and developed to analyze electromagnetic responses of antennas and other metallic structures. On Mark III Hypercube with 32 active nodes, largest lattice contains about 2,048,000 unit cells.

  13. Computational Analysis of SAXS Data Acquisition.

    PubMed

    Dong, Hui; Kim, Jin Seob; Chirikjian, Gregory S

    2015-09-01

    Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals. PMID:26244255

  14. Using 3-dimensional printing to create presurgical models for endodontic surgery.

    PubMed

    Bahcall, James K

    2014-09-01

    Advances in endodontic surgery--from both a technological and procedural perspective-have been significant over the last 18 years. Although these technologies and procedural enhancements have significantly improved endodontic surgical treatment outcomes, there is still an ongoing challenge of overcoming the limitations of interpreting preoperative 2-dimensional (2-D) radiographic representation of a 3-dimensional (3-D) in vivo surgical field. Cone-beam Computed Tomography (CBCT) has helped to address this issue by providing a 3-D enhancement of the 2-D radiograph. The next logical step to further improve a presurgical case 3-D assessment is to create a surgical model from the CBCT scan. The purpose of this article is to introduce 3-D printing of CBCT scans for creating presurgical models for endodontic surgery. PMID:25197746

  15. Chromosome Conformation of Human Fibroblasts Grown in 3-Dimensional Spheroids

    PubMed Central

    Chen, Haiming; Comment, Nicholas; Chen, Jie; Ronquist, Scott; Hero, Alfred; Ried, Thomas; Rajapakse, Indika

    2015-01-01

    In the study of interphase chromosome organization, genome-wide chromosome conformation capture (Hi-C) maps are often generated using 2-dimensional (2D) monolayer cultures. These 2D cells have morphological deviations from cells that exist in 3-dimensional (3D) tissues in vivo, and may not maintain the same chromosome conformation. We used Hi-C maps to test the extent of differences in chromosome conformation between human fibroblasts grown in 2D cultures and those grown in 3D spheroids. Significant differences in chromosome conformation were found between 2D cells and those grown in spheroids. Intra-chromosomal interactions were generally increased in spheroid cells, with a few exceptions, while inter-chromosomal interactions were generally decreased. Overall, chromosomes located closer to the nuclear periphery had increased intra-chromosomal contacts in spheroid cells, while those located more centrally had decreased interactions. This study highlights the necessity to conduct studies on the topography of the interphase nucleus under conditions that mimic an in vivo environment. PMID:25738643

  16. Thermal crosstalk in 3-dimensional RRAM crossbar array

    PubMed Central

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-01-01

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation. PMID:26310537

  17. Thermal crosstalk in 3-dimensional RRAM crossbar array.

    PubMed

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-01-01

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation. PMID:26310537

  18. 3 Dimensional Diagnosis Unravelling Prognosis of Multiple Impacted Teeth – A Case Report

    PubMed Central

    Gopinath, Adusumilli; Reddy, Naveen Admala; Rohra, Mayur G

    2013-01-01

    Impaction of teeth results from the interplay between nature and nurture. Radiographs play an important role in assessment of both the location and the typing of impacted teeth. In general, periapical, occlusal, and/or panoramic radiographs are sufficient for providing the information required by the clinician. Recent advances in diagnostic imaging enables to visualize , diagnose and prognose the treatment outcome of the impacted teeth. This case report discusses the value of cone beam computerized tomography (CBCT) for evaluation of the critical parameters like bone thickness , tooth position and tooth morphology of multiple impacted teeth by 3 dimensional radiography – CBCT. In this report, we present a case of 27-year-old male patient with multiple missing teeth. Radiographs revealed multiple impacted permanent teeth, though medical and family history along with physical examination was not suggestive of any syndromes. Intraoral periapical radiograph, Orthopantomograph, Occlusal radiograph, Cone beam computed tomography were taken for the same patient to determine the exact position of multiple impacted teeth and prognose the treatment plan with the associated factors to impacted teeth. Cone beam computed tomography is an accurate modality to localize and determine the prognosing factors associated with multiple impacted teeth. Three-dimensional volumetric imaging might provide information for improved diagnosis and treatment plans, and ultimately result in more successful treatment outcomes and better care for patients. How to cite this article: Gopinath A, Reddy NA, Rohra MG. 3 Dimensional Diagnosis Unravelling Prognosis of Multiple Impacted Teeth – A Case Report. J Int Oral Health 2013; 5(4):78-83. PMID:24155625

  19. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  20. Thermoelectric pump performance analysis computer code

    NASA Technical Reports Server (NTRS)

    Johnson, J. L.

    1973-01-01

    A computer program is presented that was used to analyze and design dual-throat electromagnetic dc conduction pumps for the 5-kwe ZrH reactor thermoelectric system. In addition to a listing of the code and corresponding identification of symbols, the bases for this analytical model are provided.

  1. Computer program performs stiffness matrix structural analysis

    NASA Technical Reports Server (NTRS)

    Bamford, R.; Batchelder, R.; Schmele, L.; Wada, B. K.

    1968-01-01

    Computer program generates the stiffness matrix for a particular type of structure from geometrical data, and performs static and normal mode analyses. It requires the structure to be modeled as a stable framework of uniform, weightless members, and joints at which loads are applied and weights are lumped.

  2. Computational and Physical Analysis of Catalytic Compounds

    NASA Astrophysics Data System (ADS)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  3. Computed Tomography Analysis of NASA BSTRA Balls

    SciTech Connect

    Perry, R L; Schneberk, D J; Thompson, R R

    2004-10-12

    Fifteen 1.25 inch BSTRA balls were scanned with the high energy computed tomography system at LLNL. This system has a resolution limit of approximately 210 microns. A threshold of 238 microns (two voxels) was used, and no anomalies at or greater than this were observed.

  4. Reliability computation using fault tree analysis

    NASA Technical Reports Server (NTRS)

    Chelson, P. O.

    1971-01-01

    A method is presented for calculating event probabilities from an arbitrary fault tree. The method includes an analytical derivation of the system equation and is not a simulation program. The method can handle systems that incorporate standby redundancy and it uses conditional probabilities for computing fault trees where the same basic failure appears in more than one fault path.

  5. Wing analysis using a transonic potential flow computational method

    NASA Technical Reports Server (NTRS)

    Henne, P. A.; Hicks, R. M.

    1978-01-01

    The ability of the method to compute wing transonic performance was determined by comparing computed results with both experimental data and results computed by other theoretical procedures. Both pressure distributions and aerodynamic forces were evaluated. Comparisons indicated that the method is a significant improvement in transonic wing analysis capability. In particular, the computational method generally calculated the correct development of three-dimensional pressure distributions from subcritical to transonic conditions. Complicated, multiple shocked flows observed experimentally were reproduced computationally. The ability to identify the effects of design modifications was demonstrated both in terms of pressure distributions and shock drag characteristics.

  6. Computational thermo-fluid analysis of a disk brake

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu

    2016-06-01

    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  7. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  8. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  9. Computer applications for engineering/structural analysis. Revision 1

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-12-31

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  10. Advances in Computer-Based Autoantibodies Analysis

    NASA Astrophysics Data System (ADS)

    Soda, Paolo; Iannello, Giulio

    Indirect Immunofluorescence (IIF) imaging is the recommended me-thod to detect autoantibodies in patient serum, whose common markers are antinuclear autoantibodies (ANA) and autoantibodies directed against double strand DNA (anti-dsDNA). Since the availability of accurately performed and correctly reported laboratory determinations is crucial for the clinicians, an evident medical demand is the development of Computer Aided Diagnosis (CAD) tools supporting physicians' decisions.

  11. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  12. Scalable Computer Performance and Analysis (Hierarchical INTegration)

    Energy Science and Technology Software Center (ESTSC)

    1999-09-02

    HINT is a program to measure a wide variety of scalable computer systems. It is capable of demonstrating the benefits of using more memory or processing power, and of improving communications within the system. HINT can be used for measurement of an existing system, while the associated program ANALYTIC HINT can be used to explain the measurements or as a design tool for proposed systems.

  13. A 3-Dimensional Anatomic Study of the Distal Biceps Tendon

    PubMed Central

    Walton, Christine; Li, Zhi; Pennings, Amanda; Agur, Anne; Elmaraghy, Amr

    2015-01-01

    Background Complete rupture of the distal biceps tendon from its osseous attachment is most often treated with operative intervention. Knowledge of the overall tendon morphology as well as the orientation of the collagenous fibers throughout the musculotendinous junction are key to intraoperative decision making and surgical technique in both the acute and chronic setting. Unfortunately, there is little information available in the literature. Purpose To comprehensively describe the morphology of the distal biceps tendon. Study Design Descriptive laboratory study. Methods The distal biceps terminal musculature, musculotendinous junction, and tendon were digitized in 10 cadaveric specimens and data reconstructed using 3-dimensional modeling. Results The average length, width, and thickness of the external distal biceps tendon were found to be 63.0, 6.0, and 3.0 mm, respectively. A unique expansion of the tendon fibers within the distal muscle was characterized, creating a thick collagenous network along the central component between the long and short heads. Conclusion This study documents the morphologic parameters of the native distal biceps tendon. Reconstruction may be necessary, especially in chronic distal biceps tendon ruptures, if the remaining tendon morphology is significantly compromised compared with the native distal biceps tendon. Knowledge of normal anatomical distal biceps tendon parameters may also guide the selection of a substitute graft with similar morphological characteristics. Clinical Relevance A thorough description of distal biceps tendon morphology is important to guide intraoperative decision making between primary repair and reconstruction and to better select the most appropriate graft. The detailed description of the tendinous expansion into the muscle may provide insight into better graft-weaving and suture-grasping techniques to maximize proximal graft incorporation. PMID:26665092

  14. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  15. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  16. Generation and 3-Dimensional Quantitation of Arterial Lesions in Mice Using Optical Projection Tomography

    PubMed Central

    Kirkby, Nicholas S.; Low, Lucinda; Wu, Junxi; Miller, Eileen; Seckl, Jonathan R.; Walker, Brian R.; Webb, David J.; Hadoke, Patrick W. F.

    2015-01-01

    The generation and analysis of vascular lesions in appropriate animal models is a cornerstone of research into cardiovascular disease, generating important information on the pathogenesis of lesion formation and the action of novel therapies. Use of atherosclerosis-prone mice, surgical methods of lesion induction, and dietary modification has dramatically improved understanding of the mechanisms that contribute to disease development and the potential of new treatments. Classically, analysis of lesions is performed ex vivo using 2-dimensional histological techniques. This article describes application of optical projection tomography (OPT) to 3-dimensional quantitation of arterial lesions. As this technique is non-destructive, it can be used as an adjunct to standard histological and immunohistochemical analyses. Neointimal lesions were induced by wire-insertion or ligation of the mouse femoral artery whilst atherosclerotic lesions were generated by administration of an atherogenic diet to apoE-deficient mice. Lesions were examined using OPT imaging of autofluorescent emission followed by complementary histological and immunohistochemical analysis. OPT clearly distinguished lesions from the underlying vascular wall. Lesion size was calculated in 2-dimensional sections using planimetry, enabling calculation of lesion volume and maximal cross-sectional area. Data generated using OPT were consistent with measurements obtained using histology, confirming the accuracy of the technique and its potential as a complement (rather than alternative) to traditional methods of analysis. This work demonstrates the potential of OPT for imaging atherosclerotic and neointimal lesions. It provides a rapid, much needed ex vivo technique for the routine 3-dimensional quantification of vascular remodelling. PMID:26067588

  17. Accuracy Evaluation of a 3-Dimensional Surface Imaging System for Guidance in Deep-Inspiration Breath-Hold Radiation Therapy

    SciTech Connect

    Alderliesten, Tanja; Sonke, Jan-Jakob; Betgen, Anja; Honnef, Joeri; Vliet-Vroegindeweij, Corine van; Remeijer, Peter

    2013-02-01

    Purpose: To investigate the applicability of 3-dimensional (3D) surface imaging for image guidance in deep-inspiration breath-hold radiation therapy (DIBH-RT) for patients with left-sided breast cancer. For this purpose, setup data based on captured 3D surfaces was compared with setup data based on cone beam computed tomography (CBCT). Methods and Materials: Twenty patients treated with DIBH-RT after breast-conserving surgery (BCS) were included. Before the start of treatment, each patient underwent a breath-hold CT scan for planning purposes. During treatment, dose delivery was preceded by setup verification using CBCT of the left breast. 3D surfaces were captured by a surface imaging system concurrently with the CBCT scan. Retrospectively, surface registrations were performed for CBCT to CT and for a captured 3D surface to CT. The resulting setup errors were compared with linear regression analysis. For the differences between setup errors, group mean, systematic error, random error, and 95% limits of agreement were calculated. Furthermore, receiver operating characteristic (ROC) analysis was performed. Results: Good correlation between setup errors was found: R{sup 2}=0.70, 0.90, 0.82 in left-right, craniocaudal, and anterior-posterior directions, respectively. Systematic errors were {<=}0.17 cm in all directions. Random errors were {<=}0.15 cm. The limits of agreement were -0.34-0.48, -0.42-0.39, and -0.52-0.23 cm in left-right, craniocaudal, and anterior-posterior directions, respectively. ROC analysis showed that a threshold between 0.4 and 0.8 cm corresponds to promising true positive rates (0.78-0.95) and false positive rates (0.12-0.28). Conclusions: The results support the application of 3D surface imaging for image guidance in DIBH-RT after BCS.

  18. The Utility of Computer-Assisted Power Analysis Lab Instruction

    ERIC Educational Resources Information Center

    Petrocelli, John V.

    2007-01-01

    Undergraduate students (N = 47), enrolled in 2 separate psychology research methods classes, evaluated a power analysis lab demonstration and homework assignment. Students attended 1 of 2 lectures that included a basic introduction to power analysis and sample size analysis. One lecture included a demonstration of how to use a computer-based power…

  19. The Reliability of Content Analysis of Computer Conference Communication

    ERIC Educational Resources Information Center

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  20. Computational fluid dynamics combustion analysis evaluation

    NASA Technical Reports Server (NTRS)

    Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.

    1992-01-01

    This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.

  1. Computational analysis of scramjet dual mode operation

    NASA Technical Reports Server (NTRS)

    1985-01-01

    One critical element in the design of a Scramjet is the detailed understanding of the complex flow field in the engine during various phases of operation. One area of interest is the computation of chemically reacting flows in the vicinity of flame holders. The characteristics of a method for solving the Navier-Stokes equations with chemical reactions are proposed. Also of interest are the flame holding characteristics of simple ramp and rearward facing steps. Both of these configurations are considered candidates for Scramjet flame holders.

  2. Computer-Based Interaction Analysis with DEGREE Revisited

    ERIC Educational Resources Information Center

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  3. Computer aided analysis and optimization of mechanical system dynamics

    NASA Technical Reports Server (NTRS)

    Haug, E. J.

    1984-01-01

    The purpose is to outline a computational approach to spatial dynamics of mechanical systems that substantially enlarges the scope of consideration to include flexible bodies, feedback control, hydraulics, and related interdisciplinary effects. Design sensitivity analysis and optimization is the ultimate goal. The approach to computer generation and solution of the system dynamic equations and graphical methods for creating animations as output is outlined.

  4. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  5. Simple hobby computer-based off-gas analysis system

    SciTech Connect

    Forrest, E.H.; Jansen, N.B.; Flickinger, M.C.; Tsao, G.T.

    1981-02-01

    An Apple II computer has been adapted to monitor fermentation offgas in laboratory and pilot scale fermentors. It can calculate oxygen uptake rates, carbon dioxide evolution rates, respiratory quotient as well as initiating recalibration procedures. In this report the computer-based off-gas analysis system is described.

  6. Potential applications of computational fluid dynamics to biofluid analysis

    NASA Technical Reports Server (NTRS)

    Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.; Kwak, D.

    1988-01-01

    Computational fluid dynamics was developed to the stage where it has become an indispensable part of aerospace research and design. In view of advances made in aerospace applications, the computational approach can be used for biofluid mechanics research. Several flow simulation methods developed for aerospace problems are briefly discussed for potential applications to biofluids, especially to blood flow analysis.

  7. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    SciTech Connect

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project.

  8. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  9. Computational methods for analysis of dynamic events in cell migration.

    PubMed

    Castañeda, V; Cerda, M; Santibáñez, F; Jara, J; Pulgar, E; Palma, K; Lemus, C G; Osorio-Reich, M; Concha, M L; Härtel, S

    2014-02-01

    Cell migration is a complex biological process that involves changes in shape and organization at the sub-cellular, cellular, and supra-cellular levels. Individual and collective cell migration can be assessed in vitro and in vivo starting from the flagellar driven movement of single sperm cells or bacteria, bacterial gliding and swarming, and amoeboid movement to the orchestrated movement of collective cell migration. One key technology to access migration phenomena is the combination of optical microscopy with image processing algorithms. This approach resolves simple motion estimation (e.g. preferred direction of migrating cells or path characteristics), but can also reveal more complex descriptors (e.g. protrusions or cellular deformations). In order to ensure an accurate quantification, the phenomena under study, their complexity, and the required level of description need to be addressed by an adequate experimental setup and processing pipeline. Here, we review typical workflows for processing starting with image acquisition, restoration (noise and artifact removal, signal enhancement), registration, analysis (object detection, segmentation and characterization) and interpretation (high level understanding). Image processing approaches for quantitative description of cell migration in 2- and 3-dimensional image series, including registration, segmentation, shape and topology description, tracking and motion fields are presented. We discuss advantages, limitations and suitability for different approaches and levels of description. PMID:24467201

  10. The morphometric analysis and recognition an amyloid plaque in microscope images by computer image processing.

    PubMed

    Grams, A; Liberski, P P; Sobów, T; Napieralska, M; Zubert, M; Napieralski, A

    2000-01-01

    This paper presents an approach of the two-dimensional image processing application in recognition of amyloid plaque in microscope images of the brain tissues. The authors propose to create universal amyloid plaque computer pattern and special multivariate image segmentation techniques based on collected images and statistical information. This recognition image procedure is divided into 3-dimensional statistical colour and morphological shape identifications. The developed computer system will collect and store image data and exchange them by network with other collaborated systems. PMID:11693723

  11. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  12. Process for computing geometric perturbations for probabilistic analysis

    SciTech Connect

    Fitch, Simeon H. K.; Riha, David S.; Thacker, Ben H.

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  13. Parallel computing for probabilistic fatigue analysis

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Lua, Yuan J.; Smith, Mark D.

    1993-01-01

    This paper presents the results of Phase I research to investigate the most effective parallel processing software strategies and hardware configurations for probabilistic structural analysis. We investigate the efficiency of both shared and distributed-memory architectures via a probabilistic fatigue life analysis problem. We also present a parallel programming approach, the virtual shared-memory paradigm, that is applicable across both types of hardware. Using this approach, problems can be solved on a variety of parallel configurations, including networks of single or multiprocessor workstations. We conclude that it is possible to effectively parallelize probabilistic fatigue analysis codes; however, special strategies will be needed to achieve large-scale parallelism to keep large number of processors busy and to treat problems with the large memory requirements encountered in practice. We also conclude that distributed-memory architecture is preferable to shared-memory for achieving large scale parallelism; however, in the future, the currently emerging hybrid-memory architectures will likely be optimal.

  14. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  15. System Matrix Analysis for Computed Tomography Imaging.

    PubMed

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  16. RSAC -6 Radiological Safety Analysis Computer Program

    SciTech Connect

    Schrader, Bradley J; Wenzel, Douglas Rudolph

    2001-06-01

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface and plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FGR 11 and 12.

  17. Combinatorial reliability analysis of multiprocessor computers

    SciTech Connect

    Hwang, K.; Tian-Pong Chang

    1982-12-01

    The authors propose a combinatorial method to evaluate the reliability of multiprocessor computers. Multiprocessor structures are classified as crossbar switch, time-shared buses, and multiport memories. Closed-form reliability expressions are derived via combinatorial path enumeration on the probabilistic-graph representation of a multiprocessor system. The method can analyze the reliability performance of real systems like C.mmp, Tandem 16, and Univac 1100/80. User-oriented performance levels are defined for measuring the performability of degradable multiprocessor systems. For a regularly structured multiprocessor system, it is fast and easy to use this technique for evaluating system reliability with statistically independent component reliabilities. System availability can be also evaluated by this reliability study. 6 references.

  18. Computer program for the transient analysis of radioisotope thermoelectric generators.

    NASA Technical Reports Server (NTRS)

    Eggers, P. E.; Ridihalgh, J. L.

    1972-01-01

    A computer program is described which represents a comprehensive analytical tool providing the capability for predicting the output power and temperature profile of an arbitrary radioisotope thermoelectric generator (RTG) design in the presence of time-dependent operating conditions. The approach taken involves the merging of three existing computer programs - namely, an RTG weight optimization design program, a thermoelectric analysis program, and a nodal heat-transfer computer program. A total of seven transient conditions are included in the computer program as the principal transients affecting long- and short-term performance characteristics of RTGs. This computer program is unique in that it designs an optimum RTG, generates a thermal model or analog and performs heat-transfer analysis of the RTG under user-specified transient conditions.

  19. Computational analysis of thresholds for magnetophosphenes

    NASA Astrophysics Data System (ADS)

    Laakso, Ilkka; Hirata, Akimasa

    2012-10-01

    In international guidelines, basic restriction limits on the exposure of humans to low-frequency magnetic and electric fields are set with the objective of preventing the generation of phosphenes, visual sensations of flashing light not caused by light. Measured data on magnetophosphenes, i.e. phosphenes caused by a magnetically induced electric field on the retina, are available from volunteer studies. However, there is no simple way for determining the retinal threshold electric field or current density from the measured threshold magnetic flux density. In this study, the experimental field configuration of a previous study, in which phosphenes were generated in volunteers by exposing their heads to a magnetic field between the poles of an electromagnet, is computationally reproduced. The finite-element method is used for determining the induced electric field and current in five different MRI-based anatomical models of the head. The direction of the induced current density on the retina is dominantly radial to the eyeball, and the maximum induced current density is observed at the superior and inferior sides of the retina, which agrees with literature data on the location of magnetophosphenes at the periphery of the visual field. On the basis of computed data, the macroscopic retinal threshold current density for phosphenes at 20 Hz can be estimated as 10 mA m-2 (-20% to  + 30%, depending on the anatomical model); this current density corresponds to an induced eddy current of 14 μA (-20% to  + 10%), and about 20% of this eddy current flows through each eye. The ICNIRP basic restriction limit for the induced electric field in the case of occupational exposure is not exceeded until the magnetic flux density is about two to three times the measured threshold for magnetophosphenes, so the basic restriction limit does not seem to be conservative. However, the reasons for the non-conservativeness are purely technical: removal of the highest 1% of electric

  20. Interactive Spectral Analysis and Computation (ISAAC)

    NASA Technical Reports Server (NTRS)

    Lytle, D. M.

    1992-01-01

    Isaac is a task in the NSO external package for IRAF. A descendant of a FORTRAN program written to analyze data from a Fourier transform spectrometer, the current implementation has been generalized sufficiently to make it useful for general spectral analysis and other one dimensional data analysis tasks. The user interface for Isaac is implemented as an interpreted mini-language containing a powerful, programmable vector calculator. Built-in commands provide much of the functionality needed to produce accurate line lists from input spectra. These built-in functions include automated spectral line finding, least squares fitting of Voigt profiles to spectral lines including equality constraints, various filters including an optimal filter construction tool, continuum fitting, and various I/O functions.

  1. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  2. Acoustic analysis of a computer cooling fan

    NASA Astrophysics Data System (ADS)

    Huang, Lixi; Wang, Jian

    2005-10-01

    Noise radiated by a typical computer cooling fan is investigated experimentally and analyzed within the framework of rotor-stator interaction noise using point source formulation. The fan is 9 cm in rotor casing diameter and its design speed is 3000 rpm. The main noise sources are found and quantified; they are (a) the inlet flow distortion caused by the sharp edges of the incomplete bellmouth due to the square outer framework, (b) the interaction of rotor blades with the downstream struts which hold the motor, and (c) the extra size of one strut carrying electrical wiring. Methods are devised to extract the rotor-strut interaction noise, (b) and (c), radiated by the component forces of drag and thrust at the leading and higher order spinning pressure modes, as well as the leading edge noise generated by (a). By re-installing the original fan rotor in various casings, the noises radiated by the three features of the original fan are separated, and details of the directivity are interpreted. It is found that the inlet flow distortion and the unequal set of four struts make about the same amount of noise. Their corrections show a potential of around 10-dB sound power reduction.

  3. Local spatial frequency analysis for computer vision

    NASA Technical Reports Server (NTRS)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  4. Early life history: A computer analysis

    NASA Astrophysics Data System (ADS)

    Bell, Peter M.

    Theoretical computer calculations, based in part on measurements of ‘young’ stars obtained with an orbiting telescope, may require a reexamination of some of the basic ideas about the composition of the earth's early atmosphere and the origin of life. According to Joel S. Levine, atmospheric geophysicist at the Langley Research Center, ‘the overwhelming majority of chemical evolution experiments since the first in 1952 may have been conducted with the wrong atmospheric mixture.’Astronomical measurements indicate that considerably more ultraviolet (UV) radiation may have been emitted by the young sun in comparison to that emitted by the present sun. Therefore, high levels of such radiation from the young sun, potentially harmful to life, would have been striking the earth at the very time life was being formed.Recent photochemical calculations by Levine and others at Langley state that at the time complex organic molecules (the precursors of living systems) were first formed from atmospheric gases the earth's atmosphere was not composed primarily of methane, ammonia, and hydrogen, as was previously supposed; instead, it was composed of carbon dioxide, nitrogen, and water vapor, all resulting from volcanic activity. The calculations indicate that both methane and ammonia were extremely short-lived and that such an atmosphere was photochemically unstable if it existed at all.

  5. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    SciTech Connect

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  6. Automatic analysis of computation in biochemical reactions.

    PubMed

    Egri-Nagy, Attila; Nehaniv, Chrystopher L; Rhodes, John L; Schilstra, Maria J

    2008-01-01

    We propose a modeling and analysis method for biochemical reactions based on finite state automata. This is a completely different approach compared to traditional modeling of reactions by differential equations. Our method aims to explore the algebraic structure behind chemical reactions using automatically generated coordinate systems. In this paper we briefly summarize the underlying mathematical theory (the algebraic hierarchical decomposition theory of finite state automata) and describe how such automata can be derived from the description of chemical reaction networks. We also outline techniques for the flexible manipulation of existing models. As a real-world example we use the Krebs citric acid cycle. PMID:18606208

  7. Real-time Interpolation for True 3-Dimensional Ultrasound Image Volumes

    PubMed Central

    Ji, Songbai; Roberts, David W.; Hartov, Alex; Paulsen, Keith D.

    2013-01-01

    We compared trilinear interpolation to voxel nearest neighbor and distance-weighted algorithms for fast and accurate processing of true 3-dimensional ultrasound (3DUS) image volumes. In this study, the computational efficiency and interpolation accuracy of the 3 methods were compared on the basis of a simulated 3DUS image volume, 34 clinical 3DUS image volumes from 5 patients, and 2 experimental phantom image volumes. We show that trilinear interpolation improves interpolation accuracy over both the voxel nearest neighbor and distance-weighted algorithms yet achieves real-time computational performance that is comparable to the voxel nearest neighbor algrorithm (1–2 orders of magnitude faster than the distance-weighted algorithm) as well as the fastest pixel-based algorithms for processing tracked 2-dimensional ultrasound images (0.035 seconds per 2-dimesional cross-sectional image [76,800 pixels interpolated, or 0.46 ms/1000 pixels] and 1.05 seconds per full volume with a 1-mm3 voxel size [4.6 million voxels interpolated, or 0.23 ms/1000 voxels]). On the basis of these results, trilinear interpolation is recommended as a fast and accurate interpolation method for rectilinear sampling of 3DUS image acquisitions, which is required to facilitate subsequent processing and display during operating room procedures such as image-guided neurosurgery. PMID:21266563

  8. Computational fluid dynamic analysis of liquid rocket combustion instability

    NASA Technical Reports Server (NTRS)

    Venkateswaran, Sankaran; Grenda, Jeffrey; Merkle, Charles L.

    1991-01-01

    The paper presents a computational analysis of liquid rocket combustion instability. Consideration is given to both a fully nonlinear unsteady calculation as well as a new CFD-based linearized stability analysis. An analytical solution for the linear stability problem in a constant area combustion chamber with uniform mean flow is developed to verify the numerical analyses.

  9. SITES-WATER RESOURCE SITE ANALYSIS COMPUTER PROGRAM, VERSION 2005

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The SITES Water Resource Site Analysis Computer program is used by USDA-NRCS and others for design and analysis of dams. The current program evolved from the DAMS2 program of the 1980’s with new features added for both functionality and ease of use. An Integrated Development Environment (IDE) was ...

  10. A Computer Program to Determine Reliability Using Analysis of Variance

    ERIC Educational Resources Information Center

    Burns, Edward

    1976-01-01

    A computer program, written in Fortran IV, is described which assesses reliability by using analysis of variance. It produces a complete analysis of variance table in addition to reliability coefficients for unadjusted and adjusted data as well as the intraclass correlation for m subjects and n items. (Author)

  11. Computer-Aided Communication Satellite System Analysis and Optimization.

    ERIC Educational Resources Information Center

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  12. Computational Aeroelastic Analysis of the Ares Launch Vehicle During Ascent

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.; Chwalowski, Pawel; Massey, Steven J.; Vatsa, Veer N.; Heeg, Jennifer; Wieseman, Carol D.; Mineck, Raymond E.

    2010-01-01

    This paper presents the static and dynamic computational aeroelastic (CAE) analyses of the Ares crew launch vehicle (CLV) during atmospheric ascent. The influence of launch vehicle flexibility on the static aerodynamic loading and integrated aerodynamic force and moment coefficients is discussed. The ultimate purpose of this analysis is to assess the aeroelastic stability of the launch vehicle along the ascent trajectory. A comparison of analysis results for several versions of the Ares CLV will be made. Flexible static and dynamic analyses based on rigid computational fluid dynamic (CFD) data are compared with a fully coupled aeroelastic time marching CFD analysis of the launch vehicle.

  13. Method and apparatus for imaging through 3-dimensional tracking of protons

    NASA Technical Reports Server (NTRS)

    Ryan, James M. (Inventor); Macri, John R. (Inventor); McConnell, Mark L. (Inventor)

    2001-01-01

    A method and apparatus for creating density images of an object through the 3-dimensional tracking of protons that have passed through the object are provided. More specifically, the 3-dimensional tracking of the protons is accomplished by gathering and analyzing images of the ionization tracks of the protons in a closely packed stack of scintillating fibers.

  14. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    NASA Astrophysics Data System (ADS)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  15. A Computational Discriminability Analysis on Twin Fingerprints

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  16. [Possibilities and limits of computer-assisted cardiotocogram analysis].

    PubMed

    Lösche, P

    1997-01-01

    The interpretation of cardiotocograms still relies primarily on visual analysis. This form of monitoring remains labour intensive and, being dependent on the training and experience of the specialist responsible, also subject to erroneous interpretation. Computer-aided cardiotocogram analysis has, in spite of encouraging successes, still not found wide application in everyday clinical routine. To achieve this, the programming system must be easy to operate, userfriendly and reliable. A program system for fully automatic cardiotocogram analysis is envisioned which runs on standard commercially-available personal computers. A clear graphic representation of the traces also permits visual assessment on the computer screen. The system described integrates the main assessment criteria of cardiotocogram analysis which can then be extended owing to the open system architecture used in the programming. Completely new analysis algorithms have given the evaluating system the capability of fully-automatic pattern recognition of fetal heart rate signals and uterine motility. An essential requirement of computer-aided cardiotocogram analysis is thereby fulfilled. Work is now focusing on the exact classification of the various types of deceleration and an extension of the capabilities of tocogram analysis. There should be nothing to hinder integration of the system into everyday clinical routine and connect it to obstetrical databases. PMID:9381837

  17. Computer programs for analysis of geophysical data

    SciTech Connect

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  18. Structural Analysis Using Computer Based Methods

    NASA Technical Reports Server (NTRS)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  19. 3-Dimensional Geologic Modeling Applied to the Structural Characterization of Geothermal Systems: Astor Pass, Nevada, USA

    SciTech Connect

    Siler, Drew L; Faulds, James E; Mayhew, Brett

    2013-04-16

    Geothermal systems in the Great Basin, USA, are controlled by a variety of fault intersection and fault interaction areas. Understanding the specific geometry of the structures most conducive to broad-scale geothermal circulation is crucial to both the mitigation of the costs of geothermal exploration (especially drilling) and to the identification of geothermal systems that have no surface expression (blind systems). 3-dimensional geologic modeling is a tool that can elucidate the specific stratigraphic intervals and structural geometries that host geothermal reservoirs. Astor Pass, NV USA lies just beyond the northern extent of the dextral Pyramid Lake fault zone near the boundary between two distinct structural domains, the Walker Lane and the Basin and Range, and exhibits characteristics of each setting. Both northwest-striking, left-stepping dextral faults of the Walker Lane and kinematically linked northerly striking normal faults associated with the Basin and Range are present. Previous studies at Astor Pass identified a blind geothermal system controlled by the intersection of west-northwest and north-northwest striking dextral-normal faults. Wells drilled into the southwestern quadrant of the fault intersection yielded 94°C fluids, with geothermometers suggesting a maximum reservoir temperature of 130°C. A 3-dimensional model was constructed based on detailed geologic maps and cross-sections, 2-dimensional seismic data, and petrologic analysis of the cuttings from three wells in order to further constrain the structural setting. The model reveals the specific geometry of the fault interaction area at a level of detail beyond what geologic maps and cross-sections can provide.

  20. CFD Based Computations of Flexible Helicopter Blades for Stability Analysis

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2011-01-01

    As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.

  1. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    2001-06-05

    A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).

  2. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    1999-10-26

    A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).

  3. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, M.S.

    1998-08-18

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device. 27 figs.

  4. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    2003-08-19

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  5. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    1998-08-18

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  6. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  7. Computer analysis of transient voltages in large grounding systems

    SciTech Connect

    Grcev, L.D.

    1996-04-01

    A computer model for transient analysis of a network of buried and above ground conductors is presented. The model is based on the electromagnetic field theory approach ad the modified image theory. Validation of the model is achieved by comparison with field measurements. The model is applied for computation of transient voltages to remote ground of large grounding grid conductors. Also computation of longitudinal and leakage currents, transient impedance, electromagnetic fields, and transient induced voltages is possible. This model is aimed to help in EMC and lightning protection studies that involve electrical and electronic systems connected to grounding systems.

  8. Studies of Cosmic Ray Modulation and Energetic Particle Propagation in Time-Dependent 3-Dimensional Heliospheric Magnetic Fields

    NASA Technical Reports Server (NTRS)

    Zhang, Ming

    2005-01-01

    The primary goal of this project was to perform theoretical calculations of propagation of cosmic rays and energetic particles in 3-dimensional heliospheric magnetic fields. We used Markov stochastic process simulation to achieve to this goal. We developed computation software that can be used to study particle propagation in, as two examples of heliospheric magnetic fields that have to be treated in 3 dimensions, a heliospheric magnetic field suggested by Fisk (1996) and a global heliosphere including the region beyond the termination shock. The results from our model calculations were compared with particle measurements from Ulysses, Earth-based spacecraft such as IMP-8, WIND and ACE, Voyagers and Pioneers in outer heliosphere for tests of the magnetic field models. We particularly looked for features of particle variations that can allow us to significantly distinguish the Fisk magnetic field from the conventional Parker spiral field. The computer code will eventually lead to a new generation of integrated software for solving complicated problems of particle acceleration, propagation and modulation in realistic 3-dimensional heliosphere of realistic magnetic fields and the solar wind with a single computation approach.

  9. On computational schemes for global-local stress analysis

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1989-01-01

    An overview is given of global-local stress analysis methods and associated difficulties and recommendations for future research. The phrase global-local analysis is understood to be an analysis in which some parts of the domain or structure are identified, for reasons of accurate determination of stresses and displacements or for more refined analysis than in the remaining parts. The parts of refined analysis are termed local and the remaining parts are called global. Typically local regions are small in size compared to global regions, while the computational effort can be larger in local regions than in global regions.

  10. Interactive analysis of thermal imagery. [computer graphics terminal for photointerpretation

    NASA Technical Reports Server (NTRS)

    Madding, R. P.; Fisher, L. T.

    1976-01-01

    Necessary knowledge is presented on data acquisition and preparation for analysis of thermal imagery of power plant heated discharges remotely sensed from an aircraft, with special emphasis on analog to digital conversion of analog tapes acquired during scanning and to geometrical scaling. The central element in the interactive analysis of thermal imagery is an interactive graphics computer terminal which allows an interpreter to effectively interact with a large-scale computer, providing decisions or data as computations are carried out. A temperature calibration is performed, which the interpreter may test anywhere on the image. When satisfied that calibration is correct, the portion of the image to be analyzed is outlined. Printed and microfiche analyses of the plume are produced. The flow chart of programs for analysis of thermal imagery is presented and discussed in some detail.

  11. Large-scale temporal analysis of computer and information science

    NASA Astrophysics Data System (ADS)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  12. Automated uncertainty analysis methods in the FRAP computer codes. [PWR

    SciTech Connect

    Peck, S O

    1980-01-01

    A user oriented, automated uncertainty analysis capability has been incorporated in the Fuel Rod Analysis Program (FRAP) computer codes. The FRAP codes have been developed for the analysis of Light Water Reactor fuel rod behavior during steady state (FRAPCON) and transient (FRAP-T) conditions as part of the United States Nuclear Regulatory Commission's Water Reactor Safety Research Program. The objective of uncertainty analysis of these codes is to obtain estimates of the uncertainty in computed outputs of the codes is to obtain estimates of the uncertainty in computed outputs of the codes as a function of known uncertainties in input variables. This paper presents the methods used to generate an uncertainty analysis of a large computer code, discusses the assumptions that are made, and shows techniques for testing them. An uncertainty analysis of FRAP-T calculated fuel rod behavior during a hypothetical loss-of-coolant transient is presented as an example and carried through the discussion to illustrate the various concepts.

  13. Large Scale EM Wave Propagation Analysis using FDTD Parallel Computation on Computer System for Education

    NASA Astrophysics Data System (ADS)

    Sonoda, Jun

    This paper describes the study of a fast electromagnetic (EM) wave propagation analysis that can solve electrically large domains using finite difference time domain (FDTD) method on cluster of personal computers (PC cluster). It reports an implementation of parallel FDTD using an MPI library on PC clusters of the computer system for education. Use of this method demonstrates that the speed-up ratio achieved for problem size 1200 × 1200 is about 55.0 using FDTD on 80 PCs. And also, indoor propagation of UWB pulse on the floor (1095.4λ× 98.6λ) is analyzed by the parallel FDTD using 40 PCs, computational time and memory have been reduced by 1/36.4 and 1/39.9, respectively. The results demonstrate that the parallel FDTD using PC cluster can analyze electrically large problems low computational costs than novel FDTD.

  14. Computational strategy for the crash design analysis using an uncertain computational mechanical model

    NASA Astrophysics Data System (ADS)

    Desceliers, C.; Soize, C.; Zarroug, M.

    2013-08-01

    The framework of this paper is the robust crash analysis of a motor vehicle. The crash analysis is carried out with an uncertain computational model for which uncertainties are taken into account with the parametric probabilistic approach and for which the stochastic solver is the Monte Carlo method. During the design process, different configurations of the motor vehicle are analyzed. Usual interpolation methods cannot be used to predict if the current configuration is similar or not to one of the previous configurations already analyzed and for which a complete stochastic computation has been carried out. In this paper, we propose a new indicator that allows to decide if the current configuration is similar to one of the previous analyzed configurations while the Monte Carlo simulation is not finished and therefore, to stop the Monte Carlo simulation before the end of computation.

  15. First Experiences with LHC Grid Computing and Distributed Analysis

    SciTech Connect

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  16. MSFC crack growth analysis computer program, version 2 (users manual)

    NASA Technical Reports Server (NTRS)

    Creager, M.

    1976-01-01

    An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.

  17. Development of a computational aero/fluids analysis system

    NASA Technical Reports Server (NTRS)

    Kelley, P. B.

    1987-01-01

    The Computational Aero/Fluids Analysis System (AFAS) provides the analytical capability to perform state-of-the-art computational analyses in two difficult fluid dynamics disciplines associated with the Space Shuttle program. This system provides the analysis tools and techniques for rapidly and efficiently accessing, analyzing, and reformulating the large and expanding external aerodynamic data base while also providing tools for complex fluid flow analyses of the SSME engine components. Both of these fluid flow disciplines, external aerodynamics and internal gasdynamics, required this capability to ensure that MSFC can respond in a timely manner as problems are encountered and operational changes are made in the Space Shuttle.

  18. Computational Fluid Dynamics Analysis of Thoracic Aortic Dissection

    NASA Astrophysics Data System (ADS)

    Tang, Yik; Fan, Yi; Cheng, Stephen; Chow, Kwok

    2011-11-01

    Thoracic Aortic Dissection (TAD) is a cardiovascular disease with high mortality. An aortic dissection is formed when blood infiltrates the layers of the vascular wall, and a new artificial channel, the false lumen, is created. The expansion of the blood vessel due to the weakened wall enhances the risk of rupture. Computational fluid dynamics analysis is performed to study the hemodynamics of this pathological condition. Both idealized geometry and realistic patient configurations from computed tomography (CT) images are investigated. Physiological boundary conditions from in vivo measurements are employed. Flow configuration and biomechanical forces are studied. Quantitative analysis allows clinicians to assess the risk of rupture in making decision regarding surgical intervention.

  19. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  20. Statistical mechanical modeling: Computer simulations, analysis and applications

    NASA Astrophysics Data System (ADS)

    Subramanian, Balakrishna

    This thesis describes the applications of statistical mechanical models and tools, especially computational techniques to the study of several problems in science. We study in chapter 2, various properties of a non-equilibrium cellular automaton model, the Toom model. We obtain numerically the exponents describing the fluctuations of the interface between the two stable phases of the model. In chapter 3, we introduce a binary alloy model with three-body potentials. Unlike the usual Ising-type models with two-body interactions, this model is not symmetric in its components. We calculate the exact low temperature phase diagram using Pirogov-Sinai theory and also find the mean-field equilibrium properties of this model. We then study the kinetics of phase segregation following a quenching in this model. We find that the results are very similar to those obtained for Ising-type models with pair interactions, indicating universality. In chapter 4, we discuss the statistical properties of "Contact Maps". These maps, are used to represent three-dimensional structures of proteins in modeling problems. We find that this representation space has particular properties that make it a convenient choice. The maps representing native folds of proteins correspond to compact structures which in turn correspond to maps with low degeneracy, making it easier to translate the map into the detailed 3-dimensional structure. The early stage of formation of a river network is described in Chapter 5 using quasi-random spanning trees on a square lattice. We observe that the statistical properties generated by these models are quite similar (better than some of the earlier models) to the empirical laws and results presented by geologists for real river networks. Finally, in chapter 6 we present a brief note on our study of the problem of progression of heterogeneous breast tumors. We investigate some of the possible pathways of progression based on the traditional notions of DCIS (Ductal

  1. 3-dimensional imaging system using crystal diffraction lenses

    DOEpatents

    Smither, Robert K.

    1999-01-01

    A device for imaging a plurality of sources of x-ray and gamma-ray radiation is provided. Diffracting crystals are used for focussing the radiation and directing the radiation to a detector which is used for analyzing their addition to collect data as to the location of the source of radiation. A computer is used for converting the data to an image. The invention also provides for a method for imaging x-ray and gamma radiation by supplying a plurality of sources of radiation; focussing the radiation onto a detector; analyzing the focused radiation to collect data as to the type and location of the radiation; and producing an image using the data.

  2. 3-dimensional imaging system using crystal diffraction lenses

    DOEpatents

    Smither, R.K.

    1999-02-09

    A device for imaging a plurality of sources of x-ray and gamma-ray radiation is provided. Diffracting crystals are used for focusing the radiation and directing the radiation to a detector which is used for analyzing their addition to collect data as to the location of the source of radiation. A computer is used for converting the data to an image. The invention also provides for a method for imaging x-ray and gamma radiation by supplying a plurality of sources of radiation; focusing the radiation onto a detector; analyzing the focused radiation to collect data as to the type and location of the radiation; and producing an image using the data. 18 figs.

  3. Application of three-dimensional computer modeling for reservoir and ore-body analysis

    SciTech Connect

    Hamilton, D.E.; Marie, J.L.; Moon, G.M.; Moretti, F.J.; Ryman, W.P.; Didur, R.S.

    1985-02-01

    Three-dimensional computer modeling of reservoirs and ore bodies aids in understanding and exploiting these resources. This modeling tool enables the geologist and engineer to correlate in 3 dimensions, experiment with various geologic interpretations, combine variables to enhance certain geologic attributes, test for reservoir heterogeneities and continuity, select drill sites or perforation zones, determine volumes, plan production, generate geologic parameters for input to flow simulators, calculate tonnages and ore-waste ratios, and test sensitivity of reserves to various ore-grade cutoffs and economic parameters. All applications benefit from the ability to update rapidly the 3-dimensional computer models when new data are collected. Two 3-dimensional computer modeling projects demonstrate these capabilities. The first project involves modeling porosity, permeability, and water saturation in a Malaysian reservoir. The models were used to analyze the relationship between water saturation and porosity and to generate geologic parameters for input to a flow simulator. The second project involves modeling copper, zinc, silver, gold, and specific gravity in a massive sulfide ore body in British Columbia. The 4 metal models were combined into one copper-equivalence model and evaluated for tonnage, stripping ratio, and sensitivity to variations of ore-grade cutoff.

  4. Superfast robust digital image correlation analysis with parallel computing

    NASA Astrophysics Data System (ADS)

    Pan, Bing; Tian, Long

    2015-03-01

    Existing digital image correlation (DIC) using the robust reliability-guided displacement tracking (RGDT) strategy for full-field displacement measurement is a path-dependent process that can only be executed sequentially. This path-dependent tracking strategy not only limits the potential of DIC for further improvement of its computational efficiency but also wastes the parallel computing power of modern computers with multicore processors. To maintain the robustness of the existing RGDT strategy and to overcome its deficiency, an improved RGDT strategy using a two-section tracking scheme is proposed. In the improved RGDT strategy, the calculated points with correlation coefficients higher than a preset threshold are all taken as reliably computed points and given the same priority to extend the correlation analysis to their neighbors. Thus, DIC calculation is first executed in parallel at multiple points by separate independent threads. Then for the few calculated points with correlation coefficients smaller than the threshold, DIC analysis using existing RGDT strategy is adopted. Benefiting from the improved RGDT strategy and the multithread computing, superfast DIC analysis can be accomplished without sacrificing its robustness and accuracy. Experimental results show that the presented parallel DIC method performed on a common eight-core laptop can achieve about a 7 times speedup.

  5. Integration of rocket turbine design and analysis through computer graphics

    NASA Technical Reports Server (NTRS)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  6. Large-scale computations in analysis of structures

    SciTech Connect

    McCallen, D.B.; Goudreau, G.L.

    1993-09-01

    Computer hardware and numerical analysis algorithms have progressed to a point where many engineering organizations and universities can perform nonlinear analyses on a routine basis. Through much remains to be done in terms of advancement of nonlinear analysis techniques and characterization on nonlinear material constitutive behavior, the technology exists today to perform useful nonlinear analysis for many structural systems. In the current paper, a survey on nonlinear analysis technologies developed and employed for many years on programmatic defense work at the Lawrence Livermore National Laboratory is provided, and ongoing nonlinear numerical simulation projects relevant to the civil engineering field are described.

  7. AKSATINT - SATELLITE INTERFERENCE ANALYSIS AND SIMULATION USING PERSONAL COMPUTERS

    NASA Technical Reports Server (NTRS)

    Kantak, A.

    1994-01-01

    In the late seventies, the number of communication satellites in service increased, and interference has become an increasingly important consideration in designing satellite/ground station communications systems. Satellite Interference Analysis and Simulation Using Personal Computers, AKSATINT, models the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both the desired and the interfering satellites are considered to be in elliptical orbits. The simulation contains computation of orbital positions of both satellites using classical orbital elements, calculation of the satellite antennae look angles for both satellites and elevation angles at the desired-satellite ground-station antenna, and computation of Doppler effect due to the motions of the satellites and the Earth's rotation. AKSATINT also computes the interference-tosignal-power ratio, taking into account losses suffered by the links. After computing the interference-to-signal-power ratio, the program computes the statistical quantities. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. The program includes a flowchart, a sample run, and results of that run. AKSATINT is expected to be of general use to system designers and frequency managers in selecting the proper frequency under an interference scenario. The AKSATINT program is written in BASIC. It was designed to operate on the IBM Personal Computer AT or compatibles, and has been implemented under MS DOS 3.2. AKSATINT was developed in 1987.

  8. EEG Control of a Virtual Helicopter in 3-Dimensional Space Using Intelligent Control Strategies

    PubMed Central

    Royer, Audrey S.; Doud, Alexander J.; Rose, Minn L.

    2011-01-01

    Films like Firefox, Surrogates, and Avatar have explored the possibilities of using brain-computer interfaces (BCIs) to control machines and replacement bodies with only thought. Real world BCIs have made great progress toward that end. Invasive BCIs have enabled monkeys to fully explore 3-dimensional (3D) space using neuroprosthetics. However, non-invasive BCIs have not been able to demonstrate such mastery of 3D space. Here, we report our work, which demonstrates that human subjects can use a non-invasive BCI to fly a virtual helicopter to any point in a 3D world. Through use of intelligent control strategies, we have facilitated the realization of controlled flight in 3D space. We accomplished this through a reductionist approach that assigns subject-specific control signals to the crucial components of 3D flight. Subject control of the helicopter was comparable when using either the BCI or a keyboard. By using intelligent control strategies, the strengths of both the user and the BCI system were leveraged and accentuated. Intelligent control strategies in BCI systems such as those presented here may prove to be the foundation for complex BCIs capable of doing more than we ever imagined. PMID:20876032

  9. Assessment and Planning for a Pediatric Bilateral Hand Transplant Using 3-Dimensional Modeling: Case Report.

    PubMed

    Gálvez, Jorge A; Gralewski, Kevin; McAndrew, Christine; Rehman, Mohamed A; Chang, Benjamin; Levin, L Scott

    2016-03-01

    Children are not typically considered for hand transplantation for various reasons, including the difficulty of finding an appropriate donor. Matching donor-recipient hands and forearms based on size is critically important. If the donor's hands are too large, the recipient may not be able to move the fingers effectively. Conversely, if the donor's hands are too small, the appearance may not be appropriate. We present an 8-year-old child evaluated for a bilateral hand transplant following bilateral amputation. The recipient forearms and model hands were modeled from computed tomography imaging studies and replicated as anatomic models with a 3-dimensional printer. We modified the scale of the printed hand to produce 3 proportions, 80%, 100% and 120%. The transplant team used the anatomical models during evaluation of a donor for appropriate match based on size. The donor's hand size matched the 100%-scale anatomical model hand and the transplant team was activated. In addition to assisting in appropriate donor selection by the transplant team, the 100%-scale anatomical model hand was used to create molds for prosthetic hands for the donor. PMID:26810827

  10. A Novel Method of Orbital Floor Reconstruction Using Virtual Planning, 3-Dimensional Printing, and Autologous Bone.

    PubMed

    Vehmeijer, Maarten; van Eijnatten, Maureen; Liberton, Niels; Wolff, Jan

    2016-08-01

    Fractures of the orbital floor are often a result of traffic accidents or interpersonal violence. To date, numerous materials and methods have been used to reconstruct the orbital floor. However, simple and cost-effective 3-dimensional (3D) printing technologies for the treatment of orbital floor fractures are still sought. This study describes a simple, precise, cost-effective method of treating orbital fractures using 3D printing technologies in combination with autologous bone. Enophthalmos and diplopia developed in a 64-year-old female patient with an orbital floor fracture. A virtual 3D model of the fracture site was generated from computed tomography images of the patient. The fracture was virtually closed using spline interpolation. Furthermore, a virtual individualized mold of the defect site was created, which was manufactured using an inkjet printer. The tangible mold was subsequently used during surgery to sculpture an individualized autologous orbital floor implant. Virtual reconstruction of the orbital floor and the resulting mold enhanced the overall accuracy and efficiency of the surgical procedure. The sculptured autologous orbital floor implant showed an excellent fit in vivo. The combination of virtual planning and 3D printing offers an accurate and cost-effective treatment method for orbital floor fractures. PMID:27137437

  11. 3-DIMENSIONAL Numerical Modeling on the Combustion and Emission Characteristics of Biodiesel in Diesel Engines

    NASA Astrophysics Data System (ADS)

    Yang, Wenming; An, Hui; Amin, Maghbouli; Li, Jing

    2014-11-01

    A 3-dimensional computational fluid dynamics modeling is conducted on a direct injection diesel engine fueled by biodiesel using multi-dimensional software KIVA4 coupled with CHEMKIN. To accurately predict the oxidation of saturated and unsaturated agents of the biodiesel fuel, a multicomponent advanced combustion model consisting of 69 species and 204 reactions combined with detailed oxidation pathways of methyl decenoate (C11H22O2), methyl-9-decenoate (C11H20O2) and n-heptane (C7H16) is employed in this work. In order to better represent the real fuel properties, the detailed chemical and thermo-physical properties of biodiesel such as vapor pressure, latent heat of vaporization, liquid viscosity and surface tension were calculated and compiled into the KIVA4 fuel library. The nitrogen monoxide (NO) and carbon monoxide (CO) formation mechanisms were also embedded. After validating the numerical simulation model by comparing the in-cylinder pressure and heat release rate curves with experimental results, further studies have been carried out to investigate the effect of combustion chamber design on flow field, subsequently on the combustion process and performance of diesel engine fueled by biodiesel. Research has also been done to investigate the impact of fuel injector location on the performance and emissions formation of diesel engine.

  12. Integrating computer programs for engineering analysis and design

    NASA Technical Reports Server (NTRS)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  13. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  14. Accuracy considerations in the computational analysis of jet noise

    NASA Technical Reports Server (NTRS)

    Scott, James N.

    1993-01-01

    The application of computational fluid dynamics methods to the analysis of problems in aerodynamic noise has resulted in the extension and adaptation of conventional CFD to the discipline now referred to as computational aeroacoustics (CAA). In the analysis of jet noise accurate resolution of a wide range of spatial and temporal scales in the flow field is essential if the acoustic far field is to be predicted. The numerical simulation of unsteady jet flow has been successfully demonstrated and many flow features have been computed with reasonable accuracy. Grid refinement and increased solution time are discussed as means of improving accuracy of Navier-Stokes solutions of unsteady jet flow. In addition various properties of different numerical procedures which influence accuracy are examined with particular emphasis on dispersion and dissipation characteristics. These properties are investigated by using selected schemes to solve model problems for the propagation of a shock wave and a sinusoidal disturbance. The results are compared for the different schemes.

  15. Finite element dynamic analysis on CDC STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  16. Boundary element analysis on vector and parallel computers

    NASA Technical Reports Server (NTRS)

    Kane, J. H.

    1994-01-01

    Boundary element analysis (BEA) can be characterized as a numerical technique that generally shifts the computational burden in the analysis toward numerical integration and the solution of nonsymmetric and either dense or blocked sparse systems of algebraic equations. Researchers have explored the concept that the fundamental characteristics of BEA can be exploited to generate effective implementations on vector and parallel computers. In this paper, the results of some of these investigations are discussed. The performance of overall algorithms for BEA on vector supercomputers, massively data parallel single instruction multiple data (SIMD), and relatively fine grained distributed memory multiple instruction multiple data (MIMD) computer systems is described. Some general trends and conclusions are discussed, along with indications of future developments that may prove fruitful in this regard.

  17. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  18. Computational Methods for the Analysis of Array Comparative Genomic Hybridization

    PubMed Central

    Chari, Raj; Lockwood, William W.; Lam, Wan L.

    2006-01-01

    Array comparative genomic hybridization (array CGH) is a technique for assaying the copy number status of cancer genomes. The widespread use of this technology has lead to a rapid accumulation of high throughput data, which in turn has prompted the development of computational strategies for the analysis of array CGH data. Here we explain the principles behind array image processing, data visualization and genomic profile analysis, review currently available software packages, and raise considerations for future software development. PMID:17992253

  19. Formal analysis of device authentication applications in ubiquitous computing.

    SciTech Connect

    Shin, Dongwan; Claycomb, William R.

    2010-11-01

    Authentication between mobile devices in ad-hoc computing environments is a challenging problem. Without pre-shared knowledge, existing applications rely on additional communication methods, such as out-of-band or location-limited channels for device authentication. However, no formal analysis has been conducted to determine whether out-of-band channels are actually necessary. We answer this question through formal analysis, and use BAN logic to show that device authentication using a single channel is not possible.

  20. Computational models for the nonlinear analysis of reinforced concrete plates

    NASA Technical Reports Server (NTRS)

    Hinton, E.; Rahman, H. H. A.; Huq, M. M.

    1980-01-01

    A finite element computational model for the nonlinear analysis of reinforced concrete solid, stiffened and cellular plates is briefly outlined. Typically, Mindlin elements are used to model the plates whereas eccentric Timoshenko elements are adopted to represent the beams. The layering technique, common in the analysis of reinforced concrete flexural systems, is incorporated in the model. The proposed model provides an inexpensive and reasonably accurate approach which can be extended for use with voided plates.

  1. Analysis of Computer-Mediated Communication: Using Formal Concept Analysis as a Visualizing Methodology.

    ERIC Educational Resources Information Center

    Hara, Noriko

    2002-01-01

    Introduces the use of Formal Concept Analysis (FCA) as a methodology to visualize the data in computer-mediated communication. Bases FCA on a mathematical lattice theory and offers visual maps (graphs) with conceptual hierarchies, and proposes use of FCA combined with content analysis to analyze computer-mediated communication. (Author/LRW)

  2. Audience Analysis: A Computer Assisted Instrument for Speech Education.

    ERIC Educational Resources Information Center

    Merritt, Floyd E.

    This paper reports on a combination questionnaire-attitude test designed to be used by speech instructors for the purpose of audience analysis. The test is divided into two parts and is scored by a computer. Part one requires the student to check items pertaining to class level, occupational goal, marital status, military service, high school…

  3. Computer-aided design and analysis of mechanisms

    NASA Technical Reports Server (NTRS)

    Knight, F. L.

    1982-01-01

    An introduction to the computer programs developed to assist in the design and analysis of mechanisms is presented. A survey of the various types of programs which are available is given, and the most widely used programs are compared. The way in which the programs are used is discussed, and demonstrated with an example.

  4. The Analysis of Essays by Computer. Final Report.

    ERIC Educational Resources Information Center

    Page, Ellis B.; Paulus, Dieter H.

    This study aimed at expanding a new field of educational measurement, by investigating the feasibility of using computer programs for the automatic analysis and evaluation of student writing. Essays written by secondary students in their English classes were rated by multiple independent judges on a number of traits usually considered important:…

  5. The NASA NASTRAN structural analysis computer program - New content

    NASA Technical Reports Server (NTRS)

    Weidman, D. J.

    1978-01-01

    Capabilities of a NASA-developed structural analysis computer program, NASTRAN, are evaluated with reference to finite-element modelling. Applications include the automotive industry as well as aerospace. It is noted that the range of sub-programs within NASTRAN has expanded, while keeping user cost low.

  6. A topological approach to computer-aided sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Chan, S. P.; Munoz, R. M.

    1971-01-01

    Sensitivities of any arbitrary system are calculated using general purpose digital computer with available software packages for transfer function analysis. Sensitivity shows how element variation within system affects system performance. Signal flow graph illustrates topological system behavior and relationship among parameters in system.

  7. Computational Analysis and Mapping of ijCSCL Content

    ERIC Educational Resources Information Center

    Lonchamp, Jacques

    2012-01-01

    The purpose of this empirical study is to analyze and map the content of the "International Journal of Computer-Supported Collaborative Learning" since its inception in 2006. Co-word analysis is the general approach that is used. In this approach, patterns of co-occurrence of pairs of items (words or phrases) identify relationships among ideas.…

  8. From Archives to Analysis: Computers Redefine the AAVSO Mission

    NASA Astrophysics Data System (ADS)

    Foster, Grant

    1992-10-01

    Recent advances in computer technology and program development promise great progress for the AAVSO in the near future. Not only have we improved our methods of archiving and evaluating variable star observations, but we will also soon be ready to enter the field of substantial in-house data analysis.

  9. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  10. Partial hyperbolicity and attracting regions in 3-dimensional manifolds

    NASA Astrophysics Data System (ADS)

    Potrie, Rafael

    The need for reliable, fiber-based sources of entangled and paired photons has intensified in recent years because of potential uses in optical quantum communication and computing. In particular, indistinguishable photon sources are an inherent part of several quantum communication protocols and are needed to establish the viability of quantum communication networks. This thesis is centered around the development of such sources at telecommunication-band wavelengths. In this thesis, we describe experiments on entangled photon generation and the creation of quantum logic gates in the C-band, and on photon indistinguishability in the O-band. These experiments utilize the four-wave mixing process in fiber which occurs as a result of the Kerr nonlinearity, to create paired photons. To begin, we report the development of a source of 1550-nm polarization entangled photons in fiber. We then interface this source with a quantum Controlled-NOT gate, which is a universal quantum logic gate. We set experimental bounds on the process fidelity of the Controlled-NOT gate. Next, we report a demonstration of quantum interference between 1310-nm photons produced in independent sources. We demonstrate high quantum interference visibility, a signature of quantum indistinguishability, while using distinguishable pump photons. Together, these efforts constitute preliminary steps toward establishing the viability of fiber-based quantum communication, which will allow us to utilize existing infrastructure for implementing quantum communication protocols.

  11. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  12. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  13. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  14. Phase transitions in a 3 dimensional lattice loop gas

    NASA Astrophysics Data System (ADS)

    MacKenzie, Richard; Nebia-Rahal, F.; Paranjape, M. B.

    2010-06-01

    We investigate, via Monte Carlo simulations, the phase structure of a system of closed, nonintersecting but otherwise noninteracting, loops in 3 Euclidean dimensions. The loops correspond to closed trajectories of massive particles and we find a phase transition as a function of their mass. We identify the order parameter as the average length of the loops at equilibrium. This order parameter exhibits a sharp increase as the mass is decreased through a critical value, the behavior seems to be a crossover transition. We believe that the model represents an effective description of the broken-symmetry sector of the 2+1 dimensional Abelian Higgs model, in the extreme strong coupling limit. The massive gauge bosons and the neutral scalars are decoupled, and the relevant low-lying excitations correspond to vortices and antivortices. The functional integral can be approximated by a sum over simple, closed vortex loop configurations. We present a novel fashion to generate nonintersecting closed loops, starting from a tetrahedral tessellation of three space. The two phases that we find admit the following interpretation: the usual Higgs phase and a novel phase which is heralded by the appearance of effectively infinitely long loops. We compute the expectation value of the Wilson loop operator and that of the Polyakov loop operator. The Wilson loop exhibits perimeter law behavior in both phases implying that the transition corresponds neither to the restoration of symmetry nor to confinement. The effective interaction between external charges is screened in both phases, however there is a dramatic increase in the polarization cloud in the novel phase as shown by the energy shift introduced by the Wilson loop.

  15. Quantum computation in the analysis of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gomez, Richard B.; Ghoshal, Debabrata; Jayanna, Anil

    2004-08-01

    Recent research on the topic of quantum computation provides us with some quantum algorithms with higher efficiency and speedup compared to their classical counterparts. In this paper, it is our intent to provide the results of our investigation of several applications of such quantum algorithms - especially the Grover's Search algorithm - in the analysis of Hyperspectral Data. We found many parallels with Grover's method in existing data processing work that make use of classical spectral matching algorithms. Our efforts also included the study of several methods dealing with hyperspectral image analysis work where classical computation methods involving large data sets could be replaced with quantum computation methods. The crux of the problem in computation involving a hyperspectral image data cube is to convert the large amount of data in high dimensional space to real information. Currently, using the classical model, different time consuming methods and steps are necessary to analyze these data including: Animation, Minimum Noise Fraction Transform, Pixel Purity Index algorithm, N-dimensional scatter plot, Identification of Endmember spectra - are such steps. If a quantum model of computation involving hyperspectral image data can be developed and formalized - it is highly likely that information retrieval from hyperspectral image data cubes would be a much easier process and the final information content would be much more meaningful and timely. In this case, dimensionality would not be a curse, but a blessing.

  16. Computational analysis of Ebolavirus data: prospects, promises and challenges.

    PubMed

    Michaelis, Martin; Rossman, Jeremy S; Wass, Mark N

    2016-08-15

    The ongoing Ebola virus (also known as Zaire ebolavirus, a member of the Ebolavirus family) outbreak in West Africa has so far resulted in >28000 confirmed cases compared with previous Ebolavirus outbreaks that affected a maximum of a few hundred individuals. Hence, Ebolaviruses impose a much greater threat than we may have expected (or hoped). An improved understanding of the virus biology is essential to develop therapeutic and preventive measures and to be better prepared for future outbreaks by members of the Ebolavirus family. Computational investigations can complement wet laboratory research for biosafety level 4 pathogens such as Ebolaviruses for which the wet experimental capacities are limited due to a small number of appropriate containment laboratories. During the current West Africa outbreak, sequence data from many Ebola virus genomes became available providing a rich resource for computational analysis. Here, we consider the studies that have already reported on the computational analysis of these data. A range of properties have been investigated including Ebolavirus evolution and pathogenicity, prediction of micro RNAs and identification of Ebolavirus specific signatures. However, the accuracy of the results remains to be confirmed by wet laboratory experiments. Therefore, communication and exchange between computational and wet laboratory researchers is necessary to make maximum use of computational analyses and to iteratively improve these approaches. PMID:27528741

  17. Parallel multithread computing for spectroscopic analysis in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Trojanowski, Michal; Kraszewski, Maciej; Strakowski, Marcin; Pluciński, Jerzy

    2014-05-01

    Spectroscopic Optical Coherence Tomography (SOCT) is an extension of Optical Coherence Tomography (OCT). It allows gathering spectroscopic information from individual scattering points inside the sample. It is based on time-frequency analysis of interferometric signals. Such analysis requires calculating hundreds of Fourier transforms while performing a single A-scan. Additionally, further processing of acquired spectroscopic information is needed. This significantly increases the time of required computations. During last years, application of graphical processing units (GPU's) was proposed to reduce computation time in OCT by using parallel computing algorithms. GPU technology can be also used to speed-up signal processing in SOCT. However, parallel algorithms used in classical OCT need to be revised because of different character of analyzed data. The classical OCT requires processing of long, independent interferometric signals for obtaining subsequent A-scans. The difference with SOCT is that it requires processing of multiple, shorter signals, which differ only in a small part of samples. We have developed new algorithms for parallel signal processing for usage in SOCT, implemented with NVIDIA CUDA (Compute Unified Device Architecture). We present details of the algorithms and performance tests for analyzing data from in-house SD-OCT system. We also give a brief discussion about usefulness of developed algorithm. Presented algorithms might be useful for researchers working on OCT, as they allow to reduce computation time and are step toward real-time signal processing of SOCT data.

  18. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  19. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGESBeta

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  20. Practical Use of Computationally Frugal Model Analysis Methods

    SciTech Connect

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugal methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts

  1. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  2. Computed tomographic beam-hardening artefacts: mathematical characterization and analysis

    PubMed Central

    Park, Hyoung Suk; Chung, Yong Eun; Seo, Jin Keun

    2015-01-01

    This paper presents a mathematical characterization and analysis of beam-hardening artefacts in X-ray computed tomography (CT). In the field of dental and medical radiography, metal artefact reduction in CT is becoming increasingly important as artificial prostheses and metallic implants become more widespread in ageing populations. Metal artefacts are mainly caused by the beam-hardening of polychromatic X-ray photon beams, which causes mismatch between the actual sinogram data and the data model being the Radon transform of the unknown attenuation distribution in the CT reconstruction algorithm. We investigate the beam-hardening factor through a mathematical analysis of the discrepancy between the data and the Radon transform of the attenuation distribution at a fixed energy level. Separation of cupping artefacts from beam-hardening artefacts allows causes and effects of streaking artefacts to be analysed. Various computer simulations and experiments are performed to support our mathematical analysis. PMID:25939628

  3. Computed tomographic beam-hardening artefacts: mathematical characterization and analysis.

    PubMed

    Park, Hyoung Suk; Chung, Yong Eun; Seo, Jin Keun

    2015-06-13

    This paper presents a mathematical characterization and analysis of beam-hardening artefacts in X-ray computed tomography (CT). In the field of dental and medical radiography, metal artefact reduction in CT is becoming increasingly important as artificial prostheses and metallic implants become more widespread in ageing populations. Metal artefacts are mainly caused by the beam-hardening of polychromatic X-ray photon beams, which causes mismatch between the actual sinogram data and the data model being the Radon transform of the unknown attenuation distribution in the CT reconstruction algorithm. We investigate the beam-hardening factor through a mathematical analysis of the discrepancy between the data and the Radon transform of the attenuation distribution at a fixed energy level. Separation of cupping artefacts from beam-hardening artefacts allows causes and effects of streaking artefacts to be analysed. Various computer simulations and experiments are performed to support our mathematical analysis. PMID:25939628

  4. Probabilistic Computer Analysis for Rapid Evaluation of Structures.

    Energy Science and Technology Software Center (ESTSC)

    2007-03-29

    P-CARES 2.0.0, Probabilistic Computer Analysis for Rapid Evaluation of Structures, was developed for NRC staff use to determine the validity and accuracy of the analysis methods used by various utilities for structural safety evaluations of nuclear power plants. P-CARES provides the capability to effectively evaluate the probabilistic seismic response using simplified soil and structural models and to quickly check the validity and/or accuracy of the SSI data received from applicants and licensees. The code ismore » organized in a modular format with the basic modules of the system performing static, seismic, and nonlinear analysis.« less

  5. Probabilistic Computer Analysis for Rapid Evaluation of Structures.

    SciTech Connect

    XU, JIM

    2007-03-29

    P-CARES 2.0.0, Probabilistic Computer Analysis for Rapid Evaluation of Structures, was developed for NRC staff use to determine the validity and accuracy of the analysis methods used by various utilities for structural safety evaluations of nuclear power plants. P-CARES provides the capability to effectively evaluate the probabilistic seismic response using simplified soil and structural models and to quickly check the validity and/or accuracy of the SSI data received from applicants and licensees. The code is organized in a modular format with the basic modules of the system performing static, seismic, and nonlinear analysis.

  6. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    SciTech Connect

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-07-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  7. The 3-dimensional cored and logarithm potentials: Periodic orbits

    SciTech Connect

    Kulesza, Maité; Llibre, Jaume

    2014-11-15

    We study analytically families of periodic orbits for the cored and logarithmic Hamiltonians with 3 degrees of freedom, which are relevant in the analysis of the galactic dynamics. First, after introducing a scale transformation in the coordinates and momenta with a parameter ε, we show that both systems give essentially the same set of equations of motion up to first order in ε. Then the conditions for finding families of periodic orbits, using the averaging theory up to first order in ε, apply equally to both systems in every energy level H = h > 0 showing the existence of at least 3 periodic orbits, for ε small enough, and also provides an analytic approximation for the initial conditions of these periodic orbits. We prove that at every positive energy level the cored and logarithmic Hamiltonians with 3 degrees of freedom have at least three periodic solutions. The technique used for proving such a result can be applied to other Hamiltonian systems.

  8. Data collection, computation and statistical analysis in psychophysiological experiments.

    PubMed

    Buzzi, R; Wespi, J; Zwimpfer, J

    1982-01-01

    The system was designed to allow simultaneous monitoring of eight bioelectrical signals together with the necessary event markers. The data inputs are pulse code modulated, recorded on magnetic tape, and then read into a minicomputer. The computer permits the determination of parameters for the following signals: electrocardiogram (ECG), respiration (RESP), skin conductance changes (SCC), electromyogram (EMG), plethysmogram (PLET), pulse transmission time (PTT), and electroencephalogram (EEG). These parameters are determined for time blocks of selectable duration and read into a mainframe computer for further statistical analysis. PMID:7183101

  9. Computer Analysis Of ILO Standard Chest Radiographs Of Pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Li, C. C.; Shu, David B. C.; Tai, H. T.; Hou, W.; Kunkle, G. A.; Wang, Y.; Hoy, R. J.

    1982-11-01

    This paper presents study of computer analysis of the 1980 ILO standard chest radiographs of pneumoconiosis. Algorithms developed for detection of individual small rounded and irregular opacities have been experimented and evaluated on these standard radiographs. The density, shape, and size distribution of the detected objects in the lung field, in spite of false positives, can be used as indicators for the beginning of pneumoconiosis. This approach is potentially useful in computer-assisted screening and early detection process where the annual chest radiograph of each worker is compared with his (her) own normal radiograph obtained previously.

  10. Computational Analysis in Support of the SSTO Flowpath Test

    NASA Technical Reports Server (NTRS)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-01-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  11. Cone-beam computed tomography analysis of curved root canals after mechanical preparation with three nickel-titanium rotary instruments

    PubMed Central

    Elsherief, Samia M.; Zayet, Mohamed K.; Hamouda, Ibrahim M.

    2013-01-01

    Cone beam computed tomography is a 3-dimensional high resolution imaging method. The purpose of this study was to compare the effects of 3 different NiTi rotary instruments used to prepare curved root canals on the final shape of the curved canals and total amount of root canal transportation by using cone-beam computed tomography. A total of 81 mesial root canals from 42 extracted human mandibular molars, with a curvature ranging from 15 to 45 degrees, were selected. Canals were randomly divided into 3 groups of 27 each. After preparation with Protaper, Revo-S and Hero Shaper, the amount of transportation and centering ability that occurred were assessed by using cone beam computed tomography. Utilizing pre- and post-instrumentation radiographs, straightening of the canal curvatures was determined with a computer image analysis program. Canals were metrically assessed for changes (surface area, changes in curvature and transportation) during canal preparation by using software SimPlant; instrument failures were also recorded. Mean total widths and outer and inner width measurements were determined on each central canal path and differences were statistically analyzed. The results showed that all instruments maintained the original canal curvature well with no significant differences between the different files (P = 0.226). During preparation there was failure of only one file (the protaper group). In conclusion, under the conditions of this study, all instruments maintained the original canal curvature well and were safe to use. Areas of uninstrumented root canal wall were left in all regions using the various systems. PMID:23885273

  12. Monolithically integrated Helmholtz coils by 3-dimensional printing

    SciTech Connect

    Li, Longguang; Abedini-Nassab, Roozbeh; Yellen, Benjamin B.

    2014-06-23

    3D printing technology is of great interest for the monolithic fabrication of integrated systems; however, it is a challenge to introduce metallic components into 3D printed molds to enable broader device functionality. Here, we develop a technique for constructing a multi-axial Helmholtz coil by injecting a eutectic liquid metal Gallium Indium alloy (EGaIn) into helically shaped orthogonal cavities constructed in a 3D printed block. The tri-axial solenoids each carry up to 3.6 A of electrical current and produce magnetic field up to 70 G. Within the central section of the coil, the field variation is less than 1% and is in agreement with theory. The flow rates and critical pressures required to fill the 3D cavities with liquid metal also agree with theoretical predictions and provide scaling trends for filling the 3D printed parts. These monolithically integrated solenoids may find future applications in electronic cell culture platforms, atomic traps, and miniaturized chemical analysis systems based on nuclear magnetic resonance.

  13. Practical Use of Computationally Frugal Model Analysis Methods.

    PubMed

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts. PMID:25810333

  14. Computational simulation for analysis and synthesis of impact resilient structure

    NASA Astrophysics Data System (ADS)

    Djojodihardjo, Harijono

    2013-10-01

    Impact resilient structures are of great interest in many engineering applications varying from civil, land vehicle, aircraft and space structures, to mention a few examples. To design such structure, one has to resort fundamental principles and take into account progress in analytical and computational approaches as well as in material science and technology. With such perspectives, this work looks at a generic beam and plate structure subject to impact loading and carry out analysis and numerical simulation. The first objective of the work is to develop a computational algorithm to analyze flat plate as a generic structure subjected to impact loading for numerical simulation and parametric study. The analysis will be based on dynamic response analysis. Consideration is given to the elastic-plastic region. The second objective is to utilize the computational algorithm for direct numerical simulation, and as a parallel scheme, commercial off-the shelf numerical code is utilized for parametric study, optimization and synthesis. Through such analysis and numerical simulation, effort is devoted to arrive at an optimum configuration in terms of loading, structural dimensions, material properties and composite lay-up, among others. Results will be discussed in view of practical applications.

  15. CFD Analysis and Design Optimization Using Parallel Computers

    NASA Technical Reports Server (NTRS)

    Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James

    1997-01-01

    A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.

  16. A Computational Approach to Qualitative Analysis in Large Textual Datasets

    PubMed Central

    Evans, Michael S.

    2014-01-01

    In this paper I introduce computational techniques to extend qualitative analysis into the study of large textual datasets. I demonstrate these techniques by using probabilistic topic modeling to analyze a broad sample of 14,952 documents published in major American newspapers from 1980 through 2012. I show how computational data mining techniques can identify and evaluate the significance of qualitatively distinct subjects of discussion across a wide range of public discourse. I also show how examining large textual datasets with computational methods can overcome methodological limitations of conventional qualitative methods, such as how to measure the impact of particular cases on broader discourse, how to validate substantive inferences from small samples of textual data, and how to determine if identified cases are part of a consistent temporal pattern. PMID:24498398

  17. 3-dimensional wells and tunnels for finite element grids

    SciTech Connect

    Cherry, T.A.; Gable, C.W.; Trease, H.

    1996-12-31

    Modeling fluid, vapor, and air injection and extraction from wells poses a number of problems. The length scale of well bores is centimeters, the region of high pressure gradient may be tens of meters and the reservoir may be tens of kilometers. Furthermore, accurate representation of the path of a deviated well can be difficult. Incorporating the physics of injection and extraction can be made easier and more accurate with automated grid generation tools that incorporate wells as part of a background mesh that represents the reservoir. GEOMESH is a modeling tool developed for automating finite element grid generation. This tool maintains the geometric integrity of the geologic framework and produces optimal (Delaunay) tetrahedral grids. GEOMESH creates a 3D well as hexagonal segments formed along the path of the well. This well structure is tetrahedralized into a Delaunay mesh and then embedded into a background mesh. The well structure can be radially or vertically refined and each well layer is assigned a material property or can take on the material properties of the surrounding stratigraphy. The resulting embedded well can then be used by unstructured finite element models for gas and fluid flow in the vicinity of wells or tunnels. This 3D well representation allows the study of the free-surface of the well and surrounding stratigraphy. It reduces possible grid orientation effects, and allows better correlation between well sample data and the geologic model. The well grids also allow improved visualization for well and tunnel model analysis. 3D observation of the grids helps qualitative interpretation and can reveal features not apparent in fewer dimensions.

  18. 3-dimensional wells and tunnels for finite element grids

    SciTech Connect

    Cherry, T.A.; Gable, C.W.; Trease, H.

    1996-04-01

    Modeling fluid, vapor, and air injection and extraction from wells poses a number of problems. The length scale of well bores is centimeters, the region of high pressure gradient may be tens of meters and the reservoir may be tens of kilometers. Furthermore, accurate representation of the path of a deviated well can be difficult. Incorporating the physics of injection and extraction can be made easier and more accurate with automated grid generation tools that incorporate wells as part of a background mesh that represents the reservoir. GEOMESH is a modeling tool developed for automating finite element grid generation. This tool maintains the geometric integrity of the geologic framework and produces optimal (Delaunay) tetrahedral grids. GEOMESH creates a 3D well as hexagonal segments formed along the path of the well. This well structure is tetrahedralized into a Delaunay mesh and then embedded into a background mesh. The well structure can be radially or vertically refined and each well layer is assigned a material property or can take on the material properties of the surrounding stratigraphy. The resulting embedded well can then be used by unstructured finite element models for gas and fluid flow in the vicinity of wells or tunnels. This 3D well representation allows the study of the free- surface of the well and surrounding stratigraphy. It reduces possible grid orientation effects, and allows better correlation between well sample data and the geologic model. The well grids also allow improved visualization for well and tunnel model analysis. 3D observation of the grids helps qualitative interpretation and can reveal features not apparent in fewer dimensions.

  19. CAPRI: Using a Geometric Foundation for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2002-01-01

    CAPRI (Computational Analysis Programming Interface) is a software development tool intended to make computerized design, simulation and analysis faster and more efficient. The computational steps traditionally taken for most engineering analysis (Computational Fluid Dynamics (CFD), structural analysis, etc.) are: Surface Generation, usually by employing a Computer Aided Design (CAD) system; Grid Generation, preparing the volume for the simulation; Flow Solver, producing the results at the specified operational point; Post-processing Visualization, interactively attempting to understand the results. It should be noted that the structures problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go IGES files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D) data formats and the upcoming CGNS). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Instead of the serial approach to analysis, CAPRI takes a geometry centric approach. CAPRI is a software building tool-kit that refers to two ideas: (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. The connection to the geometry is made through an Application Programming Interface (API) and not a file system.

  20. Computational analysis of high resolution unsteady airloads for rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Lam, C.-M. Gordon; Wachspress, Daniel A.; Bliss, Donald B.

    1994-01-01

    The study of helicopter aerodynamic loading for acoustics applications requires the application of efficient yet accurate simulations of the velocity field induced by the rotor's vortex wake. This report summarizes work to date on the development of such an analysis, which builds on the Constant Vorticity Contour (CVC) free wake model, previously implemented for the study of vibratory loading in the RotorCRAFT computer code. The present effort has focused on implementation of an airload reconstruction approach that computes high resolution airload solutions of rotor/rotor-wake interactions required for acoustics computations. Supplementary efforts on the development of improved vortex core modeling, unsteady aerodynamic effects, higher spatial resolution of rotor loading, and fast vortex wake implementations have substantially enhanced the capabilities of the resulting software, denoted RotorCRAFT/AA (AeroAcoustics). Results of validation calculations using recently acquired model rotor data show that by employing airload reconstruction it is possible to apply the CVC wake analysis with temporal and spatial resolution suitable for acoustics applications while reducing the computation time required by one to two orders of magnitude relative to that required by direct calculations. Promising correlation with this body of airload and noise data has been obtained for a variety of rotor configurations and operating conditions.

  1. PArallel Reacting Multiphase FLOw Computational Fluid Dynamic Analysis

    Energy Science and Technology Software Center (ESTSC)

    2002-06-01

    PARMFLO is a parallel multiphase reacting flow computational fluid dynamics (CFD) code. It can perform steady or unsteady simulations in three space dimensions. It is intended for use in engineering CFD analysis of industrial flow system components. Its parallel processing capabilities allow it to be applied to problems that use at least an order of magnitude more computational cells than the number that can be used on a typical single processor workstation (about 106 cellsmore » in parallel processing mode versus about io cells in serial processing mode). Alternately, by spreading the work of a CFD problem that could be run on a single workstation over a group of computers on a network, it can bring the runtime down by an order of magnitude or more (typically from many days to less than one day). The software was implemented using the industry standard Message-Passing Interface (MPI) and domain decomposition in one spatial direction. The phases of a flow problem may include an ideal gas mixture with an arbitrary number of chemical species, and dispersed droplet and particle phases. Regions of porous media may also be included within the domain. The porous media may be packed beds, foams, or monolith catalyst supports. With these features, the code is especially suited to analysis of mixing of reactants in the inlet chamber of catalytic reactors coupled to computation of product yields that result from the flow of the mixture through the catalyst coaled support structure.« less

  2. PArallel Reacting Multiphase FLOw Computational Fluid Dynamic Analysis

    SciTech Connect

    Lottes, Steven A.

    2002-06-01

    PARMFLO is a parallel multiphase reacting flow computational fluid dynamics (CFD) code. It can perform steady or unsteady simulations in three space dimensions. It is intended for use in engineering CFD analysis of industrial flow system components. Its parallel processing capabilities allow it to be applied to problems that use at least an order of magnitude more computational cells than the number that can be used on a typical single processor workstation (about 106 cells in parallel processing mode versus about io cells in serial processing mode). Alternately, by spreading the work of a CFD problem that could be run on a single workstation over a group of computers on a network, it can bring the runtime down by an order of magnitude or more (typically from many days to less than one day). The software was implemented using the industry standard Message-Passing Interface (MPI) and domain decomposition in one spatial direction. The phases of a flow problem may include an ideal gas mixture with an arbitrary number of chemical species, and dispersed droplet and particle phases. Regions of porous media may also be included within the domain. The porous media may be packed beds, foams, or monolith catalyst supports. With these features, the code is especially suited to analysis of mixing of reactants in the inlet chamber of catalytic reactors coupled to computation of product yields that result from the flow of the mixture through the catalyst coaled support structure.

  3. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  4. Ubiquitous computing in sports: A review and analysis.

    PubMed

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols. PMID:19764000

  5. A grid computing infrastructure for MEG data analysis.

    PubMed

    Nakagawa, S; Kosaka, T; Date, S; Shimojo, S; Tonoike, M

    2004-01-01

    Magnetoencephalography (MEG) is widely used for studying brain functions, but clinical applications of MEG have been less prevalent. One reason is that only clinicians who have highly specialized knowledge can use MEG diagnostically, and such clinicians are found at only a few major hospitals. Another reason is that MEG data analysis is getting more and more complicated, and deals with a large amount of data, and thus requires high-performance computing. These problems can be solved by the collaboration of human and computing resources distributed in multiple facilities. A new computing infrastructure for brain scientists and clinicians in distant locations was therefore developed by the Grid technology, which provides virtual computing environments composed of geographically distributed computers and experimental devices. A prototype system connecting an MEG system at the AIST in Japan, a Grid environment composed of PC clusters at Osaka University in Japan and Nanyang Technological University in Singapore, and user terminals in Baltimore was developed. MEG data measured at the AIST were transferred in real-time through a 1-GB/s network to the PC clusters for processing by a wavelet cross-correlation method, and then monitored in Baltimore. The current system is the basic model for remote-access to MEG equipment and high-speed processing of MEG data. PMID:16012700

  6. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  7. The 3-dimensional, 4-channel model of human visual sensitivity to grayscale scrambles.

    PubMed

    Silva, Andrew E; Chubb, Charles

    2014-08-01

    Previous research supports the claim that human vision has three dimensions of sensitivity to grayscale scrambles (textures composed of randomly scrambled mixtures of different grayscales). However, the preattentive mechanisms (called here "field-capture channels") that confer this sensitivity remain obscure. The current experiments sought to characterize the specific field-capture channels that confer this sensitivity using a task in which the participant is required to detect the location of a small patch of one type of grayscale scramble in an extended background of another type. Analysis of the results supports the existence of four field-capture channels: (1) the (previously characterized) "blackshot" channel, sharply tuned to the blackest grayscales; (2) a (previously unknown) "gray-tuned" field-capture channel whose sensitivity is zero for black rising sharply to maximum sensitivity for grayscales slightly darker than mid-gray then decreasing to half-height for brighter grayscales; (3) an "up-ramped" channel whose sensitivity is zero for black, increases linearly with increasing grayscale reaching a maximum near white; (4) a (complementary) "down-ramped" channel whose sensitivity is maximal for black, decreases linearly reaching a minimum near white. The sensitivity functions of field-capture channels (3) and (4) are linearly dependent; thus, these four field-capture channels collectively confer sensitivity to a 3-dimensional space of histogram variations. PMID:24932891

  8. Computational Aerodynamic Analysis of Offshore Upwind and Downwind Turbines

    DOE PAGESBeta

    Zhao, Qiuying; Sheng, Chunhua; Afjeh, Abdollah

    2014-01-01

    Aerodynamic interactions of the model NREL 5 MW offshore horizontal axis wind turbines (HAWT) are investigated using a high-fidelity computational fluid dynamics (CFD) analysis. Four wind turbine configurations are considered; three-bladed upwind and downwind and two-bladed upwind and downwind configurations, which operate at two different rotor speeds of 12.1 and 16 RPM. In the present study, both steady and unsteady aerodynamic loads, such as the rotor torque, blade hub bending moment, and base the tower bending moment of the tower, are evaluated in detail to provide overall assessment of different wind turbine configurations. Aerodynamic interactions between the rotor and tower are analyzed,more » including the rotor wake development downstream. The computational analysis provides insight into aerodynamic performance of the upwind and downwind, two- and three-bladed horizontal axis wind turbines.« less

  9. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  10. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  11. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    SciTech Connect

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  12. Analysis of guidance law performance using personal computers

    NASA Technical Reports Server (NTRS)

    Barrios, J. Rene

    1990-01-01

    A point mass, three-degree of freedom model is presented as a basic development tool for PC based simulation models. The model has been used in the development of guidance algorithms as well as in other applications such as performance management systems to compute optimal speeds. Its limitations and advantages are discussed with regard to the windshear environment. A method for simulating a simple autopilot is explained in detail and applied in the analysis of different guidance laws.

  13. Modeling and analysis of the spread of computer virus

    NASA Astrophysics Data System (ADS)

    Zhu, Qingyi; Yang, Xiaofan; Ren, Jianguo

    2012-12-01

    Based on a set of reasonable assumptions, we propose a novel dynamical model describing the spread of computer virus. Through qualitative analysis, we give a threshold and prove that (1) the infection-free equilibrium is globally asymptotically stable if the threshold is less than one, implying that the virus would eventually die out, and (2) the infection equilibrium is globally asymptotically stable if the threshold is greater than one. Two numerical examples are presented to demonstrate the analytical results.

  14. Variance analysis. Part II, The use of computers.

    PubMed

    Finkler, S A

    1991-09-01

    This is the second in a two-part series on variance analysis. In the first article (JONA, July/August 1991), the author discussed flexible budgeting, including the calculation of price, quantity, volume, and acuity variances. In this second article, the author focuses on the use of computers by nurse managers to aid in the process of calculating, understanding, and justifying variances. PMID:1919788

  15. Computational Methods for Failure Analysis and Life Prediction

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.

  16. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  17. Integration of 3-dimensional surgical and orthodontic technologies with orthognathic "surgery-first" approach in the management of unilateral condylar hyperplasia.

    PubMed

    Janakiraman, Nandakumar; Feinberg, Mark; Vishwanath, Meenakshi; Nalaka Jayaratne, Yasas Shri; Steinbacher, Derek M; Nanda, Ravindra; Uribe, Flavio

    2015-12-01

    Recent innovations in technology and techniques in both surgical and orthodontic fields can be integrated, especially when treating subjects with facial asymmetry. In this article, we present a treatment method consisting of 3-dimensional computer-aided surgical and orthodontic planning, which was implemented with the orthognathic surgery-first approach. Virtual surgical planning, fabrication of surgical splints using the computer-aided design/computer-aided manufacturing technique, and prediction of final orthodontic occlusion using virtual planning with robotically assisted customized archwires were integrated for this patient. Excellent esthetic and occlusal outcomes were obtained in a short period of 5.5 months. PMID:26672712

  18. Spacelab data analysis using the space plasma computer analysis network (SCAN) system

    NASA Technical Reports Server (NTRS)

    Green, J. L.

    1984-01-01

    The Space-plasma Computer Analysis Network (SCAN) currently connects a large number of U.S. Spacelab investigators into a common computer network. Used primarily by plasma physics researchers at present, SCAN provides access to Spacelab investigators in other areas of space science, to Spacelab and non-Spacelab correlative data bases, and to large Class VI computational facilities for modeling. SCAN links computers together at remote institutions used by space researchers, utilizing commercially available software for computer-to-computer communications. Started by the NASA's Office of Space Science in mid 1980, SCAN presently contains ten system nodes located at major universities and space research laboratories, with fourteen new nodes projected for the near future. The Stanford University computer gateways allow SCAN users to connect onto the ARPANET and TELENET overseas networks.

  19. Shielding analysis methods available in the scale computational system

    SciTech Connect

    Parks, C.V.; Tang, J.S.; Hermann, O.W.; Bucholz, J.A.; Emmett, M.B.

    1986-01-01

    Computational tools have been included in the SCALE system to allow shielding analysis to be performed using both discrete-ordinates and Monte Carlo techniques. One-dimensional discrete ordinates analyses are performed with the XSDRNPM-S module, and point dose rates outside the shield are calculated with the XSDOSE module. Multidimensional analyses are performed with the MORSE-SGC/S Monte Carlo module. This paper will review the above modules and the four Shielding Analysis Sequences (SAS) developed for the SCALE system. 7 refs., 8 figs.

  20. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  1. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  2. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  3. Computer assessment of interview data using latent semantic analysis.

    PubMed

    Dam, Gregory; Kaufmann, Stefan

    2008-02-01

    Clinical interviews are a powerful method for assessing students' knowledge and conceptualdevelopment. However, the analysis of the resulting data is time-consuming and can create a "bottleneck" in large-scale studies. This article demonstrates the utility of computational methods in supporting such an analysis. Thirty-four 7th-grade student explanations of the causes of Earth's seasons were assessed using latent semantic analysis (LSA). Analyses were performed on transcriptions of student responses during interviews administered, prior to (n = 21) and after (n = 13) receiving earth science instruction. An instrument that uses LSA technology was developed to identify misconceptions and assess conceptual change in students' thinking. Its accuracy, as determined by comparing its classifications to the independent coding performed by four human raters, reached 90%. Techniques for adapting LSA technology to support the analysis of interview data, as well as some limitations, are discussed. PMID:18411522

  4. Three-dimensional transonic potential flow about complex 3-dimensional configurations

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1984-01-01

    An analysis has been developed and a computer code written to predict three-dimensional subsonic or transonic potential flow fields about lifting or nonlifting configurations. Possible condfigurations include inlets, nacelles, nacelles with ground planes, S-ducts, turboprop nacelles, wings, and wing-pylon-nacelle combinations. The solution of the full partial differential equation for compressible potential flow written in terms of a velocity potential is obtained using finite differences, line relaxation, and multigrid. The analysis uses either a cylindrical or Cartesian coordinate system. The computational mesh is not body fitted. The analysis has been programmed in FORTRAN for both the CDC CYBER 203 and the CRAY-1 computers. Comparisons of computed results with experimental measurement are presented. Descriptions of the program input and output formats are included.

  5. New Mexico district work-effort analysis computer program

    USGS Publications Warehouse

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  6. PREWATE: An interactive preprocessing computer code to the Weight Analysis of Turbine Engines (WATE) computer code

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1983-01-01

    The Weight Analysis of Turbine Engines (WATE) computer code was developed by Boeing under contract to NASA Lewis. It was designed to function as an adjunct to the Navy/NASA Engine Program (NNEP). NNEP calculates the design and off-design thrust and sfc performance of User defined engine cycles. The thermodynamic parameters throughout the engine as generated by NNEP are then combined with input parameters defining the component characteristics in WATE to calculate the bare engine weight of this User defined engine. Preprocessor programs for NNEP were previously developed to simplify the task of creating input datasets. This report describes a similar preprocessor for the WATE code.

  7. Computational analysis of signaling patterns in single cells

    PubMed Central

    Davis, Denise M.; Purvis, Jeremy E.

    2014-01-01

    Signaling proteins are flexible in both form and function. They can bind to multiple molecular partners and integrate diverse types of cellular information. When imaged by time-lapse microscopy, many signaling proteins show complex patterns of activity or localization that vary from cell to cell. This heterogeneity is so prevalent that it has spurred the development of new computational strategies to analyze single-cell signaling patterns. A collective observation from these analyses is that cells appear less heterogeneous when their responses are normalized to, or synchronized with, other single-cell measurements. In many cases, these transformed signaling patterns show distinct dynamical trends that correspond with predictable phenotypic outcomes. When signaling mechanisms are unclear, computational models can suggest putative molecular interactions that are experimentally testable. Thus, computational analysis of single-cell signaling has not only provided new ways to quantify the responses of individual cells, but has helped resolve longstanding questions surrounding many well-studied human signaling proteins including NF-κB, p53, ERK1/2, and CDK2. A number of specific challenges lie ahead for single-cell analysis such as quantifying the contribution of non-cell autonomous signaling as well as the characterization of protein signaling dynamics in vivo. PMID:25263011

  8. A Computational Clonal Analysis of the Developing Mouse Limb Bud

    PubMed Central

    Marcon, Luciano; Arqués, Carlos G.; Torres, Miguel S.; Sharpe, James

    2011-01-01

    A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis. PMID:21347315

  9. Use of 3-Dimensional Volumetric Modeling of Adrenal Gland Size in Patients with Primary Pigmented Nodular Adrenocortical Disease.

    PubMed

    Chrysostomou, P P; Lodish, M B; Turkbey, E B; Papadakis, G Z; Stratakis, C A

    2016-04-01

    Primary pigmented nodular adrenocortical disease (PPNAD) is a rare type of bilateral adrenal hyperplasia leading to hypercortisolemia. Adrenal nodularity is often appreciable with computed tomography (CT); however, accurate radiologic characterization of adrenal size in PPNAD has not been studied well. We used 3-dimensional (3D) volumetric analysis to characterize and compare adrenal size in PPNAD patients, with and without Cushing's syndrome (CS). Patients diagnosed with PPNAD and their family members with known mutations in PRKAR1A were screened. CT scans were used to create 3D models of each adrenal. Criteria for biochemical diagnosis of CS included loss of diurnal variation and/or elevated midnight cortisol levels, and paradoxical increase in urinary free cortisol and/or urinary 17-hydroxysteroids after dexamethasone administration. Forty-five patients with PPNAD (24 females, 27.8±17.6 years) and 8 controls (19±3 years) were evaluated. 3D volumetric modeling of adrenal glands was performed in all. Thirty-eight patients out of 45 (84.4%) had CS. Their mean adrenal volume was 8.1 cc±4.1, 7.2 cc±4.5 (p=0.643) for non-CS, and 8.0cc±1.6 for controls. Mean values were corrected for body surface area; 4.7 cc/kg/m(2)±2.2 for CS, and 3.9 cc/kg/m(2)±1.3 for non-CS (p=0.189). Adrenal volume and midnight cortisol in both groups was positively correlated, r=0.35, p=0.03. We conclude that adrenal volume measured by 3D CT in patients with PPNAD and CS was similar to those without CS, confirming empirical CT imaging-based observations. However, the association between adrenal volume and midnight cortisol levels may be used as a marker of who among patients with PPNAD may develop CS, something that routine CT cannot do. PMID:27065461

  10. Computer image analysis of etched tracks from ionizing radiation

    NASA Technical Reports Server (NTRS)

    Blanford, George E.

    1994-01-01

    I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.

  11. Analysis of sponge zones for computational fluid mechanics

    SciTech Connect

    Bodony, Daniel J. . E-mail: bodony@stanford.edu

    2006-03-01

    The use of sponge regions, or sponge zones, which add the forcing term -{sigma}(q - q {sub ref}) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer.

  12. Novel Radiobiological Gamma Index for Evaluation of 3-Dimensional Predicted Dose Distribution

    SciTech Connect

    Sumida, Iori; Yamaguchi, Hajime; Kizaki, Hisao; Aboshi, Keiko; Tsujii, Mari; Yoshikawa, Nobuhiko; Yamada, Yuji; Suzuki, Osamu; Seo, Yuji; Isohashi, Fumiaki; Yoshioka, Yasuo; Ogawa, Kazuhiko

    2015-07-15

    Purpose: To propose a gamma index-based dose evaluation index that integrates the radiobiological parameters of tumor control (TCP) and normal tissue complication probabilities (NTCP). Methods and Materials: Fifteen prostate and head and neck (H&N) cancer patients received intensity modulated radiation therapy. Before treatment, patient-specific quality assurance was conducted via beam-by-beam analysis, and beam-specific dose error distributions were generated. The predicted 3-dimensional (3D) dose distribution was calculated by back-projection of relative dose error distribution per beam. A 3D gamma analysis of different organs (prostate: clinical [CTV] and planned target volumes [PTV], rectum, bladder, femoral heads; H&N: gross tumor volume [GTV], CTV, spinal cord, brain stem, both parotids) was performed using predicted and planned dose distributions under 2%/2 mm tolerance and physical gamma passing rate was calculated. TCP and NTCP values were calculated for voxels with physical gamma indices (PGI) >1. We propose a new radiobiological gamma index (RGI) to quantify the radiobiological effects of TCP and NTCP and calculate radiobiological gamma passing rates. Results: The mean RGI gamma passing rates for prostate cases were significantly different compared with those of PGI (P<.03–.001). The mean RGI gamma passing rates for H&N cases (except for GTV) were significantly different compared with those of PGI (P<.001). Differences in gamma passing rates between PGI and RGI were due to dose differences between the planned and predicted dose distributions. Radiobiological gamma distribution was visualized to identify areas where the dose was radiobiologically important. Conclusions: RGI was proposed to integrate radiobiological effects into PGI. This index would assist physicians and medical physicists not only in physical evaluations of treatment delivery accuracy, but also in clinical evaluations of predicted dose distribution.

  13. Casks (computer analysis of storage casks): A microcomputer based analysis system for storage cask review

    SciTech Connect

    Chen, T.F.; Mok, G.C.; Carlson, R.W.

    1995-08-01

    CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules: the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on the impact analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage casks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.

  14. Automated procedure for sensitivity analysis using computer calculus

    SciTech Connect

    Oblow, E.M.

    1983-05-01

    An automated procedure for performing sensitivity analyses has been developed. The procedure uses a new FORTRAN compiler with computer calculus capabilities to generate the derivatives needed to set up sensitivity equations. The new compiler is called GRESS - Gradient Enhanced Software System. Application of the automated procedure with direct and adjoint sensitivity theory for the analysis of non-linear, iterative systems of equations is discussed. Calculational efficiency consideration and techniques for adjoint sensitivity analysis are emphasized. The new approach was found to preserve the traditional advantages of adjoint theory while removing the tedious human effort previously needed to apply this theoretical methodology. Conclusions are drawn about the applicability of the automated procedure in numerical analysis and large-scale modelling sensitivity studies.

  15. Recent applications of the transonic wing analysis computer code, TWING

    NASA Technical Reports Server (NTRS)

    Subramanian, N. R.; Holst, T. L.; Thomas, S. D.

    1982-01-01

    An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.

  16. Computational methodology for ChIP-seq analysis

    PubMed Central

    Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

    2015-01-01

    Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

  17. Numerical analysis of boosting scheme for scalable NMR quantum computation

    SciTech Connect

    SaiToh, Akira; Kitagawa, Masahiro

    2005-02-01

    Among initialization schemes for ensemble quantum computation beginning at thermal equilibrium, the scheme proposed by Schulman and Vazirani [in Proceedings of the 31st ACM Symposium on Theory of Computing (STOC'99) (ACM Press, New York, 1999), pp. 322-329] is known for the simple quantum circuit to redistribute the biases (polarizations) of qubits and small time complexity. However, our numerical simulation shows that the number of qubits initialized by the scheme is rather smaller than expected from the von Neumann entropy because of an increase in the sum of the binary entropies of individual qubits, which indicates a growth in the total classical correlation. This result--namely, that there is such a significant growth in the total binary entropy--disagrees with that of their analysis.

  18. Improvements to the fastex flutter analysis computer code

    NASA Technical Reports Server (NTRS)

    Taylor, Ronald F.

    1987-01-01

    Modifications to the FASTEX flutter analysis computer code (UDFASTEX) are described. The objectives were to increase the problem size capacity of FASTEX, reduce run times by modification of the modal interpolation procedure, and to add new user features. All modifications to the program are operable on the VAX 11/700 series computers under the VAX operating system. Interfaces were provided to aid in the inclusion of alternate aerodynamic and flutter eigenvalue calculations. Plots can be made of the flutter velocity, display and frequency data. A preliminary capability was also developed to plot contours of unsteady pressure amplitude and phase. The relevant equations of motion, modal interpolation procedures, and control system considerations are described and software developments are summarized. Additional information documenting input instructions, procedures, and details of the plate spline algorithm is found in the appendices.

  19. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  20. Computational chemistry in Argonne`s Reactor Analysis Division

    SciTech Connect

    Gelbard, E.; Agrawal, R.; Fanning, T.

    1997-08-01

    Roughly 3 years ago work on Argonne`s Integral Fast Reactor ({open_quotes}IFR{close_quotes}) was terminated and at that time, ANL funding was redirected to a number of alternative programs. One such alternative was waste management and, since disposal of spent fuel from ANL`s EBR-II reactor presents some special problems, this seemed an appropriate area for ANL work. Methods for the treatment and disposal of spent fuel (particularly from EBR-II but also from other sources) are now under very active investigation at ANL. The very large waste form development program is mainly experimental at this point, but within the Reactor Analysis ({open_quotes}RA{close_quotes}) Division a small computational chemistry program is underway, designed to supplement the experimental program. One of the most popular proposals for the treatment of much of our high-level wastes is vitrification. As noted below, this approach has serious drawbacks for EBR-II spent fuel. ANL has proposed, instead, that spent fuel first be pretreated by a special metallurgical process which produces, as waste, chloride salts of the various fission products; these salts would then be adsorbed in zeolite A, which is subsequently bonded with glass to produce a waste form suitable for disposal. So far it has been the main mission of RA`s computational chemistry program to study the process by which leaching occurs when the glass-bonded zeolite waste form is exposed to water. It is the purpose of this paper to describe RA`s computational chemistry program, to discuss the computational techniques involved in such a program, and in general to familiarize the M. and C. Division with a computational area which is probably unfamiliar to most of its member. 11 refs., 2 figs.

  1. Construction of 3-Dimensional Printed Ultrasound Phantoms With Wall-less Vessels.

    PubMed

    Nikitichev, Daniil I; Barburas, Anamaria; McPherson, Kirstie; Mari, Jean-Martial; West, Simeon J; Desjardins, Adrien E

    2016-06-01

    Ultrasound phantoms are invaluable as training tools for vascular access procedures. We developed ultrasound phantoms with wall-less vessels using 3-dimensional printed chambers. Agar was used as a soft tissue-mimicking material, and the wall-less vessels were created with rods that were retracted after the agar was set. The chambers had integrated luer connectors to allow for fluid injections with clinical syringes. Several variations on this design are presented, which include branched and stenotic vessels. The results show that 3-dimensional printing can be well suited to the construction of wall-less ultrasound phantoms, with designs that can be readily customized and shared electronically. PMID:27162278

  2. 3-Dimensional Terraced NAND (3D TNAND) Flash Memory-Stacked Version of Folded NAND Array

    NASA Astrophysics Data System (ADS)

    Kim, Yoon; Cho, Seongjae; Lee, Gil Sung; Park, Il Han; Lee, Jong Duk; Shin, Hyungcheol; Park, Byung-Gook

    We propose a 3-dimensional terraced NAND flash memory. It has a vertical channel so it is possible to make a long enough channel in 1F2 size. And it has 3-dimensional structure whose channel is connected vertically along with two stairs. So we can obtain high density as in the stacked array structure, without silicon stacking process. We can make NAND flash memory with 3F2 cell size. Using SILVACO ATLAS simulation, we study terraced NAND flash memory characteristics such as program, erase, and read. Also, its fabrication method is proposed.

  3. Magnetic topologies of coronal mass ejection events: Effects of 3-dimensional reconnection

    SciTech Connect

    Gosling, J.T.

    1995-09-01

    New magnetic loops formed in the corona following coronal mass ejection, CME, liftoffs provide strong evidence that magnetic reconnection commonly occurs within the magnetic ``legs`` of the departing CMEs. Such reconnection is inherently 3-dimensional and naturally produces CMEs having magnetic flux rope topologies. Sustained reconnection behind CMEs can produce a mixture of open and disconnected field lines threading the CMES. In contrast to the results of 2-dimensional reconnection. the disconnected field lines are attached to the outer heliosphere at both ends. A variety of solar and solar wind observations are consistent with the concept of sustained 3-dimensional reconnection within the magnetic legs of CMEs close to the Sun.

  4. Dosimetric Comparison Between 3-Dimensional Conformal and Robotic SBRT Treatment Plans for Accelerated Partial Breast Radiotherapy.

    PubMed

    Goggin, L M; Descovich, M; McGuinness, C; Shiao, S; Pouliot, J; Park, C

    2016-06-01

    Accelerated partial breast irradiation is an attractive alternative to conventional whole breast radiotherapy for selected patients. Recently, CyberKnife has emerged as a possible alternative to conventional techniques for accelerated partial breast irradiation. In this retrospective study, we present a dosimetric comparison between 3-dimensional conformal radiotherapy plans and CyberKnife plans using circular (Iris) and multi-leaf collimators. Nine patients who had undergone breast-conserving surgery followed by whole breast radiation were included in this retrospective study. The CyberKnife planning target volume (PTV) was defined as the lumpectomy cavity + 10 mm + 2 mm with prescription dose of 30 Gy in 5 fractions. Two sets of 3-dimensional conformal radiotherapy plans were created, one used the same definitions as described for CyberKnife and the second used the RTOG-0413 definition of the PTV: lumpectomy cavity + 15 mm + 10 mm with prescription dose of 38.5 Gy in 10 fractions. Using both PTV definitions allowed us to compare the dose delivery capabilities of each technology and to evaluate the advantage of CyberKnife tracking. For the dosimetric comparison using the same PTV margins, CyberKnife and 3-dimensional plans resulted in similar tumor coverage and dose to critical structures, with the exception of the lung V5%, which was significantly smaller for 3-dimensional conformal radiotherapy, 6.2% when compared to 39.4% for CyberKnife-Iris and 17.9% for CyberKnife-multi-leaf collimator. When the inability of 3-dimensional conformal radiotherapy to track motion is considered, the result increased to 25.6%. Both CyberKnife-Iris and CyberKnife-multi-leaf collimator plans demonstrated significantly lower average ipsilateral breast V50% (25.5% and 24.2%, respectively) than 3-dimensional conformal radiotherapy (56.2%). The CyberKnife plans were more conformal but less homogeneous than the 3-dimensional conformal radiotherapy plans. Approximately 50% shorter

  5. Computing the surveillance error grid analysis: procedure and examples.

    PubMed

    Kovatchev, Boris P; Wakeman, Christian A; Breton, Marc D; Kost, Gerald J; Louie, Richard F; Tran, Nam K; Klonoff, David C

    2014-07-01

    The surveillance error grid (SEG) analysis is a tool for analysis and visualization of blood glucose monitoring (BGM) errors, based on the opinions of 206 diabetes clinicians who rated 4 distinct treatment scenarios. Resulting from this large-scale inquiry is a matrix of 337 561 risk ratings, 1 for each pair of (reference, BGM) readings ranging from 20 to 580 mg/dl. The computation of the SEG is therefore complex and in need of automation. The SEG software introduced in this article automates the task of assigning a degree of risk to each data point for a set of measured and reference blood glucose values so that the data can be distributed into 8 risk zones. The software's 2 main purposes are to (1) distribute a set of BG Monitor data into 8 risk zones ranging from none to extreme and (2) present the data in a color coded display to promote visualization. Besides aggregating the data into 8 zones corresponding to levels of risk, the SEG computes the number and percentage of data pairs in each zone and the number/percentage of data pairs above/below the diagonal line in each zone, which are associated with BGM errors creating risks for hypo- or hyperglycemia, respectively. To illustrate the action of the SEG software we first present computer-simulated data stratified along error levels defined by ISO 15197:2013. This allows the SEG to be linked to this established standard. Further illustration of the SEG procedure is done with a series of previously published data, which reflect the performance of BGM devices and test strips under various environmental conditions. We conclude that the SEG software is a useful addition to the SEG analysis presented in this journal, developed to assess the magnitude of clinical risk from analytically inaccurate data in a variety of high-impact situations such as intensive care and disaster settings. PMID:25562887

  6. Cost analysis for computer supported multiple-choice paper examinations

    PubMed Central

    Mandel, Alexander; Hörnlein, Alexander; Ifland, Marianus; Lüneburg, Edeltraud; Deckert, Jürgen; Puppe, Frank

    2011-01-01

    Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results. Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS) 2009/10 eleven, in the summer semester (SS) 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam. PMID:22205913

  7. Computational analysis of endometrial photocoagulation with diffusing optical device

    PubMed Central

    Kwon, Jinhee; Lee, Chang-Yong; Oh, Junghwan; Kang, Hyun Wook

    2013-01-01

    A balloon-catheter optical diffuser for endometrial treatment was evaluated with computational thermal analysis. Various catheter materials and dimensions were implemented to identify the optimal design for the device. Spatial and temporal development of temperature during 30-sec irradiation of 532-nm light demonstrated thermal insulation effects of polyurethane on temperature increase up to 384 K, facilitating the irreversible denaturation. The current model revealed the degree of thermal coagulation 13% thicker than experimental results possibly due to lack of tissue dynamics and light intensity distribution. In combination with photon distribution, the analytical simulation can be a feasible tool to optimize the new optical diffuser for efficient and safe endometrial treatment. PMID:24298406

  8. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  9. Computer Tomography Analysis of Fastrac Composite Thrust Chamber Assemblies

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2000-01-01

    Computed tomography (CT) inspection has been integrated into the production process for NASA's Fastrac composite thrust chamber assemblies (TCAs). CT has been proven to be uniquely qualified to detect the known critical flaw for these nozzles, liner cracks that are adjacent to debonds between the liner and overwrap. CT is also being used as a process monitoring tool through analysis of low density indications in the nozzle overwraps. 3d reconstruction of CT images to produce models of flawed areas is being used to give program engineers better insight into the location and nature of nozzle flaws.

  10. Computational analysis of liquid hypergolic propellant rocket engines

    NASA Technical Reports Server (NTRS)

    Krishnan, A.; Przekwas, A. J.; Gross, K. W.

    1992-01-01

    The combustion process in liquid rocket engines depends on a number of complex phenomena such as atomization, vaporization, spray dynamics, mixing, and reaction mechanisms. A computational tool to study their mutual interactions is developed to help analyze these processes with a view of improving existing designs and optimizing future designs of the thrust chamber. The focus of the article is on the analysis of the Variable Thrust Engine for the Orbit Maneuvering Vehicle. This engine uses a hypergolic liquid bipropellant combination of monomethyl hydrazine as fuel and nitrogen tetroxide as oxidizer.

  11. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    PubMed

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. PMID:25710902

  12. RADTRAN 5: A computer code for transportation risk analysis

    SciTech Connect

    Neuhauser, K. S.; Kanipe, F. L.

    1991-01-01

    RADTRAN 5 is a computer code developed at Sandia National Laboratories (SNL) in Albuquerque, NM, to estimate radiological and nonradiological risks of radioactive materials transportation. RADTRAN 5 is written in ANSI Standard FORTRAN 77 and contains significant advances in the methodology for route-specific analysis first developed by SNL for RADTRAN 4 (Neuhauser and Kanipe, 1992). Like the previous RADTRAN codes, RADTRAN 5 contains two major modules for incident-free and accident risk amlysis, respectively. All commercially important transportation modes may be analyzed with RADTRAN 5: highway by combination truck; highway by light-duty vehicle; rail; barge; ocean-going ship; cargo air; and passenger air.

  13. Parameter estimation and error analysis in environmental modeling and computation

    NASA Technical Reports Server (NTRS)

    Kalmaz, E. E.

    1986-01-01

    A method for the estimation of parameters and error analysis in the development of nonlinear modeling for environmental impact assessment studies is presented. The modular computer program can interactively fit different nonlinear models to the same set of data, dynamically changing the error structure associated with observed values. Parameter estimation techniques and sequential estimation algorithms employed in parameter identification and model selection are first discussed. Then, least-square parameter estimation procedures are formulated, utilizing differential or integrated equations, and are used to define a model for association of error with experimentally observed data.

  14. Computer analysis of general linear networks using digraphs.

    NASA Technical Reports Server (NTRS)

    Mcclenahan, J. O.; Chan, S.-P.

    1972-01-01

    Investigation of the application of digraphs in analyzing general electronic networks, and development of a computer program based on a particular digraph method developed by Chen. The Chen digraph method is a topological method for solution of networks and serves as a shortcut when hand calculations are required. The advantage offered by this method of analysis is that the results are in symbolic form. It is limited, however, by the size of network that may be handled. Usually hand calculations become too tedious for networks larger than about five nodes, depending on how many elements the network contains. Direct determinant expansion for a five-node network is a very tedious process also.

  15. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  16. Spheroid-based 3-dimensional culture models: Gene expression and functionality in head and neck cancer.

    PubMed

    Schmidt, Marianne; Scholz, Claus-Juergen; Polednik, Christine; Roller, Jeanette

    2016-04-01

    In the present study a panel of 12 head and neck cancer (HNSCC) cell lines were tested for spheroid formation. Since the size and morphology of spheroids is dependent on both cell adhesion and proliferation in the 3-dimensional (3D) context, morphology of HNSCC spheroids was related to expression of E-cadherin and the proliferation marker Ki67. In HNSCC cell lines the formation of tight regular spheroids was dependent on distinct E-cadherin expression levels in monolayer cultures, usually resulting in upregulation following aggregation into 3D structures. Cell lines expressing only low levels of E-cadherin in monolayers produced only loose cell clusters, frequently decreasing E-cadherin expression further upon aggregation. In these cell lines no epidermal growth factor receptor (EGFR) upregulation occurred and proliferation generally decreased in spheroids/aggregates independent of E-cadherin expression. In a second approach a global gene expression analysis of the larynx carcinoma cell line HLaC78 monolayer and the corresponding spheroids was performed. A global upregulation of gene expression in HLaC78 spheroids was related to genes involved in cell adhesion, cell junctions and cytochrome P450-mediated metabolism of xenobiotics. Downregulation was associated with genes controlling cell cycle, DNA-replication and DNA mismatch repair. Analyzing the expression of selected genes of each functional group in monolayer and spheroid cultures of all 12 cell lines revealed evidence for common gene expression shifts in genes controlling cell junctions, cell adhesion, cell cycle and DNA replication as well as genes involved in the cytochrome P450-mediated metabolism of xenobiotics. PMID:26797047

  17. A Customized Bolus Produced Using a 3-Dimensional Printer for Radiotherapy

    PubMed Central

    Kim, Shin-Wook; Shin, Hun-Joo; Kay, Chul Seung; Son, Seok Hyun

    2014-01-01

    Objective Boluses are used in high-energy radiotherapy in order to overcome the skin sparing effect. In practice though, commonly used flat boluses fail to make a perfect contact with the irregular surface of the patient’s skin, resulting in air gaps. Hence, we fabricated a customized bolus using a 3-dimensional (3D) printer and evaluated its feasibility for radiotherapy. Methods We designed two kinds of bolus for production on a 3D printer, one of which was the 3D printed flat bolus for the Blue water phantom and the other was a 3D printed customized bolus for the RANDO phantom. The 3D printed flat bolus was fabricated to verify its physical quality. The resulting 3D printed flat bolus was evaluated by assessing dosimetric parameters such as D1.5 cm, D5 cm, and D10 cm. The 3D printed customized bolus was then fabricated, and its quality and clinical feasibility were evaluated by visual inspection and by assessing dosimetric parameters such as Dmax, Dmin, Dmean, D90%, and V90%. Results The dosimetric parameters of the resulting 3D printed flat bolus showed that it was a useful dose escalating material, equivalent to a commercially available flat bolus. Analysis of the dosimetric parameters of the 3D printed customized bolus demonstrated that it is provided good dose escalation and good contact with the irregular surface of the RANDO phantom. Conclusions A customized bolus produced using a 3D printer could potentially replace commercially available flat boluses. PMID:25337700

  18. Oxidation behavior of ammonium in a 3-dimensional biofilm-electrode reactor.

    PubMed

    Tang, Jinjing; Guo, Jinsong; Fang, Fang; Chen, Youpeng; Lei, Lijing; Yang, Lin

    2013-12-01

    Excess nitrogenous compounds are detrimental to natural water systems and to human health. To completely realize autohydrogenotrophic nitrogen removal, a novel 3-dimensional biofilm-electrode reactor was designed. Titanium was electroplated with ruthenium and used as the anode. Activated carbon fiber felt was used as the cathode. The reactor was separated into two chambers by a permeable membrane. The cathode chamber was filled with granular graphite and glass beads. The cathode and cathode chamber were inhabited with domesticated biofilm. In the absence of organic substances, a nitrogen removal efficiency of up to 91% was achieved at DO levels of 3.42 +/- 0.37 mg/L when the applied current density was only 0.02 mA/cm2. The oxidation of ammonium in biofilm-electrode reactors was also investigated. It was found that ammonium could be oxidized not only on the anode but also on particle electrodes in the cathode chamber of the biofilm-electrode reactor. Oxidation rates of ammonium and nitrogen removal efficiency were found to be affected by the electric current loading on the biofilm-electrode reactor. The kinetic model of ammonium at different electric currents was analyzed by a first-order reaction kinetics equation. The regression analysis implied that when the current density was less than 0.02 mA/cm2, ammonium removal was positively correlated to the current density. However, when the current density was more than 0.02 mA/cm2, the electric current became a limiting factor for the oxidation rate of ammonium and nitrogen removal efficiency. PMID:24649670

  19. Basement membrane proteins promote progression of intraepithelial neoplasia in 3-dimensional models of human stratified epithelium.

    PubMed

    Andriani, Frank; Garfield, Jackie; Fusenig, Norbert E; Garlick, Jonathan A

    2004-01-20

    We have developed novel 3-dimensional in vitro and in vivo tissue models that mimic premalignant disease of human stratified epithelium in order to analyze the stromal contribution of extracellular matrix and basement membrane proteins to the progression of intraepithelial neoplasia. Three-dimensional, organotypic cultures were grown either on a de-epidermalized human dermis with pre-existing basement membrane components on its surface (AlloDerm), on a Type I collagen gel that lacked basement membrane proteins or on polycarbonate membranes coated with purified extracellular matrix proteins. When tumor cells (HaCaT-II4) were mixed with normal keratinocytes (4:1/normals:HaCaT-II4), tumor cells selectively attached, persisted and proliferated at the dermal-epidermal interface in vitro and generated dysplastic tissues when transplanted to nude mice only when grown in the presence of the AlloDerm substrate. This stromal interface was permissive for tumor cell attachment due to the rapid assembly of structured basement membrane. When tumor cells were mixed with normal keratinocytes and grown on polycarbonate membranes coated with individual extracellular matrix or basement membrane components, selective attachment and significant intraepithelial expansion occurred only on laminin 1 and Type IV collagen-coated membranes. This preferential adhesion of tumor cells restricted the synthesis of laminin 5 to basal cells where it was deposited in a polarized distribution. Western blot analysis revealed that tumor cell attachment was not due to differences in the synthesis or processing of laminin 5. Thus, intraepithelial progression towards premalignant disease is dependent on the selective adhesion of cells with malignant potential to basement membrane proteins that provide a permissive template for their persistence and expansion. PMID:14648700

  20. Computer-aided analysis of a Superfund site

    SciTech Connect

    Qualheim, B.J. )

    1990-05-01

    The groundwater investigation at the Lawrence Livermore National Laboratory was initiated in 1983 after perchloroethylene (PCE) and trichloroethylene (TCE) were detected in the groundwater. Since that time, more than 300 monitor wells have been completed, logged, sampled, and hydraulically tested. In 1987, the Livermore site was placed on the Environmental Protection Agency's National Priority List (Superfund). The Livermore valley is relatively flat, underlain by a complex alluvial sedimentary basin drained by two intermittent streams. The subsurface consists of unconsolidated sand, gravel, silt, and clay with multiple water-bearing zones of relatively high permeability. The hydrogeologic system is characterized as leaky, with horizontal hydraulic communication of up to 800 ft and vertical communication between aquifers of up to 50 ft. Computer-based analysis of the site stratigraphy was used to analyze and characterize the subsurface. The authors used a computer-aided design and drafting (CADD) system to create two-dimensional slices of the subsurface. The slice program takes a subsurface slice at any specified depositional gradient and at any slice thickness. A slice displays the lithology type, unit thickness, depth of slice, and chemical analyses for volatile organic compounds (VOCs). The lateral continuity of subsurface channels was mapped for each depth slice. By stacking these maps, the authors interpreted a pseudo-three-dimensional representation of probably pathways for VOC movement in the subsurface. An enhanced computer graphics system was also used to map the movement of VOCs in the subsurface.

  1. Analysis of an Atom-Optical Architecture for Quantum Computation

    NASA Astrophysics Data System (ADS)

    Devitt, Simon J.; Stephens, Ashley M.; Munro, William J.; Nemoto, Kae

    Quantum technology based on photons has emerged as one of the most promising platforms for quantum information processing, having already been used in proof-of-principle demonstrations of quantum communication and quantum computation. However, the scalability of this technology depends on the successful integration of experimentally feasible devices in an architecture that tolerates realistic errors and imperfections. Here, we analyse an atom-optical architecture for quantum computation designed to meet the requirements of scalability. The architecture is based on a modular atom-cavity device that provides an effective photon-photon interaction, allowing for the rapid, deterministic preparation of a large class of entangled states. We begin our analysis at the physical level, where we outline the experimental cavity quantum electrodynamics requirements of the basic device. Then, we describe how a scalable network of these devices can be used to prepare a three-dimensional topological cluster state, sufficient for universal fault-tolerant quantum computation. We conclude at the application level, where we estimate the system-level requirements of the architecture executing an algorithm compiled for compatibility with the topological cluster state.

  2. G-computation demonstration in causal mediation analysis.

    PubMed

    Wang, Aolin; Arah, Onyebuchi A

    2015-10-01

    Recent work has considerably advanced the definition, identification and estimation of controlled direct, and natural direct and indirect effects in causal mediation analysis. Despite the various estimation methods and statistical routines being developed, a unified approach for effect estimation under different effect decomposition scenarios is still needed for epidemiologic research. G-computation offers such unification and has been used for total effect and joint controlled direct effect estimation settings, involving different types of exposure and outcome variables. In this study, we demonstrate the utility of parametric g-computation in estimating various components of the total effect, including (1) natural direct and indirect effects, (2) standard and stochastic controlled direct effects, and (3) reference and mediated interaction effects, using Monte Carlo simulations in standard statistical software. For each study subject, we estimated their nested potential outcomes corresponding to the (mediated) effects of an intervention on the exposure wherein the mediator was allowed to attain the value it would have under a possible counterfactual exposure intervention, under a pre-specified distribution of the mediator independent of any causes, or under a fixed controlled value. A final regression of the potential outcome on the exposure intervention variable was used to compute point estimates and bootstrap was used to obtain confidence intervals. Through contrasting different potential outcomes, this analytical framework provides an intuitive way of estimating effects under the recently introduced 3- and 4-way effect decomposition. This framework can be extended to complex multivariable and longitudinal mediation settings. PMID:26537707

  3. Applying DNA computation to intractable problems in social network analysis.

    PubMed

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA. PMID:20566337

  4. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  5. Computer program for design analysis of radial-inflow turbines

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.

    1976-01-01

    A computer program written in FORTRAN that may be used for the design analysis of radial-inflow turbines was documented. The following information is included: loss model (estimation of losses), the analysis equations, a description of the input and output data, the FORTRAN program listing and list of variables, and sample cases. The input design requirements include the power, mass flow rate, inlet temperature and pressure, and rotational speed. The program output data includes various diameters, efficiencies, temperatures, pressures, velocities, and flow angles for the appropriate calculation stations. The design variables include the stator-exit angle, rotor radius ratios, and rotor-exit tangential velocity distribution. The losses are determined by an internal loss model.

  6. Computational analysis of protein interaction networks for infectious diseases.

    PubMed

    Pan, Archana; Lahiri, Chandrajit; Rajendiran, Anjana; Shanmugham, Buvaneswari

    2016-05-01

    Infectious diseases caused by pathogens, including viruses, bacteria and parasites, pose a serious threat to human health worldwide. Frequent changes in the pattern of infection mechanisms and the emergence of multidrug-resistant strains among pathogens have weakened the current treatment regimen. This necessitates the development of new therapeutic interventions to prevent and control such diseases. To cater to the need, analysis of protein interaction networks (PINs) has gained importance as one of the promising strategies. The present review aims to discuss various computational approaches to analyse the PINs in context to infectious diseases. Topology and modularity analysis of the network with their biological relevance, and the scenario till date about host-pathogen and intra-pathogenic protein interaction studies were delineated. This would provide useful insights to the research community, thereby enabling them to design novel biomedicine against such infectious diseases. PMID:26261187

  7. Electronic Forms-Based Computing for Evidentiary Analysis

    SciTech Connect

    Luse, Andy; Mennecke, Brian; Townsend, Anthony

    2009-07-01

    The paperwork associated with evidentiary collection and analysis is a highly repetitive and time-consuming process which often involves duplication of work and can frequently result in documentary errors. Electronic entry of evidence-related information can facilitate greater accuracy and less time spent on data entry. This manuscript describes a general framework for the implementation of an electronic tablet-based system for evidentiary processing. This framework is then utilized in the design and implementation of an electronic tablet-based evidentiary input prototype system developed for use by forensic laboratories which serves as a verification of the proposed framework. The manuscript concludes with a discussion of implications and recommendations for the implementation and use of tablet-based computing for evidence analysis.

  8. Computer-aided communication satellite system analysis and optimization

    NASA Technical Reports Server (NTRS)

    Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.

    1973-01-01

    The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.

  9. Dynamic analysis of spur gears using computer program DANST

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Lin, Hsiang Hsi; Liou, Chuen-Huei; Valco, Mark J.

    1993-01-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results.

  10. A computer program (MACPUMP) for interactive aquifer-test analysis

    USGS Publications Warehouse

    Day-Lewis, F. D.; Person, M.A.; Konikow, L.F.

    1995-01-01

    This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.

  11. Computer-aided strength analysis of the modernized freight wagon

    NASA Astrophysics Data System (ADS)

    Płaczek, M.; Wróbel, A.; Baier, A.

    2015-11-01

    In the paper results of computer-aided strength analysis of the modernized freight wagon based on Finite Element Method are presented. CAD model of the considered freight wagon was created and its strength was analysed in agreement with norms described the way of such kind of freight wagons testing. Then, the model of the analysed freight wagon was modernized by adding composite panels covering the inner surface of the vehicle body. Strength analysis was carried out once again and obtained results were juxtaposed. This work was carried out in order to verify the influence of composite panels on the strength of the freight car body and to estimate the possibility of reducing the steel shell thickness of the box in order to reduce weight of the freight wagon.

  12. Modern wing flutter analysis by computational fluid dynamics methods

    NASA Technical Reports Server (NTRS)

    Cunningham, Herbert J.; Batina, John T.; Bennett, Robert M.

    1988-01-01

    The application and assessment of the recently developed CAP-TSD transonic small-disturbance code for flutter prediction is described. The CAP-TSD code has been developed for aeroelastic analysis of complete aircraft configurations and was previously applied to the calculation of steady and unsteady pressures with favorable results. Generalized aerodynamic forces and flutter characteristics are calculated and compared with linear theory results and with experimental data for a 45 deg sweptback wing. These results are in good agreement with the experimental flutter data which is the first step toward validating CAP-TSD for general transonic aeroelastic applications. The paper presents these results and comparisons along with general remarks regarding modern wing flutter analysis by computational fluid dynamics methods.

  13. Computational analysis of an axial flow pediatric ventricular assist device.

    PubMed

    Throckmorton, Amy L; Untaroiu, Alexandrina; Allaire, Paul E; Wood, Houston G; Matherne, Gaynell Paul; Lim, David Scott; Peeler, Ben B; Olsen, Don B

    2004-10-01

    Longer-term (>2 weeks) mechanical circulatory support will provide an improved quality of life for thousands of pediatric cardiac failure patients per year in the United States. These pediatric patients suffer from severe congenital or acquired heart disease complicated by congestive heart failure. There are currently very few mechanical circulatory support systems available in the United States as viable options for this population. For that reason, we have designed an axial flow pediatric ventricular assist device (PVAD) with an impeller that is fully suspended by magnetic bearings. As a geometrically similar, smaller scaled version of our axial flow pump for the adult population, the PVAD has a design point of 1.5 L/min at 65 mm Hg to meet the full physiologic needs of pediatric patients. Conventional axial pump design equations and a nondimensional scaling technique were used to estimate the PVAD's initial dimensions, which allowed for the creation of computational models for performance analysis. A computational fluid dynamic analysis of the axial flow PVAD, which measures approximately 65 mm in length by 35 mm in diameter, shows that the pump will produce 1.5 L/min at 65 mm Hg for 8000 rpm. Fluid forces (approximately 1 N) were also determined for the suspension and motor design, and scalar stress values remained below 350 Pa with maximum particle residence times of approximately 0.08 milliseconds in the pump. This initial design demonstrated acceptable performance, thereby encouraging prototype manufacturing for experimental validation. PMID:15384993

  14. Analysis and optimization of cyclic methods in orbit computation

    NASA Technical Reports Server (NTRS)

    Pierce, S.

    1973-01-01

    The mathematical analysis and computation of the K=3, order 4; K=4, order 6; and K=5, order 7 cyclic methods and the K=5, order 6 Cowell method and some results of optimizing the 3 backpoint cyclic multistep methods for solving ordinary differential equations are presented. Cyclic methods have the advantage over traditional methods of having higher order for a given number of backpoints while at the same time having more free parameters. After considering several error sources the primary source for the cyclic methods has been isolated. The free parameters for three backpoint methods were used to minimize the effects of some of these error sources. They now yield more accuracy with the same computing time as Cowell's method on selected problems. This work is being extended to the five backpoint methods. The analysis and optimization are more difficult here since the matrices are larger and the dimension of the optimizing space is larger. Indications are that the primary error source can be reduced. This will still leave several parameters free to minimize other sources.

  15. Computer vision inspection of rice seed quality with discriminant analysis

    NASA Astrophysics Data System (ADS)

    Cheng, Fang; Ying, Yibin

    2004-10-01

    This study was undertaken to develop computer vision-based rice seeds inspection technology for quality control. Color image classification using a discriminant analysis algorithm identifying germinated rice seed was successfully implemented. The hybrid rice seed cultivars involved were Jinyou402, Shanyou10, Zhongyou207 and Jiayou99. Sixteen morphological features and six color features were extracted from sample images belong to training sets. The color feature of 'Huebmean' shows the strongest classification ability among all the features. Computed as the area of seed region divided by area of the smallest convex polygon that can contain the seed region, the feature of 'Solidity' is prior to the other morphological features in germinated seeds recognition. Combined with the two features of 'Huebmean' and 'Solidity', discriminant analysis was used to classify normal rice seeds and seeds germinated on panicle. Results show that the algorithm achieved an overall average accuracy of 98.4% for both of normal seeds and germinated seeds in all cultivars. The combination of 'Huebmean' and 'Solidity' was proved to be a good indicator for germinated seeds. The simple discriminant algorithm using just two features shows high accuracy and good adaptability.

  16. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  17. Computational modeling and analysis of thermoelectric properties of nanoporous silicon

    SciTech Connect

    Li, H.; Yu, Y.; Li, G.

    2014-03-28

    In this paper, thermoelectric properties of nanoporous silicon are modeled and studied by using a computational approach. The computational approach combines a quantum non-equilibrium Green's function (NEGF) coupled with the Poisson equation for electrical transport analysis, a phonon Boltzmann transport equation (BTE) for phonon thermal transport analysis and the Wiedemann-Franz law for calculating the electronic thermal conductivity. By solving the NEGF/Poisson equations self-consistently using a finite difference method, the electrical conductivity σ and Seebeck coefficient S of the material are numerically computed. The BTE is solved by using a finite volume method to obtain the phonon thermal conductivity k{sub p} and the Wiedemann-Franz law is used to obtain the electronic thermal conductivity k{sub e}. The figure of merit of nanoporous silicon is calculated by ZT=S{sup 2}σT/(k{sub p}+k{sub e}). The effects of doping density, porosity, temperature, and nanopore size on thermoelectric properties of nanoporous silicon are investigated. It is confirmed that nanoporous silicon has significantly higher thermoelectric energy conversion efficiency than its nonporous counterpart. Specifically, this study shows that, with a n-type doping density of 10{sup 20} cm{sup –3}, a porosity of 36% and nanopore size of 3 nm × 3 nm, the figure of merit ZT can reach 0.32 at 600 K. The results also show that the degradation of electrical conductivity of nanoporous Si due to the inclusion of nanopores is compensated by the large reduction in the phonon thermal conductivity and increase of absolute value of the Seebeck coefficient, resulting in a significantly improved ZT.

  18. 3-dimensional root phenotyping with a novel imaging and software platform

    Technology Transfer Automated Retrieval System (TEKTRAN)

    A novel imaging and software platform was developed for the high-throughput phenotyping of 3-dimensional root traits during seedling development. To demonstrate the platform’s capacity, plants of two rice (Oryza sativa) genotypes, Azucena and IR64, were grown in a transparent gellan gum system and ...

  19. 3-DIMENSIONAL MEASURED AND SIMULATED FLOW FOR SCOUR NEAR SPUR DIKES

    Technology Transfer Automated Retrieval System (TEKTRAN)

    To improve understanding of the flow and scour processes associated with spur dikes more fully, 3-dimensional flow velocities were measured using an acoustic Doppler velocimeter at a closely spaced grid over a fixed flat bed with a submerged spur dike. Some 2592 three-dimensional velocities around a...

  20. 3-dimensional orthodontics visualization system with dental study models and orthopantomograms

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Ong, S. H.; Foong, K. W. C.; Dhar, T.

    2005-04-01

    The aim of this study is to develop a system that provides 3-dimensional visualization of orthodontic treatments. Dental plaster models and corresponding orthopantomogram (dental panoramic tomogram) are first digitized and fed into the system. A semi-auto segmentation technique is applied to the plaster models to detect the dental arches, tooth interstices and gum margins, which are used to extract individual crown models. 3-dimensional representation of roots, generated by deforming generic tooth models with orthopantomogram using radial basis functions, is attached to corresponding crowns to enable visualization of complete teeth. An optional algorithm to close the gaps between deformed roots and actual crowns by using multi-quadratic radial basis functions is also presented, which is capable of generating smooth mesh representation of complete 3-dimensional teeth. User interface is carefully designed to achieve a flexible system with as much user friendliness as possible. Manual calibration and correction is possible throughout the data processing steps to compensate occasional misbehaviors of automatic procedures. By allowing the users to move and re-arrange individual teeth (with their roots) on a full dentition, this orthodontic visualization system provides an easy and accurate way of simulation and planning of orthodontic treatment. Its capability of presenting 3-dimensional root information with only study models and orthopantomogram is especially useful for patients who do not undergo CT scanning, which is not a routine procedure in most orthodontic cases.

  1. 3-Dimensional and Interactive Istanbul University Virtual Laboratory Based on Active Learning Methods

    ERIC Educational Resources Information Center

    Ince, Elif; Kirbaslar, Fatma Gulay; Yolcu, Ergun; Aslan, Ayse Esra; Kayacan, Zeynep Cigdem; Alkan Olsson, Johanna; Akbasli, Ayse Ceylan; Aytekin, Mesut; Bauer, Thomas; Charalambis, Dimitris; Gunes, Zeliha Ozsoy; Kandemir, Ceyhan; Sari, Umit; Turkoglu, Suleyman; Yaman, Yavuz; Yolcu, Ozgu

    2014-01-01

    The purpose of this study is to develop a 3-dimensional interactive multi-user and multi-admin IUVIRLAB featuring active learning methods and techniques for university students and to introduce the Virtual Laboratory of Istanbul University and to show effects of IUVIRLAB on students' attitudes on communication skills and IUVIRLAB. Although…

  2. Multiscale analysis of nonlinear systems using computational homology

    SciTech Connect

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure Characterization

  3. Multiscale analysis of nonlinear systems using computational homology

    SciTech Connect

    Konstantin Mischaikow, Rutgers University /Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure Characterization

  4. The future of computer-aided sperm analysis.

    PubMed

    Mortimer, Sharon T; van der Horst, Gerhard; Mortimer, David

    2015-01-01

    Computer-aided sperm analysis (CASA) technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6) and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards. PMID:25926614

  5. The future of computer-aided sperm analysis

    PubMed Central

    Mortimer, Sharon T; van der Horst, Gerhard; Mortimer, David

    2015-01-01

    Computer-aided sperm analysis (CASA) technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6) and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards. PMID:25926614

  6. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  7. Computational analysis of bacterial RNA-Seq data

    PubMed Central

    McClure, Ryan; Balasubramanian, Divya; Sun, Yan; Bobrovskyy, Maksym; Sumby, Paul; Genco, Caroline A.; Vanderpool, Carin K.; Tjaden, Brian

    2013-01-01

    Recent advances in high-throughput RNA sequencing (RNA-seq) have enabled tremendous leaps forward in our understanding of bacterial transcriptomes. However, computational methods for analysis of bacterial transcriptome data have not kept pace with the large and growing data sets generated by RNA-seq technology. Here, we present new algorithms, specific to bacterial gene structures and transcriptomes, for analysis of RNA-seq data. The algorithms are implemented in an open source software system called Rockhopper that supports various stages of bacterial RNA-seq data analysis, including aligning sequencing reads to a genome, constructing transcriptome maps, quantifying transcript abundance, testing for differential gene expression, determining operon structures and visualizing results. We demonstrate the performance of Rockhopper using 2.1 billion sequenced reads from 75 RNA-seq experiments conducted with Escherichia coli, Neisseria gonorrhoeae, Salmonella enterica, Streptococcus pyogenes and Xenorhabdus nematophila. We find that the transcriptome maps generated by our algorithms are highly accurate when compared with focused experimental data from E. coli and N. gonorrhoeae, and we validate our system’s ability to identify novel small RNAs, operons and transcription start sites. Our results suggest that Rockhopper can be used for efficient and accurate analysis of bacterial RNA-seq data, and that it can aid with elucidation of bacterial transcriptomes. PMID:23716638

  8. Novel Multicompartment 3-Dimensional Radiochromic Radiation Dosimeters for Nanoparticle-Enhanced Radiation Therapy Dosimetry

    SciTech Connect

    Alqathami, Mamdooh; Blencowe, Anton; Yeo, Un Jin; Doran, Simon J.; Qiao, Greg; Geso, Moshi

    2012-11-15

    Purpose: Gold nanoparticles (AuNps), because of their high atomic number (Z), have been demonstrated to absorb low-energy X-rays preferentially, compared with tissue, and may be used to achieve localized radiation dose enhancement in tumors. The purpose of this study is to introduce the first example of a novel multicompartment radiochromic radiation dosimeter and to demonstrate its applicability for 3-dimensional (3D) dosimetry of nanoparticle-enhanced radiation therapy. Methods and Materials: A novel multicompartment phantom radiochromic dosimeter was developed. It was designed and formulated to mimic a tumor loaded with AuNps (50 nm in diameter) at a concentration of 0.5 mM, surrounded by normal tissues. The novel dosimeter is referred to as the Sensitivity Modulated Advanced Radiation Therapy (SMART) dosimeter. The dosimeters were irradiated with 100-kV and 6-MV X-ray energies. Dose enhancement produced from the interaction of X-rays with AuNps was calculated using spectrophotometric and cone-beam optical computed tomography scanning by quantitatively comparing the change in optical density and 3D datasets of the dosimetric measurements between the tissue-equivalent (TE) and TE/AuNps compartments. The interbatch and intrabatch variability and the postresponse stability of the dosimeters with AuNps were also assessed. Results: Radiation dose enhancement factors of 1.77 and 1.11 were obtained using 100-kV and 6-MV X-ray energies, respectively. The results of this study are in good agreement with previous observations; however, for the first time we provide direct experimental confirmation and 3D visualization of the radiosensitization effect of AuNps. The dosimeters with AuNps showed small (<3.5%) interbatch variability and negligible (<0.5%) intrabatch variability. Conclusions: The SMART dosimeter yields experimental insights concerning the spatial distributions and elevated dose in nanoparticle-enhanced radiation therapy, which cannot be performed using any of

  9. Carotid-Sparing TomoHelical 3-Dimensional Conformal Radiotherapy for Early Glottic Cancer

    PubMed Central

    Hong, Chae-Seon; Oh, Dongryul; Ju, Sang Gyu; Ahn, Yong Chan; Noh, Jae Myoung; Chung, Kwangzoo; Kim, Jin Sung; Suh, Tae-Suk

    2016-01-01

    Purpose The purpose of this study was to investigate the dosimetric benefits and treatment efficiency of carotid-sparing TomoHelical 3-dimensional conformal radiotherapy (TH-3DCRT) for early glottic cancer. Materials and Methods Ten early-stage (T1N0M0) glottic squamous cell carcinoma patients were simulated, based on computed tomography scans. Two-field 3DCRT (2F-3DCRT), 3-field intensity-modulated radiation therapy (3F-IMRT), TomoHelical-IMRT (TH-IMRT), and TH-3DCRT plans were generated with a 67.5-Gy total prescription dose to the planning target volume (PTV) for each patient. In order to evaluate the plan quality, dosimetric characteristics were compared in terms of conformity index (CI) and homogeneity index (HI) for PTV, dose to the carotid arteries, and maximum dose to the spinal cord. Treatment planning and delivery times were compared to evaluate treatment efficiency. Results The median CI was substantially better for the 3F-IMRT (0.65), TH-IMRT (0.64), and TH-3DCRT (0.63) plans, compared to the 2F-3DCRT plan (0.32). PTV HI was slightly better for TH-3DCRT and TH-IMRT (1.05) compared to 2F-3DCRT (1.06) and 3F-IMRT (1.09). TH-3DCRT, 3F-IMRT, and TH-IMRT showed an excellent carotid sparing capability compared to 2F-3DCRT (p < 0.05). For all plans, the maximum dose to the spinal cord was < 45 Gy. The median treatment planning times for 2F-3DCRT (5.85 minutes) and TH-3DCRT (7.10 minutes) were much lower than those for 3F-IMRT (45.48 minutes) and TH-IMRT (35.30 minutes). The delivery times for 2F-3DCRT (2.06 minutes) and 3F-IMRT (2.48 minutes) were slightly lower than those for TH-IMRT (2.90 minutes) and TH-3DCRT (2.86 minutes). Conclusion TH-3DCRT showed excellent carotid-sparing capability, while offering high efficiency and maintaining good PTV coverage. PMID:25761477

  10. Automated Patient Identification and Localization Error Detection Using 2-Dimensional to 3-Dimensional Registration of Kilovoltage X-Ray Setup Images

    SciTech Connect

    Lamb, James M. Agazaryan, Nzhde; Low, Daniel A.

    2013-10-01

    Purpose: To determine whether kilovoltage x-ray projection radiation therapy setup images could be used to perform patient identification and detect gross errors in patient setup using a computer algorithm. Methods and Materials: Three patient cohorts treated using a commercially available image guided radiation therapy (IGRT) system that uses 2-dimensional to 3-dimensional (2D-3D) image registration were retrospectively analyzed: a group of 100 cranial radiation therapy patients, a group of 100 prostate cancer patients, and a group of 83 patients treated for spinal lesions. The setup images were acquired using fixed in-room kilovoltage imaging systems. In the prostate and cranial patient groups, localizations using image registration were performed between computed tomography (CT) simulation images from radiation therapy planning and setup x-ray images corresponding both to the same patient and to different patients. For the spinal patients, localizations were performed to the correct vertebral body, and to an adjacent vertebral body, using planning CTs and setup x-ray images from the same patient. An image similarity measure used by the IGRT system image registration algorithm was extracted from the IGRT system log files and evaluated as a discriminant for error detection. Results: A threshold value of the similarity measure could be chosen to separate correct and incorrect patient matches and correct and incorrect vertebral body localizations with excellent accuracy for these patient cohorts. A 10-fold cross-validation using linear discriminant analysis yielded misclassification probabilities of 0.000, 0.0045, and 0.014 for the cranial, prostate, and spinal cases, respectively. Conclusions: An automated measure of the image similarity between x-ray setup images and corresponding planning CT images could be used to perform automated patient identification and detection of localization errors in radiation therapy treatments.

  11. A computational framework for exploratory data analysis in biomedical imaging

    NASA Astrophysics Data System (ADS)

    Wismueller, Axel

    2009-02-01

    Purpose: To develop, test, and evaluate a novel unsupervised machine learning method for the analysis of multidimensional biomedical imaging data. Methods: The Exploration Machine (XOM) is introduced as a method for computing low-dimensional representations of high-dimensional observations. XOM systematically inverts functional and structural components of topology-preserving mappings. Thus, it can contribute to both structure-preserving visualization and data clustering. We applied XOM to the analysis of microarray imaging data of gene expression profiles in Saccharomyces cerevisiae, and to model-free analysis of functional brain MRI data by unsupervised clustering. For both applications, we performed quantitative comparisons to results obtained by established algorithms. Results: Genome data: Absolute (relative) Sammon error values were 2.21 Â. 103 (1.00) for XOM, 2.45 Â. 103 (1.11) for Sammon's mapping, 2.77 Â. 103 (1.25) for Locally Linear Embedding (LLE), 2.82 Â. 103 (1.28) for PCA, 3.36 Â. 103 (1.52) for Isomap, and 10.19 Â. 103(4.61) for Self-Organizing Map (SOM). - Functional MRI data: Areas under ROC curves for detection of task-related brain activation were 0.984 +/- 0.03 for XOM, 0.983 +/- 0.02 for Minimal-Free-Energy VQ, and 0.979 +/- 0.02 for SOM. Conclusion: We introduce the Exploration Machine as a novel machine learning method for the analysis of multidimensional biomedical imaging data. XOM can be successfully applied to microarray gene expression analysis and to clustering of functional brain MR image time-series. By simultaneously contributing to dimensionality reduction and data clustering, XOM is a useful novel method for data analysis in biomedical imaging.

  12. BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.

  13. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, J.

    1999-01-01

    A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.

  14. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, James G.

    1999-01-01

    A new objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 2 x 2.5 lat-lon grid with 20 levels of heights and winds and 10 levels of moisture) using 120,000 observations in less than 3 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system Ls totally portable and can run on -several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from I to 32 CPus is 18%. in addition, the analysis results are identical regardless of the number of processors used. T'his system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. It also includes a new quality control (buddy check) system. Static tests with the system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from a 2-month cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (0-F statistics) throughout the entire two months.

  15. Computer-aided photometric analysis of dynamic digital bioluminescent images

    NASA Astrophysics Data System (ADS)

    Gorski, Zbigniew; Bembnista, T.; Floryszak-Wieczorek, J.; Domanski, Marek; Slawinski, Janusz

    2003-04-01

    The paper deals with photometric and morphologic analysis of bioluminescent images obtained by registration of light radiated directly from some plant objects. Registration of images obtained from ultra-weak light sources by the single photon counting (SPC) technique is the subject of this work. The radiation is registered by use of a 16-bit charge coupled device (CCD) camera "Night Owl" together with WinLight EG&G Berthold software. Additional application-specific software has been developed in order to deal with objects that are changing during the exposition time. Advantages of the elaborated set of easy configurable tools named FCT for a computer-aided photometric and morphologic analysis of numerous series of quantitatively imperfect chemiluminescent images are described. Instructions are given how to use these tools and exemplified with several algorithms for the transformation of images library. Using the proposed FCT set, automatic photometric and morphologic analysis of the information hidden within series of chemiluminescent images reflecting defensive processes in poinsettia (Euphorbia pulcherrima Willd) leaves affected by a pathogenic fungus Botrytis cinerea is revealed.

  16. Privacy-preserving microbiome analysis using secure computation

    PubMed Central

    Wagner, Justin; Paulson, Joseph N.; Wang, Xiao; Bhattacharjee, Bobby; Corrada Bravo, Héctor

    2016-01-01

    Motivation: Developing targeted therapeutics and identifying biomarkers relies on large amounts of research participant data. Beyond human DNA, scientists now investigate the DNA of micro-organisms inhabiting the human body. Recent work shows that an individual’s collection of microbial DNA consistently identifies that person and could be used to link a real-world identity to a sensitive attribute in a research dataset. Unfortunately, the current suite of DNA-specific privacy-preserving analysis tools does not meet the requirements for microbiome sequencing studies. Results: To address privacy concerns around microbiome sequencing, we implement metagenomic analyses using secure computation. Our implementation allows comparative analysis over combined data without revealing the feature counts for any individual sample. We focus on three analyses and perform an evaluation on datasets currently used by the microbiome research community. We use our implementation to simulate sharing data between four policy-domains. Additionally, we describe an application of our implementation for patients to combine data that allows drug developers to query against and compensate patients for the analysis. Availability and implementation: The software is freely available for download at: http://cbcb.umd.edu/∼hcorrada/projects/secureseq.html Supplementary information: Supplementary data are available at Bioinformatics online. Contact: hcorrada@umiacs.umd.edu PMID:26873931

  17. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.

  18. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    PubMed

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity. PMID:27396650

  19. Computer-aided target tracking in motion analysis studies

    NASA Astrophysics Data System (ADS)

    Burdick, Dominic C.; Marcuse, M. L.; Mislan, J. D.

    1990-08-01

    Motion analysis studies require the precise tracking of reference objects in sequential scenes. In a typical situation, events of interest are captured at high frame rates using special cameras, and selected objects or targets are tracked on a frame by frame basis to provide necessary data for motion reconstruction. Tracking is usually done using manual methods which are slow and prone to error. A computer based image analysis system has been developed that performs tracking automatically. The objective of this work was to eliminate the bottleneck due to manual methods in high volume tracking applications such as the analysis of crash test films for the automotive industry. The system has proven to be successful in tracking standard fiducial targets and other objects in crash test scenes. Over 95 percent of target positions which could be located using manual methods can be tracked by the system, with a significant improvement in throughput over manual methods. Future work will focus on the tracking of clusters of targets and on tracking deformable objects such as airbags.

  20. The analysis of control trajectories using symbolic and database computing

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    The research broadly concerned the symbolic computation, mixed numeric-symbolic computation, and data base computation of trajectories of dynamical systems, especially control systems. It was determined that trees can be used to compute symbolically series which approximate solutions to differential equations.

  1. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    SciTech Connect

    Handler, B.H. ); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. ); Hunnum, W.H. ); Smith, D.L. )

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  2. Equation of state and fragmentation issues in computational lethality analysis

    SciTech Connect

    Trucano, T.G.

    1993-07-01

    The purpose of this report is to summarize the status of computational analysis of hypervelocity impact lethality in relatively nontechnical terms from the perspective of the author. It is not intended to be a review of the technical literature on the problems of concern. The discussion is focused by concentrating on two phenomenology areas which are of particular concern in computational impact studies. First, the material`s equation of state, specifically the treatment of expanded states of metals undergoing shock vaporization, is discussed. Second, the process of dynamic fragmentation is addressed. In both cases, the context of the discussion deals with inaccuracies and difficulties associated with numerical hypervelocity impact simulations. Laboratory experimental capabilities in hypervelocity impact for impact velocities greater than 10.0 km/s are becoming increasingly viable. This paper also gives recommendations for experimental thrusts which utilize these capabilities that will help to resolve the uncertainties in the numerical lethality studies that are pointed out in the present report.

  3. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  4. Computational Analysis of the G-III Laminar Flow Glove

    NASA Technical Reports Server (NTRS)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  5. Shell stability analysis in a computer aided engineering (CAE) environment

    NASA Technical Reports Server (NTRS)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  6. Role of dielectric medium on benzylidene aniline: A computational analysis

    SciTech Connect

    Umamaheswari, U.; Ajeetha, N.; Ojha, D. P.

    2009-12-15

    A computational analysis of ordering in N-(p-n-ethoxy benzylidene)-p-n-butyl aniline (2O.4) was performed based on quantum mechanics and intermolecular forces. The atomic charge and dipole moment at atomic centre were evaluated using the all valance electron CNDO/2 method. The modified Rayleigh-Schrodinger perturbation theory and multicentre-multipole expansion method were employed to evaluate long-range intermolecular interaction, while a 6-exp potential function was assumed for short-range interactions. The total interaction energy values obtained in these computations were used as input for calculating the probability of each configuration in a noninteracting and nonmesogenic solvent (i.e., benzene) at room temperature (300 K) using the Maxwell-Boltzmann formula. The molecular parameter of 2O.4, including the total energy, binding energy, and total dipole moment were compared with N (p-n-butoxy benzylidene)-p-n-ethyl aniline (4O.2). The present article offer theoretical support to the experimental 'observations, as well as a new and interesting way of looking at liquid crystalline molecule in a dielectric medium.

  7. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  8. Computational analysis of methods for reduction of induced drag

    NASA Technical Reports Server (NTRS)

    Janus, J. M.; Chatterjee, Animesh; Cave, Chris

    1993-01-01

    The purpose of this effort was to perform a computational flow analysis of a design concept centered around induced drag reduction and tip-vortex energy recovery. The flow model solves the unsteady three-dimensional Euler equations, discretized as a finite-volume method, utilizing a high-resolution approximate Riemann solver for cell interface flux definitions. The numerical scheme is an approximately-factored block LU implicit Newton iterative-refinement method. Multiblock domain decomposition is used to partition the field into an ordered arrangement of blocks. Three configurations are analyzed: a baseline fuselage-wing, a fuselage-wing-nacelle, and a fuselage-wing-nacelle-propfan. Aerodynamic force coefficients, propfan performance coefficients, and flowfield maps are used to qualitatively access design efficacy. Where appropriate, comparisons are made with available experimental data.

  9. Optimal low thrust geocentric transfer. [mission analysis computer program

    NASA Technical Reports Server (NTRS)

    Edelbaum, T. N.; Sackett, L. L.; Malchow, H. L.

    1973-01-01

    A computer code which will rapidly calculate time-optimal low thrust transfers is being developed as a mission analysis tool. The final program will apply to NEP or SEP missions and will include a variety of environmental effects. The current program assumes constant acceleration. The oblateness effect and shadowing may be included. Detailed state and costate equations are given for the thrust effect, oblateness effect, and shadowing. A simple but adequate model yields analytical formulas for power degradation due to the Van Allen radiation belts for SEP missions. The program avoids the classical singularities by the use of equinoctial orbital elements. Kryloff-Bogoliuboff averaging is used to facilitate rapid calculation. Results for selected cases using the current program are given.

  10. Computational analysis of promoter elements and chromatin features in yeast.

    PubMed

    Wyrick, John J

    2012-01-01

    Regulatory elements in promoter sequences typically function as binding sites for transcription factor proteins and thus are critical determinants of gene transcription. There is growing evidence that chromatin features, such as histone modifications or nucleosome positions, also have important roles in transcriptional regulation. Recent functional genomics and computational studies have yielded extensive datasets cataloging transcription factor binding sites (TFBS) and chromatin features, such as nucleosome positions, throughout the yeast genome. However, much of this data can be difficult to navigate or analyze efficiently. This chapter describes practical methods for the visualization, data mining, and statistical analysis of yeast promoter elements and chromatin features using two Web-accessible bioinformatics databases: ChromatinDB and Ceres. PMID:22113279

  11. Fast computation of Lagrangian coherent structures: algorithms and error analysis

    NASA Astrophysics Data System (ADS)

    Brunton, Steven; Rowley, Clarence

    2009-11-01

    This work investigates a number of efficient methods for computing finite time Lyapunov exponent (FTLE) fields in unsteady flows by approximating the particle flow map and eliminating redundant particle integrations in neighboring flow maps. Ridges of the FTLE fields are Lagrangian coherent structures (LCS) and provide an unsteady analogue of invariant manifolds from dynamical systems theory. The fast methods fall into two categories, unidirectional and bidirectional, depending on whether flow maps in one or both time directions are composed to form an approximate flow map. An error analysis is presented which shows that the unidirectional methods are accurate while the bidirectional methods have significant error which is aligned with the opposite time coherent structures. This relies on the fact that material from the positive time LCS attracts onto the negative time LCS near time-dependent saddle points.

  12. Satellite Interference Analysis and Simulation Using Personal Computers

    NASA Technical Reports Server (NTRS)

    Kantak, Anil

    1988-01-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  13. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    NASA Astrophysics Data System (ADS)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  14. Meta-Analysis and Computer-Mediated Communication.

    PubMed

    Taylor, Alan M

    2016-04-01

    Because of the use of human participants and differing contextual variables, research in second language acquisition often produces conflicting results, leaving practitioners confused and unsure of the effectiveness of specific treatments. This article provides insight into a recent seminal meta-analysis on the effectiveness of computer-mediated communication, providing further statistical evidence of the importance of its results. The significance of the study is examined by looking at the p values included in the references, to demonstrate how results can easily be misconstrued by practitioners and researchers. Lin's conclusion regarding the research setting of the study reports is also evaluated. In doing so, other possible explanations of what may be influencing the results can be proposed. PMID:27154373

  15. Importance sampling. I. Computing multimodel p values in linkage analysis

    SciTech Connect

    Kong, A.; Frigge, M.; Irwin, M.; Cox, N. )

    1992-12-01

    In linkage analysis, when the lod score is maximized over multiple genetic models, standard asymptotic approximation of the significance level does not apply. Monte Carlo methods can be used to estimate the p value, but procedures currently used are extremely inefficient. The authors propose a Monte Carlo procedure based on the concept of importance sampling, which can be thousands of times more efficient than current procedures. With a reasonable amount of computing time, extremely accurate estimates of the p values can be obtained. Both theoretical results and an example of maturity-onset diabetes of the young (MODY) are presented to illustrate the efficiency performance of their method. Relations between single-model and multimodel p values are explored. The new procedure is also used to investigate the performance of asymptotic approximations in a single model situation. 22 refs., 6 figs., 1 tab.

  16. Whole-genome CNV analysis: advances in computational approaches

    PubMed Central

    Pirooznia, Mehdi; Goes, Fernando S.; Zandi, Peter P.

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development. PMID:25918519

  17. Data analysis using the Gnu R system for statistical computation

    SciTech Connect

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  18. Computational analysis of the SSME fuel preburner flow

    NASA Technical Reports Server (NTRS)

    Wang, T. S.; Farmer, R. C.

    1986-01-01

    A computational fluid dynamics model which simulates the steady state operation of the SSME fuel preburner is developed. Specifically, the model will be used to quantify the flow factors which cause local hot spots in the fuel preburner in order to recommend experiments whereby the control of undesirable flow features can be demonstrated. The results of a two year effort to model the preburner are presented. In this effort, investigating the fuel preburner flowfield, the appropriate transport equations were numerically solved for both an axisymmetric and a three-dimensional configuration. Continuum's VAST (Variational Solution of the Transport equations) code, in conjunction with the CM-1000 Engineering Analysis Workstation and the NASA/Ames CYBER 205, was used to perform the required calculations. It is concluded that the preburner operational anomalies are not due to steady state phenomena and must, therefore, be related to transient operational procedures.

  19. Modeling And Simulation Of Bar Code Scanners Using Computer Aided Design Software

    NASA Astrophysics Data System (ADS)

    Hellekson, Ron; Campbell, Scott

    1988-06-01

    Many optical systems have demanding requirements to package the system in a small 3 dimensional space. The use of computer graphic tools can be a tremendous aid to the designer in analyzing the optical problems created by smaller and less costly systems. The Spectra Physics grocery store bar code scanner employs an especially complex 3 dimensional scan pattern to read bar code labels. By using a specially written program which interfaces with a computer aided design system, we have simulated many of the functions of this complex optical system. In this paper we will illustrate how a recent version of the scanner has been designed. We will discuss the use of computer graphics in the design process including interactive tweaking of the scan pattern, analysis of collected light, analysis of the scan pattern density, and analysis of the manufacturing tolerances used to build the scanner.

  20. Triclosan Computational Conformational Chemistry Analysis for Antimicrobial Properties in Polymers

    PubMed Central

    Petersen, Richard C.

    2015-01-01

    Triclosan is a diphenyl ether antimicrobial that has been analyzed by computational conformational chemistry for an understanding of Mechanomolecular Theory. Subsequent energy profile analysis combined with easily seen three-dimensional chemistry structure models for the nonpolar molecule Triclosan show how single bond rotations can alternate rapidly at a polar and nonpolar interface. Bond rotations for the center ether oxygen atom of the two aromatic rings then expose or hide nonbonding lone-pair electrons for the oxygen atom depending on the polar nature of the immediate local molecular environment. Rapid bond movements can subsequently produce fluctuations as vibration energy. Consequently, related mechanical molecular movements calculated as energy relationships by forces acting through different bond positions can help improve on current Mechanomolecular Theory. A previous controversy reported as a discrepancy in literature contends for a possible bacterial resistance from Triclosan antimicrobial. However, findings in clinical settings have not reported a single case for Triclosan bacterial resistance in over 40 years that has been documented carefully in government reports. As a result, Triclosan is recommended whenever there is a health benefit consistent with a number of approvals for use of Triclosan in healthcare devices. Since Triclosan is the most researched antimicrobial ever, literature meta analysis with computational chemistry can best describe new molecular conditions that were previously impossible by conventional chemistry methods. Triclosan vibrational energy can now explain the molecular disruption of bacterial membranes. Further, Triclosan mechanomolecular movements help illustrate use in polymer matrix composites as an antimicrobial with two new additive properties as a toughening agent to improve matrix fracture toughness from microcracking and a hydrophobic wetting agent to help incorporate strengthening fibers. Interrelated

  1. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    PubMed

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in <2 minutes, store information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible

  2. Green's Function Analysis of Periodic Structures in Computational Electromagnetics

    NASA Astrophysics Data System (ADS)

    Van Orden, Derek

    2011-12-01

    Periodic structures are used widely in electromagnetic devices, including filters, waveguiding structures, and antennas. Their electromagnetic properties may be analyzed computationally by solving an integral equation, in which an unknown equivalent current distribution in a single unit cell is convolved with a periodic Green's function that accounts for the system's boundary conditions. Fast computation of the periodic Green's function is therefore essential to achieve high accuracy solutions of complicated periodic structures, including analysis of modal wave propagation and scattering from external sources. This dissertation first presents alternative spectral representations of the periodic Green's function of the Helmholtz equation for cases of linear periodic systems in 2D and 3D free space and near planarly layered media. Although there exist multiple representations of the periodic Green's function, most are not efficient in the important case where the fields are observed near the array axis. We present spectral-spatial representations for rapid calculation of the periodic Green's functions for linear periodic arrays of current sources residing in free space as well as near a planarly layered medium. They are based on the integral expansion of the periodic Green's functions in terms of the spectral parameters transverse to the array axis. These schemes are important for the rapid computation of the interaction among unit cells of a periodic array, and, by extension, the complex dispersion relations of guided waves. Extensions of this approach to planar periodic structures are discussed. With these computation tools established, we study the traveling wave properties of linear resonant arrays placed near surfaces, and examine the coupling mechanisms that lead to radiation into guided waves supported by the surface. This behavior is especially important to understand the properties of periodic structures printed on dielectric substrates, such as periodic

  3. A Large-Scale Computational Analysis of Corneal Structural Response and Ectasia Risk in Myopic Laser Refractive Surgery

    PubMed Central

    Dupps, William Joseph; Seven, Ibrahim

    2016-01-01

    Purpose: To investigate biomechanical strain as a structural susceptibility metric for corneal ectasia in a large-scale computational trial. Methods: A finite element modeling study was performed using retrospective Scheimpflug tomography data from 40 eyes of 40 patients. LASIK and PRK were simulated with varied myopic ablation profiles and flap thickness parameters across eyes from LASIK candidates, patients disqualified for LASIK, subjects with atypical topography, and keratoconus subjects in 280 simulations. Finite element analysis output was then interrogated to extract several risk and outcome variables. We tested the hypothesis that strain is greater in known at-risk eyes than in normal eyes, evaluated the ability of a candidate strain variable to differentiate eyes that were empirically disqualified as LASIK candidates, and compared the performance of common risk variables as predictors of this novel susceptibility marker across multiple virtual subjects and surgeries. Results: A candidate susceptibility metric that expressed mean strains across the anterior residual stromal bed was significantly higher in eyes with confirmed ectatic predisposition in preoperative and all postoperative cases (P≤.003). The strain metric was effective at differentiating normal and at-risk eyes (area under receiver operating characteristic curve ≥ 0.83, P≤.002), was highly correlated to thickness-based risk metrics (as high as R2 = 95%, P<.001 for the percent of stromal tissue altered (PSTA)), and predicted large portions of the variance in predicted refractive response to surgery (R2 = 57%, P<.001). Conclusions: This study represents the first large-scale 3-dimensional structural analysis of ectasia risk and provides a novel biomechanical construct for expressing structural risk in refractive surgery. Mechanical strain is an effective marker of known ectasia risk and correlates to predicted refractive error after myopic photoablative surgery.

  4. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    ERIC Educational Resources Information Center

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  5. The role of computed tomography in terminal ballistic analysis.

    PubMed

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles. PMID:17205351

  6. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  7. Creating 3-dimensional Models of the Photosphere using the SIR Code

    NASA Astrophysics Data System (ADS)

    Thonhofer, S.; Utz, D.; Jurčák, J.; Pauritsch, J.; Hanslmeier, A.; Lemmerer, B.

    A high-resolution 3-dimensional model of the photospheric magnetic field is essential for the investigation of magnetic features such as sunspots, pores or smaller elements like single flux tubes seen as magnetic bright points. The SIR code is an advanced inversion code that retrieves physical quantities, e.g. magnetic field, from Stokes profiles. Based on this code, we developed a program for automated inversion of Hinode SOT/SP data and for storing these results in 3-dimensional data cubes in the form of fits files. We obtained models of the temperature, magnetic field strength, magnetic field angles and LOS-velocity in a region of the quiet sun. We will give a first discussion of those parameters in regards of small scale magnetic fields and what we can obtain and learn in the future.

  8. Conditioned Media From Adipose-Derived Stromal Cells Accelerates Healing in 3-Dimensional Skin Cultures.

    PubMed

    Collawn, Sherry S; Mobley, James A; Banerjee, N Sanjib; Chow, Louise T

    2016-04-01

    Wound healing involves a number of factors that results in the production of a "closed" wound. Studies have shown, in animal models, acceleration of wound healing with the addition of adipose-derived stromal cells (ADSC). The cause for the positive effect which these cells have on wound healing has not been elucidated. We have previously shown that addition of ADSC to the dermal equivalent in 3-dimensional skin cultures accelerates reepithelialization. We now demonstrate that conditioned media (CM) from cultured ADSC produced a similar rate of healing. This result suggests that a feedback from the 3-dimensional epithelial cultures to ADSC was not necessary to effect the accelerated reepithelialization. Mass spectrometry of CM from ADSC and primary human fibroblasts revealed differences in secretomes, some of which might have roles in the accelerating wound healing. Thus, the use of CM has provided some preliminary information on a possible mode of action. PMID:26954733

  9. 3-Dimensional modelling of chick embryo eye development and growth using high resolution magnetic resonance imaging.

    PubMed

    Goodall, Nicola; Kisiswa, Lilian; Prashar, Ankush; Faulkner, Stuart; Tokarczuk, Paweł; Singh, Krish; Erichsen, Jonathan T; Guggenheim, Jez; Halfter, Willi; Wride, Michael A

    2009-10-01

    Magnetic resonance imaging (MRI) is a powerful tool for generating 3-dimensional structural and functional image data. MRI has already proven valuable in creating atlases of mouse and quail development. Here, we have exploited high resolution MRI to determine the parameters necessary to acquire images of the chick embryo eye. Using a 9.4 Tesla (400 MHz) high field ultra-shielded and refrigerated magnet (Bruker), MRI was carried out on paraformaldehyde-fixed chick embryos or heads at E4, E6, E8, and E10. Image data were processed using established and custom packages (MRICro, ImageJ, ParaVision, Bruker and mri3dX). Voxel dimensions ranged from 62.5 microm to 117.2 microm. We subsequently used the images obtained from the MRI data in order to make precise measurements of chick embryo eye surface area, volume and axial length from E4 to E10. MRI was validated for accurate sizing of ocular tissue features by direct comparison with previously published literature. Furthermore, we demonstrate the utility of high resolution MRI for making accurate measurements of morphological changes due to experimental manipulation of chick eye development, thereby facilitating a better understanding of the effects on chick embryo eye development and growth of such manipulations. Chondroitin sulphate or heparin were microinjected into the vitreous cavity of the right eyes of each of 3 embryos at E5. At E10, embryos were fixed and various eye parameters (volume, surface area, axial length and equatorial diameter) were determined using MRI and normalised with respect to the un-injected left eyes. Statistically significant alterations in eye volume (p < 0.05; increases with chondroitin sulphate and decreases with heparin) and changes in vitreous homogeneity were observed in embryos following microinjection of glycosaminoglycans. Furthermore, in the heparin-injected eyes, significant disturbances at the vitreo-retinal boundary were observed as well as retinal folding and detachment

  10. Acromiohumeral Distance and 3-Dimensional Scapular Position Change After Overhead Muscle Fatigue

    PubMed Central

    Maenhout, Annelies; Dhooge, Famke; Van Herzeele, Maarten; Palmans, Tanneke; Cools, Ann

    2015-01-01

    Context: Muscle fatigue due to repetitive and prolonged overhead sports activity is considered an important factor contributing to impingement-related rotator cuff pathologic conditions in overhead athletes. The evidence on scapular and glenohumeral kinematic changes after fatigue is contradicting and prohibits conclusions about how shoulder muscle fatigue affects acromiohumeral distance. Objective: To investigate the effect of a fatigue protocol resembling overhead sports activity on acromiohumeral distance and 3-dimensional scapular position in overhead athletes. Design: Cross-sectional study. Setting: Institutional laboratory. Patients or Other Participants: A total of 29 healthy recreational overhead athletes (14 men, 15 women; age = 22.23 ± 2.82 years, height = 178.3 ± 7.8 cm, mass = 71.6 ± 9.5 kg). Intervention(s) The athletes were tested before and after a shoulder muscle-fatiguing protocol. Main Outcome Measure(s) Acromiohumeral distance was measured using ultrasound, and scapular position was determined with an electromagnetic motion-tracking system. Both measurements were performed at 3 elevation positions (0°, 45°, and 60° of abduction). We used a 3-factor mixed model for data analysis. Results: After fatigue, the acromiohumeral distance increased when the upper extremity was actively positioned at 45° (Δ = 0.78 ± 0.24 mm, P = .002) or 60° (Δ = 0.58 ± 0.23 mm, P = .02) of abduction. Scapular position changed after fatigue to a more externally rotated position at 45° (Δ = 4.97° ± 1.13°, P < .001) and 60° (Δ = 4.61° ± 1.90°, P = .001) of abduction, a more upwardly rotated position at 45° (Δ = 6.10° ± 1.30°, P < .001) and 60° (Δ = 7.20° ± 1.65°, P < .001) of abduction, and a more posteriorly tilted position at 0°, 45°, and 60° of abduction (Δ = 1.98° ± 0.41°, P < .001). Conclusions: After a fatiguing protocol, we found changes in acromiohumeral distance and scapular position that corresponded with an impingement

  11. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  12. Computer automated movement detection for the analysis of behavior

    PubMed Central

    Ramazani, Roseanna B.; Krishnan, Harish R.; Bergeson, Susan E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtraction removes the background and non-moving flies, leaving white pixels where movement has occurred. These pixels are tallied, giving a value that corresponds to the number of animals that have moved between images. Perl scripts automate these processes, allowing compatibility with high-throughput genetic screens. Four experiments demonstrate the utility of this method, the first showing heat-induced locomotor changes, the second showing tolerance to ethanol in a climbing assay, the third showing tolerance to ethanol by scoring the recovery of individual flies, and the fourth showing a mouse’s preference for a novel object. Our lab will use this method to conduct a genetic screen for ethanol induced hyperactivity and sedation, however, it could also be used to analyze locomotor behavior of any organism. PMID:17335906

  13. Consequence analysis in LPG installation using an integrated computer package.

    PubMed

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-01

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG

  14. Towards non-AdS holography in 3-dimensional higher spin gravity

    NASA Astrophysics Data System (ADS)

    Gary, Michael; Grumiller, Daniel; Rashkov, Radoslav

    2012-03-01

    We take the first steps towards non-AdS holography in higher spin gravity. Namely, we propose a variational principle for generic 3-dimensional higher spin gravity that accommodates asymptotic backgrounds beyond AdS, like asymptotically Schrödinger, Lifshitz or warped AdS spacetimes. As examples we study in some detail the four sl(2) embeddings of spin-4 gravity and provide associated geometries, including an asymptotic Lifshitz black hole.

  15. Energy Sources of the Dominant Frequency Dependent 3-dimensional Atmospheric Modes

    NASA Technical Reports Server (NTRS)

    Schubert, S.

    1985-01-01

    The energy sources and sinks associated with the zonally asymmetric winter mean flow are investigated as part of an on-going study of atmospheric variability. Distinctly different horizontal structures for the long, intermediate and short time scale atmospheric variations were noted. In previous observations, the 3-dimensional structure of the fluctuations is investigated and the relative roles of barotropic and baroclinic terms are assessed.

  16. The computer analysis of interstitial implants in radiotherapy.

    PubMed

    Hudson, F R; Denham, J W

    1985-01-01

    The use of computer techniques in the analysis of needle implants in Radiotherapy treatment is assessed from the point of view of the end user. The specification of a routine approach to the analysis is developed so that consistent assessments of implants will be possible to permit the accumulation of dosimetry data for correlation with clinical outcome. The protocol, in summary, consists of the identification of the plane in which the implant lies and of the presentation of dose rate information in the form of standardised isodose contours through three transverse slices at right angles to the "plane of lie". These transverse slices have been selected at the averaged centre of the sources (the "central plane" of Pierquin et al. [14] and at 5 mm distance from the averaged proximal and distal active ends of the sources. The isodose contours chosen for presentation include two "reference dose" contours representing 85 and 127% of the minimum dose rate found in the central plane for prescription calculation purposes and a limited set of standard contours for assessment of homogeneity of dose rate and for comparative purposes. A minimum number of subsidiary display planes may be used to characterise the implant when necessary, a lateral plane commonly proving useful. The position of the chosen sections may with value be shown on the X-ray films. A method of orientating the display with respect to anatomical structures by super-imposing on the isodose display a frame-like projection whose sides relate to anatomical landmarks is demonstrated. The future extension of the analysis to a fully automated approach is discussed. PMID:4070683

  17. Computer based imaging and analysis of root gravitropism

    NASA Technical Reports Server (NTRS)

    Evans, M. L.; Ishikawa, H.

    1997-01-01

    Two key issues in studies of the nature of the gravitropic response in roots have been the determination of the precise pattern of differential elongation responsible for downward bending and the identification of the cells that show the initial motor response. The main approach for examining patterns of differential growth during root gravitropic curvature has been to apply markers to the root surface and photograph the root at regular intervals during gravitropic curvature. Although these studies have provided valuable information on the characteristics of the gravitropic motor response in roots, their labor intensive nature limits sample size and discourages both high frequency of sampling and depth of analysis of surface expansion data. In this brief review we describe the development of computer-based video analysis systems for automated measurement of root growth and shape change and discuss some key features of the root gravitropic response that have been revealed using this methodology. We summarize the capabilities of several new pieces of software designed to measure growth and shape changes in graviresponding roots and describe recent progress in developing analysis systems for studying the small, but experimentally popular, primary roots of Arabidopsis. A key finding revealed by such studies is that the initial gravitropic response of roots of maize and Arabidopsis occurs in the distal elongation zone (DEZ) near the root apical meristem, not in the main elongation zone. Another finding is that the initiation of rapid elongation in the DEZ following gravistimulation appears to be related to rapid membrane potential changes in this region of the root. These observations have provided the incentive for ongoing studies examining possible links between potential growth modifying factors (auxin, calcium, protons) and gravistimulated changes in membrane potential and growth patterns in the DEZ.

  18. Computational and Statistical Analysis of Protein Mass Spectrometry Data

    PubMed Central

    Noble, William Stafford; MacCoss, Michael J.

    2012-01-01

    High-throughput proteomics experiments involving tandem mass spectrometry produce large volumes of complex data that require sophisticated computational analyses. As such, the field offers many challenges for computational biologists. In this article, we briefly introduce some of the core computational and statistical problems in the field and then describe a variety of outstanding problems that readers of PLoS Computational Biology might be able to help solve. PMID:22291580

  19. Computer assisted analysis of medical x-ray images

    NASA Astrophysics Data System (ADS)

    Bengtsson, Ewert

    1996-01-01

    X-rays were originally used to expose film. The early computers did not have enough capacity to handle images with useful resolution. The rapid development of computer technology over the last few decades has, however, led to the introduction of computers into radiology. In this overview paper, the various possible roles of computers in radiology are examined. The state of the art is briefly presented, and some predictions about the future are made.

  20. Computational electromagnetic analysis of plasmonic effects in interdigital photodetectors

    NASA Astrophysics Data System (ADS)

    Hill, Avery M.; Nusir, Ahmad I.; Nguyen, Paul V.; Manasreh, Omar M.; Herzog, Joseph B.

    2014-09-01

    Plasmonic nanostructures have been shown to act as optical antennas that enhance optical devices. This study focuses on computational electromagnetic (CEM) analysis of GaAs photodetectors with gold interdigital electrodes. Experiments have shown that the photoresponse of the devices depend greatly on the electrode spacing and the polarization of the incident light. Smaller electrode spacing and transverse polarization give rise to a larger photoresponse. This computational study will simulate the optical properties of these devices to determine what plasmonic properties and optical enhancement these devices may have. The models will be solving Maxwell's equations with a finite element method (FEM) algorithm provided by the software COMSOL Multiphysics 4.4. The preliminary results gathered from the simulations follow the same trends that were seen in the experimental data collected, that the spectral response increases when the electrode spacing decreases. Also the simulations show that incident light with the electric field polarized transversely across the electrodes produced a larger photocurrent as compared with longitudinal polarization. This dependency is similar to other plasmonic devices. The simulation results compare well with the experimental data. This work also will model enhancement effects in nanostructure devices with dimensions that are smaller than the current samples to lead the way for future nanoscale devices. By seeing the potential effects that the decreased spacing could have, it opens the door to a new set of devices on a smaller scale, potentially ones with a higher level of enhancement for these devices. In addition, the precise modeling and understanding of the effects of the parameters provides avenues to optimize the enhancement of these structures making more efficient photodetectors. Similar structures could also potentially be used for enhanced photovoltaics as well.

  1. Illumination system development using design and analysis of computer experiments

    NASA Astrophysics Data System (ADS)

    Keresztes, Janos C.; De Ketelaere, Bart; Audenaert, Jan; Koshel, R. J.; Saeys, Wouter

    2015-09-01

    Computer assisted optimal illumination design is crucial when developing cost-effective machine vision systems. Standard local optimization methods, such as downhill simplex optimization (DHSO), often result in an optimal solution that is influenced by the starting point by converging to a local minimum, especially when dealing with high dimensional illumination designs or nonlinear merit spaces. This work presents a novel nonlinear optimization approach, based on design and analysis of computer experiments (DACE). The methodology is first illustrated with a 2D case study of four light sources symmetrically positioned along a fixed arc in order to obtain optimal irradiance uniformity on a flat Lambertian reflecting target at the arc center. The first step consists of choosing angular positions with no overlap between sources using a fast, flexible space filling design. Ray-tracing simulations are then performed at the design points and a merit function is used for each configuration to quantify the homogeneity of the irradiance at the target. The obtained homogeneities at the design points are further used as input to a Gaussian Process (GP), which develops a preliminary distribution for the expected merit space. Global optimization is then performed on the GP more likely providing optimal parameters. Next, the light positioning case study is further investigated by varying the radius of the arc, and by adding two spots symmetrically positioned along an arc diametrically opposed to the first one. The added value of using DACE with regard to the performance in convergence is 6 times faster than the standard simplex method for equal uniformity of 97%. The obtained results were successfully validated experimentally using a short-wavelength infrared (SWIR) hyperspectral imager monitoring a Spectralon panel illuminated by tungsten halogen sources with 10% of relative error.

  2. Computational Analysis of the Hypothalamic Control of Food Intake

    PubMed Central

    Tabe-Bordbar, Shayan; Anastasio, Thomas J.

    2016-01-01

    Food-intake control is mediated by a heterogeneous network of different neural subtypes, distributed over various hypothalamic nuclei and other brain structures, in which each subtype can release more than one neurotransmitter or neurohormone. The complexity of the interactions of these subtypes poses a challenge to understanding their specific contributions to food-intake control, and apparent consistencies in the dataset can be contradicted by new findings. For example, the growing consensus that arcuate nucleus neurons expressing Agouti-related peptide (AgRP neurons) promote feeding, while those expressing pro-opiomelanocortin (POMC neurons) suppress feeding, is contradicted by findings that low AgRP neuron activity and high POMC neuron activity can be associated with high levels of food intake. Similarly, the growing consensus that GABAergic neurons in the lateral hypothalamus suppress feeding is contradicted by findings suggesting the opposite. Yet the complexity of the food-intake control network admits many different network behaviors. It is possible that anomalous associations between the responses of certain neural subtypes and feeding are actually consistent with known interactions, but their effect on feeding depends on the responses of the other neural subtypes in the network. We explored this possibility through computational analysis. We made a computer model of the interactions between the hypothalamic and other neural subtypes known to be involved in food-intake control, and optimized its parameters so that model behavior matched observed behavior over an extensive test battery. We then used specialized computational techniques to search the entire model state space, where each state represents a different configuration of the responses of the units (model neural subtypes) in the network. We found that the anomalous associations between the responses of certain hypothalamic neural subtypes and feeding are actually consistent with the known structure

  3. Educational Computer Trends in the CEMREL Region Analysis and Recommendations.

    ERIC Educational Resources Information Center

    Ohlman, Herbert

    Central Midwestern Regional Educational Laboratory, Inc. (CEMREL) analyzed educational computer trends in the CEMREL region. Between 1967-68 and 1968-69, there was roughly a 50% increase in electronic data processing (EDP) installations. Computer and remote terminal installations also increased 50%, but the proportion of computers under school…

  4. Linguistic Analysis of Natural Language Communication with Computers.

    ERIC Educational Resources Information Center

    Thompson, Bozena Henisz

    Interaction with computers in natural language requires a language that is flexible and suited to the task. This study of natural dialogue was undertaken to reveal those characteristics which can make computer English more natural. Experiments were made in three modes of communication: face-to-face, terminal-to-terminal, and human-to-computer,…

  5. Reliability of cephalometric analysis using manual and interactive computer methods.

    PubMed

    Davis, D N; Mackay, F

    1991-05-01

    This study compares the results of cephalometric analyses using manual and interactive computer graphics methods. Results are statistically in favour of the interactive computer system. This study provides a basis for ongoing research into alternative methods of cephalometric analyses, such as digitization and automatic landmark identification using sophisticated computer vision systems. PMID:1911687

  6. Pulmonary Toxicity in Stage III Non-Small Cell Lung Cancer Patients Treated With High-Dose (74 Gy) 3-Dimensional Conformal Thoracic Radiotherapy and Concurrent Chemotherapy Following Induction Chemotherapy: A Secondary Analysis of Cancer and Leukemia Group B (CALGB) Trial 30105

    SciTech Connect

    Salama, Joseph K.; Stinchcombe, Thomas E.; Gu Lin; Wang Xiaofei; Morano, Karen; Bogart, Jeffrey A.; Crawford, Jeffrey C.; Socinski, Mark A.; Blackstock, A. William; Vokes, Everett E.

    2011-11-15

    Purpose: Cancer and Leukemia Group B (CALGB) 30105 tested two different concurrent chemoradiotherapy platforms with high-dose (74 Gy) three-dimensional conformal radiotherapy (3D-CRT) after two cycles of induction chemotherapy for Stage IIIA/IIIB non-small cell lung cancer (NSCLC) patients to determine if either could achieve a primary endpoint of >18-month median survival. Final results of 30105 demonstrated that induction carboplatin and gemcitabine and concurrent gemcitabine 3D-CRT was not feasible because of treatment-related toxicity. However, induction and concurrent carboplatin/paclitaxel with 74 Gy 3D-CRT had a median survival of 24 months, and is the basis for the experimental arm in CALGB 30610/RTOG 0617/N0628. We conducted a secondary analysis of all patients to determine predictors of treatment-related pulmonary toxicity. Methods and Materials: Patient, tumor, and treatment-related variables were analyzed to determine their relation with treatment-related pulmonary toxicity. Results: Older age, higher N stage, larger planning target volume (PTV)1, smaller total lung volume/PTV1 ratio, larger V20, and larger mean lung dose were associated with increasing pulmonary toxicity on univariate analysis. Multivariate analysis confirmed that V20 and nodal stage as well as treatment with concurrent gemcitabine were associated with treatment-related toxicity. A high-risk group comprising patients with N3 disease and V20 >38% was associated with 80% of Grades 3-5 pulmonary toxicity cases. Conclusions: Elevated V20 and N3 disease status are important predictors of treatment related pulmonary toxicity in patients treated with high-dose 3D-CRT and concurrent chemotherapy. Further studies may use these metrics in considering patients for these treatments.

  7. Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1975-01-01

    An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.

  8. A computer program for the design and analysis of low-speed airfoils, supplement

    NASA Technical Reports Server (NTRS)

    Eppler, R.; Somers, D. M.

    1980-01-01

    Three new options were incorporated into an existing computer program for the design and analysis of low speed airfoils. These options permit the analysis of airfoils having variable chord (variable geometry), a boundary layer displacement iteration, and the analysis of the effect of single roughness elements. All three options are described in detail and are included in the FORTRAN IV computer program.

  9. 46 CFR 54.30-15 - Requirement for analysis and computation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 2 2014-10-01 2014-10-01 false Requirement for analysis and computation. 54.30-15 Section 54.30-15 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Mechanical Stress Relief § 54.30-15 Requirement for analysis and computation. (a) A stress analysis shall be performed to determine...

  10. 46 CFR 54.30-15 - Requirement for analysis and computation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 2 2012-10-01 2012-10-01 false Requirement for analysis and computation. 54.30-15 Section 54.30-15 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PRESSURE VESSELS Mechanical Stress Relief § 54.30-15 Requirement for analysis and computation. (a) A stress analysis shall be performed to determine...

  11. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    SciTech Connect

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.; Lins, Roberto D.; Soares, Thereza A.; Scarberry, Randall E.; Rose, Stuart J.; Williams, Leigh K.; Lai, Canhai; Critchlow, Terence J.; Straatsma, TP

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environment without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.

  12. Reliability analysis framework for computer-assisted medical decision systems

    SciTech Connect

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-02-15

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  13. Computer-aided pulmonary image analysis in small animal models

    SciTech Connect

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.; Bagci, Ulas; Kramer-Marek, Gabriela; Luna, Brian; Kubler, Andre; Dey, Bappaditya; Jain, Sanjay; Foster, Brent; Papadakis, Georgios Z.; Camp, Jeremy V.; Jonsson, Colleen B.; Bishai, William R.; Udupa, Jayaram K.

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  14. MMA, A Computer Code for Multi-Model Analysis

    SciTech Connect

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  15. Computer-aided pulmonary image analysis in small animal models

    PubMed Central

    Xu, Ziyue; Bagci, Ulas; Mansoor, Awais; Kramer-Marek, Gabriela; Luna, Brian; Kubler, Andre; Dey, Bappaditya; Foster, Brent; Papadakis, Georgios Z.; Camp, Jeremy V.; Jonsson, Colleen B.; Bishai, William R.; Jain, Sanjay; Udupa, Jayaram K.; Mollura, Daniel J.

    2015-01-01

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases. PMID:26133591

  16. Water uptake by a maize root system - An explicit numerical 3-dimensional simulation.

    NASA Astrophysics Data System (ADS)

    Leitner, Daniel; Schnepf, Andrea; Klepsch, Sabine; Roose, Tiina

    2010-05-01

    Water is one of the most important resources for plant growth and function. An accurate modelling of the unsaturated flow is not only substantial to predict water uptake but also important to describe nutrient movement regarding water saturation and transport. In this work we present a model for water uptake. The model includes the simultaneous flow of water inside the soil and inside the root network. Water saturation in the soil volume is described by the Richards equation. Water flow inside the roots' xylem is calculated using the Poiseuille law for water flow in a cylindrical tube. The water saturation in the soil as well as water uptake of the root system is calculated numerically in three dimensions. We study water uptake of a maize plant in a confined pot under different supply scenarios. The main improvement of our approach is that the root surfaces act as spatial boundaries of the soil volume. Therefore water influx into the root is described by a surface flux instead of a volume flux, which is commonly given by an effective sink term. For the numerical computation we use the following software: The 3-dimensional maize root architecture is created by a root growth model based on L-Systems (Leitner et al 2009). A mesh of the surrounding soil volume is created using the meshing software DistMesh (Persson & Strang 2004). Using this mesh the partial differential equations are solved with the finite element method using Comsol Multiphysics 3.5a. Modelling results are related to accepted water uptake models from literature (Clausnitzer & Hopmans 1994, Roose & Fowler 2004, Javaux et al 2007). This new approach has several advantages. By considering the individual roots it is possible to analyse the influence of overlapping depletion zones due to inter root competition. Furthermore, such simulations can be used to estimate the influence of simplifying assumptions that are made in the development of effective models. The model can be easily combined with a nutrient

  17. A 3-Dimensional Absorbed Dose Calculation Method Based on Quantitative SPECT for Radionuclide Therapy: Evaluation for 131I Using Monte Carlo Simulation

    PubMed Central

    Ljungberg, Michael; Sjögreen, Katarina; Liu, Xiaowei; Frey, Eric; Dewaraja, Yuni; Strand, Sven-Erik

    2009-01-01

    A general method is presented for patient-specific 3-dimensional absorbed dose calculations based on quantitative SPECT activity measurements. Methods The computational scheme includes a method for registration of the CT image to the SPECT image and position-dependent compensation for attenuation, scatter, and collimator detector response performed as part of an iterative reconstruction method. A method for conversion of the measured activity distribution to a 3-dimensional absorbed dose distribution, based on the EGS4 (electron-gamma shower, version 4) Monte Carlo code, is also included. The accuracy of the activity quantification and the absorbed dose calculation is evaluated on the basis of realistic Monte Carlo–simulated SPECT data, using the SIMIND (simulation of imaging nuclear detectors) program and a voxel-based computer phantom. CT images are obtained from the computer phantom, and realistic patient movements are added relative to the SPECT image. The SPECT-based activity concentration and absorbed dose distributions are compared with the true ones. Results Correction could be made for object scatter, photon attenuation, and scatter penetration in the collimator. However, inaccuracies were imposed by the limited spatial resolution of the SPECT system, for which the collimator response correction did not fully compensate. Conclusion The presented method includes compensation for most parameters degrading the quantitative image information. The compensation methods are based on physical models and therefore are generally applicable to other radionuclides. The proposed evaluation methodology may be used as a basis for future intercomparison of different methods. PMID:12163637

  18. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  19. Foam computer model helps in analysis of underbalanced drilling

    SciTech Connect

    Liu, G.; Medley, G.H. Jr.

    1996-07-01

    A new mechanistic model attempts to overcome many of the problems associated with existing foam flow analyses. The model calculates varying Fanning friction factors, rather than assumed constant factors, along the flow path. Foam generated by mixing gas and liquid for underbalanced drilling has unique rheological characteristics, making it very difficult to accurately predict the pressure profile. A user-friendly personal-computer program was developed to solve the mechanical energy balance equation for compressible foam flow. The program takes into account influxes of gas, liquid, and oil from formations. The pressure profile, foam quality, density, and cuttings transport are predicted by the model. A sensitivity analysis window allows the user to quickly optimize the hydraulics program by selecting the best combination of injection pressure, back pressure, and gas/liquid injection rates. This new model handles inclined and horizontal well bores and provides handy engineering and design tools for underbalanced drilling, well bore cleanout, and other foam operations. The paper describes rheological models, foam flow equations, equations of state, mechanical energy equations, pressure drop across nozzles, influx modeling, program operation, comparison to other models, to lab data, and to field data, and results.

  20. Design of airborne wind turbine and computational fluid dynamics analysis

    NASA Astrophysics Data System (ADS)

    Anbreen, Faiqa

    Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.