Science.gov

Sample records for 3-dimensional computational analysis

  1. Evaluation of the anterior mandibular donor site one year after secondary reconstruction of an alveolar cleft: 3-dimensional analysis using cone-beam computed tomography.

    PubMed

    van Bilsen, M W T; Schreurs, R; Meulstee, J W; Kuijpers, M A R; Meijer, G J; Borstlap, W A; Bergé, S J; Maal, T J J

    2015-10-01

    The aim of this study was to analyse changes in the volume of the chin after harvest of a bone graft for secondary reconstruction of an alveolar cleft. Cone-beam computed tomographic (CT) scans of 27 patients taken preoperatively, and immediately and one year postoperatively, were analysed, and 3-dimensional hard-tissue reconstructions made. The hard-tissue segmentation of the scan taken one year postoperatively was subtracted from the segmentation of the preoperative scan to calculate the alteration in the volume of bone at the donor site (chin). A centrally-orientated persistent concavity at the buccal side of the chin was found (mean (range) 160 (0-500) mm(3)). At the lingual side of the chin, a central concavity remained (mean (range) volume 20 (0-80) mm(3)). Remarkably, at the periphery of this concavity there was overgrowth of new bone (mean (range) volume 350 (0-1600) mm(3)). Re-attachment of the muscles of the tongue resulted in a significantly larger central lingual defect one year postoperatively (p=0.01). We also measured minor alterations in volume of the chin at one year. Whether these alterations influence facial appearance and long term bony quality is to be the subject of further research.

  2. Comparative Validity and Reproducibility Study of Various Landmark-Oriented Reference Planes in 3-Dimensional Computed Tomographic Analysis for Patients Receiving Orthognathic Surgery

    PubMed Central

    Lin, Hsiu-Hsia; Chuang, Ya-Fang; Weng, Jing-Ling; Lo, Lun-Jou

    2015-01-01

    Background Three-dimensional computed tomographic imaging has become popular in clinical evaluation, treatment planning, surgical simulation, and outcome assessment for maxillofacial intervention. The purposes of this study were to investigate whether there is any correlation among landmark-based horizontal reference planes and to validate the reproducibility and reliability of landmark identification. Materials and Methods Preoperative and postoperative cone-beam computed tomographic images of patients who had undergone orthognathic surgery were collected. Landmark-oriented reference planes including the Frankfort horizontal plane (FHP) and the lateral semicircular canal plane (LSP) were established. Four FHPs were defined by selecting 3 points from the orbitale, porion, or midpoint of paired points. The LSP passed through both the lateral semicircular canal points and nasion. The distances between the maxillary or mandibular teeth and the reference planes were measured, and the differences between the 2 sides were calculated and compared. The precision in locating the landmarks was evaluated by performing repeated tests, and the intraobserver reproducibility and interobserver reliability were assessed. Results A total of 30 patients with facial deformity and malocclusion—10 patients with facial symmetry, 10 patients with facial asymmetry, and 10 patients with cleft lip and palate—were recruited. Comparing the differences among the 5 reference planes showed no statistically significant difference among all patient groups. Regarding intraobserver reproducibility, the mean differences in the 3 coordinates varied from 0 to 0.35 mm, with correlation coefficients between 0.96 and 1.0, showing high correlation between repeated tests. Regarding interobserver reliability, the mean differences among the 3 coordinates varied from 0 to 0.47 mm, with correlation coefficients between 0.88 and 1.0, exhibiting high correlation between the different examiners. Conclusions The

  3. Particle trajectory computation on a 3-dimensional engine inlet. Final Report Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kim, J. J.

    1986-01-01

    A 3-dimensional particle trajectory computer code was developed to compute the distribution of water droplet impingement efficiency on a 3-dimensional engine inlet. The computed results provide the essential droplet impingement data required for the engine inlet anti-icing system design and analysis. The droplet trajectories are obtained by solving the trajectory equation using the fourth order Runge-Kutta and Adams predictor-corrector schemes. A compressible 3-D full potential flow code is employed to obtain a cylindrical grid definition of the flowfield on and about the engine inlet. The inlet surface is defined mathematically through a system of bi-cubic parametric patches in order to compute the droplet impingement points accurately. Analysis results of the 3-D trajectory code obtained for an axisymmetric droplet impingement problem are in good agreement with NACA experimental data. Experimental data are not yet available for the engine inlet impingement problem analyzed. Applicability of the method to solid particle impingement problems, such as engine sand ingestion, is also demonstrated.

  4. Unification of color postprocessing techniques for 3-dimensional computational mechanics

    NASA Technical Reports Server (NTRS)

    Bailey, Bruce Charles

    1985-01-01

    To facilitate the understanding of complex three-dimensional numerical models, advanced interactive color postprocessing techniques are introduced. These techniques are sufficiently flexible so that postprocessing difficulties arising from model size, geometric complexity, response variation, and analysis type can be adequately overcome. Finite element, finite difference, and boundary element models may be evaluated with the prototype postprocessor. Elements may be removed from parent models to be studied as independent subobjects. Discontinuous responses may be contoured including responses which become singular, and nonlinear color scales may be input by the user for the enhancement of the contouring operation. Hit testing can be performed to extract precise geometric, response, mesh, or material information from the database. In addition, stress intensity factors may be contoured along the crack front of a fracture model. Stepwise analyses can be studied, and the user can recontour responses repeatedly, as if he were paging through the response sets. As a system, these tools allow effective interpretation of complex analysis results.

  5. A 3-dimensional Analysis of the Cassiopeia A Supernova Remnant

    NASA Astrophysics Data System (ADS)

    Isensee, Karl

    We present a multi-wavelength study of the nearby supernova remnant Cassiopeia A (Cas A). Easily resolvable supernova remnants such as Cas A provide a unique opportunity to test supernova explosion models. Additionally, we can observe key processes in the interstellar medium as the ejecta from the initial explosion encounter Cas A's powerful shocks. In order to accomplish these science goals, we used the Spitzer Space Telescope's Infrared Spectrograph to create a high resolution spectral map of select regions of Cas A, allowing us to make a Doppler reconstruction of its 3-dimensional structure structure. In the center of the remnant, we find relatively pristine ejecta that have not yet reached Cas A's reverse shock or interacted with the circumstellar environment. We observe O, Si, and S emission. These ejecta can form both sheet-like structures as well as filaments. Si and O, which come from different nucleosynthetic layers of the star, are observed to be coincident in some regions, and separated by >500 km s -1 in others. Observed ejecta traveling toward us are, on average, ˜800 km s -1 slower than the material traveling away from us. We compare our observations to recent supernova explosion models and find that no single model can simultaneously reproduce all the observed features. However, models of different supernova explosions can collectively produce the observed geometries and structures of the emission interior to Cas A's reverse shock. We use the results from the models to address the conditions during the supernova explosion, concentrating on asymmetries in the shock structure. We also predict that the back surface of Cassiopeia A will begin brightening in ∼30 years, and the front surface in ˜100 years. We then used similar observations from 3 regions on Cas A's reverse shock in order to create more 3-dimensional maps. In these regions, we observe supernova ejecta both immediately before and during the shock-ejecta interaction. We determine that the

  6. The Effectiveness of an Interactive 3-Dimensional Computer Graphics Model for Medical Education

    PubMed Central

    Konishi, Takeshi; Tamura, Yoko; Moriguchi, Hiroki

    2012-01-01

    Background Medical students often have difficulty achieving a conceptual understanding of 3-dimensional (3D) anatomy, such as bone alignment, muscles, and complex movements, from 2-dimensional (2D) images. To this end, animated and interactive 3-dimensional computer graphics (3DCG) can provide better visual information to users. In medical fields, research on the advantages of 3DCG in medical education is relatively new. Objective To determine the educational effectiveness of interactive 3DCG. Methods We divided 100 participants (27 men, mean (SD) age 17.9 (0.6) years, and 73 women, mean (SD) age 18.1 (1.1) years) from the Health Sciences University of Mongolia (HSUM) into 3DCG (n = 50) and textbook-only (control) (n = 50) groups. The control group used a textbook and 2D images, while the 3DCG group was trained to use the interactive 3DCG shoulder model in addition to a textbook. We conducted a questionnaire survey via an encrypted satellite network between HSUM and Tokushima University. The questionnaire was scored on a 5-point Likert scale from strongly disagree (score 1) to strongly agree (score 5). Results Interactive 3DCG was effective in undergraduate medical education. Specifically, there was a significant difference in mean (SD) scores between the 3DCG and control groups in their response to questionnaire items regarding content (4.26 (0.69) vs 3.85 (0.68), P = .001) and teaching methods (4.33 (0.65) vs 3.74 (0.79), P < .001), but no significant difference in the Web category. Participants also provided meaningful comments on the advantages of interactive 3DCG. Conclusions Interactive 3DCG materials have positive effects on medical education when properly integrated into conventional education. In particular, our results suggest that interactive 3DCG is more efficient than textbooks alone in medical education and can motivate students to understand complex anatomical structures. PMID:23611759

  7. Accuracy and reliability of linear measurements using 3-dimensional computed tomographic imaging software for Le Fort I Osteotomy.

    PubMed

    Gaia, Bruno Felipe; Pinheiro, Lucas Rodrigues; Umetsubo, Otávio Shoite; Santos, Oseas; Costa, Felipe Ferreira; Cavalcanti, Marcelo Gusmão Paraíso

    2014-03-01

    Our purpose was to compare the accuracy and reliability of linear measurements for Le Fort I osteotomy using volume rendering software. We studied 11 dried skulls and used cone-beam computed tomography (CT) to generate 3-dimensional images. Linear measurements were based on craniometric anatomical landmarks that were predefined as specifically used for Le Fort I osteotomy, and identified twice each by 2 radiologists, independently, using Dolphin imaging version 11.5.04.35. A third examiner then made physical measurements using digital calipers. There was a significant difference between Dolphin imaging and the gold standard, particularly in the pterygoid process. The largest difference was 1.85mm (LLpPtg L). The mean differences between the physical and the 3-dimensional linear measurements ranged from -0.01 to 1.12mm for examiner 1, and 0 to 1.85mm for examiner 2. Interexaminer analysis ranged from 0.51 to 0.93. Intraexaminer correlation coefficients ranged from 0.81 to 0.96 and 0.57 to 0.92, for examiners 1 and 2, respectively. We conclude that the Dolphin imaging should be used sparingly during Le Fort I osteotomy.

  8. Computation of transonic potential flow about 3 dimensional inlets, ducts, and bodies

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1982-01-01

    An analysis was developed and a computer code, P465 Version A, written for the prediction of transonic potential flow about three dimensional objects including inlet, duct, and body geometries. Finite differences and line relaxation are used to solve the complete potential flow equation. The coordinate system used for the calculations is independent of body geometry. Cylindrical coordinates are used for the computer code. The analysis is programmed in extended FORTRAN 4 for the CYBER 203 vector computer. The programming of the analysis is oriented toward taking advantage of the vector processing capabilities of this computer. Comparisons of computed results with experimental measurements are presented to verify the analysis. Descriptions of program input and output formats are also presented.

  9. 3-Dimensional analysis for class III malocclusion patients with facial asymmetry

    PubMed Central

    Ki, Eun-Jung; Cheon, Hae-Myung; Choi, Eun-Joo; Kwon, Kyung-Hwan

    2013-01-01

    Objectives The aim of this study is to investigate the correlation between 2-dimensional (2D) cephalometric measurement and 3-dimensional (3D) cone beam computed tomography (CBCT) measurement, and to evaluate the availability of 3D analysis for asymmetry patients. Materials and Methods A total of Twenty-seven patients were evaluated for facial asymmetry by photograph and cephalometric radiograph, and CBCT. The 14 measurements values were evaluated and those for 2D and 3D were compared. The patients were classified into two groups. Patients in group 1 were evaluated for symmetry in the middle 1/3 of the face and asymmetry in the lower 1/3 of the face, and those in group 2 for asymmetry of both the middle and lower 1/3 of the face. Results In group 1, significant differences were observed in nine values out of 14 values. Values included three from anteroposterior cephalometric radiograph measurement values (cant and both body height) and six from lateral cephalometric radiographs (both ramus length, both lateral ramal inclination, and both gonial angles). In group 2, comparison between 2D and 3D showed significant difference in 10 factors. Values included four from anteroposterior cephalometric radiograph measurement values (both maxillary height, both body height) and six from lateral cephalometric radiographs (both ramus length, both lateral ramal inclination, and both gonial angles). Conclusion Information from 2D analysis was inaccurate in several measurements. Therefore, in asymmetry patients, 3D analysis is useful in diagnosis of asymmetry. PMID:24471038

  10. Automated image analysis reveals the dynamic 3-dimensional organization of multi-ciliary arrays

    PubMed Central

    Galati, Domenico F.; Abuin, David S.; Tauber, Gabriel A.; Pham, Andrew T.; Pearson, Chad G.

    2016-01-01

    ABSTRACT Multi-ciliated cells (MCCs) use polarized fields of undulating cilia (ciliary array) to produce fluid flow that is essential for many biological processes. Cilia are positioned by microtubule scaffolds called basal bodies (BBs) that are arranged within a spatially complex 3-dimensional geometry (3D). Here, we develop a robust and automated computational image analysis routine to quantify 3D BB organization in the ciliate, Tetrahymena thermophila. Using this routine, we generate the first morphologically constrained 3D reconstructions of Tetrahymena cells and elucidate rules that govern the kinetics of MCC organization. We demonstrate the interplay between BB duplication and cell size expansion through the cell cycle. In mutant cells, we identify a potential BB surveillance mechanism that balances large gaps in BB spacing by increasing the frequency of closely spaced BBs in other regions of the cell. Finally, by taking advantage of a mutant predisposed to BB disorganization, we locate the spatial domains that are most prone to disorganization by environmental stimuli. Collectively, our analyses reveal the importance of quantitative image analysis to understand the principles that guide the 3D organization of MCCs. PMID:26700722

  11. Effect of mandibular advancement on the natural position of the head: a preliminary study of 3-dimensional cephalometric analysis.

    PubMed

    Lin, Xiaozhen; Liu, Yanpu; Edwards, Sean P

    2013-10-01

    Our aim was to investigate the potential effect of advancement by bilateral sagittal split osteotomy (BSSO) on the natural position of the head by using 3-dimensional cephalomentric analysis. Seven consecutive patients who had had only BSSO advancement, and had had preoperative and 6-week postoperative cone beam computed tomography (CT) scans, were recruited to this retrospective study. Two variables, SNB and SNC2, were used to indicate the craniomandibular alignment and craniocervical inclination, respectively, in the midsagittal plane. Using 3-dimensional cephalometric analysis software, the SNB and the SNC2 were recorded in volume and measured in the midsagittal plane at 3 independent time-points. The reliability was measured and a paired t test used to assess the significance of differences between the means of SNB and SNC2 before and after operation. The 3-dimensional cephalometric measurement showed good reliability. The SNB was increased as planned in all the mandibles that were advanced, the cervical vertebrae were brought forward after BSSO, and the SNC2 was significantly increased in 6 of the 7 patients. Three-dimensional cephalometric analysis may provide an alternative way of assessing cephalometrics. After BSSO advancement, the natural position of the head changed by increasing the craniocervical inclination in an anteroposterior direction.

  12. Contributions of the Musculus Uvulae to Velopharyngeal Closure Quantified With a 3-Dimensional Multimuscle Computational Model.

    PubMed

    Inouye, Joshua M; Lin, Kant Y; Perry, Jamie L; Blemker, Silvia S

    2016-02-01

    The convexity of the dorsal surface of the velum is critical for normal velopharyngeal (VP) function and is largely attributed to the levator veli palatini (LVP) and musculus uvulae (MU). Studies have correlated a concave or flat nasal velar surface to symptoms of VP dysfunction including hypernasality and nasal air emission. In the context of surgical repair of cleft palates, the MU has been given relatively little attention in the literature compared with the larger LVP. A greater understanding of the mechanics of the MU will provide insight into understanding the influence of a dysmorphic MU, as seen in cleft palate, as it relates to VP function. The purpose of this study was to quantify the contributions of the MU to VP closure in a computational model. We created a novel 3-dimensional (3D) finite element model of the VP mechanism from magnetic resonance imaging data collected from an individual with healthy noncleft VP anatomy. The model components included the velum, posterior pharyngeal wall (PPW), LVP, and MU. Simulations were based on the muscle and soft tissue mechanical properties from the literature. We found that, similar to previous hypotheses, the MU acts as (i) a space-occupying structure and (ii) a velar extensor. As a space-occupying structure, the MU helps to nearly triple the midline VP contact length. As a velar extensor, the MU acting alone without the LVP decreases the VP distance 62%. Furthermore, activation of the MU decreases the LVP activation required for closure almost 3-fold, from 20% (without MU) to 8% (with MU). Our study suggests that any possible salvaging and anatomical reconstruction of viable MU tissue in a cleft patient may improve VP closure due to its mechanical function. In the absence or dysfunction of MU tissue, implantation of autologous or engineered tissues at the velar midline, as a possible substitute for the MU, may produce a geometric convexity more favorable to VP closure. In the future, more complex models will

  13. Comparison of nonnavigated and 3-dimensional image-based computer navigated balloon kyphoplasty.

    PubMed

    Sembrano, Jonathan N; Yson, Sharon C; Polly, David W; Ledonio, Charles Gerald T; Nuckley, David J; Santos, Edward R G

    2015-01-01

    Balloon kyphoplasty is a common treatment for osteoporotic and pathologic compression fractures. Advantages include minimal tissue disruption, quick recovery, pain relief, and in some cases prevention of progressive sagittal deformity. The benefit of image-based navigation in kyphoplasty has not been established. The goal of this study was to determine whether there is a difference between fluoroscopy-guided balloon kyphoplasty and 3-dimensional image-based navigation in terms of needle malposition rate, cement leakage rate, and radiation exposure time. The authors compared navigated and nonnavigated needle placement in 30 balloon kyphoplasty procedures (47 levels). Intraoperative 3-dimensional image-based navigation was used for needle placement in 21 cases (36 levels); conventional 2-dimensional fluoroscopy was used in the other 9 cases (11 levels). The 2 groups were compared for rates of needle malposition and cement leakage as well as radiation exposure time. Three of 11 (27%) nonnavigated cases were complicated by a malpositioned needle, and 2 of these had to be repositioned. The navigated group had a significantly lower malposition rate (1 of 36; 3%; P=.04). The overall rate of cement leakage was also similar in both groups (P=.29). Radiation exposure time was similar in both groups (navigated, 98 s/level; nonnavigated, 125 s/level; P=.10). Navigated kyphoplasty procedures did not differ significantly from nonnavigated procedures except in terms of needle malposition rate, where navigation may have decreased the need for needle repositioning.

  14. Use of 3-dimensional computed tomography to detect a barium-masked fish bone causing esophageal perforation.

    PubMed

    Tsukiyama, Atsushi; Tagami, Takashi; Kim, Shiei; Yokota, Hiroyuki

    2014-01-01

    Computed tomography (CT) is useful for evaluating esophageal foreign bodies and detecting perforation. However, when evaluation is difficult owing to the previous use of barium as a contrast medium, 3-dimensional CT may facilitate accurate diagnosis. A 49-year-old man was transferred to our hospital with the diagnosis of esophageal perforation. Because barium had been used as a contrast medium for an esophagram performed at a previous hospital, horizontal CT and esophageal endoscopy could not be able to identify the foreign body or characterize the lesion. However, 3-dimensional CT clearly revealed an L-shaped foreign body and its anatomical relationships in the mediastinum. Accordingly, we removed the foreign body using an upper gastrointestinal endoscope. The foreign body was the premaxillary bone of a sea bream. The patient was discharged without complications.

  15. Computer-Aided Designed, 3-Dimensionally Printed Porous Tissue Bioscaffolds For Craniofacial Soft Tissue Reconstruction

    PubMed Central

    Zopf, David A.; Mitsak, Anna G.; Flanagan, Colleen L.; Wheeler, Matthew; Green, Glenn E.; Hollister, Scott J.

    2016-01-01

    Objectives To determine the potential of integrated image-based Computer Aided Design (CAD) and 3D printing approach to engineer scaffolds for head and neck cartilaginous reconstruction for auricular and nasal reconstruction. Study Design Proof of concept revealing novel methods for bioscaffold production with in vitro and in vivo animal data. Setting Multidisciplinary effort encompassing two academic institutions. Subjects and Methods DICOM CT images are segmented and utilized in image-based computer aided design to create porous, anatomic structures. Bioresorbable, polycaprolactone scaffolds with spherical and random porous architecture are produced using a laser-based 3D printing process. Subcutaneous in vivo implantation of auricular and nasal scaffolds was performed in a porcine model. Auricular scaffolds were seeded with chondrogenic growth factors in a hyaluronic acid/collagen hydrogel and cultured in vitro over 2 months duration. Results Auricular and nasal constructs with several microporous architectures were rapidly manufactured with high fidelity to human patient anatomy. Subcutaneous in vivo implantation of auricular and nasal scaffolds resulted in excellent appearance and complete soft tissue ingrowth. Histologic analysis of in vitro scaffolds demonstrated native appearing cartilaginous growth respecting the boundaries of the scaffold. Conclusions Integrated image-based computer-aided design (CAD) and 3D printing processes generated patient-specific nasal and auricular scaffolds that supported cartilage regeneration. PMID:25281749

  16. Porous Media Contamination: 3-Dimensional Visualization and Quantification Using X-Ray Computed Tomography

    NASA Astrophysics Data System (ADS)

    Goldstein, L.; Prasher, S. O.; Ghoshal, S.

    2004-05-01

    Non-aqueous phase liquids (NAPLs), if spilled into the subsurface, will migrate downward, and a significant fraction will become trapped in the soil matrix. These trapped NAPL globules partition into the water and/or vapor phase, and serve as continuous sources of contamination (e.g. source zones). At present, the presence of NAPL in the subsurface is typically inferred from chemical analysis data. There are no accepted methodologies or protocols available for the direct characterization of NAPLs in the subsurface. Proven and cost-effective methodologies are needed to allow effective implementation of remediation technologies at NAPL contaminated sites. X-ray Computed Tomography (CT) has the potential to non-destructively quantify NAPL mass and distribution in soil cores due to this technology's ability to detect small atomic density differences of solid, liquid, gas, and NAPL phases present in a representative volume element. We have demonstrated that environmentally significant NAPLs, such as gasoline and other oil products, chlorinated solvents, and PCBs possess a characteristic and predictable X-ray attenuation coefficient that permits their quantification in porous media at incident beam energies, typical of medical and industrial X-ray CT scanners. As part of this study, methodologies were developed for generating and analyzing X-ray CT data for the study of NAPLs in natural porous media. Columns of NAPL-contaminated soils were scanned, flushed with solvents and water to remove entrapped NAPL, and re-scanned. X-ray CT data was analyzed to obtain numerical arrays of soil porosity, NAPL saturation, and NAPL volume at a spatial resolution of 1 mm. This methodology was validated using homogeneous and heterogeneous soil columns with known quantities of gasoline and tetrachloroethylene. NAPL volumes computed using X-ray CT data was compared with known volumes from volume balance calculations. Error analysis revealed that in a 5 cm long and 2.5 cm diameter soil

  17. Surgical Classification of the Mandibular Deformity in Craniofacial Microsomia Using 3-Dimensional Computed Tomography

    PubMed Central

    Swanson, Jordan W.; Mitchell, Brianne T.; Wink, Jason A.; Taylor, Jesse A.

    2016-01-01

    Background: Grading systems of the mandibular deformity in craniofacial microsomia (CFM) based on conventional radiographs have shown low interrater reproducibility among craniofacial surgeons. We sought to design and validate a classification based on 3-dimensional CT (3dCT) that correlates features of the deformity with surgical treatment. Methods: CFM mandibular deformities were classified as normal (T0), mild (hypoplastic, likely treated with orthodontics or orthognathic surgery; T1), moderate (vertically deficient ramus, likely treated with distraction osteogenesis; T2), or severe (ramus rudimentary or absent, with either adequate or inadequate mandibular body bone stock; T3 and T4, likely treated with costochondral graft or free fibular flap, respectively). The 3dCT face scans of CFM patients were randomized and then classified by craniofacial surgeons. Pairwise agreement and Fleiss' κ were used to assess interrater reliability. Results: The 3dCT images of 43 patients with CFM (aged 0.1–15.8 years) were reviewed by 15 craniofacial surgeons, representing an average 15.2 years of experience. Reviewers demonstrated fair interrater reliability with average pairwise agreement of 50.4 ± 9.9% (Fleiss' κ = 0.34). This represents significant improvement over the Pruzansky–Kaban classification (pairwise agreement, 39.2%; P = 0.0033.) Reviewers demonstrated substantial interrater reliability with average pairwise agreement of 83.0 ± 7.6% (κ = 0.64) distinguishing deformities requiring graft or flap reconstruction (T3 and T4) from others. Conclusion: The proposed classification, designed for the era of 3dCT, shows improved consensus with respect to stratifying the severity of mandibular deformity and type of operative management. PMID:27104097

  18. Role of the Animator in the Generation of 3-Dimensional Computer Generated Animation.

    ERIC Educational Resources Information Center

    Wedge, John Christian

    This master's thesis investigates the relationship between the traditional animator and the computer as computer animation systems allow them to apply traditional skills with a high degree of success. The advantages and disadvantages of traditional animation as a medium for expressing motion and character are noted, and it is argued that the…

  19. Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code

    NASA Technical Reports Server (NTRS)

    Weinberg, B. C.; Mcdonald, H.

    1980-01-01

    There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.

  20. A 3-dimensional Navier-Stokes-Euler code for blunt-body flow computations

    NASA Technical Reports Server (NTRS)

    Li, C. P.

    1985-01-01

    The shock-layer flowfield is obtained with or without viscous and heat-conducting dissipations from the conservative laws of fluid dynamics equations using a shock-fitting implicity finite-difference technique. The governing equations are cast in curvilinear-orthogonal coordinates and transformed to the domain between the shock and the body. Another set of equations is used for the singular coordinate axis, which, together with a cone generator away from the stagnation point, encloses the computation domain. A time-dependent alternating direction implicit factorization technique is applied to integrate the equations with local-time increment until a steady solution is reached. The shock location is updated after the flowfield computation, but the wall conditions are implemented into the implicit procedure. Innovative procedures are introduced to define the initial flowfield, to treat both perfect and equilibrium gases, to advance the solution on a coarse-to-fine grid sequence, and to start viscous flow computations from their corresponding inviscid solutions. The results are obtained from a grid no greater than 28 by 18 by 7 and converged within 300 integration steps. They are of sufficient accuracy to start parabolized Navier-Stokes or Euler calculations beyond the nose region, to compare with flight and wind-tunnel data, and to evaluate conceptual designs of reentry spacecraft.

  1. A Modular Computer Code for Simulating Reactive Multi-Species Transport in 3-Dimensional Groundwater Systems

    SciTech Connect

    TP Clement

    1999-06-24

    RT3DV1 (Reactive Transport in 3-Dimensions) is computer code that solves the coupled partial differential equations that describe reactive-flow and transport of multiple mobile and/or immobile species in three-dimensional saturated groundwater systems. RT3D is a generalized multi-species version of the US Environmental Protection Agency (EPA) transport code, MT3D (Zheng, 1990). The current version of RT3D uses the advection and dispersion solvers from the DOD-1.5 (1997) version of MT3D. As with MT3D, RT3D also requires the groundwater flow code MODFLOW for computing spatial and temporal variations in groundwater head distribution. The RT3D code was originally developed to support the contaminant transport modeling efforts at natural attenuation demonstration sites. As a research tool, RT3D has also been used to model several laboratory and pilot-scale active bioremediation experiments. The performance of RT3D has been validated by comparing the code results against various numerical and analytical solutions. The code is currently being used to model field-scale natural attenuation at multiple sites. The RT3D code is unique in that it includes an implicit reaction solver that makes the code sufficiently flexible for simulating various types of chemical and microbial reaction kinetics. RT3D V1.0 supports seven pre-programmed reaction modules that can be used to simulate different types of reactive contaminants including benzene-toluene-xylene mixtures (BTEX), and chlorinated solvents such as tetrachloroethene (PCE) and trichloroethene (TCE). In addition, RT3D has a user-defined reaction option that can be used to simulate any other types of user-specified reactive transport systems. This report describes the mathematical details of the RT3D computer code and its input/output data structure. It is assumed that the user is familiar with the basics of groundwater flow and contaminant transport mechanics. In addition, RT3D users are expected to have some experience in

  2. 3-dimensional (orthogonal) structural complexity of time-series data using low-order moment analysis

    NASA Astrophysics Data System (ADS)

    Law, Victor J.; O'Neill, Feidhlim T.; Dowling, Denis P.

    2012-09-01

    The recording of atmospheric pressure plasmas (APP) electro-acoustic emission data has been developed as a plasma metrology tool in the last couple of years. The industrial applications include automotive and aerospace industry for surface activation of polymers prior to bonding [1, 2, and 3]. It has been shown that as the APP jets proceeds over a treatment surface, at a various fixed heights, two contrasting acoustic signatures are produced which correspond to two very different plasma-surface entropy states (blow arc ˜ 1700 ± 100 K; and; afterglow ˜ 300-400 K) [4]. The metrology challenge is now to capture deterministic data points within data clusters. For this to be achieved new real-time data cluster measurement techniques needs to be developed [5]. The cluster information must be extracted within the allotted process time period if real-time process control is to be achieved. This abstract describes a theoretical structural complexity analysis (in terms crossing points) of 2 and 3-dimentional line-graphs that contain time-series data. In addition LabVIEW implementation of the 3-dimensional data analysis is performed. It is also shown the cluster analysis technique can be transfer to other (non-acoustic) datasets.

  3. Role of preoperative 3-dimensional computed tomography reconstruction in depressed skull fractures treated with craniectomy: a case report of forensic interest.

    PubMed

    Viel, Guido; Cecchetto, Giovanni; Manara, Renzo; Cecchetto, Attilio; Montisci, Massimo

    2011-06-01

    Patients affected by cranial trauma with depressed skull fractures and increased intracranial pressure generally undergo neurosurgical intervention. Because craniotomy and craniectomy remove skull fragments and generate new fracture lines, they complicate forensic examination and sometimes prevent a clear identification of skull fracture etiology. A 3-dimensional reconstruction based on preoperative computed tomography (CT) scans, giving a picture of the injuries before surgical intervention, can help the forensic examiner in identifying skull fracture origin and the means of production.We report the case of a 41-year-old-man presenting at the emergency department with a depressed skull fracture at the vertex and bilateral subdural hemorrhage. The patient underwent 2 neurosurgical interventions (craniotomy and craniectomy) but died after 40 days of hospitalization in an intensive care unit. At autopsy, the absence of various bone fragments did not allow us to establish if the skull had been stricken by a blunt object or had hit the ground with high kinetic energy. To analyze bone injuries before craniectomy, a 3-dimensional CT reconstruction based on preoperative scans was performed. A comparative analysis between autoptic and radiological data allowed us to differentiate surgical from traumatic injuries. Moreover, based on the shape and size of the depressed skull fracture (measured from the CT reformations), we inferred that the man had been stricken by a cylindric blunt object with a diameter of about 3 cm. PMID:21512384

  4. Scene-of-crime analysis by a 3-dimensional optical digitizer: a useful perspective for forensic science.

    PubMed

    Sansoni, Giovanna; Cattaneo, Cristina; Trebeschi, Marco; Gibelli, Daniele; Poppa, Pasquale; Porta, Davide; Maldarella, Monica; Picozzi, Massimo

    2011-09-01

    Analysis and detailed registration of the crime scene are of the utmost importance during investigations. However, this phase of activity is often affected by the risk of loss of evidence due to the limits of traditional scene of crime registration methods (ie, photos and videos). This technical note shows the utility of the application of a 3-dimensional optical digitizer on different crime scenes. This study aims in fact at verifying the importance and feasibility of contactless 3-dimensional reconstruction and modeling by optical digitization to achieve an optimal registration of the crime scene. PMID:21811148

  5. Scene-of-crime analysis by a 3-dimensional optical digitizer: a useful perspective for forensic science.

    PubMed

    Sansoni, Giovanna; Cattaneo, Cristina; Trebeschi, Marco; Gibelli, Daniele; Poppa, Pasquale; Porta, Davide; Maldarella, Monica; Picozzi, Massimo

    2011-09-01

    Analysis and detailed registration of the crime scene are of the utmost importance during investigations. However, this phase of activity is often affected by the risk of loss of evidence due to the limits of traditional scene of crime registration methods (ie, photos and videos). This technical note shows the utility of the application of a 3-dimensional optical digitizer on different crime scenes. This study aims in fact at verifying the importance and feasibility of contactless 3-dimensional reconstruction and modeling by optical digitization to achieve an optimal registration of the crime scene.

  6. A 3-Dimensional Analysis of Face-Mask Removal Tools in Inducing Helmet Movement

    PubMed Central

    Swartz, Erik E.; Armstrong, Charles W.; Rankin, James M.; Rogers, Burton

    2002-01-01

    Objective: To evaluate the performance of specific face-mask removal tools during football helmet face-mask retraction using 3-dimensional (3-D) video. Design and Setting: Four different tools were used: the anvil pruner (AP), polyvinyl chloride pipe cutters (PVC), Face Mask (FM) Extractor (FME), and Trainer's Angel (TA). Subjects retracted a face mask once with each tool. Subjects: Eleven certified athletic trainers served as subjects and were recruited from among local sports medicine professionals. Measurements: We analyzed a sample of movement by 3-D techniques during the retraction process. Movement of the head in 3 planes and time to retract the face mask were also assessed. All results were analyzed with a simple repeated-measures one-way multivariate analysis of variance. An overall efficiency score was calculated for each tool. Results: The AP allowed subjects to perform the face-mask removal task the fastest. Face mask removal with the AP was significantly faster than with the PVC and TA and significantly faster with the TA than the PVC. The PVC and AP created significantly more movement than the FME and TA when planes were combined. No significant differences were noted among tools for flexion-extension, rotation, or lateral flexion. The AP had an efficiency score of 14; FME, 15; TA, 18; and PVC, 35. Conclusions: The subjects performed the face-mask removal task in the least amount of time with the AP. They completed the task with the least amount of combined movement using the FME. The AP and FME had nearly identical overall efficiency scores for movement and time. PMID:12937432

  7. Morphological analysis and preoperative simulation of a double-chambered right ventricle using 3-dimensional printing technology.

    PubMed

    Shirakawa, Takashi; Koyama, Yasushi; Mizoguchi, Hiroki; Yoshitatsu, Masao

    2016-05-01

    We present a case of a double-chambered right ventricle in adulthood, in which we tried a detailed morphological assessment and preoperative simulation using 3-dimensional (3D) heart models for improved surgical planning. Polygonal object data for the heart were constructed from computed tomography images of this patient, and transferred to a desktop 3D printer to print out models in actual size. Medical staff completed all of the work processes. Because the 3D heart models were examined by hand, observed from various viewpoints and measured by callipers with ease, we were able to create an image of the complete form of the heart. The anatomical structure of an anomalous bundle was clearly observed, and surgical approaches to the lesion were simulated accurately. During surgery, we used an incision on the pulmonary infundibulum and resected three muscular components of the stenosis. The similarity between the models and the actual heart was excellent. As a result, the operation for this rare defect was performed safely and successfully. We concluded that the custom-made model was useful for morphological analysis and preoperative simulation. PMID:26860990

  8. Morphological analysis and preoperative simulation of a double-chambered right ventricle using 3-dimensional printing technology.

    PubMed

    Shirakawa, Takashi; Koyama, Yasushi; Mizoguchi, Hiroki; Yoshitatsu, Masao

    2016-05-01

    We present a case of a double-chambered right ventricle in adulthood, in which we tried a detailed morphological assessment and preoperative simulation using 3-dimensional (3D) heart models for improved surgical planning. Polygonal object data for the heart were constructed from computed tomography images of this patient, and transferred to a desktop 3D printer to print out models in actual size. Medical staff completed all of the work processes. Because the 3D heart models were examined by hand, observed from various viewpoints and measured by callipers with ease, we were able to create an image of the complete form of the heart. The anatomical structure of an anomalous bundle was clearly observed, and surgical approaches to the lesion were simulated accurately. During surgery, we used an incision on the pulmonary infundibulum and resected three muscular components of the stenosis. The similarity between the models and the actual heart was excellent. As a result, the operation for this rare defect was performed safely and successfully. We concluded that the custom-made model was useful for morphological analysis and preoperative simulation.

  9. User's manual for master: Modeling of aerodynamic surfaces by 3-dimensional explicit representation. [input to three dimensional computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Gibson, S. G.

    1983-01-01

    A system of computer programs was developed to model general three dimensional surfaces. Surfaces are modeled as sets of parametric bicubic patches. There are also capabilities to transform coordinates, to compute mesh/surface intersection normals, and to format input data for a transonic potential flow analysis. A graphical display of surface models and intersection normals is available. There are additional capabilities to regulate point spacing on input curves and to compute surface/surface intersection curves. Input and output data formats are described; detailed suggestions are given for user input. Instructions for execution are given, and examples are shown.

  10. Lateral characteristic analysis of PMLSM considering overhang effect by 3 dimensional equivalent magnetic circuit network method

    SciTech Connect

    Hur, J.; Jung, I.S.; Hyun, D.S.

    1998-09-01

    PMLSM is used for propulsion device of high speed ground transportation or contactless carrier in factory automation and office automation. This paper represents lateral characteristics of Permanent Magnet Linear Synchronous Motor (PMLSM) according to change of overhang length. In order to analyze overhang effect of PMLSM with large airgap and finite width considering lateral displacement, new 3 dimensional equivalent magnetic circuit network method (3-D EMCN) taking into account movement of the secondary in lateral direction is introduced, which supplements magnetic equivalent circuit by using numerical technique. 3-D EMCN can consider secondary movement without remesh the element because it uses the initial mesh continuously. The authors analyzed characteristics for overhang three type case which must be problems in 3-D. The results are compared with experimental data and shown a reasonable agreement.

  11. Stress analysis in platform-switching implants: a 3-dimensional finite element study.

    PubMed

    Pellizzer, Eduardo Piza; Verri, Fellippo Ramos; Falcón-Antenucci, Rosse Mary; Júnior, Joel Ferreira Santiago; de Carvalho, Paulo Sérgio Perri; de Moraes, Sandra Lúcia Dantas; Noritomi, Pedro Yoshito

    2012-10-01

    The aim of this study was to evaluate the influence of the platform-switching technique on stress distribution in implant, abutment, and peri-implant tissues, through a 3-dimensional finite element study. Three 3-dimensional mandibular models were fabricated using the SolidWorks 2006 and InVesalius software. Each model was composed of a bone block with one implant 10 mm long and of different diameters (3.75 and 5.00 mm). The UCLA abutments also ranged in diameter from 5.00 mm to 4.1 mm. After obtaining the geometries, the models were transferred to the software FEMAP 10.0 for pre- and postprocessing of finite elements to generate the mesh, loading, and boundary conditions. A total load of 200 N was applied in axial (0°), oblique (45°), and lateral (90°) directions. The models were solved by the software NeiNastran 9.0 and transferred to the software FEMAP 10.0 to obtain the results that were visualized through von Mises and maximum principal stress maps. Model A (implants with 3.75 mm/abutment with 4.1 mm) exhibited the highest area of stress concentration with all loadings (axial, oblique, and lateral) for the implant and the abutment. All models presented the stress areas at the abutment level and at the implant/abutment interface. Models B (implant with 5.0 mm/abutment with 5.0 mm) and C (implant with 5.0 mm/abutment with 4.1 mm) presented minor areas of stress concentration and similar distribution pattern. For the cortical bone, low stress concentration was observed in the peri-implant region for models B and C in comparison to model A. The trabecular bone exhibited low stress that was well distributed in models B and C. Model A presented the highest stress concentration. Model B exhibited better stress distribution. There was no significant difference between the large-diameter implants (models B and C).

  12. Surgical orthodontic treatment for a patient with advanced periodontal disease: evaluation with electromyography and 3-dimensional cone-beam computed tomography.

    PubMed

    Nakajima, Kan; Yamaguchi, Tetsutaro; Maki, Koutaro

    2009-09-01

    We report here the case of a woman with Class III malocclusion and advanced periodontal disease who was treated with surgical orthodontic correction. Functional recovery after orthodontic treatment is often monitored by serial electromyography of the masticatory muscles, whereas 3-dimensional cone-beam computed tomography can provide detailed structural information about, for example, periodontal bone defects. However, it is unclear whether the information obtained via these methods is sufficient to determine the treatment goal. It might be useful to address this issue for patients with advanced periodontal disease because of much variability between patients in the determination of treatment goals. We used detailed information obtained by 3-dimensional cone-beam computed tomography to identify periodontal bone defects and set appropriate treatment goals for inclination of the incisors and mandibular surgery. Results for this patient included stable occlusion and improved facial esthetics. This case report illustrates the benefits of establishing treatment goals acceptable to the patient, based on precise 3-dimensional assessment of dentoalveolar bone, and by using masticatory muscle activity to monitor the stability of occlusion.

  13. Biomechanical 3-Dimensional Finite Element Analysis of Obturator Protheses Retained with Zygomatic and Dental Implants in Maxillary Defects

    PubMed Central

    Akay, Canan; Yaluğ, Suat

    2015-01-01

    Background The objective of this study was to investigate the stress distribution in the bone around zygomatic and dental implants for 3 different implant-retained obturator prostheses designs in a Aramany class IV maxillary defect using 3-dimensional finite element analysis (FEA). Material\\Methods A 3-dimensional finite element model of an Aramany class IV defect was created. Three different implant-retained obturator prostheses were modeled: model 1 with 1 zygomatic implant and 1 dental implant, model 2 with 1 zygomatic implant and 2 dental implants, and model 3 with 2 zygomatic implants. Locator attachments were used as a superstructure. A 150-N load was applied 3 different ways. Qualitative analysis was based on the scale of maximum principal stress; values obtained through quantitative analysis are expressed in MPa. Results In all loading conditions, model 3 (when compared models 1 and 2) showed the lowest maximum principal stress value. Model 3 is the most appropirate reconstruction in Aramany class IV maxillary defects. Two zygomatic implants can reduce the stresses in model 3. The distribution of stresses on prostheses were more rational with the help of zygoma implants, which can distribute the stresses on each part of the maxilla. Conclusions Aramany class IV obturator prosthesis placement of 2 zygomatic implants in each side of the maxilla is more advantageous than placement of dental implants. In the non-defective side, increasing the number of dental implants is not as suitable as zygomatic implants. PMID:25714086

  14. BOPACE 3-D (the Boeing Plastic Analysis Capability for 3-dimensional Solids Using Isoparametric Finite Elements)

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Straayer, J. W.

    1975-01-01

    The BOPACE 3-D is a finite element computer program, which provides a general family of three-dimensional isoparametric solid elements, and includes a new algorithm for improving the efficiency of the elastic-plastic-creep solution procedure. Theoretical, user, and programmer oriented sections are presented to describe the program.

  15. Hybrid-finite-element analysis of some nonlinear and 3-dimensional problems of engineering fracture mechanics

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.; Nakagaki, M.; Kathiresan, K.

    1980-01-01

    In this paper, efficient numerical methods for the analysis of crack-closure effects on fatigue-crack-growth-rates, in plane stress situations, and for the solution of stress-intensity factors for arbitrary shaped surface flaws in pressure vessels, are presented. For the former problem, an elastic-plastic finite element procedure valid for the case of finite deformation gradients is developed and crack growth is simulated by the translation of near-crack-tip elements with embedded plastic singularities. For the latter problem, an embedded-elastic-singularity hybrid finite element method, which leads to a direct evaluation of K-factors, is employed.

  16. 3-dimensional microscope analysis of bone and tooth surface modifications: comparisons of fossil specimens and replicas.

    PubMed

    Bello, Silvia M; Verveniotou, Efstratia; Cornish, Lorraine; Parfitt, Simon A

    2011-01-01

    Cut-marks on fossil bones and teeth are an important source of evidence in the reconstruction of ancient butchery practices. The analysis of butchery marks has allowed archaeologists to interpret aspects of past subsistence strategies and the behavior of early humans. Recent advances in optical scanning microscopy allow detailed measurements of cut-mark morphology to be undertaken. An example of this technology is the Alicona 3D InfiniteFocus imaging microscope, which has been applied recently to the study of surface modifications on bones and teeth. Three-dimensional models generated by the Alicona microscope have been used to identify cross-sectional features of experimental cut-marks that are characteristic for specific cutting actions (e.g., slicing, chopping, scraping) and different tool types (e.g., metal versus stone tools). More recently, this technology has been applied successfully to the analysis of ∼500,000 year-old cut-marked animal bones from Boxgrove (U.K.), as well as cannibalized 14,700 cal BP year-old human bones from Gough's Cave (U.K.). This article describes molding methods used to replicate fragile prehistoric bones and teeth, where image quality was adversely affected by specimen translucency and reflectivity. Alicona images generated from molds and casts are often of better quality than those of the original specimen. PMID:21660994

  17. In situ tooth replica custom implant: a 3-dimensional finite element stress and strain analysis.

    PubMed

    Ghuneim, Wael Aly

    2013-10-01

    This study is a phase of a biomechanical study, a part of a research program concerned with the new concept of in situ tooth replication. The purpose of the study was to evaluate tooth replica under each of two possible circumstances: (1) attachment via periodontal ligament and (2) osseointegration. Replicas were made of Cortoss, a bioactive glass, bone substitute. Three-dimensional finite element analysis was used to assess the stresses and strains resulting from each of 2 types of loads: off-vertical pressure and vertical point force acting on natural mandibular second premolar and corresponding replicas. Natural tooth tolerated 19 MPa pressure or 85 N vertical force, periodontally attached replica tolerated 15 MPa pressure or 80 N force, and osseointegrated replica tolerated 23 MPa pressure or 217 N force.

  18. Error analysis of a direct current electromagnetic tracking system in digitizing 3-dimensional surface geometries.

    PubMed

    Milne, A D; Lee, J M

    1999-01-01

    The direct current electromagnetic tracking device has seen increasing use in biomechanics studies of joint kinematics and anatomical surface geometry. In these applications, a stylus is attached to a sensor to measure the spatial location of three-dimensional landmarks. Stylus calibration is performed by rotating the stylus about a fixed point in space and using regression analysis to determine the tip offset vector. Measurement errors can be induced via several pathways, including; intrinsic system errors in sensor position or angle and tip offset calibration errors. A detailed study was performed to determine the errors introduced in digitizing small surfaces with different stylus lengths (35, 55, and 65 mm) and approach angles (30 and 45 degrees) using a plastic calibration board and hemispherical models. Two-point discrimination errors increased to an average of 1.93 mm for a 254 mm step size. Rotation about a single point produced mean errors of 0.44 to 1.18 mm. Statistically significant differences in error were observed with increasing approach angles (p < 0.001). Errors of less than 6% were observed in determining the curvature of a 19 mm hemisphere. This study demonstrates that the "Flock of Birds" can be used as a digitizing tool with accuracy better than 0.76% over 254 mm step sizes. PMID:11143353

  19. Evaluation of Temperature and Stress Distribution on 2 Different Post Systems Using 3-Dimensional Finite Element Analysis

    PubMed Central

    Değer, Yalçın; Adigüzel, Özkan; Özer, Senem Yiğit; Kaya, Sadullah; Polat, Zelal Seyfioğlu; Bozyel, Bejna

    2015-01-01

    Background The mouth is exposed to thermal irritation from hot and cold food and drinks. Thermal changes in the oral cavity produce expansions and contractions in tooth structures and restorative materials. The aim of this study was to investigate the effect of temperature and stress distribution on 2 different post systems using the 3-dimensional (3D) finite element method. Material/Methods The 3D finite element model shows a labio-lingual cross-sectional view of the endodontically treated upper right central incisor and supporting periodontal ligament with bone structures. Stainless steel and glass fiber post systems with different physical and thermal properties were modelled in the tooth restored with composite core and ceramic crown. We placed 100 N static vertical occlusal loading onto the center of the incisal surface of the tooth. Thermal loads of 0°C and 65°C were applied on the model for 5 s. Temperature and thermal stresses were determined on the labio-lingual section of the model at 6 different points. Results The distribution of stress, including thermal stress values, was calculated using 3D finite element analysis. The stainless steel post system produced more temperature and thermal stresses on the restorative materials, tooth structures, and posts than did the glass fiber reinforced composite posts. Conclusions Thermal changes generated stresses in the restorative materials, tooth, and supporting structures. PMID:26615495

  20. Cost-Effectiveness Analysis of Intensity Modulated Radiation Therapy Versus 3-Dimensional Conformal Radiation Therapy for Anal Cancer

    SciTech Connect

    Hodges, Joseph C.; Beg, Muhammad S.; Das, Prajnan; Meyer, Jeffrey

    2014-07-15

    Purpose: To compare the cost-effectiveness of intensity modulated radiation therapy (IMRT) and 3-dimensional conformal radiation therapy (3D-CRT) for anal cancer and determine disease, patient, and treatment parameters that influence the result. Methods and Materials: A Markov decision model was designed with the various disease states for the base case of a 65-year-old patient with anal cancer treated with either IMRT or 3D-CRT and concurrent chemotherapy. Health states accounting for rates of local failure, colostomy failure, treatment breaks, patient prognosis, acute and late toxicities, and the utility of toxicities were informed by existing literature and analyzed with deterministic and probabilistic sensitivity analysis. Results: In the base case, mean costs and quality-adjusted life expectancy in years (QALY) for IMRT and 3D-CRT were $32,291 (4.81) and $28,444 (4.78), respectively, resulting in an incremental cost-effectiveness ratio of $128,233/QALY for IMRT compared with 3D-CRT. Probabilistic sensitivity analysis found that IMRT was cost-effective in 22%, 47%, and 65% of iterations at willingness-to-pay thresholds of $50,000, $100,000, and $150,000 per QALY, respectively. Conclusions: In our base model, IMRT was a cost-ineffective strategy despite the reduced acute treatment toxicities and their associated costs of management. The model outcome was sensitive to variations in local and colostomy failure rates, as well as patient-reported utilities relating to acute toxicities.

  1. Analysis of shape and motion of the mitral annulus in subjects with and without cardiomyopathy by echocardiographic 3-dimensional reconstruction

    NASA Technical Reports Server (NTRS)

    Flachskampf, F. A.; Chandra, S.; Gaddipatti, A.; Levine, R. A.; Weyman, A. E.; Ameling, W.; Hanrath, P.; Thomas, J. D.

    2000-01-01

    The shape and dynamics of the mitral annulus of 10 patients without heart disease (controls), 3 patients with dilated cardiomyopathy, and 5 patients with hypertrophic obstructive cardiomyopathy and normal systolic function were analyzed by transesophageal echocardiography and 3-dimensional reconstruction. Mitral annular orifice area, apico-basal motion of the annulus, and nonplanarity were calculated over time. Annular area was largest in end diastole and smallest in end systole. Mean areas were 11.8 +/- 2.5 cm(2) (controls), 15.2 +/- 4.2 cm(2) (dilated cardiomyopathy), and 10.2 +/- 2.4 cm(2) (hypertrophic cardiomyopathy) (P = not significant). After correction for body surface, annuli from patients with normal left ventricular function were smaller than annuli from patients with dilated cardiomyopathy (5.9 +/- 1.2 cm(2)/m(2) vs 7.7 +/- 1.0 cm(2)/m(2); P <.02). The change in area during the cardiac cycle showed significant differences: 23.8% +/- 5.1% (controls), 13.2% +/- 2.3% (dilated cardiomyopathy), and 32.4% +/- 7.6% (hypertrophic cardiomyopathy) (P <.001). Apico-basal motion was highest in controls, followed by those with hypertrophic obstructive and dilated cardiomyopathy (1.0 +/- 0.3 cm, 0.8 +/- 0.2 cm, 0.3 +/- 0.2 cm, respectively; P <.01). Visual inspection and Fourier analysis showed a consistent pattern of anteroseptal and posterolateral elevations of the annulus toward the left atrium. In conclusion, although area changes and apico-basal motion of the mitral annulus strongly depend on left ventricular systolic function, nonplanarity is a structural feature preserved throughout the cardiac cycle in all three groups.

  2. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles.

    PubMed

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  3. Hydrogel Based 3-Dimensional (3D) System for Toxicity and High-Throughput (HTP) Analysis for Cultured Murine Ovarian Follicles

    PubMed Central

    Zhou, Hong; Malik, Malika Amattullah; Arab, Aarthi; Hill, Matthew Thomas; Shikanov, Ariella

    2015-01-01

    Various toxicants, drugs and their metabolites carry potential ovarian toxicity. Ovarian follicles, the functional unit of the ovary, are susceptible to this type of damage at all stages of their development. However, despite of the large scale of potential negative impacts, assays that study ovarian toxicity are limited. Exposure of cultured ovarian follicles to toxicants of interest served as an important tool for evaluation of toxic effects for decades. Mouse follicles cultured on the bottom of a culture dish continue to serve an important approach for mechanistic studies. In this paper, we demonstrated the usefulness of a hydrogel based 3-dimensional (3D) mouse ovarian follicle culture as a tool to study ovarian toxicity in a different setup. The 3D in vitro culture, based on fibrin alginate interpenetrating network (FA-IPN), preserves the architecture of the ovarian follicle and physiological structure-function relationship. We applied the novel 3D high-throughput (HTP) in vitro ovarian follicle culture system to study the ovotoxic effects of an anti-cancer drug, Doxorobucin (DXR). The fibrin component in the system is degraded by plasmin and appears as a clear circle around the encapsulated follicle. The degradation area of the follicle is strongly correlated with follicle survival and growth. To analyze fibrin degradation in a high throughput manner, we created a custom MATLAB® code that converts brightfield micrographs of follicles encapsulated in FA-IPN to binary images, followed by image analysis. We did not observe any significant difference between manually processed images to the automated MATLAB® method, thereby confirming that the automated program is suitable to measure fibrin degradation to evaluate follicle health. The cultured follicles were treated with DXR at concentrations ranging from 0.005 nM to 200 nM, corresponding to the therapeutic plasma levels of DXR in patients. Follicles treated with DXR demonstrated decreased survival rate in

  4. Influence of the implant diameter with different sizes of hexagon: analysis by 3-dimensional finite element method.

    PubMed

    Pellizzer, Eduardo Piza; Verri, Fellippo Ramos; de Moraes, Sandra Lúcia Dantas; Falcón-Antenucci, Rosse Mary; de Carvalho, Paulo Sérgio Perri; Noritomi, Pedro Yoshito

    2013-08-01

    The aim of this study was to evaluate the stress distribution in implants of regular platforms and of wide diameter with different sizes of hexagon by the 3-dimensional finite element method. We used simulated 3-dimensional models with the aid of Solidworks 2006 and Rhinoceros 4.0 software for the design of the implant and abutment and the InVesalius software for the design of the bone. Each model represented a block of bone from the mandibular molar region with an implant 10 mm in length and different diameters. Model A was an implant 3.75 mm/regular hexagon, model B was an implant 5.00 mm/regular hexagon, and model C was an implant 5.00 mm/expanded hexagon. A load of 200 N was applied in the axial, lateral, and oblique directions. At implant, applying the load (axial, lateral, and oblique), the 3 models presented stress concentration at the threads in the cervical and middle regions, and the stress was higher for model A. At the abutment, models A and B showed a similar stress distribution, concentrated at the cervical and middle third; model C showed the highest stresses. On the cortical bone, the stress was concentrated at the cervical region for the 3 models and was higher for model A. In the trabecular bone, the stresses were less intense and concentrated around the implant body, and were more intense for model A. Among the models of wide diameter (models B and C), model B (implant 5.00 mm/regular hexagon) was more favorable with regard to distribution of stresses. Model A (implant 3.75 mm/regular hexagon) showed the largest areas and the most intense stress, and model B (implant 5.00 mm/regular hexagon) showed a more favorable stress distribution. The highest stresses were observed in the application of lateral load.

  5. 3-dimensional sonographic analysis based on color flow Doppler and gray scale image data: a preliminary report.

    PubMed

    Pretorius, D H; Nelson, T R; Jaffe, J S

    1992-05-01

    This paper presents preliminary results of a technique that permits acquisition and display of three-dimensional (3D) anatomy using data collected from color flow Doppler and gray scale image sonography. 3D sonographic image data were acquired as two-dimensional planar images with commercially available equipment. A translational stage permitted the transducer position and orientation to be determined. Color flow sonographic video image data were digitized into a PC-AT computer along with transducer position and orientation information. Color flow velocity and gray scale data were separated, 3D filtered, and thresholded. A surface rendering program was used to define the vessel blood-lumen interface. Planar slices of arbitrary orientation and volume rendered images were displayed interactively on a graphics workstation. The technique was demonstrated in a lamb kidney in vitro and for the carotid artery at the bifurcation in vivo. Our results demonstrate the potential of 3D sonography as a technique for visualization of anatomy. Color flow data offer direct access to the vascular system, facilitating 3D analysis and display. 3D sonography offers potential advantages over existing diagnostic studies in that it is noninvasive, requires no intravenous contrast material, offers arbitrary plane extraction and review after the patient has completed the examination, and permits vascular anatomy to be visualized clearly via rendered images.

  6. Control Point Analysis comparison for 3 different treatment planning and delivery complexity levels using a commercial 3-dimensional diode array

    SciTech Connect

    Abdellatif, Ady; Gaede, Stewart

    2014-07-01

    To investigate the use of “Control Point Analysis” (Sun Nuclear Corporation, Melbourne, FL) to analyze and compare delivered volumetric-modulated arc therapy (VMAT) plans for 3 different treatment planning complexity levels. A total of 30 patients were chosen and fully anonymized for the purpose of this study. Overall, 10 lung stereotactic body radiotherapy (SBRT), 10 head-and-neck (H and N), and 10 prostate VMAT plans were generated on Pinnacle{sup 3} and delivered on a Varian linear accelerator (LINAC). The delivered dose was measured using ArcCHECK (Sun Nuclear Corporation, Melbourne, FL). Each plan was analyzed using “Sun Nuclear Corporation (SNC) Patient 6” and “Control Point Analysis.” Gamma passing percentage was used to assess the differences between the measured and planned dose distributions and to assess the role of various control point binning combinations. Of the different sites considered, the prostate cases reported the highest gamma passing percentages calculated with “SNC Patient 6” (97.5% to 99.2% for the 3%, 3 mm) and “Control Point Analysis” (95.4% to 98.3% for the 3%, 3 mm). The mean percentage of passing control point sectors for the prostate cases increased from 51.8 ± 7.8% for individual control points to 70.6 ± 10.5% for 5 control points binned together to 87.8 ± 11.0% for 10 control points binned together (2%, 2-mm passing criteria). Overall, there was an increasing trend in the percentage of sectors passing gamma analysis with an increase in the number of control points binned together in a sector for both the gamma passing criteria (2%, 2 mm and 3%, 3 mm). Although many plans passed the clinical quality assurance criteria, plans involving the delivery of high Monitor Unit (MU)/control point (SBRT) and plans involving high degree of modulation (H and N) showed less delivery accuracy per control point compared with plans with low MU/control point and low degree of modulation (prostate)

  7. The Effects of Different Miniscrew Thread Designs and Force Directions on Stress Distribution by 3-dimensional Finite Element Analysis

    PubMed Central

    Fattahi, Hamidreza; Ajami, Shabnam; Nabavizadeh Rafsanjani, Ali

    2015-01-01

    Statement of the Problem The use of miniscrew as an absolute anchorage device in clinical orthodontics is growing increasingly. Many attempts have been made to reduce the size, to improve the design, and to increase the stability of miniscrew. Purpose The purpose of this study was to determine the effects of different thread shapes and force directions of orthodontic miniscrew on stress distribution in the supporting bone structure. Materials and Method A three-dimensional finite element analysis was used. A 200-cN force in three angles (0°, 45°, and 90°) was applied on the head of the miniscrew. The stress distribution between twelve thread shapes was investigated as categorized in four main groups; buttress, reverse buttress, square, and V-shape. Results Stress distribution was not significantly different among different thread shapes. The maximum amount of bone stress at force angles 0°, 45°, and 90° were 38.90, 30.57 and 6.62 MPa, respectively. Analyzing the von Mises stress values showed that in all models, the maximum stress was concentrated on the lowest diameter of the shank, especially the part that was in the soft tissue and cervical cortical bone regions. Conclusion There was no relation between thread shapes and von Mises stress distribution in the bone; however, different force angles could affect the von Mises stress in the bone and miniscrew. PMID:26636123

  8. Effects of different abutment connection designs on the stress distribution around five different implants: a 3-dimensional finite element analysis.

    PubMed

    Balik, Ali; Karatas, Meltem Ozdemir; Keskin, Haluk

    2012-09-01

    The stability of the bone-implant interface is required for the long-term favorable clinical outcome of implant-supported prosthetic rehabilitation. The implant failures that occur after the functional loading are mainly related to biomechanical factors. Micro movements and vibrations due to occlusal forces can lead to mechanical complications such as loosening of the screw and fractures of the abutment or implants. The aim of this study was to investigate the strain distributions in the connection areas of different implant-abutment connection systems under similar loading conditions. Five different implant-abutment connection designs from 5 different manufacturers were evaluated in this study. The investigation was performed with software using the finite element method. The geometrical modeling of the implant systems was done with CATIA virtual design software. The MSC NASTRAN solver and PATRAN postprocessing program were used to perform the linear static solution. According to the analysis, the implant-abutment connection system with external hexagonal connection showed the highest strain values, and the internal hexagonal implant-abutment connection system showed the lowest strain values. Conical + internal hexagonal and screw-in implant abutment connection interface is more successful than other systems in cases with increased vertical dimension, particularly in the posterior region.

  9. 3-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements and direct solvers parallelized on symmetric multiprocessor computers - Part II: direct data-space inverse solution

    NASA Astrophysics Data System (ADS)

    Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.

    2016-01-01

    Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.

  10. Computation of synthetic seismograms in a 3 dimensional Earth and inversion of eigenfrequency and Q quality factor datasets of normal modes

    NASA Astrophysics Data System (ADS)

    Roch, Julien; Clevede, Eric; Roult, Genevieve

    2010-05-01

    The 26 December 2004 Sumatra-Andaman event is the third biggest earthquake that has never been recorded but the first recorded with high quality broad-band seismometers. Such an earthquake offered a good opportunity for studying the normal modes of the Earth and particularly the gravest ones (frequency lower than 1 mHz) which provide important information on deep Earth. The splitting of some modes has been carefully analyzed. The eigenfrequencies and the Q quality factors of particular singlets have been retrieved with an unprecedented precision. In some cases, the eigenfrequencies of some singlets exhibit a clear shift when compared to the theoretical eigenfrequencies. Some core modes such as the 3S2 mode present an anomalous splitting, that is to say, a splitting width much larger than the expected one. Such anomalous splitting is presently admitted to be due to the existence of lateral heterogeneities in the inner core. We need an accurate model of the whole Earth and a method to compute synthetic seismograms in order to compare synthetic and observed data and to explain the behavior of such modes. Synthetic seismograms are computed by normal modes summation using a perturbative method developed up to second order in amplitude and up to third order in frequency (HOPT method). The last step consists in inverting both eigenfrequency and Q quality factor datasets in order to better constrain the deep Earth structure and especially the inner core. In order to find models of acceptable data fit in a multidimensional parameter space, we use the neighborhood algorithm method which is a derivative-free search method. It is particularly well adapted in our case (non linear problem) and is easy to tune with only 2 parameters. Our purpose is to find an ensemble of models that fit the data rather than a unique model.

  11. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  12. Frontal soft tissue analysis using a 3 dimensional camera following two-jaw rotational orthognathic surgery in skeletal class III patients.

    PubMed

    Choi, Jong Woo; Lee, Jang Yeol; Oh, Tae-Suk; Kwon, Soon Man; Yang, Sung Joon; Koh, Kyung Suk

    2014-04-01

    Although two dimensional cephalometry is the standard method for analyzing the results of orthognathic surgery, it has potential limits in frontal soft tissue analysis. We have utilized a 3 dimensional camera to examine changes in soft tissue landmarks in patients with skeletal class III dentofacial deformity who underwent two-jaw rotational setback surgery. We assessed 25 consecutive Asian patients (mean age, 22 years; range, 17-32 years) with skeletal class III dentofacial deformities who underwent two-jaw rotational surgery without maxillary advancement. Using a 3D camera, we analyzed changes in facial proportions, including vertical and horizontal dimensions, facial surface areas, nose profile, lip contour, and soft tissue cheek convexity, as well as landmarks related to facial symmetry. The average mandibular setback was 10.7 mm (range: 5-17 mm). The average SNA changed from 77.4° to 77.8°, the average SNB from 89.2° to 81.1°, and the average occlusal plane from 8.7° to 11.4°. The mid third vertical dimension changed from 58.8 mm to 57.8 mm (p = 0.059), and the lower third vertical dimension changed from 70.4 mm to 68.2 mm (p = 0.0006). The average bigonial width decreased from 113.5 mm to 109.2 mm (p = 0.0028), the alar width increased from 34.7 mm to 36.1 mm (p-value = 0.0002), and lip length was unchanged. Mean mid and lower facial surface areas decreased significantly, from 171.8 cm(2) to 166.2 cm(2) (p = 0.026) and from 71.23 cm(2) to 61.9 cm(2) (p < 0.0001), respectively. Cheek convexity increased significantly, from 171.8° to 155.9° (p = 0.0007). The 3D camera was effective in frontal soft tissue analysis for orthognathic surgery, and enabled quantitative analysis of changes in frontal soft tissue landmarks and facial proportions that were not possible with conventional 2D cephalometric analysis.

  13. Comparative Analysis of Visitors' Experiences and Knowledge Acquisition between a 3Dimensional Online and a Real-World Art Museum Tour

    ERIC Educational Resources Information Center

    D' Alba, Adriana; Jones, Greg; Wright, Robert

    2015-01-01

    This paper discusses a study conducted in the fall of 2011 and the spring of 2012 which explored the use of existing 3D virtual environment technologies by bringing a selected permanent museum exhibit displayed at a museum located in central Mexico into an online 3Dimensional experience. Using mixed methods, the research study analyzed knowledge…

  14. General design method for 3-dimensional, potential flow fields. Part 2: Computer program DIN3D1 for simple, unbranched ducts

    NASA Technical Reports Server (NTRS)

    Stanitz, J. D.

    1985-01-01

    The general design method for three-dimensional, potential, incompressible or subsonic-compressible flow developed in part 1 of this report is applied to the design of simple, unbranched ducts. A computer program, DIN3D1, is developed and five numerical examples are presented: a nozzle, two elbows, an S-duct, and the preliminary design of a side inlet for turbomachines. The two major inputs to the program are the upstream boundary shape and the lateral velocity distribution on the duct wall. As a result of these inputs, boundary conditions are overprescribed and the problem is ill posed. However, it appears that there are degrees of compatibility between these two major inputs and that, for reasonably compatible inputs, satisfactory solutions can be obtained. By not prescribing the shape of the upstream boundary, the problem presumably becomes well posed, but it is not clear how to formulate a practical design method under this circumstance. Nor does it appear desirable, because the designer usually needs to retain control over the upstream (or downstream) boundary shape. The problem is further complicated by the fact that, unlike the two-dimensional case, and irrespective of the upstream boundary shape, some prescribed lateral velocity distributions do not have proper solutions.

  15. Computational engine structural analysis

    NASA Technical Reports Server (NTRS)

    Chamis, C. C.; Johns, R. H.

    1986-01-01

    A significant research activity at the NASA Lewis Research Center is the computational simulation of complex multidisciplinary engine structural problems. This simulation is performed using computational engine structural analysis (CESA) which consists of integrated multidisciplinary computer codes in conjunction with computer post-processing for problem-specific application. A variety of the computational simulations of specific cases are described in some detail in this paper. These case studies include: (1) aeroelastic behavior of bladed rotors, (2) high velocity impact of fan blades, (3) blade-loss transient response, (4) rotor/stator/squeeze-film/bearing interaction, (5) blade-fragment/rotor-burst containment, and (6) structural behavior of advanced swept turboprops. These representative case studies are selected to demonstrate the breath of the problems analyzed and the role of the computer including post-processing and graphical display of voluminous output data.

  16. Evaluation of the middle cerebral artery occlusion techniques in the rat by in-vitro 3-dimensional micro- and nano computed tomography

    PubMed Central

    2010-01-01

    Background Animal models of focal cerebral ischemia are widely used in stroke research. The purpose of our study was to evaluate and compare the cerebral macro- and microvascular architecture of rats in two different models of permanent middle cerebral artery occlusion using an innovative quantitative micro- and nano-CT imaging technique. Methods 4h of middle cerebral artery occlusion was performed in rats using the macrosphere method or the suture technique. After contrast perfusion, brains were isolated and scanned en-bloc using micro-CT (8 μm)3 or nano-CT at 500 nm3 voxel size to generate 3D images of the cerebral vasculature. The arterial vascular volume fraction and gray scale attenuation was determined and the significance of differences in measurements was tested with analysis of variance [ANOVA]. Results Micro-CT provided quantitative information on vascular morphology. Micro- and nano-CT proved to visualize and differentiate vascular occlusion territories performed in both models of cerebral ischemia. The suture technique leads to a remarkable decrease in the intravascular volume fraction of the middle cerebral artery perfusion territory. Blocking the medial cerebral artery with macrospheres, the vascular volume fraction of the involved hemisphere decreased significantly (p < 0.001), independently of the number of macrospheres, and was comparable to the suture method. We established gray scale measurements by which focal cerebral ischemia could be radiographically categorized (p < 0.001). Nano-CT imaging demonstrates collateral perfusion related to different occluded vessel territories after macrosphere perfusion. Conclusion Micro- and Nano-CT imaging is feasible for analysis and differentiation of different models of focal cerebral ischemia in rats. PMID:20509884

  17. Reconstruction 3-dimensional image from 2-dimensional image of status optical coherence tomography (OCT) for analysis of changes in retinal thickness

    SciTech Connect

    Arinilhaq,; Widita, Rena

    2014-09-30

    Optical Coherence Tomography is often used in medical image acquisition to diagnose that change due easy to use and low price. Unfortunately, this type of examination produces a two-dimensional retinal image of the point of acquisition. Therefore, this study developed a method that combines and reconstruct 2-dimensional retinal images into three-dimensional images to display volumetric macular accurately. The system is built with three main stages: data acquisition, data extraction and 3-dimensional reconstruction. At data acquisition step, Optical Coherence Tomography produced six *.jpg images of each patient were further extracted with MATLAB 2010a software into six one-dimensional arrays. The six arrays are combined into a 3-dimensional matrix using a kriging interpolation method with SURFER9 resulting 3-dimensional graphics of macula. Finally, system provides three-dimensional color graphs based on the data distribution normal macula. The reconstruction system which has been designed produces three-dimensional images with size of 481 × 481 × h (retinal thickness) pixels.

  18. Incorporating 3-dimensional models in online articles

    PubMed Central

    Cevidanes, Lucia H. S.; Ruellasa, Antonio C. O.; Jomier, Julien; Nguyen, Tung; Pieper, Steve; Budin, Francois; Styner, Martin; Paniagua, Beatriz

    2015-01-01

    Introduction The aims of this article were to introduce the capability to view and interact with 3-dimensional (3D) surface models in online publications, and to describe how to prepare surface models for such online 3D visualizations. Methods Three-dimensional image analysis methods include image acquisition, construction of surface models, registration in a common coordinate system, visualization of overlays, and quantification of changes. Cone-beam computed tomography scans were acquired as volumetric images that can be visualized as 3D projected images or used to construct polygonal meshes or surfaces of specific anatomic structures of interest. The anatomic structures of interest in the scans can be labeled with color (3D volumetric label maps), and then the scans are registered in a common coordinate system using a target region as the reference. The registered 3D volumetric label maps can be saved in .obj, .ply, .stl, or .vtk file formats and used for overlays, quantification of differences in each of the 3 planes of space, or color-coded graphic displays of 3D surface distances. Results All registered 3D surface models in this study were saved in .vtk file format and loaded in the Elsevier 3D viewer. In this study, we describe possible ways to visualize the surface models constructed from cone-beam computed tomography images using 2D and 3D figures. The 3D surface models are available in the article’s online version for viewing and downloading using the reader’s software of choice. These 3D graphic displays are represented in the print version as 2D snapshots. Overlays and color-coded distance maps can be displayed using the reader’s software of choice, allowing graphic assessment of the location and direction of changes or morphologic differences relative to the structure of reference. The interpretation of 3D overlays and quantitative color-coded maps requires basic knowledge of 3D image analysis. Conclusions When submitting manuscripts, authors can

  19. [3-dimensional documentation of wound-healing].

    PubMed

    Körber, A; Grabbe, S; Dissemond, J

    2006-04-01

    The objective evaluation of the course of wound-healing represents a substantial parameter for the quality assurance of a modern wound management in chronic wounds. Established procedures exclusively based on a two-dimensional measurement of the wound surface with planimetry or digital photo documentation in combination with a metric statement of size. Thus so far an objective method is missing for the evaluation of the volumes of chronic wounds. By the linkage of digital photography, optical grid by means of digital scanner and an image processing software in co-operation with the company RSI we were able to do an accurate 3-dimensional documentation of chronic wounds (DigiSkin). The generated scatter-plots allow a visual, computer-assisted 3-dimensional measurement and documentation of chronic wounds. In comparison with available systems it is now possible for the first time to objectify the volume changes of a chronic wound. On the basis of a case report of a female patient with an venous leg ulcer, which has been treated with a vacuum closure therapy before and after performing a mesh-graft transplantation, we would like to describe the advantages and the resulting scientific use of this new, objective wound documentation system in the clinical employment. PMID:16575675

  20. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with.

  1. Computational Analysis of Behavior.

    PubMed

    Egnor, S E Roian; Branson, Kristin

    2016-07-01

    In this review, we discuss the emerging field of computational behavioral analysis-the use of modern methods from computer science and engineering to quantitatively measure animal behavior. We discuss aspects of experiment design important to both obtaining biologically relevant behavioral data and enabling the use of machine vision and learning techniques for automation. These two goals are often in conflict. Restraining or restricting the environment of the animal can simplify automatic behavior quantification, but it can also degrade the quality or alter important aspects of behavior. To enable biologists to design experiments to obtain better behavioral measurements, and computer scientists to pinpoint fruitful directions for algorithm improvement, we review known effects of artificial manipulation of the animal on behavior. We also review machine vision and learning techniques for tracking, feature extraction, automated behavior classification, and automated behavior discovery, the assumptions they make, and the types of data they work best with. PMID:27090952

  2. Design of 3-dimensional complex airplane configurations with specified pressure distribution via optimization

    NASA Technical Reports Server (NTRS)

    Kubrynski, Krzysztof

    1991-01-01

    A subcritical panel method applied to flow analysis and aerodynamic design of complex aircraft configurations is presented. The analysis method is based on linearized, compressible, subsonic flow equations and indirect Dirichlet boundary conditions. Quadratic dipol and linear source distribution on flat panels are applied. In the case of aerodynamic design, the geometry which minimizes differences between design and actual pressure distribution is found iteratively, using numerical optimization technique. Geometry modifications are modeled by surface transpiration concept. Constraints in respect to resulting geometry can be specified. A number of complex 3-dimensional design examples are presented. The software is adopted to personal computers, and as result an unexpected low cost of computations is obtained.

  3. Teleportation of a 3-dimensional GHZ State

    NASA Astrophysics Data System (ADS)

    Cao, Hai-Jing; Wang, Huai-Sheng; Li, Peng-Fei; Song, He-Shan

    2012-05-01

    The process of teleportation of a completely unknown 3-dimensional GHZ state is considered. Three maximally entangled 3-dimensional Bell states function as quantum channel in the scheme. This teleportation scheme can be directly generalized to teleport an unknown d-dimensional GHZ state.

  4. Quantitative analysis of aortic regurgitation: real-time 3-dimensional and 2-dimensional color Doppler echocardiographic method--a clinical and a chronic animal study

    NASA Technical Reports Server (NTRS)

    Shiota, Takahiro; Jones, Michael; Tsujino, Hiroyuki; Qin, Jian Xin; Zetts, Arthur D.; Greenberg, Neil L.; Cardon, Lisa A.; Panza, Julio A.; Thomas, James D.

    2002-01-01

    BACKGROUND: For evaluating patients with aortic regurgitation (AR), regurgitant volumes, left ventricular (LV) stroke volumes (SV), and absolute LV volumes are valuable indices. AIM: The aim of this study was to validate the combination of real-time 3-dimensional echocardiography (3DE) and semiautomated digital color Doppler cardiac flow measurement (ACM) for quantifying absolute LV volumes, LVSV, and AR volumes using an animal model of chronic AR and to investigate its clinical applicability. METHODS: In 8 sheep, a total of 26 hemodynamic states were obtained pharmacologically 20 weeks after the aortic valve noncoronary (n = 4) or right coronary (n = 4) leaflet was incised to produce AR. Reference standard LVSV and AR volume were determined using the electromagnetic flow method (EM). Simultaneous epicardial real-time 3DE studies were performed to obtain LV end-diastolic volumes (LVEDV), end-systolic volumes (LVESV), and LVSV by subtracting LVESV from LVEDV. Simultaneous ACM was performed to obtain LVSV and transmitral flows; AR volume was calculated by subtracting transmitral flow volume from LVSV. In a total of 19 patients with AR, real-time 3DE and ACM were used to obtain LVSVs and these were compared with each other. RESULTS: A strong relationship was found between LVSV derived from EM and those from the real-time 3DE (r = 0.93, P <.001, mean difference (3D - EM) = -1.0 +/- 9.8 mL). A good relationship between LVSV and AR volumes derived from EM and those by ACM was found (r = 0.88, P <.001). A good relationship between LVSV derived from real-time 3DE and that from ACM was observed (r = 0.73, P <.01, mean difference = 2.5 +/- 7.9 mL). In patients, a good relationship between LVSV obtained by real-time 3DE and ACM was found (r = 0.90, P <.001, mean difference = 0.6 +/- 9.8 mL). CONCLUSION: The combination of ACM and real-time 3DE for quantifying LV volumes, LVSV, and AR volumes was validated by the chronic animal study and was shown to be clinically applicable.

  5. Cardiothoracic Applications of 3-dimensional Printing.

    PubMed

    Giannopoulos, Andreas A; Steigner, Michael L; George, Elizabeth; Barile, Maria; Hunsaker, Andetta R; Rybicki, Frank J; Mitsouras, Dimitris

    2016-09-01

    Medical 3-dimensional (3D) printing is emerging as a clinically relevant imaging tool in directing preoperative and intraoperative planning in many surgical specialties and will therefore likely lead to interdisciplinary collaboration between engineers, radiologists, and surgeons. Data from standard imaging modalities such as computed tomography, magnetic resonance imaging, echocardiography, and rotational angiography can be used to fabricate life-sized models of human anatomy and pathology, as well as patient-specific implants and surgical guides. Cardiovascular 3D-printed models can improve diagnosis and allow for advanced preoperative planning. The majority of applications reported involve congenital heart diseases and valvular and great vessels pathologies. Printed models are suitable for planning both surgical and minimally invasive procedures. Added value has been reported toward improving outcomes, minimizing perioperative risk, and developing new procedures such as transcatheter mitral valve replacements. Similarly, thoracic surgeons are using 3D printing to assess invasion of vital structures by tumors and to assist in diagnosis and treatment of upper and lower airway diseases. Anatomic models enable surgeons to assimilate information more quickly than image review, choose the optimal surgical approach, and achieve surgery in a shorter time. Patient-specific 3D-printed implants are beginning to appear and may have significant impact on cosmetic and life-saving procedures in the future. In summary, cardiothoracic 3D printing is rapidly evolving and may be a potential game-changer for surgeons. The imager who is equipped with the tools to apply this new imaging science to cardiothoracic care is thus ideally positioned to innovate in this new emerging imaging modality.

  6. The 3-Dimensional Structure of Galaxy Clusters

    NASA Astrophysics Data System (ADS)

    King, Lindsay

    NASA's Hubble Space Telescope Multi-Cycle Treasury Program CLASH (PI Postman) has provided the community with the most detailed views ever of the central regions of massive galaxy clusters. These galaxy clusters have also been observed with NASA's Chandra X-Ray Observatory, with the ground-based Subaru telescope, and with other ground- and space-based facilities, resulting in unprecedented multi-wavelength data sets of the most massive bound structures in the universe. Fitting 3-Dimensional mass models is crucial to understanding how mass is distributed in individual clusters, investigating the properties of dark matter, and testing our cosmological model. With the exquisite data available, the time is now ideal to undertake this analysis. We propose to use algorithms that we have developed and obtain mass models for the clusters from the CLASH sample. The project would use archival gravitational lensing data, X-ray data of the cluster's hot gas and additional constraints from Sunyaev-Zel'dovich (SZ) data. Specifically, we would model the 23 clusters for which both HST and Subaru data (or in one case WFI data) are publicly available, since the exquisite imaging of HST in the clusters' central regions is beautifully augmented by the wide field coverage of Subaru imaging. If the true 3-D shapes of clusters are not properly accounted for when analysing data, this can lead to inaccuracies in the mass density profiles of individual clusters - up to 50% bias in mass for the most highly triaxial systems. Our proposed project represents an independent analysis of the CLASH sample, complementary to that of the CLASH team, probing the triaxial shapes and orientations of the cluster dark matter halos and hot gas. Our findings will be relevant to the analysis of data from future missions such as JWST and Euclid, and also to ground-based surveys to be made with telescopes such as LSST.

  7. Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment

    PubMed Central

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to evaluate accuracy and reliability in human bones, and clinical application studies to assess fracture risk and effects of osteoporosis medication are overviewed. PMID:26309819

  8. A 3-Dimensional Analysis of the Galactic Gamma-Ray Emission Resulting from Cosmic-Ray Interactions with the Interstellar Gas and Radiation Fields

    NASA Technical Reports Server (NTRS)

    Sodroski, Thomas J.; Dwek, Eli (Technical Monitor)

    2001-01-01

    The contractor will provide support for the analysis of data under ADP (NRA 96-ADP- 09; Proposal No . 167-96adp). The primary task objective is to construct a 3-D model for the distribution of high-energy (20 MeV - 30 GeV) gamma-ray emission in the Galactic disk. Under this task the contractor will utilize data from the EGRET instrument on the Compton Gamma-Ray Observatory, H I and CO surveys, radio-continuum surveys at 408 MHz, 1420 MHz, 5 GHz, and 19 GHz, the COBE Diffuse Infrared Background Experiment (DIME) all-sky maps from 1 to 240 p, and ground-based B, V, J, H, and K photometry. The respective contributions to the gamma-ray emission from cosmic ray/matter interactions, inverse Compton scattering, and extragalactic emission will be determined.

  9. Sensitivity analysis in computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Bristow, D. R.

    1984-01-01

    Information on sensitivity analysis in computational aerodynamics is given in outline, graphical, and chart form. The prediction accuracy if the MCAERO program, a perturbation analysis method, is discussed. A procedure for calculating perturbation matrix, baseline wing paneling for perturbation analysis test cases and applications of an inviscid sensitivity matrix are among the topics covered.

  10. Movement within foot and ankle joint in children with spastic cerebral palsy: a 3-dimensional ultrasound analysis of medial gastrocnemius length with correction for effects of foot deformation

    PubMed Central

    2013-01-01

    Background In spastic cerebral palsy (SCP), a limited range of motion of the foot (ROM), limits gait and other activities. Assessment of this limitation of ROM and knowledge of active mechanisms is of crucial importance for clinical treatment. Methods For a comparison between spastic cerebral palsy (SCP) children and typically developing children (TD), medial gastrocnemius muscle-tendon complex length was assessed using 3-D ultrasound imaging techniques, while exerting externally standardized moments via a hand-held dynamometer. Exemplary X-ray imaging of ankle and foot was used to confirm possible TD-SCP differences in foot deformation. Results SCP and TD did not differ in normalized level of excitation (EMG) of muscles studied. For given moments exerted in SCP, foot plate angles were all more towards plantar flexion than in TD. However, foot plate angle proved to be an invalid estimator of talocrural joint angle, since at equal foot plate angles, GM muscle-tendon complex was shorter in SCP (corresponding to an equivalent of 1 cm). A substantial difference remained even after normalizing for individual differences in tibia length. X-ray imaging of ankle and foot of one SCP child and two typically developed adults, confirmed that in SCP that of total footplate angle changes (0-4 Nm: 15°), the contribution of foot deformation to changes in foot plate angle (8) were as big as the contribution of dorsal flexion at the talocrural joint (7°). In typically developed individuals there were relatively smaller contributions (10 -11%) by foot deformation to changes in foot plate angle, indicating that the contribution of talocrural angle changes was most important. Using a new estimate for position at the talocrural joint (the difference between GM muscle–tendon complex length and tibia length, GM relative length) removed this effect, thus allowing more fair comparison of SCP and TD data. On the basis of analysis of foot plate angle and GM relative length as a function

  11. Computer analysis of railcar vibrations

    NASA Technical Reports Server (NTRS)

    Vlaminck, R. R.

    1975-01-01

    Computer models and techniques for calculating railcar vibrations are discussed along with criteria for vehicle ride optimization. The effect on vibration of car body structural dynamics, suspension system parameters, vehicle geometry, and wheel and rail excitation are presented. Ride quality vibration data collected on the state-of-the-art car and standard light rail vehicle is compared to computer predictions. The results show that computer analysis of the vehicle can be performed for relatively low cost in short periods of time. The analysis permits optimization of the design as it progresses and minimizes the possibility of excessive vibration on production vehicles.

  12. 3-dimensional imaging at nanometer resolutions

    DOEpatents

    Werner, James H.; Goodwin, Peter M.; Shreve, Andrew P.

    2010-03-09

    An apparatus and method for enabling precise, 3-dimensional, photoactivation localization microscopy (PALM) using selective, two-photon activation of fluorophores in a single z-slice of a sample in cooperation with time-gated imaging for reducing the background radiation from other image planes to levels suitable for single-molecule detection and spatial location, are described.

  13. Distributed Design and Analysis of Computer Experiments

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. Formore » example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation

  14. 3-dimensional fabrication of soft energy harvesters

    NASA Astrophysics Data System (ADS)

    McKay, Thomas; Walters, Peter; Rossiter, Jonathan; O'Brien, Benjamin; Anderson, Iain

    2013-04-01

    Dielectric elastomer generators (DEG) provide an opportunity to harvest energy from low frequency and aperiodic sources. Because DEG are soft, deformable, high energy density generators, they can be coupled to complex structures such as the human body to harvest excess mechanical energy. However, DEG are typically constrained by a rigid frame and manufactured in a simple planar structure. This planar arrangement is unlikely to be optimal for harvesting from compliant and/or complex structures. In this paper we present a soft generator which is fabricated into a 3 Dimensional geometry. This capability will enable the 3-dimensional structure of a dielectric elastomer to be customised to the energy source, allowing efficient and/or non-invasive coupling. This paper demonstrates our first 3 dimensional generator which includes a diaphragm with a soft elastomer frame. When the generator was connected to a self-priming circuit and cyclically inflated, energy was accumulated in the system, demonstrated by an increased voltage. Our 3D generator promises a bright future for dielectric elastomers that will be customised for integration with complex and soft structures. In addition to customisable geometries, the 3D printing process may lend itself to fabricating large arrays of small generator units and for fabricating truly soft generators with excellent impedance matching to biological tissue. Thus comfortable, wearable energy harvesters are one step closer to reality.

  15. Biochemical Applications Of 3-Dimensional Fluorescence Spectrometry

    NASA Astrophysics Data System (ADS)

    Leiner, Marc J.; Wolfbeis, Otto S.

    1988-06-01

    We investigated the 3-dimensional fluorescence of complex mixtures of bioloquids such as human serum, serum ultrafiltrate, human urine, and human plasma low density lipoproteins. The total fluorescence of human serum can be divided into a few peaks. When comparing fluorescence topograms of sera, from normal and cancerous subjects, we found significant differences in tryptophan fluorescence. Although the total fluorescence of human urine can be resolved into 3-5 distinct peaks, some of them. do not result from single fluorescent urinary metabolites, but rather from. several species having similar spectral properties. Human plasma, low density lipoproteins possess a native fluorescence that changes when submitted to in-vitro autoxidation. The 3-dimensional fluorescence demonstrated the presence of 7 fluorophores in the lipid domain, and 6 fluorophores in the protein. dovain- The above results demonstrated that 3-dimensional fluorescence can resolve the spectral properties of complex ,lxtures much better than other methods. Moreover, other parameters than excitation and emission wavelength and intensity (for instance fluorescence lifetime, polarization, or quenchability) may be exploited to give a multidl,ensio,a1 matrix, that is unique for each sample. Consequently, 3-dimensio:Hhal fluorescence as such, or in combination with separation techniques is therefore considered to have the potential of becoming a useful new H.ethod in clinical chemistry and analytical biochemistry.

  16. Computer vision in microstructural analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  17. Computational Aeroacoustic Analysis System Development

    NASA Technical Reports Server (NTRS)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  18. CGAT: computational genomics analysis toolkit.

    PubMed

    Sims, David; Ilott, Nicholas E; Sansom, Stephen N; Sudbery, Ian M; Johnson, Jethro S; Fawcett, Katherine A; Berlanga-Taylor, Antonio J; Luna-Valero, Sebastian; Ponting, Chris P; Heger, Andreas

    2014-05-01

    Computational genomics seeks to draw biological inferences from genomic datasets, often by integrating and contextualizing next-generation sequencing data. CGAT provides an extensive suite of tools designed to assist in the analysis of genome scale data from a range of standard file formats. The toolkit enables filtering, comparison, conversion, summarization and annotation of genomic intervals, gene sets and sequences. The tools can both be run from the Unix command line and installed into visual workflow builders, such as Galaxy.

  19. Fabrication of 3-dimensional multicellular microvascular structures

    PubMed Central

    Barreto-Ortiz, Sebastian F.; Fradkin, Jamie; Eoh, Joon; Trivero, Jacqueline; Davenport, Matthew; Ginn, Brian; Mao, Hai-Quan; Gerecht, Sharon

    2015-01-01

    Despite current advances in engineering blood vessels over 1 mm in diameter and the existing wealth of knowledge regarding capillary bed formation, studies for the development of microvasculature, the connecting bridge between them, have been extremely limited so far. Here, we evaluate the use of 3-dimensional (3D) microfibers fabricated by hydrogel electrospinning as templates for microvascular structure formation. We hypothesize that 3D microfibers improve extracellular matrix (ECM) deposition from vascular cells, enabling the formation of freestanding luminal multicellular microvasculature. Compared to 2-dimensional cultures, we demonstrate with confocal microscopy and RT-PCR that fibrin microfibers induce an increased ECM protein deposition by vascular cells, specifically endothelial colony-forming cells, pericytes, and vascular smooth muscle cells. These ECM proteins comprise different layers of the vascular wall including collagen types I, III, and IV, as well as elastin, fibronectin, and laminin. We further demonstrate the achievement of multicellular microvascular structures with an organized endothelium and a robust multicellular perivascular tunica media. This, along with the increased ECM deposition, allowed for the creation of self-supporting multilayered microvasculature with a distinct circular lumen following fibrin microfiber core removal. This approach presents an advancement toward the development of human microvasculature for basic and translational studies.—Barreto-Ortiz, S. F., Fradkin, J., Eoh, J., Trivero, J., Davenport, M., Ginn, B., Mao, H.-Q., Gerecht, S. Fabrication of 3-dimensional multicellular microvascular structures. PMID:25900808

  20. Preliminary Toxicity Analysis of 3-Dimensional Conformal Radiation Therapy Versus Intensity Modulated Radiation Therapy on the High-Dose Arm of the Radiation Therapy Oncology Group 0126 Prostate Cancer Trial

    SciTech Connect

    Michalski, Jeff M.; Yan, Yan; Watkins-Bruner, Deborah; Bosch, Walter R.; Winter, Kathryn; Galvin, James M.; Bahary, Jean-Paul; Morton, Gerard C.; Parliament, Matthew B.; Sandler, Howard M.

    2013-12-01

    Purpose: To give a preliminary report of clinical and treatment factors associated with toxicity in men receiving high-dose radiation therapy (RT) on a phase 3 dose-escalation trial. Methods and Materials: The trial was initiated with 3-dimensional conformal RT (3D-CRT) and amended after 1 year to allow intensity modulated RT (IMRT). Patients treated with 3D-CRT received 55.8 Gy to a planning target volume that included the prostate and seminal vesicles, then 23.4 Gy to prostate only. The IMRT patients were treated to the prostate and proximal seminal vesicles to 79.2 Gy. Common Toxicity Criteria, version 2.0, and Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer late morbidity scores were used for acute and late effects. Results: Of 763 patients randomized to the 79.2-Gy arm of Radiation Therapy Oncology Group 0126 protocol, 748 were eligible and evaluable: 491 and 257 were treated with 3D-CRT and IMRT, respectively. For both bladder and rectum, the volumes receiving 65, 70, and 75 Gy were significantly lower with IMRT (all P<.0001). For grade (G) 2+ acute gastrointestinal/genitourinary (GI/GU) toxicity, both univariate and multivariate analyses showed a statistically significant decrease in G2+ acute collective GI/GU toxicity for IMRT. There were no significant differences with 3D-CRT or IMRT for acute or late G2+ or 3+ GU toxicities. Univariate analysis showed a statistically significant decrease in late G2+ GI toxicity for IMRT (P=.039). On multivariate analysis, IMRT showed a 26% reduction in G2+ late GI toxicity (P=.099). Acute G2+ toxicity was associated with late G3+ toxicity (P=.005). With dose–volume histogram data in the multivariate analysis, RT modality was not significant, whereas white race (P=.001) and rectal V70 ≥15% were associated with G2+ rectal toxicity (P=.034). Conclusions: Intensity modulated RT is associated with a significant reduction in acute G2+ GI/GU toxicity. There is a trend for a

  1. 3-dimensional bioprinting for tissue engineering applications.

    PubMed

    Gu, Bon Kang; Choi, Dong Jin; Park, Sang Jun; Kim, Min Sup; Kang, Chang Mo; Kim, Chun-Ho

    2016-01-01

    The 3-dimensional (3D) printing technologies, referred to as additive manufacturing (AM) or rapid prototyping (RP), have acquired reputation over the past few years for art, architectural modeling, lightweight machines, and tissue engineering applications. Among these applications, tissue engineering field using 3D printing has attracted the attention from many researchers. 3D bioprinting has an advantage in the manufacture of a scaffold for tissue engineering applications, because of rapid-fabrication, high-precision, and customized-production, etc. In this review, we will introduce the principles and the current state of the 3D bioprinting methods. Focusing on some of studies that are being current application for biomedical and tissue engineering fields using printed 3D scaffolds.

  2. 3-dimensional bioprinting for tissue engineering applications.

    PubMed

    Gu, Bon Kang; Choi, Dong Jin; Park, Sang Jun; Kim, Min Sup; Kang, Chang Mo; Kim, Chun-Ho

    2016-01-01

    The 3-dimensional (3D) printing technologies, referred to as additive manufacturing (AM) or rapid prototyping (RP), have acquired reputation over the past few years for art, architectural modeling, lightweight machines, and tissue engineering applications. Among these applications, tissue engineering field using 3D printing has attracted the attention from many researchers. 3D bioprinting has an advantage in the manufacture of a scaffold for tissue engineering applications, because of rapid-fabrication, high-precision, and customized-production, etc. In this review, we will introduce the principles and the current state of the 3D bioprinting methods. Focusing on some of studies that are being current application for biomedical and tissue engineering fields using printed 3D scaffolds. PMID:27114828

  3. On AGV's navigation in 3-dimensional space

    NASA Astrophysics Data System (ADS)

    Kusche, Jürgen

    1996-01-01

    This paper deals with position estimation and path control for Autonomous Guided Vehicles (AGV). To enable a vehicle or a mobile robot in following a continuous “virtual” path without human control, these techniques play an important role. The relationship between the vehicle's motion in 3-dimensional space and the shape of a curved surface is described. In particular, the introduction of a digital terrain model in dead reckoning is considered. Moreover, a possible nonlinear control is developed based on curvilinear path coordinates, and the proof for global stability is given. To achieve general validity, these topics are treated here independently of the cart's special mechanization (the configuration of steered wheels and driven wheels). Simulation studies are presented to illustrate the investigations.

  4. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  5. A Petaflops Era Computing Analysis

    NASA Technical Reports Server (NTRS)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  6. Textual Analysis with Computers: Tests of Bell Laboratories' Computer Software.

    ERIC Educational Resources Information Center

    Kiefer, Kathleen E.; Smith, Charles R.

    1983-01-01

    Concludes that textual analysis with computers intrigues college writers and speeds learning of editing skills by offering immediate, reliable, and consistent attention to surface features of their prose. (HOD)

  7. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  8. Improving Perceptual Skills with 3-Dimensional Animations.

    ERIC Educational Resources Information Center

    Johns, Janet Faye; Brander, Julianne Marie

    1998-01-01

    Describes three-dimensional computer aided design (CAD) models for every component in a representative mechanical system; the CAD models made it easy to generate 3-D animations that are ideal for teaching perceptual skills in multimedia computer-based technical training. Fifteen illustrations are provided. (AEF)

  9. Application of 3-dimensional printing in hand surgery for production of a novel bone reduction clamp.

    PubMed

    Fuller, Sam M; Butz, Daniel R; Vevang, Curt B; Makhlouf, Mansour V

    2014-09-01

    Three-dimensional printing is being rapidly incorporated in the medical field to produce external prosthetics for improved cosmesis and fabricated molds to aid in presurgical planning. Biomedically engineered products from 3-dimensional printers are also utilized as implantable devices for knee arthroplasty, airway orthoses, and other surgical procedures. Although at first expensive and conceptually difficult to construct, 3-dimensional printing is now becoming more affordable and widely accessible. In hand surgery, like many other specialties, new or customized instruments would be desirable; however, the overall production cost restricts their development. We are presenting our step-by-step experience in creating a bone reduction clamp for finger fractures using 3-dimensional printing technology. Using free, downloadable software, a 3-dimensional model of a bone reduction clamp for hand fractures was created based on the senior author's (M.V.M.) specific design, previous experience, and preferences for fracture fixation. Once deemed satisfactory, the computer files were sent to a 3-dimensional printing company for the production of the prototypes. Multiple plastic prototypes were made and adjusted, affording a fast, low-cost working model of the proposed clamp. Once a workable design was obtained, a printing company produced the surgical clamp prototype directly from the 3-dimensional model represented in the computer files. This prototype was used in the operating room, meeting the expectations of the surgeon. Three-dimensional printing is affordable and offers the benefits of reducing production time and nurturing innovations in hand surgery. This article presents a step-by-step description of our design process using online software programs and 3-dimensional printing services. As medical technology advances, it is important that hand surgeons remain aware of available resources, are knowledgeable about how the process works, and are able to take advantage of

  10. Computer-Based Linguistic Analysis.

    ERIC Educational Resources Information Center

    Wright, James R.

    Noam Chomsky's transformational-generative grammar model may effectively be translated into an equivalent computer model. Phrase-structure rules and transformations are tested as to their validity and ordering by the computer via the process of random lexical substitution. Errors appearing in the grammar are detected and rectified, and formal…

  11. Bimolecular dynamics by computer analysis

    SciTech Connect

    Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.

    1984-01-01

    As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.

  12. 3-Dimensional simulation of the grain formation in investment castings

    SciTech Connect

    Gandin, C.A.; Rappaz, M. ); Tintillier, R. . Dept. Materiaux et Procedes-Direction Technique)

    1994-03-01

    A 3-dimensional (3-D) probabilistic model which has been developed previously for the prediction of grain structure formation during solidification is applied to thin superalloy plates produced using the investment-casting process. This model considers the random nucleation and orientation of nuclei formed at the mold surface and in the bulk of the liquid, the growth kinetics of the dendrite tips, and the preferential growth directions of the dendrite trunks and arms. In the present study, the grains are assumed to nucleate at the surface of the mold only. The computed grain structures, as observed in 2-dimensional (2-D) sections made parallel to the mold surface, are compared with experimental micrographs. The grain densities are then deduced as a function of the distance from the mold surface for both the experiment and the simulation. It is shown that these values are in good agreement, thus, providing validation of the grain formation mechanisms built into the 3-D probabilistic model. Finally, this model is further extended to more complex geometries and the 3-D computed grain structure of an equiaxed turbine-blade airfoil is compared with the experimental transverse section micrograph.

  13. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  14. Computational analysis on plug-in hybrid electric motorcycle chassis

    NASA Astrophysics Data System (ADS)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  15. Error Analysis In Computational Elastodynamics

    NASA Astrophysics Data System (ADS)

    Mukherjee, Somenath; Jafarali, P.; Prathap, Gangan

    The Finite Element Method (FEM) is the mathematical tool of the engineers and scientists to determine approximate solutions, in a discretised sense, of the concerned differential equations, which are not always amenable to closed form solutions. In this presentation, the mathematical aspects of this powerful computational tool as applied to the field of elastodynamics have been highlighted, using the first principles of virtual work and energy conservation.

  16. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  17. Discourse Analysis of Teaching Computing Online

    ERIC Educational Resources Information Center

    Bower, Matt

    2009-01-01

    This paper analyses the teaching and learning of computing in a Web-conferencing environment. A discourse analysis of three introductory programming learning episodes is presented to demonstrate issues and effects that arise when teaching computing using such an approach. The subject of discussion, the interactive nature of discussion and any…

  18. Radiological Safety Analysis Computer Program

    2001-08-28

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface andmore » plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FOR 11 and 12.« less

  19. Computer aided analysis of phonocardiogram.

    PubMed

    Singh, J; Anand, R S

    2007-01-01

    In the present paper analysis of phonocardiogram (PCG) records are presented. The analysis has been carried out in both time and frequency domains with the aim of detecting certain correlations between the time and frequency domain representations of PCG. The analysis is limited to first and second heart sounds (S1 and S2) only. In the time domain analysis the moving window averaging technique is used to determine the occurrence of S1 and S2, which helps in determination of cardiac interval and absolute and relative time duration of individual S1 and S2, as well as absolute and relative duration between them. In the frequency domain, fast Fourier transform (FFT) of the complete PCG record, and short time Fourier transform (STFT) and wavelet transform of individual heart sounds have been carried out. The frequency domain analysis gives an idea about the dominant frequency components in individual records and frequency spectrum of individual heart sounds. A comparative observation on both the analyses gives some correlation between time domain and frequency domain representations of PCG. PMID:17701776

  20. COMPUTATIONAL FLUID DYNAMICS MODELING ANALYSIS OF COMBUSTORS

    SciTech Connect

    Mathur, M.P.; Freeman, Mark; Gera, Dinesh

    2001-11-06

    In the current fiscal year FY01, several CFD simulations were conducted to investigate the effects of moisture in biomass/coal, particle injection locations, and flow parameters on carbon burnout and NO{sub x} inside a 150 MW GEEZER industrial boiler. Various simulations were designed to predict the suitability of biomass cofiring in coal combustors, and to explore the possibility of using biomass as a reburning fuel to reduce NO{sub x}. Some additional CFD simulations were also conducted on CERF combustor to examine the combustion characteristics of pulverized coal in enriched O{sub 2}/CO{sub 2} environments. Most of the CFD models available in the literature treat particles to be point masses with uniform temperature inside the particles. This isothermal condition may not be suitable for larger biomass particles. To this end, a stand alone program was developed from the first principles to account for heat conduction from the surface of the particle to its center. It is envisaged that the recently developed non-isothermal stand alone module will be integrated with the Fluent solver during next fiscal year to accurately predict the carbon burnout from larger biomass particles. Anisotropy in heat transfer in radial and axial will be explored using different conductivities in radial and axial directions. The above models will be validated/tested on various fullscale industrial boilers. The current NO{sub x} modules will be modified to account for local CH, CH{sub 2}, and CH{sub 3} radicals chemistry, currently it is based on global chemistry. It may also be worth exploring the effect of enriched O{sub 2}/CO{sub 2} environment on carbon burnout and NO{sub x} concentration. The research objective of this study is to develop a 3-Dimensional Combustor Model for Biomass Co-firing and reburning applications using the Fluent Computational Fluid Dynamics Code.

  1. Economic Analysis. Computer Simulation Models.

    ERIC Educational Resources Information Center

    Sterling Inst., Washington, DC. Educational Technology Center.

    A multimedia course in economic analysis was developed and used in conjunction with the United States Naval Academy. (See ED 043 790 and ED 043 791 for final reports of the project evaluation and development model.) This volume of the text discusses the simulation of behavioral relationships among variable elements in an economy and presents…

  2. Computational analysis of ramjet engine inlet interaction

    NASA Technical Reports Server (NTRS)

    Duncan, Beverly; Thomas, Scott

    1992-01-01

    A computational analysis of a ramjet engine at Mach 3.5 has been conducted and compared to results obtained experimentally. This study focuses on the behavior of the inlet both with and without combustor backpressure. Increased backpressure results in separation of the body side boundary layer and a resultant static pressure rise in the inlet throat region. The computational results compare well with the experimental data for static pressure distribution through the engine, inlet throat flow profiles, and mass capture. The computational analysis slightly underpredicts the thickness of the engine body surface boundary layer and the extent of the interaction caused by backpressure; however, the interaction is observed at approximately the same level of backpressure both experimentally and computationally. This study demonstrates the ability of two different Navier-Stokes codes, namely RPLUS and PARC, to calculate the flow features of this ramjet engine and to provide more detailed information on the process of inlet interaction and unstart.

  3. MTX data acquisition and analysis computer network

    SciTech Connect

    Butner, D.N.; Casper, T.A.; Brown, M.D.; Drlik, M.; Meyer, W.H.; Moller, J.M. )

    1990-10-01

    For the MTX experiment, we use a network of computers for plasma diagnostic data acquisition and analysis. This multivendor network employs VMS, UNIX, and BASIC based computers connected in a local area Ethernet network. Some of the data is acquired directly into a VAX/VMS computer cluster over a fiber-optic serial CAMAC highway. Several HP-Unix workstations and HP-BASIC instrument control computers acquire and analyze data for the more data intensive or specialized diagnostics. The VAX/VMS system is used for global analysis of the data and serves as the central data archiving and retrieval manager. Shot synchronization and control of data flow are implemented by task-to-task message passing using our interprocess communication system. The system has been in operation during our initial MTX tokamak and FEL experiments; it has operated reliably with data rates typically in the range of 5 Mbytes/shot without limiting the experimental shot rate.

  4. Computer aided nonlinear electrical networks analysis

    NASA Technical Reports Server (NTRS)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  5. The computer in shell stability analysis

    NASA Technical Reports Server (NTRS)

    Almroth, B. O.; Starnes, J. H., Jr.

    1975-01-01

    Some examples in which the high-speed computer has been used to improve the static stability analysis capability for general shells are examined. The fundamental concepts of static stability are reviewed with emphasis on the differences between linear bifurcation buckling and nonlinear collapse. The analysis is limited to the stability of conservative systems. Three examples are considered. The problem of cylinders subjected to bending loads is used as an example to illustrate that a simple structure can have a sufficiently complicated nonlinear behavior to require a computer analysis for accurate results. An analysis of the problems involved in the modeling of stiffening elements in plate and shell structures illustrates the necessity that the analyst recognizes all important deformation modes. The stability analysis of the Skylab structure indicates the size of problems that can be solved with current state-of-the-art capability.

  6. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  7. Temporal fringe pattern analysis with parallel computing

    SciTech Connect

    Tuck Wah Ng; Kar Tien Ang; Argentini, Gianluca

    2005-11-20

    Temporal fringe pattern analysis is invaluable in transient phenomena studies but necessitates long processing times. Here we describe a parallel computing strategy based on the single-program multiple-data model and hyperthreading processor technology to reduce the execution time. In a two-node cluster workstation configuration we found that execution periods were reduced by 1.6 times when four virtual processors were used. To allow even lower execution times with an increasing number of processors, the time allocated for data transfer, data read, and waiting should be minimized. Parallel computing is found here to present a feasible approach to reduce execution times in temporal fringe pattern analysis.

  8. The 3-dimensional construction of the Rae craton, central Canada

    NASA Astrophysics Data System (ADS)

    Snyder, David B.; Craven, James A.; Pilkington, Mark; Hillier, Michael J.

    2015-10-01

    Reconstruction of the 3-dimensional tectonic assembly of early continents, first as Archean cratons and then Proterozoic shields, remains poorly understood. In this paper, all readily available geophysical and geochemical data are assembled in a 3-D model with the most accurate bedrock geology in order to understand better the geometry of major structures within the Rae craton of central Canada. Analysis of geophysical observations of gravity and seismic wave speed variations revealed several lithospheric-scale discontinuities in physical properties. Where these discontinuities project upward to correlate with mapped upper crustal geological structures, the discontinuities can be interpreted as shear zones. Radiometric dating of xenoliths provides estimates of rock types and ages at depth beneath sparse kimberlite occurrences. These ages can also be correlated to surface rocks. The 3.6-2.6 Ga Rae craton comprises at least three smaller continental terranes, which "cratonized" during a granitic bloom. Cratonization probably represents final differentiation of early crust into a relatively homogeneous, uniformly thin (35-42 km), tonalite-trondhjemite-granodiorite crust with pyroxenite layers near the Moho. The peak thermotectonic event at 1.86-1.7 Ga was associated with the Hudsonian orogeny that assembled several cratons and lesser continental blocks into the Canadian Shield using a number of southeast-dipping megathrusts. This orogeny metasomatized, mineralized, and recrystallized mantle and lower crustal rocks, apparently making them more conductive by introducing or concentrating sulfides or graphite. Little evidence exists of thin slabs similar to modern oceanic lithosphere in this Precambrian construction history whereas underthrusting and wedging of continental lithosphere is inferred from multiple dipping discontinuities.

  9. Final Report Computational Analysis of Dynamical Systems

    SciTech Connect

    Guckenheimer, John

    2012-05-08

    This is the final report for DOE Grant DE-FG02-93ER25164, initiated in 1993. This grant supported research of John Guckenheimer on computational analysis of dynamical systems. During that period, seventeen individuals received PhD degrees under the supervision of Guckenheimer and over fifty publications related to the grant were produced. This document contains copies of these publications.

  10. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  11. Differential Cross Section Kinematics for 3-dimensional Transport Codes

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Dick, Frank

    2008-01-01

    In support of the development of 3-dimensional transport codes, this paper derives the relevant relativistic particle kinematic theory. Formulas are given for invariant, spectral and angular distributions in both the lab (spacecraft) and center of momentum frames, for collisions involving 2, 3 and n - body final states.

  12. Controlled teleportation of a 3-dimensional bipartite quantum state

    NASA Astrophysics Data System (ADS)

    Cao, Hai-Jing; Chen, Zhong-Hua; Song, He-Shan

    2008-07-01

    A controlled teleportation scheme of an unknown 3-dimensional (3D) two-particle quantum state is proposed, where a 3D Bell state and 3D GHZ state function as the quantum channel. This teleportation scheme can be directly generalized to teleport an unknown d-dimensional bipartite quantum state.

  13. Computational strategies for tire monitoring and analysis

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.; Noor, Ahmed K.; Green, James S.

    1995-01-01

    Computational strategies are presented for the modeling and analysis of tires in contact with pavement. A procedure is introduced for simple and accurate determination of tire cross-sectional geometric characteristics from a digitally scanned image. Three new strategies for reducing the computational effort in the finite element solution of tire-pavement contact are also presented. These strategies take advantage of the observation that footprint loads do not usually stimulate a significant tire response away from the pavement contact region. The finite element strategies differ in their level of approximation and required amount of computer resources. The effectiveness of the strategies is demonstrated by numerical examples of frictionless and frictional contact of the space shuttle Orbiter nose-gear tire. Both an in-house research code and a commercial finite element code are used in the numerical studies.

  14. Computational analysis of forebody tangential slot blowing

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Agosta-Greenman, Roxana M.; Rizk, Yehia M.; Schiff, Lewis B.; Cummings, Russell M.

    1994-01-01

    An overview of the computational effort to analyze forebody tangential slot blowing is presented. Tangential slot blowing generates side force and yawing moment which may be used to control an aircraft flying at high-angle-of-attack. Two different geometries are used in the analysis: (1) The High Alpha Research Vehicle; and (2) a generic chined forebody. Computations using the isolated F/A-18 forebody are obtained at full-scale wind tunnel test conditions for direct comparison with available experimental data. The effects of over- and under-blowing on force and moment production are analyzed. Time-accurate solutions using the isolated forebody are obtained to study the force onset timelag of tangential slot blowing. Computations using the generic chined forebody are obtained at experimental wind tunnel conditions, and the results compared with available experimental data. This computational analysis compliments the experimental results and provides a detailed understanding of the effects of tangential slot blowing on the flow field about simple and complex geometries.

  15. Probabilistic structural analysis computer code (NESSUS)

    NASA Technical Reports Server (NTRS)

    Shiao, Michael C.

    1988-01-01

    Probabilistic structural analysis has been developed to analyze the effects of fluctuating loads, variable material properties, and uncertain analytical models especially for high performance structures such as SSME turbopump blades. The computer code NESSUS (Numerical Evaluation of Stochastic Structure Under Stress) was developed to serve as a primary computation tool for the characterization of the probabilistic structural response due to the stochastic environments by statistical description. The code consists of three major modules NESSUS/PRE, NESSUS/FEM, and NESSUS/FPI. NESSUS/PRE is a preprocessor which decomposes the spatially correlated random variables into a set of uncorrelated random variables using a modal analysis method. NESSUS/FEM is a finite element module which provides structural sensitivities to all the random variables considered. NESSUS/FPI is Fast Probability Integration method by which a cumulative distribution function or a probability density function is calculated.

  16. Computer analysis of HIV epitope sequences

    SciTech Connect

    Gupta, G.; Myers, G.

    1990-01-01

    Phylogenetic tree analysis provide us with important general information regarding the extent and rate of HIV variation. Currently we are attempting to extend computer analysis and modeling to the V3 loop of the type 2 virus and its simian homologues, especially in light of the prominent role the latter will play in animal model studies. Moreover, it might be possible to attack the slightly similar V4 loop by this approach. However, the strategy relies very heavily upon natural'' information and constraints, thus there exist severe limitations upon the general applicability, in addition to uncertainties with regard to long-range residue interactions. 5 refs., 3 figs.

  17. Noncommutative 3 Dimensional Soliton from Multi-instantons

    NASA Astrophysics Data System (ADS)

    Correa, D. H.; Forgacs, P.; Moreno, E. F.; Schaposnik, F. A.; Silva, G. A.

    2004-07-01

    We extend the relation between instanton and monopole solutions of the selfduality equations in SU(2) gauge theory to noncommutative space-times. Using this approach and starting from a noncommutative multi-instanton solution we construct a U(2) monopole configuration which lives in 3 dimensional ordinary space. This configuration resembles the Wu-Yang monopole and satisfies the selfduality (Bogomol'nyi) equations for a U(2) Yang-Mills-Higgs system.

  18. Multimodality 3-Dimensional Image Integration for Congenital Cardiac Catheterization

    PubMed Central

    2014-01-01

    Cardiac catheterization procedures for patients with congenital and structural heart disease are becoming more complex. New imaging strategies involving integration of 3-dimensional images from rotational angiography, magnetic resonance imaging (MRI), computerized tomography (CT), and transesophageal echocardiography (TEE) are employed to facilitate these procedures. We discuss the current use of these new 3D imaging technologies and their advantages and challenges when used to guide complex diagnostic and interventional catheterization procedures in patients with congenital heart disease. PMID:25114757

  19. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  20. Successful Parenchyma-Sparing Anatomical Surgery by 3-Dimensional Reconstruction of Hilar Cholangiocarcinoma Combined with Anatomic Variation.

    PubMed

    Ni, Qihong; Wang, Haolu; Liang, Xiaowen; Zhang, Yunhe; Chen, Wei; Wang, Jian

    2016-06-01

    The combination of hilar cholangiocarcinoma and anatomic variation constitutes a rare and complicated condition. Precise understanding of 3-dimensional position of tumor in the intrahepatic structure in such cases is important for operation planning and navigation. We report a case of a 61-year woman presenting with hilar cholangiocarcinoma. Anatomic variation and tumor location were well depicted on preoperative multidetector computed tomography (MDCT) combined with 3-dimensional reconstruction as the right posterior segmental duct drained to left hepatic duct. The common hepatic duct, biliary confluence, right anterior segmental duct, and right anterior branch of portal vein were involved by the tumor (Bismuth IIIa). After carefully operation planning, we successfully performed a radical parenchyma-sparing anatomical surgery of hilar cholangiocarcinoma: Liver segmentectomy (segments 5 and 8) and caudate lobectomy. MDCTcombined with 3-dimensional reconstruction is a reliable non-invasive modality for preoperative evaluation of hilar cholangiocarcinoma. PMID:27376205

  1. Semiconductor Device Analysis on Personal Computers

    1993-02-08

    PC-1D models the internal operation of bipolar semiconductor devices by solving for the concentrations and quasi-one-dimensional flow of electrons and holes resulting from either electrical or optical excitation. PC-1D uses the same detailed physical models incorporated in mainframe computer programs, yet runs efficiently on personal computers. PC-1D was originally developed with DOE funding to analyze solar cells. That continues to be its primary mode of usage, with registered copies in regular use at more thanmore » 100 locations worldwide. The program has been successfully applied to the analysis of silicon, gallium-arsenide, and indium-phosphide solar cells. The program is also suitable for modeling bipolar transistors and diodes, including heterojunction devices. Its easy-to-use graphical interface makes it useful as a teaching tool as well.« less

  2. Simple optical computing device for chemical analysis

    NASA Astrophysics Data System (ADS)

    Soyemi, Olusola O.; Zhang, Lixia; Eastwood, DeLyle; Li, Hongli; Gemperline, Paul J.; Myrick, Michael L.

    2001-05-01

    Multivariate Optical Computing (MOC) devices have the potential of greatly simplifying as well as reducing the cost of applying the mathematics of multivariate regression to problems of chemical analysis in the real world. These devices utilize special optical interference coatings known as multivariate optical elements (MOEs) that are encoded with pre-determined spectroscopic patterns to selectively quantify a chemical species of interest in the presence of other interfering species. A T-format prototype of the first optical computing device is presented utilizing a multilayer MOE consisting of alternating layers of two metal oxide films (Nb2O5 and SiO2) on a BK-7 glass substrate. The device was tested by using it to quantify copper uroporphyrin in a quaternary mixture consisting of uroporphyrin (freebase), tin uroporphyrin, nickel uroporphyrin, and copper uroporphyrin. A standard error of prediction (SEP) of 0.86(mu) M was obtained for copper uroporphyrin.

  3. Dental computed tomographic imaging as age estimation: morphological analysis of the third molar of a group of Turkish population.

    PubMed

    Cantekin, Kenan; Sekerci, Ahmet Ercan; Buyuk, Suleyman Kutalmis

    2013-12-01

    Computed tomography (CT) is capable of providing accurate and measurable 3-dimensional images of the third molar. The aims of this study were to analyze the development of the mandibular third molar and its relation to chronological age and to create new reference data for a group of Turkish participants aged 9 to 25 years on the basis of cone-beam CT images. All data were obtained from the patients' records including medical, social, and dental anamnesis and cone-beam CT images of 752 patients. Linear regression analysis was performed to obtain regression formulas for dental age calculation with chronological age and to determine the coefficient of determination (r) for each sex. Statistical analysis showed a strong correlation between age and third-molar development for the males (r2 = 0.80) and the females (r2 = 0.78). Computed tomographic images are clinically useful for accurate and reliable estimation of dental ages of children and youth.

  4. Computed tomographic analysis of meteorite inclusions.

    PubMed

    Arnold, J R; Testa, J P; Friedman, P J; Kambic, G X

    1983-01-28

    The discovery of isotopic anomalies in the calcium- and aluminum-rich inclusions of the Allende meteorite has improved our knowledge of the origin of the solar system. Inability to find more inclusions without destroying the meteorite has hampered further study. By using a fourth-generation computed tomographic scanner with modifications to the software only, the interior of heterogeneous materials such as Allende can be nondestructively probed. The regions of material with high and low atomic numbers are displayed quickly. The object can then be cut to obtain for analysis just the areas of interest.

  5. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  6. Computer analysis of mammography phantom images (CAMPI)

    NASA Astrophysics Data System (ADS)

    Chakraborty, Dev P.

    1997-05-01

    Computer analysis of mammography phantom images (CAMPI) is a method for objective and precise measurements of phantom image quality in mammography. This investigation applied CAMPI methodology to the Fischer Mammotest Stereotactic Digital Biopsy machine. Images of an American College of Radiology phantom centered on the largest two microcalcification groups were obtained on this machine under a variety of x-ray conditions. Analyses of the images revealed that the precise behavior of the CAMPI measures could be understood from basic imaging physics principles. We conclude that CAMPI is sensitive to subtle image quality changes and can perform accurate evaluations of images, especially of directly acquired digital images.

  7. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  8. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  9. Prenatal diagnosis of holoprosencephaly with ethmocephaly via 3-dimensional sonography.

    PubMed

    Lee, Gui-Se-Ra; Hur, Soo Young; Shin, Jong-Chul; Kim, Soo-Pyung; Kim, Sa Jin

    2006-01-01

    We present the prenatal 3-dimensional (3D) sonographic findings in a case of holoprosencephaly with ethmocephaly at 32 weeks' gestation. The sonographic diagnosis was based on the intracranial findings of a single ventricle and bulb-shaped appearance of the thalami and facial abnormalities, including hypotelorism with proboscis. Chromosome study of the fetus revealed a normal female karyotype (46,XX). Postmortem examination confirmed the 3D sonographic findings. This case demonstrates that the use of 3D sonography improves the imaging and the understanding of the condition of the intracranial abnormalities and the facial anomalies. PMID:16788963

  10. The 3-dimensional cellular automata for HIV infection

    NASA Astrophysics Data System (ADS)

    Mo, Youbin; Ren, Bin; Yang, Wencao; Shuai, Jianwei

    2014-04-01

    The HIV infection dynamics is discussed in detail with a 3-dimensional cellular automata model in this paper. The model can reproduce the three-phase development, i.e., the acute period, the asymptotic period and the AIDS period, observed in the HIV-infected patients in a clinic. We show that the 3D HIV model performs a better robustness on the model parameters than the 2D cellular automata. Furthermore, we reveal that the occurrence of a perpetual source to successively generate infectious waves to spread to the whole system drives the model from the asymptotic state to the AIDS state.

  11. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  12. Analysis of Ventricular Function by Computed Tomography

    PubMed Central

    Rizvi, Asim; Deaño, Roderick C.; Bachman, Daniel P.; Xiong, Guanglei; Min, James K.; Truong, Quynh A.

    2014-01-01

    The assessment of ventricular function, cardiac chamber dimensions and ventricular mass is fundamental for clinical diagnosis, risk assessment, therapeutic decisions, and prognosis in patients with cardiac disease. Although cardiac computed tomography (CT) is a noninvasive imaging technique often used for the assessment of coronary artery disease, it can also be utilized to obtain important data about left and right ventricular function and morphology. In this review, we will discuss the clinical indications for the use of cardiac CT for ventricular analysis, review the evidence on the assessment of ventricular function compared to existing imaging modalities such cardiac MRI and echocardiography, provide a typical cardiac CT protocol for image acquisition and post-processing for ventricular analysis, and provide step-by-step instructions to acquire multiplanar cardiac views for ventricular assessment from the standard axial, coronal, and sagittal planes. Furthermore, both qualitative and quantitative assessments of ventricular function as well as sample reporting are detailed. PMID:25576407

  13. Computational Analysis of Human Blood Flow

    NASA Astrophysics Data System (ADS)

    Panta, Yogendra; Marie, Hazel; Harvey, Mark

    2009-11-01

    Fluid flow modeling with commercially available computational fluid dynamics (CFD) software is widely used to visualize and predict physical phenomena related to various biological systems. In this presentation, a typical human aorta model was analyzed assuming the blood flow as laminar with complaint cardiac muscle wall boundaries. FLUENT, a commercially available finite volume software, coupled with Solidworks, a modeling software, was employed for the preprocessing, simulation and postprocessing of all the models.The analysis mainly consists of a fluid-dynamics analysis including a calculation of the velocity field and pressure distribution in the blood and a mechanical analysis of the deformation of the tissue and artery in terms of wall shear stress. A number of other models e.g. T branches, angle shaped were previously analyzed and compared their results for consistency for similar boundary conditions. The velocities, pressures and wall shear stress distributions achieved in all models were as expected given the similar boundary conditions. The three dimensional time dependent analysis of blood flow accounting the effect of body forces with a complaint boundary was also performed.

  14. Automated feature extraction for 3-dimensional point clouds

    NASA Astrophysics Data System (ADS)

    Magruder, Lori A.; Leigh, Holly W.; Soderlund, Alexander; Clymer, Bradley; Baer, Jessica; Neuenschwander, Amy L.

    2016-05-01

    Light detection and ranging (LIDAR) technology offers the capability to rapidly capture high-resolution, 3-dimensional surface data with centimeter-level accuracy for a large variety of applications. Due to the foliage-penetrating properties of LIDAR systems, these geospatial data sets can detect ground surfaces beneath trees, enabling the production of highfidelity bare earth elevation models. Precise characterization of the ground surface allows for identification of terrain and non-terrain points within the point cloud, and facilitates further discernment between natural and man-made objects based solely on structural aspects and relative neighboring parameterizations. A framework is presented here for automated extraction of natural and man-made features that does not rely on coincident ortho-imagery or point RGB attributes. The TEXAS (Terrain EXtraction And Segmentation) algorithm is used first to generate a bare earth surface from a lidar survey, which is then used to classify points as terrain or non-terrain. Further classifications are assigned at the point level by leveraging local spatial information. Similarly classed points are then clustered together into regions to identify individual features. Descriptions of the spatial attributes of each region are generated, resulting in the identification of individual tree locations, forest extents, building footprints, and 3-dimensional building shapes, among others. Results of the fully-automated feature extraction algorithm are then compared to ground truth to assess completeness and accuracy of the methodology.

  15. Methods of the computer-aided statistical analysis of microcircuits

    NASA Astrophysics Data System (ADS)

    Beliakov, Iu. N.; Kurmaev, F. A.; Batalov, B. V.

    Methods that are currently used for the computer-aided statistical analysis of microcircuits at the design stage are summarized. In particular, attention is given to methods for solving problems in statistical analysis, statistical planning, and factorial model synthesis by means of irregular experimental design. Efficient ways of reducing the computer time required for statistical analysis and numerical methods of microcircuit analysis are proposed. The discussion also covers various aspects of the organization of computer-aided microcircuit modeling and analysis systems.

  16. Computational System For Rapid CFD Analysis In Engineering

    NASA Technical Reports Server (NTRS)

    Barson, Steven L.; Ascoli, Edward P.; Decroix, Michelle E.; Sindir, Munir M.

    1995-01-01

    Computational system comprising modular hardware and software sub-systems developed to accelerate and facilitate use of techniques of computational fluid dynamics (CFD) in engineering environment. Addresses integration of all aspects of CFD analysis process, including definition of hardware surfaces, generation of computational grids, CFD flow solution, and postprocessing. Incorporates interfaces for integration of all hardware and software tools needed to perform complete CFD analysis. Includes tools for efficient definition of flow geometry, generation of computational grids, computation of flows on grids, and postprocessing of flow data. System accepts geometric input from any of three basic sources: computer-aided design (CAD), computer-aided engineering (CAE), or definition by user.

  17. 3-Dimensional Marine CSEM Modeling by Employing TDFEM with Parallel Solvers

    NASA Astrophysics Data System (ADS)

    Wu, X.; Yang, T.

    2013-12-01

    In this paper, parallel fulfillment is developed for forward modeling of the 3-Dimensional controlled source electromagnetic (CSEM) by using time-domain finite element method (TDFEM). Recently, a greater attention rises on research of hydrocarbon (HC) reservoir detection mechanism in the seabed. Since China has vast ocean resources, seeking hydrocarbon reservoirs become significant in the national economy. However, traditional methods of seismic exploration shown a crucial obstacle to detect hydrocarbon reservoirs in the seabed with a complex structure, due to relatively high acquisition costs and high-risking exploration. In addition, the development of EM simulations typically requires both a deep knowledge of the computational electromagnetics (CEM) and a proper use of sophisticated techniques and tools from computer science. However, the complexity of large-scale EM simulations often requires large memory because of a large amount of data, or solution time to address problems concerning matrix solvers, function transforms, optimization, etc. The objective of this paper is to present parallelized implementation of the time-domain finite element method for analysis of three-dimensional (3D) marine controlled source electromagnetic problems. Firstly, we established a three-dimensional basic background model according to the seismic data, then electromagnetic simulation of marine CSEM was carried out by using time-domain finite element method, which works on a MPI (Message Passing Interface) platform with exact orientation to allow fast detecting of hydrocarbons targets in ocean environment. To speed up the calculation process, SuperLU of an MPI (Message Passing Interface) version called SuperLU_DIST is employed in this approach. Regarding the representation of three-dimension seabed terrain with sense of reality, the region is discretized into an unstructured mesh rather than a uniform one in order to reduce the number of unknowns. Moreover, high-order Whitney

  18. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  19. Probabilistic Computational Methods in Structural Failure Analysis

    NASA Astrophysics Data System (ADS)

    Krejsa, Martin; Kralik, Juraj

    2015-12-01

    Probabilistic methods are used in engineering where a computational model contains random variables. Each random variable in the probabilistic calculations contains uncertainties. Typical sources of uncertainties are properties of the material and production and/or assembly inaccuracies in the geometry or the environment where the structure should be located. The paper is focused on methods for the calculations of failure probabilities in structural failure and reliability analysis with special attention on newly developed probabilistic method: Direct Optimized Probabilistic Calculation (DOProC), which is highly efficient in terms of calculation time and the accuracy of the solution. The novelty of the proposed method lies in an optimized numerical integration that does not require any simulation technique. The algorithm has been implemented in mentioned software applications, and has been used several times in probabilistic tasks and probabilistic reliability assessments.

  20. Computed tomographic analysis of meteorite inclusions

    NASA Technical Reports Server (NTRS)

    Arnold, J. R.; Testa, J. P., Jr.; Friedman, P. J.; Kambic, G. X.

    1983-01-01

    The feasibility of obtaining nondestructively a cross-sectional display of very dense heterogeneous rocky specimens, whether lunar, terrestrial or meteoritic, by using a fourth generation computed tomographic (CT) scanner, with modifications to the software only, is discussed. A description of the scanner, and of the experimental and analytical procedures is given. Using this technique, the interior of heterogeneous materials such as Allende can be probed nondestructively. The regions of material with high and low atomic numbers are displayed quickly; the object can then be cut to obtain for analysis just the areas of interest. A comparison of this technique with conventional industrial and medical techniques is made in terms of image resolution and density distribution display precision.

  1. PROMALS3D: multiple protein sequence alignment enhanced with evolutionary and 3-dimensional structural information

    PubMed Central

    Pei, Jimin; Grishin, Nick V.

    2015-01-01

    SUMMARY Multiple sequence alignment (MSA) is an essential tool with many applications in bioinformatics and computational biology. Accurate MSA construction for divergent proteins remains a difficult computational task. The constantly increasing protein sequences and structures in public databases could be used to improve alignment quality. PROMALS3D is a tool for protein MSA construction enhanced with additional evolutionary and structural information from database searches. PROMALS3D automatically identifies homologs from sequence and structure databases for input proteins, derives structure-based constraints from alignments of 3-dimensional structures, and combines them with sequence-based constraints of profile-profile alignments in a consistency-based framework to construct high-quality multiple sequence alignments. PROMALS3D output is a consensus alignment enriched with sequence and structural information about input proteins and their homologs. PROMALS3D web server and package are available at http://prodata.swmed.edu/PROMALS3D. PMID:24170408

  2. High-speed 3-dimensional imaging in robot-assisted thoracic surgical procedures.

    PubMed

    Kajiwara, Naohiro; Akata, Soichi; Hagiwara, Masaru; Yoshida, Koichi; Kato, Yasufumi; Kakihana, Masatoshi; Ohira, Tatsuo; Kawate, Norihiko; Ikeda, Norihiko

    2014-06-01

    We used a high-speed 3-dimensional (3D) image analysis system (SYNAPSE VINCENT, Fujifilm Corp, Tokyo, Japan) to determine the best positioning of robotic arms and instruments preoperatively. The da Vinci S (Intuitive Surgical Inc, Sunnyvale, CA) was easily set up accurately and rapidly for this operation. Preoperative simulation and intraoperative navigation using the SYNAPSE VINCENT for robot-assisted thoracic operations enabled efficient planning of the operation settings. The SYNAPSE VINCENT can detect the tumor location and depict surrounding tissues quickly, accurately, and safely. This system is also excellent for navigational and educational use. PMID:24882302

  3. High-speed 3-dimensional imaging in robot-assisted thoracic surgical procedures.

    PubMed

    Kajiwara, Naohiro; Akata, Soichi; Hagiwara, Masaru; Yoshida, Koichi; Kato, Yasufumi; Kakihana, Masatoshi; Ohira, Tatsuo; Kawate, Norihiko; Ikeda, Norihiko

    2014-06-01

    We used a high-speed 3-dimensional (3D) image analysis system (SYNAPSE VINCENT, Fujifilm Corp, Tokyo, Japan) to determine the best positioning of robotic arms and instruments preoperatively. The da Vinci S (Intuitive Surgical Inc, Sunnyvale, CA) was easily set up accurately and rapidly for this operation. Preoperative simulation and intraoperative navigation using the SYNAPSE VINCENT for robot-assisted thoracic operations enabled efficient planning of the operation settings. The SYNAPSE VINCENT can detect the tumor location and depict surrounding tissues quickly, accurately, and safely. This system is also excellent for navigational and educational use.

  4. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  5. Can cloud computing benefit health services? - a SWOT analysis.

    PubMed

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare. PMID:21893777

  6. Can cloud computing benefit health services? - a SWOT analysis.

    PubMed

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  7. A computational design system for rapid CFD analysis

    NASA Technical Reports Server (NTRS)

    Ascoli, E. P.; Barson, S. L.; Decroix, M. E.; Sindir, Munir M.

    1992-01-01

    A computation design system (CDS) is described in which these tools are integrated in a modular fashion. This CDS ties together four key areas of computational analysis: description of geometry; grid generation; computational codes; and postprocessing. Integration of improved computational fluid dynamics (CFD) analysis tools through integration with the CDS has made a significant positive impact in the use of CFD for engineering design problems. Complex geometries are now analyzed on a frequent basis and with greater ease.

  8. Schlieren sequence analysis using computer vision

    NASA Astrophysics Data System (ADS)

    Smith, Nathanial Timothy

    Computer vision-based methods are proposed for extraction and measurement of flow structures of interest in schlieren video. As schlieren data has increased with faster frame rates, we are faced with thousands of images to analyze. This presents an opportunity to study global flow structures over time that may not be evident from surface measurements. A degree of automation is desirable to extract flow structures and features to give information on their behavior through the sequence. Using an interdisciplinary approach, the analysis of large schlieren data is recast as a computer vision problem. The double-cone schlieren sequence is used as a testbed for the methodology; it is unique in that it contains 5,000 images, complex phenomena, and is feature rich. Oblique structures such as shock waves and shear layers are common in schlieren images. A vision-based methodology is used to provide an estimate of oblique structure angles through the unsteady sequence. The methodology has been applied to a complex flowfield with multiple shocks. A converged detection success rate between 94% and 97% for these structures is obtained. The modified curvature scale space is used to define features at salient points on shock contours. A challenge in developing methods for feature extraction in schlieren images is the reconciliation of existing techniques with features of interest to an aerodynamicist. Domain-specific knowledge of physics must therefore be incorporated into the definition and detection phases. Known location and physically possible structure representations form a knowledge base that provides a unique feature definition and extraction. Model tip location and the motion of a shock intersection across several thousand frames are identified, localized, and tracked. Images are parsed into physically meaningful labels using segmentation. Using this representation, it is shown that in the double-cone flowfield, the dominant unsteady motion is associated with large scale

  9. Ferrofluids: Modeling, numerical analysis, and scientific computation

    NASA Astrophysics Data System (ADS)

    Tomas, Ignacio

    This dissertation presents some developments in the Numerical Analysis of Partial Differential Equations (PDEs) describing the behavior of ferrofluids. The most widely accepted PDE model for ferrofluids is the Micropolar model proposed by R.E. Rosensweig. The Micropolar Navier-Stokes Equations (MNSE) is a subsystem of PDEs within the Rosensweig model. Being a simplified version of the much bigger system of PDEs proposed by Rosensweig, the MNSE are a natural starting point of this thesis. The MNSE couple linear velocity u, angular velocity w, and pressure p. We propose and analyze a first-order semi-implicit fully-discrete scheme for the MNSE, which decouples the computation of the linear and angular velocities, is unconditionally stable and delivers optimal convergence rates under assumptions analogous to those used for the Navier-Stokes equations. Moving onto the much more complex Rosensweig's model, we provide a definition (approximation) for the effective magnetizing field h, and explain the assumptions behind this definition. Unlike previous definitions available in the literature, this new definition is able to accommodate the effect of external magnetic fields. Using this definition we setup the system of PDEs coupling linear velocity u, pressure p, angular velocity w, magnetization m, and magnetic potential ϕ We show that this system is energy-stable and devise a numerical scheme that mimics the same stability property. We prove that solutions of the numerical scheme always exist and, under certain simplifying assumptions, that the discrete solutions converge. A notable outcome of the analysis of the numerical scheme for the Rosensweig's model is the choice of finite element spaces that allow the construction of an energy-stable scheme. Finally, with the lessons learned from Rosensweig's model, we develop a diffuse-interface model describing the behavior of two-phase ferrofluid flows and present an energy-stable numerical scheme for this model. For a

  10. TAIR- TRANSONIC AIRFOIL ANALYSIS COMPUTER CODE

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.

    1994-01-01

    The Transonic Airfoil analysis computer code, TAIR, was developed to employ a fast, fully implicit algorithm to solve the conservative full-potential equation for the steady transonic flow field about an arbitrary airfoil immersed in a subsonic free stream. The full-potential formulation is considered exact under the assumptions of irrotational, isentropic, and inviscid flow. These assumptions are valid for a wide range of practical transonic flows typical of modern aircraft cruise conditions. The primary features of TAIR include: a new fully implicit iteration scheme which is typically many times faster than classical successive line overrelaxation algorithms; a new, reliable artifical density spatial differencing scheme treating the conservative form of the full-potential equation; and a numerical mapping procedure capable of generating curvilinear, body-fitted finite-difference grids about arbitrary airfoil geometries. Three aspects emphasized during the development of the TAIR code were reliability, simplicity, and speed. The reliability of TAIR comes from two sources: the new algorithm employed and the implementation of effective convergence monitoring logic. TAIR achieves ease of use by employing a "default mode" that greatly simplifies code operation, especially by inexperienced users, and many useful options including: several airfoil-geometry input options, flexible user controls over program output, and a multiple solution capability. The speed of the TAIR code is attributed to the new algorithm and the manner in which it has been implemented. Input to the TAIR program consists of airfoil coordinates, aerodynamic and flow-field convergence parameters, and geometric and grid convergence parameters. The airfoil coordinates for many airfoil shapes can be generated in TAIR from just a few input parameters. Most of the other input parameters have default values which allow the user to run an analysis in the default mode by specifing only a few input parameters

  11. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  12. Computer-aided petrographic analysis of sandstones

    SciTech Connect

    Thayer, P.A.; Helmold, K.P.

    1987-05-01

    Thin-section point counting, mathematical and statistical analysis of petrographic-petrophysical data, report generation, and graphical presentation of results can be done efficiently by computer. Compositional and textural data are collected with a modified Schares point-counting system. The system uses an MS-DOS microcomputer programmed in BASIC to drive a motorized stage attached to a polarizing microscope. Numeric codes for up to 500 different categories of minerals, cements, pores, etc, are input using a separate keypad. Calculation and printing of constituent percentages, QFR, Folk name, and grain-size distribution are completed in seconds after data entry. Raw data files, compatible with software such as Lotus 1-2-3, SPSS, and SAS, are stored on floppy disk. Petrographic data files are transferred directly to a mainframe, merged with log and petrophysical data, analyzed statistically with SAS, and reports generated. SAS/GRAPH and TELL-A-GRAF routines linked with SAS generate a variety of cross plots, histograms, pie and bar charts, ternary diagrams, and vertical variation diagrams (e.g., depth vs. porosity, permeability, mean size, sorting, and percent grains-matrix-cement).

  13. Computational analysis of DOD drop formation

    NASA Astrophysics Data System (ADS)

    Xu, Qi; Basaran, Osman

    2007-11-01

    A fundamental theoretical understanding of drop-on-demand (DOD) ink jet printing remains weak despite the widespread use of the method in practical applications for two decades. To overcome this deficiency, a computational analysis is carried out to simulate the formation of liquid drops of incompressible Newtonian fluids from a nozzle by imposing a transient flow rate upstream of the nozzle exit. The dynamics are studied as functions of the Ohnesorge number Oh (viscous/surface tension force) and the Weber number We (inertial/surface tension force). For a common ink forming from a nozzle of 10 micrometer radius, Oh=0.1. For this typical case, a phase or operability diagram is developed that shows that three regimes of operation are possible. In the first regime, where We is low, breakup does not occur, and drops remain pendant from the nozzle and undergo time periodic oscillations. Thus, the simulations show that sufficient fluid inertia, or a sufficiently large We, is required if a DOD drop is to form, in accord with intuition. At high We, two regimes exist. In the first of these two regimes, DOD drops do form but have negative velocities, i.e. they would move toward the nozzle upon breakup, which is undesirable. In the second breakup regime, not only are DOD drops formed but they do so with positive velocities.

  14. Parallel Analysis and Visualization on Cray Compute Node Linux

    SciTech Connect

    Pugmire, Dave; Ahern, Sean

    2008-01-01

    Capability computer systems are deployed to give researchers the computational power required to investigate and solve key challenges facing the scientific community. As the power of these computer systems increases, the computational problem domain typically increases in size, complexity and scope. These increases strain the ability of commodity analysis and visualization clusters to effectively perform post-processing tasks and provide critical insight and understanding to the computed results. An alternative to purchasing increasingly larger, separate analysis and visualization commodity clusters is to use the computational system itself to perform post-processing tasks. In this paper, the recent successful port of VisIt, a parallel, open source analysis and visualization tool, to compute node linux running on the Cray is detailed. Additionally, the unprecedented ability of this resource for analysis and visualization is discussed and a report on obtained results is presented.

  15. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  16. New Technique for Developing a Proton Range Compensator With Use of a 3-Dimensional Printer

    SciTech Connect

    Ju, Sang Gyu; Kim, Min Kyu; Hong, Chae-Seon; Kim, Jin Sung; Han, Youngyih; Choi, Doo Ho; Shin, Dongho; Lee, Se Byeong

    2014-02-01

    Purpose: A new system for manufacturing a proton range compensator (RC) was developed by using a 3-dimensional printer (3DP). The physical accuracy and dosimetric characteristics of the new RC manufactured by 3DP (RC{sub 3}DP) were compared with those of a conventional RC (RC{sub C}MM) manufactured by a computerized milling machine (CMM). Methods and Materials: An RC for brain tumor treatment with a scattered proton beam was calculated with a treatment planning system, and the resulting data were converted into a new format for 3DP using in-house software. The RC{sub 3}DP was printed with ultraviolet curable acrylic plastic, and an RC{sub C}MM was milled into polymethylmethacrylate using a CMM. The inner shape of both RCs was scanned by using a 3D scanner and compared with TPS data by applying composite analysis (CA; with 1-mm depth difference and 1 mm distance-to-agreement criteria) to verify their geometric accuracy. The position and distal penumbra of distal dose falloff at the central axis and field width of the dose profile at the midline depth of spread-out Bragg peak were measured for the 2 RCs to evaluate their dosimetric characteristics. Both RCs were imaged on a computed tomography scanner to evaluate uniformity of internal density. The manufacturing times for both RCs were compared to evaluate the production efficiency. Results: The pass rates for the CA test were 99.5% and 92.5% for RC{sub 3}DP and RC{sub C}MM, respectively. There was no significant difference in dosimetric characteristics and uniformity of internal density between the 2 RCs. The net fabrication times of RC{sub 3}DP and RC{sub C}MM were about 18 and 3 hours, respectively. Conclusions: The physical accuracy and dosimetric characteristics of RC{sub 3}DP were comparable with those of the conventional RC{sub C}MM, and significant system minimization was provided.

  17. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    ERIC Educational Resources Information Center

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  18. Numerical Package in Computer Supported Numeric Analysis Teaching

    ERIC Educational Resources Information Center

    Tezer, Murat

    2007-01-01

    At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…

  19. Thermal crosstalk in 3-dimensional RRAM crossbar array.

    PubMed

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-01-01

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation. PMID:26310537

  20. A Novel 3-Dimensional Approach for Cardiac Regeneration

    PubMed Central

    Munarin, F.; Coulombe, K.L.K.

    2016-01-01

    Ischemic heart diseases, such as coronary artery disease and microvascular disease, are cardiovascular pathologies that cause reduced blood supply to the heart muscle. Acute and chronic ischemia cause cardiomyocytes to die, and these cells are not naturally replaced as part of the wound healing process in the heart. To promote neovascularization in the wound bed and in implanted engineered tissues, we have developed a collagen–alginate microspheres scaffold intended for local release of drugs and growth factors in order to recruit host endothelial cells to the area and provide them with geometrical cues to form new vessels. Optimization of alginate microspheres included modulation of nitrogen pressure, alginate and CaCl2 concentrations, nozzle size, and velocity of extrusion to achieve monodisperse populations of 100 μm diameter microspheres with protein release over 3 days. In vitro incorporation of fibroblasts in the bulk collagen demonstrated cellular compatibility with embedded alginate microspheres. An in vitro vessel formation assay, performed with human umbilical vein endothelial cells (HUVECs) immobilized in the collagen phase of the collagen–alginate microspheres scaffolds, showed that HUVECs formed networks following the 3-dimensional pattern of the microspheres even in the absence of growth factor. Implantation of acellular collagen–alginate microspheres scaffolds onto healthy rat hearts confirmed the invasion of host cells at one week. Together, these results suggest that the collagen–alginate microspheres scaffold is a viable, tunable therapeutic approach for directing neovascularization in engineered tissues and in the heart after ischemic events. PMID:26736614

  1. Thermal crosstalk in 3-dimensional RRAM crossbar array.

    PubMed

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-08-27

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation.

  2. Thermal crosstalk in 3-dimensional RRAM crossbar array

    PubMed Central

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-01-01

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation. PMID:26310537

  3. Chromosome Conformation of Human Fibroblasts Grown in 3-Dimensional Spheroids

    PubMed Central

    Chen, Haiming; Comment, Nicholas; Chen, Jie; Ronquist, Scott; Hero, Alfred; Ried, Thomas; Rajapakse, Indika

    2015-01-01

    In the study of interphase chromosome organization, genome-wide chromosome conformation capture (Hi-C) maps are often generated using 2-dimensional (2D) monolayer cultures. These 2D cells have morphological deviations from cells that exist in 3-dimensional (3D) tissues in vivo, and may not maintain the same chromosome conformation. We used Hi-C maps to test the extent of differences in chromosome conformation between human fibroblasts grown in 2D cultures and those grown in 3D spheroids. Significant differences in chromosome conformation were found between 2D cells and those grown in spheroids. Intra-chromosomal interactions were generally increased in spheroid cells, with a few exceptions, while inter-chromosomal interactions were generally decreased. Overall, chromosomes located closer to the nuclear periphery had increased intra-chromosomal contacts in spheroid cells, while those located more centrally had decreased interactions. This study highlights the necessity to conduct studies on the topography of the interphase nucleus under conditions that mimic an in vivo environment. PMID:25738643

  4. NASA Applications for Computational Electromagnetic Analysis

    NASA Technical Reports Server (NTRS)

    Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.

    2011-01-01

    Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.

  5. The method of geometrical comparison of 3-dimensional objects created from DICOM images.

    PubMed

    Gaweł, Dominik; Danielewicz, Kamil; Nowak, Michał

    2012-01-01

    This work presents a method of geometrical comparison of 3-dimensional objects created from DICOM images. The reconstruction of biological objects is realized with use of Simpleware commercial software. Then the 3D geometries are registered and the recognized shape differences are visualized using color map, indicating the change of the 3D geometry. Than the last, but most important step of the presented technology is performed. The model including the information about changes in compared geometries is translated into the PDF format. Such approach allows to present the final result on every desktop computer equipped with Adobe Reader. This PDF browser is free to use and gives the possibility to freely rotate, move and zoom the model. PMID:22744507

  6. Using 3-dimensional printing to create presurgical models for endodontic surgery.

    PubMed

    Bahcall, James K

    2014-09-01

    Advances in endodontic surgery--from both a technological and procedural perspective-have been significant over the last 18 years. Although these technologies and procedural enhancements have significantly improved endodontic surgical treatment outcomes, there is still an ongoing challenge of overcoming the limitations of interpreting preoperative 2-dimensional (2-D) radiographic representation of a 3-dimensional (3-D) in vivo surgical field. Cone-beam Computed Tomography (CBCT) has helped to address this issue by providing a 3-D enhancement of the 2-D radiograph. The next logical step to further improve a presurgical case 3-D assessment is to create a surgical model from the CBCT scan. The purpose of this article is to introduce 3-D printing of CBCT scans for creating presurgical models for endodontic surgery. PMID:25197746

  7. Roentgen stereophotogrammetric analysis using computer-based image-analysis.

    PubMed

    Ostgaard, S E; Gottlieb, L; Toksvig-Larsen, S; Lebech, A; Talbot, A; Lund, B

    1997-09-01

    The two-dimensional position of markers in radiographs for Roentgen Stereophotogrammetric Analysis (RSA) is usually determined using a measuring table. The purpose of this study was to evaluate the reproducibility and the accuracy of a new RSA system using digitized radiographs and image-processing algorithms to determine the marker position in the radiographs. Four double-RSA examinations of a phantom and 18 RSA examinations from six patients included in different RSA-studies of knee prostheses were used to test the reproducibility and the accuracy of the system. The radiographs were scanned at 600 dpi resolution and 256 gray levels. The center of each of the tantalum-markers in the radiographs was calculated by the computer program from the contour of the marker with the use of an edge-detection software algorithm after the marker was identified on a PC monitor. The study showed that computer-based image analysis can be used in RSA-examinations. The advantages of using image-processing software in RSA are that the marker positions are determined in an objective manner, and that there is no need for a systematic manual identification of all the markers on the radiograph before the actual measurement.

  8. Influence of White-Coat Hypertension on Left Ventricular Deformation 2- and 3-Dimensional Speckle Tracking Study.

    PubMed

    Tadic, Marijana; Cuspidi, Cesare; Ivanovic, Branislava; Ilic, Irena; Celic, Vera; Kocijancic, Vesna

    2016-03-01

    We sought to compare left ventricular deformation in subjects with white-coat hypertension to normotensive and sustained hypertensive patients. This cross-sectional study included 139 untreated subjects who underwent 24-hour ambulatory blood pressure monitoring and completed 2- and 3-dimensional examination. Two-dimensional left ventricular multilayer strain analysis was also performed. White-coat hypertension was diagnosed if clinical blood pressure was elevated and 24-hour blood pressure was normal. Our results showed that left ventricular longitudinal and circumferential strains gradually decreased from normotensive controls across subjects with white-coat hypertension to sustained hypertensive group. Two- and 3-dimensional left ventricular radial strain, as well as 3-dimensional area strain, was not different between groups. Two-dimensional left ventricular longitudinal and circumferential strains of subendocardial and mid-myocardial layers gradually decreased from normotensive control to sustained hypertensive group. Longitudinal and circumferential strains of subepicardial layer did not differ between the observed groups. We concluded that white-coat hypertension significantly affects left ventricular deformation assessed by 2-dimensional traditional strain, multilayer strain, and 3-dimensional strain.

  9. A 3-Dimensional Anatomic Study of the Distal Biceps Tendon

    PubMed Central

    Walton, Christine; Li, Zhi; Pennings, Amanda; Agur, Anne; Elmaraghy, Amr

    2015-01-01

    Background Complete rupture of the distal biceps tendon from its osseous attachment is most often treated with operative intervention. Knowledge of the overall tendon morphology as well as the orientation of the collagenous fibers throughout the musculotendinous junction are key to intraoperative decision making and surgical technique in both the acute and chronic setting. Unfortunately, there is little information available in the literature. Purpose To comprehensively describe the morphology of the distal biceps tendon. Study Design Descriptive laboratory study. Methods The distal biceps terminal musculature, musculotendinous junction, and tendon were digitized in 10 cadaveric specimens and data reconstructed using 3-dimensional modeling. Results The average length, width, and thickness of the external distal biceps tendon were found to be 63.0, 6.0, and 3.0 mm, respectively. A unique expansion of the tendon fibers within the distal muscle was characterized, creating a thick collagenous network along the central component between the long and short heads. Conclusion This study documents the morphologic parameters of the native distal biceps tendon. Reconstruction may be necessary, especially in chronic distal biceps tendon ruptures, if the remaining tendon morphology is significantly compromised compared with the native distal biceps tendon. Knowledge of normal anatomical distal biceps tendon parameters may also guide the selection of a substitute graft with similar morphological characteristics. Clinical Relevance A thorough description of distal biceps tendon morphology is important to guide intraoperative decision making between primary repair and reconstruction and to better select the most appropriate graft. The detailed description of the tendinous expansion into the muscle may provide insight into better graft-weaving and suture-grasping techniques to maximize proximal graft incorporation. PMID:26665092

  10. Computational analysis of LDDMM for brain mapping

    PubMed Central

    Ceritoglu, Can; Tang, Xiaoying; Chow, Margaret; Hadjiabadi, Darian; Shah, Damish; Brown, Timothy; Burhanullah, Muhammad H.; Trinh, Huong; Hsu, John T.; Ament, Katarina A.; Crocetti, Deana; Mori, Susumu; Mostofsky, Stewart H.; Yantis, Steven; Miller, Michael I.; Ratnanather, J. Tilak

    2013-01-01

    One goal of computational anatomy (CA) is to develop tools to accurately segment brain structures in healthy and diseased subjects. In this paper, we examine the performance and complexity of such segmentation in the framework of the large deformation diffeomorphic metric mapping (LDDMM) registration method with reference to atlases and parameters. First we report the application of a multi-atlas segmentation approach to define basal ganglia structures in healthy and diseased kids' brains. The segmentation accuracy of the multi-atlas approach is compared with the single atlas LDDMM implementation and two state-of-the-art segmentation algorithms—Freesurfer and FSL—by computing the overlap errors between automatic and manual segmentations of the six basal ganglia nuclei in healthy subjects as well as subjects with diseases including ADHD and Autism. The high accuracy of multi-atlas segmentation is obtained at the cost of increasing the computational complexity because of the calculations necessary between the atlases and a subject. Second, we examine the effect of parameters on total LDDMM computation time and segmentation accuracy for basal ganglia structures. Single atlas LDDMM method is used to automatically segment the structures in a population of 16 subjects using different sets of parameters. The results show that a cascade approach and using fewer time steps can reduce computational complexity as much as five times while maintaining reliable segmentations. PMID:23986653

  11. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    NASA Technical Reports Server (NTRS)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  12. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  13. Computational Analysis of SAXS Data Acquisition

    PubMed Central

    Dong, Hui; Kim, Jin Seob

    2015-01-01

    Abstract Small-angle x-ray scattering (SAXS) is an experimental biophysical method used for gaining insight into the structure of large biomolecular complexes. Under appropriate chemical conditions, the information obtained from a SAXS experiment can be equated to the pair distribution function, which is the distribution of distances between every pair of points in the complex. Here we develop a mathematical model to calculate the pair distribution function for a structure of known density, and analyze the computational complexity of these calculations. Efficient recursive computation of this forward model is an important step in solving the inverse problem of recovering the three-dimensional density of biomolecular structures from their pair distribution functions. In particular, we show that integrals of products of three spherical-Bessel functions arise naturally in this context. We then develop an algorithm for the efficient recursive computation of these integrals. PMID:26244255

  14. Accuracy Evaluation of a 3-Dimensional Surface Imaging System for Guidance in Deep-Inspiration Breath-Hold Radiation Therapy

    SciTech Connect

    Alderliesten, Tanja; Sonke, Jan-Jakob; Betgen, Anja; Honnef, Joeri; Vliet-Vroegindeweij, Corine van; Remeijer, Peter

    2013-02-01

    Purpose: To investigate the applicability of 3-dimensional (3D) surface imaging for image guidance in deep-inspiration breath-hold radiation therapy (DIBH-RT) for patients with left-sided breast cancer. For this purpose, setup data based on captured 3D surfaces was compared with setup data based on cone beam computed tomography (CBCT). Methods and Materials: Twenty patients treated with DIBH-RT after breast-conserving surgery (BCS) were included. Before the start of treatment, each patient underwent a breath-hold CT scan for planning purposes. During treatment, dose delivery was preceded by setup verification using CBCT of the left breast. 3D surfaces were captured by a surface imaging system concurrently with the CBCT scan. Retrospectively, surface registrations were performed for CBCT to CT and for a captured 3D surface to CT. The resulting setup errors were compared with linear regression analysis. For the differences between setup errors, group mean, systematic error, random error, and 95% limits of agreement were calculated. Furthermore, receiver operating characteristic (ROC) analysis was performed. Results: Good correlation between setup errors was found: R{sup 2}=0.70, 0.90, 0.82 in left-right, craniocaudal, and anterior-posterior directions, respectively. Systematic errors were {<=}0.17 cm in all directions. Random errors were {<=}0.15 cm. The limits of agreement were -0.34-0.48, -0.42-0.39, and -0.52-0.23 cm in left-right, craniocaudal, and anterior-posterior directions, respectively. ROC analysis showed that a threshold between 0.4 and 0.8 cm corresponds to promising true positive rates (0.78-0.95) and false positive rates (0.12-0.28). Conclusions: The results support the application of 3D surface imaging for image guidance in DIBH-RT after BCS.

  15. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  16. Computed Tomography Analysis of NASA BSTRA Balls

    SciTech Connect

    Perry, R L; Schneberk, D J; Thompson, R R

    2004-10-12

    Fifteen 1.25 inch BSTRA balls were scanned with the high energy computed tomography system at LLNL. This system has a resolution limit of approximately 210 microns. A threshold of 238 microns (two voxels) was used, and no anomalies at or greater than this were observed.

  17. Thermoelectric pump performance analysis computer code

    NASA Technical Reports Server (NTRS)

    Johnson, J. L.

    1973-01-01

    A computer program is presented that was used to analyze and design dual-throat electromagnetic dc conduction pumps for the 5-kwe ZrH reactor thermoelectric system. In addition to a listing of the code and corresponding identification of symbols, the bases for this analytical model are provided.

  18. Computational and Physical Analysis of Catalytic Compounds

    NASA Astrophysics Data System (ADS)

    Wu, Richard; Sohn, Jung Jae; Kyung, Richard

    2015-03-01

    Nanoparticles exhibit unique physical and chemical properties depending on their geometrical properties. For this reason, synthesis of nanoparticles with controlled shape and size is important to use their unique properties. Catalyst supports are usually made of high-surface-area porous oxides or carbon nanomaterials. These support materials stabilize metal catalysts against sintering at high reaction temperatures. Many studies have demonstrated large enhancements of catalytic behavior due to the role of the oxide-metal interface. In this paper, the catalyzing ability of supported nano metal oxides, such as silicon oxide and titanium oxide compounds as catalysts have been analyzed using computational chemistry method. Computational programs such as Gamess and Chemcraft has been used in an effort to compute the efficiencies of catalytic compounds, and bonding energy changes during the optimization convergence. The result illustrates how the metal oxides stabilize and the steps that it takes. The graph of the energy computation step(N) versus energy(kcal/mol) curve shows that the energy of the titania converges faster at the 7th iteration calculation, whereas the silica converges at the 9th iteration calculation.

  19. A computational analysis for amino acid adsorption

    NASA Astrophysics Data System (ADS)

    Owens, Brandon E.; Riemann, Andreas

    2014-06-01

    In this study we have systematically investigated the effects of surface configuration and molecular state for the adsorption of methionine (C5H11NO2S) on a graphite surface using two model force fields, AMBER 3 and MM +. Computational results were compared with experimental results.

  20. Conversation Analysis of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Gonzalez-Lloret, Marta

    2011-01-01

    The potential of computer-mediated communication (CMC) for language learning resides mainly in the possibility that learners have to engage with other speakers of the language, including L1 speakers. The inclusion of CMC in the L2 classroom provides an opportunity for students to utilize authentic language in real interaction, rather than the more…

  1. Computational thermo-fluid analysis of a disk brake

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu

    2016-06-01

    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  2. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  3. Computer applications for engineering/structural analysis. Revision 1

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-12-31

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  4. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  5. TAIR: A transonic airfoil analysis computer code

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.; Holst, T. L.; Grundy, K. L.; Thomas, S. D.

    1981-01-01

    The operation of the TAIR (Transonic AIRfoil) computer code, which uses a fast, fully implicit algorithm to solve the conservative full-potential equation for transonic flow fields about arbitrary airfoils, is described on two levels of sophistication: simplified operation and detailed operation. The program organization and theory are elaborated to simplify modification of TAIR for new applications. Examples with input and output are given for a wide range of cases, including incompressible, subcritical compressible, and transonic calculations.

  6. Scalable Computer Performance and Analysis (Hierarchical INTegration)

    1999-09-02

    HINT is a program to measure a wide variety of scalable computer systems. It is capable of demonstrating the benefits of using more memory or processing power, and of improving communications within the system. HINT can be used for measurement of an existing system, while the associated program ANALYTIC HINT can be used to explain the measurements or as a design tool for proposed systems.

  7. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  8. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  9. Computer-Based Interaction Analysis with DEGREE Revisited

    ERIC Educational Resources Information Center

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  10. Simple hobby computer-based off-gas analysis system

    SciTech Connect

    Forrest, E.H.; Jansen, N.B.; Flickinger, M.C.; Tsao, G.T.

    1981-02-01

    An Apple II computer has been adapted to monitor fermentation offgas in laboratory and pilot scale fermentors. It can calculate oxygen uptake rates, carbon dioxide evolution rates, respiratory quotient as well as initiating recalibration procedures. In this report the computer-based off-gas analysis system is described.

  11. Potential applications of computational fluid dynamics to biofluid analysis

    NASA Technical Reports Server (NTRS)

    Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.; Kwak, D.

    1988-01-01

    Computational fluid dynamics was developed to the stage where it has become an indispensable part of aerospace research and design. In view of advances made in aerospace applications, the computational approach can be used for biofluid mechanics research. Several flow simulation methods developed for aerospace problems are briefly discussed for potential applications to biofluids, especially to blood flow analysis.

  12. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  13. The Utility of Computer-Assisted Power Analysis Lab Instruction

    ERIC Educational Resources Information Center

    Petrocelli, John V.

    2007-01-01

    Undergraduate students (N = 47), enrolled in 2 separate psychology research methods classes, evaluated a power analysis lab demonstration and homework assignment. Students attended 1 of 2 lectures that included a basic introduction to power analysis and sample size analysis. One lecture included a demonstration of how to use a computer-based power…

  14. The Reliability of Content Analysis of Computer Conference Communication

    ERIC Educational Resources Information Center

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  15. Computational fluid dynamics combustion analysis evaluation

    NASA Technical Reports Server (NTRS)

    Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.

    1992-01-01

    This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.

  16. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  17. Computational Modeling, Formal Analysis, and Tools for Systems Biology.

    PubMed

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science.

  18. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    SciTech Connect

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project.

  19. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  20. Method and apparatus for imaging through 3-dimensional tracking of protons

    NASA Technical Reports Server (NTRS)

    Ryan, James M. (Inventor); Macri, John R. (Inventor); McConnell, Mark L. (Inventor)

    2001-01-01

    A method and apparatus for creating density images of an object through the 3-dimensional tracking of protons that have passed through the object are provided. More specifically, the 3-dimensional tracking of the protons is accomplished by gathering and analyzing images of the ionization tracks of the protons in a closely packed stack of scintillating fibers.

  1. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  2. Process for computing geometric perturbations for probabilistic analysis

    DOEpatents

    Fitch, Simeon H. K.; Riha, David S.; Thacker, Ben H.

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  3. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  4. RSAC -6 Radiological Safety Analysis Computer Program

    SciTech Connect

    Schrader, Bradley J; Wenzel, Douglas Rudolph

    2001-06-01

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface and plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FGR 11 and 12.

  5. Computational analysis of small RNA cloning data.

    PubMed

    Berninger, Philipp; Gaidatzis, Dimos; van Nimwegen, Erik; Zavolan, Mihaela

    2008-01-01

    Cloning and sequencing is the method of choice for small regulatory RNA identification. Using deep sequencing technologies one can now obtain up to a billion nucleotides--and tens of millions of small RNAs--from a single library. Careful computational analyses of such libraries enabled the discovery of miRNAs, rasiRNAs, piRNAs, and 21U RNAs. Given the large number of sequences that can be obtained from each individual sample, deep sequencing may soon become an alternative to oligonucleotide microarray technology for mRNA expression profiling. In this report we present the methods that we developed for the annotation and expression profiling of small RNAs obtained through large-scale sequencing. These include a fast algorithm for finding nearly perfect matches of small RNAs in sequence databases, a web-accessible software system for the annotation of small RNA libraries, and a Bayesian method for comparing small RNA expression across samples.

  6. System Matrix Analysis for Computed Tomography Imaging.

    PubMed

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  7. 3-Dimensional Geologic Modeling Applied to the Structural Characterization of Geothermal Systems: Astor Pass, Nevada, USA

    SciTech Connect

    Siler, Drew L; Faulds, James E; Mayhew, Brett

    2013-04-16

    Geothermal systems in the Great Basin, USA, are controlled by a variety of fault intersection and fault interaction areas. Understanding the specific geometry of the structures most conducive to broad-scale geothermal circulation is crucial to both the mitigation of the costs of geothermal exploration (especially drilling) and to the identification of geothermal systems that have no surface expression (blind systems). 3-dimensional geologic modeling is a tool that can elucidate the specific stratigraphic intervals and structural geometries that host geothermal reservoirs. Astor Pass, NV USA lies just beyond the northern extent of the dextral Pyramid Lake fault zone near the boundary between two distinct structural domains, the Walker Lane and the Basin and Range, and exhibits characteristics of each setting. Both northwest-striking, left-stepping dextral faults of the Walker Lane and kinematically linked northerly striking normal faults associated with the Basin and Range are present. Previous studies at Astor Pass identified a blind geothermal system controlled by the intersection of west-northwest and north-northwest striking dextral-normal faults. Wells drilled into the southwestern quadrant of the fault intersection yielded 94°C fluids, with geothermometers suggesting a maximum reservoir temperature of 130°C. A 3-dimensional model was constructed based on detailed geologic maps and cross-sections, 2-dimensional seismic data, and petrologic analysis of the cuttings from three wells in order to further constrain the structural setting. The model reveals the specific geometry of the fault interaction area at a level of detail beyond what geologic maps and cross-sections can provide.

  8. Studies of Cosmic Ray Modulation and Energetic Particle Propagation in Time-Dependent 3-Dimensional Heliospheric Magnetic Fields

    NASA Technical Reports Server (NTRS)

    Zhang, Ming

    2005-01-01

    The primary goal of this project was to perform theoretical calculations of propagation of cosmic rays and energetic particles in 3-dimensional heliospheric magnetic fields. We used Markov stochastic process simulation to achieve to this goal. We developed computation software that can be used to study particle propagation in, as two examples of heliospheric magnetic fields that have to be treated in 3 dimensions, a heliospheric magnetic field suggested by Fisk (1996) and a global heliosphere including the region beyond the termination shock. The results from our model calculations were compared with particle measurements from Ulysses, Earth-based spacecraft such as IMP-8, WIND and ACE, Voyagers and Pioneers in outer heliosphere for tests of the magnetic field models. We particularly looked for features of particle variations that can allow us to significantly distinguish the Fisk magnetic field from the conventional Parker spiral field. The computer code will eventually lead to a new generation of integrated software for solving complicated problems of particle acceleration, propagation and modulation in realistic 3-dimensional heliosphere of realistic magnetic fields and the solar wind with a single computation approach.

  9. Computer-automated neutron activation analysis system

    SciTech Connect

    Minor, M.M.; Garcia, S.R.

    1983-01-01

    An automated delayed neutron counting and instrumental neutron activation analysis system has been developed at Los Alamos National Laboratory's Omega West Reactor (OWR) to analyze samples for uranium and 31 additional elements with a maximum throughput of 400 samples per day. 5 references.

  10. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, Olaf O.; Nguyen, Duc T.; Baddourah, Majdi; Qin, Jiangning

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigensolution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization search analysis and domain decomposition. The source code for many of these algorithms is available.

  11. Local spatial frequency analysis for computer vision

    NASA Technical Reports Server (NTRS)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  12. Acoustic analysis of a computer cooling fan

    NASA Astrophysics Data System (ADS)

    Huang, Lixi; Wang, Jian

    2005-10-01

    Noise radiated by a typical computer cooling fan is investigated experimentally and analyzed within the framework of rotor-stator interaction noise using point source formulation. The fan is 9 cm in rotor casing diameter and its design speed is 3000 rpm. The main noise sources are found and quantified; they are (a) the inlet flow distortion caused by the sharp edges of the incomplete bellmouth due to the square outer framework, (b) the interaction of rotor blades with the downstream struts which hold the motor, and (c) the extra size of one strut carrying electrical wiring. Methods are devised to extract the rotor-strut interaction noise, (b) and (c), radiated by the component forces of drag and thrust at the leading and higher order spinning pressure modes, as well as the leading edge noise generated by (a). By re-installing the original fan rotor in various casings, the noises radiated by the three features of the original fan are separated, and details of the directivity are interpreted. It is found that the inlet flow distortion and the unequal set of four struts make about the same amount of noise. Their corrections show a potential of around 10-dB sound power reduction.

  13. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  14. Interactive Spectral Analysis and Computation (ISAAC)

    NASA Technical Reports Server (NTRS)

    Lytle, D. M.

    1992-01-01

    Isaac is a task in the NSO external package for IRAF. A descendant of a FORTRAN program written to analyze data from a Fourier transform spectrometer, the current implementation has been generalized sufficiently to make it useful for general spectral analysis and other one dimensional data analysis tasks. The user interface for Isaac is implemented as an interpreted mini-language containing a powerful, programmable vector calculator. Built-in commands provide much of the functionality needed to produce accurate line lists from input spectra. These built-in functions include automated spectral line finding, least squares fitting of Voigt profiles to spectral lines including equality constraints, various filters including an optimal filter construction tool, continuum fitting, and various I/O functions.

  15. AIR INGRESS ANALYSIS: COMPUTATIONAL FLUID DYNAMIC MODELS

    SciTech Connect

    Chang H. Oh; Eung S. Kim; Richard Schultz; Hans Gougar; David Petti; Hyung S. Kang

    2010-08-01

    The Idaho National Laboratory (INL), under the auspices of the U.S. Department of Energy, is performing research and development that focuses on key phenomena important during potential scenarios that may occur in very high temperature reactors (VHTRs). Phenomena Identification and Ranking Studies to date have ranked an air ingress event, following on the heels of a VHTR depressurization, as important with regard to core safety. Consequently, the development of advanced air ingress-related models and verification and validation data are a very high priority. Following a loss of coolant and system depressurization incident, air will enter the core of the High Temperature Gas Cooled Reactor through the break, possibly causing oxidation of the in-the core and reflector graphite structure. Simple core and plant models indicate that, under certain circumstances, the oxidation may proceed at an elevated rate with additional heat generated from the oxidation reaction itself. Under postulated conditions of fluid flow and temperature, excessive degradation of the lower plenum graphite can lead to a loss of structural support. Excessive oxidation of core graphite can also lead to the release of fission products into the confinement, which could be detrimental to a reactor safety. Computational fluid dynamic model developed in this study will improve our understanding of this phenomenon. This paper presents two-dimensional and three-dimensional CFD results for the quantitative assessment of the air ingress phenomena. A portion of results of the density-driven stratified flow in the inlet pipe will be compared with results of the experimental results.

  16. Computer-Aided Communication Satellite System Analysis and Optimization.

    ERIC Educational Resources Information Center

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  17. Computational fluid dynamic analysis of liquid rocket combustion instability

    NASA Technical Reports Server (NTRS)

    Venkateswaran, Sankaran; Grenda, Jeffrey; Merkle, Charles L.

    1991-01-01

    The paper presents a computational analysis of liquid rocket combustion instability. Consideration is given to both a fully nonlinear unsteady calculation as well as a new CFD-based linearized stability analysis. An analytical solution for the linear stability problem in a constant area combustion chamber with uniform mean flow is developed to verify the numerical analyses.

  18. A Computer Program to Determine Reliability Using Analysis of Variance

    ERIC Educational Resources Information Center

    Burns, Edward

    1976-01-01

    A computer program, written in Fortran IV, is described which assesses reliability by using analysis of variance. It produces a complete analysis of variance table in addition to reliability coefficients for unadjusted and adjusted data as well as the intraclass correlation for m subjects and n items. (Author)

  19. Two Computer Programs for Factor Analysis. Technical Note Number 41.

    ERIC Educational Resources Information Center

    Wisler, Carl E.

    Two factor analysis algorithms, previously described by P. Horst, have been programed for use on the General Electric Time-Sharing Computer System. The first of these, Principal Components Analysis (PCA), uses the Basic Structure Successive Factor Method With Residual Matrices algorithm to obtain the principal component vectors of a correlation…

  20. Computing environments for data analysis III: Programming environments

    SciTech Connect

    McDonald, J.A.; Pedersen, J.

    1988-03-01

    This is the third in a series of papers on aspects of modern computing environments that are relevant to statistical data analysis. In this paper the authors discuss programming environments. In particular, they argue that integrated programming environments (for example, Lisp and Smalltalk environments) are more appropriate as a base for data analysis than conventional operating systems (for example, Unix).

  1. Computational Aeroelastic Analysis of the Ares Launch Vehicle During Ascent

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.; Chwalowski, Pawel; Massey, Steven J.; Vatsa, Veer N.; Heeg, Jennifer; Wieseman, Carol D.; Mineck, Raymond E.

    2010-01-01

    This paper presents the static and dynamic computational aeroelastic (CAE) analyses of the Ares crew launch vehicle (CLV) during atmospheric ascent. The influence of launch vehicle flexibility on the static aerodynamic loading and integrated aerodynamic force and moment coefficients is discussed. The ultimate purpose of this analysis is to assess the aeroelastic stability of the launch vehicle along the ascent trajectory. A comparison of analysis results for several versions of the Ares CLV will be made. Flexible static and dynamic analyses based on rigid computational fluid dynamic (CFD) data are compared with a fully coupled aeroelastic time marching CFD analysis of the launch vehicle.

  2. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    NASA Astrophysics Data System (ADS)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  3. Visualisation of the Dynamics for Longitudinal Analysis of Computer-mediated Social Networks-concept and Exemplary Cases

    NASA Astrophysics Data System (ADS)

    Harrer, Andreas; Zeini, Sam; Ziebarth, Sabrina

    In this paper we will demonstrate the potential of processing and visualising the dynamics of computer-mediated communities by means of Social Network Analysis. According to the fact that computer-mediated community systems are manifested also as structured data, we use data structures like e-mail, discussion boards, and bibliography sources for an automatic transformation into social network data formats. The paper will demonstrate a 3-dimensional visualisation of two cases: the first presents an author community based on bibliography data converted into GraphML. Based on this dataset we visualise publications networks with a tool called Weaver, which is developed in our research group. According to Lothar Krempel's algorithm, Weaver uses the first two dimensions to embed the network structure within a common solution space. The third dimension is used for representing the time axis and thus the dynamics of co-authorship relations. The second case describes recent research in open source communities and highlights how our visualization approach can be used as a complement to more traditional approaches, such as content analysis and statistics based on specific SNA indices.

  4. A computer analysis of the Schreber Memoirs.

    PubMed

    Klein, R H

    1976-06-01

    With the aid of a computerized system for content analysis, WORDS, the complete Schreber Memoirs was subjected to various multivariate reduction techniques in order to investigate the major content themes of this document. The findings included the prevalence of somatic concerns throughout the Memoirs, clear references to persecutory ideas and to Schreber's assumption of a redemptive role, complex encapsulated concerns about Schreber's relationship with God, a lack of any close relationship between sexuality and sexual transformation either to themes of castration or procreation, and the fact that neither sun, God, nor Flechsig was significantly associated with clusters concerning gender, sexuality, or castration. These findings are discussed in relation to psychodynamic interpretations furnished by prior investigators who employed different research methods.

  5. A Computational Discriminability Analysis on Twin Fingerprints

    NASA Astrophysics Data System (ADS)

    Liu, Yu; Srihari, Sargur N.

    Sharing similar genetic traits makes the investigation of twins an important study in forensics and biometrics. Fingerprints are one of the most commonly found types of forensic evidence. The similarity between twins’ prints is critical establish to the reliability of fingerprint identification. We present a quantitative analysis of the discriminability of twin fingerprints on a new data set (227 pairs of identical twins and fraternal twins) recently collected from a twin population using both level 1 and level 2 features. Although the patterns of minutiae among twins are more similar than in the general population, the similarity of fingerprints of twins is significantly different from that between genuine prints of the same finger. Twins fingerprints are discriminable with a 1.5%~1.7% higher EER than non-twins. And identical twins can be distinguished by examine fingerprint with a slightly higher error rate than fraternal twins.

  6. CFD Based Computations of Flexible Helicopter Blades for Stability Analysis

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2011-01-01

    As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.

  7. A Novel Method of Orbital Floor Reconstruction Using Virtual Planning, 3-Dimensional Printing, and Autologous Bone.

    PubMed

    Vehmeijer, Maarten; van Eijnatten, Maureen; Liberton, Niels; Wolff, Jan

    2016-08-01

    Fractures of the orbital floor are often a result of traffic accidents or interpersonal violence. To date, numerous materials and methods have been used to reconstruct the orbital floor. However, simple and cost-effective 3-dimensional (3D) printing technologies for the treatment of orbital floor fractures are still sought. This study describes a simple, precise, cost-effective method of treating orbital fractures using 3D printing technologies in combination with autologous bone. Enophthalmos and diplopia developed in a 64-year-old female patient with an orbital floor fracture. A virtual 3D model of the fracture site was generated from computed tomography images of the patient. The fracture was virtually closed using spline interpolation. Furthermore, a virtual individualized mold of the defect site was created, which was manufactured using an inkjet printer. The tangible mold was subsequently used during surgery to sculpture an individualized autologous orbital floor implant. Virtual reconstruction of the orbital floor and the resulting mold enhanced the overall accuracy and efficiency of the surgical procedure. The sculptured autologous orbital floor implant showed an excellent fit in vivo. The combination of virtual planning and 3D printing offers an accurate and cost-effective treatment method for orbital floor fractures. PMID:27137437

  8. A Novel Method of Orbital Floor Reconstruction Using Virtual Planning, 3-Dimensional Printing, and Autologous Bone.

    PubMed

    Vehmeijer, Maarten; van Eijnatten, Maureen; Liberton, Niels; Wolff, Jan

    2016-08-01

    Fractures of the orbital floor are often a result of traffic accidents or interpersonal violence. To date, numerous materials and methods have been used to reconstruct the orbital floor. However, simple and cost-effective 3-dimensional (3D) printing technologies for the treatment of orbital floor fractures are still sought. This study describes a simple, precise, cost-effective method of treating orbital fractures using 3D printing technologies in combination with autologous bone. Enophthalmos and diplopia developed in a 64-year-old female patient with an orbital floor fracture. A virtual 3D model of the fracture site was generated from computed tomography images of the patient. The fracture was virtually closed using spline interpolation. Furthermore, a virtual individualized mold of the defect site was created, which was manufactured using an inkjet printer. The tangible mold was subsequently used during surgery to sculpture an individualized autologous orbital floor implant. Virtual reconstruction of the orbital floor and the resulting mold enhanced the overall accuracy and efficiency of the surgical procedure. The sculptured autologous orbital floor implant showed an excellent fit in vivo. The combination of virtual planning and 3D printing offers an accurate and cost-effective treatment method for orbital floor fractures.

  9. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    2001-06-05

    A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).

  10. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    1998-08-18

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  11. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    1999-10-26

    A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).

  12. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  13. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    2003-08-19

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  14. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, M.S.

    1998-08-18

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device. 27 figs.

  15. Computational mechanics analysis tools for parallel-vector supercomputers

    NASA Technical Reports Server (NTRS)

    Storaasli, O. O.; Nguyen, D. T.; Baddourah, M. A.; Qin, J.

    1993-01-01

    Computational algorithms for structural analysis on parallel-vector supercomputers are reviewed. These parallel algorithms, developed by the authors, are for the assembly of structural equations, 'out-of-core' strategies for linear equation solution, massively distributed-memory equation solution, unsymmetric equation solution, general eigen-solution, geometrically nonlinear finite element analysis, design sensitivity analysis for structural dynamics, optimization algorithm and domain decomposition. The source code for many of these algorithms is available from NASA Langley.

  16. Computer programs for analysis of geophysical data

    SciTech Connect

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  17. Structural Analysis Using Computer Based Methods

    NASA Technical Reports Server (NTRS)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  18. Large-scale temporal analysis of computer and information science

    NASA Astrophysics Data System (ADS)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  19. On computational schemes for global-local stress analysis

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1989-01-01

    An overview is given of global-local stress analysis methods and associated difficulties and recommendations for future research. The phrase global-local analysis is understood to be an analysis in which some parts of the domain or structure are identified, for reasons of accurate determination of stresses and displacements or for more refined analysis than in the remaining parts. The parts of refined analysis are termed local and the remaining parts are called global. Typically local regions are small in size compared to global regions, while the computational effort can be larger in local regions than in global regions.

  20. First Experiences with LHC Grid Computing and Distributed Analysis

    SciTech Connect

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  1. MSFC crack growth analysis computer program, version 2 (users manual)

    NASA Technical Reports Server (NTRS)

    Creager, M.

    1976-01-01

    An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.

  2. Computational Analysis of the SRS Phase III Salt Disposition Alternatives

    SciTech Connect

    Dimenna, R.A.

    1999-10-07

    Completion of the Phase III evaluation and comparison of salt disposition alternatives was supported with enhanced computer models and analysis for each case on the ''short list'' of four options. SPEEDUP(TM) models and special purpose models describing mass and energy balances and flow rates were developed and used to predict performance and production characteristics for each of the options. Results from the computational analysis were a key part of the input used to select a primary and an alternate salt disposition alternative.

  3. Computational Fluid Dynamics Analysis of Thoracic Aortic Dissection

    NASA Astrophysics Data System (ADS)

    Tang, Yik; Fan, Yi; Cheng, Stephen; Chow, Kwok

    2011-11-01

    Thoracic Aortic Dissection (TAD) is a cardiovascular disease with high mortality. An aortic dissection is formed when blood infiltrates the layers of the vascular wall, and a new artificial channel, the false lumen, is created. The expansion of the blood vessel due to the weakened wall enhances the risk of rupture. Computational fluid dynamics analysis is performed to study the hemodynamics of this pathological condition. Both idealized geometry and realistic patient configurations from computed tomography (CT) images are investigated. Physiological boundary conditions from in vivo measurements are employed. Flow configuration and biomechanical forces are studied. Quantitative analysis allows clinicians to assess the risk of rupture in making decision regarding surgical intervention.

  4. The 3-dimensional cored and logarithm potentials: Periodic orbits

    SciTech Connect

    Kulesza, Maité; Llibre, Jaume

    2014-11-15

    We study analytically families of periodic orbits for the cored and logarithmic Hamiltonians with 3 degrees of freedom, which are relevant in the analysis of the galactic dynamics. First, after introducing a scale transformation in the coordinates and momenta with a parameter ε, we show that both systems give essentially the same set of equations of motion up to first order in ε. Then the conditions for finding families of periodic orbits, using the averaging theory up to first order in ε, apply equally to both systems in every energy level H = h > 0 showing the existence of at least 3 periodic orbits, for ε small enough, and also provides an analytic approximation for the initial conditions of these periodic orbits. We prove that at every positive energy level the cored and logarithmic Hamiltonians with 3 degrees of freedom have at least three periodic solutions. The technique used for proving such a result can be applied to other Hamiltonian systems.

  5. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  6. AKSATINT - SATELLITE INTERFERENCE ANALYSIS AND SIMULATION USING PERSONAL COMPUTERS

    NASA Technical Reports Server (NTRS)

    Kantak, A.

    1994-01-01

    In the late seventies, the number of communication satellites in service increased, and interference has become an increasingly important consideration in designing satellite/ground station communications systems. Satellite Interference Analysis and Simulation Using Personal Computers, AKSATINT, models the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both the desired and the interfering satellites are considered to be in elliptical orbits. The simulation contains computation of orbital positions of both satellites using classical orbital elements, calculation of the satellite antennae look angles for both satellites and elevation angles at the desired-satellite ground-station antenna, and computation of Doppler effect due to the motions of the satellites and the Earth's rotation. AKSATINT also computes the interference-tosignal-power ratio, taking into account losses suffered by the links. After computing the interference-to-signal-power ratio, the program computes the statistical quantities. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. The program includes a flowchart, a sample run, and results of that run. AKSATINT is expected to be of general use to system designers and frequency managers in selecting the proper frequency under an interference scenario. The AKSATINT program is written in BASIC. It was designed to operate on the IBM Personal Computer AT or compatibles, and has been implemented under MS DOS 3.2. AKSATINT was developed in 1987.

  7. Monolithically integrated Helmholtz coils by 3-dimensional printing

    SciTech Connect

    Li, Longguang; Abedini-Nassab, Roozbeh; Yellen, Benjamin B.

    2014-06-23

    3D printing technology is of great interest for the monolithic fabrication of integrated systems; however, it is a challenge to introduce metallic components into 3D printed molds to enable broader device functionality. Here, we develop a technique for constructing a multi-axial Helmholtz coil by injecting a eutectic liquid metal Gallium Indium alloy (EGaIn) into helically shaped orthogonal cavities constructed in a 3D printed block. The tri-axial solenoids each carry up to 3.6 A of electrical current and produce magnetic field up to 70 G. Within the central section of the coil, the field variation is less than 1% and is in agreement with theory. The flow rates and critical pressures required to fill the 3D cavities with liquid metal also agree with theoretical predictions and provide scaling trends for filling the 3D printed parts. These monolithically integrated solenoids may find future applications in electronic cell culture platforms, atomic traps, and miniaturized chemical analysis systems based on nuclear magnetic resonance.

  8. Monolithically integrated Helmholtz coils by 3-dimensional printing

    NASA Astrophysics Data System (ADS)

    Li, Longguang; Abedini-Nassab, Roozbeh; Yellen, Benjamin B.

    2014-06-01

    3D printing technology is of great interest for the monolithic fabrication of integrated systems; however, it is a challenge to introduce metallic components into 3D printed molds to enable broader device functionality. Here, we develop a technique for constructing a multi-axial Helmholtz coil by injecting a eutectic liquid metal Gallium Indium alloy (EGaIn) into helically shaped orthogonal cavities constructed in a 3D printed block. The tri-axial solenoids each carry up to 3.6 A of electrical current and produce magnetic field up to 70 G. Within the central section of the coil, the field variation is less than 1% and is in agreement with theory. The flow rates and critical pressures required to fill the 3D cavities with liquid metal also agree with theoretical predictions and provide scaling trends for filling the 3D printed parts. These monolithically integrated solenoids may find future applications in electronic cell culture platforms, atomic traps, and miniaturized chemical analysis systems based on nuclear magnetic resonance.

  9. Analysis of computational footprinting methods for DNase sequencing experiments.

    PubMed

    Gusmao, Eduardo G; Allhoff, Manuel; Zenke, Martin; Costa, Ivan G

    2016-04-01

    DNase-seq allows nucleotide-level identification of transcription factor binding sites on the basis of a computational search of footprint-like DNase I cleavage patterns on the DNA. Frequently in high-throughput methods, experimental artifacts such as DNase I cleavage bias affect the computational analysis of DNase-seq experiments. Here we performed a comprehensive and systematic study on the performance of computational footprinting methods. We evaluated ten footprinting methods in a panel of DNase-seq experiments for their ability to recover cell-specific transcription factor binding sites. We show that three methods--HINT, DNase2TF and PIQ--consistently outperformed the other evaluated methods and that correcting the DNase-seq signal for experimental artifacts significantly improved the accuracy of computational footprints. We also propose a score that can be used to detect footprints arising from transcription factors with potentially short residence times.

  10. Integration of rocket turbine design and analysis through computer graphics

    NASA Technical Reports Server (NTRS)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  11. 3-dimensional wells and tunnels for finite element grids

    SciTech Connect

    Cherry, T.A.; Gable, C.W.; Trease, H.

    1996-12-31

    Modeling fluid, vapor, and air injection and extraction from wells poses a number of problems. The length scale of well bores is centimeters, the region of high pressure gradient may be tens of meters and the reservoir may be tens of kilometers. Furthermore, accurate representation of the path of a deviated well can be difficult. Incorporating the physics of injection and extraction can be made easier and more accurate with automated grid generation tools that incorporate wells as part of a background mesh that represents the reservoir. GEOMESH is a modeling tool developed for automating finite element grid generation. This tool maintains the geometric integrity of the geologic framework and produces optimal (Delaunay) tetrahedral grids. GEOMESH creates a 3D well as hexagonal segments formed along the path of the well. This well structure is tetrahedralized into a Delaunay mesh and then embedded into a background mesh. The well structure can be radially or vertically refined and each well layer is assigned a material property or can take on the material properties of the surrounding stratigraphy. The resulting embedded well can then be used by unstructured finite element models for gas and fluid flow in the vicinity of wells or tunnels. This 3D well representation allows the study of the free-surface of the well and surrounding stratigraphy. It reduces possible grid orientation effects, and allows better correlation between well sample data and the geologic model. The well grids also allow improved visualization for well and tunnel model analysis. 3D observation of the grids helps qualitative interpretation and can reveal features not apparent in fewer dimensions.

  12. 3-dimensional wells and tunnels for finite element grids

    SciTech Connect

    Cherry, T.A.; Gable, C.W.; Trease, H.

    1996-04-01

    Modeling fluid, vapor, and air injection and extraction from wells poses a number of problems. The length scale of well bores is centimeters, the region of high pressure gradient may be tens of meters and the reservoir may be tens of kilometers. Furthermore, accurate representation of the path of a deviated well can be difficult. Incorporating the physics of injection and extraction can be made easier and more accurate with automated grid generation tools that incorporate wells as part of a background mesh that represents the reservoir. GEOMESH is a modeling tool developed for automating finite element grid generation. This tool maintains the geometric integrity of the geologic framework and produces optimal (Delaunay) tetrahedral grids. GEOMESH creates a 3D well as hexagonal segments formed along the path of the well. This well structure is tetrahedralized into a Delaunay mesh and then embedded into a background mesh. The well structure can be radially or vertically refined and each well layer is assigned a material property or can take on the material properties of the surrounding stratigraphy. The resulting embedded well can then be used by unstructured finite element models for gas and fluid flow in the vicinity of wells or tunnels. This 3D well representation allows the study of the free- surface of the well and surrounding stratigraphy. It reduces possible grid orientation effects, and allows better correlation between well sample data and the geologic model. The well grids also allow improved visualization for well and tunnel model analysis. 3D observation of the grids helps qualitative interpretation and can reveal features not apparent in fewer dimensions.

  13. Integrating computer programs for engineering analysis and design

    NASA Technical Reports Server (NTRS)

    Wilhite, A. W.; Crisp, V. K.; Johnson, S. C.

    1983-01-01

    The design of a third-generation system for integrating computer programs for engineering and design has been developed for the Aerospace Vehicle Interactive Design (AVID) system. This system consists of an engineering data management system, program interface software, a user interface, and a geometry system. A relational information system (ARIS) was developed specifically for the computer-aided engineering system. It is used for a repository of design data that are communicated between analysis programs, for a dictionary that describes these design data, for a directory that describes the analysis programs, and for other system functions. A method is described for interfacing independent analysis programs into a loosely-coupled design system. This method emphasizes an interactive extension of analysis techniques and manipulation of design data. Also, integrity mechanisms exist to maintain database correctness for multidisciplinary design tasks by an individual or a team of specialists. Finally, a prototype user interface program has been developed to aid in system utilization.

  14. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  15. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  16. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  17. Finite element dynamic analysis on CDC STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  18. Computer analysis of shells of revolution using asymptotic results

    NASA Technical Reports Server (NTRS)

    Steele, C. R.; Ranjan, G. V.; Goto, C.; Pulliam, T. H.

    1979-01-01

    It is suggested that asymptotic results for the behavior of thin shells can be incorporated in a general computer code for the analysis of a complex shell structure. The advantage when compared to existing finite difference or finite element codes is a substantial reduction in computational labor with the capability of working to a specified level of accuracy. A reduction in user preparation time and dependance on user judgment is also gained, since mesh spacing can be internally generated. The general theory is described in this paper, as well as the implementation in the computer code FAST 1 (Functional Algorithm for Shell Theory) for the analysis of the general axisymmetric shell structure with axisymmetric loading.

  19. Mathematical analysis and computer program development for electromagnetic science studies

    NASA Astrophysics Data System (ADS)

    Tichovolsky, E.; Cohen, E.; Solin, R.; Dibeneditto, T.; Obrien, J. V.; Gersht, Y.; Bamforth, J.; Weiss, A.; Kerwin, N.; Krol, Y.

    1981-11-01

    This report contains representative examples of analysis and programming assignments executed for RADC technical personnel by staff of the ACRON Corporation. An assortment of analytical and computational problems as well as data processing tasks were performed relevant to Air Force projects requiring signal processing and electromagnetic theory. The core of the work pertained to LORAN-related studies, antenna theory, target detection and tracking problems, and the analysis of acoustic and magnetostatic wave devices.

  20. Computational Methods for the Analysis of Array Comparative Genomic Hybridization

    PubMed Central

    Chari, Raj; Lockwood, William W.; Lam, Wan L.

    2006-01-01

    Array comparative genomic hybridization (array CGH) is a technique for assaying the copy number status of cancer genomes. The widespread use of this technology has lead to a rapid accumulation of high throughput data, which in turn has prompted the development of computational strategies for the analysis of array CGH data. Here we explain the principles behind array image processing, data visualization and genomic profile analysis, review currently available software packages, and raise considerations for future software development. PMID:17992253

  1. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  2. The NASA NASTRAN structural analysis computer program - New content

    NASA Technical Reports Server (NTRS)

    Weidman, D. J.

    1978-01-01

    Capabilities of a NASA-developed structural analysis computer program, NASTRAN, are evaluated with reference to finite-element modelling. Applications include the automotive industry as well as aerospace. It is noted that the range of sub-programs within NASTRAN has expanded, while keeping user cost low.

  3. Computer-Assisted Analysis of Qualitative Gerontological Research.

    ERIC Educational Resources Information Center

    Hiemstra, Roger; And Others

    1987-01-01

    Asserts that qualitative research has great potential for use in gerontological research. Describes QUALOG, a computer-assisted, qualitative data analysis scheme using logic programming developed at Syracuse University. Reviews development of QUALOG and discusses how QUALOG was used to analyze data from a qualitative study of older adult learners.…

  4. Detection of microcalcification in computer-assisted mammogram analysis

    NASA Astrophysics Data System (ADS)

    Naghdy, Golshah A.; Naghdy, Fazel; Yue, L.; Drijarkara, A. P.

    1999-07-01

    The latest trend in computer assisted mammogram analysis is reviewed and two new methods developed by the authors for automatic detection of microcalcifications (MCs) are presented. The first method is based on wavelet neurone feature detectors and ART classifiers while the second method utilized fuzzy rules for detection and grading of MCs.

  5. 3-dimensional modeling of transcranial magnetic stimulation: Design and application

    NASA Astrophysics Data System (ADS)

    Salinas, Felipe Santiago

    Over the past three decades, transcranial magnetic stimulation (TMS) has emerged as an effective tool for many research, diagnostic and therapeutic applications in humans. TMS delivers highly localized brain stimulations via non-invasive externally applied magnetic fields. This non-invasive, painless technique provides researchers and clinicians a unique tool capable of stimulating both the central and peripheral nervous systems. However, a complete analysis of the macroscopic electric fields produced by TMS has not yet been performed. In this dissertation, we present a thorough examination of the total electric field induced by TMS in air and a realistic head model with clinically relevant coil poses. In the first chapter, a detailed account of TMS coil wiring geometry was shown to provide significant improvements in the accuracy of primary E-field calculations. Three-dimensional models which accounted for the TMS coil's wire width, height, shape and number of turns clearly improved the fit of calculated-to-measured E-fields near the coil body. Detailed primary E-field models were accurate up to the surface of the coil body (within 0.5% of measured values) whereas simple models were often inadequate (up to 32% different from measured). In the second chapter, we addressed the importance of the secondary E-field created by surface charge accumulation during TMS using the boundary element method (BEM). 3-D models were developed using simple head geometries in order to test the model and compare it with measured values. The effects of tissue geometry, size and conductivity were also investigated. Finally, a realistic head model was used to assess the effect of multiple surfaces on the total E-field. We found that secondary E-fields have the greatest impact at areas in close proximity to each tissue layer. Throughout the head, the secondary E-field magnitudes were predominantly between 25% and 45% of the primary E-fields magnitude. The direction of the secondary E

  6. Analysis of Computer-Mediated Communication: Using Formal Concept Analysis as a Visualizing Methodology.

    ERIC Educational Resources Information Center

    Hara, Noriko

    2002-01-01

    Introduces the use of Formal Concept Analysis (FCA) as a methodology to visualize the data in computer-mediated communication. Bases FCA on a mathematical lattice theory and offers visual maps (graphs) with conceptual hierarchies, and proposes use of FCA combined with content analysis to analyze computer-mediated communication. (Author/LRW)

  7. Quantum computation in the analysis of hyperspectral data

    NASA Astrophysics Data System (ADS)

    Gomez, Richard B.; Ghoshal, Debabrata; Jayanna, Anil

    2004-08-01

    Recent research on the topic of quantum computation provides us with some quantum algorithms with higher efficiency and speedup compared to their classical counterparts. In this paper, it is our intent to provide the results of our investigation of several applications of such quantum algorithms - especially the Grover's Search algorithm - in the analysis of Hyperspectral Data. We found many parallels with Grover's method in existing data processing work that make use of classical spectral matching algorithms. Our efforts also included the study of several methods dealing with hyperspectral image analysis work where classical computation methods involving large data sets could be replaced with quantum computation methods. The crux of the problem in computation involving a hyperspectral image data cube is to convert the large amount of data in high dimensional space to real information. Currently, using the classical model, different time consuming methods and steps are necessary to analyze these data including: Animation, Minimum Noise Fraction Transform, Pixel Purity Index algorithm, N-dimensional scatter plot, Identification of Endmember spectra - are such steps. If a quantum model of computation involving hyperspectral image data can be developed and formalized - it is highly likely that information retrieval from hyperspectral image data cubes would be a much easier process and the final information content would be much more meaningful and timely. In this case, dimensionality would not be a curse, but a blessing.

  8. Computational analysis of Ebolavirus data: prospects, promises and challenges.

    PubMed

    Michaelis, Martin; Rossman, Jeremy S; Wass, Mark N

    2016-08-15

    The ongoing Ebola virus (also known as Zaire ebolavirus, a member of the Ebolavirus family) outbreak in West Africa has so far resulted in >28000 confirmed cases compared with previous Ebolavirus outbreaks that affected a maximum of a few hundred individuals. Hence, Ebolaviruses impose a much greater threat than we may have expected (or hoped). An improved understanding of the virus biology is essential to develop therapeutic and preventive measures and to be better prepared for future outbreaks by members of the Ebolavirus family. Computational investigations can complement wet laboratory research for biosafety level 4 pathogens such as Ebolaviruses for which the wet experimental capacities are limited due to a small number of appropriate containment laboratories. During the current West Africa outbreak, sequence data from many Ebola virus genomes became available providing a rich resource for computational analysis. Here, we consider the studies that have already reported on the computational analysis of these data. A range of properties have been investigated including Ebolavirus evolution and pathogenicity, prediction of micro RNAs and identification of Ebolavirus specific signatures. However, the accuracy of the results remains to be confirmed by wet laboratory experiments. Therefore, communication and exchange between computational and wet laboratory researchers is necessary to make maximum use of computational analyses and to iteratively improve these approaches.

  9. Parallel multithread computing for spectroscopic analysis in optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Trojanowski, Michal; Kraszewski, Maciej; Strakowski, Marcin; Pluciński, Jerzy

    2014-05-01

    Spectroscopic Optical Coherence Tomography (SOCT) is an extension of Optical Coherence Tomography (OCT). It allows gathering spectroscopic information from individual scattering points inside the sample. It is based on time-frequency analysis of interferometric signals. Such analysis requires calculating hundreds of Fourier transforms while performing a single A-scan. Additionally, further processing of acquired spectroscopic information is needed. This significantly increases the time of required computations. During last years, application of graphical processing units (GPU's) was proposed to reduce computation time in OCT by using parallel computing algorithms. GPU technology can be also used to speed-up signal processing in SOCT. However, parallel algorithms used in classical OCT need to be revised because of different character of analyzed data. The classical OCT requires processing of long, independent interferometric signals for obtaining subsequent A-scans. The difference with SOCT is that it requires processing of multiple, shorter signals, which differ only in a small part of samples. We have developed new algorithms for parallel signal processing for usage in SOCT, implemented with NVIDIA CUDA (Compute Unified Device Architecture). We present details of the algorithms and performance tests for analyzing data from in-house SD-OCT system. We also give a brief discussion about usefulness of developed algorithm. Presented algorithms might be useful for researchers working on OCT, as they allow to reduce computation time and are step toward real-time signal processing of SOCT data.

  10. Computational analysis of Ebolavirus data: prospects, promises and challenges.

    PubMed

    Michaelis, Martin; Rossman, Jeremy S; Wass, Mark N

    2016-08-15

    The ongoing Ebola virus (also known as Zaire ebolavirus, a member of the Ebolavirus family) outbreak in West Africa has so far resulted in >28000 confirmed cases compared with previous Ebolavirus outbreaks that affected a maximum of a few hundred individuals. Hence, Ebolaviruses impose a much greater threat than we may have expected (or hoped). An improved understanding of the virus biology is essential to develop therapeutic and preventive measures and to be better prepared for future outbreaks by members of the Ebolavirus family. Computational investigations can complement wet laboratory research for biosafety level 4 pathogens such as Ebolaviruses for which the wet experimental capacities are limited due to a small number of appropriate containment laboratories. During the current West Africa outbreak, sequence data from many Ebola virus genomes became available providing a rich resource for computational analysis. Here, we consider the studies that have already reported on the computational analysis of these data. A range of properties have been investigated including Ebolavirus evolution and pathogenicity, prediction of micro RNAs and identification of Ebolavirus specific signatures. However, the accuracy of the results remains to be confirmed by wet laboratory experiments. Therefore, communication and exchange between computational and wet laboratory researchers is necessary to make maximum use of computational analyses and to iteratively improve these approaches. PMID:27528741

  11. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  12. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  13. Benchmarking Undedicated Cloud Computing Providers for Analysis of Genomic Datasets

    PubMed Central

    Yazar, Seyhan; Gooden, George E. C.; Mackey, David A.; Hewitt, Alex W.

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5–78.2) for E.coli and 53.5% (95% CI: 34.4–72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5–303.1) and 173.9% (95% CI: 134.6–213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE. PMID:25247298

  14. Integration of 3-dimensional surgical and orthodontic technologies with orthognathic "surgery-first" approach in the management of unilateral condylar hyperplasia.

    PubMed

    Janakiraman, Nandakumar; Feinberg, Mark; Vishwanath, Meenakshi; Nalaka Jayaratne, Yasas Shri; Steinbacher, Derek M; Nanda, Ravindra; Uribe, Flavio

    2015-12-01

    Recent innovations in technology and techniques in both surgical and orthodontic fields can be integrated, especially when treating subjects with facial asymmetry. In this article, we present a treatment method consisting of 3-dimensional computer-aided surgical and orthodontic planning, which was implemented with the orthognathic surgery-first approach. Virtual surgical planning, fabrication of surgical splints using the computer-aided design/computer-aided manufacturing technique, and prediction of final orthodontic occlusion using virtual planning with robotically assisted customized archwires were integrated for this patient. Excellent esthetic and occlusal outcomes were obtained in a short period of 5.5 months. PMID:26672712

  15. Practical Use of Computationally Frugal Model Analysis Methods

    SciTech Connect

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugal methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts

  16. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  17. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  18. Computed tomographic beam-hardening artefacts: mathematical characterization and analysis.

    PubMed

    Park, Hyoung Suk; Chung, Yong Eun; Seo, Jin Keun

    2015-06-13

    This paper presents a mathematical characterization and analysis of beam-hardening artefacts in X-ray computed tomography (CT). In the field of dental and medical radiography, metal artefact reduction in CT is becoming increasingly important as artificial prostheses and metallic implants become more widespread in ageing populations. Metal artefacts are mainly caused by the beam-hardening of polychromatic X-ray photon beams, which causes mismatch between the actual sinogram data and the data model being the Radon transform of the unknown attenuation distribution in the CT reconstruction algorithm. We investigate the beam-hardening factor through a mathematical analysis of the discrepancy between the data and the Radon transform of the attenuation distribution at a fixed energy level. Separation of cupping artefacts from beam-hardening artefacts allows causes and effects of streaking artefacts to be analysed. Various computer simulations and experiments are performed to support our mathematical analysis.

  19. Computational Analysis in Support of the SSTO Flowpath Test

    NASA Technical Reports Server (NTRS)

    Duncan, Beverly S.; Trefny, Charles J.

    1994-01-01

    A synergistic approach of combining computational methods and experimental measurements is used in the analysis of a hypersonic inlet. There are four major focal points within this study which examine the boundary layer growth on a compression ramp upstream of the cowl lip of a scramjet inlet. Initially, the boundary layer growth on the NASP Concept Demonstrator Engine (CDE) is examined. The follow-up study determines the optimum diverter height required by the SSTO Flowpath test to best duplicate the CDE results. These flow field computations are then compared to the experimental measurements and the mass average Mach number is determined for this inlet.

  20. Applied analysis/computational mathematics. Final report 1993

    SciTech Connect

    Lax, P.; Berger, M.

    1993-12-01

    This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.

  1. Computer Analysis Of ILO Standard Chest Radiographs Of Pneumoconiosis

    NASA Astrophysics Data System (ADS)

    Li, C. C.; Shu, David B. C.; Tai, H. T.; Hou, W.; Kunkle, G. A.; Wang, Y.; Hoy, R. J.

    1982-11-01

    This paper presents study of computer analysis of the 1980 ILO standard chest radiographs of pneumoconiosis. Algorithms developed for detection of individual small rounded and irregular opacities have been experimented and evaluated on these standard radiographs. The density, shape, and size distribution of the detected objects in the lung field, in spite of false positives, can be used as indicators for the beginning of pneumoconiosis. This approach is potentially useful in computer-assisted screening and early detection process where the annual chest radiograph of each worker is compared with his (her) own normal radiograph obtained previously.

  2. Realization of masticatory movement by 3-dimensional simulation of the temporomandibular joint and the masticatory muscles.

    PubMed

    Park, Jong-Tae; Lee, Jae-Gi; Won, Sung-Yoon; Lee, Sang-Hee; Cha, Jung-Yul; Kim, Hee-Jin

    2013-07-01

    Masticatory muscles are closely involved in mastication, pronunciation, and swallowing, and it is therefore important to study the specific functions and dynamics of the mandibular and masticatory muscles. However, the shortness of muscle fibers and the diversity of movement directions make it difficult to study and simplify the dynamics of mastication. The purpose of this study was to use 3-dimensional (3D) simulation to observe the functions and movements of each of the masticatory muscles and the mandible while chewing. To simulate the masticatory movement, computed tomographic images were taken from a single Korean volunteer (30-year-old man), and skull image data were reconstructed in 3D (Mimics; Materialise, Leuven, Belgium). The 3D-reconstructed masticatory muscles were then attached to the 3D skull model. The masticatory movements were animated using Maya (Autodesk, San Rafael, CA) based on the mandibular motion path. During unilateral chewing, the mandible was found to move laterally toward the functional side by contracting the contralateral lateral pterygoid and ipsilateral temporalis muscles. During the initial mouth opening, only hinge movement was observed at the temporomandibular joint. During this period, the entire mandible rotated approximately 13 degrees toward the bicondylar horizontal plane. Continued movement of the mandible to full mouth opening occurred simultaneously with sliding and hinge movements, and the mandible rotated approximately 17 degrees toward the center of the mandibular ramus. The described approach can yield data for use in face animation and other simulation systems and for elucidating the functional components related to contraction and relaxation of muscles during mastication.

  3. Probabilistic Computer Analysis for Rapid Evaluation of Structures.

    2007-03-29

    P-CARES 2.0.0, Probabilistic Computer Analysis for Rapid Evaluation of Structures, was developed for NRC staff use to determine the validity and accuracy of the analysis methods used by various utilities for structural safety evaluations of nuclear power plants. P-CARES provides the capability to effectively evaluate the probabilistic seismic response using simplified soil and structural models and to quickly check the validity and/or accuracy of the SSI data received from applicants and licensees. The code ismore » organized in a modular format with the basic modules of the system performing static, seismic, and nonlinear analysis.« less

  4. Probabilistic Computer Analysis for Rapid Evaluation of Structures.

    SciTech Connect

    XU, JIM

    2007-03-29

    P-CARES 2.0.0, Probabilistic Computer Analysis for Rapid Evaluation of Structures, was developed for NRC staff use to determine the validity and accuracy of the analysis methods used by various utilities for structural safety evaluations of nuclear power plants. P-CARES provides the capability to effectively evaluate the probabilistic seismic response using simplified soil and structural models and to quickly check the validity and/or accuracy of the SSI data received from applicants and licensees. The code is organized in a modular format with the basic modules of the system performing static, seismic, and nonlinear analysis.

  5. Three-dimensional transonic potential flow about complex 3-dimensional configurations

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1984-01-01

    An analysis has been developed and a computer code written to predict three-dimensional subsonic or transonic potential flow fields about lifting or nonlifting configurations. Possible condfigurations include inlets, nacelles, nacelles with ground planes, S-ducts, turboprop nacelles, wings, and wing-pylon-nacelle combinations. The solution of the full partial differential equation for compressible potential flow written in terms of a velocity potential is obtained using finite differences, line relaxation, and multigrid. The analysis uses either a cylindrical or Cartesian coordinate system. The computational mesh is not body fitted. The analysis has been programmed in FORTRAN for both the CDC CYBER 203 and the CRAY-1 computers. Comparisons of computed results with experimental measurement are presented. Descriptions of the program input and output formats are included.

  6. A Computational Approach to Qualitative Analysis in Large Textual Datasets

    PubMed Central

    Evans, Michael S.

    2014-01-01

    In this paper I introduce computational techniques to extend qualitative analysis into the study of large textual datasets. I demonstrate these techniques by using probabilistic topic modeling to analyze a broad sample of 14,952 documents published in major American newspapers from 1980 through 2012. I show how computational data mining techniques can identify and evaluate the significance of qualitatively distinct subjects of discussion across a wide range of public discourse. I also show how examining large textual datasets with computational methods can overcome methodological limitations of conventional qualitative methods, such as how to measure the impact of particular cases on broader discourse, how to validate substantive inferences from small samples of textual data, and how to determine if identified cases are part of a consistent temporal pattern. PMID:24498398

  7. CFD Analysis and Design Optimization Using Parallel Computers

    NASA Technical Reports Server (NTRS)

    Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James

    1997-01-01

    A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.

  8. A computational approach to qualitative analysis in large textual datasets.

    PubMed

    Evans, Michael S

    2014-01-01

    In this paper I introduce computational techniques to extend qualitative analysis into the study of large textual datasets. I demonstrate these techniques by using probabilistic topic modeling to analyze a broad sample of 14,952 documents published in major American newspapers from 1980 through 2012. I show how computational data mining techniques can identify and evaluate the significance of qualitatively distinct subjects of discussion across a wide range of public discourse. I also show how examining large textual datasets with computational methods can overcome methodological limitations of conventional qualitative methods, such as how to measure the impact of particular cases on broader discourse, how to validate substantive inferences from small samples of textual data, and how to determine if identified cases are part of a consistent temporal pattern.

  9. Computational simulation for analysis and synthesis of impact resilient structure

    NASA Astrophysics Data System (ADS)

    Djojodihardjo, Harijono

    2013-10-01

    Impact resilient structures are of great interest in many engineering applications varying from civil, land vehicle, aircraft and space structures, to mention a few examples. To design such structure, one has to resort fundamental principles and take into account progress in analytical and computational approaches as well as in material science and technology. With such perspectives, this work looks at a generic beam and plate structure subject to impact loading and carry out analysis and numerical simulation. The first objective of the work is to develop a computational algorithm to analyze flat plate as a generic structure subjected to impact loading for numerical simulation and parametric study. The analysis will be based on dynamic response analysis. Consideration is given to the elastic-plastic region. The second objective is to utilize the computational algorithm for direct numerical simulation, and as a parallel scheme, commercial off-the shelf numerical code is utilized for parametric study, optimization and synthesis. Through such analysis and numerical simulation, effort is devoted to arrive at an optimum configuration in terms of loading, structural dimensions, material properties and composite lay-up, among others. Results will be discussed in view of practical applications.

  10. CAPRI: Using a Geometric Foundation for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2002-01-01

    CAPRI (Computational Analysis Programming Interface) is a software development tool intended to make computerized design, simulation and analysis faster and more efficient. The computational steps traditionally taken for most engineering analysis (Computational Fluid Dynamics (CFD), structural analysis, etc.) are: Surface Generation, usually by employing a Computer Aided Design (CAD) system; Grid Generation, preparing the volume for the simulation; Flow Solver, producing the results at the specified operational point; Post-processing Visualization, interactively attempting to understand the results. It should be noted that the structures problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go IGES files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D) data formats and the upcoming CGNS). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Instead of the serial approach to analysis, CAPRI takes a geometry centric approach. CAPRI is a software building tool-kit that refers to two ideas: (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. The connection to the geometry is made through an Application Programming Interface (API) and not a file system.

  11. Assessing computer waste generation in Chile using material flow analysis.

    PubMed

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  12. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  13. Ubiquitous computing in sports: A review and analysis.

    PubMed

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  14. PArallel Reacting Multiphase FLOw Computational Fluid Dynamic Analysis

    2002-06-01

    PARMFLO is a parallel multiphase reacting flow computational fluid dynamics (CFD) code. It can perform steady or unsteady simulations in three space dimensions. It is intended for use in engineering CFD analysis of industrial flow system components. Its parallel processing capabilities allow it to be applied to problems that use at least an order of magnitude more computational cells than the number that can be used on a typical single processor workstation (about 106 cellsmore » in parallel processing mode versus about io cells in serial processing mode). Alternately, by spreading the work of a CFD problem that could be run on a single workstation over a group of computers on a network, it can bring the runtime down by an order of magnitude or more (typically from many days to less than one day). The software was implemented using the industry standard Message-Passing Interface (MPI) and domain decomposition in one spatial direction. The phases of a flow problem may include an ideal gas mixture with an arbitrary number of chemical species, and dispersed droplet and particle phases. Regions of porous media may also be included within the domain. The porous media may be packed beds, foams, or monolith catalyst supports. With these features, the code is especially suited to analysis of mixing of reactants in the inlet chamber of catalytic reactors coupled to computation of product yields that result from the flow of the mixture through the catalyst coaled support structure.« less

  15. Computational analysis of high resolution unsteady airloads for rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Lam, C.-M. Gordon; Wachspress, Daniel A.; Bliss, Donald B.

    1994-01-01

    The study of helicopter aerodynamic loading for acoustics applications requires the application of efficient yet accurate simulations of the velocity field induced by the rotor's vortex wake. This report summarizes work to date on the development of such an analysis, which builds on the Constant Vorticity Contour (CVC) free wake model, previously implemented for the study of vibratory loading in the RotorCRAFT computer code. The present effort has focused on implementation of an airload reconstruction approach that computes high resolution airload solutions of rotor/rotor-wake interactions required for acoustics computations. Supplementary efforts on the development of improved vortex core modeling, unsteady aerodynamic effects, higher spatial resolution of rotor loading, and fast vortex wake implementations have substantially enhanced the capabilities of the resulting software, denoted RotorCRAFT/AA (AeroAcoustics). Results of validation calculations using recently acquired model rotor data show that by employing airload reconstruction it is possible to apply the CVC wake analysis with temporal and spatial resolution suitable for acoustics applications while reducing the computation time required by one to two orders of magnitude relative to that required by direct calculations. Promising correlation with this body of airload and noise data has been obtained for a variety of rotor configurations and operating conditions.

  16. Analysis of outcomes in radiation oncology: An integrated computational platform

    PubMed Central

    Liu, Dezhi; Ajlouni, Munther; Jin, Jian-Yue; Ryu, Samuel; Siddiqui, Farzan; Patel, Anushka; Movsas, Benjamin; Chetty, Indrin J.

    2009-01-01

    Radiotherapy research and outcome analyses are essential for evaluating new methods of radiation delivery and for assessing the benefits of a given technology on locoregional control and overall survival. In this article, a computational platform is presented to facilitate radiotherapy research and outcome studies in radiation oncology. This computational platform consists of (1) an infrastructural database that stores patient diagnosis, IMRT treatment details, and follow-up information, (2) an interface tool that is used to import and export IMRT plans in DICOM RT and AAPM/RTOG formats from a wide range of planning systems to facilitate reproducible research, (3) a graphical data analysis and programming tool that visualizes all aspects of an IMRT plan including dose, contour, and image data to aid the analysis of treatment plans, and (4) a software package that calculates radiobiological models to evaluate IMRT treatment plans. Given the limited number of general-purpose computational environments for radiotherapy research and outcome studies, this computational platform represents a powerful and convenient tool that is well suited for analyzing dose distributions biologically and correlating them with the delivered radiation dose distributions and other patient-related clinical factors. In addition the database is web-based and accessible by multiple users, facilitating its convenient application and use. PMID:19544785

  17. Use of 3-Dimensional Volumetric Modeling of Adrenal Gland Size in Patients with Primary Pigmented Nodular Adrenocortical Disease.

    PubMed

    Chrysostomou, P P; Lodish, M B; Turkbey, E B; Papadakis, G Z; Stratakis, C A

    2016-04-01

    Primary pigmented nodular adrenocortical disease (PPNAD) is a rare type of bilateral adrenal hyperplasia leading to hypercortisolemia. Adrenal nodularity is often appreciable with computed tomography (CT); however, accurate radiologic characterization of adrenal size in PPNAD has not been studied well. We used 3-dimensional (3D) volumetric analysis to characterize and compare adrenal size in PPNAD patients, with and without Cushing's syndrome (CS). Patients diagnosed with PPNAD and their family members with known mutations in PRKAR1A were screened. CT scans were used to create 3D models of each adrenal. Criteria for biochemical diagnosis of CS included loss of diurnal variation and/or elevated midnight cortisol levels, and paradoxical increase in urinary free cortisol and/or urinary 17-hydroxysteroids after dexamethasone administration. Forty-five patients with PPNAD (24 females, 27.8±17.6 years) and 8 controls (19±3 years) were evaluated. 3D volumetric modeling of adrenal glands was performed in all. Thirty-eight patients out of 45 (84.4%) had CS. Their mean adrenal volume was 8.1 cc±4.1, 7.2 cc±4.5 (p=0.643) for non-CS, and 8.0cc±1.6 for controls. Mean values were corrected for body surface area; 4.7 cc/kg/m(2)±2.2 for CS, and 3.9 cc/kg/m(2)±1.3 for non-CS (p=0.189). Adrenal volume and midnight cortisol in both groups was positively correlated, r=0.35, p=0.03. We conclude that adrenal volume measured by 3D CT in patients with PPNAD and CS was similar to those without CS, confirming empirical CT imaging-based observations. However, the association between adrenal volume and midnight cortisol levels may be used as a marker of who among patients with PPNAD may develop CS, something that routine CT cannot do. PMID:27065461

  18. A Proposal of 3-dimensional Self-organizing Memory and Its Application to Knowledge Extraction from Natural Language

    NASA Astrophysics Data System (ADS)

    Sakakibara, Kai; Hagiwara, Masafumi

    In this paper, we propose a 3-dimensional self-organizing memory and describe its application to knowledge extraction from natural language. First, the proposed system extracts a relation between words by JUMAN (morpheme analysis system) and KNP (syntax analysis system), and stores it in short-term memory. In the short-term memory, the relations are attenuated with the passage of processing. However, the relations with high frequency of appearance are stored in the long-term memory without attenuation. The relations in the long-term memory are placed to the proposed 3-dimensional self-organizing memory. We used a new learning algorithm called ``Potential Firing'' in the learning phase. In the recall phase, the proposed system recalls relational knowledge from the learned knowledge based on the input sentence. We used a new recall algorithm called ``Waterfall Recall'' in the recall phase. We added a function to respond to questions in natural language with ``yes/no'' in order to confirm the validity of proposed system by evaluating the quantity of correct answers.

  19. Novel Radiobiological Gamma Index for Evaluation of 3-Dimensional Predicted Dose Distribution

    SciTech Connect

    Sumida, Iori; Yamaguchi, Hajime; Kizaki, Hisao; Aboshi, Keiko; Tsujii, Mari; Yoshikawa, Nobuhiko; Yamada, Yuji; Suzuki, Osamu; Seo, Yuji; Isohashi, Fumiaki; Yoshioka, Yasuo; Ogawa, Kazuhiko

    2015-07-15

    Purpose: To propose a gamma index-based dose evaluation index that integrates the radiobiological parameters of tumor control (TCP) and normal tissue complication probabilities (NTCP). Methods and Materials: Fifteen prostate and head and neck (H&N) cancer patients received intensity modulated radiation therapy. Before treatment, patient-specific quality assurance was conducted via beam-by-beam analysis, and beam-specific dose error distributions were generated. The predicted 3-dimensional (3D) dose distribution was calculated by back-projection of relative dose error distribution per beam. A 3D gamma analysis of different organs (prostate: clinical [CTV] and planned target volumes [PTV], rectum, bladder, femoral heads; H&N: gross tumor volume [GTV], CTV, spinal cord, brain stem, both parotids) was performed using predicted and planned dose distributions under 2%/2 mm tolerance and physical gamma passing rate was calculated. TCP and NTCP values were calculated for voxels with physical gamma indices (PGI) >1. We propose a new radiobiological gamma index (RGI) to quantify the radiobiological effects of TCP and NTCP and calculate radiobiological gamma passing rates. Results: The mean RGI gamma passing rates for prostate cases were significantly different compared with those of PGI (P<.03–.001). The mean RGI gamma passing rates for H&N cases (except for GTV) were significantly different compared with those of PGI (P<.001). Differences in gamma passing rates between PGI and RGI were due to dose differences between the planned and predicted dose distributions. Radiobiological gamma distribution was visualized to identify areas where the dose was radiobiologically important. Conclusions: RGI was proposed to integrate radiobiological effects into PGI. This index would assist physicians and medical physicists not only in physical evaluations of treatment delivery accuracy, but also in clinical evaluations of predicted dose distribution.

  20. Wind Energy Conversion System Analysis Model (WECSAM) computer program documentation

    NASA Astrophysics Data System (ADS)

    Downey, W. T.; Hendrick, P. L.

    1982-07-01

    Described is a computer-based wind energy conversion system analysis model (WECSAM) developed to predict the technical and economic performance of wind energy conversion systems (WECS). The model is written in CDC FORTRAN V. The version described accesses a data base containing wind resource data, application loads, WECS performance characteristics, utility rates, state taxes, and state subsidies for a six state region (Minnesota, Michigan, Wisconsin, Illinois, Ohio, and Indiana). The model is designed for analysis at the county level. The computer model includes a technical performance module and an economic evaluation module. The modules can be run separately or together. The model can be run for any single user-selected county within the region or looped automatically through all counties within the region. In addition, the model has a restart capability that allows the user to modify any data-base value written to a scratch file prior to the technical or economic evaluation.

  1. Computational Aerodynamic Analysis of Offshore Upwind and Downwind Turbines

    DOE PAGES

    Zhao, Qiuying; Sheng, Chunhua; Afjeh, Abdollah

    2014-01-01

    Aerodynamic interactions of the model NREL 5 MW offshore horizontal axis wind turbines (HAWT) are investigated using a high-fidelity computational fluid dynamics (CFD) analysis. Four wind turbine configurations are considered; three-bladed upwind and downwind and two-bladed upwind and downwind configurations, which operate at two different rotor speeds of 12.1 and 16 RPM. In the present study, both steady and unsteady aerodynamic loads, such as the rotor torque, blade hub bending moment, and base the tower bending moment of the tower, are evaluated in detail to provide overall assessment of different wind turbine configurations. Aerodynamic interactions between the rotor and tower are analyzed,more » including the rotor wake development downstream. The computational analysis provides insight into aerodynamic performance of the upwind and downwind, two- and three-bladed horizontal axis wind turbines.« less

  2. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  3. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  4. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    SciTech Connect

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  5. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    PubMed

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner.

  6. Computational Methods for Failure Analysis and Life Prediction

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.

  7. Variance analysis. Part II, The use of computers.

    PubMed

    Finkler, S A

    1991-09-01

    This is the second in a two-part series on variance analysis. In the first article (JONA, July/August 1991), the author discussed flexible budgeting, including the calculation of price, quantity, volume, and acuity variances. In this second article, the author focuses on the use of computers by nurse managers to aid in the process of calculating, understanding, and justifying variances. PMID:1919788

  8. Computational aspects of growth-induced instabilities through eigenvalue analysis

    NASA Astrophysics Data System (ADS)

    Javili, A.; Dortdivanlioglu, B.; Kuhl, E.; Linder, C.

    2015-09-01

    The objective of this contribution is to establish a computational framework to study growth-induced instabilities. The common approach towards growth-induced instabilities is to decompose the deformation multiplicatively into its growth and elastic part. Recently, this concept has been employed in computations of growing continua and has proven to be extremely useful to better understand the material behavior under growth. While finite element simulations seem to be capable of predicting the behavior of growing continua, they often cannot naturally capture the instabilities caused by growth. The accepted strategy to provoke growth-induced instabilities is therefore to perturb the solution of the problem, which indeed results in geometric instabilities in the form of wrinkles and folds. However, this strategy is intrinsically subjective as the user is prescribing the perturbations and the simulations are often highly perturbation-dependent. We propose a different strategy that is inherently suitable for this problem, namely eigenvalue analysis. The main advantages of eigenvalue analysis are that first, no arbitrary, artificial perturbations are needed and second, it is, in general, independent of the time step size. Therefore, the solution obtained by this methodology is not subjective and thus, is generic and reproducible. Equipped with eigenvalue analysis, we are able to compute precisely the critical growth to initiate instabilities. Furthermore, this strategy allows us to compare different finite elements for this family of problems. Our results demonstrate that linear elements perform strikingly poorly, as compared to quadratic elements.

  9. Spacelab data analysis using the space plasma computer analysis network (SCAN) system

    NASA Technical Reports Server (NTRS)

    Green, J. L.

    1984-01-01

    The Space-plasma Computer Analysis Network (SCAN) currently connects a large number of U.S. Spacelab investigators into a common computer network. Used primarily by plasma physics researchers at present, SCAN provides access to Spacelab investigators in other areas of space science, to Spacelab and non-Spacelab correlative data bases, and to large Class VI computational facilities for modeling. SCAN links computers together at remote institutions used by space researchers, utilizing commercially available software for computer-to-computer communications. Started by the NASA's Office of Space Science in mid 1980, SCAN presently contains ten system nodes located at major universities and space research laboratories, with fourteen new nodes projected for the near future. The Stanford University computer gateways allow SCAN users to connect onto the ARPANET and TELENET overseas networks.

  10. Dosimetric Comparison Between 3-Dimensional Conformal and Robotic SBRT Treatment Plans for Accelerated Partial Breast Radiotherapy.

    PubMed

    Goggin, L M; Descovich, M; McGuinness, C; Shiao, S; Pouliot, J; Park, C

    2016-06-01

    Accelerated partial breast irradiation is an attractive alternative to conventional whole breast radiotherapy for selected patients. Recently, CyberKnife has emerged as a possible alternative to conventional techniques for accelerated partial breast irradiation. In this retrospective study, we present a dosimetric comparison between 3-dimensional conformal radiotherapy plans and CyberKnife plans using circular (Iris) and multi-leaf collimators. Nine patients who had undergone breast-conserving surgery followed by whole breast radiation were included in this retrospective study. The CyberKnife planning target volume (PTV) was defined as the lumpectomy cavity + 10 mm + 2 mm with prescription dose of 30 Gy in 5 fractions. Two sets of 3-dimensional conformal radiotherapy plans were created, one used the same definitions as described for CyberKnife and the second used the RTOG-0413 definition of the PTV: lumpectomy cavity + 15 mm + 10 mm with prescription dose of 38.5 Gy in 10 fractions. Using both PTV definitions allowed us to compare the dose delivery capabilities of each technology and to evaluate the advantage of CyberKnife tracking. For the dosimetric comparison using the same PTV margins, CyberKnife and 3-dimensional plans resulted in similar tumor coverage and dose to critical structures, with the exception of the lung V5%, which was significantly smaller for 3-dimensional conformal radiotherapy, 6.2% when compared to 39.4% for CyberKnife-Iris and 17.9% for CyberKnife-multi-leaf collimator. When the inability of 3-dimensional conformal radiotherapy to track motion is considered, the result increased to 25.6%. Both CyberKnife-Iris and CyberKnife-multi-leaf collimator plans demonstrated significantly lower average ipsilateral breast V50% (25.5% and 24.2%, respectively) than 3-dimensional conformal radiotherapy (56.2%). The CyberKnife plans were more conformal but less homogeneous than the 3-dimensional conformal radiotherapy plans. Approximately 50% shorter

  11. Construction of 3-Dimensional Printed Ultrasound Phantoms With Wall-less Vessels.

    PubMed

    Nikitichev, Daniil I; Barburas, Anamaria; McPherson, Kirstie; Mari, Jean-Martial; West, Simeon J; Desjardins, Adrien E

    2016-06-01

    Ultrasound phantoms are invaluable as training tools for vascular access procedures. We developed ultrasound phantoms with wall-less vessels using 3-dimensional printed chambers. Agar was used as a soft tissue-mimicking material, and the wall-less vessels were created with rods that were retracted after the agar was set. The chambers had integrated luer connectors to allow for fluid injections with clinical syringes. Several variations on this design are presented, which include branched and stenotic vessels. The results show that 3-dimensional printing can be well suited to the construction of wall-less ultrasound phantoms, with designs that can be readily customized and shared electronically. PMID:27162278

  12. 3-Dimensional Multiwaveguide Probe Array for Light Delivery to Distributed Brain Circuits

    PubMed Central

    Zorzos, Anthony N.; Scholvin, Jorg; Boyden, Edward S.; Fonstad, Clifton G.

    2013-01-01

    To deliver light to the brain for neuroscientific and neuroengineering applications like optogenetics, in which light is used to activate or silence neurons expressing specific photosensitive proteins, optical fibers are commonly used. However, an optical fiber is limited to delivering light to a single target within the three-dimensional structure of the brain. We here describe the design and fabrication of an array of thin microwaveguides which terminate at a 3-dimensionally distributed set of points, appropriate for delivering light to targets distributed in a 3-dimensional pattern throughout the brain. PMID:23202064

  13. Magnetic topologies of coronal mass ejection events: Effects of 3-dimensional reconnection

    SciTech Connect

    Gosling, J.T.

    1995-09-01

    New magnetic loops formed in the corona following coronal mass ejection, CME, liftoffs provide strong evidence that magnetic reconnection commonly occurs within the magnetic ``legs`` of the departing CMEs. Such reconnection is inherently 3-dimensional and naturally produces CMEs having magnetic flux rope topologies. Sustained reconnection behind CMEs can produce a mixture of open and disconnected field lines threading the CMES. In contrast to the results of 2-dimensional reconnection. the disconnected field lines are attached to the outer heliosphere at both ends. A variety of solar and solar wind observations are consistent with the concept of sustained 3-dimensional reconnection within the magnetic legs of CMEs close to the Sun.

  14. Shielding analysis methods available in the scale computational system

    SciTech Connect

    Parks, C.V.; Tang, J.S.; Hermann, O.W.; Bucholz, J.A.; Emmett, M.B.

    1986-01-01

    Computational tools have been included in the SCALE system to allow shielding analysis to be performed using both discrete-ordinates and Monte Carlo techniques. One-dimensional discrete ordinates analyses are performed with the XSDRNPM-S module, and point dose rates outside the shield are calculated with the XSDOSE module. Multidimensional analyses are performed with the MORSE-SGC/S Monte Carlo module. This paper will review the above modules and the four Shielding Analysis Sequences (SAS) developed for the SCALE system. 7 refs., 8 figs.

  15. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  16. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  17. New Mexico district work-effort analysis computer program

    USGS Publications Warehouse

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  18. PREWATE: An interactive preprocessing computer code to the Weight Analysis of Turbine Engines (WATE) computer code

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1983-01-01

    The Weight Analysis of Turbine Engines (WATE) computer code was developed by Boeing under contract to NASA Lewis. It was designed to function as an adjunct to the Navy/NASA Engine Program (NNEP). NNEP calculates the design and off-design thrust and sfc performance of User defined engine cycles. The thermodynamic parameters throughout the engine as generated by NNEP are then combined with input parameters defining the component characteristics in WATE to calculate the bare engine weight of this User defined engine. Preprocessor programs for NNEP were previously developed to simplify the task of creating input datasets. This report describes a similar preprocessor for the WATE code.

  19. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  20. Analysis of multigrid methods on massively parallel computers: Architectural implications

    NASA Technical Reports Server (NTRS)

    Matheson, Lesley R.; Tarjan, Robert E.

    1993-01-01

    We study the potential performance of multigrid algorithms running on massively parallel computers with the intent of discovering whether presently envisioned machines will provide an efficient platform for such algorithms. We consider the domain parallel version of the standard V cycle algorithm on model problems, discretized using finite difference techniques in two and three dimensions on block structured grids of size 10(exp 6) and 10(exp 9), respectively. Our models of parallel computation were developed to reflect the computing characteristics of the current generation of massively parallel multicomputers. These models are based on an interconnection network of 256 to 16,384 message passing, 'workstation size' processors executing in an SPMD mode. The first model accomplishes interprocessor communications through a multistage permutation network. The communication cost is a logarithmic function which is similar to the costs in a variety of different topologies. The second model allows single stage communication costs only. Both models were designed with information provided by machine developers and utilize implementation derived parameters. With the medium grain parallelism of the current generation and the high fixed cost of an interprocessor communication, our analysis suggests an efficient implementation requires the machine to support the efficient transmission of long messages, (up to 1000 words) or the high initiation cost of a communication must be significantly reduced through an alternative optimization technique. Furthermore, with variable length message capability, our analysis suggests the low diameter multistage networks provide little or no advantage over a simple single stage communications network.

  1. A computational clonal analysis of the developing mouse limb bud.

    PubMed

    Marcon, Luciano; Arqués, Carlos G; Torres, Miguel S; Sharpe, James

    2011-01-01

    A comprehensive spatio-temporal description of the tissue movements underlying organogenesis would be an extremely useful resource to developmental biology. Clonal analysis and fate mappings are popular experiments to study tissue movement during morphogenesis. Such experiments allow cell populations to be labeled at an early stage of development and to follow their spatial evolution over time. However, disentangling the cumulative effects of the multiple events responsible for the expansion of the labeled cell population is not always straightforward. To overcome this problem, we develop a novel computational method that combines accurate quantification of 2D limb bud morphologies and growth modeling to analyze mouse clonal data of early limb development. Firstly, we explore various tissue movements that match experimental limb bud shape changes. Secondly, by comparing computational clones with newly generated mouse clonal data we are able to choose and characterize the tissue movement map that better matches experimental data. Our computational analysis produces for the first time a two dimensional model of limb growth based on experimental data that can be used to better characterize limb tissue movement in space and time. The model shows that the distribution and shapes of clones can be described as a combination of anisotropic growth with isotropic cell mixing, without the need for lineage compartmentalization along the AP and PD axis. Lastly, we show that this comprehensive description can be used to reassess spatio-temporal gene regulations taking tissue movement into account and to investigate PD patterning hypothesis.

  2. Analysis of sponge zones for computational fluid mechanics

    SciTech Connect

    Bodony, Daniel J. . E-mail: bodony@stanford.edu

    2006-03-01

    The use of sponge regions, or sponge zones, which add the forcing term -{sigma}(q - q {sub ref}) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer.

  3. 3-Dimensional and Interactive Istanbul University Virtual Laboratory Based on Active Learning Methods

    ERIC Educational Resources Information Center

    Ince, Elif; Kirbaslar, Fatma Gulay; Yolcu, Ergun; Aslan, Ayse Esra; Kayacan, Zeynep Cigdem; Alkan Olsson, Johanna; Akbasli, Ayse Ceylan; Aytekin, Mesut; Bauer, Thomas; Charalambis, Dimitris; Gunes, Zeliha Ozsoy; Kandemir, Ceyhan; Sari, Umit; Turkoglu, Suleyman; Yaman, Yavuz; Yolcu, Ozgu

    2014-01-01

    The purpose of this study is to develop a 3-dimensional interactive multi-user and multi-admin IUVIRLAB featuring active learning methods and techniques for university students and to introduce the Virtual Laboratory of Istanbul University and to show effects of IUVIRLAB on students' attitudes on communication skills and IUVIRLAB. Although…

  4. 3-dimensional orthodontics visualization system with dental study models and orthopantomograms

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Ong, S. H.; Foong, K. W. C.; Dhar, T.

    2005-04-01

    The aim of this study is to develop a system that provides 3-dimensional visualization of orthodontic treatments. Dental plaster models and corresponding orthopantomogram (dental panoramic tomogram) are first digitized and fed into the system. A semi-auto segmentation technique is applied to the plaster models to detect the dental arches, tooth interstices and gum margins, which are used to extract individual crown models. 3-dimensional representation of roots, generated by deforming generic tooth models with orthopantomogram using radial basis functions, is attached to corresponding crowns to enable visualization of complete teeth. An optional algorithm to close the gaps between deformed roots and actual crowns by using multi-quadratic radial basis functions is also presented, which is capable of generating smooth mesh representation of complete 3-dimensional teeth. User interface is carefully designed to achieve a flexible system with as much user friendliness as possible. Manual calibration and correction is possible throughout the data processing steps to compensate occasional misbehaviors of automatic procedures. By allowing the users to move and re-arrange individual teeth (with their roots) on a full dentition, this orthodontic visualization system provides an easy and accurate way of simulation and planning of orthodontic treatment. Its capability of presenting 3-dimensional root information with only study models and orthopantomogram is especially useful for patients who do not undergo CT scanning, which is not a routine procedure in most orthodontic cases.

  5. Regional climate: Design and analysis of computer experiments?

    NASA Astrophysics Data System (ADS)

    Nychka, D. W.

    2011-12-01

    As attention shifts from broad global summaries of climate change to more specific regional impacts there is a need for data sciences to quantify the uncertainty in regional predictions. This talk will provide an overview on regional climate experiments with an emphasis on the statistical problems for interpreting these large and complex simulations. A regional climate model is a computer code based on physics that simulates the detailed flow of the atmosphere in a particular region from the large scale information of a global climate model. One intent is to compare simulations under current climate to future scenarios to infer the nature of climate change expected at a location. There exists a mature sub-discipline in engineering and statistics on the design and analysis of computer experiments. This talk will sketch how general methods from this area may apply to the interpretation of climate model experiments and to what extent the problems of interpreting climate projections are unique and require new ideas.

  6. Recent applications of the transonic wing analysis computer code, TWING

    NASA Technical Reports Server (NTRS)

    Subramanian, N. R.; Holst, T. L.; Thomas, S. D.

    1982-01-01

    An evaluation of the transonic-wing-analysis computer code TWING is given. TWING utilizes a fully implicit approximate factorization iteration scheme to solve the full potential equation in conservative form. A numerical elliptic-solver grid-generation scheme is used to generate the required finite-difference mesh. Several wing configurations were analyzed, and the limits of applicability of this code was evaluated. Comparisons of computed results were made with available experimental data. Results indicate that the code is robust, accurate (when significant viscous effects are not present), and efficient. TWING generally produces solutions an order of magnitude faster than other conservative full potential codes using successive-line overrelaxation. The present method is applicable to a wide range of isolated wing configurations including high-aspect-ratio transport wings and low-aspect-ratio, high-sweep, fighter configurations.

  7. Computational methodology for ChIP-seq analysis

    PubMed Central

    Shin, Hyunjin; Liu, Tao; Duan, Xikun; Zhang, Yong; Liu, X. Shirley

    2015-01-01

    Chromatin immunoprecipitation coupled with massive parallel sequencing (ChIP-seq) is a powerful technology to identify the genome-wide locations of DNA binding proteins such as transcription factors or modified histones. As more and more experimental laboratories are adopting ChIP-seq to unravel the transcriptional and epigenetic regulatory mechanisms, computational analyses of ChIP-seq also become increasingly comprehensive and sophisticated. In this article, we review current computational methodology for ChIP-seq analysis, recommend useful algorithms and workflows, and introduce quality control measures at different analytical steps. We also discuss how ChIP-seq could be integrated with other types of genomic assays, such as gene expression profiling and genome-wide association studies, to provide a more comprehensive view of gene regulatory mechanisms in important physiological and pathological processes. PMID:25741452

  8. Data acquisition and analysis using the IBM Computer System 9000

    SciTech Connect

    Mueller, G.E.

    1985-10-01

    A data-acquisition, analysis, and graphing program has been developed on the IBM CS-9000 multitask computer to support the UNM/SNL/GA Thermal-Hydraulic Test Facility. The software has been written in Computer System BASIC which allows accessing and configuring I/O devices. The CS-9000 has been interfaced with an HP 3497A Data Acquisition/Control Unit and an HP 7470A Graphics Plotter through the IEEE-488 Bus. With this configuration the system is capable of scanning 60 channels of analog thermocuple compensated input, 20 channels of analog pressure transducer input, and 16 channels of digital mass flow rate input. The CS-9000 graphics coupled with the HP 7470A provides useful visualization of changes in measured parameters. 8 refs., 7 figs.

  9. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  10. Computational analysis of RNA structures with chemical probing data.

    PubMed

    Ge, Ping; Zhang, Shaojie

    2015-06-01

    RNAs play various roles, not only as the genetic codes to synthesize proteins, but also as the direct participants of biological functions determined by their underlying high-order structures. Although many computational methods have been proposed for analyzing RNA structures, their accuracy and efficiency are limited, especially when applied to the large RNAs and the genome-wide data sets. Recently, advances in parallel sequencing and high-throughput chemical probing technologies have prompted the development of numerous new algorithms, which can incorporate the auxiliary structural information obtained from those experiments. Their potential has been revealed by the secondary structure prediction of ribosomal RNAs and the genome-wide ncRNA function annotation. In this review, the existing probing-directed computational methods for RNA secondary and tertiary structure analysis are discussed.

  11. Computational chemistry in Argonne`s Reactor Analysis Division

    SciTech Connect

    Gelbard, E.; Agrawal, R.; Fanning, T.

    1997-08-01

    Roughly 3 years ago work on Argonne`s Integral Fast Reactor ({open_quotes}IFR{close_quotes}) was terminated and at that time, ANL funding was redirected to a number of alternative programs. One such alternative was waste management and, since disposal of spent fuel from ANL`s EBR-II reactor presents some special problems, this seemed an appropriate area for ANL work. Methods for the treatment and disposal of spent fuel (particularly from EBR-II but also from other sources) are now under very active investigation at ANL. The very large waste form development program is mainly experimental at this point, but within the Reactor Analysis ({open_quotes}RA{close_quotes}) Division a small computational chemistry program is underway, designed to supplement the experimental program. One of the most popular proposals for the treatment of much of our high-level wastes is vitrification. As noted below, this approach has serious drawbacks for EBR-II spent fuel. ANL has proposed, instead, that spent fuel first be pretreated by a special metallurgical process which produces, as waste, chloride salts of the various fission products; these salts would then be adsorbed in zeolite A, which is subsequently bonded with glass to produce a waste form suitable for disposal. So far it has been the main mission of RA`s computational chemistry program to study the process by which leaching occurs when the glass-bonded zeolite waste form is exposed to water. It is the purpose of this paper to describe RA`s computational chemistry program, to discuss the computational techniques involved in such a program, and in general to familiarize the M. and C. Division with a computational area which is probably unfamiliar to most of its member. 11 refs., 2 figs.

  12. Sensitivity analysis for computer model projections of hurricane losses.

    PubMed

    Iman, Ronald L; Johnson, Mark E; Watson, Charles C

    2005-10-01

    Projecting losses associated with hurricanes is a complex and difficult undertaking that is fraught with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 billion dollars to 3 billion dollars in losses late on the 12th to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm struck the resort areas of Charlotte Harbor and moved across the densely populated central part of the state, with early poststorm estimates in the 28 dollars to 31 billion dollars range, and final estimates converging at 15 billion dollars as the actual intensity at landfall became apparent. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has a great appreciation for the role of computer models in projecting losses from hurricanes. The FCHLPM contracts with a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a sophisticated computer model based on the Holland wind field. Sensitivity analyses presented in this article utilize standardized regression coefficients to quantify the contribution of the computer input variables to the magnitude of the wind speed.

  13. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    PubMed

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares. PMID:23294402

  14. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    PubMed

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  15. Overview of adaptive finite element analysis in computational geodynamics

    NASA Astrophysics Data System (ADS)

    May, D. A.; Schellart, W. P.; Moresi, L.

    2013-10-01

    The use of numerical models to develop insight and intuition into the dynamics of the Earth over geological time scales is a firmly established practice in the geodynamics community. As our depth of understanding grows, and hand-in-hand with improvements in analytical techniques and higher resolution remote sensing of the physical structure and state of the Earth, there is a continual need to develop more efficient, accurate and reliable numerical techniques. This is necessary to ensure that we can meet the challenge of generating robust conclusions, interpretations and predictions from improved observations. In adaptive numerical methods, the desire is generally to maximise the quality of the numerical solution for a given amount of computational effort. Neither of these terms has a unique, universal definition, but typically there is a trade off between the number of unknowns we can calculate to obtain a more accurate representation of the Earth, and the resources (time and computational memory) required to compute them. In the engineering community, this topic has been extensively examined using the adaptive finite element (AFE) method. Recently, the applicability of this technique to geodynamic processes has started to be explored. In this review we report on the current status and usage of spatially adaptive finite element analysis in the field of geodynamics. The objective of this review is to provide a brief introduction to the area of spatially adaptive finite analysis, including a summary of different techniques to define spatial adaptation and of different approaches to guide the adaptive process in order to control the discretisation error inherent within the numerical solution. An overview of the current state of the art in adaptive modelling in geodynamics is provided, together with a discussion pertaining to the issues related to using adaptive analysis techniques and perspectives for future research in this area. Additionally, we also provide a

  16. Casks (computer analysis of storage casks): A microcomputer based analysis system for storage cask review

    SciTech Connect

    Chen, T.F.; Mok, G.C.; Carlson, R.W.

    1995-08-01

    CASKS is a microcomputer based computer system developed by LLNL to assist the Nuclear Regulatory Commission in performing confirmatory analyses for licensing review of radioactive-material storage cask designs. The analysis programs of the CASKS computer system consist of four modules: the impact analysis module, the thermal analysis module, the thermally-induced stress analysis module, and the pressure-induced stress analysis module. CASKS uses a series of menus to coordinate input programs, cask analysis programs, output programs, data archive programs and databases, so the user is able to run the system in an interactive environment. This paper outlines the theoretical background on the impact analysis module and the yielding surface formulation. The close agreement between the CASKS analytical predictions and the results obtained form the two storage casks drop tests performed by SNL and by BNFL at Winfrith serves as the validation of the CASKS impact analysis module.

  17. Multilevel extreme lateral interbody fusion (XLIF) and osteotomies for 3-dimensional severe deformity: 25 consecutive cases

    PubMed Central

    McAfee, Paul C.; Shucosky, Erin; Chotikul, Liana; Salari, Ben; Chen, Lun; Jerrems, Dan

    2013-01-01

    Background This is a retrospective review of 25 patients with severe lumbar nerve root compression undergoing multilevel anterior retroperitoneal lumbar interbody fusion and posterior instrumentation for deformity. The objective is to analyze the outcomes and clinical results from anterior interbody fusions performed through a lateral approach and compare these with traditional surgical procedures. Methods A consecutive series of 25 patients (78 extreme lateral interbody fusion [XLIF] levels) was identified to illustrate the primary advantages of XLIF in correcting the most extreme of the 3-dimensional deformities that fulfilled the following criteria: (1) a minimum of 40° of scoliosis; (2) 2 or more levels of translation, anterior spondylolisthesis, and lateral subluxation (subluxation in 2 planes), causing symptomatic neurogenic claudication and severe spinal stenosis; and (3) lumbar hypokyphosis or flat-back syndrome. In addition, the majority had trunks that were out of balance (central sacral vertical line ≥2 cm from vertical plumb line) or had sagittal imbalance, defined by a distance between the sagittal vertical line and S1 of greater than 3 cm. There were 25 patients who had severe enough deformities fulfilling these criteria that required supplementation of the lateral XLIF with posterior osteotomies and pedicle screw instrumentation. Results In our database, with a mean follow-up of 24 months, 85% of patients showed evidence of solid arthrodesis and no subsidence on computed tomography and flexion/extension radiographs. The complication rate remained low, with a perioperative rate of 2.4% and postoperative rate of 12.2%. The lateral listhesis and anterior spondylolisthetic subluxation were anatomically reduced with minimally invasive XLIF. The main finding in these 25 cases was our isolation of the major indication for supplemental posterior surgery: truncal decompensation in patients who are out of balance by 2 cm or more, in whom posterior spinal

  18. Carotid-Sparing TomoHelical 3-Dimensional Conformal Radiotherapy for Early Glottic Cancer

    PubMed Central

    Hong, Chae-Seon; Oh, Dongryul; Ju, Sang Gyu; Ahn, Yong Chan; Noh, Jae Myoung; Chung, Kwangzoo; Kim, Jin Sung; Suh, Tae-Suk

    2016-01-01

    Purpose The purpose of this study was to investigate the dosimetric benefits and treatment efficiency of carotid-sparing TomoHelical 3-dimensional conformal radiotherapy (TH-3DCRT) for early glottic cancer. Materials and Methods Ten early-stage (T1N0M0) glottic squamous cell carcinoma patients were simulated, based on computed tomography scans. Two-field 3DCRT (2F-3DCRT), 3-field intensity-modulated radiation therapy (3F-IMRT), TomoHelical-IMRT (TH-IMRT), and TH-3DCRT plans were generated with a 67.5-Gy total prescription dose to the planning target volume (PTV) for each patient. In order to evaluate the plan quality, dosimetric characteristics were compared in terms of conformity index (CI) and homogeneity index (HI) for PTV, dose to the carotid arteries, and maximum dose to the spinal cord. Treatment planning and delivery times were compared to evaluate treatment efficiency. Results The median CI was substantially better for the 3F-IMRT (0.65), TH-IMRT (0.64), and TH-3DCRT (0.63) plans, compared to the 2F-3DCRT plan (0.32). PTV HI was slightly better for TH-3DCRT and TH-IMRT (1.05) compared to 2F-3DCRT (1.06) and 3F-IMRT (1.09). TH-3DCRT, 3F-IMRT, and TH-IMRT showed an excellent carotid sparing capability compared to 2F-3DCRT (p < 0.05). For all plans, the maximum dose to the spinal cord was < 45 Gy. The median treatment planning times for 2F-3DCRT (5.85 minutes) and TH-3DCRT (7.10 minutes) were much lower than those for 3F-IMRT (45.48 minutes) and TH-IMRT (35.30 minutes). The delivery times for 2F-3DCRT (2.06 minutes) and 3F-IMRT (2.48 minutes) were slightly lower than those for TH-IMRT (2.90 minutes) and TH-3DCRT (2.86 minutes). Conclusion TH-3DCRT showed excellent carotid-sparing capability, while offering high efficiency and maintaining good PTV coverage. PMID:25761477

  19. Novel Multicompartment 3-Dimensional Radiochromic Radiation Dosimeters for Nanoparticle-Enhanced Radiation Therapy Dosimetry

    SciTech Connect

    Alqathami, Mamdooh; Blencowe, Anton; Yeo, Un Jin; Doran, Simon J.; Qiao, Greg; Geso, Moshi

    2012-11-15

    Purpose: Gold nanoparticles (AuNps), because of their high atomic number (Z), have been demonstrated to absorb low-energy X-rays preferentially, compared with tissue, and may be used to achieve localized radiation dose enhancement in tumors. The purpose of this study is to introduce the first example of a novel multicompartment radiochromic radiation dosimeter and to demonstrate its applicability for 3-dimensional (3D) dosimetry of nanoparticle-enhanced radiation therapy. Methods and Materials: A novel multicompartment phantom radiochromic dosimeter was developed. It was designed and formulated to mimic a tumor loaded with AuNps (50 nm in diameter) at a concentration of 0.5 mM, surrounded by normal tissues. The novel dosimeter is referred to as the Sensitivity Modulated Advanced Radiation Therapy (SMART) dosimeter. The dosimeters were irradiated with 100-kV and 6-MV X-ray energies. Dose enhancement produced from the interaction of X-rays with AuNps was calculated using spectrophotometric and cone-beam optical computed tomography scanning by quantitatively comparing the change in optical density and 3D datasets of the dosimetric measurements between the tissue-equivalent (TE) and TE/AuNps compartments. The interbatch and intrabatch variability and the postresponse stability of the dosimeters with AuNps were also assessed. Results: Radiation dose enhancement factors of 1.77 and 1.11 were obtained using 100-kV and 6-MV X-ray energies, respectively. The results of this study are in good agreement with previous observations; however, for the first time we provide direct experimental confirmation and 3D visualization of the radiosensitization effect of AuNps. The dosimeters with AuNps showed small (<3.5%) interbatch variability and negligible (<0.5%) intrabatch variability. Conclusions: The SMART dosimeter yields experimental insights concerning the spatial distributions and elevated dose in nanoparticle-enhanced radiation therapy, which cannot be performed using any of

  20. Cohort study analysis with a FORTRAN computer program.

    PubMed

    Coleman, M; Douglas, A; Hermon, C; Peto, J

    1986-03-01

    We describe the analysis of cohort study data with a standard FORTRAN program which should run on most computers. It provides a summary measure of the mortality (or incidence) rate ratio between the study cohort and some standard population, based either on person-years at risk or on proportional mortality, and adjusted for age, sex and calendar period; a test of the statistical significance of the ratio; and a set of observed death rates in the study cohort. Results may also be produced in a form suitable for use with GLIM. The analysis may be subdivided into a range of time intervals since each subject was first exposed to risk. The program provides for movement of subjects between different 'level-of-exposure' subgroups within the cohort, and for various methods of censoring. It allows considerable flexibility in data management, and is available with complete documentation and a worked example. The program should enable epidemiologists with little computing experience to carry out formal analysis of cohort studies.

  1. Computational analysis of contractility in engineered heart tissue.

    PubMed

    Mathews, Grant; Sondergaard, Claus; Jeffreys, Angela; Childs, William; Le, Bao Linh; Sahota, Amrit; Najibi, Skender; Nolta, Jan; Si, Ming-Sing

    2012-05-01

    Engineered heart tissue (EHT) is a potential therapy for heart failure and the basis of functional in vitro assays of novel cardiovascular treatments. Self-organizing EHT can be generated in fiber form, which makes the assessment of contractile function convenient with a force transducer. Contractile function is a key parameter of EHT performance. Analysis of EHT force data is often performed manually; however, this approach is time consuming, incomplete and subjective. Therefore, the purpose of this study was to develop a computer algorithm to efficiently and objectively analyze EHT force data. This algorithm incorporates data filtering, individual contraction detection and validation, inter/intracontractile analysis and intersample analysis. We found the algorithm to be accurate in contraction detection, validation and magnitude measurement as compared to human operators. The algorithm was efficient in processing hundreds of data acquisitions and was able to determine force-length curves, force-frequency relationships and compare various contractile parameters such as peak systolic force generation. We conclude that this computer algorithm is a key adjunct to the objective and efficient assessment of EHT contractile function. PMID:22361653

  2. Automated Patient Identification and Localization Error Detection Using 2-Dimensional to 3-Dimensional Registration of Kilovoltage X-Ray Setup Images

    SciTech Connect

    Lamb, James M. Agazaryan, Nzhde; Low, Daniel A.

    2013-10-01

    Purpose: To determine whether kilovoltage x-ray projection radiation therapy setup images could be used to perform patient identification and detect gross errors in patient setup using a computer algorithm. Methods and Materials: Three patient cohorts treated using a commercially available image guided radiation therapy (IGRT) system that uses 2-dimensional to 3-dimensional (2D-3D) image registration were retrospectively analyzed: a group of 100 cranial radiation therapy patients, a group of 100 prostate cancer patients, and a group of 83 patients treated for spinal lesions. The setup images were acquired using fixed in-room kilovoltage imaging systems. In the prostate and cranial patient groups, localizations using image registration were performed between computed tomography (CT) simulation images from radiation therapy planning and setup x-ray images corresponding both to the same patient and to different patients. For the spinal patients, localizations were performed to the correct vertebral body, and to an adjacent vertebral body, using planning CTs and setup x-ray images from the same patient. An image similarity measure used by the IGRT system image registration algorithm was extracted from the IGRT system log files and evaluated as a discriminant for error detection. Results: A threshold value of the similarity measure could be chosen to separate correct and incorrect patient matches and correct and incorrect vertebral body localizations with excellent accuracy for these patient cohorts. A 10-fold cross-validation using linear discriminant analysis yielded misclassification probabilities of 0.000, 0.0045, and 0.014 for the cranial, prostate, and spinal cases, respectively. Conclusions: An automated measure of the image similarity between x-ray setup images and corresponding planning CT images could be used to perform automated patient identification and detection of localization errors in radiation therapy treatments.

  3. Cost analysis for computer supported multiple-choice paper examinations

    PubMed Central

    Mandel, Alexander; Hörnlein, Alexander; Ifland, Marianus; Lüneburg, Edeltraud; Deckert, Jürgen; Puppe, Frank

    2011-01-01

    Introduction: Multiple-choice-examinations are still fundamental for assessment in medical degree programs. In addition to content related research, the optimization of the technical procedure is an important question. Medical examiners face three options: paper-based examinations with or without computer support or completely electronic examinations. Critical aspects are the effort for formatting, the logistic effort during the actual examination, quality, promptness and effort of the correction, the time for making the documents available for inspection by the students, and the statistical analysis of the examination results. Methods: Since three semesters a computer program for input and formatting of MC-questions in medical and other paper-based examinations is used and continuously improved at Wuerzburg University. In the winter semester (WS) 2009/10 eleven, in the summer semester (SS) 2010 twelve and in WS 2010/11 thirteen medical examinations were accomplished with the program and automatically evaluated. For the last two semesters the remaining manual workload was recorded. Results: The cost of the formatting and the subsequent analysis including adjustments of the analysis of an average examination with about 140 participants and about 35 questions was 5-7 hours for exams without complications in the winter semester 2009/2010, about 2 hours in SS 2010 and about 1.5 hours in the winter semester 2010/11. Including exams with complications, the average time was about 3 hours per exam in SS 2010 and 2.67 hours for the WS 10/11. Discussion: For conventional multiple-choice exams the computer-based formatting and evaluation of paper-based exams offers a significant time reduction for lecturers in comparison with the manual correction of paper-based exams and compared to purely electronically conducted exams it needs a much simpler technological infrastructure and fewer staff during the exam. PMID:22205913

  4. Computing the surveillance error grid analysis: procedure and examples.

    PubMed

    Kovatchev, Boris P; Wakeman, Christian A; Breton, Marc D; Kost, Gerald J; Louie, Richard F; Tran, Nam K; Klonoff, David C

    2014-07-01

    The surveillance error grid (SEG) analysis is a tool for analysis and visualization of blood glucose monitoring (BGM) errors, based on the opinions of 206 diabetes clinicians who rated 4 distinct treatment scenarios. Resulting from this large-scale inquiry is a matrix of 337 561 risk ratings, 1 for each pair of (reference, BGM) readings ranging from 20 to 580 mg/dl. The computation of the SEG is therefore complex and in need of automation. The SEG software introduced in this article automates the task of assigning a degree of risk to each data point for a set of measured and reference blood glucose values so that the data can be distributed into 8 risk zones. The software's 2 main purposes are to (1) distribute a set of BG Monitor data into 8 risk zones ranging from none to extreme and (2) present the data in a color coded display to promote visualization. Besides aggregating the data into 8 zones corresponding to levels of risk, the SEG computes the number and percentage of data pairs in each zone and the number/percentage of data pairs above/below the diagonal line in each zone, which are associated with BGM errors creating risks for hypo- or hyperglycemia, respectively. To illustrate the action of the SEG software we first present computer-simulated data stratified along error levels defined by ISO 15197:2013. This allows the SEG to be linked to this established standard. Further illustration of the SEG procedure is done with a series of previously published data, which reflect the performance of BGM devices and test strips under various environmental conditions. We conclude that the SEG software is a useful addition to the SEG analysis presented in this journal, developed to assess the magnitude of clinical risk from analytically inaccurate data in a variety of high-impact situations such as intensive care and disaster settings. PMID:25562887

  5. BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.

  6. Computer analysis of ring stiffened shells of revolution

    NASA Technical Reports Server (NTRS)

    Cohen, G. A.

    1973-01-01

    The equations and method of solution for a series of five compatible computer programs for structural analysis of axisymmetric shell structures are presented. These programs, designated as the SRA programs, apply to a common structural model but analyze different modes of structural response. They are: (1) linear asymmetric static response (SRA 100), (2) buckling of linearized asymmetric equilibrium states (SRA 101), (3) nonlinear axisymmetric static response (SRA 200), (4) buckling of nonlinear axisymmetric equilibrium states(SRA 201), and (5) vibrations about nonlinear axisymmetric equilibrium state (SRA 300).

  7. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  8. [Computation techniques in the conformational analysis of carbohydrates].

    PubMed

    Gebst, A G; Grachev, A A; Shashkov, A S; Nifant'ev, N E

    2007-01-01

    A growing number of modern studies of carbohydrates is devoted to spatial mechanisms of their participation in the cell recognition processes and directed design of inhibitors of these processes. Any progress in this field is impossible without the development of theoretical conformational analysis of carbohydrates. In this review, we generalize literature data on the potentialities of using of different molecular-mechanic force fields, the methods of quantum mechanics, and molecular dynamics to study the conformation of glycoside bond. A possibility of analyzing the reactivity of carbohydrates with the computation techniques is also discussed in brief.

  9. Computer analysis of general linear networks using digraphs.

    NASA Technical Reports Server (NTRS)

    Mcclenahan, J. O.; Chan, S.-P.

    1972-01-01

    Investigation of the application of digraphs in analyzing general electronic networks, and development of a computer program based on a particular digraph method developed by Chen. The Chen digraph method is a topological method for solution of networks and serves as a shortcut when hand calculations are required. The advantage offered by this method of analysis is that the results are in symbolic form. It is limited, however, by the size of network that may be handled. Usually hand calculations become too tedious for networks larger than about five nodes, depending on how many elements the network contains. Direct determinant expansion for a five-node network is a very tedious process also.

  10. On the computer analysis of structures and mechanical systems

    NASA Technical Reports Server (NTRS)

    Bennett, B. E.

    1984-01-01

    The governing equations for the analysis of open branch-chain mechanical systems are developed in a form suitable for implementation in a general purpose finite element computer program. Lagrange's form of d'Alembert's principle is used to derive the system mass matrix and force vector. The generalized coordinates are selected as the unconstrained relative degrees of freedom giving the position and orientation of each slave link with respect to their master link. Each slave link may have from zero to six degrees of freedom relative to the reference frames of its master link. A strategy for automatic generation of the system mass matrix and force vector is described.

  11. Computer Tomography Analysis of Fastrac Composite Thrust Chamber Assemblies

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2000-01-01

    Computed tomography (CT) inspection has been integrated into the production process for NASA's Fastrac composite thrust chamber assemblies (TCAs). CT has been proven to be uniquely qualified to detect the known critical flaw for these nozzles, liner cracks that are adjacent to debonds between the liner and overwrap. CT is also being used as a process monitoring tool through analysis of low density indications in the nozzle overwraps. 3d reconstruction of CT images to produce models of flawed areas is being used to give program engineers better insight into the location and nature of nozzle flaws.

  12. Computer-aided analysis of optical data link (Poster Paper)

    NASA Astrophysics Data System (ADS)

    Lin, Jizoo; Wang, Yuh-Diahn; Shih, Ming-Tang

    1992-10-01

    This paper presents a fiber optic simulation methodology for the design of digital lightwave link. Computer-aided analysis of high speed optical data link is important for a system designer to predict the system performance in advanced. Accurate modeling and simulation contribute the fundamental evaluation of optical communication integrated circuit feasibilities. In this paper, the modeling of optical source waveform, transmission fiber, photodetector and timing recovery technique using surface acoustic wave (SAW) filter are discussed. A SONET OC-3 transceiver is simulated as an example, while measured eye diagrams are compared with the simulation result.

  13. Verification and validation plan for reactor analysis computer codes

    SciTech Connect

    Toffer, H.; Crowe, R.D.; Schwinkendorf, K.N.; Pevey, R.E.

    1989-11-01

    This report presents a verification and validation (V&V) plan for reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. This plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River (DOE-SR) as identified in a letter to R.E. Tiller (Reference 1). The plan stresses verification and validation by demonstrating successful application of the codes to predict reactor data, special measurements, and benchmarks. This is in compliance with the intent of the WSRC quality assurance requirements. Restructuring of software especially to achieve verification compliance is not recommended.

  14. Verification and validation plan for reactor analysis computer codes

    SciTech Connect

    Toffer, H.; Crowe, R.D.; Schwinkendorf, K.N. ); Pevey, R.E. )

    1989-11-01

    This report presents a verification and validation (V V) plan for reactor analysis computer codes used in Technical Specifications development and for other safety and production support calculations. This plan fulfills the commitments by Westinghouse Savannah River Company (WSRC) to the Department of Energy Savannah River (DOE-SR) as identified in a letter to R.E. Tiller (Reference 1). The plan stresses verification and validation by demonstrating successful application of the codes to predict reactor data, special measurements, and benchmarks. This is in compliance with the intent of the WSRC quality assurance requirements. Restructuring of software especially to achieve verification compliance is not recommended.

  15. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    PubMed

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner. PMID:25710902

  16. Programming Probabilistic Structural Analysis for Parallel Processing Computer

    NASA Technical Reports Server (NTRS)

    Sues, Robert H.; Chen, Heh-Chyun; Twisdale, Lawrence A.; Chamis, Christos C.; Murthy, Pappu L. N.

    1991-01-01

    The ultimate goal of this research program is to make Probabilistic Structural Analysis (PSA) computationally efficient and hence practical for the design environment by achieving large scale parallelism. The paper identifies the multiple levels of parallelism in PSA, identifies methodologies for exploiting this parallelism, describes the development of a parallel stochastic finite element code, and presents results of two example applications. It is demonstrated that speeds within five percent of those theoretically possible can be achieved. A special-purpose numerical technique, the stochastic preconditioned conjugate gradient method, is also presented and demonstrated to be extremely efficient for certain classes of PSA problems.

  17. Computational issue in the analysis of adaptive control systems

    NASA Technical Reports Server (NTRS)

    Kosut, Robert L.

    1989-01-01

    Adaptive systems under slow parameter adaption can be analyzed by the method of averaging. This provides a means to assess stability (and instability) properties of most adaptive systems, either continuous-time or (more importantly for practice) discrete-time, as well as providing an estimate of the region of attraction. Although the method of averaging is conceptually straightforward, even simple examples are well beyond hand calculations. Specific software tools are proposed which can provide the basis for user-friendly environment to perform the necessary computations involved in the averaging analysis.

  18. Computer modeling for advanced life support system analysis.

    PubMed

    Drysdale, A

    1997-01-01

    This article discusses the equivalent mass approach to advanced life support system analysis, describes a computer model developed to use this approach, and presents early results from modeling the NASA JSC BioPlex. The model is built using an object oriented approach and G2, a commercially available modeling package Cost factor equivalencies are given for the Volosin scenarios. Plant data from NASA KSC and Utah State University (USU) are used, together with configuration data from the BioPlex design effort. Initial results focus on the importance of obtaining high plant productivity with a flight-like configuration. PMID:11540448

  19. Computer-aided analysis of a Superfund site

    SciTech Connect

    Qualheim, B.J. )

    1990-05-01

    The groundwater investigation at the Lawrence Livermore National Laboratory was initiated in 1983 after perchloroethylene (PCE) and trichloroethylene (TCE) were detected in the groundwater. Since that time, more than 300 monitor wells have been completed, logged, sampled, and hydraulically tested. In 1987, the Livermore site was placed on the Environmental Protection Agency's National Priority List (Superfund). The Livermore valley is relatively flat, underlain by a complex alluvial sedimentary basin drained by two intermittent streams. The subsurface consists of unconsolidated sand, gravel, silt, and clay with multiple water-bearing zones of relatively high permeability. The hydrogeologic system is characterized as leaky, with horizontal hydraulic communication of up to 800 ft and vertical communication between aquifers of up to 50 ft. Computer-based analysis of the site stratigraphy was used to analyze and characterize the subsurface. The authors used a computer-aided design and drafting (CADD) system to create two-dimensional slices of the subsurface. The slice program takes a subsurface slice at any specified depositional gradient and at any slice thickness. A slice displays the lithology type, unit thickness, depth of slice, and chemical analyses for volatile organic compounds (VOCs). The lateral continuity of subsurface channels was mapped for each depth slice. By stacking these maps, the authors interpreted a pseudo-three-dimensional representation of probably pathways for VOC movement in the subsurface. An enhanced computer graphics system was also used to map the movement of VOCs in the subsurface.

  20. Plans for a sensitivity analysis of bridge-scour computations

    USGS Publications Warehouse

    Dunn, David D.; Smith, Peter N.

    1993-01-01

    Plans for an analysis of the sensitivity of Level 2 bridge-scour computations are described. Cross-section data from 15 bridge sites in Texas are modified to reflect four levels of field effort ranging from no field surveys to complete surveys. Data from United States Geological Survey (USGS) topographic maps will be used to supplement incomplete field surveys. The cross sections are used to compute the water-surface profile through each bridge for several T-year recurrence-interval design discharges. The effect of determining the downstream energy grade-line slope from topographic maps is investigated by systematically varying the starting slope of each profile. The water-surface profile analyses are then used to compute potential scour resulting from each of the design discharges. The planned results will be presented in the form of exceedance-probability versus scour-depth plots with the maximum and minimum scour depths at each T-year discharge presented as error bars.

  1. Bilateral flight muscle activity predicts wing kinematics and 3-dimensional body orientation of locusts responding to looming objects.

    PubMed

    McMillan, Glyn A; Loessin, Vicky; Gray, John R

    2013-09-01

    We placed locusts in a wind tunnel using a loose tether design that allowed for motion in all three rotational degrees of freedom during presentation of a computer-generated looming disc. High-speed video allowed us to extract wing kinematics, abdomen position and 3-dimensional body orientation. Concurrent electromyographic (EMG) recordings monitored bilateral activity from the first basalar depressor muscles (m97) of the forewings, which are implicated in flight steering. Behavioural responses to a looming disc included cessation of flight (wings folded over the body), glides and active steering during sustained flight in addition to a decrease and increase in wingbeat frequency prior to and during, respectively, an evasive turn. Active steering involved shifts in bilateral m97 timing, wing asymmetries and whole-body rotations in the yaw (ψ), pitch (χ) and roll (η) planes. Changes in abdomen position and hindwing asymmetries occurred after turns were initiated. Forewing asymmetry and changes in η were most highly correlated with m97 spike latency. Correlations also increased as the disc approached, peaking prior to collision. On the inside of a turn, m97 spikes occurred earlier relative to forewing stroke reversal and bilateral timing corresponded to forewing asymmetry as well as changes in whole-body rotation. Double spikes in each m97 occurred most frequently at or immediately prior to the time the locusts turned, suggesting a behavioural significance. These data provide information on mechanisms underlying 3-dimensional flight manoeuvres and will be used to drive a closed loop flight simulator to study responses of motion-sensitive visual neurons during production of realistic behaviours.

  2. Bilateral flight muscle activity predicts wing kinematics and 3-dimensional body orientation of locusts responding to looming objects.

    PubMed

    McMillan, Glyn A; Loessin, Vicky; Gray, John R

    2013-09-01

    We placed locusts in a wind tunnel using a loose tether design that allowed for motion in all three rotational degrees of freedom during presentation of a computer-generated looming disc. High-speed video allowed us to extract wing kinematics, abdomen position and 3-dimensional body orientation. Concurrent electromyographic (EMG) recordings monitored bilateral activity from the first basalar depressor muscles (m97) of the forewings, which are implicated in flight steering. Behavioural responses to a looming disc included cessation of flight (wings folded over the body), glides and active steering during sustained flight in addition to a decrease and increase in wingbeat frequency prior to and during, respectively, an evasive turn. Active steering involved shifts in bilateral m97 timing, wing asymmetries and whole-body rotations in the yaw (ψ), pitch (χ) and roll (η) planes. Changes in abdomen position and hindwing asymmetries occurred after turns were initiated. Forewing asymmetry and changes in η were most highly correlated with m97 spike latency. Correlations also increased as the disc approached, peaking prior to collision. On the inside of a turn, m97 spikes occurred earlier relative to forewing stroke reversal and bilateral timing corresponded to forewing asymmetry as well as changes in whole-body rotation. Double spikes in each m97 occurred most frequently at or immediately prior to the time the locusts turned, suggesting a behavioural significance. These data provide information on mechanisms underlying 3-dimensional flight manoeuvres and will be used to drive a closed loop flight simulator to study responses of motion-sensitive visual neurons during production of realistic behaviours. PMID:23737560

  3. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  4. Computational analysis of core promoters in the Drosophila genome

    PubMed Central

    Ohler, Uwe; Liao, Guo-chun; Niemann, Heinrich; Rubin, Gerald M

    2002-01-01

    Background The core promoter, a region of about 100 base-pairs flanking the transcription start site (TSS), serves as the recognition site for the basal transcription apparatus. Drosophila TSSs have generally been mapped by individual experiments; the low number of accurately mapped TSSs has limited analysis of promoter sequence motifs and the training of computational prediction tools. Results We identified TSS candidates for about 2,000 Drosophila genes by aligning 5' expressed sequence tags (ESTs) from cap-trapped cDNA libraries to the genome, while applying stringent criteria concerning coverage and 5'-end distribution. Examination of the sequences flanking these TSSs revealed the presence of well-known core promoter motifs such as the TATA box, the initiator and the downstream promoter element (DPE). We also define, and assess the distribution of, several new motifs prevalent in core promoters, including what appears to be a variant DPE motif. Among the prevalent motifs is the DNA-replication-related element DRE, recently shown to be part of the recognition site for the TBP-related factor TRF2. Our TSS set was then used to retrain the computational promoter predictor McPromoter, allowing us to improve the recognition performance to over 50% sensitivity and 40% specificity. We compare these computational results to promoter prediction in vertebrates. Conclusions There are relatively few recognizable binding sites for previously known general transcription factors in Drosophila core promoters. However, we identified several new motifs enriched in promoter regions. We were also able to significantly improve the performance of computational TSS prediction in Drosophila. PMID:12537576

  5. Computer-aided strength analysis of the modernized freight wagon

    NASA Astrophysics Data System (ADS)

    Płaczek, M.; Wróbel, A.; Baier, A.

    2015-11-01

    In the paper results of computer-aided strength analysis of the modernized freight wagon based on Finite Element Method are presented. CAD model of the considered freight wagon was created and its strength was analysed in agreement with norms described the way of such kind of freight wagons testing. Then, the model of the analysed freight wagon was modernized by adding composite panels covering the inner surface of the vehicle body. Strength analysis was carried out once again and obtained results were juxtaposed. This work was carried out in order to verify the influence of composite panels on the strength of the freight car body and to estimate the possibility of reducing the steel shell thickness of the box in order to reduce weight of the freight wagon.

  6. Dynamic analysis of spur gears using computer program DANST

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Lin, Hsiang H.; Liou, Chuen-Huei; Valco, Mark J.

    1993-01-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results.

  7. Dynamic analysis of spur gears using computer program DANST

    SciTech Connect

    Oswald, F.B.; Lin, H.H.; Liou, Chuenheui; Valco, M.J.

    1993-06-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results. 14 refs.

  8. Electronic Forms-Based Computing for Evidentiary Analysis

    SciTech Connect

    Luse, Andy; Mennecke, Brian; Townsend, Anthony

    2009-07-01

    The paperwork associated with evidentiary collection and analysis is a highly repetitive and time-consuming process which often involves duplication of work and can frequently result in documentary errors. Electronic entry of evidence-related information can facilitate greater accuracy and less time spent on data entry. This manuscript describes a general framework for the implementation of an electronic tablet-based system for evidentiary processing. This framework is then utilized in the design and implementation of an electronic tablet-based evidentiary input prototype system developed for use by forensic laboratories which serves as a verification of the proposed framework. The manuscript concludes with a discussion of implications and recommendations for the implementation and use of tablet-based computing for evidence analysis.

  9. Computer-aided communication satellite system analysis and optimization

    NASA Technical Reports Server (NTRS)

    Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.

    1973-01-01

    The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.

  10. A computer program (MACPUMP) for interactive aquifer-test analysis

    USGS Publications Warehouse

    Day-Lewis, F. D.; Person, M.A.; Konikow, L.F.

    1995-01-01

    This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.

  11. Inequities in the Computer Classroom: An Analysis of Two Computer Courses.

    ERIC Educational Resources Information Center

    Alspach, Phyllis A.

    This study analyzed the enrollment of two computer classes at a public high school in northern Indiana to see if there was any computer inequity. The two classes examined--an introduction to computers course and a computer programming course--were studied over a period of four years. The sample consisted of 388 students in four years of the…

  12. Computational analyses of arteriovenous malformations in neuroimaging.

    PubMed

    Di Ieva, Antonio; Boukadoum, Mounir; Lahmiri, Salim; Cusimano, Michael D

    2015-01-01

    Computational models have been investigated for the analysis of the physiopathology and morphology of arteriovenous malformation (AVM) in recent years. Special emphasis has been given to image fusion in multimodal imaging and 3-dimensional rendering of the AVM, with the aim to improve the visualization of the lesion (for diagnostic purposes) and the selection of the nidus (for therapeutic aims, like the selection of the region of interest for the gamma knife radiosurgery plan). Searching for new diagnostic and prognostic neuroimaging biomarkers, fractal-based computational models have been proposed for describing and quantifying the angioarchitecture of the nidus. Computational modeling in the AVM field offers promising tools of analysis and requires a strict collaboration among neurosurgeons, neuroradiologists, clinicians, computer scientists, and engineers. We present here some updated state-of-the-art exemplary cases in the field, focusing on recent neuroimaging computational modeling with clinical relevance, which might offer useful clinical tools for the management of AVMs in the future.

  13. Computational Flow Analysis of a Left Ventricular Assist Device

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Kwak, Dochan; Benkowski, Robert

    1995-01-01

    Computational fluid dynamics has been developed to a level where it has become an Indispensable part of aerospace research and design. Technology developed foe aerospace applications am also be utilized for the benefit of human health. For example, a flange-to-flange rocket engine fuel-pump simulation includes the rotating and non-rotating components: the flow straighteners, the impeller, and diffusers A Ventricular Assist Device developed by NASA Johnson Space Center and Baylor College of Medicine has a design similar to a rocket engine fuel pump in that it also consists of a flow straightener, an impeller, and a diffuser. Accurate and detailed knowledge of the flowfield obtained by incompressible flow calculations can be greatly beneficial to designers in their effort to reduce the cost and improve the reliability of these devices. In addition to the geometric complexities, a variety of flow phenomena are encountered in biofluids Then include turbulent boundary layer separation, wakes, transition, tip vortex resolution, three-dimensional effects, and Reynolds number effects. In order to increase the role of Computational Fluid Dynamics (CFD) in the design process the CFD analysis tools must be evaluated and validated so that designers gain Confidence in their use. The incompressible flow solver, INS3D, has been applied to flow inside of a liquid rocket engine turbopump components and extensively validated. This paper details how the computational flow simulation capability developed for liquid rocket engine pump component analysis has bean applied to the Left Ventricular Assist Device being developed jointly by NASA JSC and Baylor College of Medicine.

  14. Computational modeling and analysis of thermoelectric properties of nanoporous silicon

    SciTech Connect

    Li, H.; Yu, Y.; Li, G.

    2014-03-28

    In this paper, thermoelectric properties of nanoporous silicon are modeled and studied by using a computational approach. The computational approach combines a quantum non-equilibrium Green's function (NEGF) coupled with the Poisson equation for electrical transport analysis, a phonon Boltzmann transport equation (BTE) for phonon thermal transport analysis and the Wiedemann-Franz law for calculating the electronic thermal conductivity. By solving the NEGF/Poisson equations self-consistently using a finite difference method, the electrical conductivity σ and Seebeck coefficient S of the material are numerically computed. The BTE is solved by using a finite volume method to obtain the phonon thermal conductivity k{sub p} and the Wiedemann-Franz law is used to obtain the electronic thermal conductivity k{sub e}. The figure of merit of nanoporous silicon is calculated by ZT=S{sup 2}σT/(k{sub p}+k{sub e}). The effects of doping density, porosity, temperature, and nanopore size on thermoelectric properties of nanoporous silicon are investigated. It is confirmed that nanoporous silicon has significantly higher thermoelectric energy conversion efficiency than its nonporous counterpart. Specifically, this study shows that, with a n-type doping density of 10{sup 20} cm{sup –3}, a porosity of 36% and nanopore size of 3 nm × 3 nm, the figure of merit ZT can reach 0.32 at 600 K. The results also show that the degradation of electrical conductivity of nanoporous Si due to the inclusion of nanopores is compensated by the large reduction in the phonon thermal conductivity and increase of absolute value of the Seebeck coefficient, resulting in a significantly improved ZT.

  15. Uncertainty analysis for computer model projections of hurricane losses.

    PubMed

    Iman, Ronald L; Johnson, Mark E; Watson, Charles C

    2005-10-01

    Projecting losses associated with hurricanes is a complex and difficult undertaking that is wrought with uncertainties. Hurricane Charley, which struck southwest Florida on August 13, 2004, illustrates the uncertainty of forecasting damages from these storms. Due to shifts in the track and the rapid intensification of the storm, real-time estimates grew from 2 to 3 billion dollars in losses late on August 12 to a peak of 50 billion dollars for a brief time as the storm appeared to be headed for the Tampa Bay area. The storm hit the resort areas of Charlotte Harbor near Punta Gorda and then went on to Orlando in the central part of the state, with early poststorm estimates converging on a damage estimate in the 28 to 31 billion dollars range. Comparable damage to central Florida had not been seen since Hurricane Donna in 1960. The Florida Commission on Hurricane Loss Projection Methodology (FCHLPM) has recognized the role of computer models in projecting losses from hurricanes. The FCHLPM established a professional team to perform onsite (confidential) audits of computer models developed by several different companies in the United States that seek to have their models approved for use in insurance rate filings in Florida. The team's members represent the fields of actuarial science, computer science, meteorology, statistics, and wind and structural engineering. An important part of the auditing process requires uncertainty and sensitivity analyses to be performed with the applicant's proprietary model. To influence future such analyses, an uncertainty and sensitivity analysis has been completed for loss projections arising from use of a Holland B parameter hurricane wind field model. Uncertainty analysis quantifies the expected percentage reduction in the uncertainty of wind speed and loss that is attributable to each of the input variables.

  16. Computational design analysis for deployment of cardiovascular stents

    NASA Astrophysics Data System (ADS)

    Tammareddi, Sriram; Sun, Guangyong; Li, Qing

    2010-06-01

    Cardiovascular disease has become a major global healthcare problem. As one of the relatively new medical devices, stents offer a minimally-invasive surgical strategy to improve the quality of life for numerous cardiovascular disease patients. One of the key associative issues has been to understand the effect of stent structures on its deployment behaviour. This paper aims to develop a computational model for exploring the biomechanical responses to the change in stent geometrical parameters, namely the strut thickness and cross-link width of the Palmaz-Schatz stent. Explicit 3D dynamic finite element analysis was carried out to explore the sensitivity of these geometrical parameters on deployment performance, such as dog-boning, fore-shortening, and stent deformation over the load cycle. It has been found that an increase in stent thickness causes a sizeable rise in the load required to deform the stent to its target diameter, whilst reducing maximum dog-boning in the stent. An increase in the cross-link width showed that no change in the load is required to deform the stent to its target diameter, and there is no apparent correlation with dog-boning but an increased fore-shortening with increasing cross-link width. The computational modelling and analysis presented herein proves an effective way to refine or optimise the design of stent structures.

  17. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  18. Multiscale analysis of nonlinear systems using computational homology

    SciTech Connect

    Konstantin Mischaikow, Rutgers University /Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure Characterization

  19. Multiscale analysis of nonlinear systems using computational homology

    SciTech Connect

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure Characterization

  20. Computational analysis of bacterial RNA-Seq data

    PubMed Central

    McClure, Ryan; Balasubramanian, Divya; Sun, Yan; Bobrovskyy, Maksym; Sumby, Paul; Genco, Caroline A.; Vanderpool, Carin K.; Tjaden, Brian

    2013-01-01

    Recent advances in high-throughput RNA sequencing (RNA-seq) have enabled tremendous leaps forward in our understanding of bacterial transcriptomes. However, computational methods for analysis of bacterial transcriptome data have not kept pace with the large and growing data sets generated by RNA-seq technology. Here, we present new algorithms, specific to bacterial gene structures and transcriptomes, for analysis of RNA-seq data. The algorithms are implemented in an open source software system called Rockhopper that supports various stages of bacterial RNA-seq data analysis, including aligning sequencing reads to a genome, constructing transcriptome maps, quantifying transcript abundance, testing for differential gene expression, determining operon structures and visualizing results. We demonstrate the performance of Rockhopper using 2.1 billion sequenced reads from 75 RNA-seq experiments conducted with Escherichia coli, Neisseria gonorrhoeae, Salmonella enterica, Streptococcus pyogenes and Xenorhabdus nematophila. We find that the transcriptome maps generated by our algorithms are highly accurate when compared with focused experimental data from E. coli and N. gonorrhoeae, and we validate our system’s ability to identify novel small RNAs, operons and transcription start sites. Our results suggest that Rockhopper can be used for efficient and accurate analysis of bacterial RNA-seq data, and that it can aid with elucidation of bacterial transcriptomes. PMID:23716638

  1. Computational analysis of bacterial RNA-Seq data.

    PubMed

    McClure, Ryan; Balasubramanian, Divya; Sun, Yan; Bobrovskyy, Maksym; Sumby, Paul; Genco, Caroline A; Vanderpool, Carin K; Tjaden, Brian

    2013-08-01

    Recent advances in high-throughput RNA sequencing (RNA-seq) have enabled tremendous leaps forward in our understanding of bacterial transcriptomes. However, computational methods for analysis of bacterial transcriptome data have not kept pace with the large and growing data sets generated by RNA-seq technology. Here, we present new algorithms, specific to bacterial gene structures and transcriptomes, for analysis of RNA-seq data. The algorithms are implemented in an open source software system called Rockhopper that supports various stages of bacterial RNA-seq data analysis, including aligning sequencing reads to a genome, constructing transcriptome maps, quantifying transcript abundance, testing for differential gene expression, determining operon structures and visualizing results. We demonstrate the performance of Rockhopper using 2.1 billion sequenced reads from 75 RNA-seq experiments conducted with Escherichia coli, Neisseria gonorrhoeae, Salmonella enterica, Streptococcus pyogenes and Xenorhabdus nematophila. We find that the transcriptome maps generated by our algorithms are highly accurate when compared with focused experimental data from E. coli and N. gonorrhoeae, and we validate our system's ability to identify novel small RNAs, operons and transcription start sites. Our results suggest that Rockhopper can be used for efficient and accurate analysis of bacterial RNA-seq data, and that it can aid with elucidation of bacterial transcriptomes.

  2. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  3. The future of computer-aided sperm analysis

    PubMed Central

    Mortimer, Sharon T; van der Horst, Gerhard; Mortimer, David

    2015-01-01

    Computer-aided sperm analysis (CASA) technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6) and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards. PMID:25926614

  4. Acromiohumeral Distance and 3-Dimensional Scapular Position Change After Overhead Muscle Fatigue

    PubMed Central

    Maenhout, Annelies; Dhooge, Famke; Van Herzeele, Maarten; Palmans, Tanneke; Cools, Ann

    2015-01-01

    Context: Muscle fatigue due to repetitive and prolonged overhead sports activity is considered an important factor contributing to impingement-related rotator cuff pathologic conditions in overhead athletes. The evidence on scapular and glenohumeral kinematic changes after fatigue is contradicting and prohibits conclusions about how shoulder muscle fatigue affects acromiohumeral distance. Objective: To investigate the effect of a fatigue protocol resembling overhead sports activity on acromiohumeral distance and 3-dimensional scapular position in overhead athletes. Design: Cross-sectional study. Setting: Institutional laboratory. Patients or Other Participants: A total of 29 healthy recreational overhead athletes (14 men, 15 women; age = 22.23 ± 2.82 years, height = 178.3 ± 7.8 cm, mass = 71.6 ± 9.5 kg). Intervention(s) The athletes were tested before and after a shoulder muscle-fatiguing protocol. Main Outcome Measure(s) Acromiohumeral distance was measured using ultrasound, and scapular position was determined with an electromagnetic motion-tracking system. Both measurements were performed at 3 elevation positions (0°, 45°, and 60° of abduction). We used a 3-factor mixed model for data analysis. Results: After fatigue, the acromiohumeral distance increased when the upper extremity was actively positioned at 45° (Δ = 0.78 ± 0.24 mm, P = .002) or 60° (Δ = 0.58 ± 0.23 mm, P = .02) of abduction. Scapular position changed after fatigue to a more externally rotated position at 45° (Δ = 4.97° ± 1.13°, P < .001) and 60° (Δ = 4.61° ± 1.90°, P = .001) of abduction, a more upwardly rotated position at 45° (Δ = 6.10° ± 1.30°, P < .001) and 60° (Δ = 7.20° ± 1.65°, P < .001) of abduction, and a more posteriorly tilted position at 0°, 45°, and 60° of abduction (Δ = 1.98° ± 0.41°, P < .001). Conclusions: After a fatiguing protocol, we found changes in acromiohumeral distance and scapular position that corresponded with an impingement

  5. DETECTORS AND EXPERIMENTAL METHODS: Decay vertex reconstruction and 3-dimensional lifetime determination at BESIII

    NASA Astrophysics Data System (ADS)

    Xu, Min; He, Kang-Lin; Zhang, Zi-Ping; Wang, Yi-Fang; Bian, Jian-Ming; Cao, Guo-Fu; Cao, Xue-Xiang; Chen, Shen-Jian; Deng, Zi-Yan; Fu, Cheng-Dong; Gao, Yuan-Ning; Han, Lei; Han, Shao-Qing; He, Miao; Hu, Ji-Feng; Hu, Xiao-Wei; Huang, Bin; Huang, Xing-Tao; Jia, Lu-Kui; Ji, Xiao-Bin; Li, Hai-Bo; Li, Wei-Dong; Liang, Yu-Tie; Liu, Chun-Xiu; Liu, Huai-Min; Liu, Ying; Liu, Yong; Luo, Tao; Lü, Qi-Wen; Ma, Qiu-Mei; Ma, Xiang; Mao, Ya-Jun; Mao, Ze-Pu; Mo, Xiao-Hu; Ning, Fei-Peng; Ping, Rong-Gang; Qiu, Jin-Fa; Song, Wen-Bo; Sun, Sheng-Sen; Sun, Xiao-Dong; Sun, Yong-Zhao; Tian, Hao-Lai; Wang, Ji-Ke; Wang, Liang-Liang; Wen, Shuo-Pin; Wu, Ling-Hui; Wu, Zhi; Xie, Yu-Guang; Yan, Jie; Yan, Liang; Yao, Jian; Yuan, Chang-Zheng; Yuan, Ye; Zhang, Chang-Chun; Zhang, Jian-Yong; Zhang, Lei; Zhang, Xue-Yao; Zhang, Yao; Zheng, Yang-Heng; Zhu, Yong-Sheng; Zou, Jia-Heng

    2009-06-01

    This paper focuses mainly on the vertex reconstruction of resonance particles with a relatively long lifetime such as K0S, Λ, as well as on lifetime measurements using a 3-dimensional fit. The kinematic constraints between the production and decay vertices and the decay vertex fitting algorithm based on the least squares method are both presented. Reconstruction efficiencies including experimental resolutions are discussed. The results and systematic errors are calculated based on a Monte Carlo simulation.

  6. Energy Sources of the Dominant Frequency Dependent 3-dimensional Atmospheric Modes

    NASA Technical Reports Server (NTRS)

    Schubert, S.

    1985-01-01

    The energy sources and sinks associated with the zonally asymmetric winter mean flow are investigated as part of an on-going study of atmospheric variability. Distinctly different horizontal structures for the long, intermediate and short time scale atmospheric variations were noted. In previous observations, the 3-dimensional structure of the fluctuations is investigated and the relative roles of barotropic and baroclinic terms are assessed.

  7. A computational framework for exploratory data analysis in biomedical imaging

    NASA Astrophysics Data System (ADS)

    Wismueller, Axel

    2009-02-01

    Purpose: To develop, test, and evaluate a novel unsupervised machine learning method for the analysis of multidimensional biomedical imaging data. Methods: The Exploration Machine (XOM) is introduced as a method for computing low-dimensional representations of high-dimensional observations. XOM systematically inverts functional and structural components of topology-preserving mappings. Thus, it can contribute to both structure-preserving visualization and data clustering. We applied XOM to the analysis of microarray imaging data of gene expression profiles in Saccharomyces cerevisiae, and to model-free analysis of functional brain MRI data by unsupervised clustering. For both applications, we performed quantitative comparisons to results obtained by established algorithms. Results: Genome data: Absolute (relative) Sammon error values were 2.21 Â. 103 (1.00) for XOM, 2.45 Â. 103 (1.11) for Sammon's mapping, 2.77 Â. 103 (1.25) for Locally Linear Embedding (LLE), 2.82 Â. 103 (1.28) for PCA, 3.36 Â. 103 (1.52) for Isomap, and 10.19 Â. 103(4.61) for Self-Organizing Map (SOM). - Functional MRI data: Areas under ROC curves for detection of task-related brain activation were 0.984 +/- 0.03 for XOM, 0.983 +/- 0.02 for Minimal-Free-Energy VQ, and 0.979 +/- 0.02 for SOM. Conclusion: We introduce the Exploration Machine as a novel machine learning method for the analysis of multidimensional biomedical imaging data. XOM can be successfully applied to microarray gene expression analysis and to clustering of functional brain MR image time-series. By simultaneously contributing to dimensionality reduction and data clustering, XOM is a useful novel method for data analysis in biomedical imaging.

  8. The Neural Representation of 3-Dimensional Objects in Rodent Memory Circuits

    PubMed Central

    Burke, Sara N.; Barnes, Carol A.

    2014-01-01

    Three-dimensional objects are common stimuli that rodents and other animals encounter in the natural world that contribute to the associations that are the hallmark of explicit memory. Thus, the use of 3-dimensional objects for investigating the circuits that support associative and episodic memories has a long history. In rodents, the neural representation of these types of stimuli is a polymodal process and lesion data suggest that the perirhinal cortex, an area of the medial temporal lobe that receives afferent input from all sensory modalities, is particularly important for integrating sensory information across modalities to support object recognition. Not surprisingly, recent data from in vivo electrophysiological recordings have shown that principal cells within the perirhinal cortex are activated at locations of an environment that contain 3-dimensional objects. Interestingly, it appears that neural activity patterns related to object stimuli are ubiquitous across memory circuits and have now been observed in many medial temporal lobe structures as well as in the anterior cingulate cortex. This review summarizes behavioral and neurophysiological data that examine the representation of 3-dimensional objects across brain regions that are involved in memory. PMID:25205370

  9. The neural representation of 3-dimensional objects in rodent memory circuits.

    PubMed

    Burke, Sara N; Barnes, Carol A

    2015-05-15

    Three-dimensional objects are common stimuli that rodents and other animals encounter in the natural world that contribute to the associations that are the hallmark of explicit memory. Thus, the use of 3-dimensional objects for investigating the circuits that support associative and episodic memories has a long history. In rodents, the neural representation of these types of stimuli is a polymodal process and lesion data suggest that the perirhinal cortex, an area of the medial temporal lobe that receives afferent input from all sensory modalities, is particularly important for integrating sensory information across modalities to support object recognition. Not surprisingly, recent data from in vivo electrophysiological recordings have shown that principal cells within the perirhinal cortex are activated at locations of an environment that contain 3-dimensional objects. Interestingly, it appears that neural activity patterns related to object stimuli are ubiquitous across memory circuits and have now been observed in many medial temporal lobe structures as well as in the anterior cingulate cortex. This review summarizes behavioral and neurophysiological data that examine the representation of 3-dimensional objects across brain regions that are involved in memory. PMID:25205370

  10. The Preoperative Evaluation of Infective Endocarditis via 3-Dimensional Transesophageal Echocardiography.

    PubMed

    Yong, Matthew S; Saxena, Pankaj; Killu, Ammar M; Coffey, Sean; Burkhart, Harold M; Wan, Siu-Hin; Malouf, Joseph F

    2015-08-01

    Transesophageal echocardiography continues to have a central role in the diagnosis of infective endocarditis and its sequelae. Recent technological advances offer the option of 3-dimensional imaging in the evaluation of patients with infective endocarditis. We present an illustrative case and review the literature regarding the potential advantages and limitations of 3-dimensional transesophageal echocardiography in the diagnosis of complicated infective endocarditis. A 51-year-old man, an intravenous drug user who had undergone bioprosthetic aortic valve replacement 5 months earlier, presented with prosthetic valve endocarditis. Preoperative transesophageal echocardiography with 3D rendition revealed a large abscess involving the mitral aortic intervalvular fibrosa, together with a mycotic aneurysm that had ruptured into the left atrium, resulting in a left ventricle-to-left atrium fistula. Three-dimensional transesophageal echocardiography enabled superior preoperative anatomic delineation and surgical planning. We conclude that 3-dimensional transesophageal echocardiography can be a useful adjunct to traditional 2-dimensional transesophageal echocardiography as a tool in the diagnosis of infective endocarditis.

  11. The analysis of control trajectories using symbolic and database computing

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    The research broadly concerned the symbolic computation, mixed numeric-symbolic computation, and data base computation of trajectories of dynamical systems, especially control systems. It was determined that trees can be used to compute symbolically series which approximate solutions to differential equations.

  12. Role of dielectric medium on benzylidene aniline: A computational analysis

    SciTech Connect

    Umamaheswari, U.; Ajeetha, N.; Ojha, D. P.

    2009-12-15

    A computational analysis of ordering in N-(p-n-ethoxy benzylidene)-p-n-butyl aniline (2O.4) was performed based on quantum mechanics and intermolecular forces. The atomic charge and dipole moment at atomic centre were evaluated using the all valance electron CNDO/2 method. The modified Rayleigh-Schrodinger perturbation theory and multicentre-multipole expansion method were employed to evaluate long-range intermolecular interaction, while a 6-exp potential function was assumed for short-range interactions. The total interaction energy values obtained in these computations were used as input for calculating the probability of each configuration in a noninteracting and nonmesogenic solvent (i.e., benzene) at room temperature (300 K) using the Maxwell-Boltzmann formula. The molecular parameter of 2O.4, including the total energy, binding energy, and total dipole moment were compared with N (p-n-butoxy benzylidene)-p-n-ethyl aniline (4O.2). The present article offer theoretical support to the experimental 'observations, as well as a new and interesting way of looking at liquid crystalline molecule in a dielectric medium.

  13. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    SciTech Connect

    Handler, B.H. ); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. ); Hunnum, W.H. ); Smith, D.L. )

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  14. Equation of state and fragmentation issues in computational lethality analysis

    SciTech Connect

    Trucano, T.G.

    1993-07-01

    The purpose of this report is to summarize the status of computational analysis of hypervelocity impact lethality in relatively nontechnical terms from the perspective of the author. It is not intended to be a review of the technical literature on the problems of concern. The discussion is focused by concentrating on two phenomenology areas which are of particular concern in computational impact studies. First, the material`s equation of state, specifically the treatment of expanded states of metals undergoing shock vaporization, is discussed. Second, the process of dynamic fragmentation is addressed. In both cases, the context of the discussion deals with inaccuracies and difficulties associated with numerical hypervelocity impact simulations. Laboratory experimental capabilities in hypervelocity impact for impact velocities greater than 10.0 km/s are becoming increasingly viable. This paper also gives recommendations for experimental thrusts which utilize these capabilities that will help to resolve the uncertainties in the numerical lethality studies that are pointed out in the present report.

  15. Computational Analysis of the G-III Laminar Flow Glove

    NASA Technical Reports Server (NTRS)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  16. Shell stability analysis in a computer aided engineering (CAE) environment

    NASA Technical Reports Server (NTRS)

    Arbocz, J.; Hol, J. M. A. M.

    1993-01-01

    The development of 'DISDECO', the Delft Interactive Shell DEsign COde is described. The purpose of this project is to make the accumulated theoretical, numerical and practical knowledge of the last 25 years or so readily accessible to users interested in the analysis of buckling sensitive structures. With this open ended, hierarchical, interactive computer code the user can access from his workstation successively programs of increasing complexity. The computational modules currently operational in DISDECO provide the prospective user with facilities to calculate the critical buckling loads of stiffened anisotropic shells under combined loading, to investigate the effects the various types of boundary conditions will have on the critical load, and to get a complete picture of the degrading effects the different shapes of possible initial imperfections might cause, all in one interactive session. Once a design is finalized, its collapse load can be verified by running a large refined model remotely from behind the workstation with one of the current generation 2-dimensional codes, with advanced capabilities to handle both geometric and material nonlinearities.

  17. Volume measurements of normal orbital structures by computed tomographic analysis

    SciTech Connect

    Forbes, G.; Gehring, D.G.; Gorman, C.A.; Brennan, M.D.; Jackson, I.T.

    1985-07-01

    Computed tomographic digital data and special off-line computer graphic analysis were used to measure volumes of normal orbital soft tissue, extraocular muscle, orbital fat, and total bony orbit in vivo in 29 patients (58 orbits). The upper limits of normal for adult bony orbit, soft tissue exclusive of the globe, orbital fat, and muscle are 30.1 cm/sup 3/, 20.0 cm/sup 3/, 14.4 cm/sup 3/, and 6.5 cm/sup 3/, respectively. There are small differences in men as a group compared with women but minimal difference between right and left orbits in the same person. The accuracy of the techniques was established at 7%-8% for these orbit structural volumes in physical phantoms and in simulated silicone orbit phantoms in dry skulls. Mean values and upper limits of normal for volumes were determined in adult orbital structures for future comparison with changes due to endocrine ophthalmopathy, trauma, and congenital deformity.

  18. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  19. Methodological and computational considerations for multiple correlation analysis.

    PubMed

    Shieh, Gwowen; Kung, Cmen-Feng

    2007-11-01

    The squared multiple correlation coefficient has been widely employed to assess the goodness-of-fit of linear regression models in many applications. Although there are numerous published sources that present inferential issues and computing algorithms for multinormal correlation models, the statistical procedure for testing substantive significance by specifying the nonzero-effect null hypothesis has received little attention. This article emphasizes the importance of determining whether the squared multiple correlation coefficient is small or large in comparison with some prescribed standard and develops corresponding Excel worksheets that facilitate the implementation of various aspects of the suggested significance tests. In view of the extensive accessibility of Microsoft Excel software and the ultimate convenience of general-purpose statistical packages, the associated computer routines for interval estimation, power calculation, a nd samplesize determination are alsoprovided for completeness. The statistical methods and available programs of multiple correlation analysis described in this article purport to enhance pedagogical presentation in academic curricula and practical application in psychological research.

  20. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.

  1. Privacy-preserving microbiome analysis using secure computation

    PubMed Central

    Wagner, Justin; Paulson, Joseph N.; Wang, Xiao; Bhattacharjee, Bobby; Corrada Bravo, Héctor

    2016-01-01

    Motivation: Developing targeted therapeutics and identifying biomarkers relies on large amounts of research participant data. Beyond human DNA, scientists now investigate the DNA of micro-organisms inhabiting the human body. Recent work shows that an individual’s collection of microbial DNA consistently identifies that person and could be used to link a real-world identity to a sensitive attribute in a research dataset. Unfortunately, the current suite of DNA-specific privacy-preserving analysis tools does not meet the requirements for microbiome sequencing studies. Results: To address privacy concerns around microbiome sequencing, we implement metagenomic analyses using secure computation. Our implementation allows comparative analysis over combined data without revealing the feature counts for any individual sample. We focus on three analyses and perform an evaluation on datasets currently used by the microbiome research community. We use our implementation to simulate sharing data between four policy-domains. Additionally, we describe an application of our implementation for patients to combine data that allows drug developers to query against and compensate patients for the analysis. Availability and implementation: The software is freely available for download at: http://cbcb.umd.edu/∼hcorrada/projects/secureseq.html Supplementary information: Supplementary data are available at Bioinformatics online. Contact: hcorrada@umiacs.umd.edu PMID:26873931

  2. Multiresolution analysis over simple graphs for brain computer interfaces

    NASA Astrophysics Data System (ADS)

    Asensio-Cubero, J.; Gan, J. Q.; Palaniappan, R.

    2013-08-01

    Objective. Multiresolution analysis (MRA) offers a useful framework for signal analysis in the temporal and spectral domains, although commonly employed MRA methods may not be the best approach for brain computer interface (BCI) applications. This study aims to develop a new MRA system for extracting tempo-spatial-spectral features for BCI applications based on wavelet lifting over graphs. Approach. This paper proposes a new graph-based transform for wavelet lifting and a tailored simple graph representation for electroencephalography (EEG) data, which results in an MRA system where temporal, spectral and spatial characteristics are used to extract motor imagery features from EEG data. The transformed data is processed within a simple experimental framework to test the classification performance of the new method. Main Results. The proposed method can significantly improve the classification results obtained by various wavelet families using the same methodology. Preliminary results using common spatial patterns as feature extraction method show that we can achieve comparable classification accuracy to more sophisticated methodologies. From the analysis of the results we can obtain insights into the pattern development in the EEG data, which provide useful information for feature basis selection and thus for improving classification performance. Significance. Applying wavelet lifting over graphs is a new approach for handling BCI data. The inherent flexibility of the lifting scheme could lead to new approaches based on the hereby proposed method for further classification performance improvement.

  3. Detection of Organophosphorus Pesticides with Colorimetry and Computer Image Analysis.

    PubMed

    Li, Yanjie; Hou, Changjun; Lei, Jincan; Deng, Bo; Huang, Jing; Yang, Mei

    2016-01-01

    Organophosphorus pesticides (OPs) represent a very important class of pesticides that are widely used in agriculture because of their relatively high-performance and moderate environmental persistence, hence the sensitive and specific detection of OPs is highly significant. Based on the inhibitory effect of acetylcholinesterase (AChE) induced by inhibitors, including OPs and carbamates, a colorimetric analysis was used for detection of OPs with computer image analysis of color density in CMYK (cyan, magenta, yellow and black) color space and non-linear modeling. The results showed that there was a gradually weakened trend of yellow intensity with the increase of the concentration of dichlorvos. The quantitative analysis of dichlorvos was achieved by Artificial Neural Network (ANN) modeling, and the results showed that the established model had a good predictive ability between training sets and predictive sets. Real cabbage samples containing dichlorvos were detected by colorimetry and gas chromatography (GC), respectively. The results showed that there was no significant difference between colorimetry and GC (P > 0.05). The experiments of accuracy, precision and repeatability revealed good performance for detection of OPs. AChE can also be inhibited by carbamates, and therefore this method has potential applications in real samples for OPs and carbamates because of high selectivity and sensitivity. PMID:27396650

  4. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, J.

    1999-01-01

    A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.

  5. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, James G.

    1999-01-01

    A new objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 2 x 2.5 lat-lon grid with 20 levels of heights and winds and 10 levels of moisture) using 120,000 observations in less than 3 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system Ls totally portable and can run on -several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from I to 32 CPus is 18%. in addition, the analysis results are identical regardless of the number of processors used. T'his system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. It also includes a new quality control (buddy check) system. Static tests with the system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from a 2-month cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (0-F statistics) throughout the entire two months.

  6. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    ERIC Educational Resources Information Center

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  7. A Large-Scale Computational Analysis of Corneal Structural Response and Ectasia Risk in Myopic Laser Refractive Surgery

    PubMed Central

    Dupps, William Joseph; Seven, Ibrahim

    2016-01-01

    Purpose: To investigate biomechanical strain as a structural susceptibility metric for corneal ectasia in a large-scale computational trial. Methods: A finite element modeling study was performed using retrospective Scheimpflug tomography data from 40 eyes of 40 patients. LASIK and PRK were simulated with varied myopic ablation profiles and flap thickness parameters across eyes from LASIK candidates, patients disqualified for LASIK, subjects with atypical topography, and keratoconus subjects in 280 simulations. Finite element analysis output was then interrogated to extract several risk and outcome variables. We tested the hypothesis that strain is greater in known at-risk eyes than in normal eyes, evaluated the ability of a candidate strain variable to differentiate eyes that were empirically disqualified as LASIK candidates, and compared the performance of common risk variables as predictors of this novel susceptibility marker across multiple virtual subjects and surgeries. Results: A candidate susceptibility metric that expressed mean strains across the anterior residual stromal bed was significantly higher in eyes with confirmed ectatic predisposition in preoperative and all postoperative cases (P≤.003). The strain metric was effective at differentiating normal and at-risk eyes (area under receiver operating characteristic curve ≥ 0.83, P≤.002), was highly correlated to thickness-based risk metrics (as high as R2 = 95%, P<.001 for the percent of stromal tissue altered (PSTA)), and predicted large portions of the variance in predicted refractive response to surgery (R2 = 57%, P<.001). Conclusions: This study represents the first large-scale 3-dimensional structural analysis of ectasia risk and provides a novel biomechanical construct for expressing structural risk in refractive surgery. Mechanical strain is an effective marker of known ectasia risk and correlates to predicted refractive error after myopic photoablative surgery.

  8. Meta-Analysis and Computer-Mediated Communication.

    PubMed

    Taylor, Alan M

    2016-04-01

    Because of the use of human participants and differing contextual variables, research in second language acquisition often produces conflicting results, leaving practitioners confused and unsure of the effectiveness of specific treatments. This article provides insight into a recent seminal meta-analysis on the effectiveness of computer-mediated communication, providing further statistical evidence of the importance of its results. The significance of the study is examined by looking at the p values included in the references, to demonstrate how results can easily be misconstrued by practitioners and researchers. Lin's conclusion regarding the research setting of the study reports is also evaluated. In doing so, other possible explanations of what may be influencing the results can be proposed.

  9. An integrated-intensity method for emission spectrographic computer analysis

    USGS Publications Warehouse

    Thomas, Catharine P.

    1975-01-01

    An integrated-intensity method has been devised to improve the computer analysis of data by emission spectrography. The area of the intensity profile of a spectral line is approximated by a rectangle whose height is related to the intensity difference between the peak and background of the line and whose width is measured at a fixed transmittance below the apex of the line. The method is illustrated by the determination of strontium in the presence of greater than 10 percent calcium. The Sr 3380.711-A line, which is unaffected by calcium and which has a linear analytical curve extending from 100-3,000 ppm, has been used to determine strontium in 18 standard reference rocks covering a wide range of geologic materials. Both the accuracy and the precision of the determinations were well within the accepted range for a semiquantitative procedure.

  10. Drusen measurement from fundus photographs using computer image analysis.

    PubMed

    Peli, E; Lahav, M

    1986-12-01

    Drusen are yellowish deposits at the level of the retinal pigment epithelium and are frequently associated with age-related maculopathy (ARM). Drusen often change in size and number over time and may be followed by atrophic or exudative macular degeneration. A quantitative method to measure the development of drusen is needed for controlled studies of the natural history, prognosis, and treatment of ARM. An objective method is described using computer image analysis of fundus photographs for the detection and measurement of drusen. This technique enables us to measure both the area of drusen in the macula and the changes in the drusen pattern over time. Evaluation of repeated photographs showed reproducibility of 6.1%, whereas the reproducibility of processing photographic duplicates was 2.3%. Digitization with a high-quality linear array solid state camera did not change reproducibility significantly. PMID:3808617

  11. Computational analysis of the SSME fuel preburner flow

    NASA Technical Reports Server (NTRS)

    Wang, T. S.; Farmer, R. C.

    1986-01-01

    A computational fluid dynamics model which simulates the steady state operation of the SSME fuel preburner is developed. Specifically, the model will be used to quantify the flow factors which cause local hot spots in the fuel preburner in order to recommend experiments whereby the control of undesirable flow features can be demonstrated. The results of a two year effort to model the preburner are presented. In this effort, investigating the fuel preburner flowfield, the appropriate transport equations were numerically solved for both an axisymmetric and a three-dimensional configuration. Continuum's VAST (Variational Solution of the Transport equations) code, in conjunction with the CM-1000 Engineering Analysis Workstation and the NASA/Ames CYBER 205, was used to perform the required calculations. It is concluded that the preburner operational anomalies are not due to steady state phenomena and must, therefore, be related to transient operational procedures.

  12. Image editing and computer assisted bitemark analysis: a case report.

    PubMed

    Wood, R E; Miller, P A; Blenkinsop, B R

    1994-12-01

    Bitemark evidence in a homicide usually involves a perpetrator biting the victim prior to or around the time of death. This paper presents a case in which a homicide victim bit his assailant. A suspect taken into custody was found to have what appeared to be a human bitemark on the proximal phalanx of his right thumb. Scale photographs of this injury were obtained and compared to the dentition of the decreased using digitized computer images for superimposition. Three different approaches for comparison with the bitemark photograph were utilized: comparison with radiographs of amalgam-filled impressions of dental casts, a transparent overlay technique and comparison with photographs of a simulated bitemark inked onto the hand of a volunteer. A review of these techniques as they apply to computerized bitemark analysis is presented.

  13. Satellite Interference Analysis and Simulation Using Personal Computers

    NASA Technical Reports Server (NTRS)

    Kantak, Anil

    1988-01-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  14. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  15. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Astrophysics Data System (ADS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-02-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  16. Automation of Large-scale Computer Cluster Monitoring Information Analysis

    NASA Astrophysics Data System (ADS)

    Magradze, Erekle; Nadal, Jordi; Quadt, Arnulf; Kawamura, Gen; Musheghyan, Haykuhi

    2015-12-01

    High-throughput computing platforms consist of a complex infrastructure and provide a number of services apt to failures. To mitigate the impact of failures on the quality of the provided services, a constant monitoring and in time reaction is required, which is impossible without automation of the system administration processes. This paper introduces a way of automation of the process of monitoring information analysis to provide the long and short term predictions of the service response time (SRT) for a mass storage and batch systems and to identify the status of a service at a given time. The approach for the SRT predictions is based on Adaptive Neuro Fuzzy Inference System (ANFIS). An evaluation of the approaches is performed on real monitoring data from the WLCG Tier 2 center GoeGrid. Ten fold cross validation results demonstrate high efficiency of both approaches in comparison to known methods.

  17. Image editing and computer assisted bitemark analysis: a case report.

    PubMed

    Wood, R E; Miller, P A; Blenkinsop, B R

    1994-12-01

    Bitemark evidence in a homicide usually involves a perpetrator biting the victim prior to or around the time of death. This paper presents a case in which a homicide victim bit his assailant. A suspect taken into custody was found to have what appeared to be a human bitemark on the proximal phalanx of his right thumb. Scale photographs of this injury were obtained and compared to the dentition of the decreased using digitized computer images for superimposition. Three different approaches for comparison with the bitemark photograph were utilized: comparison with radiographs of amalgam-filled impressions of dental casts, a transparent overlay technique and comparison with photographs of a simulated bitemark inked onto the hand of a volunteer. A review of these techniques as they apply to computerized bitemark analysis is presented. PMID:9227063

  18. Data analysis using the Gnu R system for statistical computation

    SciTech Connect

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  19. Parallel computation of meshless methods for explicit dynamic analysis.

    SciTech Connect

    Danielson, K. T.; Hao, S.; Liu, W. K.; Uras, R. A.; Li, S.; Reactor Engineering; Northwestern Univ.; Waterways Experiment Station

    2000-03-10

    A parallel computational implementation of modern meshless methods is presented for explicit dynamic analysis. The procedures are demonstrated by application of the Reproducing Kernel Particle Method (RKPM). Aspects of a coarse grain parallel paradigm are detailed for a Lagrangian formulation using model partitioning. Integration points are uniquely defined on separate processors and particle definitions are duplicated, as necessary, so that all support particles for each point are defined locally on the corresponding processor. Several partitioning schemes are considered and a reduced graph-based procedure is presented. Partitioning issues are discussed and procedures to accommodate essential boundary conditions in parallel are presented. Explicit MPI message passing statements are used for all communications among partitions on different processors. The effectiveness of the procedure is demonstrated by highly deformable inelastic example problems.

  20. Satellite interference analysis and simulation using personal computers

    NASA Astrophysics Data System (ADS)

    Kantak, Anil

    1988-03-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  1. Software for computer-aided receiver operating characteristic (ROC) analysis

    NASA Astrophysics Data System (ADS)

    Engel, John R.; Craine, Eric R.

    1994-04-01

    We are currently developing an easy-to-use, microcomputer-based software application to help researchers perform ROC studies. The software will have facilities for aiding the researcher in all phases of an ROC study, including experiment design, setting up and conducting test sessions, analyzing results and generating reports. The initial version of the software, named 'ROC Assistant', operates on Macintosh computers and enables the user to enter a case list, run test sessions and produce an ROC curve. We are in the process of developing enhanced versions which will incorporate functions for statistical analysis, experimental design and online help. In this paper we discuss the ROC methodology upon which the software is based as well as our software development effort to date.

  2. Computational analysis of methods for reduction of induced drag

    NASA Technical Reports Server (NTRS)

    Janus, J. M.; Chatterjee, Animesh; Cave, Chris

    1993-01-01

    The purpose of this effort was to perform a computational flow analysis of a design concept centered around induced drag reduction and tip-vortex energy recovery. The flow model solves the unsteady three-dimensional Euler equations, discretized as a finite-volume method, utilizing a high-resolution approximate Riemann solver for cell interface flux definitions. The numerical scheme is an approximately-factored block LU implicit Newton iterative-refinement method. Multiblock domain decomposition is used to partition the field into an ordered arrangement of blocks. Three configurations are analyzed: a baseline fuselage-wing, a fuselage-wing-nacelle, and a fuselage-wing-nacelle-propfan. Aerodynamic force coefficients, propfan performance coefficients, and flowfield maps are used to qualitatively access design efficacy. Where appropriate, comparisons are made with available experimental data.

  3. Whole-genome CNV analysis: advances in computational approaches

    PubMed Central

    Pirooznia, Mehdi; Goes, Fernando S.; Zandi, Peter P.

    2015-01-01

    Accumulating evidence indicates that DNA copy number variation (CNV) is likely to make a significant contribution to human diversity and also play an important role in disease susceptibility. Recent advances in genome sequencing technologies have enabled the characterization of a variety of genomic features, including CNVs. This has led to the development of several bioinformatics approaches to detect CNVs from next-generation sequencing data. Here, we review recent advances in CNV detection from whole genome sequencing. We discuss the informatics approaches and current computational tools that have been developed as well as their strengths and limitations. This review will assist researchers and analysts in choosing the most suitable tools for CNV analysis as well as provide suggestions for new directions in future development. PMID:25918519

  4. Meta-Analysis and Computer-Mediated Communication.

    PubMed

    Taylor, Alan M

    2016-04-01

    Because of the use of human participants and differing contextual variables, research in second language acquisition often produces conflicting results, leaving practitioners confused and unsure of the effectiveness of specific treatments. This article provides insight into a recent seminal meta-analysis on the effectiveness of computer-mediated communication, providing further statistical evidence of the importance of its results. The significance of the study is examined by looking at the p values included in the references, to demonstrate how results can easily be misconstrued by practitioners and researchers. Lin's conclusion regarding the research setting of the study reports is also evaluated. In doing so, other possible explanations of what may be influencing the results can be proposed. PMID:27154373

  5. Computational analysis of azine-N-oxides as energetic materials

    SciTech Connect

    Ritchie, J.P.

    1994-05-01

    A BKW equation of state in a 1-dimensional hydrodynamic simulation of the cylinder test can be used to estimate the performance of explosives. Using this approach, the novel explosive 1,4-diamino-2,3,5,6-tetrazine-2,5-dioxide (TZX) was analyzed. Despite a high detonation velocity and a predicted CJ pressure comparable to that of RDX, TZX performs relatively poorly in the cylinder test. Theoretical and computational analysis shows this to be the result of a low heat of detonation. A conceptual strategy is proposed to remedy this problem. In order to predict the required heats of formation, new ab initio group equivalents were developed. Crystal structure calculations are also described that show hydrogen-bonding is important in determining the density of TZX and related compounds.

  6. Thermodiffusion in multicomponent hydrocarbon mixtures: Experimental investigations and computational analysis.

    PubMed

    VanVaerenbergh, Stefan; Srinivasan, Seshasai; Saghir, M Ziad

    2009-09-21

    In an unprecedented experimental investigation, a ternary and a four component hydrocarbon mixture at high pressure have been studied in a nearly convection free environment to understand the thermodiffusion process. A binary mixture has also been investigated in this environment. Experimental investigations of the three mixtures have been conducted in space onboard the spacecraft FOTON-M3 thereby isolating the gravity-induced convection that otherwise interferes with thermodiffusion experiments on Earth. The experimental results have also been used to test a thermodiffusion model that has been calibrated based on the results of previous experimental investigations. It was found that with an increase in the number of components in the mixtures, the performance of the thermodiffusion model deteriorated. Computational analysis was also made to estimate the possible sources of errors. Simulations showed that the vibrations of the spacecraft could influence the estimates of thermodiffusion factors. It was also found that they are sensitive to slight variations in the temperature of the mixture.

  7. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    PubMed

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in <2 minutes, store information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible

  8. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    PubMed

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in <2 minutes, store information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible

  9. Green's Function Analysis of Periodic Structures in Computational Electromagnetics

    NASA Astrophysics Data System (ADS)

    Van Orden, Derek

    2011-12-01

    Periodic structures are used widely in electromagnetic devices, including filters, waveguiding structures, and antennas. Their electromagnetic properties may be analyzed computationally by solving an integral equation, in which an unknown equivalent current distribution in a single unit cell is convolved with a periodic Green's function that accounts for the system's boundary conditions. Fast computation of the periodic Green's function is therefore essential to achieve high accuracy solutions of complicated periodic structures, including analysis of modal wave propagation and scattering from external sources. This dissertation first presents alternative spectral representations of the periodic Green's function of the Helmholtz equation for cases of linear periodic systems in 2D and 3D free space and near planarly layered media. Although there exist multiple representations of the periodic Green's function, most are not efficient in the important case where the fields are observed near the array axis. We present spectral-spatial representations for rapid calculation of the periodic Green's functions for linear periodic arrays of current sources residing in free space as well as near a planarly layered medium. They are based on the integral expansion of the periodic Green's functions in terms of the spectral parameters transverse to the array axis. These schemes are important for the rapid computation of the interaction among unit cells of a periodic array, and, by extension, the complex dispersion relations of guided waves. Extensions of this approach to planar periodic structures are discussed. With these computation tools established, we study the traveling wave properties of linear resonant arrays placed near surfaces, and examine the coupling mechanisms that lead to radiation into guided waves supported by the surface. This behavior is especially important to understand the properties of periodic structures printed on dielectric substrates, such as periodic

  10. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  11. Water uptake by a maize root system - An explicit numerical 3-dimensional simulation.

    NASA Astrophysics Data System (ADS)

    Leitner, Daniel; Schnepf, Andrea; Klepsch, Sabine; Roose, Tiina

    2010-05-01

    Water is one of the most important resources for plant growth and function. An accurate modelling of the unsaturated flow is not only substantial to predict water uptake but also important to describe nutrient movement regarding water saturation and transport. In this work we present a model for water uptake. The model includes the simultaneous flow of water inside the soil and inside the root network. Water saturation in the soil volume is described by the Richards equation. Water flow inside the roots' xylem is calculated using the Poiseuille law for water flow in a cylindrical tube. The water saturation in the soil as well as water uptake of the root system is calculated numerically in three dimensions. We study water uptake of a maize plant in a confined pot under different supply scenarios. The main improvement of our approach is that the root surfaces act as spatial boundaries of the soil volume. Therefore water influx into the root is described by a surface flux instead of a volume flux, which is commonly given by an effective sink term. For the numerical computation we use the following software: The 3-dimensional maize root architecture is created by a root growth model based on L-Systems (Leitner et al 2009). A mesh of the surrounding soil volume is created using the meshing software DistMesh (Persson & Strang 2004). Using this mesh the partial differential equations are solved with the finite element method using Comsol Multiphysics 3.5a. Modelling results are related to accepted water uptake models from literature (Clausnitzer & Hopmans 1994, Roose & Fowler 2004, Javaux et al 2007). This new approach has several advantages. By considering the individual roots it is possible to analyse the influence of overlapping depletion zones due to inter root competition. Furthermore, such simulations can be used to estimate the influence of simplifying assumptions that are made in the development of effective models. The model can be easily combined with a nutrient

  12. A 3-Dimensional Absorbed Dose Calculation Method Based on Quantitative SPECT for Radionuclide Therapy: Evaluation for 131I Using Monte Carlo Simulation

    PubMed Central

    Ljungberg, Michael; Sjögreen, Katarina; Liu, Xiaowei; Frey, Eric; Dewaraja, Yuni; Strand, Sven-Erik

    2009-01-01

    A general method is presented for patient-specific 3-dimensional absorbed dose calculations based on quantitative SPECT activity measurements. Methods The computational scheme includes a method for registration of the CT image to the SPECT image and position-dependent compensation for attenuation, scatter, and collimator detector response performed as part of an iterative reconstruction method. A method for conversion of the measured activity distribution to a 3-dimensional absorbed dose distribution, based on the EGS4 (electron-gamma shower, version 4) Monte Carlo code, is also included. The accuracy of the activity quantification and the absorbed dose calculation is evaluated on the basis of realistic Monte Carlo–simulated SPECT data, using the SIMIND (simulation of imaging nuclear detectors) program and a voxel-based computer phantom. CT images are obtained from the computer phantom, and realistic patient movements are added relative to the SPECT image. The SPECT-based activity concentration and absorbed dose distributions are compared with the true ones. Results Correction could be made for object scatter, photon attenuation, and scatter penetration in the collimator. However, inaccuracies were imposed by the limited spatial resolution of the SPECT system, for which the collimator response correction did not fully compensate. Conclusion The presented method includes compensation for most parameters degrading the quantitative image information. The compensation methods are based on physical models and therefore are generally applicable to other radionuclides. The proposed evaluation methodology may be used as a basis for future intercomparison of different methods. PMID:12163637

  13. The role of computed tomography in terminal ballistic analysis.

    PubMed

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles. PMID:17205351

  14. The role of computed tomography in terminal ballistic analysis.

    PubMed

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles.

  15. Pulmonary Toxicity in Stage III Non-Small Cell Lung Cancer Patients Treated With High-Dose (74 Gy) 3-Dimensional Conformal Thoracic Radiotherapy and Concurrent Chemotherapy Following Induction Chemotherapy: A Secondary Analysis of Cancer and Leukemia Group B (CALGB) Trial 30105

    SciTech Connect

    Salama, Joseph K.; Stinchcombe, Thomas E.; Gu Lin; Wang Xiaofei; Morano, Karen; Bogart, Jeffrey A.; Crawford, Jeffrey C.; Socinski, Mark A.; Blackstock, A. William; Vokes, Everett E.

    2011-11-15

    Purpose: Cancer and Leukemia Group B (CALGB) 30105 tested two different concurrent chemoradiotherapy platforms with high-dose (74 Gy) three-dimensional conformal radiotherapy (3D-CRT) after two cycles of induction chemotherapy for Stage IIIA/IIIB non-small cell lung cancer (NSCLC) patients to determine if either could achieve a primary endpoint of >18-month median survival. Final results of 30105 demonstrated that induction carboplatin and gemcitabine and concurrent gemcitabine 3D-CRT was not feasible because of treatment-related toxicity. However, induction and concurrent carboplatin/paclitaxel with 74 Gy 3D-CRT had a median survival of 24 months, and is the basis for the experimental arm in CALGB 30610/RTOG 0617/N0628. We conducted a secondary analysis of all patients to determine predictors of treatment-related pulmonary toxicity. Methods and Materials: Patient, tumor, and treatment-related variables were analyzed to determine their relation with treatment-related pulmonary toxicity. Results: Older age, higher N stage, larger planning target volume (PTV)1, smaller total lung volume/PTV1 ratio, larger V20, and larger mean lung dose were associated with increasing pulmonary toxicity on univariate analysis. Multivariate analysis confirmed that V20 and nodal stage as well as treatment with concurrent gemcitabine were associated with treatment-related toxicity. A high-risk group comprising patients with N3 disease and V20 >38% was associated with 80% of Grades 3-5 pulmonary toxicity cases. Conclusions: Elevated V20 and N3 disease status are important predictors of treatment related pulmonary toxicity in patients treated with high-dose 3D-CRT and concurrent chemotherapy. Further studies may use these metrics in considering patients for these treatments.

  16. Computer automated movement detection for the analysis of behavior.

    PubMed

    Ramazani, Roseanna B; Krishnan, Harish R; Bergeson, Susan E; Atkinson, Nigel S

    2007-05-15

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimental observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtraction removes the background and non-moving flies, leaving white pixels where movement has occurred. These pixels are tallied, giving a value that corresponds to the number of animals that have moved between images. Perl scripts automate these processes, allowing compatibility with high-throughput genetic screens. Four experiments demonstrate the utility of this method, the first showing heat-induced locomotor changes, the second showing tolerance to ethanol in a climbing assay, the third showing tolerance to ethanol by scoring the recovery of individual flies, and the fourth showing a mouse's preference for a novel object. Our lab will use this method to conduct a genetic screen for ethanol-induced hyperactivity and sedation, however, it could also be used to analyze locomotor behavior of any organism. PMID:17335906

  17. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  18. Variance Analysis and Comparison in Computer-Aided Design

    NASA Astrophysics Data System (ADS)

    Ullrich, T.; Schiffer, T.; Schinko, C.; Fellner, D. W.

    2011-09-01

    The need to analyze and visualize differences of very similar objects arises in many research areas: mesh compression, scan alignment, nominal/actual value comparison, quality management, and surface reconstruction to name a few. In computer graphics, for example, differences of surfaces are used for analyzing mesh processing algorithms such as mesh compression. They are also used to validate reconstruction and fitting results of laser scanned surfaces. As laser scanning has become very important for the acquisition and preservation of artifacts, scanned representations are used for documentation as well as analysis of ancient objects. Detailed mesh comparisons can reveal smallest changes and damages. These analysis and documentation tasks are needed not only in the context of cultural heritage but also in engineering and manufacturing. Differences of surfaces are analyzed to check the quality of productions. Our contribution to this problem is a workflow, which compares a reference / nominal surface with an actual, laser-scanned data set. The reference surface is a procedural model whose accuracy and systematics describe the semantic properties of an object; whereas the laser-scanned object is a real-world data set without any additional semantic information.

  19. Computer Analysis of the Leaf Movements of Pinto Beans 1

    PubMed Central

    Hoshizaki, Takashi; Hamner, K. C.

    1969-01-01

    Computer analysis was used for the detection of rhythmic components and the estimation of period length in leaf movement records. The results of this study indicated that spectral analysis can be profitably used to determine rhythmic components in leaf movements. In Pinto bean plants (Phaseolus vulgaris L.) grown for 28 days under continuous light of 750 ft-c and at a constant temperature of 28°, there was only 1 highly significant rhythmic component in the leaf movements. The period of this rhythm was 27.3 hr. In plants grown at 20°, there were 2 highly significant rhythmic components: 1 of 13.8 hr and a much stronger 1 of 27.3 hr. At 15°, the highly significant rhythmic components were also 27.3 and 13.8 hr in length but were of equal intensity. Random movements less than 9 hr in length became very pronounced at this temperature. At 10°, no significant rhythm was found in the leaf movements. At 5°, the leaf movements ceased within 1 day. PMID:16657155

  20. Analysis and design methodology for VLSI computing networks. Final report

    SciTech Connect

    Lev-Ari, H.

    1984-08-01

    Several methods for modeling and analysis of parallel algorithms and architectures have been proposed in the recent years. These include recursion-type methods, like recursion equations, z-transform descriptions and do-loops in high-level programming languages, and precedence-graph-type methods like data-flow graphs (marked graphs) and related Petri-net derived models. Most efforts have been recently directed towards developing methodologies for structured parallel algorithms and architectures and, in particular, for systolic-array-like systems. Some important properties of parallel algorithms have been identified in the process of this research effort. These include executability (the absence of deadlocks) pipelinability, regularity of structure, locality of interconnections, and dimensionality. The research has also demonstrated the feasibility of multirate systolic arrays with different rates of data propagation along different directions in the array. This final report presents a new methodology for modeling and analysis of parallel algorithms and architectures. This methodology provides a unified conceptual framework, which is called modular computing network, that clearly displays the key properties of parallel systems.

  1. Computer automated movement detection for the analysis of behavior

    PubMed Central

    Ramazani, Roseanna B.; Krishnan, Harish R.; Bergeson, Susan E.; Atkinson, Nigel S.

    2007-01-01

    Currently, measuring ethanol behaviors in flies depends on expensive image analysis software or time intensive experimenter observation. We have designed an automated system for the collection and analysis of locomotor behavior data, using the IEEE 1394 acquisition program dvgrab, the image toolkit ImageMagick and the programming language Perl. In the proposed method, flies are placed in a clear container and a computer-controlled camera takes pictures at regular intervals. Digital subtraction removes the background and non-moving flies, leaving white pixels where movement has occurred. These pixels are tallied, giving a value that corresponds to the number of animals that have moved between images. Perl scripts automate these processes, allowing compatibility with high-throughput genetic screens. Four experiments demonstrate the utility of this method, the first showing heat-induced locomotor changes, the second showing tolerance to ethanol in a climbing assay, the third showing tolerance to ethanol by scoring the recovery of individual flies, and the fourth showing a mouse’s preference for a novel object. Our lab will use this method to conduct a genetic screen for ethanol induced hyperactivity and sedation, however, it could also be used to analyze locomotor behavior of any organism. PMID:17335906

  2. Consequence analysis in LPG installation using an integrated computer package.

    PubMed

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-01

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG

  3. Computational and Statistical Analysis of Protein Mass Spectrometry Data

    PubMed Central

    Noble, William Stafford; MacCoss, Michael J.

    2012-01-01

    High-throughput proteomics experiments involving tandem mass spectrometry produce large volumes of complex data that require sophisticated computational analyses. As such, the field offers many challenges for computational biologists. In this article, we briefly introduce some of the core computational and statistical problems in the field and then describe a variety of outstanding problems that readers of PLoS Computational Biology might be able to help solve. PMID:22291580

  4. Linguistic Analysis of Natural Language Communication with Computers.

    ERIC Educational Resources Information Center

    Thompson, Bozena Henisz

    Interaction with computers in natural language requires a language that is flexible and suited to the task. This study of natural dialogue was undertaken to reveal those characteristics which can make computer English more natural. Experiments were made in three modes of communication: face-to-face, terminal-to-terminal, and human-to-computer,…

  5. Computational electromagnetic analysis of plasmonic effects in interdigital photodetectors

    NASA Astrophysics Data System (ADS)

    Hill, Avery M.; Nusir, Ahmad I.; Nguyen, Paul V.; Manasreh, Omar M.; Herzog, Joseph B.

    2014-09-01

    Plasmonic nanostructures have been shown to act as optical antennas that enhance optical devices. This study focuses on computational electromagnetic (CEM) analysis of GaAs photodetectors with gold interdigital electrodes. Experiments have shown that the photoresponse of the devices depend greatly on the electrode spacing and the polarization of the incident light. Smaller electrode spacing and transverse polarization give rise to a larger photoresponse. This computational study will simulate the optical properties of these devices to determine what plasmonic properties and optical enhancement these devices may have. The models will be solving Maxwell's equations with a finite element method (FEM) algorithm provided by the software COMSOL Multiphysics 4.4. The preliminary results gathered from the simulations follow the same trends that were seen in the experimental data collected, that the spectral response increases when the electrode spacing decreases. Also the simulations show that incident light with the electric field polarized transversely across the electrodes produced a larger photocurrent as compared with longitudinal polarization. This dependency is similar to other plasmonic devices. The simulation results compare well with the experimental data. This work also will model enhancement effects in nanostructure devices with dimensions that are smaller than the current samples to lead the way for future nanoscale devices. By seeing the potential effects that the decreased spacing could have, it opens the door to a new set of devices on a smaller scale, potentially ones with a higher level of enhancement for these devices. In addition, the precise modeling and understanding of the effects of the parameters provides avenues to optimize the enhancement of these structures making more efficient photodetectors. Similar structures could also potentially be used for enhanced photovoltaics as well.

  6. Computer based imaging and analysis of root gravitropism.

    PubMed

    Evans, M L; Ishikawa, H

    1997-06-01

    Two key issues in studies of the nature of the gravitropic response in roots have been the determination of the precise pattern of differential elongation responsible for downward bending and the identification of the cells that show the initial motor response. The main approach for examining patterns of differential growth during root gravitropic curvature has been to apply markers to the root surface and photograph the root at regular intervals during gravitropic curvature. Although these studies have provided valuable information on the characteristics of the gravitropic motor response in roots, their labor intensive nature limits sample size and discourages both high frequency of sampling and depth of analysis of surface expansion data. In this brief review we describe the development of computer-based video analysis systems for automated measurement of root growth and shape change and discuss some key features of the root gravitropic response that have been revealed using this methodology. We summarize the capabilities of several new pieces of software designed to measure growth and shape changes in graviresponding roots and describe recent progress in developing analysis systems for studying the small, but experimentally popular, primary roots of Arabidopsis. A key finding revealed by such studies is that the initial gravitropic response of roots of maize and Arabidopsis occurs in the distal elongation zone (DEZ) near the root apical meristem, not in the main elongation zone. Another finding is that the initiation of rapid elongation in the DEZ following gravistimulation appears to be related to rapid membrane potential changes in this region of the root. These observations have provided the incentive for ongoing studies examining possible links between potential growth modifying factors (auxin, calcium, protons) and gravistimulated changes in membrane potential and growth patterns in the DEZ. PMID:11540122

  7. Computer based imaging and analysis of root gravitropism

    NASA Technical Reports Server (NTRS)

    Evans, M. L.; Ishikawa, H.

    1997-01-01

    Two key issues in studies of the nature of the gravitropic response in roots have been the determination of the precise pattern of differential elongation responsible for downward bending and the identification of the cells that show the initial motor response. The main approach for examining patterns of differential growth during root gravitropic curvature has been to apply markers to the root surface and photograph the root at regular intervals during gravitropic curvature. Although these studies have provided valuable information on the characteristics of the gravitropic motor response in roots, their labor intensive nature limits sample size and discourages both high frequency of sampling and depth of analysis of surface expansion data. In this brief review we describe the development of computer-based video analysis systems for automated measurement of root growth and shape change and discuss some key features of the root gravitropic response that have been revealed using this methodology. We summarize the capabilities of several new pieces of software designed to measure growth and shape changes in graviresponding roots and describe recent progress in developing analysis systems for studying the small, but experimentally popular, primary roots of Arabidopsis. A key finding revealed by such studies is that the initial gravitropic response of roots of maize and Arabidopsis occurs in the distal elongation zone (DEZ) near the root apical meristem, not in the main elongation zone. Another finding is that the initiation of rapid elongation in the DEZ following gravistimulation appears to be related to rapid membrane potential changes in this region of the root. These observations have provided the incentive for ongoing studies examining possible links between potential growth modifying factors (auxin, calcium, protons) and gravistimulated changes in membrane potential and growth patterns in the DEZ.

  8. Computer image analysis of toxic fatty degeneration in rat liver.

    PubMed

    Stetkiewicz, J; Zieliński, K; Stetkiewicz, I; Koktysz, R

    1989-01-01

    Fatty degeneration of the liver is one of the most frequently observed pathological changes in the experimental estimation of the toxicity of chemical compounds. The intensity of this kind of damage is most often detected by means of a generally accepted scale of points, whereas the classification is performed according to the subjective "feeling" of the pathologist. In modern pathological diagnostics, computer analysis of images is used to perform an objective estimation of the degree of damage to various organs. In order to check the usefulness of this kind of method, comparative biochemical and morphometrical studies were undertaken in trichloroethylene (TRI)-induced fatty degeneration of the liver. TRI was administered to rats intragastrically, in single doses: 1/2; 1/3; 1/4; 1/6 and 1/18 DL50. 24 hours after the administration, the animals were sacrificed. The content of triglycerides in the liver was determined according to Folch et al. (1956). Simple lipids in the histochemical samples were detected by means of staining with a lipotropic, Fat Red 7B. The area of fatty degeneration was estimated in the microscopic samples by the use of an automatic image analyser IBAS 2000 (Kontron). The morphometrical data concerning the area of fatty degeneration in the liver amplified a high degree of correlation with the content of triglycerides (r = 0.89) and the dose of TRI (r = 0.96). The degree of correlation between the biochemical data and the dose of TRI was 0.88. The morphometrical studies performed have proved to be of great use in estimating the degree of fatty degeneration in the liver. This method enables precise, quantitative measuring of this sort of liver damage in the material prepared for routine histopathological analysis. It requires, however, the application of a specialized device for quantitative image analysis.

  9. Computational Analysis of the Hypothalamic Control of Food Intake

    PubMed Central

    Tabe-Bordbar, Shayan; Anastasio, Thomas J.

    2016-01-01

    Food-intake control is mediated by a heterogeneous network of different neural subtypes, distributed over various hypothalamic nuclei and other brain structures, in which each subtype can release more than one neurotransmitter or neurohormone. The complexity of the interactions of these subtypes poses a challenge to understanding their specific contributions to food-intake control, and apparent consistencies in the dataset can be contradicted by new findings. For example, the growing consensus that arcuate nucleus neurons expressing Agouti-related peptide (AgRP neurons) promote feeding, while those expressing pro-opiomelanocortin (POMC neurons) suppress feeding, is contradicted by findings that low AgRP neuron activity and high POMC neuron activity can be associated with high levels of food intake. Similarly, the growing consensus that GABAergic neurons in the lateral hypothalamus suppress feeding is contradicted by findings suggesting the opposite. Yet the complexity of the food-intake control network admits many different network behaviors. It is possible that anomalous associations between the responses of certain neural subtypes and feeding are actually consistent with known interactions, but their effect on feeding depends on the responses of the other neural subtypes in the network. We explored this possibility through computational analysis. We made a computer model of the interactions between the hypothalamic and other neural subtypes known to be involved in food-intake control, and optimized its parameters so that model behavior matched observed behavior over an extensive test battery. We then used specialized computational techniques to search the entire model state space, where each state represents a different configuration of the responses of the units (model neural subtypes) in the network. We found that the anomalous associations between the responses of certain hypothalamic neural subtypes and feeding are actually consistent with the known structure

  10. Computational Analysis of the Hypothalamic Control of Food Intake.

    PubMed

    Tabe-Bordbar, Shayan; Anastasio, Thomas J

    2016-01-01

    Food-intake control is mediated by a heterogeneous network of different neural subtypes, distributed over various hypothalamic nuclei and other brain structures, in which each subtype can release more than one neurotransmitter or neurohormone. The complexity of the interactions of these subtypes poses a challenge to understanding their specific contributions to food-intake control, and apparent consistencies in the dataset can be contradicted by new findings. For example, the growing consensus that arcuate nucleus neurons expressing Agouti-related peptide (AgRP neurons) promote feeding, while those expressing pro-opiomelanocortin (POMC neurons) suppress feeding, is contradicted by findings that low AgRP neuron activity and high POMC neuron activity can be associated with high levels of food intake. Similarly, the growing consensus that GABAergic neurons in the lateral hypothalamus suppress feeding is contradicted by findings suggesting the opposite. Yet the complexity of the food-intake control network admits many different network behaviors. It is possible that anomalous associations between the responses of certain neural subtypes and feeding are actually consistent with known interactions, but their effect on feeding depends on the responses of the other neural subtypes in the network. We explored this possibility through computational analysis. We made a computer model of the interactions between the hypothalamic and other neural subtypes known to be involved in food-intake control, and optimized its parameters so that model behavior matched observed behavior over an extensive test battery. We then used specialized computational techniques to search the entire model state space, where each state represents a different configuration of the responses of the units (model neural subtypes) in the network. We found that the anomalous associations between the responses of certain hypothalamic neural subtypes and feeding are actually consistent with the known structure

  11. 3-Dimensional Scene Perception during Active Electrolocation in a Weakly Electric Pulse Fish

    PubMed Central

    von der Emde, Gerhard; Behr, Katharina; Bouton, Béatrice; Engelmann, Jacob; Fetz, Steffen; Folde, Caroline

    2010-01-01

    Weakly electric fish use active electrolocation for object detection and orientation in their environment even in complete darkness. The African mormyrid Gnathonemus petersii can detect object parameters, such as material, size, shape, and distance. Here, we tested whether individuals of this species can learn to identify 3-dimensional objects independently of the training conditions and independently of the object's position in space (rotation-invariance; size-constancy). Individual G. petersii were trained in a two-alternative forced-choice procedure to electrically discriminate between a 3-dimensional object (S+) and several alternative objects (S−). Fish were then tested whether they could identify the S+ among novel objects and whether single components of S+ were sufficient for recognition. Size-constancy was investigated by presenting the S+ together with a larger version at different distances. Rotation-invariance was tested by rotating S+ and/or S− in 3D. Our results show that electrolocating G. petersii could (1) recognize an object independently of the S− used during training. When only single components of a complex S+ were offered, recognition of S+ was more or less affected depending on which part was used. (2) Object-size was detected independently of object distance, i.e. fish showed size-constancy. (3) The majority of the fishes tested recognized their S+ even if it was rotated in space, i.e. these fishes showed rotation-invariance. (4) Object recognition was restricted to the near field around the fish and failed when objects were moved more than about 4 cm away from the animals. Our results indicate that even in complete darkness our G. petersii were capable of complex 3-dimensional scene perception using active electrolocation. PMID:20577635

  12. Incorporating a 3-dimensional printer into the management of early-stage cervical cancer.

    PubMed

    Baek, Min-Hyun; Kim, Dae-Yeon; Kim, Namkug; Rhim, Chae Chun; Kim, Jong-Hyeok; Nam, Joo-Hyun

    2016-08-01

    We used a 3-dimensional (3D) printer to create anatomical replicas of real lesions and tested its application in cervical cancer. Our study patient decided to undergo radical hysterectomy after seeing her 3D model which was then used to plan and simulate this surgery. Using 3D printers to create patient-specific 3D tumor models may aid cervical cancer patients make treatment decisions. This technology will lead to better surgical and oncological outcomes for cervical cancer patients. J. Surg. Oncol. 2016;114:150-152. © 2016 Wiley Periodicals, Inc.

  13. Patterned 3-dimensional metal grid electrodes as alternative electron collectors in dye-sensitized solar cells.

    PubMed

    Chua, Julianto; Mathews, Nripan; Jennings, James R; Yang, Guangwu; Wang, Qing; Mhaisalkar, Subodh G

    2011-11-21

    We describe the application of 3-dimensional metal grid electrodes (3D-MGEs) as electron collectors in dye-sensitized solar cells (DSCs) as a replacement for fluorinated tin oxide (FTO) electrodes. Requirements, structure, advantages, and limitations of the metal grid electrodes are discussed. Solar conversion efficiencies of 6.2% have been achieved in 3D-MGE based solar cells, comparable to that fabricated on FTO (7.1%). The charge transport properties and collection efficiencies in these novel solar cells have been studied using electrochemical impedance spectroscopy.

  14. Incorporating a 3-dimensional printer into the management of early-stage cervical cancer.

    PubMed

    Baek, Min-Hyun; Kim, Dae-Yeon; Kim, Namkug; Rhim, Chae Chun; Kim, Jong-Hyeok; Nam, Joo-Hyun

    2016-08-01

    We used a 3-dimensional (3D) printer to create anatomical replicas of real lesions and tested its application in cervical cancer. Our study patient decided to undergo radical hysterectomy after seeing her 3D model which was then used to plan and simulate this surgery. Using 3D printers to create patient-specific 3D tumor models may aid cervical cancer patients make treatment decisions. This technology will lead to better surgical and oncological outcomes for cervical cancer patients. J. Surg. Oncol. 2016;114:150-152. © 2016 Wiley Periodicals, Inc. PMID:27222318

  15. Finite element modelling of a 3 dimensional dielectrophoretic flow separator device for optimal bioprocessing conditions.

    PubMed

    Fatoyinbo, H O; Hughes, M P

    2004-01-01

    Planar 2-dimensional dielectrophoresis electrode geometries are limited in only being capable of handling fluid volumes ranging from picolitres to hundreds of microliters per hour. A 3-dimensional electrode system has been developed capable of handling significantly larger volumes of fluid. Using finite element modeling the electric field distribution within various bore sizes was realized. From these simulations it is possible to optimize bioprocessing factors influencing the performance of a dielectrophoretic separator. Process calculations have shown that flow-rates of 25ml hr/sup -1/ or more can be attained for the separation of heterogeneous populations of bio-particles based on their dielectric properties.

  16. Computational Analysis of a Prototype Martian Rotorcraft Experiment

    NASA Technical Reports Server (NTRS)

    Corfeld, Kelly J.; Strawn, Roger C.; Long, Lyle N.

    2001-01-01

    This paper presents Reynolds-averaged Navier-Stokes calculations for a prototype Martian rotorcraft. The computations are intended for comparison with an ongoing Mars rotor hover test at NASA Ames Research Center. These computational simulations present a new and challenging problem, since rotors that operate on Mars will experience a unique low Reynolds number and high Mach number environment. Computed results for the 3-D rotor differ substantially from 2-D sectional computations in that the 3-D results exhibit a stall delay phenomenon caused by rotational forces along the blade span. Computational results have yet to be compared to experimental data, but computed performance predictions match the experimental design goals fairly well. In addition, the computed results provide a high level of detail in the rotor wake and blade surface aerodynamics. These details provide an important supplement to the expected experimental performance data.

  17. Computational Analysis of a Prototype Martian Rotorcraft Experiment

    NASA Technical Reports Server (NTRS)

    Corfeld, Kelly J.; Strawn, Roger C.; Long, Lyle N.

    2002-01-01

    This paper presents Reynolds-averaged Navier-Stokes calculations for a prototype Martian rotorcraft. The computations are intended for comparison with an ongoing Mars rotor hover test at NASA Ames Research Center. These computational simulations present a new and challenging problem, since rotors that operate on Mars will experience a unique low Reynolds number and high Mach number environment. Computed results for the 3-D rotor differ substantially from 2-D sectional computations in that the 3-D results exhibit a stall delay phenomenon caused by rotational forces along the blade span. Computational results have yet to be compared to experimental data, but computed performance predictions match the experimental design goals fairly well. In addition, the computed results provide a high level of detail in the rotor wake and blade surface aerodynamics. These details provide an important supplement to the expected experimental performance data.

  18. Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1975-01-01

    An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.

  19. EVA worksite analysis--use of computer analysis for EVA operations development and execution.

    PubMed

    Anderson, D

    1999-01-01

    To sustain the rate of extravehicular activity (EVA) required to assemble and maintain the International Space Station, we must enhance our ability to plan, train for, and execute EVAs. An underlying analysis capability has been developed to ensure EVA access to all external worksites as a starting point for ground training, to generate information needed for on-orbit training, and to react quickly to develop contingency EVA plans, techniques, and procedures. This paper describes the use of computer-based EVA worksite analysis techniques for EVA worksite design. EVA worksite analysis has been used to design 80% of EVA worksites on the U.S. portion of the International Space Station. With the launch of the first U.S. element of the station, EVA worksite analysis is being developed further to support real-time analysis of unplanned EVA operations. This paper describes this development and deployment of EVA worksite analysis for International Space Station (ISS) mission support.

  20. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy

    DOE PAGES

    Solares, Santiago D.

    2015-11-26

    This study introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretationmore » of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tappingmode imaging, for both of which the force curves exhibit the expected features. Lastly, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.« less

  1. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy

    SciTech Connect

    Solares, Santiago D.

    2015-11-26

    This study introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretation of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tappingmode imaging, for both of which the force curves exhibit the expected features. Lastly, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.

  2. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy.

    PubMed

    Solares, Santiago D

    2015-01-01

    This paper introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretation of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tapping-mode imaging, for both of which the force curves exhibit the expected features. Finally, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.

  3. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy

    PubMed Central

    2015-01-01

    Summary This paper introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretation of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tapping-mode imaging, for both of which the force curves exhibit the expected features. Finally, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments. PMID:26734515

  4. Experimental Validation of Plastic Mandible Models Produced by a “Low-Cost” 3-Dimensional Fused Deposition Modeling Printer

    PubMed Central

    Maschio, Federico; Pandya, Mirali; Olszewski, Raphael

    2016-01-01

    Background The objective of this study was to investigate the accuracy of 3-dimensional (3D) plastic (ABS) models generated using a low-cost 3D fused deposition modelling printer. Material/Methods Two human dry mandibles were scanned with a cone beam computed tomography (CBCT) Accuitomo device. Preprocessing consisted of 3D reconstruction with Maxilim software and STL file repair with Netfabb software. Then, the data were used to print 2 plastic replicas with a low-cost 3D fused deposition modeling printer (Up plus 2®). Two independent observers performed the identification of 26 anatomic landmarks on the 4 mandibles (2 dry and 2 replicas) with a 3D measuring arm. Each observer repeated the identifications 20 times. The comparison between the dry and plastic mandibles was based on 13 distances: 8 distances less than 12 mm and 5 distances greater than 12 mm. Results The mean absolute difference (MAD) was 0.37 mm, and the mean dimensional error (MDE) was 3.76%. The MDE decreased to 0.93% for distances greater than 12 mm. Conclusions Plastic models generated using the low-cost 3D printer UPplus2® provide dimensional accuracies comparable to other well-established rapid prototyping technologies. Validated low-cost 3D printers could represent a step toward the better accessibility of rapid prototyping technologies in the medical field. PMID:27003456

  5. Virtual model surgery and wafer fabrication using 2-dimensional cephalograms, 3-dimensional virtual dental models, and stereolithographic technology.

    PubMed

    Choi, Jin-Young; Hwang, Jong-Min; Baek, Seung-Hak

    2012-02-01

    Although several 3-dimensional virtual model surgery (3D-VMS) programs have been introduced to reduce time-consuming manual laboratory steps and potential errors, these programs still require 3D-computed tomography (3D-CT) data and involve complex computerized maneuvers. Because it is difficult to take 3D-CTs for all cases, a new VMS program using 2D lateral and posteroanterior cephalograms and 3D virtual dental models (2.5D-VMS program; 3Txer version 2.5, Orapix, Seoul, Korea) has recently been introduced. The purposes of this article were to present the methodology of the 2.5D-VMS program and to verify the accuracy of intermediate surgical wafers fabricated with the stereolithographic technique. Two cases successfully treated using the 2.5D-VMS program are presented. There was no significant difference in the position of upper dentition after surgical movement between 2.5D-VMS and 3D-VMS in 18 samples (less than 0.10 mm, P > .05, Wilcoxon-signed rank test). The 2.5D-VMS can be regarded as an effective alternative for 3D-VMS for cases in which 3D-CT data are not available.

  6. Computer-aided pulmonary image analysis in small animal models

    PubMed Central

    Xu, Ziyue; Bagci, Ulas; Mansoor, Awais; Kramer-Marek, Gabriela; Luna, Brian; Kubler, Andre; Dey, Bappaditya; Foster, Brent; Papadakis, Georgios Z.; Camp, Jeremy V.; Jonsson, Colleen B.; Bishai, William R.; Jain, Sanjay; Udupa, Jayaram K.; Mollura, Daniel J.

    2015-01-01

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases. PMID:26133591

  7. MMA, A Computer Code for Multi-Model Analysis

    SciTech Connect

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  8. Computer-aided pulmonary image analysis in small animal models

    SciTech Connect

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.; Bagci, Ulas; Kramer-Marek, Gabriela; Luna, Brian; Kubler, Andre; Dey, Bappaditya; Jain, Sanjay; Foster, Brent; Papadakis, Georgios Z.; Camp, Jeremy V.; Jonsson, Colleen B.; Bishai, William R.; Udupa, Jayaram K.

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  9. Reliability analysis framework for computer-assisted medical decision systems

    SciTech Connect

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-02-15

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  10. Computational heat transfer analysis for oscillatory channel flows

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir; Kannapareddy, Mohan

    1993-01-01

    An accurate finite-difference scheme has been utilized to investigate oscillatory, laminar and incompressible flow between two-parallel-plates and in circular tubes. The two-parallel-plates simulate the regenerator of a free-piston Stirling engine (foil type regenerator) and the channel wall was included in the analysis (conjugate heat transfer problem). The circular tubes simulate the cooler and heater of the engine with an isothermal wall. The study conducted covered a wide range for the maximum Reynolds number (from 75 to 60,000), Valensi number (from 2.5 to 700), and relative amplitude of fluid displacement (0.714 and 1.34). The computational results indicate a complex nature of the heat flux distribution with time and axial location in the channel. At the channel mid-plane we observed two thermal cycles (out of phase with the flow) per each flow cycle. At this axial location the wall heat flux mean value, amplitude and phase shift with the flow are dependent upon the maximum Reynolds number, Valensi number and relative amplitude of fluid displacement. At other axial locations, the wall heat flux distribution is more complex.

  11. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  12. Temporal codes and computations for sensory representation and scene analysis.

    PubMed

    Cariani, Peter A

    2004-09-01

    This paper considers a space of possible temporal codes, surveys neurophysiological and psychological evidence for their use in nervous systems, and presents examples of neural timing networks that operate in the time-domain. Sensory qualities can be encoded temporally by means of two broad strategies: stimulus-driven temporal correlations (phase-locking) and stimulus-triggering of endogenous temporal response patterns. Evidence for stimulus-related spike timing patterns exists in nearly every sensory modality, and such information can be potentially utilized for representation of stimulus qualities, localization of sources, and perceptual grouping. Multiple strategies for temporal (time, frequency, and code-division) multiplexing of information for transmission and grouping are outlined. Using delays and multiplications (coincidences), neural timing networks perform time-domain signal processing operations to compare, extract and separate temporal patterns. Separation of synthetic double vowels by a recurrent neural timing network is used to illustrate how coherences in temporal fine structure can be exploited to build up and separate periodic signals with different fundamentals. Timing nets constitute a time-domain scene analysis strategy based on temporal pattern invariance rather than feature-based labeling, segregation and binding of channels. Further potential implications of temporal codes and computations for new kinds of neural networks are explored.

  13. A comparative reliability analysis of computer-generated bitemark overlays.

    PubMed

    McNamee, Anne H; Sweet, David; Pretty, Iain

    2005-03-01

    This study compared the reliability of two methods used to produce computer-generated bitemark overlays with Adobe Photoshop (Adobe Systems Inc., San Jose, CA). Scanned images of twelve dental casts were sent to 30 examiners with different experience levels. Examiners were instructed to produce an overlay for each cast image based on the instructions provided for the two techniques. Measurements of the area and the x-y coordinate position of the biting edges of the anterior teeth were obtained using Scion Image software program (Scion Corporation, Frederick, MD) for each overlay. The inter- and intra-reliability assessment of the measurements was performed using an analysis of variance and calculation of reliability coefficients. The assessment of the area measurements showed significant variances seen in the examiner variable for both techniques resulting in low reliability coefficients. Conversely, the results for the positional measurements showed no significant differences in the variances between examiners with exceptionally high reliability coefficients. It was concluded that both techniques were reliable methods to produce bitemark overlays in assessing tooth position. PMID:15818864

  14. Can computational biology improve the phylogenetic analysis of insulin?

    PubMed

    Chakraborty, Chiranjib; Roy, Sanjiban S; Hsu, Minna J; Agoramoorthy, Govindasamy

    2012-11-01

    Using computational biology, we have depicted the insulin phylogenetics. We have also analyzed the sequence alignment and sequence logos formation for both the insulin chain A and B for three groups namely, the mammalian group, vertebrates group and fish group. We have also analyzed cladograms of insulin for the mammalian group. In accordance with that path lengths, matrix for distance analysis, matching representation of nodes of the cladogram and dissimilarity between two nodes have been performed for both of the A and B chains of the mammalian group. Our results show that 12 amino acid residues (GlyA1, IleA2, ValA3, TyrA19, CysA20, AsnA21, LeuB6, GlyB8, LeuB11, ValB12, GlyB23 and PheB24) are highly conserved for all groups and among them some (GlyA1, IleA2, ValA3);(TyrA19, CysA20, AsnA21) are continuous. This study shows a rapid method to calculate the amino acid sequences in terms of evolutionary conservation rates as well as molecular phylogenetics. PMID:22265574

  15. Computer analysis of X-band radar data

    NASA Technical Reports Server (NTRS)

    Knowlton, D. J.; Hoffer, R. M.

    1983-01-01

    The effectiveness of using currently available computer techniques for interpretation of MSS data to interpret SAR imagery for forest monitoring was assessed. Data were gathered with NASA's airborne APQ-102 dual-polarized, X-band SAR in a flight at 60,000 ft. Microdensitometry was employed to digitize the HH- and HV-polarized imagery. A ground spatial resolution of 15 m was obtained, control points were identified, a second order biquadratic transformation was applied to compensate for orientation, and rms errors were calculated. A second data set was taken with 30 m resolution in order to simulate thematic mapper operation. Classification was performed with pixel-by-pixel and textural classification algorithms. A statistical analysis was also carried out to find any significant differences between classifiers in a data set for a given classifier. Each polarization featured an independent distortion which required appropriate preprocessing to correct. Further studies are recommended with multiple frequencies viewing and multiple polarizations and look angles to define the actual forest classifications that can be made with the SAR imagery.

  16. Design of airborne wind turbine and computational fluid dynamics analysis

    NASA Astrophysics Data System (ADS)

    Anbreen, Faiqa

    Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.

  17. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  18. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  19. Northwest Trajectory Analysis Capability: A Platform for Enhancing Computational Biophysics Analysis

    SciTech Connect

    Peterson, Elena S.; Stephan, Eric G.; Corrigan, Abigail L.; Lins, Roberto D.; Soares, Thereza A.; Scarberry, Randall E.; Rose, Stuart J.; Williams, Leigh K.; Lai, Canhai; Critchlow, Terence J.; Straatsma, TP

    2008-07-30

    As computational resources continue to increase, the ability of computational simulations to effectively complement, and in some cases replace, experimentation in scientific exploration also increases. Today, large-scale simulations are recognized as an effective tool for scientific exploration in many disciplines including chemistry and biology. A natural side effect of this trend has been the need for an increasingly complex analytical environment. In this paper, we describe Northwest Trajectory Analysis Capability (NTRAC), an analytical software suite developed to enhance the efficiency of computational biophysics analyses. Our strategy is to layer higher-level services and introduce improved tools within the user’s familiar environment without preventing researchers from using traditional tools and methods. Our desire is to share these experiences to serve as an example for effectively analyzing data intensive large scale simulation data.

  20. Computational analysis of irradiation facilities at the JSI TRIGA reactor.

    PubMed

    Snoj, Luka; Zerovnik, Gašper; Trkov, Andrej

    2012-03-01

    Characterization and optimization of irradiation facilities in a research reactor is important for optimal performance. Nowadays this is commonly done with advanced Monte Carlo neutron transport computer codes such as MCNP. However, the computational model in such calculations should be verified and validated with experiments. In the paper we describe the irradiation facilities at the JSI TRIGA reactor and demonstrate their computational characterization to support experimental campaigns by providing information on the characteristics of the irradiation facilities. PMID:22154389

  1. Crossover from 2-dimensional to 3-dimensional aggregations of clusters on square lattice substrates

    NASA Astrophysics Data System (ADS)

    Cheng, Yi; Zhu, Yu-Hong; Pan, Qi-Fa; Yang, Bo; Tao, Xiang-Ming; Ye, Gao-Xiang

    2015-11-01

    A Monte Carlo study on the crossover from 2-dimensional to 3-dimensional aggregations of clusters is presented. Based on the traditional cluster-cluster aggregation (CCA) simulation, a modified growth model is proposed. The clusters (including single particles and their aggregates) diffuse with diffusion step length l (1 ≤ l ≤ 7) and aggregate on a square lattice substrate. If the number of particles contained in a cluster is larger than a critical size sc, the particles at the edge of the cluster have a possibility to jump onto the upper layer, which results in the crossover from 2-dimensional to 3-dimensional aggregations. Our simulation results are in good agreement with the experimental findings. Project supported by the National Natural Science Foundation of China (Grant Nos. 11374082 and 11074215), the Science Foundation of Zhejiang Province Department of Education, China (Grant No. Y201018280), the Fundamental Research Funds for Central Universities, China (Grant No. 2012QNA3010), and the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20100101110005).

  2. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  3. Computational modeling and analysis of the hydrodynamics of human swimming

    NASA Astrophysics Data System (ADS)

    von Loebbecke, Alfred

    Computational modeling and simulations are used to investigate the hydrodynamics of competitive human swimming. The simulations employ an immersed boundary (IB) solver that allows us to simulate viscous, incompressible, unsteady flow past complex, moving/deforming three-dimensional bodies on stationary Cartesian grids. This study focuses on the hydrodynamics of the "dolphin kick". Three female and two male Olympic level swimmers are used to develop kinematically accurate models of this stroke for the simulations. A simulation of a dolphin undergoing its natural swimming motion is also presented for comparison. CFD enables the calculation of flow variables throughout the domain and over the swimmer's body surface during the entire kick cycle. The feet are responsible for all thrust generation in the dolphin kick. Moreover, it is found that the down-kick (ventral position) produces more thrust than the up-kick. A quantity of interest to the swimming community is the drag of a swimmer in motion (active drag). Accurate estimates of this quantity have been difficult to obtain in experiments but are easily calculated with CFD simulations. Propulsive efficiencies of the human swimmers are found to be in the range of 11% to 30%. The dolphin simulation case has a much higher efficiency of 55%. Investigation of vortex structures in the wake indicate that the down-kick can produce a vortex ring with a jet of accelerated fluid flowing through its center. This vortex ring and the accompanying jet are the primary thrust generating mechanisms in the human dolphin kick. In an attempt to understand the propulsive mechanisms of surface strokes, we have also conducted a computational analysis of two different styles of arm-pulls in the backstroke and the front crawl. These simulations involve only the arm and no air-water interface is included. Two of the four strokes are specifically designed to take advantage of lift-based propulsion by undergoing lateral motions of the hand

  4. Computational modeling and impact analysis of textile composite structures

    NASA Astrophysics Data System (ADS)

    Hur, Hae-Kyu

    This study is devoted to the development of an integrated numerical modeling enabling one to investigate the static and the dynamic behaviors and failures of 2-D textile composite as well as 3-D orthogonal woven composite structures weakened by cracks and subjected to static-, impact- and ballistic-type loads. As more complicated modeling about textile composite structures is introduced, some of homogenization schemes, geometrical modeling and crack propagations become more difficult problems to solve. To overcome these problems, this study presents effective mesh-generation schemes, homogenization modeling based on a repeating unit cell and sinusoidal functions, and also a cohesive element to study micro-crack shapes. This proposed research has two: (1) studying behavior of textile composites under static loads, (2) studying dynamic responses of these textile composite structures subjected to the transient/ballistic loading. In the first part, efficient homogenization schemes are suggested to show the influence of textile architectures on mechanical characteristics considering the micro modeling of repeating unit cell. Furthermore, the structures of multi-layered or multi-phase composites combined with different laminar such as a sub-laminate, are considered to find the mechanical characteristics. A simple progressive failure mechanism for the textile composites is also presented. In the second part, this study focuses on three main phenomena to solve the dynamic problems: micro-crack shapes, textile architectures and textile effective moduli. To obtain a good solutions of the dynamic problems, this research attempts to use four approaches: (I) determination of governing equations via a three-level hierarchy: micro-mechanical unit cell analysis, layer-wise analysis accounting for transverse strains and stresses, and structural analysis based on anisotropic plate layers, (II) development of an efficient computational approach enabling one to perform transient

  5. Computer Models for IRIS Control System Transient Analysis

    SciTech Connect

    Gary D. Storrick; Bojan Petrovic; Luca Oriani

    2007-01-31

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process

  6. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    NASA Technical Reports Server (NTRS)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  7. A Geometric Modelling Approach to Determining the Best Sensing Coverage for 3-Dimensional Acoustic Target Tracking in Wireless Sensor Networks

    PubMed Central

    Pashazadeh, Saeid; Sharifi, Mohsen

    2009-01-01

    Existing 3-dimensional acoustic target tracking methods that use wired/wireless networked sensor nodes to track targets based on four sensing coverage do not always compute the feasible spatio-temporal information of target objects. To investigate this discrepancy in a formal setting, we propose a geometric model of the target tracking problem alongside its equivalent geometric dual model that is easier to solve. We then study and prove some properties of dual model by exploiting its relationship with algebra. Based on these properties, we propose a four coverage axis line method based on four sensing coverage and prove that four sensing coverage always yields two dual correct answers; usually one of them is infeasible. By showing that the feasible answer can be only sometimes identified by using a simple time test method such as the one proposed by ourselves, we prove that four sensing coverage fails to always yield the feasible spatio-temporal information of a target object. We further prove that five sensing coverage always gives the feasible position of a target object under certain conditions that are discussed in this paper. We propose three extensions to four coverage axis line method, namely, five coverage extent point method, five coverage extended axis lines method, and five coverage redundant axis lines method. Computation and time complexities of all four proposed methods are equal in the worst cases as well as on average being equal to Θ(1) each. Proposed methods and proved facts about capabilities of sensing coverage degree in this paper can be used in all other methods of acoustic target tracking like Bayesian filtering methods. PMID:22423198

  8. Analysis of Children's Computational Errors: A Qualitative Approach

    ERIC Educational Resources Information Center

    Engelhardt, J. M.

    1977-01-01

    This study was designed to replicate and extend Roberts' (1968) efforts at classifying computational errors. 198 elementary school students were administered an 84-item arithmetic computation test. Eight types of errors were described which led to several tentative generalizations. (Editor/RK)

  9. Comparative analysis of diagnostic 12-lead electrocardiography and 3-dimensional noninvasive mapping.

    PubMed

    Leong, Kevin Ming Wei; Lim, Phang Boon; Kanagaratnam, Prapa

    2015-03-01

    The clinical utility of noninvasive electrocardiographic imaging has been demonstrated in a variety of conditions. It has recently been shown to have superior predictive accuracy and higher clinical value than validated 12-lead electrogram algorithms in the localization of arrhythmias arising from the ventricular outflow tract, and displays similar potential in other conditions.

  10. Intracellular trafficking of superparamagnetic iron oxide nanoparticles conjugated with TAT peptide: 3-dimensional electron tomography analysis

    SciTech Connect

    Nair, Baiju G.; Fukuda, Takahiro; Mizuki, Toru; Hanajiri, Tatsuro; Maekawa, Toru

    2012-05-18

    Highlights: Black-Right-Pointing-Pointer We study the intracellular localisation of TAT-SPIONs using 3-D electron tomography. Black-Right-Pointing-Pointer 3-D images of TAT-SPIONs in a cell are clearly shown. Black-Right-Pointing-Pointer Release of TAT-SPIONs from endocytic vesicles into the cytoplasm is clearly shown. -- Abstract: Internalisation of nanoparticles conjugated with cell penetrating peptides is a promising approach to various drug delivery applications. Cell penetrating peptides such as transactivating transcriptional activator (TAT) peptides derived from HIV-1 proteins are effective intracellular delivery vectors for a wide range of nanoparticles and pharmaceutical agents thanks to their amicable ability to enter cells and minimum cytotoxicity. Although different mechanisms of intracellular uptake and localisation have been proposed for TAT conjugated nanoparticles, it is necessary to visualise the particles on a 3-D plane in order to investigate the actual intracellular uptake and localisation. Here, we study the intracellular localisation and trafficking of TAT peptide conjugated superparamagnetic ion oxide nanoparticles (TAT-SPIONs) using 3-D electron tomography. 3-D tomograms clearly show the location of TAT-SPIONs in a cell and their slow release from the endocytic vesicles into the cytoplasm. The present methodology may well be utilised for further investigations of the behaviours of nanoparticles in cells and eventually for the development of nano drug delivery systems.

  11. MMA, A Computer Code for Multi-Model Analysis

    USGS Publications Warehouse

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  12. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  13. Interface design of VSOP'94 computer code for safety analysis

    SciTech Connect

    Natsir, Khairina Andiwijayakusuma, D.; Wahanani, Nursinta Adi; Yazid, Putranto Ilham

    2014-09-30

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  14. Computer assisted sound analysis of arteriovenous fistula in hemodialysis patients.

    PubMed

    Malindretos, Pavlos; Liaskos, Christos; Bamidis, Panagiotis; Chryssogonidis, Ioannis; Lasaridis, Anastasios; Nikolaidis, Pavlos

    2014-02-01

    The purpose of this study was to reveal the unique sound characteristics of the bruit produced by arteriovenous fistulae (AVF), using a computerized method. An electronic stethoscope (20 Hz to 20 000 Hz sensitivity) was used, connected to a portable laptop computer. Forty prevalent hemodialysis patients participated in the study. All measurements were made with patients resting in supine position, prior to the initiation of mid-week dialysis session. Standard color Doppler technique was used to estimate blood flow. Clinical examination revealed the surface where the perceived bruit was more intense, and the recording took place at a sample rate of 22 000 Hz in WAV lossless format. Fast Fourier Transform (FFT) mathematical algorithm, was used for the sound analysis. This algorithm is particularly useful in revealing the periodicity of sound data as well as in mapping its frequency behavior and its strength. Produced frequencies were divided into 40 frequency intervals, 250 Hz apart, so that the results would be easier to plot and comprehend. The mean age of the patients was 63.5 ± 14 years; the median time on dialysis was 39.6 months (mean 1 month, max. 200 months). The mean blood flow was 857.7 ± 448.3 ml/min. The mean sound frequency was approximately 5 500 Hz ± 4 000 Hz and the median, which is also expressing the major peak of sound data, was 750 Hz, varying from 250 Hz to 10 000 Hz. A possible limitation of the study is the relatively small number of participants.

  15. Synthesis, spectral, computational and thermal analysis studies of metallocefotaxime antibiotics.

    PubMed

    Masoud, Mamdouh S; Ali, Alaa E; Elasala, Gehan S

    2015-01-01

    Cefotaxime metal complexes of Cr(III), Mn(II), Fe(III), Co(II), Ni(II), Cu(II), Zn(II), Cd(II), Hg(II) and two mixed metals complexes of (Fe,Cu) and (Fe,Ni) were synthesized and characterized by elemental analysis, IR, electronic spectra, magnetic susceptibility and ESR spectra. The studies proved that cefotaxime may act as mono, bi, tri and tetra-dentate ligand through oxygen atoms of lactam carbonyl, carboxylic or amide carbonyl groups and nitrogen atom of thiazole ring. From the magnetic measurements and electronic spectral data, octahedral structures were proposed for all complexes. Quantum chemical methods have been performed for cefotaxime to calculate charges, bond lengths, bond angles, dihedral angles, electronegativity (χ), chemical potential (μ), global hardness (η), softness (σ) and the electrophilicity index (ω). The thermal decomposition of the prepared metals complexes was studied by TGA, DTA and DSC techniques. Thermogravimetric studies revealed the presence of lattice or coordinated water molecules in all the prepared complexes. The decomposition mechanisms were suggested. The thermal decomposition of the complexes ended with the formation of metal oxides and carbon residue as a final product except in case of Hg complex, sublimation occur at the temperature range 376.5-575.0 °C so, only carbon residue was produced during thermal decomposition. The orders of chemical reactions (n) were calculated via the peak symmetry method and the activation parameters were computed from the thermal decomposition data. The geometries of complexes may be converted from Oh to Td during the thermal decomposition steps.

  16. Interface design of VSOP'94 computer code for safety analysis

    NASA Astrophysics Data System (ADS)

    Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi

    2014-09-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  17. Microwave circuit analysis and design by a massively distributed computing network

    NASA Astrophysics Data System (ADS)

    Vai, Mankuan; Prasad, Sheila

    1995-05-01

    The advances in microelectronic engineering have rendered massively distributed computing networks practical and affordable. This paper describes one application of this distributed computing paradigm to the analysis and design of microwave circuits. A distributed computing network, constructed in the form of a neural network, is developed to automate the operations typically performed on a normalized Smith chart. Examples showing the use of this computing network for impedance matching and stabilizing are provided.

  18. Using Puppet to contextualize computing resources for ATLAS analysis on Google Compute Engine

    NASA Astrophysics Data System (ADS)

    Öhman, Henrik; Panitkin, Sergey; Hendrix, Valerie; Atlas Collaboration

    2014-06-01

    With the advent of commercial as well as institutional and national clouds, new opportunities for on-demand computing resources for the HEP community become available. The new cloud technologies also come with new challenges, and one such is the contextualization of computing resources with regard to requirements of the user and his experiment. In particular on Google's new cloud platform Google Compute Engine (GCE) upload of user's virtual machine images is not possible. This precludes application of ready to use technologies like CernVM and forces users to build and contextualize their own VM images from scratch. We investigate the use of Puppet to facilitate contextualization of cloud resources on GCE, with particular regard to ease of configuration and dynamic resource scaling.

  19. Video Capture and Analysis: Seizing on Computer Technology To Teach the Physical Sciences.

    ERIC Educational Resources Information Center

    Lessie, Douglas

    2001-01-01

    Describes a course for nonscience majors in which material is presented using video capture and analysis technology. Students study real-life physical phenomena and learn significant computer, quantitative analysis, and modeling skills. (SAH)

  20. The analysis of control trajectories using symbolic and database computing

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1995-01-01

    This final report comprises the formal semi-annual status reports for this grant for the periods June 30-December 31, 1993, January 1-June 30, 1994, and June 1-December 31, 1994. The research supported by this grant is broadly concerned with the symbolic computation, mixed numeric-symbolic computation, and database computation of trajectories of dynamical systems, especially control systems. A review of work during the report period covers: trajectories and approximating series, the Cayley algebra of trees, actions of differential operators, geometrically stable integration algorithms, hybrid systems, trajectory stores, PTool, and other activities. A list of publications written during the report period is attached.