Science.gov

Sample records for 3-dimensional computational analysis

  1. Image analysis and superimposition of 3-dimensional cone-beam computed tomography models

    PubMed Central

    Cevidanes, Lucia H. S.; Styner, Martin A.; Proffit, William R.

    2013-01-01

    Three-dimensional (3D) imaging techniques can provide valuable information to clinicians and researchers. But as we move from traditional 2-dimensional (2D) cephalometric analysis to new 3D techniques, it is often necessary to compare 2D with 3D data. Cone-beam computed tomography (CBCT) provides simulation tools that can help bridge the gap between image types. CBCT acquisitions can be made to simulate panoramic, lateral, and posteroanterior cephalometric radioagraphs so that they can be compared with preexisting cephalometric databases. Applications of 3D imaging in orthodontics include initial diagnosis and superimpositions for assessing growth, treatment changes, and stability. Three-dimensional CBCT images show dental root inclination and torque, impacted and supernumerary tooth positions, thickness and morphology of bone at sites of mini-implants for anchorage, and osteotomy sites in surgical planning. Findings such as resorption, hyperplasic growth, displacement, shape anomalies of mandibular condyles, and morphological differences between the right and left sides emphasize the diagnostic value of computed tomography acquisitions. Furthermore, relationships of soft tissues and the airway can be assessed in 3 dimensions. PMID:16679201

  2. Airway Wall Area Derived from 3-Dimensional Computed Tomography Analysis Differs among Lung Lobes in Male Smokers

    PubMed Central

    Tho, Nguyen Van; Trang, Le Thi Huyen; Murakami, Yoshitaka; Ogawa, Emiko; Ryujin, Yasushi; Kanda, Rie; Nakagawa, Hiroaki; Goto, Kenichi; Fukunaga, Kentaro; Higami, Yuichi; Seto, Ruriko; Nagao, Taishi; Oguma, Tetsuya; Yamaguchi, Masafumi; Lan, Le Thi Tuyet; Nakano, Yasutaka

    2014-01-01

    Background It is time-consuming to obtain the square root of airway wall area of the hypothetical airway with an internal perimeter of 10 mm (√Aaw at Pi10), a comparable index of airway dimensions in chronic obstructive pulmonary disease (COPD), from all airways of the whole lungs using 3-dimensional computed tomography (CT) analysis. We hypothesized that √Aaw at Pi10 differs among the five lung lobes and √Aaw at Pi10 derived from one certain lung lobe has a high level of agreement with that derived from the whole lungs in smokers. Methods Pulmonary function tests and chest volumetric CTs were performed in 157 male smokers (102 COPD, 55 non-COPD). All visible bronchial segments from the 3rd to 5th generations were segmented and measured using commercially available 3-dimensional CT analysis software. √Aaw at Pi10 of each lung lobe was estimated from all measurable bronchial segments of that lobe. Results Using a mixed-effects model, √Aaw at Pi10 differed significantly among the five lung lobes (R2 = 0.78, P<0.0001). The Bland-Altman plots show that √Aaw at Pi10 derived from the right or left upper lobe had a high level of agreement with that derived from the whole lungs, while √Aaw at Pi10 derived from the right or left lower lobe did not. Conclusion In male smokers, CT-derived airway wall area differs among the five lung lobes, and airway wall area derived from the right or left upper lobe is representative of the whole lungs. PMID:24865661

  3. Comparative Validity and Reproducibility Study of Various Landmark-Oriented Reference Planes in 3-Dimensional Computed Tomographic Analysis for Patients Receiving Orthognathic Surgery

    PubMed Central

    Lin, Hsiu-Hsia; Chuang, Ya-Fang; Weng, Jing-Ling; Lo, Lun-Jou

    2015-01-01

    Background Three-dimensional computed tomographic imaging has become popular in clinical evaluation, treatment planning, surgical simulation, and outcome assessment for maxillofacial intervention. The purposes of this study were to investigate whether there is any correlation among landmark-based horizontal reference planes and to validate the reproducibility and reliability of landmark identification. Materials and Methods Preoperative and postoperative cone-beam computed tomographic images of patients who had undergone orthognathic surgery were collected. Landmark-oriented reference planes including the Frankfort horizontal plane (FHP) and the lateral semicircular canal plane (LSP) were established. Four FHPs were defined by selecting 3 points from the orbitale, porion, or midpoint of paired points. The LSP passed through both the lateral semicircular canal points and nasion. The distances between the maxillary or mandibular teeth and the reference planes were measured, and the differences between the 2 sides were calculated and compared. The precision in locating the landmarks was evaluated by performing repeated tests, and the intraobserver reproducibility and interobserver reliability were assessed. Results A total of 30 patients with facial deformity and malocclusion—10 patients with facial symmetry, 10 patients with facial asymmetry, and 10 patients with cleft lip and palate—were recruited. Comparing the differences among the 5 reference planes showed no statistically significant difference among all patient groups. Regarding intraobserver reproducibility, the mean differences in the 3 coordinates varied from 0 to 0.35 mm, with correlation coefficients between 0.96 and 1.0, showing high correlation between repeated tests. Regarding interobserver reliability, the mean differences among the 3 coordinates varied from 0 to 0.47 mm, with correlation coefficients between 0.88 and 1.0, exhibiting high correlation between the different examiners. Conclusions The

  4. Particle trajectory computation on a 3-dimensional engine inlet. Final Report Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Kim, J. J.

    1986-01-01

    A 3-dimensional particle trajectory computer code was developed to compute the distribution of water droplet impingement efficiency on a 3-dimensional engine inlet. The computed results provide the essential droplet impingement data required for the engine inlet anti-icing system design and analysis. The droplet trajectories are obtained by solving the trajectory equation using the fourth order Runge-Kutta and Adams predictor-corrector schemes. A compressible 3-D full potential flow code is employed to obtain a cylindrical grid definition of the flowfield on and about the engine inlet. The inlet surface is defined mathematically through a system of bi-cubic parametric patches in order to compute the droplet impingement points accurately. Analysis results of the 3-D trajectory code obtained for an axisymmetric droplet impingement problem are in good agreement with NACA experimental data. Experimental data are not yet available for the engine inlet impingement problem analyzed. Applicability of the method to solid particle impingement problems, such as engine sand ingestion, is also demonstrated.

  5. The Effectiveness of an Interactive 3-Dimensional Computer Graphics Model for Medical Education

    PubMed Central

    Konishi, Takeshi; Tamura, Yoko; Moriguchi, Hiroki

    2012-01-01

    Background Medical students often have difficulty achieving a conceptual understanding of 3-dimensional (3D) anatomy, such as bone alignment, muscles, and complex movements, from 2-dimensional (2D) images. To this end, animated and interactive 3-dimensional computer graphics (3DCG) can provide better visual information to users. In medical fields, research on the advantages of 3DCG in medical education is relatively new. Objective To determine the educational effectiveness of interactive 3DCG. Methods We divided 100 participants (27 men, mean (SD) age 17.9 (0.6) years, and 73 women, mean (SD) age 18.1 (1.1) years) from the Health Sciences University of Mongolia (HSUM) into 3DCG (n = 50) and textbook-only (control) (n = 50) groups. The control group used a textbook and 2D images, while the 3DCG group was trained to use the interactive 3DCG shoulder model in addition to a textbook. We conducted a questionnaire survey via an encrypted satellite network between HSUM and Tokushima University. The questionnaire was scored on a 5-point Likert scale from strongly disagree (score 1) to strongly agree (score 5). Results Interactive 3DCG was effective in undergraduate medical education. Specifically, there was a significant difference in mean (SD) scores between the 3DCG and control groups in their response to questionnaire items regarding content (4.26 (0.69) vs 3.85 (0.68), P = .001) and teaching methods (4.33 (0.65) vs 3.74 (0.79), P < .001), but no significant difference in the Web category. Participants also provided meaningful comments on the advantages of interactive 3DCG. Conclusions Interactive 3DCG materials have positive effects on medical education when properly integrated into conventional education. In particular, our results suggest that interactive 3DCG is more efficient than textbooks alone in medical education and can motivate students to understand complex anatomical structures. PMID:23611759

  6. Superimposition of 3-dimensional cone-beam computed tomography models of growing patients

    PubMed Central

    Cevidanes, Lucia H. C.; Heymann, Gavin; Cornelis, Marie A.; DeClerck, Hugo J.; Tulloch, J. F. Camilla

    2009-01-01

    Introduction The objective of this study was to evaluate a new method for superimposition of 3-dimensional (3D) models of growing subjects. Methods Cone-beam computed tomography scans were taken before and after Class III malocclusion orthopedic treatment with miniplates. Three observers independently constructed 18 3D virtual surface models from cone-beam computed tomography scans of 3 patients. Separate 3D models were constructed for soft-tissue, cranial base, maxillary, and mandibular surfaces. The anterior cranial fossa was used to register the 3D models of before and after treatment (about 1 year of follow-up). Results Three-dimensional overlays of superimposed models and 3D color-coded displacement maps allowed visual and quantitative assessment of growth and treatment changes. The range of interobserver errors for each anatomic region was 0.4 mm for the zygomatic process of maxilla, chin, condyles, posterior border of the rami, and lower border of the mandible, and 0.5 mm for the anterior maxilla soft-tissue upper lip. Conclusions Our results suggest that this method is a valid and reproducible assessment of treatment outcomes for growing subjects. This technique can be used to identify maxillary and mandibular positional changes and bone remodeling relative to the anterior cranial fossa. PMID:19577154

  7. MAPAG: a computer program to construct 2- and 3-dimensional antigenic maps.

    PubMed

    Aguilar, R C; Retegui, L A; Roguin, L P

    1994-01-01

    The contact area between an antibody (Ab) and the antigen (Ag) is called antigenic determinant or epitope. The first step in the characterization of an Ag by using monoclonal antibodies (MAb) is to map the relative distribution of the corresponding epitopes on the Ag surface. The computer program MAPAG has been devised to automatically construct antigenic maps. MAPAG is fed with a binary matrix of experimental data indicating the ability of paired MAb to bind or not simultaneously to the Ag. The program is interactive menu-driven and allows the user an easy data handling. MAPAG utilizes iterative processes to construct and to adjust the final map, which is graphically shown as a 2- or a 3-dimensional model. Additionally, the antigenic map obtained can be optionally modified by the user or readjusted by the program. The suitability of MAPAG was illustrated by running experimental data from literature and comparing antigenic maps constructed by the program with those elaborated by the investigators without the assistance of a computer. Furthermore, since some MAb could present negative allosteric effects leading to misinterpretation of data, MAPAG has been provided with an approximate reasoning module to solve such anomalous situations. Results indicated that the program can be successfully employed as a simple, fast and reliable antigenic model-builder.

  8. Cerebral Degeneration in Amyotrophic Lateral Sclerosis Revealed by 3-Dimensional Texture Analysis

    PubMed Central

    Maani, Rouzbeh; Yang, Yee-Hong; Emery, Derek; Kalra, Sanjay

    2016-01-01

    Introduction: Routine MR images do not consistently reveal pathological changes in the brain in ALS. Texture analysis, a method to quantitate voxel intensities and their patterns and interrelationships, can detect changes in images not apparent to the naked eye. Our objective was to evaluate cerebral degeneration in ALS using 3-dimensional texture analysis of MR images of the brain. Methods: In a case-control design, voxel-based texture analysis was performed on T1-weighted MR images of 20 healthy subjects and 19 patients with ALS. Four texture features, namely, autocorrelation, sum of squares variance, sum average, and sum variance were computed. Texture features were compared between the groups by statistical parametric mapping and correlated with clinical measures of disability and upper motor neuron dysfunction. Results: Texture features were different in ALS in motor regions including the precentral gyrus and corticospinal tracts. To a lesser extent, changes were also found in the thalamus, cingulate gyrus, and temporal lobe. Texture features in the precentral gyrus correlated with disease duration, and in the corticospinal tract they correlated with finger tapping speed. Conclusions: Changes in MR image textures are present in motor and non-motor regions in ALS and correlate with clinical features. Whole brain texture analysis has potential in providing biomarkers of cerebral degeneration in ALS. PMID:27064416

  9. Computation of transonic potential flow about 3 dimensional inlets, ducts, and bodies

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1982-01-01

    An analysis was developed and a computer code, P465 Version A, written for the prediction of transonic potential flow about three dimensional objects including inlet, duct, and body geometries. Finite differences and line relaxation are used to solve the complete potential flow equation. The coordinate system used for the calculations is independent of body geometry. Cylindrical coordinates are used for the computer code. The analysis is programmed in extended FORTRAN 4 for the CYBER 203 vector computer. The programming of the analysis is oriented toward taking advantage of the vector processing capabilities of this computer. Comparisons of computed results with experimental measurements are presented to verify the analysis. Descriptions of program input and output formats are also presented.

  10. 3-Dimensional analysis for class III malocclusion patients with facial asymmetry

    PubMed Central

    Ki, Eun-Jung; Cheon, Hae-Myung; Choi, Eun-Joo; Kwon, Kyung-Hwan

    2013-01-01

    Objectives The aim of this study is to investigate the correlation between 2-dimensional (2D) cephalometric measurement and 3-dimensional (3D) cone beam computed tomography (CBCT) measurement, and to evaluate the availability of 3D analysis for asymmetry patients. Materials and Methods A total of Twenty-seven patients were evaluated for facial asymmetry by photograph and cephalometric radiograph, and CBCT. The 14 measurements values were evaluated and those for 2D and 3D were compared. The patients were classified into two groups. Patients in group 1 were evaluated for symmetry in the middle 1/3 of the face and asymmetry in the lower 1/3 of the face, and those in group 2 for asymmetry of both the middle and lower 1/3 of the face. Results In group 1, significant differences were observed in nine values out of 14 values. Values included three from anteroposterior cephalometric radiograph measurement values (cant and both body height) and six from lateral cephalometric radiographs (both ramus length, both lateral ramal inclination, and both gonial angles). In group 2, comparison between 2D and 3D showed significant difference in 10 factors. Values included four from anteroposterior cephalometric radiograph measurement values (both maxillary height, both body height) and six from lateral cephalometric radiographs (both ramus length, both lateral ramal inclination, and both gonial angles). Conclusion Information from 2D analysis was inaccurate in several measurements. Therefore, in asymmetry patients, 3D analysis is useful in diagnosis of asymmetry. PMID:24471038

  11. Automated image analysis reveals the dynamic 3-dimensional organization of multi-ciliary arrays.

    PubMed

    Galati, Domenico F; Abuin, David S; Tauber, Gabriel A; Pham, Andrew T; Pearson, Chad G

    2015-12-23

    Multi-ciliated cells (MCCs) use polarized fields of undulating cilia (ciliary array) to produce fluid flow that is essential for many biological processes. Cilia are positioned by microtubule scaffolds called basal bodies (BBs) that are arranged within a spatially complex 3-dimensional geometry (3D). Here, we develop a robust and automated computational image analysis routine to quantify 3D BB organization in the ciliate, Tetrahymena thermophila. Using this routine, we generate the first morphologically constrained 3D reconstructions of Tetrahymena cells and elucidate rules that govern the kinetics of MCC organization. We demonstrate the interplay between BB duplication and cell size expansion through the cell cycle. In mutant cells, we identify a potential BB surveillance mechanism that balances large gaps in BB spacing by increasing the frequency of closely spaced BBs in other regions of the cell. Finally, by taking advantage of a mutant predisposed to BB disorganization, we locate the spatial domains that are most prone to disorganization by environmental stimuli. Collectively, our analyses reveal the importance of quantitative image analysis to understand the principles that guide the 3D organization of MCCs.

  12. Automated image analysis reveals the dynamic 3-dimensional organization of multi-ciliary arrays

    PubMed Central

    Galati, Domenico F.; Abuin, David S.; Tauber, Gabriel A.; Pham, Andrew T.; Pearson, Chad G.

    2016-01-01

    ABSTRACT Multi-ciliated cells (MCCs) use polarized fields of undulating cilia (ciliary array) to produce fluid flow that is essential for many biological processes. Cilia are positioned by microtubule scaffolds called basal bodies (BBs) that are arranged within a spatially complex 3-dimensional geometry (3D). Here, we develop a robust and automated computational image analysis routine to quantify 3D BB organization in the ciliate, Tetrahymena thermophila. Using this routine, we generate the first morphologically constrained 3D reconstructions of Tetrahymena cells and elucidate rules that govern the kinetics of MCC organization. We demonstrate the interplay between BB duplication and cell size expansion through the cell cycle. In mutant cells, we identify a potential BB surveillance mechanism that balances large gaps in BB spacing by increasing the frequency of closely spaced BBs in other regions of the cell. Finally, by taking advantage of a mutant predisposed to BB disorganization, we locate the spatial domains that are most prone to disorganization by environmental stimuli. Collectively, our analyses reveal the importance of quantitative image analysis to understand the principles that guide the 3D organization of MCCs. PMID:26700722

  13. Effect of mandibular advancement on the natural position of the head: a preliminary study of 3-dimensional cephalometric analysis.

    PubMed

    Lin, Xiaozhen; Liu, Yanpu; Edwards, Sean P

    2013-10-01

    Our aim was to investigate the potential effect of advancement by bilateral sagittal split osteotomy (BSSO) on the natural position of the head by using 3-dimensional cephalomentric analysis. Seven consecutive patients who had had only BSSO advancement, and had had preoperative and 6-week postoperative cone beam computed tomography (CT) scans, were recruited to this retrospective study. Two variables, SNB and SNC2, were used to indicate the craniomandibular alignment and craniocervical inclination, respectively, in the midsagittal plane. Using 3-dimensional cephalometric analysis software, the SNB and the SNC2 were recorded in volume and measured in the midsagittal plane at 3 independent time-points. The reliability was measured and a paired t test used to assess the significance of differences between the means of SNB and SNC2 before and after operation. The 3-dimensional cephalometric measurement showed good reliability. The SNB was increased as planned in all the mandibles that were advanced, the cervical vertebrae were brought forward after BSSO, and the SNC2 was significantly increased in 6 of the 7 patients. Three-dimensional cephalometric analysis may provide an alternative way of assessing cephalometrics. After BSSO advancement, the natural position of the head changed by increasing the craniocervical inclination in an anteroposterior direction.

  14. Fast Apriori-based Graph Mining Algorithm and application to 3-dimensional Structure Analysis

    NASA Astrophysics Data System (ADS)

    Nishimura, Yoshio; Washio, Takashi; Yoshida, Tetsuya; Motoda, Hiroshi; Inokuchi, Akihiro; Okada, Takashi

    Apriori-based Graph Mining (AGM) algorithm efficiently extracts all the subgraph patterns which frequently appear in graph structured data. The algorithm can deal with general graph structured data with multiple labels of vartices and edges, and is capable of analyzing the topological structure of graphs. In this paper, we propose a new method to analyze graph structured data for a 3-dimensional coordinate by AGM. In this method the distance between each vertex of a graph is calculated and added to the edge label so that AGM can handle 3-dimensional graph structured data. One problem in our approach is that the number of edge labels increases, which results in the increase of computational time to extract subgraph patterns. To alleviate this problem, we also propose a faster algorithm of AGM by adding an extra constraint to reduce the number of generated candidates for seeking frequent subgraphs. Chemical compounds with dopamine antagonist in MDDR database were analyzed by AGM to characterize their 3-dimensional chemical structure and correlation with physiological activity.

  15. Use of 3-dimensional computed tomography to detect a barium-masked fish bone causing esophageal perforation.

    PubMed

    Tsukiyama, Atsushi; Tagami, Takashi; Kim, Shiei; Yokota, Hiroyuki

    2014-01-01

    Computed tomography (CT) is useful for evaluating esophageal foreign bodies and detecting perforation. However, when evaluation is difficult owing to the previous use of barium as a contrast medium, 3-dimensional CT may facilitate accurate diagnosis. A 49-year-old man was transferred to our hospital with the diagnosis of esophageal perforation. Because barium had been used as a contrast medium for an esophagram performed at a previous hospital, horizontal CT and esophageal endoscopy could not be able to identify the foreign body or characterize the lesion. However, 3-dimensional CT clearly revealed an L-shaped foreign body and its anatomical relationships in the mediastinum. Accordingly, we removed the foreign body using an upper gastrointestinal endoscope. The foreign body was the premaxillary bone of a sea bream. The patient was discharged without complications.

  16. Comparison of nonnavigated and 3-dimensional image-based computer navigated balloon kyphoplasty.

    PubMed

    Sembrano, Jonathan N; Yson, Sharon C; Polly, David W; Ledonio, Charles Gerald T; Nuckley, David J; Santos, Edward R G

    2015-01-01

    Balloon kyphoplasty is a common treatment for osteoporotic and pathologic compression fractures. Advantages include minimal tissue disruption, quick recovery, pain relief, and in some cases prevention of progressive sagittal deformity. The benefit of image-based navigation in kyphoplasty has not been established. The goal of this study was to determine whether there is a difference between fluoroscopy-guided balloon kyphoplasty and 3-dimensional image-based navigation in terms of needle malposition rate, cement leakage rate, and radiation exposure time. The authors compared navigated and nonnavigated needle placement in 30 balloon kyphoplasty procedures (47 levels). Intraoperative 3-dimensional image-based navigation was used for needle placement in 21 cases (36 levels); conventional 2-dimensional fluoroscopy was used in the other 9 cases (11 levels). The 2 groups were compared for rates of needle malposition and cement leakage as well as radiation exposure time. Three of 11 (27%) nonnavigated cases were complicated by a malpositioned needle, and 2 of these had to be repositioned. The navigated group had a significantly lower malposition rate (1 of 36; 3%; P=.04). The overall rate of cement leakage was also similar in both groups (P=.29). Radiation exposure time was similar in both groups (navigated, 98 s/level; nonnavigated, 125 s/level; P=.10). Navigated kyphoplasty procedures did not differ significantly from nonnavigated procedures except in terms of needle malposition rate, where navigation may have decreased the need for needle repositioning.

  17. Surgical Classification of the Mandibular Deformity in Craniofacial Microsomia Using 3-Dimensional Computed Tomography

    PubMed Central

    Swanson, Jordan W.; Mitchell, Brianne T.; Wink, Jason A.; Taylor, Jesse A.

    2016-01-01

    Background: Grading systems of the mandibular deformity in craniofacial microsomia (CFM) based on conventional radiographs have shown low interrater reproducibility among craniofacial surgeons. We sought to design and validate a classification based on 3-dimensional CT (3dCT) that correlates features of the deformity with surgical treatment. Methods: CFM mandibular deformities were classified as normal (T0), mild (hypoplastic, likely treated with orthodontics or orthognathic surgery; T1), moderate (vertically deficient ramus, likely treated with distraction osteogenesis; T2), or severe (ramus rudimentary or absent, with either adequate or inadequate mandibular body bone stock; T3 and T4, likely treated with costochondral graft or free fibular flap, respectively). The 3dCT face scans of CFM patients were randomized and then classified by craniofacial surgeons. Pairwise agreement and Fleiss' κ were used to assess interrater reliability. Results: The 3dCT images of 43 patients with CFM (aged 0.1–15.8 years) were reviewed by 15 craniofacial surgeons, representing an average 15.2 years of experience. Reviewers demonstrated fair interrater reliability with average pairwise agreement of 50.4 ± 9.9% (Fleiss' κ = 0.34). This represents significant improvement over the Pruzansky–Kaban classification (pairwise agreement, 39.2%; P = 0.0033.) Reviewers demonstrated substantial interrater reliability with average pairwise agreement of 83.0 ± 7.6% (κ = 0.64) distinguishing deformities requiring graft or flap reconstruction (T3 and T4) from others. Conclusion: The proposed classification, designed for the era of 3dCT, shows improved consensus with respect to stratifying the severity of mandibular deformity and type of operative management. PMID:27104097

  18. Solution of 3-dimensional time-dependent viscous flows. Part 2: Development of the computer code

    NASA Technical Reports Server (NTRS)

    Weinberg, B. C.; Mcdonald, H.

    1980-01-01

    There is considerable interest in developing a numerical scheme for solving the time dependent viscous compressible three dimensional flow equations to aid in the design of helicopter rotors. The development of a computer code to solve a three dimensional unsteady approximate form of the Navier-Stokes equations employing a linearized block emplicit technique in conjunction with a QR operator scheme is described. Results of calculations of several Cartesian test cases are presented. The computer code can be applied to more complex flow fields such as these encountered on rotating airfoils.

  19. A Modular Computer Code for Simulating Reactive Multi-Species Transport in 3-Dimensional Groundwater Systems

    SciTech Connect

    TP Clement

    1999-06-24

    RT3DV1 (Reactive Transport in 3-Dimensions) is computer code that solves the coupled partial differential equations that describe reactive-flow and transport of multiple mobile and/or immobile species in three-dimensional saturated groundwater systems. RT3D is a generalized multi-species version of the US Environmental Protection Agency (EPA) transport code, MT3D (Zheng, 1990). The current version of RT3D uses the advection and dispersion solvers from the DOD-1.5 (1997) version of MT3D. As with MT3D, RT3D also requires the groundwater flow code MODFLOW for computing spatial and temporal variations in groundwater head distribution. The RT3D code was originally developed to support the contaminant transport modeling efforts at natural attenuation demonstration sites. As a research tool, RT3D has also been used to model several laboratory and pilot-scale active bioremediation experiments. The performance of RT3D has been validated by comparing the code results against various numerical and analytical solutions. The code is currently being used to model field-scale natural attenuation at multiple sites. The RT3D code is unique in that it includes an implicit reaction solver that makes the code sufficiently flexible for simulating various types of chemical and microbial reaction kinetics. RT3D V1.0 supports seven pre-programmed reaction modules that can be used to simulate different types of reactive contaminants including benzene-toluene-xylene mixtures (BTEX), and chlorinated solvents such as tetrachloroethene (PCE) and trichloroethene (TCE). In addition, RT3D has a user-defined reaction option that can be used to simulate any other types of user-specified reactive transport systems. This report describes the mathematical details of the RT3D computer code and its input/output data structure. It is assumed that the user is familiar with the basics of groundwater flow and contaminant transport mechanics. In addition, RT3D users are expected to have some experience in

  20. Estimation of Nasal Tip Support Using Computer-Aided Design and 3-Dimensional Printed Models

    PubMed Central

    Gray, Eric; Maducdoc, Marlon; Manuel, Cyrus; Wong, Brian J. F.

    2016-01-01

    IMPORTANCE Palpation of the nasal tip is an essential component of the preoperative rhinoplasty examination. Measuring tip support is challenging, and the forces that correspond to ideal tip support are unknown. OBJECTIVE To identify the integrated reaction force and the minimum and ideal mechanical properties associated with nasal tip support. DESIGN, SETTING, AND PARTICIPANTS Three-dimensional (3-D) printed anatomic silicone nasal models were created using a computed tomographic scan and computer-aided design software. From this model, 3-D printing and casting methods were used to create 5 anatomically correct nasal models of varying constitutive Young moduli (0.042, 0.086, 0.098, 0.252, and 0.302 MPa) from silicone. Thirty rhinoplasty surgeons who attended a regional rhinoplasty course evaluated the reaction force (nasal tip recoil) of each model by palpation and selected the model that satisfied their requirements for minimum and ideal tip support. Data were collected from May 3 to 4, 2014. RESULTS Of the 30 respondents, 4 surgeons had been in practice for 1 to 5 years; 9 surgeons, 6 to 15 years; 7 surgeons, 16 to 25 years; and 10 surgeons, 26 or more years. Seventeen surgeons considered themselves in the advanced to expert skill competency levels. Logistic regression estimated the minimum threshold for the Young moduli for adequate and ideal tip support to be 0.096 and 0.154 MPa, respectively. Logistic regression estimated the thresholds for the reaction force associated with the absolute minimum and ideal requirements for good tip recoil to be 0.26 to 4.74 N and 0.37 to 7.19 N during 1- to 8-mm displacement, respectively. CONCLUSIONS AND RELEVANCE This study presents a method to estimate clinically relevant nasal tip reaction forces, which serve as a proxy for nasal tip support. This information will become increasingly important in computational modeling of nasal tip mechanics and ultimately will enhance surgical planning for rhinoplasty. LEVEL OF EVIDENCE

  1. 3-dimensional (orthogonal) structural complexity of time-series data using low-order moment analysis

    NASA Astrophysics Data System (ADS)

    Law, Victor J.; O'Neill, Feidhlim T.; Dowling, Denis P.

    2012-09-01

    The recording of atmospheric pressure plasmas (APP) electro-acoustic emission data has been developed as a plasma metrology tool in the last couple of years. The industrial applications include automotive and aerospace industry for surface activation of polymers prior to bonding [1, 2, and 3]. It has been shown that as the APP jets proceeds over a treatment surface, at a various fixed heights, two contrasting acoustic signatures are produced which correspond to two very different plasma-surface entropy states (blow arc ˜ 1700 ± 100 K; and; afterglow ˜ 300-400 K) [4]. The metrology challenge is now to capture deterministic data points within data clusters. For this to be achieved new real-time data cluster measurement techniques needs to be developed [5]. The cluster information must be extracted within the allotted process time period if real-time process control is to be achieved. This abstract describes a theoretical structural complexity analysis (in terms crossing points) of 2 and 3-dimentional line-graphs that contain time-series data. In addition LabVIEW implementation of the 3-dimensional data analysis is performed. It is also shown the cluster analysis technique can be transfer to other (non-acoustic) datasets.

  2. Role of preoperative 3-dimensional computed tomography reconstruction in depressed skull fractures treated with craniectomy: a case report of forensic interest.

    PubMed

    Viel, Guido; Cecchetto, Giovanni; Manara, Renzo; Cecchetto, Attilio; Montisci, Massimo

    2011-06-01

    Patients affected by cranial trauma with depressed skull fractures and increased intracranial pressure generally undergo neurosurgical intervention. Because craniotomy and craniectomy remove skull fragments and generate new fracture lines, they complicate forensic examination and sometimes prevent a clear identification of skull fracture etiology. A 3-dimensional reconstruction based on preoperative computed tomography (CT) scans, giving a picture of the injuries before surgical intervention, can help the forensic examiner in identifying skull fracture origin and the means of production.We report the case of a 41-year-old-man presenting at the emergency department with a depressed skull fracture at the vertex and bilateral subdural hemorrhage. The patient underwent 2 neurosurgical interventions (craniotomy and craniectomy) but died after 40 days of hospitalization in an intensive care unit. At autopsy, the absence of various bone fragments did not allow us to establish if the skull had been stricken by a blunt object or had hit the ground with high kinetic energy. To analyze bone injuries before craniectomy, a 3-dimensional CT reconstruction based on preoperative scans was performed. A comparative analysis between autoptic and radiological data allowed us to differentiate surgical from traumatic injuries. Moreover, based on the shape and size of the depressed skull fracture (measured from the CT reformations), we inferred that the man had been stricken by a cylindric blunt object with a diameter of about 3 cm.

  3. Scene-of-crime analysis by a 3-dimensional optical digitizer: a useful perspective for forensic science.

    PubMed

    Sansoni, Giovanna; Cattaneo, Cristina; Trebeschi, Marco; Gibelli, Daniele; Poppa, Pasquale; Porta, Davide; Maldarella, Monica; Picozzi, Massimo

    2011-09-01

    Analysis and detailed registration of the crime scene are of the utmost importance during investigations. However, this phase of activity is often affected by the risk of loss of evidence due to the limits of traditional scene of crime registration methods (ie, photos and videos). This technical note shows the utility of the application of a 3-dimensional optical digitizer on different crime scenes. This study aims in fact at verifying the importance and feasibility of contactless 3-dimensional reconstruction and modeling by optical digitization to achieve an optimal registration of the crime scene.

  4. User's manual for master: Modeling of aerodynamic surfaces by 3-dimensional explicit representation. [input to three dimensional computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Gibson, S. G.

    1983-01-01

    A system of computer programs was developed to model general three dimensional surfaces. Surfaces are modeled as sets of parametric bicubic patches. There are also capabilities to transform coordinates, to compute mesh/surface intersection normals, and to format input data for a transonic potential flow analysis. A graphical display of surface models and intersection normals is available. There are additional capabilities to regulate point spacing on input curves and to compute surface/surface intersection curves. Input and output data formats are described; detailed suggestions are given for user input. Instructions for execution are given, and examples are shown.

  5. 3-Dimensional Analysis of Dynamic Behavior of Bearing of Nielsen Bridge

    NASA Astrophysics Data System (ADS)

    Tanimura, Shinji; Heya, Hiroyuki; Umeda, Tsutomu; Mimura, Koji; Yoshikawa, Osamu

    In 1995, the great Hanshin-Awaji earthquake caused a large amount of destruction and structural failures. One example, whose mechanism is not fully clear, is the fracture of a bridge bearing of a Nielsen type bridge that does not occur under the ordinary static or dynamic loading conditions. The fracture probably resulted from very high stress due to an unexpected dynamic mechanism. In this paper, the 3-dimensional dynamic behavior of a Nielsen type bridge was analyzed by assuming a collision between the upper and the lower parts of the bearing, which might have occurred in the great Hanshin-Awaji earthquake. The numerical results show that an impact due to a relative velocity of 5˜6m/s between the upper and the lower parts of the bearing generates a stress sufficient to cause a fracture in the upper bearing. The observed features of the actual fracture surface was also simulated fairly closely.

  6. Stress analysis in platform-switching implants: a 3-dimensional finite element study.

    PubMed

    Pellizzer, Eduardo Piza; Verri, Fellippo Ramos; Falcón-Antenucci, Rosse Mary; Júnior, Joel Ferreira Santiago; de Carvalho, Paulo Sérgio Perri; de Moraes, Sandra Lúcia Dantas; Noritomi, Pedro Yoshito

    2012-10-01

    The aim of this study was to evaluate the influence of the platform-switching technique on stress distribution in implant, abutment, and peri-implant tissues, through a 3-dimensional finite element study. Three 3-dimensional mandibular models were fabricated using the SolidWorks 2006 and InVesalius software. Each model was composed of a bone block with one implant 10 mm long and of different diameters (3.75 and 5.00 mm). The UCLA abutments also ranged in diameter from 5.00 mm to 4.1 mm. After obtaining the geometries, the models were transferred to the software FEMAP 10.0 for pre- and postprocessing of finite elements to generate the mesh, loading, and boundary conditions. A total load of 200 N was applied in axial (0°), oblique (45°), and lateral (90°) directions. The models were solved by the software NeiNastran 9.0 and transferred to the software FEMAP 10.0 to obtain the results that were visualized through von Mises and maximum principal stress maps. Model A (implants with 3.75 mm/abutment with 4.1 mm) exhibited the highest area of stress concentration with all loadings (axial, oblique, and lateral) for the implant and the abutment. All models presented the stress areas at the abutment level and at the implant/abutment interface. Models B (implant with 5.0 mm/abutment with 5.0 mm) and C (implant with 5.0 mm/abutment with 4.1 mm) presented minor areas of stress concentration and similar distribution pattern. For the cortical bone, low stress concentration was observed in the peri-implant region for models B and C in comparison to model A. The trabecular bone exhibited low stress that was well distributed in models B and C. Model A presented the highest stress concentration. Model B exhibited better stress distribution. There was no significant difference between the large-diameter implants (models B and C).

  7. Surgical orthodontic treatment for a patient with advanced periodontal disease: evaluation with electromyography and 3-dimensional cone-beam computed tomography.

    PubMed

    Nakajima, Kan; Yamaguchi, Tetsutaro; Maki, Koutaro

    2009-09-01

    We report here the case of a woman with Class III malocclusion and advanced periodontal disease who was treated with surgical orthodontic correction. Functional recovery after orthodontic treatment is often monitored by serial electromyography of the masticatory muscles, whereas 3-dimensional cone-beam computed tomography can provide detailed structural information about, for example, periodontal bone defects. However, it is unclear whether the information obtained via these methods is sufficient to determine the treatment goal. It might be useful to address this issue for patients with advanced periodontal disease because of much variability between patients in the determination of treatment goals. We used detailed information obtained by 3-dimensional cone-beam computed tomography to identify periodontal bone defects and set appropriate treatment goals for inclination of the incisors and mandibular surgery. Results for this patient included stable occlusion and improved facial esthetics. This case report illustrates the benefits of establishing treatment goals acceptable to the patient, based on precise 3-dimensional assessment of dentoalveolar bone, and by using masticatory muscle activity to monitor the stability of occlusion.

  8. Analysis of 3-dimensional finite element after reconstruction of impaired ankle deltoid ligament

    PubMed Central

    Ji, Yunhan; Tang, Xianzhong; Li, Yifan; Xu, Wei; Qiu, Wenjun

    2016-01-01

    We compared four repair techniques for impaired ankle ligament deltoideum, namely Wiltberger, Deland, Kitaoka and Hintermann using a 3-dimensional finite element. We built an ankle ligament deltoideum model, including six pieces of bone structures, gristles and main ligaments around the ankle. After testing the model, we built an impaired ligament deltoideum model plus four reconstruction models. Subsequently, different levels of force on ankles with different flexion were imposed and ankle biomechanics were compared. In the course of bending, from plantar flexion 20° to back flexion 20°, the extortion of talus decreased while the eversion increased. Four reconstruction models failed to bring back the impaired ankle to normal, with an obvious increase of extortion and eversion. The Kitaoka technique was useful to reduce the extortion angle in a consequential manner. Compared with the other three techniques, the Kitaoka technique produced better results for extortion angle and the difference was statistically significant. However, in case of eversion, there was no significant difference among the four techniques (P>0.05). Lateral ligament's stress in all the four models was different from the normal one. When the ankle was imposed with extortion moment of force, stress of anterior talofibular ligament with the Kitaoka reconstruction method was close to that of the complete deltoid ligament. When ankle was imposed with eversion moment of force, stress of anterior talofibular ligament with Kitaoka and Deland reconstruction methods were close to that of the complete deltoid ligament. We concluded that Kitaoka and Deland tendon reconstruction technique could recover impaired ankle deltoid ligament and re-established its normal biomechanics characteristics. PMID:28105122

  9. Analysis of 3-dimensional finite element after reconstruction of impaired ankle deltoid ligament.

    PubMed

    Ji, Yunhan; Tang, Xianzhong; Li, Yifan; Xu, Wei; Qiu, Wenjun

    2016-12-01

    We compared four repair techniques for impaired ankle ligament deltoideum, namely Wiltberger, Deland, Kitaoka and Hintermann using a 3-dimensional finite element. We built an ankle ligament deltoideum model, including six pieces of bone structures, gristles and main ligaments around the ankle. After testing the model, we built an impaired ligament deltoideum model plus four reconstruction models. Subsequently, different levels of force on ankles with different flexion were imposed and ankle biomechanics were compared. In the course of bending, from plantar flexion 20° to back flexion 20°, the extortion of talus decreased while the eversion increased. Four reconstruction models failed to bring back the impaired ankle to normal, with an obvious increase of extortion and eversion. The Kitaoka technique was useful to reduce the extortion angle in a consequential manner. Compared with the other three techniques, the Kitaoka technique produced better results for extortion angle and the difference was statistically significant. However, in case of eversion, there was no significant difference among the four techniques (P>0.05). Lateral ligament's stress in all the four models was different from the normal one. When the ankle was imposed with extortion moment of force, stress of anterior talofibular ligament with the Kitaoka reconstruction method was close to that of the complete deltoid ligament. When ankle was imposed with eversion moment of force, stress of anterior talofibular ligament with Kitaoka and Deland reconstruction methods were close to that of the complete deltoid ligament. We concluded that Kitaoka and Deland tendon reconstruction technique could recover impaired ankle deltoid ligament and re-established its normal biomechanics characteristics.

  10. BOPACE 3-D (the Boeing Plastic Analysis Capability for 3-dimensional Solids Using Isoparametric Finite Elements)

    NASA Technical Reports Server (NTRS)

    Vos, R. G.; Straayer, J. W.

    1975-01-01

    The BOPACE 3-D is a finite element computer program, which provides a general family of three-dimensional isoparametric solid elements, and includes a new algorithm for improving the efficiency of the elastic-plastic-creep solution procedure. Theoretical, user, and programmer oriented sections are presented to describe the program.

  11. Hydroelectric structures studies using 3-dimensional methods

    SciTech Connect

    Harrell, T.R.; Jones, G.V.; Toner, C.K. )

    1989-01-01

    Deterioration and degradation of aged, hydroelectric project structures can significantly affect the operation and safety of a project. In many cases, hydroelectric headworks (in particular) have complicated geometrical configurations, loading patterns and hence, stress conditions. An accurate study of such structures can be performed using 3-dimensional computer models. 3-D computer models can be used for both stability evaluation and for finite element stress analysis. Computer aided engineering processes facilitate the use of 3-D methods in both pre-processing and post-processing of data. Two actual project examples are used to emphasize the authors' points.

  12. Prosthesis-guided implant restoration of an auricular defect using computed tomography and 3-dimensional photographic imaging technologies: a clinical report.

    PubMed

    Wang, Shuming; Leng, Xu; Zheng, Yaqi; Zhang, Dapeng; Wu, Guofeng

    2015-02-01

    The concept of prosthesis-guided implantation has been widely accepted for intraoral implant placement, although clinicians do not fully appreciate its use for facial defect restoration. In this clinical report, multiple digital technologies were used to restore a facial defect with prosthesis-guided implantation. A simulation surgery was performed to remove the residual auricular tissue and to ensure the correct position of the mirrored contralateral ear model. The combined application of computed tomography and 3-dimensional photography preserved the position of the mirrored model and facilitated the definitive implant-retained auricular prosthesis.

  13. Hybrid-finite-element analysis of some nonlinear and 3-dimensional problems of engineering fracture mechanics

    NASA Technical Reports Server (NTRS)

    Atluri, S. N.; Nakagaki, M.; Kathiresan, K.

    1980-01-01

    In this paper, efficient numerical methods for the analysis of crack-closure effects on fatigue-crack-growth-rates, in plane stress situations, and for the solution of stress-intensity factors for arbitrary shaped surface flaws in pressure vessels, are presented. For the former problem, an elastic-plastic finite element procedure valid for the case of finite deformation gradients is developed and crack growth is simulated by the translation of near-crack-tip elements with embedded plastic singularities. For the latter problem, an embedded-elastic-singularity hybrid finite element method, which leads to a direct evaluation of K-factors, is employed.

  14. Solution of Poisson equations for 3-dimensional grid generations. [computations of a flow field over a thin delta wing

    NASA Technical Reports Server (NTRS)

    Fujii, K.

    1983-01-01

    A method for generating three dimensional, finite difference grids about complicated geometries by using Poisson equations is developed. The inhomogenous terms are automatically chosen such that orthogonality and spacing restrictions at the body surface are satisfied. Spherical variables are used to avoid the axis singularity, and an alternating-direction-implicit (ADI) solution scheme is used to accelerate the computations. Computed results are presented that show the capability of the method. Since most of the results presented have been used as grids for flow-field computations, this is indicative that the method is a useful tool for generating three-dimensional grids about complicated geometries.

  15. Evaluation of Temperature and Stress Distribution on 2 Different Post Systems Using 3-Dimensional Finite Element Analysis

    PubMed Central

    Değer, Yalçın; Adigüzel, Özkan; Özer, Senem Yiğit; Kaya, Sadullah; Polat, Zelal Seyfioğlu; Bozyel, Bejna

    2015-01-01

    Background The mouth is exposed to thermal irritation from hot and cold food and drinks. Thermal changes in the oral cavity produce expansions and contractions in tooth structures and restorative materials. The aim of this study was to investigate the effect of temperature and stress distribution on 2 different post systems using the 3-dimensional (3D) finite element method. Material/Methods The 3D finite element model shows a labio-lingual cross-sectional view of the endodontically treated upper right central incisor and supporting periodontal ligament with bone structures. Stainless steel and glass fiber post systems with different physical and thermal properties were modelled in the tooth restored with composite core and ceramic crown. We placed 100 N static vertical occlusal loading onto the center of the incisal surface of the tooth. Thermal loads of 0°C and 65°C were applied on the model for 5 s. Temperature and thermal stresses were determined on the labio-lingual section of the model at 6 different points. Results The distribution of stress, including thermal stress values, was calculated using 3D finite element analysis. The stainless steel post system produced more temperature and thermal stresses on the restorative materials, tooth structures, and posts than did the glass fiber reinforced composite posts. Conclusions Thermal changes generated stresses in the restorative materials, tooth, and supporting structures. PMID:26615495

  16. Contrast enhanced micro-computed tomography resolves the 3-dimensional morphology of the cardiac conduction system in mammalian hearts.

    PubMed

    Stephenson, Robert S; Boyett, Mark R; Hart, George; Nikolaidou, Theodora; Cai, Xue; Corno, Antonio F; Alphonso, Nelson; Jeffery, Nathan; Jarvis, Jonathan C

    2012-01-01

    The general anatomy of the cardiac conduction system (CCS) has been known for 100 years, but its complex and irregular three-dimensional (3D) geometry is not so well understood. This is largely because the conducting tissue is not distinct from the surrounding tissue by dissection. The best descriptions of its anatomy come from studies based on serial sectioning of samples taken from the appropriate areas of the heart. Low X-ray attenuation has formerly ruled out micro-computed tomography (micro-CT) as a modality to resolve internal structures of soft tissue, but incorporation of iodine, which has a high molecular weight, into those tissues enhances the differential attenuation of X-rays and allows visualisation of fine detail in embryos and skeletal muscle. Here, with the use of a iodine based contrast agent (I(2)KI), we present contrast enhanced micro-CT images of cardiac tissue from rat and rabbit in which the three major subdivisions of the CCS can be differentiated from the surrounding contractile myocardium and visualised in 3D. Structures identified include the sinoatrial node (SAN) and the atrioventricular conduction axis: the penetrating bundle, His bundle, the bundle branches and the Purkinje network. Although the current findings are consistent with existing anatomical representations, the representations shown here offer superior resolution and are the first 3D representations of the CCS within a single intact mammalian heart.

  17. Cost-Effectiveness Analysis of Intensity Modulated Radiation Therapy Versus 3-Dimensional Conformal Radiation Therapy for Anal Cancer

    SciTech Connect

    Hodges, Joseph C.; Beg, Muhammad S.; Das, Prajnan; Meyer, Jeffrey

    2014-07-15

    Purpose: To compare the cost-effectiveness of intensity modulated radiation therapy (IMRT) and 3-dimensional conformal radiation therapy (3D-CRT) for anal cancer and determine disease, patient, and treatment parameters that influence the result. Methods and Materials: A Markov decision model was designed with the various disease states for the base case of a 65-year-old patient with anal cancer treated with either IMRT or 3D-CRT and concurrent chemotherapy. Health states accounting for rates of local failure, colostomy failure, treatment breaks, patient prognosis, acute and late toxicities, and the utility of toxicities were informed by existing literature and analyzed with deterministic and probabilistic sensitivity analysis. Results: In the base case, mean costs and quality-adjusted life expectancy in years (QALY) for IMRT and 3D-CRT were $32,291 (4.81) and $28,444 (4.78), respectively, resulting in an incremental cost-effectiveness ratio of $128,233/QALY for IMRT compared with 3D-CRT. Probabilistic sensitivity analysis found that IMRT was cost-effective in 22%, 47%, and 65% of iterations at willingness-to-pay thresholds of $50,000, $100,000, and $150,000 per QALY, respectively. Conclusions: In our base model, IMRT was a cost-ineffective strategy despite the reduced acute treatment toxicities and their associated costs of management. The model outcome was sensitive to variations in local and colostomy failure rates, as well as patient-reported utilities relating to acute toxicities.

  18. Analysis of shape and motion of the mitral annulus in subjects with and without cardiomyopathy by echocardiographic 3-dimensional reconstruction

    NASA Technical Reports Server (NTRS)

    Flachskampf, F. A.; Chandra, S.; Gaddipatti, A.; Levine, R. A.; Weyman, A. E.; Ameling, W.; Hanrath, P.; Thomas, J. D.

    2000-01-01

    The shape and dynamics of the mitral annulus of 10 patients without heart disease (controls), 3 patients with dilated cardiomyopathy, and 5 patients with hypertrophic obstructive cardiomyopathy and normal systolic function were analyzed by transesophageal echocardiography and 3-dimensional reconstruction. Mitral annular orifice area, apico-basal motion of the annulus, and nonplanarity were calculated over time. Annular area was largest in end diastole and smallest in end systole. Mean areas were 11.8 +/- 2.5 cm(2) (controls), 15.2 +/- 4.2 cm(2) (dilated cardiomyopathy), and 10.2 +/- 2.4 cm(2) (hypertrophic cardiomyopathy) (P = not significant). After correction for body surface, annuli from patients with normal left ventricular function were smaller than annuli from patients with dilated cardiomyopathy (5.9 +/- 1.2 cm(2)/m(2) vs 7.7 +/- 1.0 cm(2)/m(2); P <.02). The change in area during the cardiac cycle showed significant differences: 23.8% +/- 5.1% (controls), 13.2% +/- 2.3% (dilated cardiomyopathy), and 32.4% +/- 7.6% (hypertrophic cardiomyopathy) (P <.001). Apico-basal motion was highest in controls, followed by those with hypertrophic obstructive and dilated cardiomyopathy (1.0 +/- 0.3 cm, 0.8 +/- 0.2 cm, 0.3 +/- 0.2 cm, respectively; P <.01). Visual inspection and Fourier analysis showed a consistent pattern of anteroseptal and posterolateral elevations of the annulus toward the left atrium. In conclusion, although area changes and apico-basal motion of the mitral annulus strongly depend on left ventricular systolic function, nonplanarity is a structural feature preserved throughout the cardiac cycle in all three groups.

  19. Optic Strut and Para-clinoid Region – Assessment by Multi-detector Computed Tomography with Multiplanar and 3 Dimensional Reconstructions

    PubMed Central

    Ravikiran, S.R.; Kumar, Ashvini; Chavadi, Channabasappa; Pulastya, Sanyal

    2015-01-01

    Purpose To evaluate thickness, location and orientation of optic strut and anterior clinoid process and variations in paraclinoid region, solely based on multidetector computed tomography (MDCT) images with multiplanar (MPR) and 3 dimensional (3D) reconstructions, among Indian population. Materials and Methods Ninety five CT scans of head and paranasal sinuses patients were retrospectively evaluated with MPR and 3D reconstructions to assess optic strut thickness, angle and location, variations like pneumatisation, carotico-clinoid foramen and inter-clinoid osseous ridge. Results Mean optic strut thickness was 3.64mm (±0.64), optic strut angle was 42.67 (±6.16) degrees. Mean width and length of anterior clinoid process were 10.65mm (±0.79) and 11.20mm (±0.95) respectively. Optic strut attachment to sphenoid body was predominantly sulcal as in 52 cases (54.74%) and was most frequently attached to anterior 2/5th of anterior clinoid process, seen in 93 sides (48.95%). Pneumatisation of optic strut occurred in 23 sides. Carotico-clinoid foramen was observed in 42 cases (22.11%), complete foramen in 10 cases (5.26%), incomplete foramen in 24 cases (12.63%) and contact type in 8 cases (4.21%). Inter-clinoid osseous bridge was seen unilaterally in 4 cases. Conclusion The study assesses morphometric features and anatomical variations of paraclinoid region using MDCT 3D and multiplanar reconstructions in Indian population. PMID:26557589

  20. Repeatability of choku-tsuki and oi-tsuki in shotokan karate: a 3-dimensional analysis with thirteen black-belt karateka.

    PubMed

    Sforza, C; Turci, M; Grassi, G P; Fragnito, N; Serrao, G; Ferrario, V F

    2001-06-01

    13 black-belt karateka performed two different standardized counter-offensive techniques. The trajectories of selected body landmarks were studied by using a computerized image analyzer that allows a 3-dimensional reconstruction of standardized movements. The repeatability of both karate techniques was quantified for each participant. Analysis confirmed that more experienced karateka obtained the best repeatability, as already demonstrated in a preliminary study conducted with a smaller sample of less experienced participants.

  1. Influence of the implant diameter with different sizes of hexagon: analysis by 3-dimensional finite element method.

    PubMed

    Pellizzer, Eduardo Piza; Verri, Fellippo Ramos; de Moraes, Sandra Lúcia Dantas; Falcón-Antenucci, Rosse Mary; de Carvalho, Paulo Sérgio Perri; Noritomi, Pedro Yoshito

    2013-08-01

    The aim of this study was to evaluate the stress distribution in implants of regular platforms and of wide diameter with different sizes of hexagon by the 3-dimensional finite element method. We used simulated 3-dimensional models with the aid of Solidworks 2006 and Rhinoceros 4.0 software for the design of the implant and abutment and the InVesalius software for the design of the bone. Each model represented a block of bone from the mandibular molar region with an implant 10 mm in length and different diameters. Model A was an implant 3.75 mm/regular hexagon, model B was an implant 5.00 mm/regular hexagon, and model C was an implant 5.00 mm/expanded hexagon. A load of 200 N was applied in the axial, lateral, and oblique directions. At implant, applying the load (axial, lateral, and oblique), the 3 models presented stress concentration at the threads in the cervical and middle regions, and the stress was higher for model A. At the abutment, models A and B showed a similar stress distribution, concentrated at the cervical and middle third; model C showed the highest stresses. On the cortical bone, the stress was concentrated at the cervical region for the 3 models and was higher for model A. In the trabecular bone, the stresses were less intense and concentrated around the implant body, and were more intense for model A. Among the models of wide diameter (models B and C), model B (implant 5.00 mm/regular hexagon) was more favorable with regard to distribution of stresses. Model A (implant 3.75 mm/regular hexagon) showed the largest areas and the most intense stress, and model B (implant 5.00 mm/regular hexagon) showed a more favorable stress distribution. The highest stresses were observed in the application of lateral load.

  2. Bimaxillary 'rotation advancement' procedures in patients with obstructive sleep apnea: a 3-dimensional airway analysis of morphological changes.

    PubMed

    Zinser, M J; Zachow, S; Sailer, H F

    2013-05-01

    The aim of this retrospective three dimensional (3D) computed tomographic analysis was to investigate the morphological airway changes in 17 obstructive sleep apnea (OSA) patients following bimaxillary rotation advancement procedures. Morphological changes of the nasal cavity and naso-, oro- and hypopharynx were analysed separately, as were the total airway changes using nine parameters of airway size and four of shape. The Wilcoxon test was used to compare airway changes and the intraclass correlation coefficient to qualify inter-observer reliability. Following bimaxillary advancement and anti-clockwise maxillary rotation, the total airway volume and the lateral dimension of the cross-sectional airway increased significantly. The total length of the airway became shorter (p<0.05). Remarkable changes were seen in the oropharynx: the length, volume, cross-sectional area (CSA), antero-posterior and medio-lateral distance changed (p<0.05). This combined with a significant 3D change in the shape of the airway from round to elliptical. The average cross-sectional oropharyngeal area was nearly doubled, the minimal CSA increased 40%, and the hyoid bone was located more anterior and superior. Inter-examiner reliabilities were high (0.89). 3D airway analysis aids the understanding of postoperative pathophysiological changes in OSA patients. The airway became shorter, more voluminous, medio-laterally wider, and more compact and elliptical.

  3. A 3 dimensional assessment of the depth of tumor invasion in microinvasive tongue squamous cell carcinoma - A case series analysis

    PubMed Central

    Amit-Byatnal, Aditi; Natarajan, Jayalakshmi; Shenoy, Satish; Kamath, Asha; Hunter, Keith

    2015-01-01

    Background Accurate assessment of the depth of tumor invasion (DI) in microinvasive squamous cell carcinoma (MISCC) of the tongue is critical to prognosis. An arithmetic model is generated to determine a reliable method of measurement of DI and correlate this with the local recurrence. Material and Methods Tumor thickness (TT) and DI were measured in tissue sections of 14 cases of MISCC of the tongue, by manual ocular micrometer and digital image analysis at four reference points (A, B, C, and D). The comparison of TT and DI with relevant clinicopathologic parameters was assessed using Mann Whitney U test. Reliability of these methods and the values obtained were compared and correlated with the recurrence of tumors by Wilcoxon Signed Ranks Test. 3D reconstruction of the lesion was done on a Cartesian coordinate system. X face was on the YZ plane and Z face was on the XY plane of the coordinate system. Results Computer generated 3D model of oral mucosa in four cases that recurred showed increased DI in the Z coordinate compared to the XY coordinate. The median DI measurements between XY and Z coordinates in these cases showed no significant difference (Wilcoxon Signed Ranks Test, p = 0.068). Conclusions The assessment of DI in 3 dimensions is critical for accurate assessment of MISCC and precise DI allows complete removal of tumor. Key words:Depth of invasion, tumor thickness, microinvasive squamous cell carcinoma, tongue squamous cell carcinoma. PMID:26449426

  4. Control Point Analysis comparison for 3 different treatment planning and delivery complexity levels using a commercial 3-dimensional diode array

    SciTech Connect

    Abdellatif, Ady; Gaede, Stewart

    2014-07-01

    To investigate the use of “Control Point Analysis” (Sun Nuclear Corporation, Melbourne, FL) to analyze and compare delivered volumetric-modulated arc therapy (VMAT) plans for 3 different treatment planning complexity levels. A total of 30 patients were chosen and fully anonymized for the purpose of this study. Overall, 10 lung stereotactic body radiotherapy (SBRT), 10 head-and-neck (H and N), and 10 prostate VMAT plans were generated on Pinnacle{sup 3} and delivered on a Varian linear accelerator (LINAC). The delivered dose was measured using ArcCHECK (Sun Nuclear Corporation, Melbourne, FL). Each plan was analyzed using “Sun Nuclear Corporation (SNC) Patient 6” and “Control Point Analysis.” Gamma passing percentage was used to assess the differences between the measured and planned dose distributions and to assess the role of various control point binning combinations. Of the different sites considered, the prostate cases reported the highest gamma passing percentages calculated with “SNC Patient 6” (97.5% to 99.2% for the 3%, 3 mm) and “Control Point Analysis” (95.4% to 98.3% for the 3%, 3 mm). The mean percentage of passing control point sectors for the prostate cases increased from 51.8 ± 7.8% for individual control points to 70.6 ± 10.5% for 5 control points binned together to 87.8 ± 11.0% for 10 control points binned together (2%, 2-mm passing criteria). Overall, there was an increasing trend in the percentage of sectors passing gamma analysis with an increase in the number of control points binned together in a sector for both the gamma passing criteria (2%, 2 mm and 3%, 3 mm). Although many plans passed the clinical quality assurance criteria, plans involving the delivery of high Monitor Unit (MU)/control point (SBRT) and plans involving high degree of modulation (H and N) showed less delivery accuracy per control point compared with plans with low MU/control point and low degree of modulation (prostate)

  5. Effects of different abutment connection designs on the stress distribution around five different implants: a 3-dimensional finite element analysis.

    PubMed

    Balik, Ali; Karatas, Meltem Ozdemir; Keskin, Haluk

    2012-09-01

    The stability of the bone-implant interface is required for the long-term favorable clinical outcome of implant-supported prosthetic rehabilitation. The implant failures that occur after the functional loading are mainly related to biomechanical factors. Micro movements and vibrations due to occlusal forces can lead to mechanical complications such as loosening of the screw and fractures of the abutment or implants. The aim of this study was to investigate the strain distributions in the connection areas of different implant-abutment connection systems under similar loading conditions. Five different implant-abutment connection designs from 5 different manufacturers were evaluated in this study. The investigation was performed with software using the finite element method. The geometrical modeling of the implant systems was done with CATIA virtual design software. The MSC NASTRAN solver and PATRAN postprocessing program were used to perform the linear static solution. According to the analysis, the implant-abutment connection system with external hexagonal connection showed the highest strain values, and the internal hexagonal implant-abutment connection system showed the lowest strain values. Conical + internal hexagonal and screw-in implant abutment connection interface is more successful than other systems in cases with increased vertical dimension, particularly in the posterior region.

  6. Pre- and postoperative evaluation of partial anomalous pulmonary venous return: by 3-dimensional cardiovascular magnetic resonance imaging and cardiovascular computed tomography.

    PubMed

    Crestanello, Juan A; Daniels, Curt; Franco, Veronica; Raman, Subha V

    2010-01-01

    The pre- and postoperative evaluation of anomalous pulmonary venous return usually requires multiple invasive and noninvasive tests in order to obtain complete anatomic and functional data. Conversely, in a single setting, either cardiovascular magnetic resonance imaging or cardiovascular computed tomography can sufficiently reveal this information in adult patients. Herein, we present the cases of 2 patients with partial anomalous pulmonary venous return who underwent preoperative and postoperative evaluation by either method alone, and we discuss the benefits and limitations of each technique.

  7. Computer analysis of arteriograms

    NASA Technical Reports Server (NTRS)

    Selzer, R. H.; Armstrong, J. H.; Beckenbach, E. B.; Blankenhorn, D. H.; Crawford, D. W.; Brooks, S. H.; Sanmarco, M. E.

    1977-01-01

    A computer system has been developed to quantify the degree of atherosclerosis in the human femoral artery. The analysis involves first scanning and digitizing angiographic film, then tracking the outline of the arterial image and finally computing the relative amount of roughness or irregularity in the vessel wall. The image processing system and method are described.

  8. 3-dimensional magnetotelluric inversion including topography using deformed hexahedral edge finite elements and direct solvers parallelized on symmetric multiprocessor computers - Part II: direct data-space inverse solution

    NASA Astrophysics Data System (ADS)

    Kordy, M.; Wannamaker, P.; Maris, V.; Cherkaev, E.; Hill, G.

    2016-01-01

    Following the creation described in Part I of a deformable edge finite-element simulator for 3-D magnetotelluric (MT) responses using direct solvers, in Part II we develop an algorithm named HexMT for 3-D regularized inversion of MT data including topography. Direct solvers parallelized on large-RAM, symmetric multiprocessor (SMP) workstations are used also for the Gauss-Newton model update. By exploiting the data-space approach, the computational cost of the model update becomes much less in both time and computer memory than the cost of the forward simulation. In order to regularize using the second norm of the gradient, we factor the matrix related to the regularization term and apply its inverse to the Jacobian, which is done using the MKL PARDISO library. For dense matrix multiplication and factorization related to the model update, we use the PLASMA library which shows very good scalability across processor cores. A synthetic test inversion using a simple hill model shows that including topography can be important; in this case depression of the electric field by the hill can cause false conductors at depth or mask the presence of resistive structure. With a simple model of two buried bricks, a uniform spatial weighting for the norm of model smoothing recovered more accurate locations for the tomographic images compared to weightings which were a function of parameter Jacobians. We implement joint inversion for static distortion matrices tested using the Dublin secret model 2, for which we are able to reduce nRMS to ˜1.1 while avoiding oscillatory convergence. Finally we test the code on field data by inverting full impedance and tipper MT responses collected around Mount St Helens in the Cascade volcanic chain. Among several prominent structures, the north-south trending, eruption-controlling shear zone is clearly imaged in the inversion.

  9. General design method for 3-dimensional, potential flow fields. Part 2: Computer program DIN3D1 for simple, unbranched ducts

    NASA Technical Reports Server (NTRS)

    Stanitz, J. D.

    1985-01-01

    The general design method for three-dimensional, potential, incompressible or subsonic-compressible flow developed in part 1 of this report is applied to the design of simple, unbranched ducts. A computer program, DIN3D1, is developed and five numerical examples are presented: a nozzle, two elbows, an S-duct, and the preliminary design of a side inlet for turbomachines. The two major inputs to the program are the upstream boundary shape and the lateral velocity distribution on the duct wall. As a result of these inputs, boundary conditions are overprescribed and the problem is ill posed. However, it appears that there are degrees of compatibility between these two major inputs and that, for reasonably compatible inputs, satisfactory solutions can be obtained. By not prescribing the shape of the upstream boundary, the problem presumably becomes well posed, but it is not clear how to formulate a practical design method under this circumstance. Nor does it appear desirable, because the designer usually needs to retain control over the upstream (or downstream) boundary shape. The problem is further complicated by the fact that, unlike the two-dimensional case, and irrespective of the upstream boundary shape, some prescribed lateral velocity distributions do not have proper solutions.

  10. Frontal soft tissue analysis using a 3 dimensional camera following two-jaw rotational orthognathic surgery in skeletal class III patients.

    PubMed

    Choi, Jong Woo; Lee, Jang Yeol; Oh, Tae-Suk; Kwon, Soon Man; Yang, Sung Joon; Koh, Kyung Suk

    2014-04-01

    Although two dimensional cephalometry is the standard method for analyzing the results of orthognathic surgery, it has potential limits in frontal soft tissue analysis. We have utilized a 3 dimensional camera to examine changes in soft tissue landmarks in patients with skeletal class III dentofacial deformity who underwent two-jaw rotational setback surgery. We assessed 25 consecutive Asian patients (mean age, 22 years; range, 17-32 years) with skeletal class III dentofacial deformities who underwent two-jaw rotational surgery without maxillary advancement. Using a 3D camera, we analyzed changes in facial proportions, including vertical and horizontal dimensions, facial surface areas, nose profile, lip contour, and soft tissue cheek convexity, as well as landmarks related to facial symmetry. The average mandibular setback was 10.7 mm (range: 5-17 mm). The average SNA changed from 77.4° to 77.8°, the average SNB from 89.2° to 81.1°, and the average occlusal plane from 8.7° to 11.4°. The mid third vertical dimension changed from 58.8 mm to 57.8 mm (p = 0.059), and the lower third vertical dimension changed from 70.4 mm to 68.2 mm (p = 0.0006). The average bigonial width decreased from 113.5 mm to 109.2 mm (p = 0.0028), the alar width increased from 34.7 mm to 36.1 mm (p-value = 0.0002), and lip length was unchanged. Mean mid and lower facial surface areas decreased significantly, from 171.8 cm(2) to 166.2 cm(2) (p = 0.026) and from 71.23 cm(2) to 61.9 cm(2) (p < 0.0001), respectively. Cheek convexity increased significantly, from 171.8° to 155.9° (p = 0.0007). The 3D camera was effective in frontal soft tissue analysis for orthognathic surgery, and enabled quantitative analysis of changes in frontal soft tissue landmarks and facial proportions that were not possible with conventional 2D cephalometric analysis.

  11. Comparative Analysis of Visitors' Experiences and Knowledge Acquisition between a 3Dimensional Online and a Real-World Art Museum Tour

    ERIC Educational Resources Information Center

    D' Alba, Adriana; Jones, Greg; Wright, Robert

    2015-01-01

    This paper discusses a study conducted in the fall of 2011 and the spring of 2012 which explored the use of existing 3D virtual environment technologies by bringing a selected permanent museum exhibit displayed at a museum located in central Mexico into an online 3Dimensional experience. Using mixed methods, the research study analyzed knowledge…

  12. Evaluation of the middle cerebral artery occlusion techniques in the rat by in-vitro 3-dimensional micro- and nano computed tomography

    PubMed Central

    2010-01-01

    Background Animal models of focal cerebral ischemia are widely used in stroke research. The purpose of our study was to evaluate and compare the cerebral macro- and microvascular architecture of rats in two different models of permanent middle cerebral artery occlusion using an innovative quantitative micro- and nano-CT imaging technique. Methods 4h of middle cerebral artery occlusion was performed in rats using the macrosphere method or the suture technique. After contrast perfusion, brains were isolated and scanned en-bloc using micro-CT (8 μm)3 or nano-CT at 500 nm3 voxel size to generate 3D images of the cerebral vasculature. The arterial vascular volume fraction and gray scale attenuation was determined and the significance of differences in measurements was tested with analysis of variance [ANOVA]. Results Micro-CT provided quantitative information on vascular morphology. Micro- and nano-CT proved to visualize and differentiate vascular occlusion territories performed in both models of cerebral ischemia. The suture technique leads to a remarkable decrease in the intravascular volume fraction of the middle cerebral artery perfusion territory. Blocking the medial cerebral artery with macrospheres, the vascular volume fraction of the involved hemisphere decreased significantly (p < 0.001), independently of the number of macrospheres, and was comparable to the suture method. We established gray scale measurements by which focal cerebral ischemia could be radiographically categorized (p < 0.001). Nano-CT imaging demonstrates collateral perfusion related to different occluded vessel territories after macrosphere perfusion. Conclusion Micro- and Nano-CT imaging is feasible for analysis and differentiation of different models of focal cerebral ischemia in rats. PMID:20509884

  13. Incorporating 3-dimensional models in online articles

    PubMed Central

    Cevidanes, Lucia H. S.; Ruellasa, Antonio C. O.; Jomier, Julien; Nguyen, Tung; Pieper, Steve; Budin, Francois; Styner, Martin; Paniagua, Beatriz

    2015-01-01

    Introduction The aims of this article were to introduce the capability to view and interact with 3-dimensional (3D) surface models in online publications, and to describe how to prepare surface models for such online 3D visualizations. Methods Three-dimensional image analysis methods include image acquisition, construction of surface models, registration in a common coordinate system, visualization of overlays, and quantification of changes. Cone-beam computed tomography scans were acquired as volumetric images that can be visualized as 3D projected images or used to construct polygonal meshes or surfaces of specific anatomic structures of interest. The anatomic structures of interest in the scans can be labeled with color (3D volumetric label maps), and then the scans are registered in a common coordinate system using a target region as the reference. The registered 3D volumetric label maps can be saved in .obj, .ply, .stl, or .vtk file formats and used for overlays, quantification of differences in each of the 3 planes of space, or color-coded graphic displays of 3D surface distances. Results All registered 3D surface models in this study were saved in .vtk file format and loaded in the Elsevier 3D viewer. In this study, we describe possible ways to visualize the surface models constructed from cone-beam computed tomography images using 2D and 3D figures. The 3D surface models are available in the article’s online version for viewing and downloading using the reader’s software of choice. These 3D graphic displays are represented in the print version as 2D snapshots. Overlays and color-coded distance maps can be displayed using the reader’s software of choice, allowing graphic assessment of the location and direction of changes or morphologic differences relative to the structure of reference. The interpretation of 3D overlays and quantitative color-coded maps requires basic knowledge of 3D image analysis. Conclusions When submitting manuscripts, authors can

  14. Reconstruction 3-dimensional image from 2-dimensional image of status optical coherence tomography (OCT) for analysis of changes in retinal thickness

    SciTech Connect

    Arinilhaq,; Widita, Rena

    2014-09-30

    Optical Coherence Tomography is often used in medical image acquisition to diagnose that change due easy to use and low price. Unfortunately, this type of examination produces a two-dimensional retinal image of the point of acquisition. Therefore, this study developed a method that combines and reconstruct 2-dimensional retinal images into three-dimensional images to display volumetric macular accurately. The system is built with three main stages: data acquisition, data extraction and 3-dimensional reconstruction. At data acquisition step, Optical Coherence Tomography produced six *.jpg images of each patient were further extracted with MATLAB 2010a software into six one-dimensional arrays. The six arrays are combined into a 3-dimensional matrix using a kriging interpolation method with SURFER9 resulting 3-dimensional graphics of macula. Finally, system provides three-dimensional color graphs based on the data distribution normal macula. The reconstruction system which has been designed produces three-dimensional images with size of 481 × 481 × h (retinal thickness) pixels.

  15. In-Field, In Situ, and In Vivo 3-Dimensional Elemental Mapping for Plant Tissue and Soil Analysis Using Laser-Induced Breakdown Spectroscopy

    PubMed Central

    Zhao, Chunjiang; Dong, Daming; Du, Xiaofan; Zheng, Wengang

    2016-01-01

    Sensing and mapping element distributions in plant tissues and its growth environment has great significance for understanding the uptake, transport, and accumulation of nutrients and harmful elements in plants, as well as for understanding interactions between plants and the environment. In this study, we developed a 3-dimensional elemental mapping system based on laser-induced breakdown spectroscopy that can be deployed in- field to directly measure the distribution of multiple elements in living plants as well as in the soil. Mapping is performed by a fast scanning laser, which ablates a micro volume of a sample to form a plasma. The presence and concentration of specific elements are calculated using the atomic, ionic, and molecular spectral characteristics of the plasma emission spectra. Furthermore, we mapped the pesticide residues in maize leaves after spraying to demonstrate the capacity of this method for trace elemental mapping. We also used the system to quantitatively detect the element concentrations in soil, which can be used to further understand the element transport between plants and soil. We demonstrate that this method has great potential for elemental mapping in plant tissues and soil with the advantages of 3-dimensional and multi-elemental mapping, in situ and in vivo measurement, flexible use, and low cost. PMID:27782074

  16. In-Field, In Situ, and In Vivo 3-Dimensional Elemental Mapping for Plant Tissue and Soil Analysis Using Laser-Induced Breakdown Spectroscopy.

    PubMed

    Zhao, Chunjiang; Dong, Daming; Du, Xiaofan; Zheng, Wengang

    2016-10-22

    Sensing and mapping element distributions in plant tissues and its growth environment has great significance for understanding the uptake, transport, and accumulation of nutrients and harmful elements in plants, as well as for understanding interactions between plants and the environment. In this study, we developed a 3-dimensional elemental mapping system based on laser-induced breakdown spectroscopy that can be deployed in- field to directly measure the distribution of multiple elements in living plants as well as in the soil. Mapping is performed by a fast scanning laser, which ablates a micro volume of a sample to form a plasma. The presence and concentration of specific elements are calculated using the atomic, ionic, and molecular spectral characteristics of the plasma emission spectra. Furthermore, we mapped the pesticide residues in maize leaves after spraying to demonstrate the capacity of this method for trace elemental mapping. We also used the system to quantitatively detect the element concentrations in soil, which can be used to further understand the element transport between plants and soil. We demonstrate that this method has great potential for elemental mapping in plant tissues and soil with the advantages of 3-dimensional and multi-elemental mapping, in situ and in vivo measurement, flexible use, and low cost.

  17. Scientific visualization of 3-dimensional optimized stellarator configurations

    SciTech Connect

    Spong, D.A.

    1998-01-01

    The design techniques and physics analysis of modern stellarator configurations for magnetic fusion research rely heavily on high performance computing and simulation. Stellarators, which are fundamentally 3-dimensional in nature, offer significantly more design flexibility than more symmetric devices such as the tokamak. By varying the outer boundary shape of the plasma, a variety of physics features, such as transport, stability, and heating efficiency can be optimized. Scientific visualization techniques are an important adjunct to this effort as they provide a necessary ergonomic link between the numerical results and the intuition of the human researcher. The authors have developed a variety of visualization techniques for stellarators which both facilitate the design optimization process and allow the physics simulations to be more readily understood.

  18. Quantitative analysis of aortic regurgitation: real-time 3-dimensional and 2-dimensional color Doppler echocardiographic method--a clinical and a chronic animal study

    NASA Technical Reports Server (NTRS)

    Shiota, Takahiro; Jones, Michael; Tsujino, Hiroyuki; Qin, Jian Xin; Zetts, Arthur D.; Greenberg, Neil L.; Cardon, Lisa A.; Panza, Julio A.; Thomas, James D.

    2002-01-01

    BACKGROUND: For evaluating patients with aortic regurgitation (AR), regurgitant volumes, left ventricular (LV) stroke volumes (SV), and absolute LV volumes are valuable indices. AIM: The aim of this study was to validate the combination of real-time 3-dimensional echocardiography (3DE) and semiautomated digital color Doppler cardiac flow measurement (ACM) for quantifying absolute LV volumes, LVSV, and AR volumes using an animal model of chronic AR and to investigate its clinical applicability. METHODS: In 8 sheep, a total of 26 hemodynamic states were obtained pharmacologically 20 weeks after the aortic valve noncoronary (n = 4) or right coronary (n = 4) leaflet was incised to produce AR. Reference standard LVSV and AR volume were determined using the electromagnetic flow method (EM). Simultaneous epicardial real-time 3DE studies were performed to obtain LV end-diastolic volumes (LVEDV), end-systolic volumes (LVESV), and LVSV by subtracting LVESV from LVEDV. Simultaneous ACM was performed to obtain LVSV and transmitral flows; AR volume was calculated by subtracting transmitral flow volume from LVSV. In a total of 19 patients with AR, real-time 3DE and ACM were used to obtain LVSVs and these were compared with each other. RESULTS: A strong relationship was found between LVSV derived from EM and those from the real-time 3DE (r = 0.93, P <.001, mean difference (3D - EM) = -1.0 +/- 9.8 mL). A good relationship between LVSV and AR volumes derived from EM and those by ACM was found (r = 0.88, P <.001). A good relationship between LVSV derived from real-time 3DE and that from ACM was observed (r = 0.73, P <.01, mean difference = 2.5 +/- 7.9 mL). In patients, a good relationship between LVSV obtained by real-time 3DE and ACM was found (r = 0.90, P <.001, mean difference = 0.6 +/- 9.8 mL). CONCLUSION: The combination of ACM and real-time 3DE for quantifying LV volumes, LVSV, and AR volumes was validated by the chronic animal study and was shown to be clinically applicable.

  19. 3-dimensional Oil Drift Simulations

    NASA Astrophysics Data System (ADS)

    Wettre, C.; Reistad, M.; Hjøllo, B.Å.

    Simulation of oil drift has been an ongoing activity at the Norwegian Meteorological Institute since the 1970's. The Marine Forecasting Centre provides a 24-hour service for the Norwegian Pollution Control Authority and the oil companies operating in the Norwegian sector. The response time is 30 minutes. From 2002 the service is extended to simulation of oil drift from oil spills in deep water, using the DeepBlow model developed by SINTEF Applied Chemistry. The oil drift model can be applied both for instantaneous and continuous releases. The changes in the mass of oil and emulsion as a result of evaporation and emulsion are computed. For oil spill at deep water, hydrate formation and gas dissolution are taken into account. The properties of the oil depend on the oil type, and in the present version 64 different types of oil can be simulated. For accurate oil drift simulations it is important to have the best possible data on the atmospheric and oceanic conditions. The oil drift simulations at the Norwegian Meteorological Institute are always based on the most updated data from numerical models of the atmosphere and the ocean. The drift of the surface oil is computed from the vectorial sum of the surface current from the ocean model and the wave induced Stokes drift computed from wave energy spectra from the wave prediction model. In the new model the current distribution with depth is taken into account when calculating the drift of the dispersed oil droplets. Salinity and temperature profiles from the ocean model are needed in the DeepBlow model. The result of the oil drift simulations can be plotted on sea charts used for navigation, either as trajectory plots or particle plots showing the situation at a given time. The results can also be sent as data files to be included in the user's own GIS system.

  20. Computed tomography-based finite element analysis to assess fracture risk and osteoporosis treatment

    PubMed Central

    Imai, Kazuhiro

    2015-01-01

    Finite element analysis (FEA) is a computer technique of structural stress analysis and developed in engineering mechanics. FEA has developed to investigate structural behavior of human bones over the past 40 years. When the faster computers have acquired, better FEA, using 3-dimensional computed tomography (CT) has been developed. This CT-based finite element analysis (CT/FEA) has provided clinicians with useful data. In this review, the mechanism of CT/FEA, validation studies of CT/FEA to evaluate accuracy and reliability in human bones, and clinical application studies to assess fracture risk and effects of osteoporosis medication are overviewed. PMID:26309819

  1. Effect of Foot Hyperpronation on Lumbar Lordosis and Thoracic Kyphosis in Standing Position Using 3-Dimensional Ultrasound-Based Motion Analysis System

    PubMed Central

    Farokhmanesh, Khatere; Shirzadian, Toraj; Mahboubi, Mohammad; Shahri, Mina Neyakan

    2014-01-01

    Based on clinical observations, foot hyperpronation is very common. Excessive pronation (hyperpronation) can cause malalignment of the lower extremities. This most often leads to functional and structural deficits. The aim of this study was to assess the effect of foot hyperpronation on lumbar lordosis and thoracic kyphosis. Thirty five healthy subjects (age range, 18030 years) were asked to stand on 4 positions including a flat surface (normal position) and on wedges angled at 10, 15, and 20 degrees. Sampling was done using simple random sampling. Measurements were made by a motion analysis system. For data analysis, the SPSS software (ver. 18) using paired t-test and repeated measures analysis of variance (ANOVA) was applied. The eversion created by the wedges caused a significant increase in lumbar lordosis and thoracic kyphosis. The most significant change occurred between two consecutive positions of flat surface and the first wedge. The t-test for repeated measures showed a high correlation between each two consecutive positions. The results showed that with increased bilateral foot pronation, lumbar lordosis and thoracic kyphosis increased as well. In fact, each of these results is a compensation phenomenon. Further studies are required to determine long-term results of excessive foot pronation and its probable effect on damage progression. PMID:25169004

  2. Cardiothoracic Applications of 3-dimensional Printing.

    PubMed

    Giannopoulos, Andreas A; Steigner, Michael L; George, Elizabeth; Barile, Maria; Hunsaker, Andetta R; Rybicki, Frank J; Mitsouras, Dimitris

    2016-09-01

    Medical 3-dimensional (3D) printing is emerging as a clinically relevant imaging tool in directing preoperative and intraoperative planning in many surgical specialties and will therefore likely lead to interdisciplinary collaboration between engineers, radiologists, and surgeons. Data from standard imaging modalities such as computed tomography, magnetic resonance imaging, echocardiography, and rotational angiography can be used to fabricate life-sized models of human anatomy and pathology, as well as patient-specific implants and surgical guides. Cardiovascular 3D-printed models can improve diagnosis and allow for advanced preoperative planning. The majority of applications reported involve congenital heart diseases and valvular and great vessels pathologies. Printed models are suitable for planning both surgical and minimally invasive procedures. Added value has been reported toward improving outcomes, minimizing perioperative risk, and developing new procedures such as transcatheter mitral valve replacements. Similarly, thoracic surgeons are using 3D printing to assess invasion of vital structures by tumors and to assist in diagnosis and treatment of upper and lower airway diseases. Anatomic models enable surgeons to assimilate information more quickly than image review, choose the optimal surgical approach, and achieve surgery in a shorter time. Patient-specific 3D-printed implants are beginning to appear and may have significant impact on cosmetic and life-saving procedures in the future. In summary, cardiothoracic 3D printing is rapidly evolving and may be a potential game-changer for surgeons. The imager who is equipped with the tools to apply this new imaging science to cardiothoracic care is thus ideally positioned to innovate in this new emerging imaging modality.

  3. A 3-Dimensional Analysis of the Galactic Gamma-Ray Emission Resulting from Cosmic-Ray Interactions with the Interstellar Gas and Radiation Fields

    NASA Technical Reports Server (NTRS)

    Sodroski, Thomas J.; Dwek, Eli (Technical Monitor)

    2001-01-01

    The contractor will provide support for the analysis of data under ADP (NRA 96-ADP- 09; Proposal No . 167-96adp). The primary task objective is to construct a 3-D model for the distribution of high-energy (20 MeV - 30 GeV) gamma-ray emission in the Galactic disk. Under this task the contractor will utilize data from the EGRET instrument on the Compton Gamma-Ray Observatory, H I and CO surveys, radio-continuum surveys at 408 MHz, 1420 MHz, 5 GHz, and 19 GHz, the COBE Diffuse Infrared Background Experiment (DIME) all-sky maps from 1 to 240 p, and ground-based B, V, J, H, and K photometry. The respective contributions to the gamma-ray emission from cosmic ray/matter interactions, inverse Compton scattering, and extragalactic emission will be determined.

  4. 3-Dimensional Topographic Models for the Classroom

    NASA Technical Reports Server (NTRS)

    Keller, J. W.; Roark, J. H.; Sakimoto, S. E. H.; Stockman, S.; Frey, H. V.

    2003-01-01

    We have recently undertaken a program to develop educational tools using 3-dimensional solid models of digital elevation data acquired by the Mars Orbital Laser Altimeter (MOLA) for Mars as well as a variety of sources for elevation data of the Earth. This work is made possible by the use of rapid prototyping technology to construct solid 3-Dimensional models of science data. We recently acquired rapid prototyping machine that builds 3-dimensional models in extruded plastic. While the machine was acquired to assist in the design and development of scientific instruments and hardware, it is also fully capable of producing models of spacecraft remote sensing data. We have demonstrated this by using Mars Orbiter Laser Altimeter (MOLA) topographic data and Earth based topographic data to produce extruded plastic topographic models which are visually appealing and instantly engage those who handle them.

  5. Movement within foot and ankle joint in children with spastic cerebral palsy: a 3-dimensional ultrasound analysis of medial gastrocnemius length with correction for effects of foot deformation

    PubMed Central

    2013-01-01

    Background In spastic cerebral palsy (SCP), a limited range of motion of the foot (ROM), limits gait and other activities. Assessment of this limitation of ROM and knowledge of active mechanisms is of crucial importance for clinical treatment. Methods For a comparison between spastic cerebral palsy (SCP) children and typically developing children (TD), medial gastrocnemius muscle-tendon complex length was assessed using 3-D ultrasound imaging techniques, while exerting externally standardized moments via a hand-held dynamometer. Exemplary X-ray imaging of ankle and foot was used to confirm possible TD-SCP differences in foot deformation. Results SCP and TD did not differ in normalized level of excitation (EMG) of muscles studied. For given moments exerted in SCP, foot plate angles were all more towards plantar flexion than in TD. However, foot plate angle proved to be an invalid estimator of talocrural joint angle, since at equal foot plate angles, GM muscle-tendon complex was shorter in SCP (corresponding to an equivalent of 1 cm). A substantial difference remained even after normalizing for individual differences in tibia length. X-ray imaging of ankle and foot of one SCP child and two typically developed adults, confirmed that in SCP that of total footplate angle changes (0-4 Nm: 15°), the contribution of foot deformation to changes in foot plate angle (8) were as big as the contribution of dorsal flexion at the talocrural joint (7°). In typically developed individuals there were relatively smaller contributions (10 -11%) by foot deformation to changes in foot plate angle, indicating that the contribution of talocrural angle changes was most important. Using a new estimate for position at the talocrural joint (the difference between GM muscle–tendon complex length and tibia length, GM relative length) removed this effect, thus allowing more fair comparison of SCP and TD data. On the basis of analysis of foot plate angle and GM relative length as a function

  6. Distributed Design and Analysis of Computer Experiments

    SciTech Connect

    Doak, Justin

    2002-11-11

    DDACE is a C++ object-oriented software library for the design and analysis of computer experiments. DDACE can be used to generate samples from a variety of sampling techniques. These samples may be used as input to a application code. DDACE also contains statistical tools such as response surface models and correlation coefficients to analyze input/output relationships between variables in an application code. DDACE can generate input values for uncertain variables within a user's application. For example, a user might like to vary a temperature variable as well as some material variables in a series of simulations. Through the series of simulations the user might be looking for optimal settings of parameters based on some user criteria. Or the user may be interested in the sensitivity to input variability shown by an output variable. In either case, the user may provide information about the suspected ranges and distributions of a set of input variables, along with a sampling scheme, and DDACE will generate input points based on these specifications. The input values generated by DDACE and the one or more outputs computed through the user's application code can be analyzed with a variety of statistical methods. This can lead to a wealth of information about the relationships between the variables in the problem. While statistical and mathematical packages may be employeed to carry out the analysis on the input/output relationships, DDACE also contains some tools for analyzing the simulation data. DDACE incorporates a software package called MARS (Multivariate Adaptive Regression Splines), developed by Jerome Friedman. MARS is used for generating a spline surface fit of the data. With MARS, a model simplification may be calculated using the input and corresponding output, values for the user's application problem. The MARS grid data may be used for generating 3-dimensional response surface plots of the simulation data. DDACE also contains an implementation of an

  7. 3-dimensional imaging at nanometer resolutions

    DOEpatents

    Werner, James H.; Goodwin, Peter M.; Shreve, Andrew P.

    2010-03-09

    An apparatus and method for enabling precise, 3-dimensional, photoactivation localization microscopy (PALM) using selective, two-photon activation of fluorophores in a single z-slice of a sample in cooperation with time-gated imaging for reducing the background radiation from other image planes to levels suitable for single-molecule detection and spatial location, are described.

  8. Optimization of 3-dimensional imaging of the breast region with 3-dimensional laser scanners.

    PubMed

    Kovacs, Laszlo; Yassouridis, Alexander; Zimmermann, Alexander; Brockmann, Gernot; Wöhnl, Antonia; Blaschke, Matthias; Eder, Maximilian; Schwenzer-Zimmerer, Katja; Rosenberg, Robert; Papadopulos, Nikolaos A; Biemer, Edgar

    2006-03-01

    The anatomic conditions of the female breast require imaging the breast region 3-dimensionally in a normal standing position for quality assurance and for surgery planning or surgery simulation. The goal of this work was to optimize the imaging technology for the mammary region with a 3-dimensional (3D) laser scanner, to evaluate the precision and accuracy of the method, and to allow optimum data reproducibility. Avoiding the influence of biotic factors, such as mobility, we tested the most favorable imaging technology on dummy models for scanner-related factors such as the scanner position in comparison with the torso and the number of scanners and single shots. The influence of different factors of the breast region, such as different breast shapes or premarking of anatomic landmarks, was also first investigated on dummies. The findings from the dummy models were then compared with investigations on test persons, and the accuracy of measurements on the virtual models was compared with a coincidence analysis of the manually measured values. The best precision and accuracy of breast region measurements were achieved when landmarks were marked before taking the shots and when shots at 30 degrees left and 30 degrees right, relative to the sagittal line, were taken with 2 connected scanners mounted with a +10-degree upward angle. However, the precision of the measurements on test persons was significantly lower than those measured on dummies. Our findings show that the correct settings for 3D imaging of the breast region with a laser scanner can achieve an acceptable degree of accuracy and reproducibility.

  9. 3-dimensional fabrication of soft energy harvesters

    NASA Astrophysics Data System (ADS)

    McKay, Thomas; Walters, Peter; Rossiter, Jonathan; O'Brien, Benjamin; Anderson, Iain

    2013-04-01

    Dielectric elastomer generators (DEG) provide an opportunity to harvest energy from low frequency and aperiodic sources. Because DEG are soft, deformable, high energy density generators, they can be coupled to complex structures such as the human body to harvest excess mechanical energy. However, DEG are typically constrained by a rigid frame and manufactured in a simple planar structure. This planar arrangement is unlikely to be optimal for harvesting from compliant and/or complex structures. In this paper we present a soft generator which is fabricated into a 3 Dimensional geometry. This capability will enable the 3-dimensional structure of a dielectric elastomer to be customised to the energy source, allowing efficient and/or non-invasive coupling. This paper demonstrates our first 3 dimensional generator which includes a diaphragm with a soft elastomer frame. When the generator was connected to a self-priming circuit and cyclically inflated, energy was accumulated in the system, demonstrated by an increased voltage. Our 3D generator promises a bright future for dielectric elastomers that will be customised for integration with complex and soft structures. In addition to customisable geometries, the 3D printing process may lend itself to fabricating large arrays of small generator units and for fabricating truly soft generators with excellent impedance matching to biological tissue. Thus comfortable, wearable energy harvesters are one step closer to reality.

  10. Computer vision in microstructural analysis

    NASA Technical Reports Server (NTRS)

    Srinivasan, Malur N.; Massarweh, W.; Hough, C. L.

    1992-01-01

    The following is a laboratory experiment designed to be performed by advanced-high school and beginning-college students. It is hoped that this experiment will create an interest in and further understanding of materials science. The objective of this experiment is to demonstrate that the microstructure of engineered materials is affected by the processing conditions in manufacture, and that it is possible to characterize the microstructure using image analysis with a computer. The principle of computer vision will first be introduced followed by the description of the system developed at Texas A&M University. This in turn will be followed by the description of the experiment to obtain differences in microstructure and the characterization of the microstructure using computer vision.

  11. Computational Aeroacoustic Analysis System Development

    NASA Technical Reports Server (NTRS)

    Hadid, A.; Lin, W.; Ascoli, E.; Barson, S.; Sindir, M.

    2001-01-01

    Many industrial and commercial products operate in a dynamic flow environment and the aerodynamically generated noise has become a very important factor in the design of these products. In light of the importance in characterizing this dynamic environment, Rocketdyne has initiated a multiyear effort to develop an advanced general-purpose Computational Aeroacoustic Analysis System (CAAS) to address these issues. This system will provide a high fidelity predictive capability for aeroacoustic design and analysis. The numerical platform is able to provide high temporal and spatial accuracy that is required for aeroacoustic calculations through the development of a high order spectral element numerical algorithm. The analysis system is integrated with well-established CAE tools, such as a graphical user interface (GUI) through PATRAN, to provide cost-effective access to all of the necessary tools. These include preprocessing (geometry import, grid generation and boundary condition specification), code set up (problem specification, user parameter definition, etc.), and postprocessing. The purpose of the present paper is to assess the feasibility of such a system and to demonstrate the efficiency and accuracy of the numerical algorithm through numerical examples. Computations of vortex shedding noise were carried out in the context of a two-dimensional low Mach number turbulent flow past a square cylinder. The computational aeroacoustic approach that is used in CAAS relies on coupling a base flow solver to the acoustic solver throughout a computational cycle. The unsteady fluid motion, which is responsible for both the generation and propagation of acoustic waves, is calculated using a high order flow solver. The results of the flow field are then passed to the acoustic solver through an interpolator to map the field values into the acoustic grid. The acoustic field, which is governed by the linearized Euler equations, is then calculated using the flow results computed

  12. Preliminary Toxicity Analysis of 3-Dimensional Conformal Radiation Therapy Versus Intensity Modulated Radiation Therapy on the High-Dose Arm of the Radiation Therapy Oncology Group 0126 Prostate Cancer Trial

    SciTech Connect

    Michalski, Jeff M.; Yan, Yan; Watkins-Bruner, Deborah; Bosch, Walter R.; Winter, Kathryn; Galvin, James M.; Bahary, Jean-Paul; Morton, Gerard C.; Parliament, Matthew B.; Sandler, Howard M.

    2013-12-01

    Purpose: To give a preliminary report of clinical and treatment factors associated with toxicity in men receiving high-dose radiation therapy (RT) on a phase 3 dose-escalation trial. Methods and Materials: The trial was initiated with 3-dimensional conformal RT (3D-CRT) and amended after 1 year to allow intensity modulated RT (IMRT). Patients treated with 3D-CRT received 55.8 Gy to a planning target volume that included the prostate and seminal vesicles, then 23.4 Gy to prostate only. The IMRT patients were treated to the prostate and proximal seminal vesicles to 79.2 Gy. Common Toxicity Criteria, version 2.0, and Radiation Therapy Oncology Group/European Organization for Research and Treatment of Cancer late morbidity scores were used for acute and late effects. Results: Of 763 patients randomized to the 79.2-Gy arm of Radiation Therapy Oncology Group 0126 protocol, 748 were eligible and evaluable: 491 and 257 were treated with 3D-CRT and IMRT, respectively. For both bladder and rectum, the volumes receiving 65, 70, and 75 Gy were significantly lower with IMRT (all P<.0001). For grade (G) 2+ acute gastrointestinal/genitourinary (GI/GU) toxicity, both univariate and multivariate analyses showed a statistically significant decrease in G2+ acute collective GI/GU toxicity for IMRT. There were no significant differences with 3D-CRT or IMRT for acute or late G2+ or 3+ GU toxicities. Univariate analysis showed a statistically significant decrease in late G2+ GI toxicity for IMRT (P=.039). On multivariate analysis, IMRT showed a 26% reduction in G2+ late GI toxicity (P=.099). Acute G2+ toxicity was associated with late G3+ toxicity (P=.005). With dose–volume histogram data in the multivariate analysis, RT modality was not significant, whereas white race (P=.001) and rectal V70 ≥15% were associated with G2+ rectal toxicity (P=.034). Conclusions: Intensity modulated RT is associated with a significant reduction in acute G2+ GI/GU toxicity. There is a trend for a

  13. Forensic Analysis of Compromised Computers

    NASA Technical Reports Server (NTRS)

    Wolfe, Thomas

    2004-01-01

    Directory Tree Analysis File Generator is a Practical Extraction and Reporting Language (PERL) script that simplifies and automates the collection of information for forensic analysis of compromised computer systems. During such an analysis, it is sometimes necessary to collect and analyze information about files on a specific directory tree. Directory Tree Analysis File Generator collects information of this type (except information about directories) and writes it to a text file. In particular, the script asks the user for the root of the directory tree to be processed, the name of the output file, and the number of subtree levels to process. The script then processes the directory tree and puts out the aforementioned text file. The format of the text file is designed to enable the submission of the file as input to a spreadsheet program, wherein the forensic analysis is performed. The analysis usually consists of sorting files and examination of such characteristics of files as ownership, time of creation, and time of most recent access, all of which characteristics are among the data included in the text file.

  14. A Petaflops Era Computing Analysis

    NASA Technical Reports Server (NTRS)

    Preston, Frank S.

    1998-01-01

    This report covers a study of the potential for petaflops (1O(exp 15) floating point operations per second) computing. This study was performed within the year 1996 and should be considered as the first step in an on-going effort. 'Me analysis concludes that a petaflop system is technically feasible but not feasible with today's state-of-the-art. Since the computer arena is now a commodity business, most experts expect that a petaflops system will evolve from current technology in an evolutionary fashion. To meet the price expectations of users waiting for petaflop performance, great improvements in lowering component costs will be required. Lower power consumption is also a must. The present rate of progress in improved performance places the date of introduction of petaflop systems at about 2010. Several years before that date, it is projected that the resolution limit of chips will reach the now known resolution limit. Aside from the economic problems and constraints, software is identified as the major problem. The tone of this initial study is more pessimistic than most of the Super-published material available on petaflop systems. Workers in the field are expected to generate more data which could serve to provide a basis for a more informed projection. This report includes an annotated bibliography.

  15. Personal Computer Transport Analysis Program

    NASA Technical Reports Server (NTRS)

    DiStefano, Frank, III; Wobick, Craig; Chapman, Kirt; McCloud, Peter

    2012-01-01

    The Personal Computer Transport Analysis Program (PCTAP) is C++ software used for analysis of thermal fluid systems. The program predicts thermal fluid system and component transients. The output consists of temperatures, flow rates, pressures, delta pressures, tank quantities, and gas quantities in the air, along with air scrubbing component performance. PCTAP s solution process assumes that the tubes in the system are well insulated so that only the heat transfer between fluid and tube wall and between adjacent tubes is modeled. The system described in the model file is broken down into its individual components; i.e., tubes, cold plates, heat exchangers, etc. A solution vector is built from the components and a flow is then simulated with fluid being transferred from one component to the next. The solution vector of components in the model file is built at the initiation of the run. This solution vector is simply a list of components in the order of their inlet dependency on other components. The component parameters are updated in the order in which they appear in the list at every time step. Once the solution vectors have been determined, PCTAP cycles through the components in the solution vector, executing their outlet function for each time-step increment.

  16. Computer graphics in aerodynamic analysis

    NASA Technical Reports Server (NTRS)

    Cozzolongo, J. V.

    1984-01-01

    The use of computer graphics and its application to aerodynamic analyses on a routine basis is outlined. The mathematical modelling of the aircraft geometries and the shading technique implemented are discussed. Examples of computer graphics used to display aerodynamic flow field data and aircraft geometries are shown. A future need in computer graphics for aerodynamic analyses is addressed.

  17. Candidate gene analyses of 3-dimensional dentoalveolar phenotypes in subjects with malocclusion

    PubMed Central

    Weaver, Cole A.; Miller, Steven F.; da Fontoura, Clarissa S. G.; Wehby, George L.; Amendt, Brad A.; Holton, Nathan E.; Allareddy, Veeratrishul; Southard, Thomas E.; Moreno Uribe, Lina M.

    2017-01-01

    Introduction Genetic studies of malocclusion etiology have identified 4 deleterious mutations in genes, DUSP6, ARHGAP21, FGF23, and ADAMTS1 in familial Class III cases. Although these variants may have large impacts on Class III phenotypic expression, their low frequency (<1%) makes them unlikely to explain most malocclusions. Thus, much of the genetic variation underlying the dentofacial phenotypic variation associated with malocclusion remains unknown. In this study, we evaluated associations between common genetic variations in craniofacial candidate genes and 3-dimensional dentoalveolar phenotypes in patients with malocclusion. Methods Pretreatment dental casts or cone-beam computed tomographic images from 300 healthy subjects were digitized with 48 landmarks. The 3-dimensional coordinate data were submitted to a geometric morphometric approach along with principal component analysis to generate continuous phenotypes including symmetric and asymmetric components of dentoalveolar shape variation, fluctuating asymmetry, and size. The subjects were genotyped for 222 single-nucleotide polymorphisms in 82 genes/loci, and phenotpye-genotype associations were tested via multivariate linear regression. Results Principal component analysis of symmetric variation identified 4 components that explained 68% of the total variance and depicted anteroposterior, vertical, and transverse dentoalveolar discrepancies. Suggestive associations (P < 0.05) were identified with PITX2, SNAI3, 11q22.2-q22.3, 4p16.1, ISL1, and FGF8. Principal component analysis for asymmetric variations identified 4 components that explained 51% of the total variations and captured left-to-right discrepancies resulting in midline deviations, unilateral crossbites, and ectopic eruptions. Suggestive associations were found with TBX1 AJUBA, SNAI3 SATB2, TP63, and 1p22.1. Fluctuating asymmetry was associated with BMP3 and LATS1. Associations for SATB2 and BMP3 with asymmetric variations remained significant

  18. Bimolecular dynamics by computer analysis

    SciTech Connect

    Eilbeck, J.C.; Lomdahl, P.S.; Scott, A.C.

    1984-01-01

    As numerical tools (computers and display equipment) become more powerful and the atomic structures of important biological molecules become known, the importance of detailed computation of nonequilibrium biomolecular dynamics increases. In this manuscript we report results from a well developed study of the hydrogen bonded polypeptide crystal acetanilide, a model protein. Directions for future research are suggested. 9 references, 6 figures.

  19. Computer Aided Data Analysis in Sociometry

    ERIC Educational Resources Information Center

    Langeheine, Rolf

    1978-01-01

    A computer program which analyzes sociometric data is presented. The SDAS program provides classical sociometric analysis. Multi-dimensional scaling and cluster analysis techniques may be combined with the MSP program. (JKS)

  20. Computer assistance in food analysis.

    PubMed

    Dusold, L R; Roach, J A

    1986-01-01

    Laboratory computer links are a key part of acquisition, movement, and interpretation of certain types of data. Remote information retrieval from databases such as the Chemical Information System provides the analyst with structural and toxicological information via a laboratory terminal. Remote processing of laboratory data by large computers permits the application of pattern recognition techniques to the solution of complex multivariate problems such as the detection of food adulteration.

  1. Chaotic Advection in a Bounded 3-Dimensional Potential Flow

    NASA Astrophysics Data System (ADS)

    Metcalfe, Guy; Smith, Lachlan; Lester, Daniel

    2012-11-01

    3-dimensional potential, or Darcy flows, are central to understanding and designing laminar transport in porous media; however, chaotic advection in 3-dimensional, volume-preserving flows is still not well understood. We show results of advecting passive scalars in a transient 3-dimensional potential flow that consists of a steady dipole flow and periodic reorientation. Even for the most symmetric reorientation protocol, neither of the two invarients of the motion are conserved; however, one invarient is closely shadowed by a surface of revolution constructed from particle paths of the steady flow, creating in practice an adiabatic surface. A consequence is that chaotic regions cover 3-dimensional space, though tubular regular regions are still transport barriers. This appears to be a new mechanism generating 3-dimensional chaotic orbits. These results contast with the experimental and theoretical results for chaotic scalar transport in 2-dimensional Darcy flows. Wiggins, J. Fluid Mech. 654 (2010).

  2. Computational methods for global/local analysis

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.; Mccleary, Susan L.; Aminpour, Mohammad A.; Knight, Norman F., Jr.

    1992-01-01

    Computational methods for global/local analysis of structures which include both uncoupled and coupled methods are described. In addition, global/local analysis methodology for automatic refinement of incompatible global and local finite element models is developed. Representative structural analysis problems are presented to demonstrate the global/local analysis methods.

  3. Computational analysis on plug-in hybrid electric motorcycle chassis

    NASA Astrophysics Data System (ADS)

    Teoh, S. J.; Bakar, R. A.; Gan, L. M.

    2013-12-01

    Plug-in hybrid electric motorcycle (PHEM) is an alternative to promote sustainability lower emissions. However, the PHEM overall system packaging is constrained by limited space in a motorcycle chassis. In this paper, a chassis applying the concept of a Chopper is analysed to apply in PHEM. The chassis 3dimensional (3D) modelling is built with CAD software. The PHEM power-train components and drive-train mechanisms are intergraded into the 3D modelling to ensure the chassis provides sufficient space. Besides that, a human dummy model is built into the 3D modelling to ensure the rider?s ergonomics and comfort. The chassis 3D model then undergoes stress-strain simulation. The simulation predicts the stress distribution, displacement and factor of safety (FOS). The data are used to identify the critical point, thus suggesting the chassis design is applicable or need to redesign/ modify to meet the require strength. Critical points mean highest stress which might cause the chassis to fail. This point occurs at the joints at triple tree and bracket rear absorber for a motorcycle chassis. As a conclusion, computational analysis predicts the stress distribution and guideline to develop a safe prototype chassis.

  4. Assessment of Arterial Wall Enhancement for Differentiation of Parent Artery Disease from Small Artery Disease: Comparison between Histogram Analysis and Visual Analysis on 3-Dimensional Contrast-Enhanced T1-Weighted Turbo Spin Echo MR Images at 3T

    PubMed Central

    Jang, Jinhee; Kim, Tae-Won; Hwang, Eo-Jin; Koo, Jaseong; Shin, Yong Sam; Jung, So-Lyung; Ahn, Kook-Jin; Kim, Bum-soo

    2017-01-01

    Objective The purpose of this study was to compare the histogram analysis and visual scores in 3T MRI assessment of middle cerebral arterial wall enhancement in patients with acute stroke, for the differentiation of parent artery disease (PAD) from small artery disease (SAD). Materials and Methods Among the 82 consecutive patients in a tertiary hospital for one year, 25 patients with acute infarcts in middle cerebral artery (MCA) territory were included in this study including 15 patients with PAD and 10 patients with SAD. Three-dimensional contrast-enhanced T1-weighted turbo spin echo MR images with black-blood preparation at 3T were analyzed both qualitatively and quantitatively. The degree of MCA stenosis, and visual and histogram assessments on MCA wall enhancement were evaluated. A statistical analysis was performed to compare diagnostic accuracy between qualitative and quantitative metrics. Results The degree of stenosis, visual enhancement score, geometric mean (GM), and the 90th percentile (90P) value from the histogram analysis were significantly higher in PAD than in SAD (p = 0.006 for stenosis, < 0.001 for others). The receiver operating characteristic curve area of GM and 90P were 1 (95% confidence interval [CI], 0.86–1.00). Conclusion A histogram analysis of a relevant arterial wall enhancement allows differentiation between PAD and SAD in patients with acute stroke within the MCA territory. PMID:28246519

  5. Computer applications for engineering/structural analysis

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-01-01

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  6. IUE Data Analysis Software for Personal Computers

    NASA Technical Reports Server (NTRS)

    Thompson, R.; Caplinger, J.; Taylor, L.; Lawton , P.

    1996-01-01

    This report summarizes the work performed for the program titled, "IUE Data Analysis Software for Personal Computers" awarded under Astrophysics Data Program NRA 92-OSSA-15. The work performed was completed over a 2-year period starting in April 1994. As a result of the project, 450 IDL routines and eight database tables are now available for distribution for Power Macintosh computers and Personal Computers running Windows 3.1.

  7. Massive Contingency Analysis with High Performance Computing

    SciTech Connect

    Huang, Zhenyu; Chen, Yousu; Nieplocha, Jaroslaw

    2009-07-26

    Contingency analysis is a key function in the Energy Management System (EMS) to assess the impact of various combinations of power system component failures based on state estimates. Contingency analysis is also extensively used in power market operation for feasibility test of market solutions. Faster analysis of more cases is required to safely and reliably operate today’s power grids with less marginal and more intermittent renewable energy sources. Enabled by the latest development in the computer industry, high performance computing holds the promise of meet the need in the power industry. This paper investigates the potential of high performance computing for massive contingency analysis. The framework of "N-x" contingency analysis is established and computational load balancing schemes are studied and implemented with high performance computers. Case studies of massive 300,000-contingency-case analysis using the Western Electricity Coordinating Council power grid model are presented to illustrate the application of high performance computing and demonstrate the performance of the framework and computational load balancing schemes.

  8. Mandibular reconstruction using stereolithographic 3-dimensional printing modeling technology.

    PubMed

    Cohen, Adir; Laviv, Amir; Berman, Phillip; Nashef, Rizan; Abu-Tair, Jawad

    2009-11-01

    Mandibular reconstruction can be challenging for the surgeon wishing to restore its unique geometry. Reconstruction can be achieved with titanium bone plates followed by autogenous bone grafting. Incorporation of the bone graft into the mandible provides continuity and strength required for proper esthetics and function and permitting dental implant rehabilitation at a later stage. Precious time in the operating room is invested in plate contouring to reconstruct the mandible. Rapid prototyping technologies can construct physical models from computer-aided design via 3-dimensional (3D) printers. A prefabricated 3D model is achieved, which assists in accurate contouring of plates and/or planning of bone graft harvest geometry before surgery. The 2 most commonly used rapid prototyping technologies are stereolithography and 3D printing (3DP). Three-dimensional printing is advantageous to stereolithography for better accuracy, quicker printing time, and lower cost. We present 3 clinical cases based on 3DP modeling technology. Models were fabricated before the resection of mandibular ameloblastoma and were used to prepare bridging plates before the first stage of reconstruction. In 1 case, another model was fabricated and used as a template for iliac crest bone graft in the second stage of reconstruction. The 3DP technology provided a precise, fast, and cheap mandibular reconstruction, which aids in shortened operation time (and therefore decreased exposure time to general anesthesia, decreased blood loss, and shorter wound exposure time) and easier surgical procedure.

  9. Computer aided engineering analysis of automotive bumpers

    SciTech Connect

    Glance, P.M.

    1984-01-01

    This paper presents a description of a general purpose, computer-aided engineering design methodology which has been employed in the design of automotive bumper systems. A comparison of computer-aided analysis predictions with actual test data is presented. Two case histories of bumper system designs are discussed.

  10. Distributed computing and nuclear reactor analysis

    SciTech Connect

    Brown, F.B.; Derstine, K.L.; Blomquist, R.N.

    1994-03-01

    Large-scale scientific and engineering calculations for nuclear reactor analysis can now be carried out effectively in a distributed computing environment, at costs far lower than for traditional mainframes. The distributed computing environment must include support for traditional system services, such as a queuing system for batch work, reliable filesystem backups, and parallel processing capabilities for large jobs. All ANL computer codes for reactor analysis have been adapted successfully to a distributed system based on workstations and X-terminals. Distributed parallel processing has been demonstrated to be effective for long-running Monte Carlo calculations.

  11. [3-dimensional photogrammetry assessment of facial contours].

    PubMed

    Kakoschke, D; Gäbel, H; Schettler, D

    1997-02-01

    In Germany, three-dimensional non-invasive measurement techniques are not in routine use for medical purposes. Completely integrated applications of photogrammetric technology are lacking. The results of clinical examination, X-rays and pre- and postoperative photographs from different angles have been used for medical analysis. In an interdisciplinary research project we tested the general applicability of photogrammetric measurement systems. We examined patients with malformations of the mandible-maxilla complex by taking pictures of the face. In order to assess the surface structure we projected regular patterns onto the surface. We calculated about 500 points on the surface with accuracy better than 0.2 mm. Graphical analyses of measurement results are presented in clinically relevant form. We produce representations of the faces in auto-CAD by means of regular meshes which allow views from any perspective, longitudinal and lateral sections. In addition to calculating angles, distances, surfaces and volumes, visualisation of shape is a useful aid in documentation and quantification of changes of soft tissue of the human face under surgery treatment.

  12. DFT computational analysis of piracetam

    NASA Astrophysics Data System (ADS)

    Rajesh, P.; Gunasekaran, S.; Seshadri, S.; Gnanasambandan, T.

    2014-11-01

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  13. DFT computational analysis of piracetam.

    PubMed

    Rajesh, P; Gunasekaran, S; Seshadri, S; Gnanasambandan, T

    2014-11-11

    Density functional theory calculation with B3LYP using 6-31G(d,p) and 6-31++G(d,p) basis set have been used to determine ground state molecular geometries. The first order hyperpolarizability (β0) and related properties (β, α0 and Δα) of piracetam is calculated using B3LYP/6-31G(d,p) method on the finite-field approach. The stability of molecule has been analyzed by using NBO/NLMO analysis. The calculation of first hyperpolarizability shows that the molecule is an attractive molecule for future applications in non-linear optics. Molecular electrostatic potential (MEP) at a point in the space around a molecule gives an indication of the net electrostatic effect produced at that point by the total charge distribution of the molecule. The calculated HOMO and LUMO energies show that charge transfer occurs within these molecules. Mulliken population analysis on atomic charge is also calculated. Because of vibrational analysis, the thermodynamic properties of the title compound at different temperatures have been calculated. Finally, the UV-Vis spectra and electronic absorption properties are explained and illustrated from the frontier molecular orbitals.

  14. Computational analysis of ramjet engine inlet interaction

    NASA Technical Reports Server (NTRS)

    Duncan, Beverly; Thomas, Scott

    1992-01-01

    A computational analysis of a ramjet engine at Mach 3.5 has been conducted and compared to results obtained experimentally. This study focuses on the behavior of the inlet both with and without combustor backpressure. Increased backpressure results in separation of the body side boundary layer and a resultant static pressure rise in the inlet throat region. The computational results compare well with the experimental data for static pressure distribution through the engine, inlet throat flow profiles, and mass capture. The computational analysis slightly underpredicts the thickness of the engine body surface boundary layer and the extent of the interaction caused by backpressure; however, the interaction is observed at approximately the same level of backpressure both experimentally and computationally. This study demonstrates the ability of two different Navier-Stokes codes, namely RPLUS and PARC, to calculate the flow features of this ramjet engine and to provide more detailed information on the process of inlet interaction and unstart.

  15. HL-20 computational fluid dynamics analysis

    NASA Astrophysics Data System (ADS)

    Weilmuenster, K. James; Greene, Francis A.

    1993-09-01

    The essential elements of a computational fluid dynamics analysis of the HL-20/personnel launch system aerothermal environment at hypersonic speeds including surface definition, grid generation, solution techniques, and visual representation of results are presented. Examples of solution technique validation through comparison with data from ground-based facilities are presented, along with results from computations at flight conditions. Computations at flight points indicate that real-gas effects have little or no effect on vehicle aerodynamics and, at these conditions, results from approximate techniques for determining surface heating are comparable with those obtained from Navier-Stokes solutions.

  16. HL-20 computational fluid dynamics analysis

    NASA Technical Reports Server (NTRS)

    Weilmuenster, K. J.; Greene, Francis A.

    1993-01-01

    The essential elements of a computational fluid dynamics analysis of the HL-20/personnel launch system aerothermal environment at hypersonic speeds including surface definition, grid generation, solution techniques, and visual representation of results are presented. Examples of solution technique validation through comparison with data from ground-based facilities are presented, along with results from computations at flight conditions. Computations at flight points indicate that real-gas effects have little or no effect on vehicle aerodynamics and, at these conditions, results from approximate techniques for determining surface heating are comparable with those obtained from Navier-Stokes solutions.

  17. Computer aided nonlinear electrical networks analysis

    NASA Technical Reports Server (NTRS)

    Slapnicar, P.

    1977-01-01

    Techniques used in simulating an electrical circuit with nonlinear elements for use in computer-aided circuit analysis programs are described. Elements of the circuit include capacitors, resistors, inductors, transistors, diodes, and voltage and current sources (constant or time varying). Simulation features are discussed for dc, ac, and/or transient circuit analysis. Calculations are based on the model approach of formulating the circuit equations. A particular solution of transient analysis for nonlinear storage elements is described.

  18. Computer analysis of foetal monitoring signals.

    PubMed

    Nunes, Inês; Ayres-de-Campos, Diogo

    2016-01-01

    Five systems for computer analysis of foetal monitoring signals are currently available, incorporating the evaluation of cardiotocographic (CTG) or combined CTG with electrocardiographic ST data. All systems have been integrated with central monitoring stations, allowing the simultaneous monitoring of several tracings on the same computer screen in multiple hospital locations. Computer analysis elicits real-time visual and sound alerts for health care professionals when abnormal patterns are detected, with the aim of prompting a re-evaluation and subsequent clinical action, if considered necessary. Comparison between the CTG analyses provided by the computer and clinical experts has been carried out in all systems, and in three of them, the accuracy of computer alerts in predicting newborn outcomes was evaluated. Comparisons between these studies are hampered by the differences in selection criteria and outcomes. Two of these systems have just completed multicentre randomised clinical trials comparing them with conventional CTG monitoring, and their results are awaited shortly. For the time being, there is limited evidence regarding the impact of computer analysis of foetal monitoring signals on perinatal indicators and on health care professionals' behaviour.

  19. ASTEC: Controls analysis for personal computers

    NASA Technical Reports Server (NTRS)

    Downing, John P.; Bauer, Frank H.; Thorpe, Christopher J.

    1989-01-01

    The ASTEC (Analysis and Simulation Tools for Engineering Controls) software is under development at Goddard Space Flight Center (GSFC). The design goal is to provide a wide selection of controls analysis tools at the personal computer level, as well as the capability to upload compute-intensive jobs to a mainframe or supercomputer. The project is a follow-on to the INCA (INteractive Controls Analysis) program that has been developed at GSFC over the past five years. While ASTEC makes use of the algorithms and expertise developed for the INCA program, the user interface was redesigned to take advantage of the capabilities of the personal computer. The design philosophy and the current capabilities of the ASTEC software are described.

  20. Interfacing Computer Aided Parallelization and Performance Analysis

    NASA Technical Reports Server (NTRS)

    Jost, Gabriele; Jin, Haoqiang; Labarta, Jesus; Gimenez, Judit; Biegel, Bryan A. (Technical Monitor)

    2003-01-01

    When porting sequential applications to parallel computer architectures, the program developer will typically go through several cycles of source code optimization and performance analysis. We have started a project to develop an environment where the user can jointly navigate through program structure and performance data information in order to make efficient optimization decisions. In a prototype implementation we have interfaced the CAPO computer aided parallelization tool with the Paraver performance analysis tool. We describe both tools and their interface and give an example for how the interface helps within the program development cycle of a benchmark code.

  1. Development and Validation of a 3-Dimensional CFB Furnace Model

    NASA Astrophysics Data System (ADS)

    Vepsäläinen, Arl; Myöhänen, Karl; Hyppäneni, Timo; Leino, Timo; Tourunen, Antti

    At Foster Wheeler, a three-dimensional CFB furnace model is essential part of knowledge development of CFB furnace process regarding solid mixing, combustion, emission formation and heat transfer. Results of laboratory and pilot scale phenomenon research are utilized in development of sub-models. Analyses of field-test results in industrial-scale CFB boilers including furnace profile measurements are simultaneously carried out with development of 3-dimensional process modeling, which provides a chain of knowledge that is utilized as feedback for phenomenon research. Knowledge gathered by model validation studies and up-to-date parameter databases are utilized in performance prediction and design development of CFB boiler furnaces. This paper reports recent development steps related to modeling of combustion and formation of char and volatiles of various fuel types in CFB conditions. Also a new model for predicting the formation of nitrogen oxides is presented. Validation of mixing and combustion parameters for solids and gases are based on test balances at several large-scale CFB boilers combusting coal, peat and bio-fuels. Field-tests including lateral and vertical furnace profile measurements and characterization of solid materials provides a window for characterization of fuel specific mixing and combustion behavior in CFB furnace at different loads and operation conditions. Measured horizontal gas profiles are projection of balance between fuel mixing and reactions at lower part of furnace and are used together with both lateral temperature profiles at bed and upper parts of furnace for determination of solid mixing and combustion model parameters. Modeling of char and volatile based formation of NO profiles is followed by analysis of oxidizing and reducing regions formed due lower furnace design and mixing characteristics of fuel and combustion airs effecting to formation ofNO furnace profile by reduction and volatile-nitrogen reactions. This paper presents

  2. 3-Dimensional wireless sensor network localization: A review

    NASA Astrophysics Data System (ADS)

    Najib, Yasmeen Nadhirah Ahmad; Daud, Hanita; Aziz, Azrina Abd; Razali, Radzuan

    2016-11-01

    The proliferation of wireless sensor network (WSN) has shifted the focus to 3-Dimensional geometry rather than 2-Dimensional geometry. Since exact location of sensors has been the fundamental issue in wireless sensor network, node localization is essential for any wireless sensor network applications. Most algorithms mainly focus on 2-Dimensional geometry, where the application of this algorithm will decrease the accuracy on 3-Dimensional geometry. The low rank attribute in WSN's node estimation makes the application of nuclear norm minimization as a viable solution for dimensionality reduction problems. This research proposes a novel localization algorithm for 3-Dimensional WSN which is nuclear norm minimization. The node localization is formulated via Euclidean Distance Matrix (EDM) and is then optimized using Nuclear-Norm Minimization (NNM).

  3. Behavior Computation for Smart Grid Software Analysis

    SciTech Connect

    Linger, Richard C; Pleszkoch, Mark G; Prowell, Stacy J; Sayre, Kirk D

    2011-01-01

    Smart grid embedded software is subject to intrusion and compromise with potentially serious consequences. Current methods of cybersecurity analysis are increasingly challenged by the scope the problem. Oak Ridge National Laboratory (ORNL) is pioneering the new technology of software behavior computation to help address these risks. Software behavior computation and its instantiation in Function eXtraction (FX) systems apply mathematical foundations of denotational semantics to compute the behavior of software in all circumstances of use. Research has shown how to make the effects of recursion-theoretic limitations on this process arbitrarily small. Behavior computation operates on the functional semantics of programs, and is not subject to the limitations of syntactic recognition or testing. ORNL is applying FX technology to help evaluate cyber security properties in smart grid systems, with initial focus on vulnerabilities in embedded software that controls smart meters.

  4. Computer-assisted qualitative data analysis software.

    PubMed

    Cope, Diane G

    2014-05-01

    Advances in technology have provided new approaches for data collection methods and analysis for researchers. Data collection is no longer limited to paper-and-pencil format, and numerous methods are now available through Internet and electronic resources. With these techniques, researchers are not burdened with entering data manually and data analysis is facilitated by software programs. Quantitative research is supported by the use of computer software and provides ease in the management of large data sets and rapid analysis of numeric statistical methods. New technologies are emerging to support qualitative research with the availability of computer-assisted qualitative data analysis software (CAQDAS).CAQDAS will be presented with a discussion of advantages, limitations, controversial issues, and recommendations for this type of software use.

  5. COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    EPA Science Inventory



    COMPUTER ANALYSIS OF PLANAR GAMMA CAMERA IMAGES

    T Martonen1 and J Schroeter2

    1Experimental Toxicology Division, National Health and Environmental Effects Research Laboratory, U.S. EPA, Research Triangle Park, NC 27711 USA and 2Curriculum in Toxicology, Unive...

  6. Final Report Computational Analysis of Dynamical Systems

    SciTech Connect

    Guckenheimer, John

    2012-05-08

    This is the final report for DOE Grant DE-FG02-93ER25164, initiated in 1993. This grant supported research of John Guckenheimer on computational analysis of dynamical systems. During that period, seventeen individuals received PhD degrees under the supervision of Guckenheimer and over fifty publications related to the grant were produced. This document contains copies of these publications.

  7. The 3-dimensional construction of the Rae craton, central Canada

    NASA Astrophysics Data System (ADS)

    Snyder, David B.; Craven, James A.; Pilkington, Mark; Hillier, Michael J.

    2015-10-01

    Reconstruction of the 3-dimensional tectonic assembly of early continents, first as Archean cratons and then Proterozoic shields, remains poorly understood. In this paper, all readily available geophysical and geochemical data are assembled in a 3-D model with the most accurate bedrock geology in order to understand better the geometry of major structures within the Rae craton of central Canada. Analysis of geophysical observations of gravity and seismic wave speed variations revealed several lithospheric-scale discontinuities in physical properties. Where these discontinuities project upward to correlate with mapped upper crustal geological structures, the discontinuities can be interpreted as shear zones. Radiometric dating of xenoliths provides estimates of rock types and ages at depth beneath sparse kimberlite occurrences. These ages can also be correlated to surface rocks. The 3.6-2.6 Ga Rae craton comprises at least three smaller continental terranes, which "cratonized" during a granitic bloom. Cratonization probably represents final differentiation of early crust into a relatively homogeneous, uniformly thin (35-42 km), tonalite-trondhjemite-granodiorite crust with pyroxenite layers near the Moho. The peak thermotectonic event at 1.86-1.7 Ga was associated with the Hudsonian orogeny that assembled several cratons and lesser continental blocks into the Canadian Shield using a number of southeast-dipping megathrusts. This orogeny metasomatized, mineralized, and recrystallized mantle and lower crustal rocks, apparently making them more conductive by introducing or concentrating sulfides or graphite. Little evidence exists of thin slabs similar to modern oceanic lithosphere in this Precambrian construction history whereas underthrusting and wedging of continental lithosphere is inferred from multiple dipping discontinuities.

  8. Computational approaches to fMRI analysis.

    PubMed

    Cohen, Jonathan D; Daw, Nathaniel; Engelhardt, Barbara; Hasson, Uri; Li, Kai; Niv, Yael; Norman, Kenneth A; Pillow, Jonathan; Ramadge, Peter J; Turk-Browne, Nicholas B; Willke, Theodore L

    2017-02-23

    Analysis methods in cognitive neuroscience have not always matched the richness of fMRI data. Early methods focused on estimating neural activity within individual voxels or regions, averaged over trials or blocks and modeled separately in each participant. This approach mostly neglected the distributed nature of neural representations over voxels, the continuous dynamics of neural activity during tasks, the statistical benefits of performing joint inference over multiple participants and the value of using predictive models to constrain analysis. Several recent exploratory and theory-driven methods have begun to pursue these opportunities. These methods highlight the importance of computational techniques in fMRI analysis, especially machine learning, algorithmic optimization and parallel computing. Adoption of these techniques is enabling a new generation of experiments and analyses that could transform our understanding of some of the most complex-and distinctly human-signals in the brain: acts of cognition such as thoughts, intentions and memories.

  9. Analysis of dissection algorithms for vector computers

    NASA Technical Reports Server (NTRS)

    George, A.; Poole, W. G., Jr.; Voigt, R. G.

    1978-01-01

    Recently two dissection algorithms (one-way and incomplete nested dissection) have been developed for solving the sparse positive definite linear systems arising from n by n grid problems. Concurrently, vector computers (such as the CDC STAR-100 and TI ASC) have been developed for large scientific applications. An analysis of the use of dissection algorithms on vector computers dictates that vectors of maximum length be utilized thereby implying little or no dissection; on the other hand, minimizing operation counts suggest that considerable dissection be performed. In this paper we discuss the resolution of this conflict by minimizing the total time required by vectorized versions of the two algorithms.

  10. Differential Cross Section Kinematics for 3-dimensional Transport Codes

    NASA Technical Reports Server (NTRS)

    Norbury, John W.; Dick, Frank

    2008-01-01

    In support of the development of 3-dimensional transport codes, this paper derives the relevant relativistic particle kinematic theory. Formulas are given for invariant, spectral and angular distributions in both the lab (spacecraft) and center of momentum frames, for collisions involving 2, 3 and n - body final states.

  11. Reflection of solar wind protons on the Martian bow shock: Investigations by means of 3-dimensional simulations

    NASA Astrophysics Data System (ADS)

    Richer, E.; Chanteur, G. M.; Modolo, R.; Dubinin, E.

    2012-09-01

    The reflection of solar wind protons on the Martian bow shock (BS) is investigated by means of three-dimensional simulation models. A two steps approach is adopted to allow a detailed analysis of the reflected population. Firstly, the 3-dimensional hybrid model of Modolo et al. (2005) is used to compute a stationary state of the interaction of the solar wind (SW) with Mars. Secondly, the motion of test particles is followed in the electromagnetic field computed by the hybrid simulation meanwhile detection criteria defined to identify reflected protons are applied. This study demonstrates some effects of the large curvature of a planetary BS on the structure of the foreshock. Reflected protons encounter the BS in a region encompassing parts of the quasi-perpendicular and quasi-parallel shocks, and exit the shock mainly from the quasi-parallel region. The energy spectrum of all reflected protons extends from 0 to almost 15keV. A virtual omnidirectional detector (VOD) is used to compute the local omnidirectional flux of reflected protons at various locations upstream of the BS. Spatial variations of this omnidirectional flux indicate the location and spatial extent of the proton foreshock and demonstrate its shift, increasing with the distance downstream, in the direction opposite to the motional electric field of the SW. Local energy spectra computed from the VOD observations demonstrate the existence of an energy gradient along the direction of the convection electric field.

  12. Computational strategies for tire monitoring and analysis

    NASA Technical Reports Server (NTRS)

    Danielson, Kent T.; Noor, Ahmed K.; Green, James S.

    1995-01-01

    Computational strategies are presented for the modeling and analysis of tires in contact with pavement. A procedure is introduced for simple and accurate determination of tire cross-sectional geometric characteristics from a digitally scanned image. Three new strategies for reducing the computational effort in the finite element solution of tire-pavement contact are also presented. These strategies take advantage of the observation that footprint loads do not usually stimulate a significant tire response away from the pavement contact region. The finite element strategies differ in their level of approximation and required amount of computer resources. The effectiveness of the strategies is demonstrated by numerical examples of frictionless and frictional contact of the space shuttle Orbiter nose-gear tire. Both an in-house research code and a commercial finite element code are used in the numerical studies.

  13. Computational analysis of forebody tangential slot blowing

    NASA Technical Reports Server (NTRS)

    Gee, Ken; Agosta-Greenman, Roxana M.; Rizk, Yehia M.; Schiff, Lewis B.; Cummings, Russell M.

    1994-01-01

    An overview of the computational effort to analyze forebody tangential slot blowing is presented. Tangential slot blowing generates side force and yawing moment which may be used to control an aircraft flying at high-angle-of-attack. Two different geometries are used in the analysis: (1) The High Alpha Research Vehicle; and (2) a generic chined forebody. Computations using the isolated F/A-18 forebody are obtained at full-scale wind tunnel test conditions for direct comparison with available experimental data. The effects of over- and under-blowing on force and moment production are analyzed. Time-accurate solutions using the isolated forebody are obtained to study the force onset timelag of tangential slot blowing. Computations using the generic chined forebody are obtained at experimental wind tunnel conditions, and the results compared with available experimental data. This computational analysis compliments the experimental results and provides a detailed understanding of the effects of tangential slot blowing on the flow field about simple and complex geometries.

  14. Computational stability analysis of dynamical systems

    NASA Astrophysics Data System (ADS)

    Nikishkov, Yuri Gennadievich

    2000-10-01

    Due to increased available computer power, the analysis of nonlinear flexible multi-body systems, fixed-wing aircraft and rotary-wing vehicles is relying on increasingly complex, large scale models. An important aspect of the dynamic response of flexible multi-body systems is the potential presence of instabilities. Stability analysis is typically performed on simplified models with the smallest number of degrees of freedom required to capture the physical phenomena that cause the instability. The system stability boundaries are then evaluated using the characteristic exponent method or Floquet theory for systems with constant or periodic coefficients, respectively. As the number of degrees of freedom used to represent the system increases, these methods become increasingly cumbersome, and quickly unmanageable. In this work, a novel approach is proposed, the Implicit Floquet Analysis, which evaluates the largest eigenvalues of the transition matrix using the Arnoldi algorithm, without the explicit computation of this matrix. This method is far more computationally efficient than the classical approach and is ideally suited for systems involving a large number of degrees of freedom. The proposed approach is conveniently implemented as a postprocessing step to any existing simulation tool. The application of the method to a geometrically nonlinear multi-body dynamics code is presented. This work also focuses on the implementation of trimming algorithms and the development of tools for the graphical representation of numerical simulations and stability information for multi-body systems.

  15. Computer analysis of HIV epitope sequences

    SciTech Connect

    Gupta, G.; Myers, G.

    1990-01-01

    Phylogenetic tree analysis provide us with important general information regarding the extent and rate of HIV variation. Currently we are attempting to extend computer analysis and modeling to the V3 loop of the type 2 virus and its simian homologues, especially in light of the prominent role the latter will play in animal model studies. Moreover, it might be possible to attack the slightly similar V4 loop by this approach. However, the strategy relies very heavily upon natural'' information and constraints, thus there exist severe limitations upon the general applicability, in addition to uncertainties with regard to long-range residue interactions. 5 refs., 3 figs.

  16. 76 FR 60939 - Metal Fatigue Analysis Performed by Computer Software

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... COMMISSION Metal Fatigue Analysis Performed by Computer Software AGENCY: Nuclear Regulatory Commission... applicants' analyses and methodologies using the computer software package, WESTEMS TM , to demonstrate... by Computer Software Addressees All holders of, and applicants for, a power reactor operating...

  17. Wetting characteristics of 3-dimensional nanostructured fractal surfaces

    NASA Astrophysics Data System (ADS)

    Davis, Ethan; Liu, Ying; Jiang, Lijia; Lu, Yongfeng; Ndao, Sidy

    2017-01-01

    This article reports the fabrication and wetting characteristics of 3-dimensional nanostructured fractal surfaces (3DNFS). Three distinct 3DNFS surfaces, namely cubic, Romanesco broccoli, and sphereflake were fabricated using two-photon direct laser writing. Contact angle measurements were performed on the multiscale fractal surfaces to characterize their wetting properties. Average contact angles ranged from 66.8° for the smooth control surface to 0° for one of the fractal surfaces. The change in wetting behavior was attributed to modification of the interfacial surface properties due to the inclusion of 3-dimensional hierarchical fractal nanostructures. However, this behavior does not exactly obey existing surface wetting models in the literature. Potential applications for these types of surfaces in physical and biological sciences are also discussed.

  18. 3-dimensional (3D) fabricated polymer based drug delivery systems.

    PubMed

    Moulton, Simon E; Wallace, Gordon G

    2014-11-10

    Drug delivery from 3-dimensional (3D) structures is a rapidly growing area of research. It is essential to achieve structures wherein drug stability is ensured, the drug loading capacity is appropriate and the desired controlled release profile can be attained. Attention must also be paid to the development of appropriate fabrication machinery that allows 3D drug delivery systems (DDS) to be produced in a simple, reliable and reproducible manner. The range of fabrication methods currently being used to form 3D DDSs include electrospinning (solution and melt), wet-spinning and printing (3-dimensional). The use of these techniques enables production of DDSs from the macro-scale down to the nano-scale. This article reviews progress in these fabrication techniques to form DDSs that possess desirable drug delivery kinetics for a wide range of applications.

  19. Computational analysis of aircraft pressure relief doors

    NASA Astrophysics Data System (ADS)

    Schott, Tyler

    Modern trends in commercial aircraft design have sought to improve fuel efficiency while reducing emissions by operating at higher pressures and temperatures than ever before. Consequently, greater demands are placed on the auxiliary bleed air systems used for a multitude of aircraft operations. The increased role of bleed air systems poses significant challenges for the pressure relief system to ensure the safe and reliable operation of the aircraft. The core compartment pressure relief door (PRD) is an essential component of the pressure relief system which functions to relieve internal pressure in the core casing of a high-bypass turbofan engine during a burst duct over-pressurization event. The successful modeling and analysis of a burst duct event are imperative to the design and development of PRD's to ensure that they will meet the increased demands placed on the pressure relief system. Leveraging high-performance computing coupled with advances in computational analysis, this thesis focuses on a comprehensive computational fluid dynamics (CFD) study to characterize turbulent flow dynamics and quantify the performance of a core compartment PRD across a range of operating conditions and geometric configurations. The CFD analysis was based on a compressible, steady-state, three-dimensional, Reynolds-averaged Navier-Stokes approach. Simulations were analyzed, and results show that variations in freestream conditions, plenum environment, and geometric configurations have a non-linear impact on the discharge, moment, thrust, and surface temperature characteristics. The CFD study revealed that the underlying physics for this behavior is explained by the interaction of vortices, jets, and shockwaves. This thesis research is innovative and provides a comprehensive and detailed analysis of existing and novel PRD geometries over a range of realistic operating conditions representative of a burst duct over-pressurization event. Further, the study provides aircraft

  20. Reconstructing a 3-dimensional image of the results of antinuclear antibody testing by indirect immunofluorescence.

    PubMed

    Murai, Ryosei; Yamada, Koji; Tanaka, Maki; Kuribayashi, Kageaki; Kobayashi, Daisuke; Tsuji, Naoki; Watanabe, Naoki

    2013-01-31

    Indirect immunofluorescence anti-nuclear antibody testing (IIF-ANAT) is an essential screening tool in the diagnosis of various autoimmune disorders. ANA titer quantification and interpretation of immunofluorescence patterns are determined subjectively, which is problematic. First, we determined the examination conditions under which IIF-ANAT fluorescence intensities are quantified. Next, IIF-ANAT was performed using homogeneous, discrete speckled, and mixed serum samples. Images were obtained using Bio Zero BZ-8000, and 3-dimensional images were reconstructed using the BZ analyzer software. In the 2-dimensional analysis, homogeneous ANAs hid the discrete speckled pattern, resulting in a diagnosis of homogeneous immunofluorescence. However, 3-dimensional analysis of the same sample showed discrete speckled-type ANA in the homogeneous background. This study strengthened the current IIF-ANAT method by providing a new approach to quantify the fluorescence intensity and enhance the resolution of IIF-ANAT fluorescence patterns. Reconstructed 3-dimensional imaging of IIF-ANAT can be a powerful tool for routine laboratory examination.

  1. Multimodality imaging of intrauterine devices with an emphasis on the emerging role of 3-dimensional ultrasound.

    PubMed

    Reiner, Jeffrey S; Brindle, Kathleen A; Khati, Nadia Juliet

    2012-12-01

    The intrauterine contraceptive device (IUD) is one of the most widely used reversible contraception methods throughout the world. With advancing technology, it has rapidly gained acceptance through its increased effectiveness and practicality compared with more invasive means such as laparoscopic tubal ligation. This pictorial essay will present the IUDs most commonly used today. It will illustrate both normal and abnormal positions of IUDs across all cross-sectional imaging modalities including 2-dimensional ultrasound, computed tomography, and magnetic resonance imaging, with a focus on the emerging role of 3-dimensional ultrasound as the modality of choice.

  2. FORTRAN computer program for seismic risk analysis

    USGS Publications Warehouse

    McGuire, Robin K.

    1976-01-01

    A program for seismic risk analysis is described which combines generality of application, efficiency and accuracy of operation, and the advantage of small storage requirements. The theoretical basis for the program is first reviewed, and the computational algorithms used to apply this theory are described. The information required for running the program is listed. Published attenuation functions describing the variation with earthquake magnitude and distance of expected values for various ground motion parameters are summarized for reference by the program user. Finally, suggestions for use of the program are made, an example problem is described (along with example problem input and output) and the program is listed.

  3. Computer network environment planning and analysis

    NASA Technical Reports Server (NTRS)

    Dalphin, John F.

    1989-01-01

    The GSFC Computer Network Environment provides a broadband RF cable between campus buildings and ethernet spines in buildings for the interlinking of Local Area Networks (LANs). This system provides terminal and computer linkage among host and user systems thereby providing E-mail services, file exchange capability, and certain distributed computing opportunities. The Environment is designed to be transparent and supports multiple protocols. Networking at Goddard has a short history and has been under coordinated control of a Network Steering Committee for slightly more than two years; network growth has been rapid with more than 1500 nodes currently addressed and greater expansion expected. A new RF cable system with a different topology is being installed during summer 1989; consideration of a fiber optics system for the future will begin soon. Summmer study was directed toward Network Steering Committee operation and planning plus consideration of Center Network Environment analysis and modeling. Biweekly Steering Committee meetings were attended to learn the background of the network and the concerns of those managing it. Suggestions for historical data gathering have been made to support future planning and modeling. Data Systems Dynamic Simulator, a simulation package developed at NASA and maintained at GSFC was studied as a possible modeling tool for the network environment. A modeling concept based on a hierarchical model was hypothesized for further development. Such a model would allow input of newly updated parameters and would provide an estimation of the behavior of the network.

  4. Good relationships between computational image analysis and radiological physics

    NASA Astrophysics Data System (ADS)

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-01

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  5. Good relationships between computational image analysis and radiological physics

    SciTech Connect

    Arimura, Hidetaka; Kamezawa, Hidemi; Jin, Ze; Nakamoto, Takahiro; Soufi, Mazen

    2015-09-30

    Good relationships between computational image analysis and radiological physics have been constructed for increasing the accuracy of medical diagnostic imaging and radiation therapy in radiological physics. Computational image analysis has been established based on applied mathematics, physics, and engineering. This review paper will introduce how computational image analysis is useful in radiation therapy with respect to radiological physics.

  6. Experimental analysis of computer system dependability

    NASA Technical Reports Server (NTRS)

    Iyer, Ravishankar, K.; Tang, Dong

    1993-01-01

    This paper reviews an area which has evolved over the past 15 years: experimental analysis of computer system dependability. Methodologies and advances are discussed for three basic approaches used in the area: simulated fault injection, physical fault injection, and measurement-based analysis. The three approaches are suited, respectively, to dependability evaluation in the three phases of a system's life: design phase, prototype phase, and operational phase. Before the discussion of these phases, several statistical techniques used in the area are introduced. For each phase, a classification of research methods or study topics is outlined, followed by discussion of these methods or topics as well as representative studies. The statistical techniques introduced include the estimation of parameters and confidence intervals, probability distribution characterization, and several multivariate analysis methods. Importance sampling, a statistical technique used to accelerate Monte Carlo simulation, is also introduced. The discussion of simulated fault injection covers electrical-level, logic-level, and function-level fault injection methods as well as representative simulation environments such as FOCUS and DEPEND. The discussion of physical fault injection covers hardware, software, and radiation fault injection methods as well as several software and hybrid tools including FIAT, FERARI, HYBRID, and FINE. The discussion of measurement-based analysis covers measurement and data processing techniques, basic error characterization, dependency analysis, Markov reward modeling, software-dependability, and fault diagnosis. The discussion involves several important issues studies in the area, including fault models, fast simulation techniques, workload/failure dependency, correlated failures, and software fault tolerance.

  7. Analysis of Ventricular Function by Computed Tomography

    PubMed Central

    Rizvi, Asim; Deaño, Roderick C.; Bachman, Daniel P.; Xiong, Guanglei; Min, James K.; Truong, Quynh A.

    2014-01-01

    The assessment of ventricular function, cardiac chamber dimensions and ventricular mass is fundamental for clinical diagnosis, risk assessment, therapeutic decisions, and prognosis in patients with cardiac disease. Although cardiac computed tomography (CT) is a noninvasive imaging technique often used for the assessment of coronary artery disease, it can also be utilized to obtain important data about left and right ventricular function and morphology. In this review, we will discuss the clinical indications for the use of cardiac CT for ventricular analysis, review the evidence on the assessment of ventricular function compared to existing imaging modalities such cardiac MRI and echocardiography, provide a typical cardiac CT protocol for image acquisition and post-processing for ventricular analysis, and provide step-by-step instructions to acquire multiplanar cardiac views for ventricular assessment from the standard axial, coronal, and sagittal planes. Furthermore, both qualitative and quantitative assessments of ventricular function as well as sample reporting are detailed. PMID:25576407

  8. Computational Analysis of Lung Deformation after Murine neumonectomy

    PubMed Central

    Filipovic, Nenad; Gibney, Barry C.; Nikolic, Dalibor; Konerding, Moritz A.; Mentzer, Steven J.; Tsuda, Akira

    2012-01-01

    In many mammalian species, the removal of one lung (pneumonectomy) is associated with the compensatory growth of the remaining lung. To investigate the hypothesis that parenchymal deformation may trigger lung regeneration, we used microCT scanning to create 3-dimensional finite element geometric models of the murine lung pre- and post-pneumonectomy (24 hours). The structural correspondence between models was established using anatomic landmarks and an iterative computational algorithm. When compared with the pre-pneumonectomy lung, the post-pneumonectomy models demonstrated significant translation and rotation of the cardiac lobe into the post-pneumonectomy pleural space. 2-dimensional maps of lung deformation demonstrated significant heterogeneity ; the areas of greatest deformation were present in the subpleural regions of the lobe. Consistent with previously identified growth patterns, subpleural regions of enhanced deformation are compatible with a mechanical signal—likely involving parenchymal stretch—triggering lung growth. PMID:22978574

  9. 3-dimensional electronic structures of CaC6

    NASA Astrophysics Data System (ADS)

    Kyung, Wonshik; Kim, Yeongkwan; Han, Garam; Leem, Choonshik; Kim, Junsung; Kim, Yeongwook; Kim, Keunsu; Rotenberg, Eli; Kim, Changyoung; Postech Collaboration; Advanced Light Source Collaboration; Yonsei University Team

    2014-03-01

    There is still remaining issues on origin of superconductivity in graphite intercalation compounds, especially CaC6 because of its relatively high transition temperature than other GICs. There are two competing theories on where the superconductivity occurs in this material; intercalant metal or charge doped graphene layer. To elucidate this issue, it is necessary to confirm existence of intercalant driven band. Therefore, we performed 3 dimensional electronic structure studies with ARPES to find out 3d dispersive intercalant band. However, we could not observe it, instead observed 3d dispersive carbon band. This support the aspect of charge doped graphene superconductivity more than intercalant driving aspect.

  10. The 3-dimensional cellular automata for HIV infection

    NASA Astrophysics Data System (ADS)

    Mo, Youbin; Ren, Bin; Yang, Wencao; Shuai, Jianwei

    2014-04-01

    The HIV infection dynamics is discussed in detail with a 3-dimensional cellular automata model in this paper. The model can reproduce the three-phase development, i.e., the acute period, the asymptotic period and the AIDS period, observed in the HIV-infected patients in a clinic. We show that the 3D HIV model performs a better robustness on the model parameters than the 2D cellular automata. Furthermore, we reveal that the occurrence of a perpetual source to successively generate infectious waves to spread to the whole system drives the model from the asymptotic state to the AIDS state.

  11. A 3-dimensional finite-difference method for calculating the dynamic coefficients of seals

    NASA Technical Reports Server (NTRS)

    Dietzen, F. J.; Nordmann, R.

    1989-01-01

    A method to calculate the dynamic coefficients of seals with arbitrary geometry is presented. The Navier-Stokes equations are used in conjunction with the k-e turbulence model to describe the turbulent flow. These equations are solved by a full 3-dimensional finite-difference procedure instead of the normally used perturbation analysis. The time dependence of the equations is introduced by working with a coordinate system rotating with the precession frequency of the shaft. The results of this theory are compared with coefficients calculated by a perturbation analysis and with experimental results.

  12. Children and Computer Technology: Analysis and Recommendations.

    ERIC Educational Resources Information Center

    Shields, Margie K.; Behrman, Richard E.

    2000-01-01

    Examines how computer use affects children's development, disparities between rich and poor, and how computers enhance learning, noting risks and benefits. Recommendations to improve computer access and use at home and school include: researchers must study the effects of extended computer use on child development, and parents should limit the…

  13. Automated feature extraction for 3-dimensional point clouds

    NASA Astrophysics Data System (ADS)

    Magruder, Lori A.; Leigh, Holly W.; Soderlund, Alexander; Clymer, Bradley; Baer, Jessica; Neuenschwander, Amy L.

    2016-05-01

    Light detection and ranging (LIDAR) technology offers the capability to rapidly capture high-resolution, 3-dimensional surface data with centimeter-level accuracy for a large variety of applications. Due to the foliage-penetrating properties of LIDAR systems, these geospatial data sets can detect ground surfaces beneath trees, enabling the production of highfidelity bare earth elevation models. Precise characterization of the ground surface allows for identification of terrain and non-terrain points within the point cloud, and facilitates further discernment between natural and man-made objects based solely on structural aspects and relative neighboring parameterizations. A framework is presented here for automated extraction of natural and man-made features that does not rely on coincident ortho-imagery or point RGB attributes. The TEXAS (Terrain EXtraction And Segmentation) algorithm is used first to generate a bare earth surface from a lidar survey, which is then used to classify points as terrain or non-terrain. Further classifications are assigned at the point level by leveraging local spatial information. Similarly classed points are then clustered together into regions to identify individual features. Descriptions of the spatial attributes of each region are generated, resulting in the identification of individual tree locations, forest extents, building footprints, and 3-dimensional building shapes, among others. Results of the fully-automated feature extraction algorithm are then compared to ground truth to assess completeness and accuracy of the methodology.

  14. Computational analysis of unmanned aerial vehicle (UAV)

    NASA Astrophysics Data System (ADS)

    Abudarag, Sakhr; Yagoub, Rashid; Elfatih, Hassan; Filipovic, Zoran

    2017-01-01

    A computational analysis has been performed to verify the aerodynamics properties of Unmanned Aerial Vehicle (UAV). The UAV-SUST has been designed and fabricated at the Department of Aeronautical Engineering at Sudan University of Science and Technology in order to meet the specifications required for surveillance and reconnaissance mission. It is classified as a medium range and medium endurance UAV. A commercial CFD solver is used to simulate steady and unsteady aerodynamics characteristics of the entire UAV. In addition to Lift Coefficient (CL), Drag Coefficient (CD), Pitching Moment Coefficient (CM) and Yawing Moment Coefficient (CN), the pressure and velocity contours are illustrated. The aerodynamics parameters are represented a very good agreement with the design consideration at angle of attack ranging from zero to 26 degrees. Moreover, the visualization of the velocity field and static pressure contours is indicated a satisfactory agreement with the proposed design. The turbulence is predicted by enhancing K-ω SST turbulence model within the computational fluid dynamics code.

  15. Computational analysis of EGFR inhibition by Argos.

    PubMed

    Reeves, Gregory T; Kalifa, Rachel; Klein, Daryl E; Lemmon, Mark A; Shvartsman, Stanislav Y

    2005-08-15

    Argos, a secreted inhibitor of the Drosophila epidermal growth factor receptor, and the only known secreted receptor tyrosine kinase inhibitor, acts by sequestering the EGFR ligand Spitz. We use computational modeling to show that this biochemically-determined mechanism of Argos action can explain available genetic data for EGFR/Spitz/Argos interactions in vivo. We find that efficient Spitz sequestration by Argos is key for explaining the existing data and for providing a robust feedback loop that modulates the Spitz gradient in embryonic ventral ectoderm patterning. Computational analysis of the EGFR/Spitz/Argos module in the ventral ectoderm shows that Argos need not be long-ranged to account for genetic data, and can actually have very short range. In our models, Argos with long or short length scale functions to limit the range and action of secreted Spitz. Thus, the spatial range of Argos does not have to be tightly regulated or may act at different ranges in distinct developmental contexts.

  16. Computed tomographic analysis of meteorite inclusions

    NASA Technical Reports Server (NTRS)

    Arnold, J. R.; Testa, J. P., Jr.; Friedman, P. J.; Kambic, G. X.

    1983-01-01

    The feasibility of obtaining nondestructively a cross-sectional display of very dense heterogeneous rocky specimens, whether lunar, terrestrial or meteoritic, by using a fourth generation computed tomographic (CT) scanner, with modifications to the software only, is discussed. A description of the scanner, and of the experimental and analytical procedures is given. Using this technique, the interior of heterogeneous materials such as Allende can be probed nondestructively. The regions of material with high and low atomic numbers are displayed quickly; the object can then be cut to obtain for analysis just the areas of interest. A comparison of this technique with conventional industrial and medical techniques is made in terms of image resolution and density distribution display precision.

  17. Computational based functional analysis of Bacillus phytases.

    PubMed

    Verma, Anukriti; Singh, Vinay Kumar; Gaur, Smriti

    2016-02-01

    Phytase is an enzyme which catalyzes the total hydrolysis of phytate to less phosphorylated myo-inositol derivatives and inorganic phosphate and digests the undigestable phytate part present in seeds and grains and therefore provides digestible phosphorus, calcium and other mineral nutrients. Phytases are frequently added to the feed of monogastric animals so that bioavailability of phytic acid-bound phosphate increases, ultimately enhancing the nutritional value of diets. The Bacillus phytase is very suitable to be used in animal feed because of its optimum pH with excellent thermal stability. Present study is aimed to perform an in silico comparative characterization and functional analysis of phytases from Bacillus amyloliquefaciens to explore physico-chemical properties using various bio-computational tools. All proteins are acidic and thermostable and can be used as suitable candidates in the feed industry.

  18. Review of Computational Stirling Analysis Methods

    NASA Technical Reports Server (NTRS)

    Dyson, Rodger W.; Wilson, Scott D.; Tew, Roy C.

    2004-01-01

    Nuclear thermal to electric power conversion carries the promise of longer duration missions and higher scientific data transmission rates back to Earth for both Mars rovers and deep space missions. A free-piston Stirling convertor is a candidate technology that is considered an efficient and reliable power conversion device for such purposes. While already very efficient, it is believed that better Stirling engines can be developed if the losses inherent its current designs could be better understood. However, they are difficult to instrument and so efforts are underway to simulate a complete Stirling engine numerically. This has only recently been attempted and a review of the methods leading up to and including such computational analysis is presented. And finally it is proposed that the quality and depth of Stirling loss understanding may be improved by utilizing the higher fidelity and efficiency of recently developed numerical methods. One such method, the Ultra HI-Fl technique is presented in detail.

  19. Computer analysis of radionuclide esophageal transit studies

    SciTech Connect

    Klein, H.A.; Wald, A.

    1984-09-01

    For detailed examination of the esophageal transit of a swallowed radioactive liquid bolus, three computer-based techniques have been developed: analysis of time-activity curves with decomposition into rapid and residual components, yielding the mean transit time for the former and the residual fraction for the latter; reduction of dynamic image sequences to single condensed images, facilitating subjective assessment; and tracking of the centroid of radioactivity, permitting quantification of retrograde motion. Studies were performed on 12 normal subjects and on six patients with motility disorders. Elevated residual fractions were observed in all the patients, and an abnormal degree of retrograde motion in two. Two normal and two abnormal studies exemplify the variety of patterns observed in condensed images.

  20. 3-Dimensional Marine CSEM Modeling by Employing TDFEM with Parallel Solvers

    NASA Astrophysics Data System (ADS)

    Wu, X.; Yang, T.

    2013-12-01

    In this paper, parallel fulfillment is developed for forward modeling of the 3-Dimensional controlled source electromagnetic (CSEM) by using time-domain finite element method (TDFEM). Recently, a greater attention rises on research of hydrocarbon (HC) reservoir detection mechanism in the seabed. Since China has vast ocean resources, seeking hydrocarbon reservoirs become significant in the national economy. However, traditional methods of seismic exploration shown a crucial obstacle to detect hydrocarbon reservoirs in the seabed with a complex structure, due to relatively high acquisition costs and high-risking exploration. In addition, the development of EM simulations typically requires both a deep knowledge of the computational electromagnetics (CEM) and a proper use of sophisticated techniques and tools from computer science. However, the complexity of large-scale EM simulations often requires large memory because of a large amount of data, or solution time to address problems concerning matrix solvers, function transforms, optimization, etc. The objective of this paper is to present parallelized implementation of the time-domain finite element method for analysis of three-dimensional (3D) marine controlled source electromagnetic problems. Firstly, we established a three-dimensional basic background model according to the seismic data, then electromagnetic simulation of marine CSEM was carried out by using time-domain finite element method, which works on a MPI (Message Passing Interface) platform with exact orientation to allow fast detecting of hydrocarbons targets in ocean environment. To speed up the calculation process, SuperLU of an MPI (Message Passing Interface) version called SuperLU_DIST is employed in this approach. Regarding the representation of three-dimension seabed terrain with sense of reality, the region is discretized into an unstructured mesh rather than a uniform one in order to reduce the number of unknowns. Moreover, high-order Whitney

  1. Performance Analysis of Cloud Computing Architectures Using Discrete Event Simulation

    NASA Technical Reports Server (NTRS)

    Stocker, John C.; Golomb, Andrew M.

    2011-01-01

    Cloud computing offers the economic benefit of on-demand resource allocation to meet changing enterprise computing needs. However, the flexibility of cloud computing is disadvantaged when compared to traditional hosting in providing predictable application and service performance. Cloud computing relies on resource scheduling in a virtualized network-centric server environment, which makes static performance analysis infeasible. We developed a discrete event simulation model to evaluate the overall effectiveness of organizations in executing their workflow in traditional and cloud computing architectures. The two part model framework characterizes both the demand using a probability distribution for each type of service request as well as enterprise computing resource constraints. Our simulations provide quantitative analysis to design and provision computing architectures that maximize overall mission effectiveness. We share our analysis of key resource constraints in cloud computing architectures and findings on the appropriateness of cloud computing in various applications.

  2. Computational analysis of heat flow in computer casing

    NASA Astrophysics Data System (ADS)

    Nor Azwadi, C. S.; Goh, C. K.; Afiq Witri, M. Y.

    2012-06-01

    Reliability of a computer system is directly related to thermal management system. This is due to the fact that poor thermal management led to high temperature distribution throughout hardware components and resulting poor performance and reducing fatigue life of the package. Therefore, good cooling solutions (heat sink, fan) and proper form factor design (expandability, interchangeable of parts) is necessary to provide good thermal management in computer system. The performance of Advanced Technology Extended (ATX) and its purposed successor, Balanced Technology Extended (BTX) were compared to investigate the aforementioned factors. Simulations were conducted by using ANSYS software. Results obtained from simulations were compared with values in the datasheet obtained from manufacturers for validation purposes and it was discovered that there are more chaos region in the flow profile for ATX form factor. In contrast, BTX form factor yields a straighter flow profile. Based on the result, we can conclude that BTX form factor has better cooling capability compared to its predecessor, ATX due to the improvement of layout made in the BTX form factor. With this change, it enabled BTX form factor to be used with more advanced components which dissipate more amount of heat and also improves the acoustic performance of BTX by reducing the number of fan needed to just one unit for BTX.

  3. Analysis on the security of cloud computing

    NASA Astrophysics Data System (ADS)

    He, Zhonglin; He, Yuhua

    2011-02-01

    Cloud computing is a new technology, which is the fusion of computer technology and Internet development. It will lead the revolution of IT and information field. However, in cloud computing data and application software is stored at large data centers, and the management of data and service is not completely trustable, resulting in safety problems, which is the difficult point to improve the quality of cloud service. This paper briefly introduces the concept of cloud computing. Considering the characteristics of cloud computing, it constructs the security architecture of cloud computing. At the same time, with an eye toward the security threats cloud computing faces, several corresponding strategies are provided from the aspect of cloud computing users and service providers.

  4. Can cloud computing benefit health services? - a SWOT analysis.

    PubMed

    Kuo, Mu-Hsing; Kushniruk, Andre; Borycki, Elizabeth

    2011-01-01

    In this paper, we discuss cloud computing, the current state of cloud computing in healthcare, and the challenges and opportunities of adopting cloud computing in healthcare. A Strengths, Weaknesses, Opportunities and Threats (SWOT) analysis was used to evaluate the feasibility of adopting this computing model in healthcare. The paper concludes that cloud computing could have huge benefits for healthcare but there are a number of issues that will need to be addressed before its widespread use in healthcare.

  5. A computational design system for rapid CFD analysis

    NASA Technical Reports Server (NTRS)

    Ascoli, E. P.; Barson, S. L.; Decroix, M. E.; Sindir, Munir M.

    1992-01-01

    A computation design system (CDS) is described in which these tools are integrated in a modular fashion. This CDS ties together four key areas of computational analysis: description of geometry; grid generation; computational codes; and postprocessing. Integration of improved computational fluid dynamics (CFD) analysis tools through integration with the CDS has made a significant positive impact in the use of CFD for engineering design problems. Complex geometries are now analyzed on a frequent basis and with greater ease.

  6. Computing in Qualitative Analysis: A Healthy Development?

    ERIC Educational Resources Information Center

    Richards, Lyn; Richards, Tom

    1991-01-01

    Discusses the potential impact of computers in qualitative health research. Describes the original goals, design, and implementation of NUDIST, a qualitative computing software. Argues for evaluation of the impact of computer techniques and for an opening of debate among program developers and users to address the purposes and power of computing…

  7. Research in applied mathematics, numerical analysis, and computer science

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering (ICASE) in applied mathematics, numerical analysis, and computer science is summarized and abstracts of published reports are presented. The major categories of the ICASE research program are: (1) numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; (2) control and parameter identification; (3) computational problems in engineering and the physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and (4) computer systems and software, especially vector and parallel computers.

  8. TAIR- TRANSONIC AIRFOIL ANALYSIS COMPUTER CODE

    NASA Technical Reports Server (NTRS)

    Dougherty, F. C.

    1994-01-01

    The Transonic Airfoil analysis computer code, TAIR, was developed to employ a fast, fully implicit algorithm to solve the conservative full-potential equation for the steady transonic flow field about an arbitrary airfoil immersed in a subsonic free stream. The full-potential formulation is considered exact under the assumptions of irrotational, isentropic, and inviscid flow. These assumptions are valid for a wide range of practical transonic flows typical of modern aircraft cruise conditions. The primary features of TAIR include: a new fully implicit iteration scheme which is typically many times faster than classical successive line overrelaxation algorithms; a new, reliable artifical density spatial differencing scheme treating the conservative form of the full-potential equation; and a numerical mapping procedure capable of generating curvilinear, body-fitted finite-difference grids about arbitrary airfoil geometries. Three aspects emphasized during the development of the TAIR code were reliability, simplicity, and speed. The reliability of TAIR comes from two sources: the new algorithm employed and the implementation of effective convergence monitoring logic. TAIR achieves ease of use by employing a "default mode" that greatly simplifies code operation, especially by inexperienced users, and many useful options including: several airfoil-geometry input options, flexible user controls over program output, and a multiple solution capability. The speed of the TAIR code is attributed to the new algorithm and the manner in which it has been implemented. Input to the TAIR program consists of airfoil coordinates, aerodynamic and flow-field convergence parameters, and geometric and grid convergence parameters. The airfoil coordinates for many airfoil shapes can be generated in TAIR from just a few input parameters. Most of the other input parameters have default values which allow the user to run an analysis in the default mode by specifing only a few input parameters

  9. New computing systems, future computing environment, and their implications on structural analysis and design

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.; Housner, Jerrold M.

    1993-01-01

    Recent advances in computer technology that are likely to impact structural analysis and design of flight vehicles are reviewed. A brief summary is given of the advances in microelectronics, networking technologies, and in the user-interface hardware and software. The major features of new and projected computing systems, including high performance computers, parallel processing machines, and small systems, are described. Advances in programming environments, numerical algorithms, and computational strategies for new computing systems are reviewed. The impact of the advances in computer technology on structural analysis and the design of flight vehicles is described. A scenario for future computing paradigms is presented, and the near-term needs in the computational structures area are outlined.

  10. Simple parameter estimation for complex models — Testing evolutionary techniques on 3-dimensional biogeochemical ocean models

    NASA Astrophysics Data System (ADS)

    Mattern, Jann Paul; Edwards, Christopher A.

    2017-01-01

    Parameter estimation is an important part of numerical modeling and often required when a coupled physical-biogeochemical ocean model is first deployed. However, 3-dimensional ocean model simulations are computationally expensive and models typically contain upwards of 10 parameters suitable for estimation. Hence, manual parameter tuning can be lengthy and cumbersome. Here, we present four easy to implement and flexible parameter estimation techniques and apply them to two 3-dimensional biogeochemical models of different complexities. Based on a Monte Carlo experiment, we first develop a cost function measuring the model-observation misfit based on multiple data types. The parameter estimation techniques are then applied and yield a substantial cost reduction over ∼ 100 simulations. Based on the outcome of multiple replicate experiments, they perform on average better than random, uninformed parameter search but performance declines when more than 40 parameters are estimated together. Our results emphasize the complex cost function structure for biogeochemical parameters and highlight dependencies between different parameters as well as different cost function formulations.

  11. Automated 3-Dimensional Brain Atlas Fitting to Microelectrode Recordings from Deep Brain Stimulation Surgeries

    PubMed Central

    Luján, J. Luis; Noecker, Angela M.; Butson, Christopher R.; Cooper, Scott E.; Walter, Benjamin L.; Vitek, Jerrold L.; McIntyre, Cameron C.

    2009-01-01

    Objective Deep brain stimulation (DBS) surgeries commonly rely on brain atlases and microelectrode recordings (MER) to help identify the target location for electrode implantation. We present an automated method for optimally fitting a 3-dimensional brain atlas to intraoperative MER and predicting a target DBS electrode location in stereotactic coordinates for the patient. Methods We retrospectively fit a 3-dimensional brain atlas to MER points from 10 DBS surgeries targeting the subthalamic nucleus (STN). We used a constrained optimization algorithm to maximize the MER points correctly fitted (i.e., contained) within the appropriate atlas nuclei. We compared our optimization approach to conventional anterior commissure-posterior commissure (AC/PC) scaling, and to manual fits performed by four experts. A theoretical DBS electrode target location in the dorsal STN was customized to each patient as part of the fitting process and compared to the location of the clinically defined therapeutic stimulation contact. Results The human expert and computer optimization fits achieved significantly better fits than the AC/PC scaling (80, 81, and 41% of correctly fitted MER, respectively). However, the optimization fits were performed in less time than the expert fits and converged to a single solution for each patient, eliminating interexpert variance. Conclusions and Significance DBS therapeutic outcomes are directly related to electrode implantation accuracy. Our automated fitting techniques may aid in the surgical decision-making process by optimally integrating brain atlas and intraoperative neurophysiological data to provide a visual guide for target identification. PMID:19556832

  12. 3-Dimensional quantitative detection of nanoparticle content in biological tissue samples after local cancer treatment

    NASA Astrophysics Data System (ADS)

    Rahn, Helene; Alexiou, Christoph; Trahms, Lutz; Odenbach, Stefan

    2014-06-01

    X-ray computed tomography is nowadays used for a wide range of applications in medicine, science and technology. X-ray microcomputed tomography (XμCT) follows the same principles used for conventional medical CT scanners, but improves the spatial resolution to a few micrometers. We present an example of an application of X-ray microtomography, a study of 3-dimensional biodistribution, as along with the quantification of nanoparticle content in tumoral tissue after minimally invasive cancer therapy. One of these minimal invasive cancer treatments is magnetic drug targeting, where the magnetic nanoparticles are used as controllable drug carriers. The quantification is based on a calibration of the XμCT-equipment. The developed calibration procedure of the X-ray-μCT-equipment is based on a phantom system which allows the discrimination between the various gray values of the data set. These phantoms consist of a biological tissue substitute and magnetic nanoparticles. The phantoms have been studied with XμCT and have been examined magnetically. The obtained gray values and nanoparticle concentration lead to a calibration curve. This curve can be applied to tomographic data sets. Accordingly, this calibration enables a voxel-wise assignment of gray values in the digital tomographic data set to nanoparticle content. Thus, the calibration procedure enables a 3-dimensional study of nanoparticle distribution as well as concentration.

  13. X-Ray Computed Tomography for Failure Analysis Investigations

    DTIC Science & Technology

    1993-05-01

    AD-A268 086 WL-TR-93-4047 X - RAY COMPUTED TOMOGRAPHY FOR FAILURE ANALYSIS INVESTIGATIONS Richard H. Bossi William Shepherd Boeing Defense & Space... X - Ray Computed Tomography for Failure Analysis Investigations PE: 63112F PR: 3153 6. AUTilOR(S) TA: 00 Richard H. Bossi and William Shepherd WU: 06 7...feature detection and three-dimensional positioning capability of X - ray computed tomography are valuable and cost saving assets to a failure analysis

  14. An Analysis of 27 Years of Research into Computer Education Published in Australian Educational Computing

    ERIC Educational Resources Information Center

    Zagami, Jason

    2015-01-01

    Analysis of three decades of publications in Australian Educational Computing (AEC) provides insight into the historical trends in Australian educational computing, highlighting an emphasis on pedagogy, comparatively few articles on educational technologies, and strong research topic alignment with similar international journals. Analysis confirms…

  15. Volume accumulator design analysis computer codes

    NASA Technical Reports Server (NTRS)

    Whitaker, W. D.; Shimazaki, T. T.

    1973-01-01

    The computer codes, VANEP and VANES, were written and used to aid in the design and performance calculation of the volume accumulator units (VAU) for the 5-kwe reactor thermoelectric system. VANEP computes the VAU design which meets the primary coolant loop VAU volume and pressure performance requirements. VANES computes the performance of the VAU design, determined from the VANEP code, at the conditions of the secondary coolant loop. The codes can also compute the performance characteristics of the VAU's under conditions of possible modes of failure which still permit continued system operation.

  16. Numerical Package in Computer Supported Numeric Analysis Teaching

    ERIC Educational Resources Information Center

    Tezer, Murat

    2007-01-01

    At universities in the faculties of Engineering, Sciences, Business and Economics together with higher education in Computing, it is stated that because of the difficulty, calculators and computers can be used in Numerical Analysis (NA). In this study, the learning computer supported NA will be discussed together with important usage of the…

  17. New Technique for Developing a Proton Range Compensator With Use of a 3-Dimensional Printer

    SciTech Connect

    Ju, Sang Gyu; Kim, Min Kyu; Hong, Chae-Seon; Kim, Jin Sung; Han, Youngyih; Choi, Doo Ho; Shin, Dongho; Lee, Se Byeong

    2014-02-01

    Purpose: A new system for manufacturing a proton range compensator (RC) was developed by using a 3-dimensional printer (3DP). The physical accuracy and dosimetric characteristics of the new RC manufactured by 3DP (RC{sub 3}DP) were compared with those of a conventional RC (RC{sub C}MM) manufactured by a computerized milling machine (CMM). Methods and Materials: An RC for brain tumor treatment with a scattered proton beam was calculated with a treatment planning system, and the resulting data were converted into a new format for 3DP using in-house software. The RC{sub 3}DP was printed with ultraviolet curable acrylic plastic, and an RC{sub C}MM was milled into polymethylmethacrylate using a CMM. The inner shape of both RCs was scanned by using a 3D scanner and compared with TPS data by applying composite analysis (CA; with 1-mm depth difference and 1 mm distance-to-agreement criteria) to verify their geometric accuracy. The position and distal penumbra of distal dose falloff at the central axis and field width of the dose profile at the midline depth of spread-out Bragg peak were measured for the 2 RCs to evaluate their dosimetric characteristics. Both RCs were imaged on a computed tomography scanner to evaluate uniformity of internal density. The manufacturing times for both RCs were compared to evaluate the production efficiency. Results: The pass rates for the CA test were 99.5% and 92.5% for RC{sub 3}DP and RC{sub C}MM, respectively. There was no significant difference in dosimetric characteristics and uniformity of internal density between the 2 RCs. The net fabrication times of RC{sub 3}DP and RC{sub C}MM were about 18 and 3 hours, respectively. Conclusions: The physical accuracy and dosimetric characteristics of RC{sub 3}DP were comparable with those of the conventional RC{sub C}MM, and significant system minimization was provided.

  18. NASA Applications for Computational Electromagnetic Analysis

    NASA Technical Reports Server (NTRS)

    Lewis, Catherine C.; Trout, Dawn H.; Krome, Mark E.; Perry, Thomas A.

    2011-01-01

    Computational Electromagnetic Software is used by NASA to analyze the compatibility of systems too large or too complex for testing. Recent advances in software packages and computer capabilities have made it possible to determine the effects of a transmitter inside a launch vehicle fairing, better analyze the environment threats, and perform on-orbit replacements with assured electromagnetic compatibility.

  19. Computer vision syndrome (CVS) – Thermographic Analysis

    NASA Astrophysics Data System (ADS)

    Llamosa-Rincón, L. E.; Jaime-Díaz, J. M.; Ruiz-Cardona, D. F.

    2017-01-01

    The use of computers has reported an exponential growth in the last decades, the possibility of carrying out several tasks for both professional and leisure purposes has contributed to the great acceptance by the users. The consequences and impact of uninterrupted tasks with computers screens or displays on the visual health, have grabbed researcher’s attention. When spending long periods of time in front of a computer screen, human eyes are subjected to great efforts, which in turn triggers a set of symptoms known as Computer Vision Syndrome (CVS). Most common of them are: blurred vision, visual fatigue and Dry Eye Syndrome (DES) due to unappropriate lubrication of ocular surface when blinking decreases. An experimental protocol was de-signed and implemented to perform thermographic studies on healthy human eyes during exposure to dis-plays of computers, with the main purpose of comparing the existing differences in temperature variations of healthy ocular surfaces.

  20. Thermal crosstalk in 3-dimensional RRAM crossbar array

    NASA Astrophysics Data System (ADS)

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-08-01

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation.

  1. The first 3-dimensional assemblies of organotin-functionalized polyanions.

    PubMed

    Piedra-Garza, Luis Fernando; Reinoso, Santiago; Dickman, Michael H; Sanguineti, Michael M; Kortz, Ulrich

    2009-08-21

    Reaction of the (CH(3))(2)Sn(2+) electrophile toward trilacunary [A-alpha-XW(9)O(34)](n-) Keggin polytungstates (X = P(V), As(V), Si(IV)) with guanidinium as templating-cation resulted in the isostructural compounds Na[C(NH(2))(3)](2)[{(CH(3))(2)Sn(H(2)O)}(3)(A-alpha-PW(9)O(34))] x 9 H(2)O (1), Na[C(NH(2))(3)](2)[{(CH(3))(2)Sn(H(2)O)}(3)(A-alpha-AsW(9)O(34))] x 8 H(2)O (2) and Na(2)[C(NH(2))(3)](2)[{(CH(3))(2)Sn(H(2)O)}(3)(A-alpha-SiW(9)O(34))] x 10 H(2)O (3). Compounds 1-3 constitute the first 3-dimensional assemblies of organotin-functionalized polyanions, as well as the first example of a dimethyltin-containing tungstosilicate in the case of 3, and they show a similar chiral architecture based on tetrahedrally-arranged {(CH(3))(2)Sn}(3)(A-alpha-XW(9)O(34)) monomeric building-blocks connected via intermolecular Sn-O=W bridges regardless of the size and/or charge of the heteroatom.

  2. Thermal crosstalk in 3-dimensional RRAM crossbar array

    PubMed Central

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-01-01

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation. PMID:26310537

  3. Thermal crosstalk in 3-dimensional RRAM crossbar array.

    PubMed

    Sun, Pengxiao; Lu, Nianduan; Li, Ling; Li, Yingtao; Wang, Hong; Lv, Hangbing; Liu, Qi; Long, Shibing; Liu, Su; Liu, Ming

    2015-08-27

    High density 3-dimensional (3D) crossbar resistive random access memory (RRAM) is one of the major focus of the new age technologies. To compete with the ultra-high density NAND and NOR memories, understanding of reliability mechanisms and scaling potential of 3D RRAM crossbar array is needed. Thermal crosstalk is one of the most critical effects that should be considered in 3D crossbar array application. The Joule heat generated inside the RRAM device will determine the switching behavior itself, and for dense memory arrays, the temperature surrounding may lead to a consequent resistance degradation of neighboring devices. In this work, thermal crosstalk effect and scaling potential under thermal effect in 3D RRAM crossbar array are systematically investigated. It is revealed that the reset process is dominated by transient thermal effect in 3D RRAM array. More importantly, thermal crosstalk phenomena could deteriorate device retention performance and even lead to data storage state failure from LRS (low resistance state) to HRS (high resistance state) of the disturbed RRAM cell. In addition, the resistance state degradation will be more serious with continuously scaling down the feature size. Possible methods for alleviating thermal crosstalk effect while further advancing the scaling potential are also provided and verified by numerical simulation.

  4. In vitro measurement of muscle volume with 3-dimensional ultrasound.

    PubMed

    Delcker, A; Walker, F; Caress, J; Hunt, C; Tegeler, C

    1999-05-01

    The aim was to test the accuracy of muscle volume measurements with a new 3-dimensional (3-D) ultrasound system, which allows a freehand scanning of the transducer with an improved quality of the ultrasound images and therefore the outlines of the muscles. Five resected cadaveric hand muscles were insonated and the muscle volumes calculated by 3-D reconstructions of the acquired 2-D ultrasound sections. Intra-reader, inter-reader and follow-up variability were calculated, as well as the volume of the muscle tissue measured by water displacement. In the results, 3-D ultrasound and water displacement measurements showed an average deviation of 10.1%; Data of 3-D ultrasound measurements were: intra-reader variability 2.8%; inter-reader variability 2.4% and follow-up variability 2.3%. 3-D measurements of muscle volume are valid and reliable. Serial sonographic measurements of muscle may be able to quantitate changes in muscle volume that occur in disease and recovery.

  5. Invasive 3-Dimensional Organotypic Neoplasia from Multiple Normal Human Epithelia

    PubMed Central

    Ridky, Todd W.; Chow, Jennifer M.; Wong, David J.; Khavari, Paul A.

    2013-01-01

    Refined cancer models are required to assess the burgeoning number of potential targets for cancer therapeutics within a rapid and clinically relevant context. Here we utilize tumor-associated genetic pathways to transform primary human epithelial cells from epidermis, oropharynx, esophagus, and cervix into genetically defined tumors within a human 3-dimensional (3-D) tissue environment incorporating cell-populated stroma and intact basement membrane. These engineered organotypic tissues recapitulated natural features of tumor progression, including epithelial invasion through basement membrane, a complex process critically required for biologic malignancy in 90% of human cancers. Invasion was rapid, and potentiated by stromal cells. Oncogenic signals in 3-D tissue, but not 2-D culture, resembled gene expression profiles from spontaneous human cancers. Screening well-characterized signaling pathway inhibitors in 3-D organotypic neoplasia helped distil a clinically faithful cancer gene signature. Multi-tissue 3-D human tissue cancer models may provide an efficient and relevant complement to current approaches to characterize cancer progression. PMID:21102459

  6. Computer analysis of slow vital capacity spirograms.

    PubMed

    Primiano, F P; Bacevice, A E; Lough, M D; Doershuk, C F

    1982-01-01

    We have developed a digital computer program which evaluates the vital capacity and its subdivisions, expiratory reserve volume and inspiratory capacity. The algorithm examines the multibreath spirogram, a continuous record of quiet breathing interspersed among repeated slow, large volume maneuvers. Quiet breaths are recognized by comparing features of each breath to the respective average and variation of these features for all breaths. A self-scaling, iterative procedure is used to identify those end-tidal points that most likely represent the subject's functional residual capacity. A least-squared error baseline is then fit through these points to partition the vital capacity. Twenty-three spirograms from patients with documented pulmonary disease were independently analyzed by the computer, a pulmonary function technician, and the laboratory supervisor. No practical differences were found among the results. However, the computer's values, in contrast to those of the technician, were reproducible on repeated trials and free of computational and transcriptional errors.

  7. System balance analysis for vector computers

    NASA Technical Reports Server (NTRS)

    Knight, J. C.; Poole, W. G., Jr.; Voight, R. G.

    1975-01-01

    The availability of vector processors capable of sustaining computing rates of 10 to the 8th power arithmetic results pers second raised the question of whether peripheral storage devices representing current technology can keep such processors supplied with data. By examining the solution of a large banded linear system on these computers, it was found that even under ideal conditions, the processors will frequently be waiting for problem data.

  8. Quantum Computer Circuit Analysis and Design

    DTIC Science & Technology

    2009-02-01

    is a first order nonlinear differential matrix equation of the Lax type. This report gives derivations of the Levi-Civita connection, Riemann...computational paths in the )2( nSU manifold. It is a nonlinear first-order differential matrix equation of the same form as the Lax equation for...I. L. Quantum Information and Computation; Cambridge University Press, 2000. 2. Dowling , M. R.; Nielsen, M. A. The Geometry of Quantum

  9. Influence of White-Coat Hypertension on Left Ventricular Deformation 2- and 3-Dimensional Speckle Tracking Study.

    PubMed

    Tadic, Marijana; Cuspidi, Cesare; Ivanovic, Branislava; Ilic, Irena; Celic, Vera; Kocijancic, Vesna

    2016-03-01

    We sought to compare left ventricular deformation in subjects with white-coat hypertension to normotensive and sustained hypertensive patients. This cross-sectional study included 139 untreated subjects who underwent 24-hour ambulatory blood pressure monitoring and completed 2- and 3-dimensional examination. Two-dimensional left ventricular multilayer strain analysis was also performed. White-coat hypertension was diagnosed if clinical blood pressure was elevated and 24-hour blood pressure was normal. Our results showed that left ventricular longitudinal and circumferential strains gradually decreased from normotensive controls across subjects with white-coat hypertension to sustained hypertensive group. Two- and 3-dimensional left ventricular radial strain, as well as 3-dimensional area strain, was not different between groups. Two-dimensional left ventricular longitudinal and circumferential strains of subendocardial and mid-myocardial layers gradually decreased from normotensive control to sustained hypertensive group. Longitudinal and circumferential strains of subepicardial layer did not differ between the observed groups. We concluded that white-coat hypertension significantly affects left ventricular deformation assessed by 2-dimensional traditional strain, multilayer strain, and 3-dimensional strain.

  10. 3-Dimensional Immersive Visualization For Regional Water Planning

    NASA Astrophysics Data System (ADS)

    Block, J.; Razdan, A.; Shangraw, R.; Arrowsmith, R.

    2005-12-01

    accurately represent the model inputs and outputs to MODFLOW in ways the data have not been previously presented. We have explored new data conversion techniques to import GIS data to a Linux-based computing cluster. The innovative visualization of these data allows the water planners to more completely grasp the intricate complexities of the data analysis, while being able to see more inputs to the model simultaneously. Temporal changes in aquifer storage are now represented in time-stepped 3D surfaces, thus reducing the cognitive load for comprehension. Planners will ultimately use the resulting visualization tools to educate policy decision makers on outcomes from alternate scenarios and the effect of variations in the key model input parameters.

  11. Modern Computational Techniques for the HMMER Sequence Analysis

    PubMed Central

    2013-01-01

    This paper focuses on the latest research and critical reviews on modern computing architectures, software and hardware accelerated algorithms for bioinformatics data analysis with an emphasis on one of the most important sequence analysis applications—hidden Markov models (HMM). We show the detailed performance comparison of sequence analysis tools on various computing platforms recently developed in the bioinformatics society. The characteristics of the sequence analysis, such as data and compute-intensive natures, make it very attractive to optimize and parallelize by using both traditional software approach and innovated hardware acceleration technologies. PMID:25937944

  12. Computational analysis of LDDMM for brain mapping.

    PubMed

    Ceritoglu, Can; Tang, Xiaoying; Chow, Margaret; Hadjiabadi, Darian; Shah, Damish; Brown, Timothy; Burhanullah, Muhammad H; Trinh, Huong; Hsu, John T; Ament, Katarina A; Crocetti, Deana; Mori, Susumu; Mostofsky, Stewart H; Yantis, Steven; Miller, Michael I; Ratnanather, J Tilak

    2013-01-01

    One goal of computational anatomy (CA) is to develop tools to accurately segment brain structures in healthy and diseased subjects. In this paper, we examine the performance and complexity of such segmentation in the framework of the large deformation diffeomorphic metric mapping (LDDMM) registration method with reference to atlases and parameters. First we report the application of a multi-atlas segmentation approach to define basal ganglia structures in healthy and diseased kids' brains. The segmentation accuracy of the multi-atlas approach is compared with the single atlas LDDMM implementation and two state-of-the-art segmentation algorithms-Freesurfer and FSL-by computing the overlap errors between automatic and manual segmentations of the six basal ganglia nuclei in healthy subjects as well as subjects with diseases including ADHD and Autism. The high accuracy of multi-atlas segmentation is obtained at the cost of increasing the computational complexity because of the calculations necessary between the atlases and a subject. Second, we examine the effect of parameters on total LDDMM computation time and segmentation accuracy for basal ganglia structures. Single atlas LDDMM method is used to automatically segment the structures in a population of 16 subjects using different sets of parameters. The results show that a cascade approach and using fewer time steps can reduce computational complexity as much as five times while maintaining reliable segmentations.

  13. Frequency modulation television analysis: Threshold impulse analysis. [with computer program

    NASA Technical Reports Server (NTRS)

    Hodge, W. H.

    1973-01-01

    A computer program is developed to calculate the FM threshold impulse rates as a function of the carrier-to-noise ratio for a specified FM system. The system parameters and a vector of 1024 integers, representing the probability density of the modulating voltage, are required as input parameters. The computer program is utilized to calculate threshold impulse rates for twenty-four sets of measured probability data supplied by NASA and for sinusoidal and Gaussian modulating waveforms. As a result of the analysis several conclusions are drawn: (1) The use of preemphasis in an FM television system improves the threshold by reducing the impulse rate. (2) Sinusoidal modulation produces a total impulse rate which is a practical upper bound for the impulse rates of TV signals providing the same peak deviations. (3) As the moment of the FM spectrum about the center frequency of the predetection filter increases, the impulse rate tends to increase. (4) A spectrum having an expected frequency above (below) the center frequency of the predetection filter produces a higher negative (positive) than positive (negative) impulse rate.

  14. 3-Dimensional shear wave elastography of breast lesions

    PubMed Central

    Chen, Ya-ling; Chang, Cai; Zeng, Wei; Wang, Fen; Chen, Jia-jian; Qu, Ning

    2016-01-01

    Abstract Color patterns of 3-dimensional (3D) shear wave elastography (SWE) is a promising method in differentiating tumoral nodules recently. This study was to evaluate the diagnostic accuracy of color patterns of 3D SWE in breast lesions, with special emphasis on coronal planes. A total of 198 consecutive women with 198 breast lesions (125 malignant and 73 benign) were included, who underwent conventional ultrasound (US), 3D B-mode, and 3D SWE before surgical excision. SWE color patterns of Views A (transverse), T (sagittal), and C (coronal) were determined. Sensitivity, specificity, and the area under the receiver operating characteristic curve (AUC) were calculated. Distribution of SWE color patterns was significantly different between malignant and benign lesions (P = 0.001). In malignant lesions, “Stiff Rim” was significantly more frequent in View C (crater sign, 60.8%) than in View A (51.2%, P = 0.013) and View T (54.1%, P = 0.035). AUC for combination of “Crater Sign” and conventional US was significantly higher than View A (0.929 vs 0.902, P = 0.004) and View T (0.929 vs 0.907, P = 0.009), and specificity significantly increased (90.4% vs 78.1%, P = 0.013) without significant change in sensitivity (85.6% vs 88.0%, P = 0.664) as compared with conventional US. In conclusion, combination of conventional US with 3D SWE color patterns significantly increased diagnostic accuracy, with “Crater Sign” in coronal plane of the highest value. PMID:27684820

  15. A new preclinical 3-dimensional agarose colony formation assay.

    PubMed

    Kajiwara, Yoshinori; Panchabhai, Sonali; Levin, Victor A

    2008-08-01

    The evaluation of new drug treatments and combination treatments for gliomas and other cancers requires a robust means to interrogate wide dose ranges and varying times of drug exposure without stain-inactivation of the cells (colonies). To this end, we developed a 3-dimensional (3D) colony formation assay that makes use of GelCount technology, a new cell colony counter for gels and soft agars. We used U251MG, SNB19, and LNZ308 glioma cell lines and MiaPaCa pancreas adenocarcinoma and SW480 colon adenocarcinoma cell lines. Colonies were grown in a two-tiered agarose that had 0.7% agarose on the bottom and 0.3% agarose on top. We then studied the effects of DFMO, carboplatin, and SAHA over a 3-log dose range and over multiple days of drug exposure. Using GelCount we approximated the area under the curve (AUC) of colony volumes as the sum of colony volumes (microm2xOD) in each plate to calculate IC50 values. Adenocarcinoma colonies were recognized by GelCount scanning at 3-4 days, while it took 6-7 days to detect glioma colonies. The growth rate of MiaPaCa and SW480 cells was rapid, with 100 colonies counted in 5-6 days; glioma cells grew more slowly, with 100 colonies counted in 9-10 days. Reliable log dose versus AUC curves were observed for all drugs studied. In conclusion, the GelCount method that we describe is more quantitative than traditional colony assays and allows precise study of drug effects with respect to both dose and time of exposure using fewer culture plates.

  16. Computational analysis of an aortic valve jet

    NASA Astrophysics Data System (ADS)

    Shadden, Shawn C.; Astorino, Matteo; Gerbeau, Jean-Frédéric

    2009-11-01

    In this work we employ a coupled FSI scheme using an immersed boundary method to simulate flow through a realistic deformable, 3D aortic valve model. This data was used to compute Lagrangian coherent structures, which revealed flow separation from the valve leaflets during systole, and correspondingly, the boundary between the jet of ejected fluid and the regions of separated, recirculating flow. Advantages of computing LCS in multi-dimensional FSI models of the aortic valve are twofold. For one, the quality and effectiveness of existing clinical indices used to measure aortic jet size can be tested by taking advantage of the accurate measure of the jet area derived from LCS. Secondly, as an ultimate goal, a reliable computational framework for the assessment of the aortic valve stenosis could be developed.

  17. Global detailed geoid computation and model analysis

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Vincent, S.

    1974-01-01

    Comparisons and analyses were carried out through the use of detailed gravimetric geoids which we have computed by combining models with a set of 26,000 1 deg x 1 deg mean free air gravity anomalies. The accuracy of the detailed gravimetric geoid computed using the most recent Goddard earth model (GEM-6) in conjunction with the set of 1 deg x 1 deg mean free air gravity anomalies is assessed at + or - 2 meters on the continents of North America, Europe, and Australia, 2 to 5 meters in the Northeast Pacific and North Atlantic areas, and 5 to 10 meters in other areas where surface gravity data are sparse. The R.M.S. differences between this detailed geoid and the detailed geoids computed using the other satellite gravity fields in conjuction with same set of surface data range from 3 to 7 meters.

  18. An emulator for minimizing computer resources for finite element analysis

    NASA Technical Reports Server (NTRS)

    Melosh, R.; Utku, S.; Islam, M.; Salama, M.

    1984-01-01

    A computer code, SCOPE, has been developed for predicting the computer resources required for a given analysis code, computer hardware, and structural problem. The cost of running the code is a small fraction (about 3 percent) of the cost of performing the actual analysis. However, its accuracy in predicting the CPU and I/O resources depends intrinsically on the accuracy of calibration data that must be developed once for the computer hardware and the finite element analysis code of interest. Testing of the SCOPE code on the AMDAHL 470 V/8 computer and the ELAS finite element analysis program indicated small I/O errors (3.2 percent), larger CPU errors (17.8 percent), and negligible total errors (1.5 percent).

  19. Reproducibility of computational workflows is automated using continuous analysis.

    PubMed

    Beaulieu-Jones, Brett K; Greene, Casey S

    2017-03-13

    Replication, validation and extension of experiments are crucial for scientific progress. Computational experiments are scriptable and should be easy to reproduce. However, computational analyses are designed and run in a specific computing environment, which may be difficult or impossible to match using written instructions. We report the development of continuous analysis, a workflow that enables reproducible computational analyses. Continuous analysis combines Docker, a container technology akin to virtual machines, with continuous integration, a software development technique, to automatically rerun a computational analysis whenever updates or improvements are made to source code or data. This enables researchers to reproduce results without contacting the study authors. Continuous analysis allows reviewers, editors or readers to verify reproducibility without manually downloading and rerunning code and can provide an audit trail for analyses of data that cannot be shared.

  20. Computer Aided Modeling and Post Processing with NASTRAN Analysis

    NASA Technical Reports Server (NTRS)

    Boroughs, R. R.

    1984-01-01

    Computer aided engineering systems are invaluable tools in performing NASTRAN finite element analysis. These techniques are implemented in both the pre-processing and post-processing phases of the NASTRAN analysis. The finite element model development, or pre-processing phase, was automated with a computer aided modeling program called Supertabl, and the review and interpretation of the results of the NASTRAN analysis, or post-processing phase, was automated with a computer aided plotting program called Output Display. An intermediate program, Nasplot, which was developed in-house, has also helped to cut down on the model checkout time and reduce errors in the model. An interface has been established between the finite element computer aided engineering system and the Learjet computer aided design system whereby data can be transferred back and forth between the two. These systems have significantly improved productivity and the ability to perform NASTRAN analysis in response to product development requests.

  1. Accuracy Evaluation of a 3-Dimensional Surface Imaging System for Guidance in Deep-Inspiration Breath-Hold Radiation Therapy

    SciTech Connect

    Alderliesten, Tanja; Sonke, Jan-Jakob; Betgen, Anja; Honnef, Joeri; Vliet-Vroegindeweij, Corine van; Remeijer, Peter

    2013-02-01

    Purpose: To investigate the applicability of 3-dimensional (3D) surface imaging for image guidance in deep-inspiration breath-hold radiation therapy (DIBH-RT) for patients with left-sided breast cancer. For this purpose, setup data based on captured 3D surfaces was compared with setup data based on cone beam computed tomography (CBCT). Methods and Materials: Twenty patients treated with DIBH-RT after breast-conserving surgery (BCS) were included. Before the start of treatment, each patient underwent a breath-hold CT scan for planning purposes. During treatment, dose delivery was preceded by setup verification using CBCT of the left breast. 3D surfaces were captured by a surface imaging system concurrently with the CBCT scan. Retrospectively, surface registrations were performed for CBCT to CT and for a captured 3D surface to CT. The resulting setup errors were compared with linear regression analysis. For the differences between setup errors, group mean, systematic error, random error, and 95% limits of agreement were calculated. Furthermore, receiver operating characteristic (ROC) analysis was performed. Results: Good correlation between setup errors was found: R{sup 2}=0.70, 0.90, 0.82 in left-right, craniocaudal, and anterior-posterior directions, respectively. Systematic errors were {<=}0.17 cm in all directions. Random errors were {<=}0.15 cm. The limits of agreement were -0.34-0.48, -0.42-0.39, and -0.52-0.23 cm in left-right, craniocaudal, and anterior-posterior directions, respectively. ROC analysis showed that a threshold between 0.4 and 0.8 cm corresponds to promising true positive rates (0.78-0.95) and false positive rates (0.12-0.28). Conclusions: The results support the application of 3D surface imaging for image guidance in DIBH-RT after BCS.

  2. Thermoelectric pump performance analysis computer code

    NASA Technical Reports Server (NTRS)

    Johnson, J. L.

    1973-01-01

    A computer program is presented that was used to analyze and design dual-throat electromagnetic dc conduction pumps for the 5-kwe ZrH reactor thermoelectric system. In addition to a listing of the code and corresponding identification of symbols, the bases for this analytical model are provided.

  3. Computer-assisted photometric microplate analysis.

    PubMed

    Hörer, O L; Pop, D A

    1987-01-01

    The main algorithm of computer-assisted absorption and emission photometry of samples on a microplate is presented. The software can be used for the enzyme immunoassay (EIA) and other virological tests. The performances of an SPF-500 (Aminco) spectrofluorometer/Felix M18 microcomputer system are discussed on the ground of some results obtained by using the implemented programs.

  4. Computed Tomography Analysis of NASA BSTRA Balls

    SciTech Connect

    Perry, R L; Schneberk, D J; Thompson, R R

    2004-10-12

    Fifteen 1.25 inch BSTRA balls were scanned with the high energy computed tomography system at LLNL. This system has a resolution limit of approximately 210 microns. A threshold of 238 microns (two voxels) was used, and no anomalies at or greater than this were observed.

  5. Conversation Analysis of Computer-Mediated Communication

    ERIC Educational Resources Information Center

    Gonzalez-Lloret, Marta

    2011-01-01

    The potential of computer-mediated communication (CMC) for language learning resides mainly in the possibility that learners have to engage with other speakers of the language, including L1 speakers. The inclusion of CMC in the L2 classroom provides an opportunity for students to utilize authentic language in real interaction, rather than the more…

  6. Computational thermo-fluid analysis of a disk brake

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Kuraishi, Takashi; Tabata, Shinichiro; Takagi, Hirokazu

    2016-06-01

    We present computational thermo-fluid analysis of a disk brake, including thermo-fluid analysis of the flow around the brake and heat conduction analysis of the disk. The computational challenges include proper representation of the small-scale thermo-fluid behavior, high-resolution representation of the thermo-fluid boundary layers near the spinning solid surfaces, and bringing the heat transfer coefficient (HTC) calculated in the thermo-fluid analysis of the flow to the heat conduction analysis of the spinning disk. The disk brake model used in the analysis closely represents the actual configuration, and this adds to the computational challenges. The components of the method we have developed for computational analysis of the class of problems with these types of challenges include the Space-Time Variational Multiscale method for coupled incompressible flow and thermal transport, ST Slip Interface method for high-resolution representation of the thermo-fluid boundary layers near spinning solid surfaces, and a set of projection methods for different parts of the disk to bring the HTC calculated in the thermo-fluid analysis. With the HTC coming from the thermo-fluid analysis of the flow around the brake, we do the heat conduction analysis of the disk, from the start of the breaking until the disk spinning stops, demonstrating how the method developed works in computational analysis of this complex and challenging problem.

  7. Computer-based image analysis in breast pathology

    PubMed Central

    Gandomkar, Ziba; Brennan, Patrick C.; Mello-Thoms, Claudia

    2016-01-01

    Whole slide imaging (WSI) has the potential to be utilized in telepathology, teleconsultation, quality assurance, clinical education, and digital image analysis to aid pathologists. In this paper, the potential added benefits of computer-assisted image analysis in breast pathology are reviewed and discussed. One of the major advantages of WSI systems is the possibility of doing computer-based image analysis on the digital slides. The purpose of computer-assisted analysis of breast virtual slides can be (i) segmentation of desired regions or objects such as diagnostically relevant areas, epithelial nuclei, lymphocyte cells, tubules, and mitotic figures, (ii) classification of breast slides based on breast cancer (BCa) grades, the invasive potential of tumors, or cancer subtypes, (iii) prognosis of BCa, or (iv) immunohistochemical quantification. While encouraging results have been achieved in this area, further progress is still required to make computer-based image analysis of breast virtual slides acceptable for clinical practice. PMID:28066683

  8. Computer applications for engineering/structural analysis. Revision 1

    SciTech Connect

    Zaslawsky, M.; Samaddar, S.K.

    1991-12-31

    Analysts and organizations have a tendency to lock themselves into specific codes with the obvious consequences of not addressing the real problem and thus reaching the wrong conclusion. This paper discusses the role of the analyst in selecting computer codes. The participation and support of a computation division in modifying the source program, configuration management, and pre- and post-processing of codes are among the subjects discussed. Specific examples illustrating the computer code selection process are described in the following problem areas: soil structure interaction, structural analysis of nuclear reactors, analysis of waste tanks where fluid structure interaction is important, analysis of equipment, structure-structure interaction, analysis of the operation of the superconductor supercollider which includes friction and transient temperature, and 3D analysis of the 10-meter telescope being built in Hawaii. Validation and verification of computer codes and their impact on the selection process are also discussed.

  9. Hybrid soft computing systems for electromyographic signals analysis: a review.

    PubMed

    Xie, Hong-Bo; Guo, Tianruo; Bai, Siwei; Dokos, Socrates

    2014-02-03

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis.

  10. Hybrid soft computing systems for electromyographic signals analysis: a review

    PubMed Central

    2014-01-01

    Electromyographic (EMG) is a bio-signal collected on human skeletal muscle. Analysis of EMG signals has been widely used to detect human movement intent, control various human-machine interfaces, diagnose neuromuscular diseases, and model neuromusculoskeletal system. With the advances of artificial intelligence and soft computing, many sophisticated techniques have been proposed for such purpose. Hybrid soft computing system (HSCS), the integration of these different techniques, aims to further improve the effectiveness, efficiency, and accuracy of EMG analysis. This paper reviews and compares key combinations of neural network, support vector machine, fuzzy logic, evolutionary computing, and swarm intelligence for EMG analysis. Our suggestions on the possible future development of HSCS in EMG analysis are also given in terms of basic soft computing techniques, further combination of these techniques, and their other applications in EMG analysis. PMID:24490979

  11. Computer aided radiation analysis for manned spacecraft

    NASA Technical Reports Server (NTRS)

    Appleby, Matthew H.; Griffin, Brand N.; Tanner, Ernest R., II; Pogue, William R.; Golightly, Michael J.

    1991-01-01

    In order to assist in the design of radiation shielding an analytical tool is presented that can be employed in combination with CAD facilities and NASA transport codes. The nature of radiation in space is described, and the operational requirements for protection are listed as background information for the use of the technique. The method is based on the Boeing radiation exposure model (BREM) for combining NASA radiation transport codes and CAD facilities, and the output is given as contour maps of the radiation-shield distribution so that dangerous areas can be identified. Computational models are used to solve the 1D Boltzmann transport equation and determine the shielding needs for the worst-case scenario. BREM can be employed directly with the radiation computations to assess radiation protection during all phases of design which saves time and ultimately spacecraft weight.

  12. Research Activity in Computational Physics utilizing High Performance Computing: Co-authorship Network Analysis

    NASA Astrophysics Data System (ADS)

    Ahn, Sul-Ah; Jung, Youngim

    2016-10-01

    The research activities of the computational physicists utilizing high performance computing are analyzed by bibliometirc approaches. This study aims at providing the computational physicists utilizing high-performance computing and policy planners with useful bibliometric results for an assessment of research activities. In order to achieve this purpose, we carried out a co-authorship network analysis of journal articles to assess the research activities of researchers for high-performance computational physics as a case study. For this study, we used journal articles of the Scopus database from Elsevier covering the time period of 2004-2013. We extracted the author rank in the physics field utilizing high-performance computing by the number of papers published during ten years from 2004. Finally, we drew the co-authorship network for 45 top-authors and their coauthors, and described some features of the co-authorship network in relation to the author rank. Suggestions for further studies are discussed.

  13. Computer content analysis of the Schreber case.

    PubMed

    O'Dell, J W; Weideman, D

    1993-01-01

    The text of Schreber's Memoirs of My Nervous Illness was analyzed by computer at the level of the individual word. These words then were grouped into 17 rational categories, and the categories were checked for reliability. The contents of Schreber's work then were compared with three other documents. In general, the Memoirs showed much greater delusional content than the other documents. Interestingly, sexual matters did not appear to be Schreber's principal problem at this atomistic level.

  14. Analysis of a Multiprocessor Guidance Computer. Ph.D. Thesis

    NASA Technical Reports Server (NTRS)

    Maltach, E. G.

    1969-01-01

    The design of the next generation of spaceborne digital computers is described. It analyzes a possible multiprocessor computer configuration. For the analysis, a set of representative space computing tasks was abstracted from the Lunar Module Guidance Computer programs as executed during the lunar landing, from the Apollo program. This computer performs at this time about 24 concurrent functions, with iteration rates from 10 times per second to once every two seconds. These jobs were tabulated in a machine-independent form, and statistics of the overall job set were obtained. It was concluded, based on a comparison of simulation and Markov results, that the Markov process analysis is accurate in predicting overall trends and in configuration comparisons, but does not provide useful detailed information in specific situations. Using both types of analysis, it was determined that the job scheduling function is a critical one for efficiency of the multiprocessor. It is recommended that research into the area of automatic job scheduling be performed.

  15. Network Analysis and Knowledge Discovery Through DNA Computing

    DTIC Science & Technology

    2006-07-01

    The application of computational mathematics and information science to biology has aided in the understanding of biological systems. Today biology can now aid information science . This research activity addresses this new and potentially symbiotic relationship between biology and information. In this report, a biocomputational analysis of a biologically represented network is demonstrated. A report on new DNA aqueous laboratory computing techniques is also given.

  16. Computer-Based Interaction Analysis with DEGREE Revisited

    ERIC Educational Resources Information Center

    Barros, B.; Verdejo, M. F.

    2016-01-01

    We review our research with "DEGREE" and analyse how our work has impacted the collaborative learning community since 2000. Our research is framed within the context of computer-based interaction analysis and the development of computer-supported collaborative learning (CSCL) tools. We identify some aspects of our work which have been…

  17. The symbolic computation and automatic analysis of trajectories

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    Research was generally done on computation of trajectories of dynamical systems, especially control systems. Algorithms were further developed for rewriting expressions involving differential operators. The differential operators involved arise in the local analysis of nonlinear control systems. An initial design was completed of the system architecture for software to analyze nonlinear control systems using data base computing.

  18. Potential applications of computational fluid dynamics to biofluid analysis

    NASA Technical Reports Server (NTRS)

    Kwak, D.; Chang, J. L. C.; Rogers, S. E.; Rosenfeld, M.; Kwak, D.

    1988-01-01

    Computational fluid dynamics was developed to the stage where it has become an indispensable part of aerospace research and design. In view of advances made in aerospace applications, the computational approach can be used for biofluid mechanics research. Several flow simulation methods developed for aerospace problems are briefly discussed for potential applications to biofluids, especially to blood flow analysis.

  19. The Reliability of Content Analysis of Computer Conference Communication

    ERIC Educational Resources Information Center

    Rattleff, Pernille

    2007-01-01

    The focus of this article is the reliability of content analysis of students' computer conference communication. Content analysis is often used when researching the relationship between learning and the use of information and communications technology in educational settings. A number of studies where content analysis is used and classification…

  20. Photoprotection by pistachio bioactives in a 3-dimensional human skin equivalent tissue model.

    PubMed

    Chen, C-Y Oliver; Smith, Avi; Liu, Yuntao; Du, Peng; Blumberg, Jeffrey B; Garlick, Jonathan

    2017-01-25

    Reactive oxygen species (ROS) generated during ultraviolet (UV) light exposure can induce skin damage and aging. Antioxidants can provide protection against oxidative injury to skin via "quenching" ROS. Using a validated 3-dimensional (3D) human skin equivalent (HSE) tissue model that closely mimics human skin, we examined whether pistachio antioxidants could protect HSE against UVA-induced damage. Lutein and γ-tocopherol are the predominant lipophilic antioxidants in pistachios; treatment with these compounds prior to UVA exposure protected against morphological changes to the epithelial and connective tissue compartments of HSE. Pistachio antioxidants preserved overall skin thickness and organization, as well as fibroblast morphology, in HSE exposed to UVA irradiation. However, this protection was not substantiated by the analysis of the proliferation of keratinocytes and apoptosis of fibroblasts. Additional studies are warranted to elucidate the basis of these discordant results and extend research into the potential role of pistachio bioactives promoting skin health.

  1. Multivariate analysis: A statistical approach for computations

    NASA Astrophysics Data System (ADS)

    Michu, Sachin; Kaushik, Vandana

    2014-10-01

    Multivariate analysis is a type of multivariate statistical approach commonly used in, automotive diagnosis, education evaluating clusters in finance etc and more recently in the health-related professions. The objective of the paper is to provide a detailed exploratory discussion about factor analysis (FA) in image retrieval method and correlation analysis (CA) of network traffic. Image retrieval methods aim to retrieve relevant images from a collected database, based on their content. The problem is made more difficult due to the high dimension of the variable space in which the images are represented. Multivariate correlation analysis proposes an anomaly detection and analysis method based on the correlation coefficient matrix. Anomaly behaviors in the network include the various attacks on the network like DDOs attacks and network scanning.

  2. Computational Modeling, Formal Analysis, and Tools for Systems Biology

    PubMed Central

    Bartocci, Ezio; Lió, Pietro

    2016-01-01

    As the amount of biological data in the public domain grows, so does the range of modeling and analysis techniques employed in systems biology. In recent years, a number of theoretical computer science developments have enabled modeling methodology to keep pace. The growing interest in systems biology in executable models and their analysis has necessitated the borrowing of terms and methods from computer science, such as formal analysis, model checking, static analysis, and runtime verification. Here, we discuss the most important and exciting computational methods and tools currently available to systems biologists. We believe that a deeper understanding of the concepts and theory highlighted in this review will produce better software practice, improved investigation of complex biological processes, and even new ideas and better feedback into computer science. PMID:26795950

  3. Surveillance Analysis Computer System (SACS) software requirements specification (SRS)

    SciTech Connect

    Glasscock, J.A.; Flanagan, M.J.

    1995-09-01

    This document is the primary document establishing requirements for the Surveillance Analysis Computer System (SACS) Database, an Impact Level 3Q system. The purpose is to provide the customer and the performing organization with the requirements for the SACS Project.

  4. Computational fluid dynamics combustion analysis evaluation

    NASA Technical Reports Server (NTRS)

    Kim, Y. M.; Shang, H. M.; Chen, C. P.; Ziebarth, J. P.

    1992-01-01

    This study involves the development of numerical modelling in spray combustion. These modelling efforts are mainly motivated to improve the computational efficiency in the stochastic particle tracking method as well as to incorporate the physical submodels of turbulence, combustion, vaporization, and dense spray effects. The present mathematical formulation and numerical methodologies can be casted in any time-marching pressure correction methodologies (PCM) such as FDNS code and MAST code. A sequence of validation cases involving steady burning sprays and transient evaporating sprays will be included.

  5. Analysis and computer tools for separation processes involving nonideal mixtures

    SciTech Connect

    Lucia, A.

    1992-05-01

    The objectives of this research, were to continue to further both the theoretical understanding of and the development of computer tools (algorithms) for separation processes involving nonideal mixtures. These objectives were divided into three interrelated major areas -- the mathematical analysis of the number of steady-state solutions to multistage separation processes, the numerical analysis of general, related fixed-point methods, and the development and implementation of computer tools for process simulation.

  6. Process for computing geometric perturbations for probabilistic analysis

    DOEpatents

    Fitch, Simeon H. K. [Charlottesville, VA; Riha, David S [San Antonio, TX; Thacker, Ben H [San Antonio, TX

    2012-04-10

    A method for computing geometric perturbations for probabilistic analysis. The probabilistic analysis is based on finite element modeling, in which uncertainties in the modeled system are represented by changes in the nominal geometry of the model, referred to as "perturbations". These changes are accomplished using displacement vectors, which are computed for each node of a region of interest and are based on mean-value coordinate calculations.

  7. Computational analysis of deep brain stimulation.

    PubMed

    McIntyre, Cameron C; Miocinovic, Svjetlana; Butson, Christopher R

    2007-09-01

    Chronic, high-frequency electrical stimulation of subcortical brain structures (deep brain stimulation [DBS]) is an effective clinical treatment for several medically refractory neurological disorders. However, the clinical successes of DBS are tempered by the limited understanding of the response of neurons to applied electric fields and scientific definition of the therapeutic mechanisms of DBS remains elusive. In addition, it is presently unclear which electrode designs and stimulation parameters are optimal for maximum therapeutic benefit and minimal side effects. Detailed computer modeling of DBS has recently emerged as a powerful technique to enhance our understanding of the effects of DBS and to create a virtual testing ground for new stimulation paradigms. This review summarizes the fundamentals of neurostimulation modeling and provides an overview of some of the scientific contributions of computer models to the field of DBS. We then provide a prospective view on the application of DBS-modeling tools to augment the clinical utility of DBS and to design the next generation of DBS technology.

  8. Computer analysis of digital well logs

    USGS Publications Warehouse

    Scott, James H.

    1984-01-01

    A comprehensive system of computer programs has been developed by the U.S. Geological Survey for analyzing digital well logs. The programs are operational on a minicomputer in a research well-logging truck, making it possible to analyze and replot the logs while at the field site. The minicomputer also serves as a controller of digitizers, counters, and recorders during acquisition of well logs. The analytical programs are coordinated with the data acquisition programs in a flexible system that allows the operator to make changes quickly and easily in program variables such as calibration coefficients, measurement units, and plotting scales. The programs are designed to analyze the following well-logging measurements: natural gamma-ray, neutron-neutron, dual-detector density with caliper, magnetic susceptibility, single-point resistance, self potential, resistivity (normal and Wenner configurations), induced polarization, temperature, sonic delta-t, and sonic amplitude. The computer programs are designed to make basic corrections for depth displacements, tool response characteristics, hole diameter, and borehole fluid effects (when applicable). Corrected well-log measurements are output to magnetic tape or plotter with measurement units transformed to petrophysical and chemical units of interest, such as grade of uranium mineralization in percent eU3O8, neutron porosity index in percent, and sonic velocity in kilometers per second.

  9. RSAC -6 Radiological Safety Analysis Computer Program

    SciTech Connect

    Schrader, Bradley J; Wenzel, Douglas Rudolph

    2001-06-01

    RSAC-6 is the latest version of the RSAC program. It calculates the consequences of a release of radionuclides to the atmosphere. Using a personal computer, a user can generate a fission product inventory; decay and in-grow the inventory during transport through processes, facilities, and the environment; model the downwind dispersion of the activity; and calculate doses to downwind individuals. Internal dose from the inhalation and ingestion pathways is calculated. External dose from ground surface and plume gamma pathways is calculated. New and exciting updates to the program include the ability to evaluate a release to an enclosed room, resuspension of deposited activity and evaluation of a release up to 1 meter from the release point. Enhanced tools are included for dry deposition, building wake, occupancy factors, respirable fraction, AMAD adjustment, updated and enhanced radionuclide inventory and inclusion of the dose-conversion factors from FGR 11 and 12.

  10. System Matrix Analysis for Computed Tomography Imaging.

    PubMed

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data.

  11. Computational analysis of small RNA cloning data.

    PubMed

    Berninger, Philipp; Gaidatzis, Dimos; van Nimwegen, Erik; Zavolan, Mihaela

    2008-01-01

    Cloning and sequencing is the method of choice for small regulatory RNA identification. Using deep sequencing technologies one can now obtain up to a billion nucleotides--and tens of millions of small RNAs--from a single library. Careful computational analyses of such libraries enabled the discovery of miRNAs, rasiRNAs, piRNAs, and 21U RNAs. Given the large number of sequences that can be obtained from each individual sample, deep sequencing may soon become an alternative to oligonucleotide microarray technology for mRNA expression profiling. In this report we present the methods that we developed for the annotation and expression profiling of small RNAs obtained through large-scale sequencing. These include a fast algorithm for finding nearly perfect matches of small RNAs in sequence databases, a web-accessible software system for the annotation of small RNA libraries, and a Bayesian method for comparing small RNA expression across samples.

  12. Computational analysis of maltose binding protein translocation

    NASA Astrophysics Data System (ADS)

    Chinappi, Mauro; Cecconi, Fabio; Massimo Casciola, Carlo

    2011-05-01

    We propose a computational model for the study of maltose binding protein translocation across α-hemolysin nanopores. The phenomenological approach simplifies both the pore and the polypeptide chain; however it retains the basic structural protein-like properties of the maltose binding protein by promoting the correct formation of its native key interactions. By considering different observables characterising the channel blockade and molecule transport, we verified that MD simulations reproduce qualitatively the behaviour observed in a recent experiment. Simulations reveal that blockade events consist of a capture stage, to some extent related to the unfolding kinetics, and a single file translocation process in the channel. A threshold mechanics underlies the process activation with a critical force depending on the protein denaturation state. Finally, our results support the simple interpretation of translocation via first-passage statistics of a driven diffusion process of a single reaction coordinate.

  13. System Matrix Analysis for Computed Tomography Imaging

    PubMed Central

    Flores, Liubov; Vidal, Vicent; Verdú, Gumersindo

    2015-01-01

    In practical applications of computed tomography imaging (CT), it is often the case that the set of projection data is incomplete owing to the physical conditions of the data acquisition process. On the other hand, the high radiation dose imposed on patients is also undesired. These issues demand that high quality CT images can be reconstructed from limited projection data. For this reason, iterative methods of image reconstruction have become a topic of increased research interest. Several algorithms have been proposed for few-view CT. We consider that the accurate solution of the reconstruction problem also depends on the system matrix that simulates the scanning process. In this work, we analyze the application of the Siddon method to generate elements of the matrix and we present results based on real projection data. PMID:26575482

  14. From Image Analysis to Computer Vision: Motives, Methods, and Milestones.

    DTIC Science & Technology

    1998-07-01

    images. Initially, work on digital image analysis dealt with specific classes of images such as text, photomicrographs, nuclear particle tracks, and aerial...photographs; but by the 1960’s, general algorithms and paradigms for image analysis began to be formulated. When the artificial intelligence...scene, but eventually from image sequences obtained by a moving camera; at this stage, image analysis had become scene analysis or computer vision

  15. Manipulating Heat Flow through 3 Dimensional Nanoscale Phononic Crystal Structure

    DTIC Science & Technology

    2014-06-02

    Nanoscale Phononic Crystal Structure 5a. CONTRACT NUMBER FA23861214047 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) Baowen Li 5d...through computer simulation, how the three dimensional (3D) phononic crystal structures can confine phonon and thus reduce thermal conductivity...phononic crystal (PnC) with spherical pores, which can reduce thermal conductivity of bulk Si by a factor up to 10,000 times at room temperature. The

  16. Can Abdominal Hypopressive Technique Change Levator Hiatus Area?: A 3-Dimensional Ultrasound Study.

    PubMed

    Resende, Ana Paula Magalhães; Torelli, Luiza; Zanetti, Miriam Raquel Diniz; Petricelli, Carla Dellabarba; Jármy-Di Bella, Zsuzsanna IIona Katalin; Nakamura, Mary Uchiyama; Araujo Júnior, E; Moron, Antonio Fernandes; Girão, Manoel João Batista Castello; Sartori, Marair Gracio Ferreira

    2016-06-01

    This study aimed to evaluate the levator hiatus area (LHA) at rest and during the performance of maximal pelvic floor muscle (PFM) contractions, during the abdominal hypopressive technique (AHT), and during the combination of PFM contractions (PFMCs) and the AHT. The study included 17 healthy nulliparous women who had no history of pelvic floor disorders. The LHA was evaluated with the patients in the lithotomy position. After a physiotherapist instructed the patients on the proper performance of the PFM and AHT exercises, 1 gynecologist performed the 3-dimensional translabial ultrasound examinations. The LHA was measured with the patients at rest. The PFMC alone, the AHT alone or the AHT in combination with a PFMC with 30 seconds of rest between the evaluations were performed. Each measurement was performed 2 times, and the mean value was used for statistical analysis. The Wilcoxon test was used to test the differences between the 2 maneuvers. Similar values were observed when comparing the LHA of the PFM at rest (12.2 ± 2.4) cm and during the AHT (11.7 ± 2.6) cm (P = 0.227). The AHT+ PFMC (10.2 ± 1.9) cm demonstrated lower values compared with AHT alone (11.7 ± 2.6) cm (P = 0.002). When comparing the PFMC (10.4 ± 2.1) cm with the AHT + PFMC (10.2 ± 1.9) cm, no significant difference (P = 0.551) was observed. During PFMC, the constriction was 1.8 cm; during the AHT, the constriction was 0.5 cm; and during the AHT + PFMC, it was 2 cm. The LHA assessed by 3-dimensional ultrasound did not significantly change with AHT. These results support the theory that AHT does not strengthen PFM.

  17. 3-Dimensional Geologic Modeling Applied to the Structural Characterization of Geothermal Systems: Astor Pass, Nevada, USA

    SciTech Connect

    Siler, Drew L; Faulds, James E; Mayhew, Brett

    2013-04-16

    Geothermal systems in the Great Basin, USA, are controlled by a variety of fault intersection and fault interaction areas. Understanding the specific geometry of the structures most conducive to broad-scale geothermal circulation is crucial to both the mitigation of the costs of geothermal exploration (especially drilling) and to the identification of geothermal systems that have no surface expression (blind systems). 3-dimensional geologic modeling is a tool that can elucidate the specific stratigraphic intervals and structural geometries that host geothermal reservoirs. Astor Pass, NV USA lies just beyond the northern extent of the dextral Pyramid Lake fault zone near the boundary between two distinct structural domains, the Walker Lane and the Basin and Range, and exhibits characteristics of each setting. Both northwest-striking, left-stepping dextral faults of the Walker Lane and kinematically linked northerly striking normal faults associated with the Basin and Range are present. Previous studies at Astor Pass identified a blind geothermal system controlled by the intersection of west-northwest and north-northwest striking dextral-normal faults. Wells drilled into the southwestern quadrant of the fault intersection yielded 94°C fluids, with geothermometers suggesting a maximum reservoir temperature of 130°C. A 3-dimensional model was constructed based on detailed geologic maps and cross-sections, 2-dimensional seismic data, and petrologic analysis of the cuttings from three wells in order to further constrain the structural setting. The model reveals the specific geometry of the fault interaction area at a level of detail beyond what geologic maps and cross-sections can provide.

  18. Studies of Cosmic Ray Modulation and Energetic Particle Propagation in Time-Dependent 3-Dimensional Heliospheric Magnetic Fields

    NASA Technical Reports Server (NTRS)

    Zhang, Ming

    2005-01-01

    The primary goal of this project was to perform theoretical calculations of propagation of cosmic rays and energetic particles in 3-dimensional heliospheric magnetic fields. We used Markov stochastic process simulation to achieve to this goal. We developed computation software that can be used to study particle propagation in, as two examples of heliospheric magnetic fields that have to be treated in 3 dimensions, a heliospheric magnetic field suggested by Fisk (1996) and a global heliosphere including the region beyond the termination shock. The results from our model calculations were compared with particle measurements from Ulysses, Earth-based spacecraft such as IMP-8, WIND and ACE, Voyagers and Pioneers in outer heliosphere for tests of the magnetic field models. We particularly looked for features of particle variations that can allow us to significantly distinguish the Fisk magnetic field from the conventional Parker spiral field. The computer code will eventually lead to a new generation of integrated software for solving complicated problems of particle acceleration, propagation and modulation in realistic 3-dimensional heliosphere of realistic magnetic fields and the solar wind with a single computation approach.

  19. Local spatial frequency analysis for computer vision

    NASA Technical Reports Server (NTRS)

    Krumm, John; Shafer, Steven A.

    1990-01-01

    A sense of vision is a prerequisite for a robot to function in an unstructured environment. However, real-world scenes contain many interacting phenomena that lead to complex images which are difficult to interpret automatically. Typical computer vision research proceeds by analyzing various effects in isolation (e.g., shading, texture, stereo, defocus), usually on images devoid of realistic complicating factors. This leads to specialized algorithms which fail on real-world images. Part of this failure is due to the dichotomy of useful representations for these phenomena. Some effects are best described in the spatial domain, while others are more naturally expressed in frequency. In order to resolve this dichotomy, we present the combined space/frequency representation which, for each point in an image, shows the spatial frequencies at that point. Within this common representation, we develop a set of simple, natural theories describing phenomena such as texture, shape, aliasing and lens parameters. We show these theories lead to algorithms for shape from texture and for dealiasing image data. The space/frequency representation should be a key aid in untangling the complex interaction of phenomena in images, allowing automatic understanding of real-world scenes.

  20. Adaptive computational methods for aerothermal heating analysis

    NASA Technical Reports Server (NTRS)

    Price, John M.; Oden, J. Tinsley

    1988-01-01

    The development of adaptive gridding techniques for finite-element analysis of fluid dynamics equations is described. The developmental work was done with the Euler equations with concentration on shock and inviscid flow field capturing. Ultimately this methodology is to be applied to a viscous analysis for the purpose of predicting accurate aerothermal loads on complex shapes subjected to high speed flow environments. The development of local error estimate strategies as a basis for refinement strategies is discussed, as well as the refinement strategies themselves. The application of the strategies to triangular elements and a finite-element flux-corrected-transport numerical scheme are presented. The implementation of these strategies in the GIM/PAGE code for 2-D and 3-D applications is documented and demonstrated.

  1. Interactive Spectral Analysis and Computation (ISAAC)

    NASA Technical Reports Server (NTRS)

    Lytle, D. M.

    1992-01-01

    Isaac is a task in the NSO external package for IRAF. A descendant of a FORTRAN program written to analyze data from a Fourier transform spectrometer, the current implementation has been generalized sufficiently to make it useful for general spectral analysis and other one dimensional data analysis tasks. The user interface for Isaac is implemented as an interpreted mini-language containing a powerful, programmable vector calculator. Built-in commands provide much of the functionality needed to produce accurate line lists from input spectra. These built-in functions include automated spectral line finding, least squares fitting of Voigt profiles to spectral lines including equality constraints, various filters including an optimal filter construction tool, continuum fitting, and various I/O functions.

  2. Computational Understanding: Analysis of Sentences and Context

    DTIC Science & Technology

    1974-05-01

    to take English texts, disambxguate the words and semantic relationships in- volved, and settle questions like anaphoric reference, to the point...rather than what the word in isolation might mean. Tfit theory of text analysis ?.l*o stresses binding by predictions. To assume that a word is...cluster is basically the bundle of predictions and structures, knowledge that can bind a ’.ext into a unit. The cluster has much the same theoretical

  3. Computer Assistance in Teaching Dynamic-Stochastic Systems Analysis.

    ERIC Educational Resources Information Center

    Talpaz, Hovav

    A university level course in systems analysis with close contact and massive use of computer time was designed. The objectives of the course were primarily to teach social science graduate students, mostly from economics and agricultural economics, the basic methodological and quantitative tools of systems analysis and design. It was designed to…

  4. VIC: A Computer Analysis of Verbal Interaction Category Systems.

    ERIC Educational Resources Information Center

    Kline, John A.; And Others

    VIC is a computer program for the analysis of verbal interaction category systems, especially the Flanders interaction analysis system. The observer codes verbal behavior on coding sheets for later machine scoring. A matrix is produced by the program showing the number and percentages of times that a particular cell describes classroom behavior.…

  5. Two Computer Programs for Factor Analysis. Technical Note Number 41.

    ERIC Educational Resources Information Center

    Wisler, Carl E.

    Two factor analysis algorithms, previously described by P. Horst, have been programed for use on the General Electric Time-Sharing Computer System. The first of these, Principal Components Analysis (PCA), uses the Basic Structure Successive Factor Method With Residual Matrices algorithm to obtain the principal component vectors of a correlation…

  6. Computer-Aided Communication Satellite System Analysis and Optimization.

    ERIC Educational Resources Information Center

    Stagl, Thomas W.; And Others

    Various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. The rationale for selecting General Dynamics/Convair's Satellite Telecommunication Analysis and Modeling Program (STAMP) in modified form to aid in the system costing and sensitivity analysis work in the Program on…

  7. AVES: A Computer Cluster System approach for INTEGRAL Scientific Analysis

    NASA Astrophysics Data System (ADS)

    Federici, M.; Martino, B. L.; Natalucci, L.; Umbertini, P.

    The AVES computing system, based on an "Cluster" architecture is a fully integrated, low cost computing facility dedicated to the archiving and analysis of the INTEGRAL data. AVES is a modular system that uses the software resource manager (SLURM) and allows almost unlimited expandibility (65,536 nodes and hundreds of thousands of processors); actually is composed by 30 Personal Computers with Quad-Cores CPU able to reach the computing power of 300 Giga Flops (300x10{9} Floating point Operations Per Second), with 120 GB of RAM and 7.5 Tera Bytes (TB) of storage memory in UFS configuration plus 6 TB for users area. AVES was designed and built to solve growing problems raised from the analysis of the large data amount accumulated by the INTEGRAL mission (actually about 9 TB) and due to increase every year. The used analysis software is the OSA package, distributed by the ISDC in Geneva. This is a very complex package consisting of dozens of programs that can not be converted to parallel computing. To overcome this limitation we developed a series of programs to distribute the workload analysis on the various nodes making AVES automatically divide the analysis in N jobs sent to N cores. This solution thus produces a result similar to that obtained by the parallel computing configuration. In support of this we have developed tools that allow a flexible use of the scientific software and quality control of on-line data storing. The AVES software package is constituted by about 50 specific programs. Thus the whole computing time, compared to that provided by a Personal Computer with single processor, has been enhanced up to a factor 70.

  8. Computational Aeroelastic Analysis of the Ares Launch Vehicle During Ascent

    NASA Technical Reports Server (NTRS)

    Bartels, Robert E.; Chwalowski, Pawel; Massey, Steven J.; Vatsa, Veer N.; Heeg, Jennifer; Wieseman, Carol D.; Mineck, Raymond E.

    2010-01-01

    This paper presents the static and dynamic computational aeroelastic (CAE) analyses of the Ares crew launch vehicle (CLV) during atmospheric ascent. The influence of launch vehicle flexibility on the static aerodynamic loading and integrated aerodynamic force and moment coefficients is discussed. The ultimate purpose of this analysis is to assess the aeroelastic stability of the launch vehicle along the ascent trajectory. A comparison of analysis results for several versions of the Ares CLV will be made. Flexible static and dynamic analyses based on rigid computational fluid dynamic (CFD) data are compared with a fully coupled aeroelastic time marching CFD analysis of the launch vehicle.

  9. Minimum-fuel, 3-dimensional flightpath guidance of transfer jets

    NASA Technical Reports Server (NTRS)

    Neuman, F.; Kreindler, E.

    1984-01-01

    Minimum fuel, three dimensional flightpaths for commercial jet aircraft are discussed. The theoretical development is divided into two sections. In both sections, the necessary conditions of optimal control, including singular arcs and state constraints, are used. One section treats the initial and final portions (below 10,000 ft) of long optimal flightpaths. Here all possible paths can be derived by generating fields of extremals. Another section treats the complete intermediate length, three dimensional terminal area flightpaths. Here only representative sample flightpaths can be computed. Sufficient detail is provided to give the student of optimal control a complex example of a useful application of optimal control theory.

  10. Computational analysis of local membrane properties

    NASA Astrophysics Data System (ADS)

    Gapsys, Vytautas; de Groot, Bert L.; Briones, Rodolfo

    2013-10-01

    In the field of biomolecular simulations, dynamics of phospholipid membranes is of special interest. A number of proteins, including channels, transporters, receptors and short peptides are embedded in lipid bilayers and tightly interact with phospholipids. While the experimental measurements report on the spatial and/or temporal average membrane properties, simulation results are not restricted to the average properties. In the current study, we present a collection of methods for an efficient local membrane property calculation, comprising bilayer thickness, area per lipid, deuterium order parameters, Gaussian and mean curvature. The local membrane property calculation allows for a direct mapping of the membrane features, which subsequently can be used for further analysis and visualization of the processes of interest. The main features of the described methods are highlighted in a number of membrane systems, namely: a pure dimyristoyl-phosphatidyl-choline (DMPC) bilayer, a fusion peptide interacting with a membrane, voltage-dependent anion channel protein embedded in a DMPC bilayer, cholesterol enriched bilayer and a coarse grained simulation of a curved palmitoyl-oleoyl-phosphatidyl-choline lipid membrane. The local membrane property analysis proves to provide an intuitive and detailed view on the observables that are otherwise interpreted as averaged bilayer properties.

  11. CFD Based Computations of Flexible Helicopter Blades for Stability Analysis

    NASA Technical Reports Server (NTRS)

    Guruswamy, Guru P.

    2011-01-01

    As a collaborative effort among government aerospace research laboratories an advanced version of a widely used computational fluid dynamics code, OVERFLOW, was recently released. This latest version includes additions to model flexible rotating multiple blades. In this paper, the OVERFLOW code is applied to improve the accuracy of airload computations from the linear lifting line theory that uses displacements from beam model. Data transfers required at every revolution are managed through a Unix based script that runs jobs on large super-cluster computers. Results are demonstrated for the 4-bladed UH-60A helicopter. Deviations of computed data from flight data are evaluated. Fourier analysis post-processing that is suitable for aeroelastic stability computations are performed.

  12. A computer analysis of the Schreber Memoirs.

    PubMed

    Klein, R H

    1976-06-01

    With the aid of a computerized system for content analysis, WORDS, the complete Schreber Memoirs was subjected to various multivariate reduction techniques in order to investigate the major content themes of this document. The findings included the prevalence of somatic concerns throughout the Memoirs, clear references to persecutory ideas and to Schreber's assumption of a redemptive role, complex encapsulated concerns about Schreber's relationship with God, a lack of any close relationship between sexuality and sexual transformation either to themes of castration or procreation, and the fact that neither sun, God, nor Flechsig was significantly associated with clusters concerning gender, sexuality, or castration. These findings are discussed in relation to psychodynamic interpretations furnished by prior investigators who employed different research methods.

  13. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, M.S.

    1998-08-18

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device. 27 figs.

  14. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    1999-10-26

    A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).

  15. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    1998-08-18

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  16. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    2001-06-05

    A computer system (1) for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area (814) and sample sequences in another area (816) on a display device (3).

  17. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.; Wang, Chunwei; Jevons, Luis C.; Bernhart, Derek H.; Lipshutz, Robert J.

    2004-05-11

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments are improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  18. Computer-aided visualization and analysis system for sequence evaluation

    DOEpatents

    Chee, Mark S.

    2003-08-19

    A computer system for analyzing nucleic acid sequences is provided. The computer system is used to perform multiple methods for determining unknown bases by analyzing the fluorescence intensities of hybridized nucleic acid probes. The results of individual experiments may be improved by processing nucleic acid sequences together. Comparative analysis of multiple experiments is also provided by displaying reference sequences in one area and sample sequences in another area on a display device.

  19. Computer analysis of transient voltages in large grounding systems

    SciTech Connect

    Grcev, L.D.

    1996-04-01

    A computer model for transient analysis of a network of buried and above ground conductors is presented. The model is based on the electromagnetic field theory approach ad the modified image theory. Validation of the model is achieved by comparison with field measurements. The model is applied for computation of transient voltages to remote ground of large grounding grid conductors. Also computation of longitudinal and leakage currents, transient impedance, electromagnetic fields, and transient induced voltages is possible. This model is aimed to help in EMC and lightning protection studies that involve electrical and electronic systems connected to grounding systems.

  20. 3-dimensional imaging system using crystal diffraction lenses

    DOEpatents

    Smither, R.K.

    1999-02-09

    A device for imaging a plurality of sources of x-ray and gamma-ray radiation is provided. Diffracting crystals are used for focusing the radiation and directing the radiation to a detector which is used for analyzing their addition to collect data as to the location of the source of radiation. A computer is used for converting the data to an image. The invention also provides for a method for imaging x-ray and gamma radiation by supplying a plurality of sources of radiation; focusing the radiation onto a detector; analyzing the focused radiation to collect data as to the type and location of the radiation; and producing an image using the data. 18 figs.

  1. 3-dimensional imaging system using crystal diffraction lenses

    DOEpatents

    Smither, Robert K.

    1999-01-01

    A device for imaging a plurality of sources of x-ray and gamma-ray radiation is provided. Diffracting crystals are used for focussing the radiation and directing the radiation to a detector which is used for analyzing their addition to collect data as to the location of the source of radiation. A computer is used for converting the data to an image. The invention also provides for a method for imaging x-ray and gamma radiation by supplying a plurality of sources of radiation; focussing the radiation onto a detector; analyzing the focused radiation to collect data as to the type and location of the radiation; and producing an image using the data.

  2. Structural Analysis Using Computer Based Methods

    NASA Technical Reports Server (NTRS)

    Dietz, Matthew R.

    2013-01-01

    The stiffness of a flex hose that will be used in the umbilical arms of the Space Launch Systems mobile launcher needed to be determined in order to properly qualify ground umbilical plate behavior during vehicle separation post T-0. This data is also necessary to properly size and design the motors used to retract the umbilical arms. Therefore an experiment was created to determine the stiffness of the hose. Before the test apparatus for the experiment could be built, the structure had to be analyzed to ensure it would not fail under given loading conditions. The design model was imported into the analysis software and optimized to decrease runtime while still providing accurate restlts and allow for seamless meshing. Areas exceeding the allowable stresses in the structure were located and modified before submitting the design for fabrication. In addition, a mock up of a deep space habitat and the support frame was designed and needed to be analyzed for structural integrity under different loading conditions. The load cases were provided by the customer and were applied to the structure after optimizing the geometry. Once again, weak points in the structure were located and recommended design changes were made to the customer and the process was repeated until the load conditions were met without exceeding the allowable stresses. After the stresses met the required factors of safety the designs were released for fabrication.

  3. Computer programs for analysis of geophysical data

    SciTech Connect

    Rozhkov, M.; Nakanishi, K.

    1994-06-01

    This project is oriented toward the application of the mobile seismic array data analysis technique in seismic investigations of the Earth (the noise-array method). The technique falls into the class of emission tomography methods but, in contrast to classic tomography, 3-D images of the microseismic activity of the media are obtained by passive seismic antenna scanning of the half-space, rather than by solution of the inverse Radon`s problem. It is reasonable to expect that areas of geothermal activity, active faults, areas of volcanic tremors and hydrocarbon deposits act as sources of intense internal microseismic activity or as effective sources for scattered (secondary) waves. The conventional approaches of seismic investigations of a geological medium include measurements of time-limited determinate signals from artificial or natural sources. However, the continuous seismic oscillations, like endogenous microseisms, coda and scattering waves, can give very important information about the structure of the Earth. The presence of microseismic sources or inhomogeneities within the Earth results in the appearance of coherent seismic components in a stochastic wave field recorded on the surface by a seismic array. By careful processing of seismic array data, these coherent components can be used to develop a 3-D model of the microseismic activity of the media or images of the noisy objects. Thus, in contrast to classic seismology where narrow windows are used to get the best time resolution of seismic signals, our model requires long record length for the best spatial resolution.

  4. Computational analysis of a pulsed inductive plasma accelerator

    NASA Astrophysics Data System (ADS)

    Corpening, Jeremy H.

    The pulsed inductive plasma accelerator allows for ionization of a cold gas propellant to plasma and acceleration of plasma with the same current pulse and without plasma contact with any part. This is beneficial since erosion is never a problem and lifetimes are limited only by the amount of carried propellant. To date, work involving the pulsed inductive plasma accelerator concept has been largely experimental with minimal computational analysis. The goal of the present research was to develop a computational tool using Maxwell's equations coupled with the Navier-Stokes fluid equations to fully analyze a pulsed inductive plasma accelerator. A plasma model was developed using the Saha equation and partition functions to calculate all required thermodynamic properties. The solution to Maxwell's equations was verified accurate and then coupled computations with propellant plasma were conducted. These coupled computations showed good order of magnitude accuracy with a simple onedimensional model however failed when the plasma began to accelerate due to the Lorentz force. The electric field, magnetic field, current density, and Lorentz force were all aligned in the proper vector directions. The computational failure occurred due to rapid, fictitious increases in the induced electric field in the vacuum created between the accelerating plasma and drive coil. Possible solutions to this problem are to decrease the time step and refine the grid density. Although complete acceleration of propellant plasma has yet to be computationally computed, this study has shown successful coupled computations with Maxwell and Navier-Stokes equations for a pulsed inductive plasma accelerator.

  5. Assessment and Planning for a Pediatric Bilateral Hand Transplant Using 3-Dimensional Modeling: Case Report.

    PubMed

    Gálvez, Jorge A; Gralewski, Kevin; McAndrew, Christine; Rehman, Mohamed A; Chang, Benjamin; Levin, L Scott

    2016-03-01

    Children are not typically considered for hand transplantation for various reasons, including the difficulty of finding an appropriate donor. Matching donor-recipient hands and forearms based on size is critically important. If the donor's hands are too large, the recipient may not be able to move the fingers effectively. Conversely, if the donor's hands are too small, the appearance may not be appropriate. We present an 8-year-old child evaluated for a bilateral hand transplant following bilateral amputation. The recipient forearms and model hands were modeled from computed tomography imaging studies and replicated as anatomic models with a 3-dimensional printer. We modified the scale of the printed hand to produce 3 proportions, 80%, 100% and 120%. The transplant team used the anatomical models during evaluation of a donor for appropriate match based on size. The donor's hand size matched the 100%-scale anatomical model hand and the transplant team was activated. In addition to assisting in appropriate donor selection by the transplant team, the 100%-scale anatomical model hand was used to create molds for prosthetic hands for the donor.

  6. A Novel Method of Orbital Floor Reconstruction Using Virtual Planning, 3-Dimensional Printing, and Autologous Bone.

    PubMed

    Vehmeijer, Maarten; van Eijnatten, Maureen; Liberton, Niels; Wolff, Jan

    2016-08-01

    Fractures of the orbital floor are often a result of traffic accidents or interpersonal violence. To date, numerous materials and methods have been used to reconstruct the orbital floor. However, simple and cost-effective 3-dimensional (3D) printing technologies for the treatment of orbital floor fractures are still sought. This study describes a simple, precise, cost-effective method of treating orbital fractures using 3D printing technologies in combination with autologous bone. Enophthalmos and diplopia developed in a 64-year-old female patient with an orbital floor fracture. A virtual 3D model of the fracture site was generated from computed tomography images of the patient. The fracture was virtually closed using spline interpolation. Furthermore, a virtual individualized mold of the defect site was created, which was manufactured using an inkjet printer. The tangible mold was subsequently used during surgery to sculpture an individualized autologous orbital floor implant. Virtual reconstruction of the orbital floor and the resulting mold enhanced the overall accuracy and efficiency of the surgical procedure. The sculptured autologous orbital floor implant showed an excellent fit in vivo. The combination of virtual planning and 3D printing offers an accurate and cost-effective treatment method for orbital floor fractures.

  7. Large-scale temporal analysis of computer and information science

    NASA Astrophysics Data System (ADS)

    Soos, Sandor; Kampis, George; Gulyás, László

    2013-09-01

    The main aim of the project reported in this paper was twofold. One of the primary goals was to produce an extensive source of network data for bibliometric analyses of field dynamics in the case of Computer and Information Science. To this end, we rendered the raw material of the DBLP computer and infoscience bibliography into a comprehensive collection of dynamic network data, promptly available for further statistical analysis. The other goal was to demonstrate the value of our data source via its use in mapping Computer and Information Science (CIS). An analysis of the evolution of CIS was performed in terms of collaboration (co-authorship) network dynamics. Dynamic network analysis covered three quarters of the XX. century (76 years, from 1936 to date). Network evolution was described both at the macro- and the mezo level (in terms of community characteristics). Results show that the development of CIS followed what appears to be a universal pattern of growing into a "mature" discipline.

  8. Control of Grasp and Manipulation by Soft Fingers with 3-Dimensional Deformation

    NASA Astrophysics Data System (ADS)

    Nakashima, Akira; Shibata, Takeshi; Hayakawa, Yoshikazu

    In this paper, we consider control of grasp and manipulation of an object in a 3-dimensional space by a 3-fingered hand robot with soft finger tips. We firstly propose a 3-dimensional deformation model of a hemispherical soft finger tip and verify its relevance by experimental data. Second, we consider the contact kinematics and derive the dynamical equations of the fingers and the object where the 3-dimensional deformation is considered. For the system, we thirdly propose a method to regulate the object and the internal force with the information of the hand, the object and the deformation. A simulation result is presented to show the effectiveness of the control method.

  9. On computational schemes for global-local stress analysis

    NASA Technical Reports Server (NTRS)

    Reddy, J. N.

    1989-01-01

    An overview is given of global-local stress analysis methods and associated difficulties and recommendations for future research. The phrase global-local analysis is understood to be an analysis in which some parts of the domain or structure are identified, for reasons of accurate determination of stresses and displacements or for more refined analysis than in the remaining parts. The parts of refined analysis are termed local and the remaining parts are called global. Typically local regions are small in size compared to global regions, while the computational effort can be larger in local regions than in global regions.

  10. MSFC crack growth analysis computer program, version 2 (users manual)

    NASA Technical Reports Server (NTRS)

    Creager, M.

    1976-01-01

    An updated version of the George C. Marshall Space Flight Center Crack Growth Analysis Program is described. The updated computer program has significantly expanded capabilities over the original one. This increased capability includes an extensive expansion of the library of stress intensity factors, plotting capability, increased design iteration capability, and the capability of performing proof test logic analysis. The technical approaches used within the computer program are presented, and the input and output formats and options are described. Details of the stress intensity equations, example data, and example problems are presented.

  11. Visualization and Data Analysis for High-Performance Computing

    SciTech Connect

    Sewell, Christopher Meyer

    2016-09-27

    This is a set of slides from a guest lecture for a class at the University of Texas, El Paso on visualization and data analysis for high-performance computing. The topics covered are the following: trends in high-performance computing; scientific visualization, such as OpenGL, ray tracing and volume rendering, VTK, and ParaView; data science at scale, such as in-situ visualization, image databases, distributed memory parallelism, shared memory parallelism, VTK-m, "big data", and then an analysis example.

  12. Computational Analysis of the SRS Phase III Salt Disposition Alternatives

    SciTech Connect

    Dimenna, R.A.

    1999-10-07

    Completion of the Phase III evaluation and comparison of salt disposition alternatives was supported with enhanced computer models and analysis for each case on the ''short list'' of four options. SPEEDUP(TM) models and special purpose models describing mass and energy balances and flow rates were developed and used to predict performance and production characteristics for each of the options. Results from the computational analysis were a key part of the input used to select a primary and an alternate salt disposition alternative.

  13. First Experiences with LHC Grid Computing and Distributed Analysis

    SciTech Connect

    Fisk, Ian

    2010-12-01

    In this presentation the experiences of the LHC experiments using grid computing were presented with a focus on experience with distributed analysis. After many years of development, preparation, exercises, and validation the LHC (Large Hadron Collider) experiments are in operations. The computing infrastructure has been heavily utilized in the first 6 months of data collection. The general experience of exploiting the grid infrastructure for organized processing and preparation is described, as well as the successes employing the infrastructure for distributed analysis. At the end the expected evolution and future plans are outlined.

  14. Advances in computational design and analysis of airbreathing propulsion systems

    NASA Technical Reports Server (NTRS)

    Klineberg, John M.

    1989-01-01

    The development of commercial and military aircraft depends, to a large extent, on engine manufacturers being able to achieve significant increases in propulsion capability through improved component aerodynamics, materials, and structures. The recent history of propulsion has been marked by efforts to develop computational techniques that can speed up the propulsion design process and produce superior designs. The availability of powerful supercomputers, such as the NASA Numerical Aerodynamic Simulator, and the potential for even higher performance offered by parallel computer architectures, have opened the door to the use of multi-dimensional simulations to study complex physical phenomena in propulsion systems that have previously defied analysis or experimental observation. An overview of several NASA Lewis research efforts is provided that are contributing toward the long-range goal of a numerical test-cell for the integrated, multidisciplinary design, analysis, and optimization of propulsion systems. Specific examples in Internal Computational Fluid Mechanics, Computational Structural Mechanics, Computational Materials Science, and High Performance Computing are cited and described in terms of current capabilities, technical challenges, and future research directions.

  15. AKSATINT - SATELLITE INTERFERENCE ANALYSIS AND SIMULATION USING PERSONAL COMPUTERS

    NASA Technical Reports Server (NTRS)

    Kantak, A.

    1994-01-01

    In the late seventies, the number of communication satellites in service increased, and interference has become an increasingly important consideration in designing satellite/ground station communications systems. Satellite Interference Analysis and Simulation Using Personal Computers, AKSATINT, models the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both the desired and the interfering satellites are considered to be in elliptical orbits. The simulation contains computation of orbital positions of both satellites using classical orbital elements, calculation of the satellite antennae look angles for both satellites and elevation angles at the desired-satellite ground-station antenna, and computation of Doppler effect due to the motions of the satellites and the Earth's rotation. AKSATINT also computes the interference-tosignal-power ratio, taking into account losses suffered by the links. After computing the interference-to-signal-power ratio, the program computes the statistical quantities. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. The program includes a flowchart, a sample run, and results of that run. AKSATINT is expected to be of general use to system designers and frequency managers in selecting the proper frequency under an interference scenario. The AKSATINT program is written in BASIC. It was designed to operate on the IBM Personal Computer AT or compatibles, and has been implemented under MS DOS 3.2. AKSATINT was developed in 1987.

  16. Tutorial: Parallel Computing of Simulation Models for Risk Analysis.

    PubMed

    Reilly, Allison C; Staid, Andrea; Gao, Michael; Guikema, Seth D

    2016-10-01

    Simulation models are widely used in risk analysis to study the effects of uncertainties on outcomes of interest in complex problems. Often, these models are computationally complex and time consuming to run. This latter point may be at odds with time-sensitive evaluations or may limit the number of parameters that are considered. In this article, we give an introductory tutorial focused on parallelizing simulation code to better leverage modern computing hardware, enabling risk analysts to better utilize simulation-based methods for quantifying uncertainty in practice. This article is aimed primarily at risk analysts who use simulation methods but do not yet utilize parallelization to decrease the computational burden of these models. The discussion is focused on conceptual aspects of embarrassingly parallel computer code and software considerations. Two complementary examples are shown using the languages MATLAB and R. A brief discussion of hardware considerations is located in the Appendix.

  17. Superfast robust digital image correlation analysis with parallel computing

    NASA Astrophysics Data System (ADS)

    Pan, Bing; Tian, Long

    2015-03-01

    Existing digital image correlation (DIC) using the robust reliability-guided displacement tracking (RGDT) strategy for full-field displacement measurement is a path-dependent process that can only be executed sequentially. This path-dependent tracking strategy not only limits the potential of DIC for further improvement of its computational efficiency but also wastes the parallel computing power of modern computers with multicore processors. To maintain the robustness of the existing RGDT strategy and to overcome its deficiency, an improved RGDT strategy using a two-section tracking scheme is proposed. In the improved RGDT strategy, the calculated points with correlation coefficients higher than a preset threshold are all taken as reliably computed points and given the same priority to extend the correlation analysis to their neighbors. Thus, DIC calculation is first executed in parallel at multiple points by separate independent threads. Then for the few calculated points with correlation coefficients smaller than the threshold, DIC analysis using existing RGDT strategy is adopted. Benefiting from the improved RGDT strategy and the multithread computing, superfast DIC analysis can be accomplished without sacrificing its robustness and accuracy. Experimental results show that the presented parallel DIC method performed on a common eight-core laptop can achieve about a 7 times speedup.

  18. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April, 1986 through September 30, 1986 is summarized.

  19. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1988-01-01

    This report summarizes research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period April l, 1988 through September 30, 1988.

  20. Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis and computer science

    NASA Technical Reports Server (NTRS)

    1987-01-01

    Research conducted at the Institute for Computer Applications in Science and Engineering in applied mathematics, numerical analysis, and computer science during the period October 1, 1986 through March 31, 1987 is summarized.

  1. 3-Dimensional Computational Fluid Dynamics Modeling of Solid Oxide Fuel Cell Using Different Fuels

    DTIC Science & Technology

    2011-01-01

    Material Operating Temperature (oC) Efficiency (%) PEMFC H2, Methanol, Formic Acid Hydrated Organic Polymer < 90 40-50 AFC Pure H2 Aqueous...major types of fuel cells in practice are listed below: Polymer Electrolyte Membrane Fuel Cell (PEMFC) Alkaline Fuel cell (AFC) Phosphoric Acid ...potassium hydroxide 60 – 250 50 PAFC Pure H2 Phosphoric Acid 180 - 210 40 MCFC H2, CH4, CH3OH Molten Alkali Carbonate 600 – 700 45-55

  2. Integration of rocket turbine design and analysis through computer graphics

    NASA Technical Reports Server (NTRS)

    Hsu, Wayne; Boynton, Jim

    1988-01-01

    An interactive approach with engineering computer graphics is used to integrate the design and analysis processes of a rocket engine turbine into a progressive and iterative design procedure. The processes are interconnected through pre- and postprocessors. The graphics are used to generate the blade profiles, their stacking, finite element generation, and analysis presentation through color graphics. Steps of the design process discussed include pitch-line design, axisymmetric hub-to-tip meridional design, and quasi-three-dimensional analysis. The viscous two- and three-dimensional analysis codes are executed after acceptable designs are achieved and estimates of initial losses are confirmed.

  3. Petascale visual data analysis in a production computing environment

    NASA Astrophysics Data System (ADS)

    Ahern, Sean

    2007-07-01

    Supporting the visualization and analysis needs of the users of the Department of Energy's premiere high-performance computing centers requires a careful engineering of software and hardware system architectures to provide maximum capability and algorithmic breadth. Data set growth follows an inverse power law that has implications for the platforms that are deployed for analysis and visualization; central storage and coupled analysis platforms are critical for petascale post-production. Software architectures like VisIt - which exploit parallel platforms, as well as provide remote capability, extensibility, and optimization - are fruitful ground for delivering new analysis capabilities for petascale applications. Finally, direct interaction with customers is key to deploying successful results.

  4. Recent developments of the NESSUS probabilistic structural analysis computer program

    NASA Technical Reports Server (NTRS)

    Millwater, H.; Wu, Y.-T.; Torng, T.; Thacker, B.; Riha, D.; Leung, C. P.

    1992-01-01

    The NESSUS probabilistic structural analysis computer program combines state-of-the-art probabilistic algorithms with general purpose structural analysis methods to compute the probabilistic response and the reliability of engineering structures. Uncertainty in loading, material properties, geometry, boundary conditions and initial conditions can be simulated. The structural analysis methods include nonlinear finite element and boundary element methods. Several probabilistic algorithms are available such as the advanced mean value method and the adaptive importance sampling method. The scope of the code has recently been expanded to include probabilistic life and fatigue prediction of structures in terms of component and system reliability and risk analysis of structures considering cost of failure. The code is currently being extended to structural reliability considering progressive crack propagation. Several examples are presented to demonstrate the new capabilities.

  5. Finite element dynamic analysis on CDC STAR-100 computer

    NASA Technical Reports Server (NTRS)

    Noor, A. K.; Lambiotte, J. J., Jr.

    1978-01-01

    Computational algorithms are presented for the finite element dynamic analysis of structures on the CDC STAR-100 computer. The spatial behavior is described using higher-order finite elements. The temporal behavior is approximated by using either the central difference explicit scheme or Newmark's implicit scheme. In each case the analysis is broken up into a number of basic macro-operations. Discussion is focused on the organization of the computation and the mode of storage of different arrays to take advantage of the STAR pipeline capability. The potential of the proposed algorithms is discussed and CPU times are given for performing the different macro-operations for a shell modeled by higher order composite shallow shell elements having 80 degrees of freedom.

  6. Boundary element analysis on vector and parallel computers

    NASA Technical Reports Server (NTRS)

    Kane, J. H.

    1994-01-01

    Boundary element analysis (BEA) can be characterized as a numerical technique that generally shifts the computational burden in the analysis toward numerical integration and the solution of nonsymmetric and either dense or blocked sparse systems of algebraic equations. Researchers have explored the concept that the fundamental characteristics of BEA can be exploited to generate effective implementations on vector and parallel computers. In this paper, the results of some of these investigations are discussed. The performance of overall algorithms for BEA on vector supercomputers, massively data parallel single instruction multiple data (SIMD), and relatively fine grained distributed memory multiple instruction multiple data (MIMD) computer systems is described. Some general trends and conclusions are discussed, along with indications of future developments that may prove fruitful in this regard.

  7. Interactive computer code for dynamic and soil structure interaction analysis

    SciTech Connect

    Mulliken, J.S.

    1995-12-01

    A new interactive computer code is presented in this paper for dynamic and soil-structure interaction (SSI) analyses. The computer program FETA (Finite Element Transient Analysis) is a self contained interactive graphics environment for IBM-PC`s that is used for the development of structural and soil models as well as post-processing dynamic analysis output. Full 3-D isometric views of the soil-structure system, animation of displacements, frequency and time domain responses at nodes, and response spectra are all graphically available simply by pointing and clicking with a mouse. FETA`s finite element solver performs 2-D and 3-D frequency and time domain soil-structure interaction analyses. The solver can be directly accessed from the graphical interface on a PC, or run on a number of other computer platforms.

  8. Computer analysis of shells of revolution using asymptotic results

    NASA Technical Reports Server (NTRS)

    Steele, C. R.; Ranjan, G. V.; Goto, C.; Pulliam, T. H.

    1979-01-01

    It is suggested that asymptotic results for the behavior of thin shells can be incorporated in a general computer code for the analysis of a complex shell structure. The advantage when compared to existing finite difference or finite element codes is a substantial reduction in computational labor with the capability of working to a specified level of accuracy. A reduction in user preparation time and dependance on user judgment is also gained, since mesh spacing can be internally generated. The general theory is described in this paper, as well as the implementation in the computer code FAST 1 (Functional Algorithm for Shell Theory) for the analysis of the general axisymmetric shell structure with axisymmetric loading.

  9. Computational Methods for the Analysis of Array Comparative Genomic Hybridization

    PubMed Central

    Chari, Raj; Lockwood, William W.; Lam, Wan L.

    2006-01-01

    Array comparative genomic hybridization (array CGH) is a technique for assaying the copy number status of cancer genomes. The widespread use of this technology has lead to a rapid accumulation of high throughput data, which in turn has prompted the development of computational strategies for the analysis of array CGH data. Here we explain the principles behind array image processing, data visualization and genomic profile analysis, review currently available software packages, and raise considerations for future software development. PMID:17992253

  10. A statistical package for computing time and frequency domain analysis

    NASA Technical Reports Server (NTRS)

    Brownlow, J.

    1978-01-01

    The spectrum analysis (SPA) program is a general purpose digital computer program designed to aid in data analysis. The program does time and frequency domain statistical analyses as well as some preanalysis data preparation. The capabilities of the SPA program include linear trend removal and/or digital filtering of data, plotting and/or listing of both filtered and unfiltered data, time domain statistical characterization of data, and frequency domain statistical characterization of data.

  11. Computer-aided-analysis of linear control system robustness

    NASA Technical Reports Server (NTRS)

    Stengel, Robert F.; Ray, Laura R.

    1990-01-01

    Stochastic robustness is a simple technique used to estimate the stability and performance robustness of linear, time-invariant systems. The use of high-speed graphics workstations and control system design software in stochastic robustness analysis is discussed and demonstrated. It is shown that stochastic robustness makes good use of modern computational and graphic tools, and it is easily implemented using commercial control system design and analysis software.

  12. Computational models for the nonlinear analysis of reinforced concrete plates

    NASA Technical Reports Server (NTRS)

    Hinton, E.; Rahman, H. H. A.; Huq, M. M.

    1980-01-01

    A finite element computational model for the nonlinear analysis of reinforced concrete solid, stiffened and cellular plates is briefly outlined. Typically, Mindlin elements are used to model the plates whereas eccentric Timoshenko elements are adopted to represent the beams. The layering technique, common in the analysis of reinforced concrete flexural systems, is incorporated in the model. The proposed model provides an inexpensive and reasonably accurate approach which can be extended for use with voided plates.

  13. A Computer Aided Statistical Covariance Program for Missile System Analysis

    DTIC Science & Technology

    1974-04-01

    ENGINEERING RESEARCH OKLAHOMA STATE UNIVERSITY A COMPUTER AIDED STATISTICAL COVARIANCE PROGRAM FOR MISSILE SYSTEM ANALYSI. TO D JN2 U. S. Army Missile...ORGANIZATION NAME AND ADDRESS 10. PROGRAM ELEMENT. PROJECT. TASK AREA & WORK UNIT NUMBERS Office of Engineering Rsch, Oklahoma State Univ Agiculture...ANALYSIS by James R. Rowland and V. M. Gupta School of Electrical Engineering V Approved for public release; distribution unlimited. Office of Engineering

  14. Validation of the NESSUS probabilistic finite element analysis computer program

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.; Burnside, O. H.

    1988-01-01

    A computer program, NESSUS, is being developed as part of a NASA-sponsored project to develop probabilistic structural analysis methods for propulsion system components. This paper describes the process of validating the NESSUS code, as it has been developed to date, and presents numerical results comparing NESSUS and exact solutions for a set of selected problems.

  15. Connecting Performance Analysis and Visualization to Advance Extreme Scale Computing

    SciTech Connect

    Bremer, Peer-Timo; Mohr, Bernd; Schulz, Martin; Pasccci, Valerio; Gamblin, Todd; Brunst, Holger

    2015-07-29

    The characterization, modeling, analysis, and tuning of software performance has been a central topic in High Performance Computing (HPC) since its early beginnings. The overall goal is to make HPC software run faster on particular hardware, either through better scheduling, on-node resource utilization, or more efficient distributed communication.

  16. The NASA NASTRAN structural analysis computer program - New content

    NASA Technical Reports Server (NTRS)

    Weidman, D. J.

    1978-01-01

    Capabilities of a NASA-developed structural analysis computer program, NASTRAN, are evaluated with reference to finite-element modelling. Applications include the automotive industry as well as aerospace. It is noted that the range of sub-programs within NASTRAN has expanded, while keeping user cost low.

  17. Detection of microcalcification in computer-assisted mammogram analysis

    NASA Astrophysics Data System (ADS)

    Naghdy, Golshah A.; Naghdy, Fazel; Yue, L.; Drijarkara, A. P.

    1999-07-01

    The latest trend in computer assisted mammogram analysis is reviewed and two new methods developed by the authors for automatic detection of microcalcifications (MCs) are presented. The first method is based on wavelet neurone feature detectors and ART classifiers while the second method utilized fuzzy rules for detection and grading of MCs.

  18. Audience Analysis: A Computer Assisted Instrument for Speech Education.

    ERIC Educational Resources Information Center

    Merritt, Floyd E.

    This paper reports on a combination questionnaire-attitude test designed to be used by speech instructors for the purpose of audience analysis. The test is divided into two parts and is scored by a computer. Part one requires the student to check items pertaining to class level, occupational goal, marital status, military service, high school…

  19. Monolithically integrated Helmholtz coils by 3-dimensional printing

    NASA Astrophysics Data System (ADS)

    Li, Longguang; Abedini-Nassab, Roozbeh; Yellen, Benjamin B.

    2014-06-01

    3D printing technology is of great interest for the monolithic fabrication of integrated systems; however, it is a challenge to introduce metallic components into 3D printed molds to enable broader device functionality. Here, we develop a technique for constructing a multi-axial Helmholtz coil by injecting a eutectic liquid metal Gallium Indium alloy (EGaIn) into helically shaped orthogonal cavities constructed in a 3D printed block. The tri-axial solenoids each carry up to 3.6 A of electrical current and produce magnetic field up to 70 G. Within the central section of the coil, the field variation is less than 1% and is in agreement with theory. The flow rates and critical pressures required to fill the 3D cavities with liquid metal also agree with theoretical predictions and provide scaling trends for filling the 3D printed parts. These monolithically integrated solenoids may find future applications in electronic cell culture platforms, atomic traps, and miniaturized chemical analysis systems based on nuclear magnetic resonance.

  20. Monolithically integrated Helmholtz coils by 3-dimensional printing

    SciTech Connect

    Li, Longguang; Abedini-Nassab, Roozbeh; Yellen, Benjamin B.

    2014-06-23

    3D printing technology is of great interest for the monolithic fabrication of integrated systems; however, it is a challenge to introduce metallic components into 3D printed molds to enable broader device functionality. Here, we develop a technique for constructing a multi-axial Helmholtz coil by injecting a eutectic liquid metal Gallium Indium alloy (EGaIn) into helically shaped orthogonal cavities constructed in a 3D printed block. The tri-axial solenoids each carry up to 3.6 A of electrical current and produce magnetic field up to 70 G. Within the central section of the coil, the field variation is less than 1% and is in agreement with theory. The flow rates and critical pressures required to fill the 3D cavities with liquid metal also agree with theoretical predictions and provide scaling trends for filling the 3D printed parts. These monolithically integrated solenoids may find future applications in electronic cell culture platforms, atomic traps, and miniaturized chemical analysis systems based on nuclear magnetic resonance.

  1. Benchmarking undedicated cloud computing providers for analysis of genomic datasets.

    PubMed

    Yazar, Seyhan; Gooden, George E C; Mackey, David A; Hewitt, Alex W

    2014-01-01

    A major bottleneck in biological discovery is now emerging at the computational level. Cloud computing offers a dynamic means whereby small and medium-sized laboratories can rapidly adjust their computational capacity. We benchmarked two established cloud computing services, Amazon Web Services Elastic MapReduce (EMR) on Amazon EC2 instances and Google Compute Engine (GCE), using publicly available genomic datasets (E.coli CC102 strain and a Han Chinese male genome) and a standard bioinformatic pipeline on a Hadoop-based platform. Wall-clock time for complete assembly differed by 52.9% (95% CI: 27.5-78.2) for E.coli and 53.5% (95% CI: 34.4-72.6) for human genome, with GCE being more efficient than EMR. The cost of running this experiment on EMR and GCE differed significantly, with the costs on EMR being 257.3% (95% CI: 211.5-303.1) and 173.9% (95% CI: 134.6-213.1) more expensive for E.coli and human assemblies respectively. Thus, GCE was found to outperform EMR both in terms of cost and wall-clock time. Our findings confirm that cloud computing is an efficient and potentially cost-effective alternative for analysis of large genomic datasets. In addition to releasing our cost-effectiveness comparison, we present available ready-to-use scripts for establishing Hadoop instances with Ganglia monitoring on EC2 or GCE.

  2. Chromosome breakage and sister chromatid exchange analysis in computer operators

    SciTech Connect

    Butler, M.G.; Yost, J.; Jenkins, B.B.

    1987-01-01

    Chromosome breakage analysis with Mitomycin C (MMC) and sister chromatid exchanges (SCE) were obtained on 10 computer operators with computer exposure for a minimum of 3 hours per day for 4 years and 10 control subjects matched for age and personal lifestyle. No difference was found between the two groups in the total number of chromatid and chromosome aberrations in cells grown at 48 and/or 96 hours in Mitomycin C (20 or 50 ng/ml-final concentration). The average number of SCE per cell in approximately 30 cells from each person was 6.4 +/- 1.1 (mean +/- standard deviation) for the computer operators and 9.2 +/- 1.6 for the controls. This difference was significant (p < .001). The replicative index was significantly higher (p < .01) in computer operators than in control subjects. The number of SCE appeared not to be influenced by the years of computer exposure. Additional studies with larger sample sizes will be needed to identify if significant differences exist in cell kinetics and sister chromatid exchanges in individuals employed as computer operators.

  3. 3-dimensional modeling of transcranial magnetic stimulation: Design and application

    NASA Astrophysics Data System (ADS)

    Salinas, Felipe Santiago

    Over the past three decades, transcranial magnetic stimulation (TMS) has emerged as an effective tool for many research, diagnostic and therapeutic applications in humans. TMS delivers highly localized brain stimulations via non-invasive externally applied magnetic fields. This non-invasive, painless technique provides researchers and clinicians a unique tool capable of stimulating both the central and peripheral nervous systems. However, a complete analysis of the macroscopic electric fields produced by TMS has not yet been performed. In this dissertation, we present a thorough examination of the total electric field induced by TMS in air and a realistic head model with clinically relevant coil poses. In the first chapter, a detailed account of TMS coil wiring geometry was shown to provide significant improvements in the accuracy of primary E-field calculations. Three-dimensional models which accounted for the TMS coil's wire width, height, shape and number of turns clearly improved the fit of calculated-to-measured E-fields near the coil body. Detailed primary E-field models were accurate up to the surface of the coil body (within 0.5% of measured values) whereas simple models were often inadequate (up to 32% different from measured). In the second chapter, we addressed the importance of the secondary E-field created by surface charge accumulation during TMS using the boundary element method (BEM). 3-D models were developed using simple head geometries in order to test the model and compare it with measured values. The effects of tissue geometry, size and conductivity were also investigated. Finally, a realistic head model was used to assess the effect of multiple surfaces on the total E-field. We found that secondary E-fields have the greatest impact at areas in close proximity to each tissue layer. Throughout the head, the secondary E-field magnitudes were predominantly between 25% and 45% of the primary E-fields magnitude. The direction of the secondary E

  4. Aerodynamic design optimization using sensitivity analysis and computational fluid dynamics

    NASA Technical Reports Server (NTRS)

    Baysal, Oktay; Eleshaky, Mohamed E.

    1991-01-01

    A new and efficient method is presented for aerodynamic design optimization, which is based on a computational fluid dynamics (CFD)-sensitivity analysis algorithm. The method is applied to design a scramjet-afterbody configuration for an optimized axial thrust. The Euler equations are solved for the inviscid analysis of the flow, which in turn provides the objective function and the constraints. The CFD analysis is then coupled with the optimization procedure that uses a constrained minimization method. The sensitivity coefficients, i.e. gradients of the objective function and the constraints, needed for the optimization are obtained using a quasi-analytical method rather than the traditional brute force method of finite difference approximations. During the one-dimensional search of the optimization procedure, an approximate flow analysis (predicted flow) based on a first-order Taylor series expansion is used to reduce the computational cost. Finally, the sensitivity of the optimum objective function to various design parameters, which are kept constant during the optimization, is computed to predict new optimum solutions. The flow analysis of the demonstrative example are compared with the experimental data. It is shown that the method is more efficient than the traditional methods.

  5. Practical Use of Computationally Frugal Model Analysis Methods

    SciTech Connect

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugal methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts

  6. Practical Use of Computationally Frugal Model Analysis Methods

    DOE PAGES

    Hill, Mary C.; Kavetski, Dmitri; Clark, Martyn; ...

    2015-03-21

    Computationally frugal methods of model analysis can provide substantial benefits when developing models of groundwater and other environmental systems. Model analysis includes ways to evaluate model adequacy and to perform sensitivity and uncertainty analysis. Frugal methods typically require 10s of parallelizable model runs; their convenience allows for other uses of the computational effort. We suggest that model analysis be posed as a set of questions used to organize methods that range from frugal to expensive (requiring 10,000 model runs or more). This encourages focus on method utility, even when methods have starkly different theoretical backgrounds. We note that many frugalmore » methods are more useful when unrealistic process-model nonlinearities are reduced. Inexpensive diagnostics are identified for determining when frugal methods are advantageous. Examples from the literature are used to demonstrate local methods and the diagnostics. We suggest that the greater use of computationally frugal model analysis methods would allow questions such as those posed in this work to be addressed more routinely, allowing the environmental sciences community to obtain greater scientific insight from the many ongoing and future modeling efforts« less

  7. Computed tomographic beam-hardening artefacts: mathematical characterization and analysis.

    PubMed

    Park, Hyoung Suk; Chung, Yong Eun; Seo, Jin Keun

    2015-06-13

    This paper presents a mathematical characterization and analysis of beam-hardening artefacts in X-ray computed tomography (CT). In the field of dental and medical radiography, metal artefact reduction in CT is becoming increasingly important as artificial prostheses and metallic implants become more widespread in ageing populations. Metal artefacts are mainly caused by the beam-hardening of polychromatic X-ray photon beams, which causes mismatch between the actual sinogram data and the data model being the Radon transform of the unknown attenuation distribution in the CT reconstruction algorithm. We investigate the beam-hardening factor through a mathematical analysis of the discrepancy between the data and the Radon transform of the attenuation distribution at a fixed energy level. Separation of cupping artefacts from beam-hardening artefacts allows causes and effects of streaking artefacts to be analysed. Various computer simulations and experiments are performed to support our mathematical analysis.

  8. Cartographic Modeling: Computer-assisted Analysis of Spatially Defined Neighborhoods

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Tomlin, C. D.

    1982-01-01

    Cartographic models addressing a wide variety of applications are composed of fundamental map processing operations. These primitive operations are neither data base nor application-specific. By organizing the set of operations into a mathematical-like structure, the basis for a generalized cartographic modeling framework can be developed. Among the major classes of primitive operations are those associated with reclassifying map categories, overlaying maps, determining distance and connectivity, and characterizing cartographic neighborhoods. The conceptual framework of cartographic modeling is established and techniques for characterizing neighborhoods are used as a means of demonstrating some of the more sophisticated procedures of computer-assisted map analysis. A cartographic model for assessing effective roundwood supply is briefly described as an example of a computer analysis. Most of the techniques described have been implemented as part of the map analysis package developed at the Yale School of Forestry and Environmental Studies.

  9. Computer Vision-Based Image Analysis of Bacteria.

    PubMed

    Danielsen, Jonas; Nordenfelt, Pontus

    2017-01-01

    Microscopy is an essential tool for studying bacteria, but is today mostly used in a qualitative or possibly semi-quantitative manner often involving time-consuming manual analysis. It also makes it difficult to assess the importance of individual bacterial phenotypes, especially when there are only subtle differences in features such as shape, size, or signal intensity, which is typically very difficult for the human eye to discern. With computer vision-based image analysis - where computer algorithms interpret image data - it is possible to achieve an objective and reproducible quantification of images in an automated fashion. Besides being a much more efficient and consistent way to analyze images, this can also reveal important information that was previously hard to extract with traditional methods. Here, we present basic concepts of automated image processing, segmentation and analysis that can be relatively easy implemented for use with bacterial research.

  10. Improved Flow Modeling in Transient Reactor Safety Analysis Computer Codes

    SciTech Connect

    Holowach, M.J.; Hochreiter, L.E.; Cheung, F.B.

    2002-07-01

    A method of accounting for fluid-to-fluid shear in between calculational cells over a wide range of flow conditions envisioned in reactor safety studies has been developed such that it may be easily implemented into a computer code such as COBRA-TF for more detailed subchannel analysis. At a given nodal height in the calculational model, equivalent hydraulic diameters are determined for each specific calculational cell using either laminar or turbulent velocity profiles. The velocity profile may be determined from a separate CFD (Computational Fluid Dynamics) analysis, experimental data, or existing semi-empirical relationships. The equivalent hydraulic diameter is then applied to the wall drag force calculation so as to determine the appropriate equivalent fluid-to-fluid shear caused by the wall for each cell based on the input velocity profile. This means of assigning the shear to a specific cell is independent of the actual wetted perimeter and flow area for the calculational cell. The use of this equivalent hydraulic diameter for each cell within a calculational subchannel results in a representative velocity profile which can further increase the accuracy and detail of heat transfer and fluid flow modeling within the subchannel when utilizing a thermal hydraulics systems analysis computer code such as COBRA-TF. Utilizing COBRA-TF with the flow modeling enhancement results in increased accuracy for a coarse-mesh model without the significantly greater computational and time requirements of a full-scale 3D (three-dimensional) transient CFD calculation. (authors)

  11. Applied analysis/computational mathematics. Final report 1993

    SciTech Connect

    Lax, P.; Berger, M.

    1993-12-01

    This is the final report for the Courant Mathematics and Computing Laboratory (CMCL) research program for the years 1991--1993. Our research efforts encompass the formulation of physical problems in terms of mathematical models (both old and new), the mathematical analysis of such models, and their numerical resolution. This last step involves the development and implementation of efficient methods for large scale computation. Our analytic and numerical work often go hand in hand; new theoretical approaches often have numerical counterparts, while numerical experimentation often suggests avenues for analytical investigation.

  12. EST analysis pipeline: use of distributed computing resources.

    PubMed

    González, Francisco Javier; Vizcaíno, Juan Antonio

    2011-01-01

    This chapter describes how a pipeline for the analysis of expressed sequence tag (EST) data can be -implemented, based on our previous experience generating ESTs from Trichoderma spp. We focus on key steps in the workflow, such as the processing of raw data from the sequencers, the clustering of ESTs, and the functional annotation of the sequences using BLAST, InterProScan, and BLAST2GO. Some of the steps require the use of intensive computing power. Since these resources are not available for small research groups or institutes without bioinformatics support, an alternative will be described: the use of distributed computing resources (local grids and Amazon EC2).

  13. Integration of 3-dimensional surgical and orthodontic technologies with orthognathic "surgery-first" approach in the management of unilateral condylar hyperplasia.

    PubMed

    Janakiraman, Nandakumar; Feinberg, Mark; Vishwanath, Meenakshi; Nalaka Jayaratne, Yasas Shri; Steinbacher, Derek M; Nanda, Ravindra; Uribe, Flavio

    2015-12-01

    Recent innovations in technology and techniques in both surgical and orthodontic fields can be integrated, especially when treating subjects with facial asymmetry. In this article, we present a treatment method consisting of 3-dimensional computer-aided surgical and orthodontic planning, which was implemented with the orthognathic surgery-first approach. Virtual surgical planning, fabrication of surgical splints using the computer-aided design/computer-aided manufacturing technique, and prediction of final orthodontic occlusion using virtual planning with robotically assisted customized archwires were integrated for this patient. Excellent esthetic and occlusal outcomes were obtained in a short period of 5.5 months.

  14. Dissection of the host-pathogen interaction in human tuberculosis using a bioengineered 3-dimensional model

    PubMed Central

    Tezera, Liku B; Bielecka, Magdalena K; Chancellor, Andrew; Reichmann, Michaela T; Shammari, Basim Al; Brace, Patience; Batty, Alex; Tocheva, Annie; Jogai, Sanjay; Marshall, Ben G; Tebruegge, Marc; Jayasinghe, Suwan N; Mansour, Salah; Elkington, Paul T

    2017-01-01

    Cell biology differs between traditional cell culture and 3-dimensional (3-D) systems, and is modulated by the extracellular matrix. Experimentation in 3-D presents challenges, especially with virulent pathogens. Mycobacterium tuberculosis (Mtb) kills more humans than any other infection and is characterised by a spatially organised immune response and extracellular matrix remodelling. We developed a 3-D system incorporating virulent mycobacteria, primary human blood mononuclear cells and collagen–alginate matrix to dissect the host-pathogen interaction. Infection in 3-D led to greater cellular survival and permitted longitudinal analysis over 21 days. Key features of human tuberculosis develop, and extracellular matrix integrity favours the host over the pathogen. We optimised multiparameter readouts to study emerging therapeutic interventions: cytokine supplementation, host-directed therapy and immunoaugmentation. Each intervention modulates the host-pathogen interaction, but has both beneficial and harmful effects. This methodology has wide applicability to investigate infectious, inflammatory and neoplastic diseases and develop novel drug regimes and vaccination approaches. DOI: http://dx.doi.org/10.7554/eLife.21283.001 PMID:28063256

  15. Polarization-independent efficiency enhancement of organic solar cells by using 3-dimensional plasmonic electrode

    NASA Astrophysics Data System (ADS)

    Li, Xuanhua; Choy, Wallace C. H.; Ren, Xingang; Xin, Jianzhuo; Lin, Peng; Leung, Dennis C. W.

    2013-04-01

    Plasmonic back reflectors have recently become a promising strategy for realizing efficient organic solar cell (OSCs). Since plasmonic effects are strongly sensitive to light polarization, it is highly desirable to simultaneously achieve polarization-independent response and enhanced power conversion efficiency (PCE) by designing the nanostructured geometry of plasmonic reflector electrode. Here, through a strategic analysis of 2-dimensional grating (2D) and 3-dimensional patterns (3D), with similar periodicity as a plasmonic back reflector, we find that the OSCs with 3D pattern achieve the best PCE enhancement by 24.6%, while the OSCs with 2D pattern can offer 17.5% PCE enhancement compared to the optimized control OSCs. Importantly, compared with the 2D pattern, the 3D pattern shows a polarization independent plasmonic response, which will greatly extend its uses in photovoltaic applications. This work shows the significances of carefully selecting and designing geometry of plasmonic nanostructures in achieving high-efficient, polarization-independent plasmonic OSCs.

  16. Probabilistic Computer Analysis for Rapid Evaluation of Structures.

    SciTech Connect

    XU, JIM

    2007-03-29

    P-CARES 2.0.0, Probabilistic Computer Analysis for Rapid Evaluation of Structures, was developed for NRC staff use to determine the validity and accuracy of the analysis methods used by various utilities for structural safety evaluations of nuclear power plants. P-CARES provides the capability to effectively evaluate the probabilistic seismic response using simplified soil and structural models and to quickly check the validity and/or accuracy of the SSI data received from applicants and licensees. The code is organized in a modular format with the basic modules of the system performing static, seismic, and nonlinear analysis.

  17. Preoperative 3-dimensional Magnetic Resonance Imaging of Uterine Myoma and Endometrium Before Myomectomy.

    PubMed

    Kim, Young Jae; Kim, Kwang Gi; Lee, Sa Ra; Lee, Seung Hyun; Kang, Byung Chul

    2017-02-01

    Uterine myomas are the most common gynecologic benign tumor affecting women of childbearing age, and myomectomy is the main surgical option to preserve the uterus and fertility. During myomectomy for women with multiple myomas, it is advisable to identify and remove as many as possible to decrease the risk of future myomectomies. With deficient preoperative imaging, gynecologists are challenged to identify the location and size of myomas and the endometrium, which, in turn, can lead to uterine rupture during future pregnancies. Current conventional 2-dimensional imaging has limitations in identifying precise locations of multiple myomas and the endometrium. In our experience, we preferred to use 3-dimensional imaging to delineate the myomas, endometrium, or blood vessels, which we were able to successfully reconstruct by using the following imaging method. To achieve 3-dimensional imaging, we matched T2 turbo spin echo images to detect uterine myomas and endometria with T1 high-resolution isotropic volume excitation-post images used to detect blood vessels by using an algorithm based on the 3-dimensional region growing method. Then, we produced images of the uterine myomas, endometria, and blood vessels using a 3-dimensional surface rendering method and successfully reconstructed selective 3-dimensional imaging for uterine myomas, endometria, and adjacent blood vessels. A Web-based survey was sent to 66 gynecologists concerning imaging techniques used before myomectomy. Twenty-eight of 36 responding gynecologists answered that the 3-dimensional image produced in the current study is preferred to conventional 2-dimensional magnetic resonance imaging in identifying precise locations of uterine myomas and endometria. The proposed 3-dimensional magnetic resonance imaging method successfully reconstructed uterine myomas, endometria, and adjacent vessels. We propose that this will be a helpful adjunct to uterine myomectomy as a preoperative imaging technique in future

  18. CFD Analysis and Design Optimization Using Parallel Computers

    NASA Technical Reports Server (NTRS)

    Martinelli, Luigi; Alonso, Juan Jose; Jameson, Antony; Reuther, James

    1997-01-01

    A versatile and efficient multi-block method is presented for the simulation of both steady and unsteady flow, as well as aerodynamic design optimization of complete aircraft configurations. The compressible Euler and Reynolds Averaged Navier-Stokes (RANS) equations are discretized using a high resolution scheme on body-fitted structured meshes. An efficient multigrid implicit scheme is implemented for time-accurate flow calculations. Optimum aerodynamic shape design is achieved at very low cost using an adjoint formulation. The method is implemented on parallel computing systems using the MPI message passing interface standard to ensure portability. The results demonstrate that, by combining highly efficient algorithms with parallel computing, it is possible to perform detailed steady and unsteady analysis as well as automatic design for complex configurations using the present generation of parallel computers.

  19. A Computational Approach to Qualitative Analysis in Large Textual Datasets

    PubMed Central

    Evans, Michael S.

    2014-01-01

    In this paper I introduce computational techniques to extend qualitative analysis into the study of large textual datasets. I demonstrate these techniques by using probabilistic topic modeling to analyze a broad sample of 14,952 documents published in major American newspapers from 1980 through 2012. I show how computational data mining techniques can identify and evaluate the significance of qualitatively distinct subjects of discussion across a wide range of public discourse. I also show how examining large textual datasets with computational methods can overcome methodological limitations of conventional qualitative methods, such as how to measure the impact of particular cases on broader discourse, how to validate substantive inferences from small samples of textual data, and how to determine if identified cases are part of a consistent temporal pattern. PMID:24498398

  20. A comparative analysis of soft computing techniques for gene prediction.

    PubMed

    Goel, Neelam; Singh, Shailendra; Aseri, Trilok Chand

    2013-07-01

    The rapid growth of genomic sequence data for both human and nonhuman species has made analyzing these sequences, especially predicting genes in them, very important and is currently the focus of many research efforts. Beside its scientific interest in the molecular biology and genomics community, gene prediction is of considerable importance in human health and medicine. A variety of gene prediction techniques have been developed for eukaryotes over the past few years. This article reviews and analyzes the application of certain soft computing techniques in gene prediction. First, the problem of gene prediction and its challenges are described. These are followed by different soft computing techniques along with their application to gene prediction. In addition, a comparative analysis of different soft computing techniques for gene prediction is given. Finally some limitations of the current research activities and future research directions are provided.

  1. An integrated 3-Dimensional Genome Modeling Engine for data-driven simulation of spatial genome organization.

    PubMed

    Szałaj, Przemysław; Tang, Zhonghui; Michalski, Paul; Pietal, Michal J; Luo, Oscar J; Sadowski, Michał; Li, Xingwang; Radew, Kamen; Ruan, Yijun; Plewczynski, Dariusz

    2016-12-01

    ChIA-PET is a high-throughput mapping technology that reveals long-range chromatin interactions and provides insights into the basic principles of spatial genome organization and gene regulation mediated by specific protein factors. Recently, we showed that a single ChIA-PET experiment provides information at all genomic scales of interest, from the high-resolution locations of binding sites and enriched chromatin interactions mediated by specific protein factors, to the low resolution of nonenriched interactions that reflect topological neighborhoods of higher-order chromosome folding. This multilevel nature of ChIA-PET data offers an opportunity to use multiscale 3D models to study structural-functional relationships at multiple length scales, but doing so requires a structural modeling platform. Here, we report the development of 3D-GNOME (3-Dimensional Genome Modeling Engine), a complete computational pipeline for 3D simulation using ChIA-PET data. 3D-GNOME consists of three integrated components: a graph-distance-based heat map normalization tool, a 3D modeling platform, and an interactive 3D visualization tool. Using ChIA-PET and Hi-C data derived from human B-lymphocytes, we demonstrate the effectiveness of 3D-GNOME in building 3D genome models at multiple levels, including the entire genome, individual chromosomes, and specific segments at megabase (Mb) and kilobase (kb) resolutions of single average and ensemble structures. Further incorporation of CTCF-motif orientation and high-resolution looping patterns in 3D simulation provided additional reliability of potential biologically plausible topological structures.

  2. Realization of masticatory movement by 3-dimensional simulation of the temporomandibular joint and the masticatory muscles.

    PubMed

    Park, Jong-Tae; Lee, Jae-Gi; Won, Sung-Yoon; Lee, Sang-Hee; Cha, Jung-Yul; Kim, Hee-Jin

    2013-07-01

    Masticatory muscles are closely involved in mastication, pronunciation, and swallowing, and it is therefore important to study the specific functions and dynamics of the mandibular and masticatory muscles. However, the shortness of muscle fibers and the diversity of movement directions make it difficult to study and simplify the dynamics of mastication. The purpose of this study was to use 3-dimensional (3D) simulation to observe the functions and movements of each of the masticatory muscles and the mandible while chewing. To simulate the masticatory movement, computed tomographic images were taken from a single Korean volunteer (30-year-old man), and skull image data were reconstructed in 3D (Mimics; Materialise, Leuven, Belgium). The 3D-reconstructed masticatory muscles were then attached to the 3D skull model. The masticatory movements were animated using Maya (Autodesk, San Rafael, CA) based on the mandibular motion path. During unilateral chewing, the mandible was found to move laterally toward the functional side by contracting the contralateral lateral pterygoid and ipsilateral temporalis muscles. During the initial mouth opening, only hinge movement was observed at the temporomandibular joint. During this period, the entire mandible rotated approximately 13 degrees toward the bicondylar horizontal plane. Continued movement of the mandible to full mouth opening occurred simultaneously with sliding and hinge movements, and the mandible rotated approximately 17 degrees toward the center of the mandibular ramus. The described approach can yield data for use in face animation and other simulation systems and for elucidating the functional components related to contraction and relaxation of muscles during mastication.

  3. wolfPAC: building a high-performance distributed computing network for phylogenetic analysis using 'obsolete' computational resources.

    PubMed

    Reeves, Patrick A; Friedman, Philip H; Richards, Christopher M

    2005-01-01

    wolfPAC is an AppleScript-based software package that facilitates the use of numerous, remotely located Macintosh computers to perform computationally-intensive phylogenetic analyses using the popular application PAUP* (Phylogenetic Analysis Using Parsimony). It has been designed to utilise readily available, inexpensive processors and to encourage sharing of computational resources within the worldwide phylogenetics community.

  4. Practical Use of Computationally Frugal Model Analysis Methods.

    PubMed

    Hill, Mary C; Kavetski, Dmitri; Clark, Martyn; Ye, Ming; Arabi, Mazdak; Lu, Dan; Foglia, Laura; Mehl, Steffen

    2016-03-01

    Three challenges compromise the utility of mathematical models of groundwater and other environmental systems: (1) a dizzying array of model analysis methods and metrics make it difficult to compare evaluations of model adequacy, sensitivity, and uncertainty; (2) the high computational demands of many popular model analysis methods (requiring 1000's, 10,000 s, or more model runs) make them difficult to apply to complex models; and (3) many models are plagued by unrealistic nonlinearities arising from the numerical model formulation and implementation. This study proposes a strategy to address these challenges through a careful combination of model analysis and implementation methods. In this strategy, computationally frugal model analysis methods (often requiring a few dozen parallelizable model runs) play a major role, and computationally demanding methods are used for problems where (relatively) inexpensive diagnostics suggest the frugal methods are unreliable. We also argue in favor of detecting and, where possible, eliminating unrealistic model nonlinearities-this increases the realism of the model itself and facilitates the application of frugal methods. Literature examples are used to demonstrate the use of frugal methods and associated diagnostics. We suggest that the strategy proposed in this paper would allow the environmental sciences community to achieve greater transparency and falsifiability of environmental models, and obtain greater scientific insight from ongoing and future modeling efforts.

  5. Computational simulation for analysis and synthesis of impact resilient structure

    NASA Astrophysics Data System (ADS)

    Djojodihardjo, Harijono

    2013-10-01

    Impact resilient structures are of great interest in many engineering applications varying from civil, land vehicle, aircraft and space structures, to mention a few examples. To design such structure, one has to resort fundamental principles and take into account progress in analytical and computational approaches as well as in material science and technology. With such perspectives, this work looks at a generic beam and plate structure subject to impact loading and carry out analysis and numerical simulation. The first objective of the work is to develop a computational algorithm to analyze flat plate as a generic structure subjected to impact loading for numerical simulation and parametric study. The analysis will be based on dynamic response analysis. Consideration is given to the elastic-plastic region. The second objective is to utilize the computational algorithm for direct numerical simulation, and as a parallel scheme, commercial off-the shelf numerical code is utilized for parametric study, optimization and synthesis. Through such analysis and numerical simulation, effort is devoted to arrive at an optimum configuration in terms of loading, structural dimensions, material properties and composite lay-up, among others. Results will be discussed in view of practical applications.

  6. CAPRI: Using a Geometric Foundation for Computational Analysis and Design

    NASA Technical Reports Server (NTRS)

    Haimes, Robert

    2002-01-01

    CAPRI (Computational Analysis Programming Interface) is a software development tool intended to make computerized design, simulation and analysis faster and more efficient. The computational steps traditionally taken for most engineering analysis (Computational Fluid Dynamics (CFD), structural analysis, etc.) are: Surface Generation, usually by employing a Computer Aided Design (CAD) system; Grid Generation, preparing the volume for the simulation; Flow Solver, producing the results at the specified operational point; Post-processing Visualization, interactively attempting to understand the results. It should be noted that the structures problem is more tractable than CFD; there are fewer mesh topologies used and the grids are not as fine (this problem space does not have the length scaling issues of fluids). For CFD, these steps have worked well in the past for simple steady-state simulations at the expense of much user interaction. The data was transmitted between phases via files. In most cases, the output from a CAD system could go IGES files. The output from Grid Generators and Solvers do not really have standards though there are a couple of file formats that can be used for a subset of the gridding (i.e. PLOT3D) data formats and the upcoming CGNS). The user would have to patch up the data or translate from one format to another to move to the next step. Sometimes this could take days. Instead of the serial approach to analysis, CAPRI takes a geometry centric approach. CAPRI is a software building tool-kit that refers to two ideas: (1) A simplified, object-oriented, hierarchical view of a solid part integrating both geometry and topology definitions, and (2) programming access to this part or assembly and any attached data. The connection to the geometry is made through an Application Programming Interface (API) and not a file system.

  7. Three-dimensional transonic potential flow about complex 3-dimensional configurations

    NASA Technical Reports Server (NTRS)

    Reyhner, T. A.

    1984-01-01

    An analysis has been developed and a computer code written to predict three-dimensional subsonic or transonic potential flow fields about lifting or nonlifting configurations. Possible condfigurations include inlets, nacelles, nacelles with ground planes, S-ducts, turboprop nacelles, wings, and wing-pylon-nacelle combinations. The solution of the full partial differential equation for compressible potential flow written in terms of a velocity potential is obtained using finite differences, line relaxation, and multigrid. The analysis uses either a cylindrical or Cartesian coordinate system. The computational mesh is not body fitted. The analysis has been programmed in FORTRAN for both the CDC CYBER 203 and the CRAY-1 computers. Comparisons of computed results with experimental measurement are presented. Descriptions of the program input and output formats are included.

  8. XII Advanced Computing and Analysis Techniques in Physics Research

    NASA Astrophysics Data System (ADS)

    Speer, Thomas; Carminati, Federico; Werlen, Monique

    November 2008 will be a few months after the official start of LHC when the highest quantum energy ever produced by mankind will be observed by the most complex piece of scientific equipment ever built. LHC will open a new era in physics research and push further the frontier of Knowledge This achievement has been made possible by new technological developments in many fields, but computing is certainly the technology that has made possible this whole enterprise. Accelerator and detector design, construction management, data acquisition, detectors monitoring, data analysis, event simulation and theoretical interpretation are all computing based HEP activities but also occurring many other research fields. Computing is everywhere and forms the common link between all involved scientists and engineers. The ACAT workshop series, created back in 1990 as AIHENP (Artificial Intelligence in High Energy and Nuclear Research) has been covering the tremendous evolution of computing in its most advanced topics, trying to setup bridges between computer science, experimental and theoretical physics. Conference web-site: http://acat2008.cern.ch/ Programme and presentations: http://indico.cern.ch/conferenceDisplay.py?confId=34666

  9. Assessing computer waste generation in Chile using material flow analysis.

    PubMed

    Steubing, Bernhard; Böni, Heinz; Schluep, Mathias; Silva, Uca; Ludwig, Christian

    2010-03-01

    The quantities of e-waste are expected to increase sharply in Chile. The purpose of this paper is to provide a quantitative data basis on generated e-waste quantities. A material flow analysis was carried out assessing the generation of e-waste from computer equipment (desktop and laptop PCs as well as CRT and LCD-monitors). Import and sales data were collected from the Chilean Customs database as well as from publications by the International Data Corporation. A survey was conducted to determine consumers' choices with respect to storage, re-use and disposal of computer equipment. The generation of e-waste was assessed in a baseline as well as upper and lower scenarios until 2020. The results for the baseline scenario show that about 10,000 and 20,000 tons of computer waste may be generated in the years 2010 and 2020, respectively. The cumulative e-waste generation will be four to five times higher in the upcoming decade (2010-2019) than during the current decade (2000-2009). By 2020, the shares of LCD-monitors and laptops will increase more rapidly replacing other e-waste including the CRT-monitors. The model also shows the principal flows of computer equipment from production and sale to recycling and disposal. The re-use of computer equipment plays an important role in Chile. An appropriate recycling scheme will have to be introduced to provide adequate solutions for the growing rate of e-waste generation.

  10. Ubiquitous computing in sports: A review and analysis.

    PubMed

    Baca, Arnold; Dabnichki, Peter; Heller, Mario; Kornfeind, Philipp

    2009-10-01

    Ubiquitous (pervasive) computing is a term for a synergetic use of sensing, communication and computing. Pervasive use of computing has seen a rapid increase in the current decade. This development has propagated in applied sport science and everyday life. The work presents a survey of recent developments in sport and leisure with emphasis on technology and computational techniques. A detailed analysis on new technological developments is performed. Sensors for position and motion detection, and such for equipment and physiological monitoring are discussed. Aspects of novel trends in communication technologies and data processing are outlined. Computational advancements have started a new trend - development of smart and intelligent systems for a wide range of applications - from model-based posture recognition to context awareness algorithms for nutrition monitoring. Examples particular to coaching and training are discussed. Selected tools for monitoring rules' compliance and automatic decision-making are outlined. Finally, applications in leisure and entertainment are presented, from systems supporting physical activity to systems providing motivation. It is concluded that the emphasis in future will shift from technologies to intelligent systems that allow for enhanced social interaction as efforts need to be made to improve user-friendliness and standardisation of measurement and transmission protocols.

  11. A grid computing infrastructure for MEG data analysis.

    PubMed

    Nakagawa, S; Kosaka, T; Date, S; Shimojo, S; Tonoike, M

    2004-11-30

    Magnetoencephalography (MEG) is widely used for studying brain functions, but clinical applications of MEG have been less prevalent. One reason is that only clinicians who have highly specialized knowledge can use MEG diagnostically, and such clinicians are found at only a few major hospitals. Another reason is that MEG data analysis is getting more and more complicated, and deals with a large amount of data, and thus requires high-performance computing. These problems can be solved by the collaboration of human and computing resources distributed in multiple facilities. A new computing infrastructure for brain scientists and clinicians in distant locations was therefore developed by the Grid technology, which provides virtual computing environments composed of geographically distributed computers and experimental devices. A prototype system connecting an MEG system at the AIST in Japan, a Grid environment composed of PC clusters at Osaka University in Japan and Nanyang Technological University in Singapore, and user terminals in Baltimore was developed. MEG data measured at the AIST were transferred in real-time through a 1-GB/s network to the PC clusters for processing by a wavelet cross-correlation method, and then monitored in Baltimore. The current system is the basic model for remote-access to MEG equipment and high-speed processing of MEG data.

  12. A Research Roadmap for Computation-Based Human Reliability Analysis

    SciTech Connect

    Boring, Ronald; Mandelli, Diego; Joe, Jeffrey; Smith, Curtis; Groth, Katrina

    2015-08-01

    The United States (U.S.) Department of Energy (DOE) is sponsoring research through the Light Water Reactor Sustainability (LWRS) program to extend the life of the currently operating fleet of commercial nuclear power plants. The Risk Informed Safety Margin Characterization (RISMC) research pathway within LWRS looks at ways to maintain and improve the safety margins of these plants. The RISMC pathway includes significant developments in the area of thermalhydraulics code modeling and the development of tools to facilitate dynamic probabilistic risk assessment (PRA). PRA is primarily concerned with the risk of hardware systems at the plant; yet, hardware reliability is often secondary in overall risk significance to human errors that can trigger or compound undesirable events at the plant. This report highlights ongoing efforts to develop a computation-based approach to human reliability analysis (HRA). This computation-based approach differs from existing static and dynamic HRA approaches in that it: (i) interfaces with a dynamic computation engine that includes a full scope plant model, and (ii) interfaces with a PRA software toolset. The computation-based HRA approach presented in this report is called the Human Unimodels for Nuclear Technology to Enhance Reliability (HUNTER) and incorporates in a hybrid fashion elements of existing HRA methods to interface with new computational tools developed under the RISMC pathway. The goal of this research effort is to model human performance more accurately than existing approaches, thereby minimizing modeling uncertainty found in current plant risk models.

  13. Visual Analysis of Cloud Computing Performance Using Behavioral Lines.

    PubMed

    Muelder, Chris; Zhu, Biao; Chen, Wei; Zhang, Hongxin; Ma, Kwan-Liu

    2016-02-29

    Cloud computing is an essential technology to Big Data analytics and services. A cloud computing system is often comprised of a large number of parallel computing and storage devices. Monitoring the usage and performance of such a system is important for efficient operations, maintenance, and security. Tracing every application on a large cloud system is untenable due to scale and privacy issues. But profile data can be collected relatively efficiently by regularly sampling the state of the system, including properties such as CPU load, memory usage, network usage, and others, creating a set of multivariate time series for each system. Adequate tools for studying such large-scale, multidimensional data are lacking. In this paper, we present a visual based analysis approach to understanding and analyzing the performance and behavior of cloud computing systems. Our design is based on similarity measures and a layout method to portray the behavior of each compute node over time. When visualizing a large number of behavioral lines together, distinct patterns often appear suggesting particular types of performance bottleneck. The resulting system provides multiple linked views, which allow the user to interactively explore the data by examining the data or a selected subset at different levels of detail. Our case studies, which use datasets collected from two different cloud systems, show that this visual based approach is effective in identifying trends and anomalies of the systems.

  14. Computational analysis of high resolution unsteady airloads for rotor aeroacoustics

    NASA Technical Reports Server (NTRS)

    Quackenbush, Todd R.; Lam, C.-M. Gordon; Wachspress, Daniel A.; Bliss, Donald B.

    1994-01-01

    The study of helicopter aerodynamic loading for acoustics applications requires the application of efficient yet accurate simulations of the velocity field induced by the rotor's vortex wake. This report summarizes work to date on the development of such an analysis, which builds on the Constant Vorticity Contour (CVC) free wake model, previously implemented for the study of vibratory loading in the RotorCRAFT computer code. The present effort has focused on implementation of an airload reconstruction approach that computes high resolution airload solutions of rotor/rotor-wake interactions required for acoustics computations. Supplementary efforts on the development of improved vortex core modeling, unsteady aerodynamic effects, higher spatial resolution of rotor loading, and fast vortex wake implementations have substantially enhanced the capabilities of the resulting software, denoted RotorCRAFT/AA (AeroAcoustics). Results of validation calculations using recently acquired model rotor data show that by employing airload reconstruction it is possible to apply the CVC wake analysis with temporal and spatial resolution suitable for acoustics applications while reducing the computation time required by one to two orders of magnitude relative to that required by direct calculations. Promising correlation with this body of airload and noise data has been obtained for a variety of rotor configurations and operating conditions.

  15. Computational methods for efficient structural reliability and reliability sensitivity analysis

    NASA Technical Reports Server (NTRS)

    Wu, Y.-T.

    1993-01-01

    This paper presents recent developments in efficient structural reliability analysis methods. The paper proposes an efficient, adaptive importance sampling (AIS) method that can be used to compute reliability and reliability sensitivities. The AIS approach uses a sampling density that is proportional to the joint PDF of the random variables. Starting from an initial approximate failure domain, sampling proceeds adaptively and incrementally with the goal of reaching a sampling domain that is slightly greater than the failure domain to minimize over-sampling in the safe region. Several reliability sensitivity coefficients are proposed that can be computed directly and easily from the above AIS-based failure points. These probability sensitivities can be used for identifying key random variables and for adjusting design to achieve reliability-based objectives. The proposed AIS methodology is demonstrated using a turbine blade reliability analysis problem.

  16. Computational Aerodynamic Analysis of Offshore Upwind and Downwind Turbines

    DOE PAGES

    Zhao, Qiuying; Sheng, Chunhua; Afjeh, Abdollah

    2014-01-01

    Aerodynamic interactions of the model NREL 5 MW offshore horizontal axis wind turbines (HAWT) are investigated using a high-fidelity computational fluid dynamics (CFD) analysis. Four wind turbine configurations are considered; three-bladed upwind and downwind and two-bladed upwind and downwind configurations, which operate at two different rotor speeds of 12.1 and 16 RPM. In the present study, both steady and unsteady aerodynamic loads, such as the rotor torque, blade hub bending moment, and base the tower bending moment of the tower, are evaluated in detail to provide overall assessment of different wind turbine configurations. Aerodynamic interactions between the rotor and tower are analyzed,more » including the rotor wake development downstream. The computational analysis provides insight into aerodynamic performance of the upwind and downwind, two- and three-bladed horizontal axis wind turbines.« less

  17. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    NASA Astrophysics Data System (ADS)

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; Najm, Habib N.

    2017-04-01

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to not only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. The algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.

  18. Advanced computational tools for 3-D seismic analysis

    SciTech Connect

    Barhen, J.; Glover, C.W.; Protopopescu, V.A.

    1996-06-01

    The global objective of this effort is to develop advanced computational tools for 3-D seismic analysis, and test the products using a model dataset developed under the joint aegis of the United States` Society of Exploration Geophysicists (SEG) and the European Association of Exploration Geophysicists (EAEG). The goal is to enhance the value to the oil industry of the SEG/EAEG modeling project, carried out with US Department of Energy (DOE) funding in FY` 93-95. The primary objective of the ORNL Center for Engineering Systems Advanced Research (CESAR) is to spearhead the computational innovations techniques that would enable a revolutionary advance in 3-D seismic analysis. The CESAR effort is carried out in collaboration with world-class domain experts from leading universities, and in close coordination with other national laboratories and oil industry partners.

  19. Triploidy in rainbow trout determined by computer-assisted analysis.

    PubMed

    Espinosa, Emilio; Josa, Agustín; Gil, Lidia; Martí, José Ignacio

    2005-11-01

    This study was designed to assess the use of a computer-assisted system based on erythrocyte measurements as a possible alternative to flow cytometry for identifying triploid rainbow trout (Oncorhynchus mykiss). Blood smears were prepared from 26 triploid and 26 diploid specimens, as determined by flow cytometry after staining blood cells with propidium iodide. The cell and nucleus lengths of 10 erythrocytes were determined in each fish. This was followed by discriminatory analysis to distinguish between diploids and triploids based on their score profiles. Triploid trout showed significantly larger erythrocyte cell and nucleus measurements than their diploid counterparts (N=52; P<0.0001). Erythrocyte length correctly identified 100% of the fish specimens as diploid or triploid, while nucleus length was a less accurate predictor of the level of ploidy. Our findings validate the potential use of computer-assisted analysis for this purpose.

  20. Micro Computer Tomography for medical device and pharmaceutical packaging analysis.

    PubMed

    Hindelang, Florine; Zurbach, Raphael; Roggo, Yves

    2015-04-10

    Biomedical device and medicine product manufacturing are long processes facing global competition. As technology evolves with time, the level of quality, safety and reliability increases simultaneously. Micro Computer Tomography (Micro CT) is a tool allowing a deep investigation of products: it can contribute to quality improvement. This article presents the numerous applications of Micro CT for medical device and pharmaceutical packaging analysis. The samples investigated confirmed CT suitability for verification of integrity, measurements and defect detections in a non-destructive manner.

  1. Mathematical Analysis and Computer Program Development for Electromagnetic Science Studies.

    DTIC Science & Technology

    1981-11-01

    interest that needed to be computed was the far-field power pattern of a planar array of electrodes . The derivation of it is carried out belnw. Fig. 3...shows, for a double- electrode array and relevant geometry. As several inve3tigators have shown, radiation into an isotropic medium is often a reasonable...physical approximation to make. and certainly makes the analysis simpler. For the double- electrode array shown in Fig. 3 , we note that Y" + L -1 2. (2

  2. Vector Field Visual Data Analysis Technologies for Petascale Computational Science

    SciTech Connect

    Garth, Christoph; Deines, Eduard; Joy, Kenneth I.; Bethel, E. Wes; Childs, Hank; Weber, Gunther; Ahern, Sean; Pugmire, Dave; Sanderson, Allen; Johnson, Chris

    2009-11-13

    State-of-the-art computational science simulations generate large-scale vector field data sets. Visualization and analysis is a key aspect of obtaining insight into these data sets and represents an important challenge. This article discusses possibilities and challenges of modern vector field visualization and focuses on methods and techniques developed in the SciDAC Visualization and Analytics Center for Enabling Technologies (VACET) and deployed in the open-source visualization tool, VisIt.

  3. Computed Tomography Inspection and Analysis for Additive Manufacturing Components

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2016-01-01

    Computed tomography (CT) inspection was performed on test articles additively manufactured from metallic materials. Metallic AM and machined wrought alloy test articles with programmed flaws were inspected using a 2MeV linear accelerator based CT system. Performance of CT inspection on identically configured wrought and AM components and programmed flaws was assessed using standard image analysis techniques to determine the impact of additive manufacturing on inspectability of objects with complex geometries.

  4. Computational Methods for Failure Analysis and Life Prediction

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K. (Compiler); Harris, Charles E. (Compiler); Housner, Jerrold M. (Compiler); Hopkins, Dale A. (Compiler)

    1993-01-01

    This conference publication contains the presentations and discussions from the joint UVA/NASA Workshop on Computational Methods for Failure Analysis and Life Prediction held at NASA Langley Research Center 14-15 Oct. 1992. The presentations focused on damage failure and life predictions of polymer-matrix composite structures. They covered some of the research activities at NASA Langley, NASA Lewis, Southwest Research Institute, industry, and universities. Both airframes and propulsion systems were considered.

  5. Theoretical Innovations in Combustion Stability Research: Integrated Analysis and Computation

    DTIC Science & Technology

    2011-04-14

    presentation [2] has been made at a national conference of this subject. b.2-Thermomechanics of reactive gases Transient, spatially...Integrated Analysis and Computation 5a. CONTRACT NUMBER FA9550-10-C-0088 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR(S) David Kassoy...KISS and JPL personnel. 15. SUBJECT TERMS Combustion, Thermomechanics, Turbulent Reacting Flow, Supercritical Gases , Rocket Engine Stability 16

  6. Analysis of guidance law performance using personal computers

    NASA Technical Reports Server (NTRS)

    Barrios, J. Rene

    1990-01-01

    A point mass, three-degree of freedom model is presented as a basic development tool for PC based simulation models. The model has been used in the development of guidance algorithms as well as in other applications such as performance management systems to compute optimal speeds. Its limitations and advantages are discussed with regard to the windshear environment. A method for simulating a simple autopilot is explained in detail and applied in the analysis of different guidance laws.

  7. Use of 3-Dimensional Volumetric Modeling of Adrenal Gland Size in Patients with Primary Pigmented Nodular Adrenocortical Disease.

    PubMed

    Chrysostomou, P P; Lodish, M B; Turkbey, E B; Papadakis, G Z; Stratakis, C A

    2016-04-01

    Primary pigmented nodular adrenocortical disease (PPNAD) is a rare type of bilateral adrenal hyperplasia leading to hypercortisolemia. Adrenal nodularity is often appreciable with computed tomography (CT); however, accurate radiologic characterization of adrenal size in PPNAD has not been studied well. We used 3-dimensional (3D) volumetric analysis to characterize and compare adrenal size in PPNAD patients, with and without Cushing's syndrome (CS). Patients diagnosed with PPNAD and their family members with known mutations in PRKAR1A were screened. CT scans were used to create 3D models of each adrenal. Criteria for biochemical diagnosis of CS included loss of diurnal variation and/or elevated midnight cortisol levels, and paradoxical increase in urinary free cortisol and/or urinary 17-hydroxysteroids after dexamethasone administration. Forty-five patients with PPNAD (24 females, 27.8±17.6 years) and 8 controls (19±3 years) were evaluated. 3D volumetric modeling of adrenal glands was performed in all. Thirty-eight patients out of 45 (84.4%) had CS. Their mean adrenal volume was 8.1 cc±4.1, 7.2 cc±4.5 (p=0.643) for non-CS, and 8.0cc±1.6 for controls. Mean values were corrected for body surface area; 4.7 cc/kg/m(2)±2.2 for CS, and 3.9 cc/kg/m(2)±1.3 for non-CS (p=0.189). Adrenal volume and midnight cortisol in both groups was positively correlated, r=0.35, p=0.03. We conclude that adrenal volume measured by 3D CT in patients with PPNAD and CS was similar to those without CS, confirming empirical CT imaging-based observations. However, the association between adrenal volume and midnight cortisol levels may be used as a marker of who among patients with PPNAD may develop CS, something that routine CT cannot do.

  8. Spacelab data analysis using the space plasma computer analysis network (SCAN) system

    NASA Technical Reports Server (NTRS)

    Green, J. L.

    1984-01-01

    The Space-plasma Computer Analysis Network (SCAN) currently connects a large number of U.S. Spacelab investigators into a common computer network. Used primarily by plasma physics researchers at present, SCAN provides access to Spacelab investigators in other areas of space science, to Spacelab and non-Spacelab correlative data bases, and to large Class VI computational facilities for modeling. SCAN links computers together at remote institutions used by space researchers, utilizing commercially available software for computer-to-computer communications. Started by the NASA's Office of Space Science in mid 1980, SCAN presently contains ten system nodes located at major universities and space research laboratories, with fourteen new nodes projected for the near future. The Stanford University computer gateways allow SCAN users to connect onto the ARPANET and TELENET overseas networks.

  9. Advances in Computational Stability Analysis of Composite Aerospace Structures

    SciTech Connect

    Degenhardt, R.; Araujo, F. C. de

    2010-09-30

    European aircraft industry demands for reduced development and operating costs. Structural weight reduction by exploitation of structural reserves in composite aerospace structures contributes to this aim, however, it requires accurate and experimentally validated stability analysis of real structures under realistic loading conditions. This paper presents different advances from the area of computational stability analysis of composite aerospace structures which contribute to that field. For stringer stiffened panels main results of the finished EU project COCOMAT are given. It investigated the exploitation of reserves in primary fibre composite fuselage structures through an accurate and reliable simulation of postbuckling and collapse. For unstiffened cylindrical composite shells a proposal for a new design method is presented.

  10. Shielding analysis methods available in the scale computational system

    SciTech Connect

    Parks, C.V.; Tang, J.S.; Hermann, O.W.; Bucholz, J.A.; Emmett, M.B.

    1986-01-01

    Computational tools have been included in the SCALE system to allow shielding analysis to be performed using both discrete-ordinates and Monte Carlo techniques. One-dimensional discrete ordinates analyses are performed with the XSDRNPM-S module, and point dose rates outside the shield are calculated with the XSDOSE module. Multidimensional analyses are performed with the MORSE-SGC/S Monte Carlo module. This paper will review the above modules and the four Shielding Analysis Sequences (SAS) developed for the SCALE system. 7 refs., 8 figs.

  11. Structural mode significance using INCA. [Interactive Controls Analysis computer program

    NASA Technical Reports Server (NTRS)

    Bauer, Frank H.; Downing, John P.; Thorpe, Christopher J.

    1990-01-01

    Structural finite element models are often too large to be used in the design and analysis of control systems. Model reduction techniques must be applied to reduce the structural model to manageable size. In the past, engineers either performed the model order reduction by hand or used distinct computer programs to retrieve the data, to perform the significance analysis and to reduce the order of the model. To expedite this process, the latest version of INCA has been expanded to include an interactive graphical structural mode significance and model order reduction capability.

  12. Computer Based Economic Analysis Techniques to Support Functional Economic Analysis

    DTIC Science & Technology

    1993-09-01

    is one of the most frequently used tools to uncover and explore profit potential. B. CALCULATION OF BREAK EVER ANALYSIS Haga and Lang (1992) state...BENEFITS For benfits that are quantifiable, Haga and Lang (1992) express BCR in the following notation. BCR=QOM Equation 9-1UAC (Where QOM is a...emulation. In addition to the software requirements, FEAM has the following hardware criteria: 68 * A mouse "* 2MB of RAM "* 20MB of Hard Disk space "* EGA

  13. Computer Use, Confidence, Attitudes, and Knowledge: A Causal Analysis.

    ERIC Educational Resources Information Center

    Levine, Tamar; Donitsa-Schmidt, Smadar

    1998-01-01

    Introduces a causal model which links measures of computer experience, computer-related attitudes, computer-related confidence, and perceived computer-based knowledge. The causal model suggests that computer use has a positive effect on perceived computer self-confidence, as well as on computer-related attitudes. Questionnaires were administered…

  14. Novel Radiobiological Gamma Index for Evaluation of 3-Dimensional Predicted Dose Distribution

    SciTech Connect

    Sumida, Iori; Yamaguchi, Hajime; Kizaki, Hisao; Aboshi, Keiko; Tsujii, Mari; Yoshikawa, Nobuhiko; Yamada, Yuji; Suzuki, Osamu; Seo, Yuji; Isohashi, Fumiaki; Yoshioka, Yasuo; Ogawa, Kazuhiko

    2015-07-15

    Purpose: To propose a gamma index-based dose evaluation index that integrates the radiobiological parameters of tumor control (TCP) and normal tissue complication probabilities (NTCP). Methods and Materials: Fifteen prostate and head and neck (H&N) cancer patients received intensity modulated radiation therapy. Before treatment, patient-specific quality assurance was conducted via beam-by-beam analysis, and beam-specific dose error distributions were generated. The predicted 3-dimensional (3D) dose distribution was calculated by back-projection of relative dose error distribution per beam. A 3D gamma analysis of different organs (prostate: clinical [CTV] and planned target volumes [PTV], rectum, bladder, femoral heads; H&N: gross tumor volume [GTV], CTV, spinal cord, brain stem, both parotids) was performed using predicted and planned dose distributions under 2%/2 mm tolerance and physical gamma passing rate was calculated. TCP and NTCP values were calculated for voxels with physical gamma indices (PGI) >1. We propose a new radiobiological gamma index (RGI) to quantify the radiobiological effects of TCP and NTCP and calculate radiobiological gamma passing rates. Results: The mean RGI gamma passing rates for prostate cases were significantly different compared with those of PGI (P<.03–.001). The mean RGI gamma passing rates for H&N cases (except for GTV) were significantly different compared with those of PGI (P<.001). Differences in gamma passing rates between PGI and RGI were due to dose differences between the planned and predicted dose distributions. Radiobiological gamma distribution was visualized to identify areas where the dose was radiobiologically important. Conclusions: RGI was proposed to integrate radiobiological effects into PGI. This index would assist physicians and medical physicists not only in physical evaluations of treatment delivery accuracy, but also in clinical evaluations of predicted dose distribution.

  15. Computational Tools for the Secondary Analysis of Metabolomics Experiments

    PubMed Central

    Booth, Sean C.; Weljie, Aalim M.; Turner, Raymond J.

    2013-01-01

    Metabolomics experiments have become commonplace in a wide variety of disciplines. By identifying and quantifying metabolites researchers can achieve a systems level understanding of metabolism. These studies produce vast swaths of data which are often only lightly interpreted due to the overwhelmingly large amount of variables that are measured. Recently, a number of computational tools have been developed which enable much deeper analysis of metabolomics data. These data have been difficult to interpret as understanding the connections between dozens of altered metabolites has often relied on the biochemical knowledge of researchers and their speculations. Modern biochemical databases provide information about the interconnectivity of metabolism which can be automatically polled using metabolomics secondary analysis tools. Starting with lists of altered metabolites, there are two main types of analysis: enrichment analysis computes which metabolic pathways have been significantly altered whereas metabolite mapping contextualizes the abundances and significances of measured metabolites into network visualizations. Many different tools have been developed for one or both of these applications. In this review the functionality and use of these software is discussed. Together these novel secondary analysis tools will enable metabolomics researchers to plumb the depths of their data and produce farther reaching biological conclusions than ever before. PMID:24688685

  16. Analysis of computational modeling techniques for complete rotorcraft configurations

    NASA Astrophysics Data System (ADS)

    O'Brien, David M., Jr.

    Computational fluid dynamics (CFD) provides the helicopter designer with a powerful tool for identifying problematic aerodynamics. Through the use of CFD, design concepts can be analyzed in a virtual wind tunnel long before a physical model is ever created. Traditional CFD analysis tends to be a time consuming process, where much of the effort is spent generating a high quality computational grid. Recent increases in computing power and memory have created renewed interest in alternative grid schemes such as unstructured grids, which facilitate rapid grid generation by relaxing restrictions on grid structure. Three rotor models have been incorporated into a popular fixed-wing unstructured CFD solver to increase its capability and facilitate availability to the rotorcraft community. The benefit of unstructured grid methods is demonstrated through rapid generation of high fidelity configuration models. The simplest rotor model is the steady state actuator disk approximation. By transforming the unsteady rotor problem into a steady state one, the actuator disk can provide rapid predictions of performance parameters such as lift and drag. The actuator blade and overset blade models provide a depiction of the unsteady rotor wake, but incur a larger computational cost than the actuator disk. The actuator blade model is convenient when the unsteady aerodynamic behavior needs to be investigated, but the computational cost of the overset approach is too large. The overset or chimera method allows the blades loads to be computed from first-principles and therefore provides the most accurate prediction of the rotor wake for the models investigated. The physics of the flow fields generated by these models for rotor/fuselage interactions are explored, along with efficiencies and limitations of each method.

  17. Computer-assisted qualitative data analysis software: a review.

    PubMed

    Banner, Davina J; Albarrran, John W

    2009-01-01

    Over recent decades, qualitative research has become accepted as a uniquely valuable methodological approach for generating knowledge, particularly in relation to promoting understanding of patients' experiences and responses to illness. Within cardiovascular nursing such qualitative approaches have been widely adopted to systematically investigate a number of phenomena. Contemporary qualitative research practice comprises a diverse range of disciplines and approaches. Computer-aided qualitative data analysis software represents an important facet of this increasingly sophisticated movement. Such software offers an efficient means through which to manage and organize data while supporting rigorous data analysis. The increasing use of qualitative data analysis software has stimulated wide discussion. This research column includes a review of some of the advantages and debates related to the use and integration of qualitative data analysis software.

  18. Dosimetric Comparison Between 3-Dimensional Conformal and Robotic SBRT Treatment Plans for Accelerated Partial Breast Radiotherapy.

    PubMed

    Goggin, L M; Descovich, M; McGuinness, C; Shiao, S; Pouliot, J; Park, C

    2016-06-01

    Accelerated partial breast irradiation is an attractive alternative to conventional whole breast radiotherapy for selected patients. Recently, CyberKnife has emerged as a possible alternative to conventional techniques for accelerated partial breast irradiation. In this retrospective study, we present a dosimetric comparison between 3-dimensional conformal radiotherapy plans and CyberKnife plans using circular (Iris) and multi-leaf collimators. Nine patients who had undergone breast-conserving surgery followed by whole breast radiation were included in this retrospective study. The CyberKnife planning target volume (PTV) was defined as the lumpectomy cavity + 10 mm + 2 mm with prescription dose of 30 Gy in 5 fractions. Two sets of 3-dimensional conformal radiotherapy plans were created, one used the same definitions as described for CyberKnife and the second used the RTOG-0413 definition of the PTV: lumpectomy cavity + 15 mm + 10 mm with prescription dose of 38.5 Gy in 10 fractions. Using both PTV definitions allowed us to compare the dose delivery capabilities of each technology and to evaluate the advantage of CyberKnife tracking. For the dosimetric comparison using the same PTV margins, CyberKnife and 3-dimensional plans resulted in similar tumor coverage and dose to critical structures, with the exception of the lung V5%, which was significantly smaller for 3-dimensional conformal radiotherapy, 6.2% when compared to 39.4% for CyberKnife-Iris and 17.9% for CyberKnife-multi-leaf collimator. When the inability of 3-dimensional conformal radiotherapy to track motion is considered, the result increased to 25.6%. Both CyberKnife-Iris and CyberKnife-multi-leaf collimator plans demonstrated significantly lower average ipsilateral breast V50% (25.5% and 24.2%, respectively) than 3-dimensional conformal radiotherapy (56.2%). The CyberKnife plans were more conformal but less homogeneous than the 3-dimensional conformal radiotherapy plans. Approximately 50% shorter

  19. New Mexico district work-effort analysis computer program

    USGS Publications Warehouse

    Hiss, W.L.; Trantolo, A.P.; Sparks, J.L.

    1972-01-01

    The computer program (CAN 2) described in this report is one of several related programs used in the New Mexico District cost-analysis system. The work-effort information used in these programs is accumulated and entered to the nearest hour on forms completed by each employee. Tabulating cards are punched directly from these forms after visual examinations for errors are made. Reports containing detailed work-effort data itemized by employee within each project and account and by account and project for each employee are prepared for both current-month and year-to-date periods by the CAN 2 computer program. An option allowing preparation of reports for a specified 3-month period is provided. The total number of hours worked on each account and project and a grand total of hours worked in the New Mexico District is computed and presented in a summary report for each period. Work effort not chargeable directly to individual projects or accounts is considered as overhead and can be apportioned to the individual accounts and projects on the basis of the ratio of the total hours of work effort for the individual accounts or projects to the total New Mexico District work effort at the option of the user. The hours of work performed by a particular section, such as General Investigations or Surface Water, are prorated and charged to the projects or accounts within the particular section. A number of surveillance or buffer accounts are employed to account for the hours worked on special events or on those parts of large projects or accounts that require a more detailed analysis. Any part of the New Mexico District operation can be separated and analyzed in detail by establishing an appropriate buffer account. With the exception of statements associated with word size, the computer program is written in FORTRAN IV in a relatively low and standard language level to facilitate its use on different digital computers. The program has been run only on a Control Data Corporation

  20. PREWATE: An interactive preprocessing computer code to the Weight Analysis of Turbine Engines (WATE) computer code

    NASA Technical Reports Server (NTRS)

    Fishbach, L. H.

    1983-01-01

    The Weight Analysis of Turbine Engines (WATE) computer code was developed by Boeing under contract to NASA Lewis. It was designed to function as an adjunct to the Navy/NASA Engine Program (NNEP). NNEP calculates the design and off-design thrust and sfc performance of User defined engine cycles. The thermodynamic parameters throughout the engine as generated by NNEP are then combined with input parameters defining the component characteristics in WATE to calculate the bare engine weight of this User defined engine. Preprocessor programs for NNEP were previously developed to simplify the task of creating input datasets. This report describes a similar preprocessor for the WATE code.

  1. Computer analysis of Farnsworth-Munsell 100-hue test.

    PubMed

    Winston, J V; Martin, D A; Heckenlively, J R

    1986-01-31

    Color vision abnormalities indicated by the Farnsworth-Munsell 100-hue Color Vision Tests (FM-100) were analyzed by computer to better characterize and group congenital and acquired color vision disorders and to help establish statistically significant diagnostic criteria. Standard evaluation of the FM-100 is by axis and error score calculations. A method has been established for computer-averaging many tests from patients with the same color abnormalities determined by history, standard FM-100 and Nagel anomaloscope. The computer calculated an average error score and standard deviation for each of the 85 color caps. Every time a new patient was evaluated for color vision abnormality, his score was compared with averaged tests with common diagnoses, by calculating distance scores. The averaged test with the lowest distance score consistently tended to coincide with the diagnosis. An analysis of 130 FM-100 color tests found technician-calculated error scores to be incorrect, although usually minor, in 40% of the tests. The computer-calculated axes agreed well with the technician's estimates. The distance scores predicted the diagnosis accurately 89% of the time. Many errors were due to the small number of protanopes averaged and inability to distinguish trichromats from dichromats.

  2. A Lumped Computational Model for Sodium Sulfur Battery Analysis

    NASA Astrophysics Data System (ADS)

    Wu, Fan

    Due to the cost of materials and time consuming testing procedures, development of new batteries is a slow and expensive practice. The purpose of this study is to develop a computational model and assess the capabilities of such a model designed to aid in the design process and control of sodium sulfur batteries. To this end, a transient lumped computational model derived from an integral analysis of the transport of species, energy and charge throughout the battery has been developed. The computation processes are coupled with the use of Faraday's law, and solutions for the species concentrations, electrical potential and current are produced in a time marching fashion. Properties required for solving the governing equations are calculated and updated as a function of time based on the composition of each control volume. The proposed model is validated against multi- dimensional simulations and experimental results from literatures, and simulation results using the proposed model is presented and analyzed. The computational model and electrochemical model used to solve the equations for the lumped model are compared with similar ones found in the literature. The results obtained from the current model compare favorably with those from experiments and other models.

  3. Computer image analysis of etched tracks from ionizing radiation

    NASA Technical Reports Server (NTRS)

    Blanford, George E.

    1994-01-01

    I proposed to continue a cooperative research project with Dr. David S. McKay concerning image analysis of tracks. Last summer we showed that we could measure track densities using the Oxford Instruments eXL computer and software that is attached to an ISI scanning electron microscope (SEM) located in building 31 at JSC. To reduce the dependence on JSC equipment, we proposed to transfer the SEM images to UHCL for analysis. Last summer we developed techniques to use digitized scanning electron micrographs and computer image analysis programs to measure track densities in lunar soil grains. Tracks were formed by highly ionizing solar energetic particles and cosmic rays during near surface exposure on the Moon. The track densities are related to the exposure conditions (depth and time). Distributions of the number of grains as a function of their track densities can reveal the modality of soil maturation. As part of a consortium effort to better understand the maturation of lunar soil and its relation to its infrared reflectance properties, we worked on lunar samples 67701,205 and 61221,134. These samples were etched for a shorter time (6 hours) than last summer's sample and this difference has presented problems for establishing the correct analysis conditions. We used computer counting and measurement of area to obtain preliminary track densities and a track density distribution that we could interpret for sample 67701,205. This sample is a submature soil consisting of approximately 85 percent mature soil mixed with approximately 15 percent immature, but not pristine, soil.

  4. Analysis of sponge zones for computational fluid mechanics

    SciTech Connect

    Bodony, Daniel J. . E-mail: bodony@stanford.edu

    2006-03-01

    The use of sponge regions, or sponge zones, which add the forcing term -{sigma}(q - q {sub ref}) to the right-hand-side of the governing equations in computational fluid mechanics as an ad hoc boundary treatment is widespread. They are used to absorb and minimize reflections from computational boundaries and as forcing sponges to introduce prescribed disturbances into a calculation. A less common usage is as a means of extending a calculation from a smaller domain into a larger one, such as in computing the far-field sound generated in a localized region. By analogy to the penalty method of finite elements, the method is placed on a solid foundation, complete with estimates of convergence. The analysis generalizes the work of Israeli and Orszag [M. Israeli, S.A. Orszag, Approximation of radiation boundary conditions, J. Comp. Phys. 41 (1981) 115-135] and confirms their findings when applied as a special case to one-dimensional wave propagation in an absorbing sponge. It is found that the rate of convergence of the actual solution to the target solution, with an appropriate norm, is inversely proportional to the sponge strength. A detailed analysis for acoustic wave propagation in one-dimension verifies the convergence rate given by the general theory. The exponential point-wise convergence derived by Israeli and Orszag in the high-frequency limit is recovered and found to hold over all frequencies. A weakly nonlinear analysis of the method when applied to Burgers' equation shows similar convergence properties. Three numerical examples are given to confirm the analysis: the acoustic extension of a two-dimensional time-harmonic point source, the acoustic extension of a three-dimensional initial-value problem of a sound pulse, and the introduction of unstable eigenmodes from linear stability theory into a two-dimensional shear layer.

  5. A Computational Approach for Probabilistic Analysis of Water Impact Simulations

    NASA Technical Reports Server (NTRS)

    Horta, Lucas G.; Mason, Brian H.; Lyle, Karen H.

    2009-01-01

    NASA's development of new concepts for the Crew Exploration Vehicle Orion presents many similar challenges to those worked in the sixties during the Apollo program. However, with improved modeling capabilities, new challenges arise. For example, the use of the commercial code LS-DYNA, although widely used and accepted in the technical community, often involves high-dimensional, time consuming, and computationally intensive simulations. The challenge is to capture what is learned from a limited number of LS-DYNA simulations to develop models that allow users to conduct interpolation of solutions at a fraction of the computational time. This paper presents a description of the LS-DYNA model, a brief summary of the response surface techniques, the analysis of variance approach used in the sensitivity studies, equations used to estimate impact parameters, results showing conditions that might cause injuries, and concluding remarks.

  6. Computational analysis of RNA structures with chemical probing data

    PubMed Central

    Ge, Ping; Zhang, Shaojie

    2015-01-01

    RNAs play various roles, not only as the genetic codes to synthesize proteins, but also as the direct participants of biological functions determined by their underlying high-order structures. Although many computational methods have been proposed for analyzing RNA structures, their accuracy and efficiency are limited, especially when applied to the large RNAs and the genome-wide data sets. Recently, advances in parallel sequencing and high-throughput chemical probing technologies have prompted the development of numerous new algorithms, which can incorporate the auxiliary structural information obtained from those experiments. Their potential has been revealed by the secondary structure prediction of ribosomal RNAs and the genome-wide ncRNA function annotation. In this review, the existing probing-directed computational methods for RNA secondary and tertiary structure analysis are discussed. PMID:25687190

  7. Computational analysis of the SRS Phase III salt disposition alternatives

    SciTech Connect

    Dimenna, R.A.

    2000-01-04

    In late 1997, the In-Tank Precipitation (ITP), facility was shut down and an evaluation of alternative methods to process the liquid high-level waste stored in the Savannah River Site High-Level Waste storage tanks was begun. The objective was to determine whether another process might avoid the operational difficulties encountered with ITP for a lower cost than modifying the existing structured approach to evaluating proposed alternatives on a common basis to identify the best one. Results from the computational analysis were a key part of the input used to select a primary and a secondary salt disposition alternative. This paper describes the process by which the computation needs were identified, addressed, and accomplished with a limited staff under stringent schedule constraints.

  8. Oxidation behavior of ammonium in a 3-dimensional biofilm-electrode reactor.

    PubMed

    Tang, Jinjing; Guo, Jinsong; Fang, Fang; Chen, Youpeng; Lei, Lijing; Yang, Lin

    2013-12-01

    Excess nitrogenous compounds are detrimental to natural water systems and to human health. To completely realize autohydrogenotrophic nitrogen removal, a novel 3-dimensional biofilm-electrode reactor was designed. Titanium was electroplated with ruthenium and used as the anode. Activated carbon fiber felt was used as the cathode. The reactor was separated into two chambers by a permeable membrane. The cathode chamber was filled with granular graphite and glass beads. The cathode and cathode chamber were inhabited with domesticated biofilm. In the absence of organic substances, a nitrogen removal efficiency of up to 91% was achieved at DO levels of 3.42 +/- 0.37 mg/L when the applied current density was only 0.02 mA/cm2. The oxidation of ammonium in biofilm-electrode reactors was also investigated. It was found that ammonium could be oxidized not only on the anode but also on particle electrodes in the cathode chamber of the biofilm-electrode reactor. Oxidation rates of ammonium and nitrogen removal efficiency were found to be affected by the electric current loading on the biofilm-electrode reactor. The kinetic model of ammonium at different electric currents was analyzed by a first-order reaction kinetics equation. The regression analysis implied that when the current density was less than 0.02 mA/cm2, ammonium removal was positively correlated to the current density. However, when the current density was more than 0.02 mA/cm2, the electric current became a limiting factor for the oxidation rate of ammonium and nitrogen removal efficiency.

  9. Usefulness of 3-dimensional stereotactic surface projection FDG PET images for the diagnosis of dementia

    PubMed Central

    Kim, Jahae; Cho, Sang-Geon; Song, Minchul; Kang, Sae-Ryung; Kwon, Seong Young; Choi, Kang-Ho; Choi, Seong-Min; Kim, Byeong-Chae; Song, Ho-Chun

    2016-01-01

    Abstract To compare diagnostic performance and confidence of a standard visual reading and combined 3-dimensional stereotactic surface projection (3D-SSP) results to discriminate between Alzheimer disease (AD)/mild cognitive impairment (MCI), dementia with Lewy bodies (DLB), and frontotemporal dementia (FTD). [18F]fluorodeoxyglucose (FDG) PET brain images were obtained from 120 patients (64 AD/MCI, 38 DLB, and 18 FTD) who were clinically confirmed over 2 years follow-up. Three nuclear medicine physicians performed the diagnosis and rated diagnostic confidence twice; once by standard visual methods, and once by adding of 3D-SSP. Diagnostic performance and confidence were compared between the 2 methods. 3D-SSP showed higher sensitivity, specificity, accuracy, positive, and negative predictive values to discriminate different types of dementia compared with the visual method alone, except for AD/MCI specificity and FTD sensitivity. Correction of misdiagnosis after adding 3D-SSP images was greatest for AD/MCI (56%), followed by DLB (13%) and FTD (11%). Diagnostic confidence also increased in DLB (visual: 3.2; 3D-SSP: 4.1; P < 0.001), followed by AD/MCI (visual: 3.1; 3D-SSP: 3.8; P = 0.002) and FTD (visual: 3.5; 3D-SSP: 4.2; P = 0.022). Overall, 154/360 (43%) cases had a corrected misdiagnosis or improved diagnostic confidence for the correct diagnosis. The addition of 3D-SSP images to visual analysis helped to discriminate different types of dementia in FDG PET scans, by correcting misdiagnoses and enhancing diagnostic confidence in the correct diagnosis. Improvement of diagnostic accuracy and confidence by 3D-SSP images might help to determine the cause of dementia and appropriate treatment. PMID:27930593

  10. Computational chemistry in Argonne`s Reactor Analysis Division

    SciTech Connect

    Gelbard, E.; Agrawal, R.; Fanning, T.

    1997-08-01

    Roughly 3 years ago work on Argonne`s Integral Fast Reactor ({open_quotes}IFR{close_quotes}) was terminated and at that time, ANL funding was redirected to a number of alternative programs. One such alternative was waste management and, since disposal of spent fuel from ANL`s EBR-II reactor presents some special problems, this seemed an appropriate area for ANL work. Methods for the treatment and disposal of spent fuel (particularly from EBR-II but also from other sources) are now under very active investigation at ANL. The very large waste form development program is mainly experimental at this point, but within the Reactor Analysis ({open_quotes}RA{close_quotes}) Division a small computational chemistry program is underway, designed to supplement the experimental program. One of the most popular proposals for the treatment of much of our high-level wastes is vitrification. As noted below, this approach has serious drawbacks for EBR-II spent fuel. ANL has proposed, instead, that spent fuel first be pretreated by a special metallurgical process which produces, as waste, chloride salts of the various fission products; these salts would then be adsorbed in zeolite A, which is subsequently bonded with glass to produce a waste form suitable for disposal. So far it has been the main mission of RA`s computational chemistry program to study the process by which leaching occurs when the glass-bonded zeolite waste form is exposed to water. It is the purpose of this paper to describe RA`s computational chemistry program, to discuss the computational techniques involved in such a program, and in general to familiarize the M. and C. Division with a computational area which is probably unfamiliar to most of its member. 11 refs., 2 figs.

  11. Computer aided analysis, simulation and optimisation of thermal sterilisation processes.

    PubMed

    Narayanan, C M; Banerjee, Arindam

    2013-04-01

    Although thermal sterilisation is a widely employed industrial process, little work is reported in the available literature including patents on the mathematical analysis and simulation of these processes. In the present work, software packages have been developed for computer aided optimum design of thermal sterilisation processes. Systems involving steam sparging, jacketed heating/cooling, helical coils submerged in agitated vessels and systems that employ external heat exchangers (double pipe, shell and tube and plate exchangers) have been considered. Both batch and continuous operations have been analysed and simulated. The dependence of del factor on system / operating parameters such as mass or volume of substrate to be sterilised per batch, speed of agitation, helix diameter, substrate to steam ratio, rate of substrate circulation through heat exchanger and that through holding tube have been analysed separately for each mode of sterilisation. Axial dispersion in the holding tube has also been adequately accounted for through an appropriately defined axial dispersion coefficient. The effect of exchanger characteristics/specifications on the system performance has also been analysed. The multiparameter computer aided design (CAD) software packages prepared are thus highly versatile in nature and they permit to make the most optimum choice of operating variables for the processes selected. The computed results have been compared with extensive data collected from a number of industries (distilleries, food processing and pharmaceutical industries) and pilot plants and satisfactory agreement has been observed between the two, thereby ascertaining the accuracy of the CAD softwares developed. No simplifying assumptions have been made during the analysis and the design of associated heating / cooling equipment has been performed utilising the most updated design correlations and computer softwares.

  12. 3-Dimensional and Interactive Istanbul University Virtual Laboratory Based on Active Learning Methods

    ERIC Educational Resources Information Center

    Ince, Elif; Kirbaslar, Fatma Gulay; Yolcu, Ergun; Aslan, Ayse Esra; Kayacan, Zeynep Cigdem; Alkan Olsson, Johanna; Akbasli, Ayse Ceylan; Aytekin, Mesut; Bauer, Thomas; Charalambis, Dimitris; Gunes, Zeliha Ozsoy; Kandemir, Ceyhan; Sari, Umit; Turkoglu, Suleyman; Yaman, Yavuz; Yolcu, Ozgu

    2014-01-01

    The purpose of this study is to develop a 3-dimensional interactive multi-user and multi-admin IUVIRLAB featuring active learning methods and techniques for university students and to introduce the Virtual Laboratory of Istanbul University and to show effects of IUVIRLAB on students' attitudes on communication skills and IUVIRLAB. Although there…

  13. 3-dimensional orthodontics visualization system with dental study models and orthopantomograms

    NASA Astrophysics Data System (ADS)

    Zhang, Hua; Ong, S. H.; Foong, K. W. C.; Dhar, T.

    2005-04-01

    The aim of this study is to develop a system that provides 3-dimensional visualization of orthodontic treatments. Dental plaster models and corresponding orthopantomogram (dental panoramic tomogram) are first digitized and fed into the system. A semi-auto segmentation technique is applied to the plaster models to detect the dental arches, tooth interstices and gum margins, which are used to extract individual crown models. 3-dimensional representation of roots, generated by deforming generic tooth models with orthopantomogram using radial basis functions, is attached to corresponding crowns to enable visualization of complete teeth. An optional algorithm to close the gaps between deformed roots and actual crowns by using multi-quadratic radial basis functions is also presented, which is capable of generating smooth mesh representation of complete 3-dimensional teeth. User interface is carefully designed to achieve a flexible system with as much user friendliness as possible. Manual calibration and correction is possible throughout the data processing steps to compensate occasional misbehaviors of automatic procedures. By allowing the users to move and re-arrange individual teeth (with their roots) on a full dentition, this orthodontic visualization system provides an easy and accurate way of simulation and planning of orthodontic treatment. Its capability of presenting 3-dimensional root information with only study models and orthopantomogram is especially useful for patients who do not undergo CT scanning, which is not a routine procedure in most orthodontic cases.

  14. Computational analysis of contractility in engineered heart tissue.

    PubMed

    Mathews, Grant; Sondergaard, Claus; Jeffreys, Angela; Childs, William; Le, Bao Linh; Sahota, Amrit; Najibi, Skender; Nolta, Jan; Si, Ming-Sing

    2012-05-01

    Engineered heart tissue (EHT) is a potential therapy for heart failure and the basis of functional in vitro assays of novel cardiovascular treatments. Self-organizing EHT can be generated in fiber form, which makes the assessment of contractile function convenient with a force transducer. Contractile function is a key parameter of EHT performance. Analysis of EHT force data is often performed manually; however, this approach is time consuming, incomplete and subjective. Therefore, the purpose of this study was to develop a computer algorithm to efficiently and objectively analyze EHT force data. This algorithm incorporates data filtering, individual contraction detection and validation, inter/intracontractile analysis and intersample analysis. We found the algorithm to be accurate in contraction detection, validation and magnitude measurement as compared to human operators. The algorithm was efficient in processing hundreds of data acquisitions and was able to determine force-length curves, force-frequency relationships and compare various contractile parameters such as peak systolic force generation. We conclude that this computer algorithm is a key adjunct to the objective and efficient assessment of EHT contractile function.

  15. Computing the surveillance error grid analysis: procedure and examples.

    PubMed

    Kovatchev, Boris P; Wakeman, Christian A; Breton, Marc D; Kost, Gerald J; Louie, Richard F; Tran, Nam K; Klonoff, David C

    2014-07-01

    The surveillance error grid (SEG) analysis is a tool for analysis and visualization of blood glucose monitoring (BGM) errors, based on the opinions of 206 diabetes clinicians who rated 4 distinct treatment scenarios. Resulting from this large-scale inquiry is a matrix of 337 561 risk ratings, 1 for each pair of (reference, BGM) readings ranging from 20 to 580 mg/dl. The computation of the SEG is therefore complex and in need of automation. The SEG software introduced in this article automates the task of assigning a degree of risk to each data point for a set of measured and reference blood glucose values so that the data can be distributed into 8 risk zones. The software's 2 main purposes are to (1) distribute a set of BG Monitor data into 8 risk zones ranging from none to extreme and (2) present the data in a color coded display to promote visualization. Besides aggregating the data into 8 zones corresponding to levels of risk, the SEG computes the number and percentage of data pairs in each zone and the number/percentage of data pairs above/below the diagonal line in each zone, which are associated with BGM errors creating risks for hypo- or hyperglycemia, respectively. To illustrate the action of the SEG software we first present computer-simulated data stratified along error levels defined by ISO 15197:2013. This allows the SEG to be linked to this established standard. Further illustration of the SEG procedure is done with a series of previously published data, which reflect the performance of BGM devices and test strips under various environmental conditions. We conclude that the SEG software is a useful addition to the SEG analysis presented in this journal, developed to assess the magnitude of clinical risk from analytically inaccurate data in a variety of high-impact situations such as intensive care and disaster settings.

  16. Computational particle physics for event generators and data analysis

    NASA Astrophysics Data System (ADS)

    Perret-Gallix, Denis

    2013-08-01

    High-energy physics data analysis relies heavily on the comparison between experimental and simulated data as stressed lately by the Higgs search at LHC and the recent identification of a Higgs-like new boson. The first link in the full simulation chain is the event generation both for background and for expected signals. Nowadays event generators are based on the automatic computation of matrix element or amplitude for each process of interest. Moreover, recent analysis techniques based on the matrix element likelihood method assign probabilities for every event to belong to any of a given set of possible processes. This method originally used for the top mass measurement, although computing intensive, has shown its efficiency at LHC to extract the new boson signal from the background. Serving both needs, the automatic calculation of matrix element is therefore more than ever of prime importance for particle physics. Initiated in the 80's, the techniques have matured for the lowest order calculations (tree-level), but become complex and CPU time consuming when higher order calculations involving loop diagrams are necessary like for QCD processes at LHC. New calculation techniques for next-to-leading order (NLO) have surfaced making possible the generation of processes with many final state particles (up to 6). If NLO calculations are in many cases under control, although not yet fully automatic, even higher precision calculations involving processes at 2-loops or more remain a big challenge. After a short introduction to particle physics and to the related theoretical framework, we will review some of the computing techniques that have been developed to make these calculations automatic. The main available packages and some of the most important applications for simulation and data analysis, in particular at LHC will also be summarized (see CCP2012 slides [1]).

  17. Computer analysis of ring stiffened shells of revolution

    NASA Technical Reports Server (NTRS)

    Cohen, G. A.

    1973-01-01

    The equations and method of solution for a series of five compatible computer programs for structural analysis of axisymmetric shell structures are presented. These programs, designated as the SRA programs, apply to a common structural model but analyze different modes of structural response. They are: (1) linear asymmetric static response (SRA 100), (2) buckling of linearized asymmetric equilibrium states (SRA 101), (3) nonlinear axisymmetric static response (SRA 200), (4) buckling of nonlinear axisymmetric equilibrium states(SRA 201), and (5) vibrations about nonlinear axisymmetric equilibrium state (SRA 300).

  18. Computer analysis of general linear networks using digraphs.

    NASA Technical Reports Server (NTRS)

    Mcclenahan, J. O.; Chan, S.-P.

    1972-01-01

    Investigation of the application of digraphs in analyzing general electronic networks, and development of a computer program based on a particular digraph method developed by Chen. The Chen digraph method is a topological method for solution of networks and serves as a shortcut when hand calculations are required. The advantage offered by this method of analysis is that the results are in symbolic form. It is limited, however, by the size of network that may be handled. Usually hand calculations become too tedious for networks larger than about five nodes, depending on how many elements the network contains. Direct determinant expansion for a five-node network is a very tedious process also.

  19. Computational analysis of liquid hypergolic propellant rocket engines

    NASA Technical Reports Server (NTRS)

    Krishnan, A.; Przekwas, A. J.; Gross, K. W.

    1992-01-01

    The combustion process in liquid rocket engines depends on a number of complex phenomena such as atomization, vaporization, spray dynamics, mixing, and reaction mechanisms. A computational tool to study their mutual interactions is developed to help analyze these processes with a view of improving existing designs and optimizing future designs of the thrust chamber. The focus of the article is on the analysis of the Variable Thrust Engine for the Orbit Maneuvering Vehicle. This engine uses a hypergolic liquid bipropellant combination of monomethyl hydrazine as fuel and nitrogen tetroxide as oxidizer.

  20. Computational geometry assessment for morphometric analysis of the mandible.

    PubMed

    Raith, Stefan; Varga, Viktoria; Steiner, Timm; Hölzle, Frank; Fischer, Horst

    2017-01-01

    This paper presents a fully automated algorithm for geometry assessment of the mandible. Anatomical landmarks could be reliably detected and distances were statistically evaluated with principal component analysis. The method allows for the first time to generate a mean mandible shape with statistically valid geometrical variations based on a large set of 497 CT-scans of human mandibles. The data may be used in bioengineering for designing novel oral implants, for planning of computer-guided surgery, and for the improvement of biomechanical models, as it is shown that commercially available mandible replicas differ significantly from the mean of the investigated population.

  1. Computer Image Analysis of Histochemically-Labeled Acetylcholinesterase.

    DTIC Science & Technology

    1984-11-30

    image analysis on conjunction with histochemical techniques to describe the distribution of acetylcholinesterase (AChE) activity in nervous and muscular tissue in rats treated with organophosphates (OPs). The objective of the first year of work on this remaining 2 years. We began by adopting a version of the AChE staining method as modified by Hanker, which consistent with the optical properties of our video system. We wrote computer programs for provide a numeric quantity which represents the degree of staining in a tissue section. The staining was calibrated by

  2. Computer compensation for NMR quantitative analysis of trace components

    SciTech Connect

    Nakayama, T.; Fujiwara, Y.

    1981-07-22

    A computer program has been written that determines trace components and separates overlapping components in multicomponent NMR spectra. This program uses the Lorentzian curve as a theoretical curve of NMR spectra. The coefficients of the Lorentzian are determined by the method of least squares. Systematic errors such as baseline/phase distortion are compensated and random errors are smoothed by taking moving averages, so that there processes contribute substantially to decreasing the accumulation time of spectral data. The accuracy of quantitative analysis of trace components has been improved by two significant figures. This program was applied to determining the abundance of 13C and the saponification degree of PVA.

  3. Computer-assisted analysis of adenosine triphosphate data.

    PubMed

    Erkenbrecher, C W; Crabtree, S J; Stevenson, L H

    1976-09-01

    A computer program has been written to assist in the analysis of adenosine 5'-triphosphate data. The program is designed to calculate a dilution curve and to correct sample and adenosine 5'-triphosphate standard data for background and dilution effects. In addition, basic statistical parameters and estimates of biomass carbon are also calculated for each group of samples and printed in a convenient format. The versatility of the program to analyze data from both qauatic and terrestrial samples is noted as well as its potential use with various types of instrumentation and extraction techniques.

  4. Computer Tomography Analysis of Fastrac Composite Thrust Chamber Assemblies

    NASA Technical Reports Server (NTRS)

    Beshears, Ronald D.

    2000-01-01

    Computed tomography (CT) inspection has been integrated into the production process for NASA's Fastrac composite thrust chamber assemblies (TCAs). CT has been proven to be uniquely qualified to detect the known critical flaw for these nozzles, liner cracks that are adjacent to debonds between the liner and overwrap. CT is also being used as a process monitoring tool through analysis of low density indications in the nozzle overwraps. 3d reconstruction of CT images to produce models of flawed areas is being used to give program engineers better insight into the location and nature of nozzle flaws.

  5. Novel Multicompartment 3-Dimensional Radiochromic Radiation Dosimeters for Nanoparticle-Enhanced Radiation Therapy Dosimetry

    SciTech Connect

    Alqathami, Mamdooh; Blencowe, Anton; Yeo, Un Jin; Doran, Simon J.; Qiao, Greg; Geso, Moshi

    2012-11-15

    Purpose: Gold nanoparticles (AuNps), because of their high atomic number (Z), have been demonstrated to absorb low-energy X-rays preferentially, compared with tissue, and may be used to achieve localized radiation dose enhancement in tumors. The purpose of this study is to introduce the first example of a novel multicompartment radiochromic radiation dosimeter and to demonstrate its applicability for 3-dimensional (3D) dosimetry of nanoparticle-enhanced radiation therapy. Methods and Materials: A novel multicompartment phantom radiochromic dosimeter was developed. It was designed and formulated to mimic a tumor loaded with AuNps (50 nm in diameter) at a concentration of 0.5 mM, surrounded by normal tissues. The novel dosimeter is referred to as the Sensitivity Modulated Advanced Radiation Therapy (SMART) dosimeter. The dosimeters were irradiated with 100-kV and 6-MV X-ray energies. Dose enhancement produced from the interaction of X-rays with AuNps was calculated using spectrophotometric and cone-beam optical computed tomography scanning by quantitatively comparing the change in optical density and 3D datasets of the dosimetric measurements between the tissue-equivalent (TE) and TE/AuNps compartments. The interbatch and intrabatch variability and the postresponse stability of the dosimeters with AuNps were also assessed. Results: Radiation dose enhancement factors of 1.77 and 1.11 were obtained using 100-kV and 6-MV X-ray energies, respectively. The results of this study are in good agreement with previous observations; however, for the first time we provide direct experimental confirmation and 3D visualization of the radiosensitization effect of AuNps. The dosimeters with AuNps showed small (<3.5%) interbatch variability and negligible (<0.5%) intrabatch variability. Conclusions: The SMART dosimeter yields experimental insights concerning the spatial distributions and elevated dose in nanoparticle-enhanced radiation therapy, which cannot be performed using any of

  6. Automated Patient Identification and Localization Error Detection Using 2-Dimensional to 3-Dimensional Registration of Kilovoltage X-Ray Setup Images

    SciTech Connect

    Lamb, James M. Agazaryan, Nzhde; Low, Daniel A.

    2013-10-01

    Purpose: To determine whether kilovoltage x-ray projection radiation therapy setup images could be used to perform patient identification and detect gross errors in patient setup using a computer algorithm. Methods and Materials: Three patient cohorts treated using a commercially available image guided radiation therapy (IGRT) system that uses 2-dimensional to 3-dimensional (2D-3D) image registration were retrospectively analyzed: a group of 100 cranial radiation therapy patients, a group of 100 prostate cancer patients, and a group of 83 patients treated for spinal lesions. The setup images were acquired using fixed in-room kilovoltage imaging systems. In the prostate and cranial patient groups, localizations using image registration were performed between computed tomography (CT) simulation images from radiation therapy planning and setup x-ray images corresponding both to the same patient and to different patients. For the spinal patients, localizations were performed to the correct vertebral body, and to an adjacent vertebral body, using planning CTs and setup x-ray images from the same patient. An image similarity measure used by the IGRT system image registration algorithm was extracted from the IGRT system log files and evaluated as a discriminant for error detection. Results: A threshold value of the similarity measure could be chosen to separate correct and incorrect patient matches and correct and incorrect vertebral body localizations with excellent accuracy for these patient cohorts. A 10-fold cross-validation using linear discriminant analysis yielded misclassification probabilities of 0.000, 0.0045, and 0.014 for the cranial, prostate, and spinal cases, respectively. Conclusions: An automated measure of the image similarity between x-ray setup images and corresponding planning CT images could be used to perform automated patient identification and detection of localization errors in radiation therapy treatments.

  7. Applying DNA computation to intractable problems in social network analysis.

    PubMed

    Chen, Rick C S; Yang, Stephen J H

    2010-09-01

    From ancient times to the present day, social networks have played an important role in the formation of various organizations for a range of social behaviors. As such, social networks inherently describe the complicated relationships between elements around the world. Based on mathematical graph theory, social network analysis (SNA) has been developed in and applied to various fields such as Web 2.0 for Web applications and product developments in industries, etc. However, some definitions of SNA, such as finding a clique, N-clique, N-clan, N-club and K-plex, are NP-complete problems, which are not easily solved via traditional computer architecture. These challenges have restricted the uses of SNA. This paper provides DNA-computing-based approaches with inherently high information density and massive parallelism. Using these approaches, we aim to solve the three primary problems of social networks: N-clique, N-clan, and N-club. Their accuracy and feasible time complexities discussed in the paper will demonstrate that DNA computing can be used to facilitate the development of SNA.

  8. Computational Models of Auditory Scene Analysis: A Review.

    PubMed

    Szabó, Beáta T; Denham, Susan L; Winkler, István

    2016-01-01

    Auditory scene analysis (ASA) refers to the process (es) of parsing the complex acoustic input into auditory perceptual objects representing either physical sources or temporal sound patterns, such as melodies, which contributed to the sound waves reaching the ears. A number of new computational models accounting for some of the perceptual phenomena of ASA have been published recently. Here we provide a theoretically motivated review of these computational models, aiming to relate their guiding principles to the central issues of the theoretical framework of ASA. Specifically, we ask how they achieve the grouping and separation of sound elements and whether they implement some form of competition between alternative interpretations of the sound input. We consider the extent to which they include predictive processes, as important current theories suggest that perception is inherently predictive, and also how they have been evaluated. We conclude that current computational models of ASA are fragmentary in the sense that rather than providing general competing interpretations of ASA, they focus on assessing the utility of specific processes (or algorithms) for finding the causes of the complex acoustic signal. This leaves open the possibility for integrating complementary aspects of the models into a more comprehensive theory of ASA.

  9. Computational Models of Auditory Scene Analysis: A Review

    PubMed Central

    Szabó, Beáta T.; Denham, Susan L.; Winkler, István

    2016-01-01

    Auditory scene analysis (ASA) refers to the process (es) of parsing the complex acoustic input into auditory perceptual objects representing either physical sources or temporal sound patterns, such as melodies, which contributed to the sound waves reaching the ears. A number of new computational models accounting for some of the perceptual phenomena of ASA have been published recently. Here we provide a theoretically motivated review of these computational models, aiming to relate their guiding principles to the central issues of the theoretical framework of ASA. Specifically, we ask how they achieve the grouping and separation of sound elements and whether they implement some form of competition between alternative interpretations of the sound input. We consider the extent to which they include predictive processes, as important current theories suggest that perception is inherently predictive, and also how they have been evaluated. We conclude that current computational models of ASA are fragmentary in the sense that rather than providing general competing interpretations of ASA, they focus on assessing the utility of specific processes (or algorithms) for finding the causes of the complex acoustic signal. This leaves open the possibility for integrating complementary aspects of the models into a more comprehensive theory of ASA. PMID:27895552

  10. Plans for a sensitivity analysis of bridge-scour computations

    USGS Publications Warehouse

    Dunn, David D.; Smith, Peter N.

    1993-01-01

    Plans for an analysis of the sensitivity of Level 2 bridge-scour computations are described. Cross-section data from 15 bridge sites in Texas are modified to reflect four levels of field effort ranging from no field surveys to complete surveys. Data from United States Geological Survey (USGS) topographic maps will be used to supplement incomplete field surveys. The cross sections are used to compute the water-surface profile through each bridge for several T-year recurrence-interval design discharges. The effect of determining the downstream energy grade-line slope from topographic maps is investigated by systematically varying the starting slope of each profile. The water-surface profile analyses are then used to compute potential scour resulting from each of the design discharges. The planned results will be presented in the form of exceedance-probability versus scour-depth plots with the maximum and minimum scour depths at each T-year discharge presented as error bars.

  11. Human embryonic growth and development of the cerebellum using 3-dimensional ultrasound and virtual reality.

    PubMed

    Rousian, M; Groenenberg, I A L; Hop, W C; Koning, A H J; van der Spek, P J; Exalto, N; Steegers, E A P

    2013-08-01

    The aim of our study was to evaluate the first trimester cerebellar growth and development using 2 different measuring techniques: 3-dimensional (3D) and virtual reality (VR) ultrasound visualization. The cerebellum measurements were related to gestational age (GA) and crown-rump length (CRL). Finally, the reproducibility of both the methods was tested. In a prospective cohort study, we collected 630 first trimester, serially obtained, 3D ultrasound scans of 112 uncomplicated pregnancies between 7 + 0 and 12 + 6 weeks of GA. Only scans with high-quality images of the fossa posterior were selected for the analysis. Measurements were performed offline in the coronal plane using 3D (4D view) and VR (V-Scope) software. The VR enables the observer to use all available dimensions in a data set by visualizing the volume as a "hologram." Total cerebellar diameter, left, and right hemispheric diameter, and thickness were measured using both the techniques. All measurements were performed 3 times and means were used in repeated measurements analysis. After exclusion criteria were applied 177 (28%) 3D data sets were available for further analysis. The median GA was 10 + 0 weeks and the median CRL was 31.4 mm (range: 5.2-79.0 mm). The cerebellar parameters could be measured from 7 gestational weeks onward. The total cerebellar diameter increased from 2.2 mm at 7 weeks of GA to 13.9 mm at 12 weeks of GA using VR and from 2.2 to 13.8 mm using 3D ultrasound. The reproducibility, established in a subset of 35 data sets, resulted in intraclass correlation coefficient values ≥0.98. It can be concluded that cerebellar measurements performed by the 2 methods proved to be reproducible and comparable with each other. However, VR-using all three dimensions-provides a superior method for the visualization of the cerebellum. The constructed reference values can be used to study normal and abnormal cerebellar growth and development.

  12. BEST3D user's manual: Boundary Element Solution Technology, 3-Dimensional Version 3.0

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The theoretical basis and programming strategy utilized in the construction of the computer program BEST3D (boundary element solution technology - three dimensional) and detailed input instructions are provided for the use of the program. An extensive set of test cases and sample problems is included in the manual and is also available for distribution with the program. The BEST3D program was developed under the 3-D Inelastic Analysis Methods for Hot Section Components contract (NAS3-23697). The overall objective of this program was the development of new computer programs allowing more accurate and efficient three-dimensional thermal and stress analysis of hot section components, i.e., combustor liners, turbine blades, and turbine vanes. The BEST3D program allows both linear and nonlinear analysis of static and quasi-static elastic problems and transient dynamic analysis for elastic problems. Calculation of elastic natural frequencies and mode shapes is also provided.

  13. Computational analysis of core promoters in the Drosophila genome

    PubMed Central

    Ohler, Uwe; Liao, Guo-chun; Niemann, Heinrich; Rubin, Gerald M

    2002-01-01

    Background The core promoter, a region of about 100 base-pairs flanking the transcription start site (TSS), serves as the recognition site for the basal transcription apparatus. Drosophila TSSs have generally been mapped by individual experiments; the low number of accurately mapped TSSs has limited analysis of promoter sequence motifs and the training of computational prediction tools. Results We identified TSS candidates for about 2,000 Drosophila genes by aligning 5' expressed sequence tags (ESTs) from cap-trapped cDNA libraries to the genome, while applying stringent criteria concerning coverage and 5'-end distribution. Examination of the sequences flanking these TSSs revealed the presence of well-known core promoter motifs such as the TATA box, the initiator and the downstream promoter element (DPE). We also define, and assess the distribution of, several new motifs prevalent in core promoters, including what appears to be a variant DPE motif. Among the prevalent motifs is the DNA-replication-related element DRE, recently shown to be part of the recognition site for the TBP-related factor TRF2. Our TSS set was then used to retrain the computational promoter predictor McPromoter, allowing us to improve the recognition performance to over 50% sensitivity and 40% specificity. We compare these computational results to promoter prediction in vertebrates. Conclusions There are relatively few recognizable binding sites for previously known general transcription factors in Drosophila core promoters. However, we identified several new motifs enriched in promoter regions. We were also able to significantly improve the performance of computational TSS prediction in Drosophila. PMID:12537576

  14. Analysis of CERN computing infrastructure and monitoring data

    NASA Astrophysics Data System (ADS)

    Nieke, C.; Lassnig, M.; Menichetti, L.; Motesnitsalis, E.; Duellmann, D.

    2015-12-01

    Optimizing a computing infrastructure on the scale of LHC requires a quantitative understanding of a complex network of many different resources and services. For this purpose the CERN IT department and the LHC experiments are collecting a large multitude of logs and performance probes, which are already successfully used for short-term analysis (e.g. operational dashboards) within each group. The IT analytics working group has been created with the goal to bring data sources from different services and on different abstraction levels together and to implement a suitable infrastructure for mid- to long-term statistical analysis. It further provides a forum for joint optimization across single service boundaries and the exchange of analysis methods and tools. To simplify access to the collected data, we implemented an automated repository for cleaned and aggregated data sources based on the Hadoop ecosystem. This contribution describes some of the challenges encountered, such as dealing with heterogeneous data formats, selecting an efficient storage format for map reduce and external access, and will describe the repository user interface. Using this infrastructure we were able to quantitatively analyze the relationship between CPU/wall fraction, latency/throughput constraints of network and disk and the effective job throughput. In this contribution we will first describe the design of the shared analysis infrastructure and then present a summary of first analysis results from the combined data sources.

  15. Electronic Forms-Based Computing for Evidentiary Analysis

    SciTech Connect

    Luse, Andy; Mennecke, Brian; Townsend, Anthony

    2009-07-01

    The paperwork associated with evidentiary collection and analysis is a highly repetitive and time-consuming process which often involves duplication of work and can frequently result in documentary errors. Electronic entry of evidence-related information can facilitate greater accuracy and less time spent on data entry. This manuscript describes a general framework for the implementation of an electronic tablet-based system for evidentiary processing. This framework is then utilized in the design and implementation of an electronic tablet-based evidentiary input prototype system developed for use by forensic laboratories which serves as a verification of the proposed framework. The manuscript concludes with a discussion of implications and recommendations for the implementation and use of tablet-based computing for evidence analysis.

  16. Computer program for design analysis of radial-inflow turbines

    NASA Technical Reports Server (NTRS)

    Glassman, A. J.

    1976-01-01

    A computer program written in FORTRAN that may be used for the design analysis of radial-inflow turbines was documented. The following information is included: loss model (estimation of losses), the analysis equations, a description of the input and output data, the FORTRAN program listing and list of variables, and sample cases. The input design requirements include the power, mass flow rate, inlet temperature and pressure, and rotational speed. The program output data includes various diameters, efficiencies, temperatures, pressures, velocities, and flow angles for the appropriate calculation stations. The design variables include the stator-exit angle, rotor radius ratios, and rotor-exit tangential velocity distribution. The losses are determined by an internal loss model.

  17. A computer program (MACPUMP) for interactive aquifer-test analysis

    USGS Publications Warehouse

    Day-Lewis, F. D.; Person, M.A.; Konikow, L.F.

    1995-01-01

    This report introduces MACPUMP (Version 1.0), an aquifer-test-analysis package for use with Macintosh4 computers. The report outlines the input- data format, describes the solutions encoded in the program, explains the menu-items, and offers a tutorial illustrating the use of the program. The package reads list-directed aquifer-test data from a file, plots the data to the screen, generates and plots type curves for several different test conditions, and allows mouse-controlled curve matching. MACPUMP features pull-down menus, a simple text viewer for displaying data-files, and optional on-line help windows. This version includes the analytical solutions for nonleaky and leaky confined aquifers, using both type curves and straight-line methods, and for the analysis of single-well slug tests using type curves. An executable version of the code and sample input data sets are included on an accompanying floppy disk.

  18. Computer-aided communication satellite system analysis and optimization

    NASA Technical Reports Server (NTRS)

    Stagl, T. W.; Morgan, N. H.; Morley, R. E.; Singh, J. P.

    1973-01-01

    The capabilities and limitations of the various published computer programs for fixed/broadcast communication satellite system synthesis and optimization are discussed. A satellite Telecommunication analysis and Modeling Program (STAMP) for costing and sensitivity analysis work in application of communication satellites to educational development is given. The modifications made to STAMP include: extension of the six beam capability to eight; addition of generation of multiple beams from a single reflector system with an array of feeds; an improved system costing to reflect the time value of money, growth in earth terminal population with time, and to account for various measures of system reliability; inclusion of a model for scintillation at microwave frequencies in the communication link loss model; and, an updated technological environment.

  19. Dynamic analysis of spur gears using computer program DANST

    NASA Astrophysics Data System (ADS)

    Oswald, Fred B.; Lin, Hsiang Hsi; Liou, Chuen-Huei; Valco, Mark J.

    1993-06-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results.

  20. Dynamic analysis of spur gears using computer program DANST

    NASA Technical Reports Server (NTRS)

    Oswald, Fred B.; Lin, Hsiang Hsi; Liou, Chuen-Huei; Valco, Mark J.

    1993-01-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results.

  1. Dynamic analysis of spur gears using computer program DANST

    SciTech Connect

    Oswald, F.B.; Lin, H.H.; Liou, Chuenheui; Valco, M.J.

    1993-06-01

    DANST is a computer program for static and dynamic analysis of spur gear systems. The program can be used for parametric studies to predict the effect on dynamic load and tooth bending stress of spur gears due to operating speed, torque, stiffness, damping, inertia, and tooth profile. DANST performs geometric modeling and dynamic analysis for low- or high-contact-ratio spur gears. DANST can simulate gear systems with contact ratio ranging from one to three. It was designed to be easy to use, and it is extensively documented by comments in the source code. This report describes the installation and use of DANST. It covers input data requirements and presents examples. The report also compares DANST predictions for gear tooth loads and bending stress to experimental and finite element results. 14 refs.

  2. Modern wing flutter analysis by computational fluid dynamics methods

    NASA Technical Reports Server (NTRS)

    Cunningham, Herbert J.; Batina, John T.; Bennett, Robert M.

    1988-01-01

    The application and assessment of the recently developed CAP-TSD transonic small-disturbance code for flutter prediction is described. The CAP-TSD code has been developed for aeroelastic analysis of complete aircraft configurations and was previously applied to the calculation of steady and unsteady pressures with favorable results. Generalized aerodynamic forces and flutter characteristics are calculated and compared with linear theory results and with experimental data for a 45 deg sweptback wing. These results are in good agreement with the experimental flutter data which is the first step toward validating CAP-TSD for general transonic aeroelastic applications. The paper presents these results and comparisons along with general remarks regarding modern wing flutter analysis by computational fluid dynamics methods.

  3. Computational Flow Analysis of a Left Ventricular Assist Device

    NASA Technical Reports Server (NTRS)

    Kiris, Cetin; Kwak, Dochan; Benkowski, Robert

    1995-01-01

    Computational fluid dynamics has been developed to a level where it has become an Indispensable part of aerospace research and design. Technology developed foe aerospace applications am also be utilized for the benefit of human health. For example, a flange-to-flange rocket engine fuel-pump simulation includes the rotating and non-rotating components: the flow straighteners, the impeller, and diffusers A Ventricular Assist Device developed by NASA Johnson Space Center and Baylor College of Medicine has a design similar to a rocket engine fuel pump in that it also consists of a flow straightener, an impeller, and a diffuser. Accurate and detailed knowledge of the flowfield obtained by incompressible flow calculations can be greatly beneficial to designers in their effort to reduce the cost and improve the reliability of these devices. In addition to the geometric complexities, a variety of flow phenomena are encountered in biofluids Then include turbulent boundary layer separation, wakes, transition, tip vortex resolution, three-dimensional effects, and Reynolds number effects. In order to increase the role of Computational Fluid Dynamics (CFD) in the design process the CFD analysis tools must be evaluated and validated so that designers gain Confidence in their use. The incompressible flow solver, INS3D, has been applied to flow inside of a liquid rocket engine turbopump components and extensively validated. This paper details how the computational flow simulation capability developed for liquid rocket engine pump component analysis has bean applied to the Left Ventricular Assist Device being developed jointly by NASA JSC and Baylor College of Medicine.

  4. Upgrades of Two Computer Codes for Analysis of Turbomachinery

    NASA Technical Reports Server (NTRS)

    Chima, Rodrick V.; Liou, Meng-Sing

    2005-01-01

    Major upgrades have been made in two of the programs reported in "ive Computer Codes for Analysis of Turbomachinery". The affected programs are: Swift -- a code for three-dimensional (3D) multiblock analysis; and TCGRID, which generates a 3D grid used with Swift. Originally utilizing only a central-differencing scheme for numerical solution, Swift was augmented by addition of two upwind schemes that give greater accuracy but take more computing time. Other improvements in Swift include addition of a shear-stress-transport turbulence model for better prediction of adverse pressure gradients, addition of an H-grid capability for flexibility in modeling flows in pumps and ducts, and modification to enable simultaneous modeling of hub and tip clearances. Improvements in TCGRID include modifications to enable generation of grids for more complicated flow paths and addition of an option to generate grids compatible with the ADPAC code used at NASA and in industry. For both codes, new test cases were developed and documentation was updated. Both codes were converted to Fortran 90, with dynamic memory allocation. Both codes were also modified for ease of use in both UNIX and Windows operating systems.

  5. Computer vision inspection of rice seed quality with discriminant analysis

    NASA Astrophysics Data System (ADS)

    Cheng, Fang; Ying, Yibin

    2004-10-01

    This study was undertaken to develop computer vision-based rice seeds inspection technology for quality control. Color image classification using a discriminant analysis algorithm identifying germinated rice seed was successfully implemented. The hybrid rice seed cultivars involved were Jinyou402, Shanyou10, Zhongyou207 and Jiayou99. Sixteen morphological features and six color features were extracted from sample images belong to training sets. The color feature of 'Huebmean' shows the strongest classification ability among all the features. Computed as the area of seed region divided by area of the smallest convex polygon that can contain the seed region, the feature of 'Solidity' is prior to the other morphological features in germinated seeds recognition. Combined with the two features of 'Huebmean' and 'Solidity', discriminant analysis was used to classify normal rice seeds and seeds germinated on panicle. Results show that the algorithm achieved an overall average accuracy of 98.4% for both of normal seeds and germinated seeds in all cultivars. The combination of 'Huebmean' and 'Solidity' was proved to be a good indicator for germinated seeds. The simple discriminant algorithm using just two features shows high accuracy and good adaptability.

  6. Computational singular perturbation analysis of stochastic chemical systems with stiffness

    DOE PAGES

    Wang, Lijin; Han, Xiaoying; Cao, Yanzhao; ...

    2017-01-25

    Computational singular perturbation (CSP) is a useful method for analysis, reduction, and time integration of stiff ordinary differential equation systems. It has found dominant utility, in particular, in chemical reaction systems with a large range of time scales at continuum and deterministic level. On the other hand, CSP is not directly applicable to chemical reaction systems at micro or meso-scale, where stochasticity plays an non-negligible role and thus has to be taken into account. In this work we develop a novel stochastic computational singular perturbation (SCSP) analysis and time integration framework, and associated algorithm, that can be used to notmore » only construct accurately and efficiently the numerical solutions to stiff stochastic chemical reaction systems, but also analyze the dynamics of the reduced stochastic reaction systems. Furthermore, the algorithm is illustrated by an application to a benchmark stochastic differential equation model, and numerical experiments are carried out to demonstrate the effectiveness of the construction.« less

  7. Computational design analysis for deployment of cardiovascular stents

    NASA Astrophysics Data System (ADS)

    Tammareddi, Sriram; Sun, Guangyong; Li, Qing

    2010-06-01

    Cardiovascular disease has become a major global healthcare problem. As one of the relatively new medical devices, stents offer a minimally-invasive surgical strategy to improve the quality of life for numerous cardiovascular disease patients. One of the key associative issues has been to understand the effect of stent structures on its deployment behaviour. This paper aims to develop a computational model for exploring the biomechanical responses to the change in stent geometrical parameters, namely the strut thickness and cross-link width of the Palmaz-Schatz stent. Explicit 3D dynamic finite element analysis was carried out to explore the sensitivity of these geometrical parameters on deployment performance, such as dog-boning, fore-shortening, and stent deformation over the load cycle. It has been found that an increase in stent thickness causes a sizeable rise in the load required to deform the stent to its target diameter, whilst reducing maximum dog-boning in the stent. An increase in the cross-link width showed that no change in the load is required to deform the stent to its target diameter, and there is no apparent correlation with dog-boning but an increased fore-shortening with increasing cross-link width. The computational modelling and analysis presented herein proves an effective way to refine or optimise the design of stent structures.

  8. The CDF computing and analysis system: First experience

    SciTech Connect

    R. Colombo et al.

    2001-11-02

    The Collider Detector at Fermilab (CDF) collaboration records and analyses proton anti-proton interactions with a center-of-mass energy of 2 TeV at the Tevatron. A new collider run, Run II, of the Tevatron started in April. During its more than two year duration the CDF experiment expects to record about 1 PetaByte of data. With its multi-purpose detector and center-of-mass energy at the frontier, the experimental program is large and versatile. The over 500 scientists of CDF will engage in searches for new particles, like the Higgs boson or supersymmetric particles, precision measurement of electroweak parameters, like the mass of the W boson, measurement of top quark parameters, and a large spectrum of B physics. The experiment has taken data and analyzed them in previous runs. For Run II, however, the computing model was changed to incorporate new methodologies, the file format switched, and both data handling and analysis system redesigned to cope with the increased demands. This paper (4-036 at Chep 2001) gives an overview of the CDF Run II compute system with emphasis on areas where the current system does not match initial estimates and projections. For the data handling and analysis system a more detailed description is given.

  9. Computer graphics techniques for aircraft EMC analysis and design

    NASA Astrophysics Data System (ADS)

    Kubina, S. J.; Bhartia, P.

    1983-10-01

    A comprehensive computer-aided system for the prediction of the potential interaction between avionics systems, with special emphasis on antenna-to-antenna coupling, is described. The methodology is applicable throughout the life cycle of an avionic/weapon system, including system upgrades and retrofits. As soon as aircraft geometry and preliminary systems information becomes available, the computer codes can be used to selectively display proposed antenna locations, emitter/receptor response characteristics, electromagnetic interference (EMI) margins and the actual ray-optical paths of maximum antenna-antenna coupling for each potential interacting antenna set. Antennas can be interactively relocated by track-ball (or joystick) and the analysis repeated at will for optimization or installation design study purposes. The codes can significantly simplify the task of the designer/analyst in effectively identifying critical interactions among an overwhelming large set of potential ones. In addition, it is an excellent design, development and analysis tool which simultaneously identifies both numerically and pictorially the EMI interdependencies among subsystems.

  10. A Computational Tool for Quantitative Analysis of Vascular Networks

    PubMed Central

    Zudaire, Enrique; Gambardella, Laure; Kurcz, Christopher; Vermeren, Sonja

    2011-01-01

    Angiogenesis is the generation of mature vascular networks from pre-existing vessels. Angiogenesis is crucial during the organism' development, for wound healing and for the female reproductive cycle. Several murine experimental systems are well suited for studying developmental and pathological angiogenesis. They include the embryonic hindbrain, the post-natal retina and allantois explants. In these systems vascular networks are visualised by appropriate staining procedures followed by microscopical analysis. Nevertheless, quantitative assessment of angiogenesis is hampered by the lack of readily available, standardized metrics and software analysis tools. Non-automated protocols are being used widely and they are, in general, time - and labour intensive, prone to human error and do not permit computation of complex spatial metrics. We have developed a light-weight, user friendly software, AngioTool, which allows for quick, hands-off and reproducible quantification of vascular networks in microscopic images. AngioTool computes several morphological and spatial parameters including the area covered by a vascular network, the number of vessels, vessel length, vascular density and lacunarity. In addition, AngioTool calculates the so-called “branching index” (branch points / unit area), providing a measurement of the sprouting activity of a specimen of interest. We have validated AngioTool using images of embryonic murine hindbrains, post-natal retinas and allantois explants. AngioTool is open source and can be downloaded free of charge. PMID:22110636

  11. Multiscale analysis of nonlinear systems using computational homology

    SciTech Connect

    Konstantin Mischaikow, Rutgers University /Georgia Institute of Technology, Michael Schatz, Georgia Institute of Technology, William Kalies, Florida Atlantic University, Thomas Wanner,George Mason University

    2010-05-19

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure Characterization

  12. Multiscale analysis of nonlinear systems using computational homology

    SciTech Connect

    Konstantin Mischaikow; Michael Schatz; William Kalies; Thomas Wanner

    2010-05-24

    This is a collaborative project between the principal investigators. However, as is to be expected, different PIs have greater focus on different aspects of the project. This report lists these major directions of research which were pursued during the funding period: (1) Computational Homology in Fluids - For the computational homology effort in thermal convection, the focus of the work during the first two years of the funding period included: (1) A clear demonstration that homology can sensitively detect the presence or absence of an important flow symmetry, (2) An investigation of homology as a probe for flow dynamics, and (3) The construction of a new convection apparatus for probing the effects of large-aspect-ratio. (2) Computational Homology in Cardiac Dynamics - We have initiated an effort to test the use of homology in characterizing data from both laboratory experiments and numerical simulations of arrhythmia in the heart. Recently, the use of high speed, high sensitivity digital imaging in conjunction with voltage sensitive fluorescent dyes has enabled researchers to visualize electrical activity on the surface of cardiac tissue, both in vitro and in vivo. (3) Magnetohydrodynamics - A new research direction is to use computational homology to analyze results of large scale simulations of 2D turbulence in the presence of magnetic fields. Such simulations are relevant to the dynamics of black hole accretion disks. The complex flow patterns from simulations exhibit strong qualitative changes as a function of magnetic field strength. Efforts to characterize the pattern changes using Fourier methods and wavelet analysis have been unsuccessful. (4) Granular Flow - two experts in the area of granular media are studying 2D model experiments of earthquake dynamics where the stress fields can be measured; these stress fields from complex patterns of 'force chains' that may be amenable to analysis using computational homology. (5) Microstructure Characterization

  13. The future of computer-aided sperm analysis

    PubMed Central

    Mortimer, Sharon T; van der Horst, Gerhard; Mortimer, David

    2015-01-01

    Computer-aided sperm analysis (CASA) technology was developed in the late 1980s for analyzing sperm movement characteristics or kinematics and has been highly successful in enabling this field of research. CASA has also been used with great success for measuring semen characteristics such as sperm concentration and proportions of progressive motility in many animal species, including wide application in domesticated animal production laboratories and reproductive toxicology. However, attempts to use CASA for human clinical semen analysis have largely met with poor success due to the inherent difficulties presented by many human semen samples caused by sperm clumping and heavy background debris that, until now, have precluded accurate digital image analysis. The authors review the improved capabilities of two modern CASA platforms (Hamilton Thorne CASA-II and Microptic SCA6) and consider their current and future applications with particular reference to directing our focus towards using this technology to assess functional rather than simple descriptive characteristics of spermatozoa. Specific requirements for validating CASA technology as a semi-automated system for human semen analysis are also provided, with particular reference to the accuracy and uncertainty of measurement expected of a robust medical laboratory test for implementation in clinical laboratories operating according to modern accreditation standards. PMID:25926614

  14. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity where more and more complex flow problems can be tackled with this approach. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by a contra-rotating open rotor. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the methodologies of how to apply the immersed boundary method to this moving boundary problem, we will provide a detailed validation of the aeroacoustic analysis approach employing the Launch Ascent and Vehicle Aerodynamics (LAVA) solver. Two free-stream Mach numbers with M=0.2 and M=0.78 are considered in this analysis that are based on the nominally take-off and cruise flow conditions. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. Spectral analysis is used to determine the dominant wave propagation pattern in the acoustic near-field.

  15. The analysis of control trajectories using symbolic and database computing

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1991-01-01

    The research broadly concerned the symbolic computation, mixed numeric-symbolic computation, and data base computation of trajectories of dynamical systems, especially control systems. It was determined that trees can be used to compute symbolically series which approximate solutions to differential equations.

  16. Gender Differences in Computer-Related Attitudes and Behavior: A Meta-Analysis.

    ERIC Educational Resources Information Center

    Whitley, Bernard E., Jr.

    1997-01-01

    A meta-analysis of studies of gender differences in computer attitudes and behavior found that males exhibited greater sex-role stereotyping of computers, higher computer self-efficacy, and more positive attitudes toward computers than females. Most differences in attitudes and behavior were small, with the largest found in high school students.…

  17. Customizable Computer-Based Interaction Analysis for Coaching and Self-Regulation in Synchronous CSCL Systems

    ERIC Educational Resources Information Center

    Lonchamp, Jacques

    2010-01-01

    Computer-based interaction analysis (IA) is an automatic process that aims at understanding a computer-mediated activity. In a CSCL system, computer-based IA can provide information directly to learners for self-assessment and regulation and to tutors for coaching support. This article proposes a customizable computer-based IA approach for a…

  18. Acromiohumeral Distance and 3-Dimensional Scapular Position Change After Overhead Muscle Fatigue

    PubMed Central

    Maenhout, Annelies; Dhooge, Famke; Van Herzeele, Maarten; Palmans, Tanneke; Cools, Ann

    2015-01-01

    Context: Muscle fatigue due to repetitive and prolonged overhead sports activity is considered an important factor contributing to impingement-related rotator cuff pathologic conditions in overhead athletes. The evidence on scapular and glenohumeral kinematic changes after fatigue is contradicting and prohibits conclusions about how shoulder muscle fatigue affects acromiohumeral distance. Objective: To investigate the effect of a fatigue protocol resembling overhead sports activity on acromiohumeral distance and 3-dimensional scapular position in overhead athletes. Design: Cross-sectional study. Setting: Institutional laboratory. Patients or Other Participants: A total of 29 healthy recreational overhead athletes (14 men, 15 women; age = 22.23 ± 2.82 years, height = 178.3 ± 7.8 cm, mass = 71.6 ± 9.5 kg). Intervention(s) The athletes were tested before and after a shoulder muscle-fatiguing protocol. Main Outcome Measure(s) Acromiohumeral distance was measured using ultrasound, and scapular position was determined with an electromagnetic motion-tracking system. Both measurements were performed at 3 elevation positions (0°, 45°, and 60° of abduction). We used a 3-factor mixed model for data analysis. Results: After fatigue, the acromiohumeral distance increased when the upper extremity was actively positioned at 45° (Δ = 0.78 ± 0.24 mm, P = .002) or 60° (Δ = 0.58 ± 0.23 mm, P = .02) of abduction. Scapular position changed after fatigue to a more externally rotated position at 45° (Δ = 4.97° ± 1.13°, P < .001) and 60° (Δ = 4.61° ± 1.90°, P = .001) of abduction, a more upwardly rotated position at 45° (Δ = 6.10° ± 1.30°, P < .001) and 60° (Δ = 7.20° ± 1.65°, P < .001) of abduction, and a more posteriorly tilted position at 0°, 45°, and 60° of abduction (Δ = 1.98° ± 0.41°, P < .001). Conclusions: After a fatiguing protocol, we found changes in acromiohumeral distance and scapular position that corresponded with an impingement

  19. Energy Sources of the Dominant Frequency Dependent 3-dimensional Atmospheric Modes

    NASA Technical Reports Server (NTRS)

    Schubert, S.

    1985-01-01

    The energy sources and sinks associated with the zonally asymmetric winter mean flow are investigated as part of an on-going study of atmospheric variability. Distinctly different horizontal structures for the long, intermediate and short time scale atmospheric variations were noted. In previous observations, the 3-dimensional structure of the fluctuations is investigated and the relative roles of barotropic and baroclinic terms are assessed.

  20. DETECTORS AND EXPERIMENTAL METHODS: Decay vertex reconstruction and 3-dimensional lifetime determination at BESIII

    NASA Astrophysics Data System (ADS)

    Xu, Min; He, Kang-Lin; Zhang, Zi-Ping; Wang, Yi-Fang; Bian, Jian-Ming; Cao, Guo-Fu; Cao, Xue-Xiang; Chen, Shen-Jian; Deng, Zi-Yan; Fu, Cheng-Dong; Gao, Yuan-Ning; Han, Lei; Han, Shao-Qing; He, Miao; Hu, Ji-Feng; Hu, Xiao-Wei; Huang, Bin; Huang, Xing-Tao; Jia, Lu-Kui; Ji, Xiao-Bin; Li, Hai-Bo; Li, Wei-Dong; Liang, Yu-Tie; Liu, Chun-Xiu; Liu, Huai-Min; Liu, Ying; Liu, Yong; Luo, Tao; Lü, Qi-Wen; Ma, Qiu-Mei; Ma, Xiang; Mao, Ya-Jun; Mao, Ze-Pu; Mo, Xiao-Hu; Ning, Fei-Peng; Ping, Rong-Gang; Qiu, Jin-Fa; Song, Wen-Bo; Sun, Sheng-Sen; Sun, Xiao-Dong; Sun, Yong-Zhao; Tian, Hao-Lai; Wang, Ji-Ke; Wang, Liang-Liang; Wen, Shuo-Pin; Wu, Ling-Hui; Wu, Zhi; Xie, Yu-Guang; Yan, Jie; Yan, Liang; Yao, Jian; Yuan, Chang-Zheng; Yuan, Ye; Zhang, Chang-Chun; Zhang, Jian-Yong; Zhang, Lei; Zhang, Xue-Yao; Zhang, Yao; Zheng, Yang-Heng; Zhu, Yong-Sheng; Zou, Jia-Heng

    2009-06-01

    This paper focuses mainly on the vertex reconstruction of resonance particles with a relatively long lifetime such as K0S, Λ, as well as on lifetime measurements using a 3-dimensional fit. The kinematic constraints between the production and decay vertices and the decay vertex fitting algorithm based on the least squares method are both presented. Reconstruction efficiencies including experimental resolutions are discussed. The results and systematic errors are calculated based on a Monte Carlo simulation.

  1. Computational Analysis of the G-III Laminar Flow Glove

    NASA Technical Reports Server (NTRS)

    Malik, Mujeeb R.; Liao, Wei; Lee-Rausch, Elizabeth M.; Li, Fei; Choudhari, Meelan M.; Chang, Chau-Lyan

    2011-01-01

    Under NASA's Environmentally Responsible Aviation Project, flight experiments are planned with the primary objective of demonstrating the Discrete Roughness Elements (DRE) technology for passive laminar flow control at chord Reynolds numbers relevant to transport aircraft. In this paper, we present a preliminary computational assessment of the Gulfstream-III (G-III) aircraft wing-glove designed to attain natural laminar flow for the leading-edge sweep angle of 34.6deg. Analysis for a flight Mach number of 0.75 shows that it should be possible to achieve natural laminar flow for twice the transition Reynolds number ever achieved at this sweep angle. However, the wing-glove needs to be redesigned to effectively demonstrate passive laminar flow control using DREs. As a by-product of the computational assessment, effect of surface curvature on stationary crossflow disturbances is found to be strongly stabilizing for the current design, and it is suggested that convex surface curvature could be used as a control parameter for natural laminar flow design, provided transition occurs via stationary crossflow disturbances.

  2. Survey of computer programs for heat transfer analysis

    NASA Technical Reports Server (NTRS)

    Noor, Ahmed K.

    1986-01-01

    An overview is given of the current capabilities of thirty-three computer programs that are used to solve heat transfer problems. The programs considered range from large general-purpose codes with broad spectrum of capabilities, large user community, and comprehensive user support (e.g., ABAQUS, ANSYS, EAL, MARC, MITAS II, MSC/NASTRAN, and SAMCEF) to the small, special-purpose codes with limited user community such as ANDES, NTEMP, TAC2D, TAC3D, TEPSA and TRUMP. The majority of the programs use either finite elements or finite differences for the spatial discretization. The capabilities of the programs are listed in tabular form followed by a summary of the major features of each program. The information presented herein is based on a questionnaire sent to the developers of each program. This information is preceded by a brief background material needed for effective evaluation and use of computer programs for heat transfer analysis. The present survey is useful in the initial selection of the programs which are most suitable for a particular application. The final selection of the program to be used should, however, be based on a detailed examination of the documentation and the literature about the program.

  3. Computational Analysis on Stent Geometries in Carotid Artery: A Review

    NASA Astrophysics Data System (ADS)

    Paisal, Muhammad Sufyan Amir; Taib, Ishkrizat; Ismail, Al Emran

    2017-01-01

    This paper reviews the work done by previous researchers in order to gather the information for the current study which about the computational analysis on stent geometry in carotid artery. The implantation of stent in carotid artery has become popular treatment for arterial diseases of hypertension such as stenosis, thrombosis, atherosclerosis and embolization, in reducing the rate of mortality and morbidity. For the stenting of an artery, the previous researchers did many type of mathematical models in which, the physiological variables of artery is analogized to electrical variables. Thus, the computational fluid dynamics (CFD) of artery could be done, which this method is also did by previous researchers. It lead to the current study in finding the hemodynamic characteristics due to artery stenting such as wall shear stress (WSS) and wall shear stress gradient (WSSG). Another objective of this study is to evaluate the nowadays stent configuration for full optimization in reducing the arterial side effect such as restenosis rate after a few weeks of stenting. The evaluation of stent is based on the decrease of strut-strut intersection, decrease of strut width and increase of the strut-strut spacing. The existing configuration of stents are actually good enough in widening the narrowed arterial wall but the disease such as thrombosis still occurs in early and late stage after the stent implantation. Thus, the outcome of this study is the prediction for the reduction of restenosis rate and the WSS distribution is predicted to be able in classifying which stent configuration is the best.

  4. Computational Approach to Dendritic Spine Taxonomy and Shape Transition Analysis

    PubMed Central

    Bokota, Grzegorz; Magnowska, Marta; Kuśmierczyk, Tomasz; Łukasik, Michał; Roszkowska, Matylda; Plewczynski, Dariusz

    2016-01-01

    The common approach in morphological analysis of dendritic spines of mammalian neuronal cells is to categorize spines into subpopulations based on whether they are stubby, mushroom, thin, or filopodia shaped. The corresponding cellular models of synaptic plasticity, long-term potentiation, and long-term depression associate the synaptic strength with either spine enlargement or spine shrinkage. Although a variety of automatic spine segmentation and feature extraction methods were developed recently, no approaches allowing for an automatic and unbiased distinction between dendritic spine subpopulations and detailed computational models of spine behavior exist. We propose an automatic and statistically based method for the unsupervised construction of spine shape taxonomy based on arbitrary features. The taxonomy is then utilized in the newly introduced computational model of behavior, which relies on transitions between shapes. Models of different populations are compared using supplied bootstrap-based statistical tests. We compared two populations of spines at two time points. The first population was stimulated with long-term potentiation, and the other in the resting state was used as a control. The comparison of shape transition characteristics allowed us to identify the differences between population behaviors. Although some extreme changes were observed in the stimulated population, statistically significant differences were found only when whole models were compared. The source code of our software is freely available for non-commercial use1. Contact: d.plewczynski@cent.uw.edu.pl. PMID:28066226

  5. Equation of state and fragmentation issues in computational lethality analysis

    SciTech Connect

    Trucano, T.G.

    1993-07-01

    The purpose of this report is to summarize the status of computational analysis of hypervelocity impact lethality in relatively nontechnical terms from the perspective of the author. It is not intended to be a review of the technical literature on the problems of concern. The discussion is focused by concentrating on two phenomenology areas which are of particular concern in computational impact studies. First, the material`s equation of state, specifically the treatment of expanded states of metals undergoing shock vaporization, is discussed. Second, the process of dynamic fragmentation is addressed. In both cases, the context of the discussion deals with inaccuracies and difficulties associated with numerical hypervelocity impact simulations. Laboratory experimental capabilities in hypervelocity impact for impact velocities greater than 10.0 km/s are becoming increasingly viable. This paper also gives recommendations for experimental thrusts which utilize these capabilities that will help to resolve the uncertainties in the numerical lethality studies that are pointed out in the present report.

  6. NALDA (Naval Aviation Logistics Data Analysis) CAI (computer aided instruction)

    SciTech Connect

    Handler, B.H. ); France, P.A.; Frey, S.C.; Gaubas, N.F.; Hyland, K.J.; Lindsey, A.M.; Manley, D.O. ); Hunnum, W.H. ); Smith, D.L. )

    1990-07-01

    Data Systems Engineering Organization (DSEO) personnel developed a prototype computer aided instruction CAI system for the Naval Aviation Logistics Data Analysis (NALDA) system. The objective of this project was to provide a CAI prototype that could be used as an enhancement to existing NALDA training. The CAI prototype project was performed in phases. The task undertaken in Phase I was to analyze the problem and the alternative solutions and to develop a set of recommendations on how best to proceed. The findings from Phase I are documented in Recommended CAI Approach for the NALDA System (Duncan et al., 1987). In Phase II, a structured design and specifications were developed, and a prototype CAI system was created. A report, NALDA CAI Prototype: Phase II Final Report, was written to record the findings and results of Phase II. NALDA CAI: Recommendations for an Advanced Instructional Model, is comprised of related papers encompassing research on computer aided instruction CAI, newly developing training technologies, instructional systems development, and an Advanced Instructional Model. These topics were selected because of their relevancy to the CAI needs of NALDA. These papers provide general background information on various aspects of CAI and give a broad overview of new technologies and their impact on the future design and development of training programs. The paper within have been index separately elsewhere.

  7. Open Rotor Computational Aeroacoustic Analysis with an Immersed Boundary Method

    NASA Technical Reports Server (NTRS)

    Brehm, Christoph; Barad, Michael F.; Kiris, Cetin C.

    2016-01-01

    Reliable noise prediction capabilities are essential to enable novel fuel efficient open rotor designs that can meet the community and cabin noise standards. Toward this end, immersed boundary methods have reached a level of maturity so that they are being frequently employed for specific real world applications within NASA. This paper demonstrates that our higher-order immersed boundary method provides the ability for aeroacoustic analysis of wake-dominated flow fields generated by highly complex geometries. This is the first of a kind aeroacoustic simulation of an open rotor propulsion system employing an immersed boundary method. In addition to discussing the peculiarities of applying the immersed boundary method to this moving boundary problem, we will provide a detailed aeroacoustic analysis of the noise generation mechanisms encountered in the open rotor flow. The simulation data is compared to available experimental data and other computational results employing more conventional CFD methods. The noise generation mechanisms are analyzed employing spectral analysis, proper orthogonal decomposition and the causality method.

  8. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, J.

    1999-01-01

    A new atmospheric objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 1 X 1 lat-lon grid with 18 levels of heights and winds and 10 levels of moisture) using 120,000 observations in 17 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system is totally portable and can run on several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from 1 to 32 CPUs is 18%. In addition, the analysis results are identical regardless of the number of processors used. This system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. Static tests with a 2 X 2.5 resolution version of this system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from several months of cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (O-F statistics) as the current operational system.

  9. An Efficient Objective Analysis System for Parallel Computers

    NASA Technical Reports Server (NTRS)

    Stobie, James G.

    1999-01-01

    A new objective analysis system designed for parallel computers will be described. The system can produce a global analysis (on a 2 x 2.5 lat-lon grid with 20 levels of heights and winds and 10 levels of moisture) using 120,000 observations in less than 3 minutes on 32 CPUs (SGI Origin 2000). No special parallel code is needed (e.g. MPI or multitasking) and the 32 CPUs do not have to be on the same platform. The system Ls totally portable and can run on -several different architectures at once. In addition, the system can easily scale up to 100 or more CPUS. This will allow for much higher resolution and significant increases in input data. The system scales linearly as the number of observations and the number of grid points. The cost overhead in going from I to 32 CPus is 18%. in addition, the analysis results are identical regardless of the number of processors used. T'his system has all the characteristics of optimal interpolation, combining detailed instrument and first guess error statistics to produce the best estimate of the atmospheric state. It also includes a new quality control (buddy check) system. Static tests with the system showed it's analysis increments are comparable to the latest NASA operational system including maintenance of mass-wind balance. Results from a 2-month cycling test in the Goddard EOS Data Assimilation System (GEOS DAS) show this new analysis retains the same level of agreement between the first guess and observations (0-F statistics) throughout the entire two months.

  10. Computational analysis of Variable Thrust Engine (VTE) performance

    NASA Technical Reports Server (NTRS)

    Giridharan, M. G.; Krishnan, A.; Przekwas, A. J.

    1993-01-01

    The Variable Thrust Engine (VTE) of the Orbital Maneuvering Vehicle (OMV) uses a hypergolic propellant combination of Monomethyl Hydrazine (MMH) and Nitrogen Tetroxide (NTO) as fuel and oxidizer, respectively. The performance of the VTE depends on a number of complex interacting phenomena such as atomization, spray dynamics, vaporization, turbulent mixing, convective/radiative heat transfer, and hypergolic combustion. This study involved the development of a comprehensive numerical methodology to facilitate detailed analysis of the VTE. An existing Computational Fluid Dynamics (CFD) code was extensively modified to include the following models: a two-liquid, two-phase Eulerian-Lagrangian spray model; a chemical equilibrium model; and a discrete ordinate radiation heat transfer model. The modified code was used to conduct a series of simulations to assess the effects of various physical phenomena and boundary conditions on the VTE performance. The details of the models and the results of the simulations are presented.

  11. Computational analysis of methods for reduction of induced drag

    NASA Technical Reports Server (NTRS)

    Janus, J. M.; Chatterjee, Animesh; Cave, Chris

    1993-01-01

    The purpose of this effort was to perform a computational flow analysis of a design concept centered around induced drag reduction and tip-vortex energy recovery. The flow model solves the unsteady three-dimensional Euler equations, discretized as a finite-volume method, utilizing a high-resolution approximate Riemann solver for cell interface flux definitions. The numerical scheme is an approximately-factored block LU implicit Newton iterative-refinement method. Multiblock domain decomposition is used to partition the field into an ordered arrangement of blocks. Three configurations are analyzed: a baseline fuselage-wing, a fuselage-wing-nacelle, and a fuselage-wing-nacelle-propfan. Aerodynamic force coefficients, propfan performance coefficients, and flowfield maps are used to qualitatively access design efficacy. Where appropriate, comparisons are made with available experimental data.

  12. Computational analysis of azine-N-oxides as energetic materials

    SciTech Connect

    Ritchie, J.P.

    1994-05-01

    A BKW equation of state in a 1-dimensional hydrodynamic simulation of the cylinder test can be used to estimate the performance of explosives. Using this approach, the novel explosive 1,4-diamino-2,3,5,6-tetrazine-2,5-dioxide (TZX) was analyzed. Despite a high detonation velocity and a predicted CJ pressure comparable to that of RDX, TZX performs relatively poorly in the cylinder test. Theoretical and computational analysis shows this to be the result of a low heat of detonation. A conceptual strategy is proposed to remedy this problem. In order to predict the required heats of formation, new ab initio group equivalents were developed. Crystal structure calculations are also described that show hydrogen-bonding is important in determining the density of TZX and related compounds.

  13. A Computer Program for Statistically-Based Decision Analysis

    PubMed Central

    Polaschek, Jeanette X.; Lenert, Leslie A.; Garber, Alan M.

    1990-01-01

    The majority of patients with coronary artery disease do not fall into the well defined populations from randomized clinical trials. Observational databases contain a rich source of information that could be used by practicing physicians to evaluate treatment alternatives for their patients. We describe a computer system, the CABG Kibitzer, which uses an integrated approach to evaluate the treatment alternatives for CAD patients. We combine a statistical multivariate model for calculating survival advantages with DA techniques for assessing patient preferences and sensitivity analysis, to create one tool that physicians find easy to use in daily clinical practice. The development of tools of this kind is a necessary step in making the data of outcome studies accessible to practicing physicians.

  14. Optimal low thrust geocentric transfer. [mission analysis computer program

    NASA Technical Reports Server (NTRS)

    Edelbaum, T. N.; Sackett, L. L.; Malchow, H. L.

    1973-01-01

    A computer code which will rapidly calculate time-optimal low thrust transfers is being developed as a mission analysis tool. The final program will apply to NEP or SEP missions and will include a variety of environmental effects. The current program assumes constant acceleration. The oblateness effect and shadowing may be included. Detailed state and costate equations are given for the thrust effect, oblateness effect, and shadowing. A simple but adequate model yields analytical formulas for power degradation due to the Van Allen radiation belts for SEP missions. The program avoids the classical singularities by the use of equinoctial orbital elements. Kryloff-Bogoliuboff averaging is used to facilitate rapid calculation. Results for selected cases using the current program are given.

  15. Computational analysis of the SSME fuel preburner flow

    NASA Technical Reports Server (NTRS)

    Wang, T. S.; Farmer, R. C.

    1986-01-01

    A computational fluid dynamics model which simulates the steady state operation of the SSME fuel preburner is developed. Specifically, the model will be used to quantify the flow factors which cause local hot spots in the fuel preburner in order to recommend experiments whereby the control of undesirable flow features can be demonstrated. The results of a two year effort to model the preburner are presented. In this effort, investigating the fuel preburner flowfield, the appropriate transport equations were numerically solved for both an axisymmetric and a three-dimensional configuration. Continuum's VAST (Variational Solution of the Transport equations) code, in conjunction with the CM-1000 Engineering Analysis Workstation and the NASA/Ames CYBER 205, was used to perform the required calculations. It is concluded that the preburner operational anomalies are not due to steady state phenomena and must, therefore, be related to transient operational procedures.

  16. Meta-Analysis and Computer-Mediated Communication.

    PubMed

    Taylor, Alan M

    2016-04-01

    Because of the use of human participants and differing contextual variables, research in second language acquisition often produces conflicting results, leaving practitioners confused and unsure of the effectiveness of specific treatments. This article provides insight into a recent seminal meta-analysis on the effectiveness of computer-mediated communication, providing further statistical evidence of the importance of its results. The significance of the study is examined by looking at the p values included in the references, to demonstrate how results can easily be misconstrued by practitioners and researchers. Lin's conclusion regarding the research setting of the study reports is also evaluated. In doing so, other possible explanations of what may be influencing the results can be proposed.

  17. Data analysis using the Gnu R system for statistical computation

    SciTech Connect

    Simone, James; /Fermilab

    2011-07-01

    R is a language system for statistical computation. It is widely used in statistics, bioinformatics, machine learning, data mining, quantitative finance, and the analysis of clinical drug trials. Among the advantages of R are: it has become the standard language for developing statistical techniques, it is being actively developed by a large and growing global user community, it is open source software, it is highly portable (Linux, OS-X and Windows), it has a built-in documentation system, it produces high quality graphics and it is easily extensible with over four thousand extension library packages available covering statistics and applications. This report gives a very brief introduction to R with some examples using lattice QCD simulation results. It then discusses the development of R packages designed for chi-square minimization fits for lattice n-pt correlation functions.

  18. Satellite Interference Analysis and Simulation Using Personal Computers

    NASA Technical Reports Server (NTRS)

    Kantak, Anil

    1988-01-01

    This report presents the complete analysis and formulas necessary to quantify the interference experienced by a generic satellite communications receiving station due to an interfering satellite. Both satellites, the desired as well as the interfering satellite, are considered to be in elliptical orbits. Formulas are developed for the satellite look angles and the satellite transmit angles generally related to the land mask of the receiving station site for both satellites. Formulas for considering Doppler effect due to the satellite motion as well as the Earth's rotation are developed. The effect of the interfering-satellite signal modulation and the Doppler effect on the power received are considered. The statistical formulation of the interference effect is presented in the form of a histogram of the interference to the desired signal power ratio. Finally, a computer program suitable for microcomputers such as IBM AT is provided with the flowchart, a sample run, results of the run, and the program code.

  19. Computational Methods for Sensitivity and Uncertainty Analysis in Criticality Safety

    SciTech Connect

    Broadhead, B.L.; Childs, R.L.; Rearden, B.T.

    1999-09-20

    Interest in the sensitivity methods that were developed and widely used in the 1970s (the FORSS methodology at ORNL among others) has increased recently as a result of potential use in the area of criticality safety data validation procedures to define computational bias, uncertainties and area(s) of applicability. Functional forms of the resulting sensitivity coefficients can be used as formal parameters in the determination of applicability of benchmark experiments to their corresponding industrial application areas. In order for these techniques to be generally useful to the criticality safety practitioner, the procedures governing their use had to be updated and simplified. This paper will describe the resulting sensitivity analysis tools that have been generated for potential use by the criticality safety community.

  20. Triclosan Computational Conformational Chemistry Analysis for Antimicrobial Properties in Polymers.

    PubMed

    Petersen, Richard C

    2015-03-01

    Triclosan is a diphenyl ether antimicrobial that has been analyzed by computational conformational chemistry for an understanding of Mechanomolecular Theory. Subsequent energy profile analysis combined with easily seen three-dimensional chemistry structure models for the nonpolar molecule Triclosan show how single bond rotations can alternate rapidly at a polar and nonpolar interface. Bond rotations for the center ether oxygen atom of the two aromatic rings then expose or hide nonbonding lone-pair electrons for the oxygen atom depending on the polar nature of the immediate local molecular environment. Rapid bond movements can subsequently produce fluctuations as vibration energy. Consequently, related mechanical molecular movements calculated as energy relationships by forces acting through different bond positions can help improve on current Mechanomolecular Theory. A previous controversy reported as a discrepancy in literature contends for a possible bacterial resistance from Triclosan antimicrobial. However, findings in clinical settings have not reported a single case for Triclosan bacterial resistance in over 40 years that has been documented carefully in government reports. As a result, Triclosan is recommended whenever there is a health benefit consistent with a number of approvals for use of Triclosan in healthcare devices. Since Triclosan is the most researched antimicrobial ever, literature meta analysis with computational chemistry can best describe new molecular conditions that were previously impossible by conventional chemistry methods. Triclosan vibrational energy can now explain the molecular disruption of bacterial membranes. Further, Triclosan mechanomolecular movements help illustrate use in polymer matrix composites as an antimicrobial with two new additive properties as a toughening agent to improve matrix fracture toughness from microcracking and a hydrophobic wetting agent to help incorporate strengthening fibers. Interrelated

  1. Triclosan Computational Conformational Chemistry Analysis for Antimicrobial Properties in Polymers

    PubMed Central

    Petersen, Richard C.

    2015-01-01

    Triclosan is a diphenyl ether antimicrobial that has been analyzed by computational conformational chemistry for an understanding of Mechanomolecular Theory. Subsequent energy profile analysis combined with easily seen three-dimensional chemistry structure models for the nonpolar molecule Triclosan show how single bond rotations can alternate rapidly at a polar and nonpolar interface. Bond rotations for the center ether oxygen atom of the two aromatic rings then expose or hide nonbonding lone-pair electrons for the oxygen atom depending on the polar nature of the immediate local molecular environment. Rapid bond movements can subsequently produce fluctuations as vibration energy. Consequently, related mechanical molecular movements calculated as energy relationships by forces acting through different bond positions can help improve on current Mechanomolecular Theory. A previous controversy reported as a discrepancy in literature contends for a possible bacterial resistance from Triclosan antimicrobial. However, findings in clinical settings have not reported a single case for Triclosan bacterial resistance in over 40 years that has been documented carefully in government reports. As a result, Triclosan is recommended whenever there is a health benefit consistent with a number of approvals for use of Triclosan in healthcare devices. Since Triclosan is the most researched antimicrobial ever, literature meta analysis with computational chemistry can best describe new molecular conditions that were previously impossible by conventional chemistry methods. Triclosan vibrational energy can now explain the molecular disruption of bacterial membranes. Further, Triclosan mechanomolecular movements help illustrate use in polymer matrix composites as an antimicrobial with two new additive properties as a toughening agent to improve matrix fracture toughness from microcracking and a hydrophobic wetting agent to help incorporate strengthening fibers. Interrelated

  2. Removal of Supernumerary Teeth Utilizing a Computer-Aided Design/Computer-Aided Manufacturing Surgical Guide.

    PubMed

    Jo, Chanwoo; Bae, Doohwan; Choi, Byungho; Kim, Jihun

    2016-11-12

    Supernumerary teeth need to be removed because they can cause various complications. Caution is needed because their removal can cause damage to permanent teeth or tooth germs in the local vicinity. Surgical guides have recently been used in maxillofacial surgery. Because surgical guides are designed through preoperative analysis by computer-aided design software and fabricated using a 3-dimensional printer applying computer-aided manufacturing technology, they increase the accuracy and predictability of surgery. This report describes 2 cases of removal of a mesiodens-1 from a child and 1 from an adolescent-using a surgical guide; these would have been difficult to remove with conventional surgical methods.

  3. Computer-assisted sperm analysis (CASA): capabilities and potential developments.

    PubMed

    Amann, Rupert P; Waberski, Dagmar

    2014-01-01

    Computer-assisted sperm analysis (CASA) systems have evolved over approximately 40 years, through advances in devices to capture the image from a microscope, huge increases in computational power concurrent with amazing reduction in size of computers, new computer languages, and updated/expanded software algorithms. Remarkably, basic concepts for identifying sperm and their motion patterns are little changed. Older and slower systems remain in use. Most major spermatology laboratories and semen processing facilities have a CASA system, but the extent of reliance thereon ranges widely. This review describes capabilities and limitations of present CASA technology used with boar, bull, and stallion sperm, followed by possible future developments. Each marketed system is different. Modern CASA systems can automatically view multiple fields in a shallow specimen chamber to capture strobe-like images of 500 to >2000 sperm, at 50 or 60 frames per second, in clear or complex extenders, and in <2 minutes, store information for ≥ 30 frames and provide summary data for each spermatozoon and the population. A few systems evaluate sperm morphology concurrent with motion. CASA cannot accurately predict 'fertility' that will be obtained with a semen sample or subject. However, when carefully validated, current CASA systems provide information important for quality assurance of semen planned for marketing, and for the understanding of the diversity of sperm responses to changes in the microenvironment in research. The four take-home messages from this review are: (1) animal species, extender or medium, specimen chamber, intensity of illumination, imaging hardware and software, instrument settings, technician, etc., all affect accuracy and precision of output values; (2) semen production facilities probably do not need a substantially different CASA system whereas biology laboratories would benefit from systems capable of imaging and tracking sperm in deep chambers for a flexible

  4. A Computer-Assisted Language Analysis System (CALAS) and Its Applications.

    ERIC Educational Resources Information Center

    Pepinsky, Harold B.

    A Computer-Assisted Language Analysis System (CALAS) was developed as a syntactic and semantic analyzer of machine readable text in English. CALAS includes a set of computer programs, an algorithm for implementation, and human editors who assist the computer and its programmer in the processing of data. Data analysis is accomplished in three…

  5. Summary of research in applied mathematics, numerical analysis, and computer sciences

    NASA Technical Reports Server (NTRS)

    1986-01-01

    The major categories of current ICASE research programs addressed include: numerical methods, with particular emphasis on the development and analysis of basic numerical algorithms; control and parameter identification problems, with emphasis on effective numerical methods; computational problems in engineering and physical sciences, particularly fluid dynamics, acoustics, and structural analysis; and computer systems and software, especially vector and parallel computers.

  6. The role of computed tomography in terminal ballistic analysis.

    PubMed

    Rutty, G N; Boyce, P; Robinson, C E; Jeffery, A J; Morgan, B

    2008-01-01

    Terminal ballistics concerns the science of projectile behaviour within a target and includes wound ballistics that considers what happens when a projectile strikes a living being. A number of soft tissue ballistic simulants have been used to assess the damage to tissue caused by projectiles. Standard assessment of these materials, such as ballistic soap or ordnance gelatine, requires the block to be opened or that a mould to be made to visualize the wound track. This is time consuming and may affect the accuracy of the findings especially if the block dries and alters shape during the process. Therefore, accurate numerical analysis of the permanent or temporary cavity is limited. Computed tomography (CT) potentially offers a quicker non-invasive analysis tool for this task. Four commercially purchased ballistic glycerine soap blocks were used. Each had a single firearm discharged into it from a distance of approximately 15 cm using both gunshot and shotgun projectiles. After discharge, each block was imaged by a modern 16 slice multi-detector CT scanner and analysed using 3-D reconstruction software. Using the anterior-posterior and lateral scout views and the multi-plane reconstructed images, it was possible to visualize the temporary cavity, as well as the fragmentation and dispersal pattern of the projectiles, the distance travelled and angle of dispersal within the block of each projectile or fragment. A virtual cast of the temporary cavity can be also be made. Multi-detector CT with 3-D analysis software is shown to create a reliable permanent record of the projectile path allowing rapid analysis of different firearms and projectiles.

  7. Methodology for Benefit Analysis of CAD/CAM (Computer-Aided Design/Computer-Aided Manufacturing) in USN Shipyards.

    DTIC Science & Technology

    1984-03-01

    D-Ri38 398 METHODOLOGY FOR BENEFIT ANALYSIS OF CAD/CAM / (COMPUTER-HIDED DESIGN/COMPUTER-AIDED MANUFACTURING) IN USN SHIPYARDS(U) NAVAL POSTGRADUATE...Monterey, California DT I ~" t • EB3 1984 THESIS METHODOLOGY FOR BENEFIT ANALYSIS OF CAD/CAM IN USN SHIPYARDS by Richard B. Grahlman March 1984 Thesis...REPORT & PERIOD COVERED Methodology for Benefit Analysis of CAD/CAM Mastrch 1984 i in UM Sipyads. PERFORMIANG ORG. REPORT NUM8ER 7- AUHOW11111 4

  8. Quantitative Computed Tomography and Image Analysis for Advanced Muscle Assessment

    PubMed Central

    Edmunds, Kyle Joseph; Gíslason, Magnus K.; Arnadottir, Iris D.; Marcante, Andrea; Piccione, Francesco; Gargiulo, Paolo

    2016-01-01

    Medical imaging is of particular interest in the field of translational myology, as extant literature describes the utilization of a wide variety of techniques to non-invasively recapitulate and quantity various internal and external tissue morphologies. In the clinical context, medical imaging remains a vital tool for diagnostics and investigative assessment. This review outlines the results from several investigations on the use of computed tomography (CT) and image analysis techniques to assess muscle conditions and degenerative process due to aging or pathological conditions. Herein, we detail the acquisition of spiral CT images and the use of advanced image analysis tools to characterize muscles in 2D and 3D. Results from these studies recapitulate changes in tissue composition within muscles, as visualized by the association of tissue types to specified Hounsfield Unit (HU) values for fat, loose connective tissue or atrophic muscle, and normal muscle, including fascia and tendon. We show how results from these analyses can be presented as both average HU values and compositions with respect to total muscle volumes, demonstrating the reliability of these tools to monitor, assess and characterize muscle degeneration. PMID:27478562

  9. Variance Analysis and Comparison in Computer-Aided Design

    NASA Astrophysics Data System (ADS)

    Ullrich, T.; Schiffer, T.; Schinko, C.; Fellner, D. W.

    2011-09-01

    The need to analyze and visualize differences of very similar objects arises in many research areas: mesh compression, scan alignment, nominal/actual value comparison, quality management, and surface reconstruction to name a few. In computer graphics, for example, differences of surfaces are used for analyzing mesh processing algorithms such as mesh compression. They are also used to validate reconstruction and fitting results of laser scanned surfaces. As laser scanning has become very important for the acquisition and preservation of artifacts, scanned representations are used for documentation as well as analysis of ancient objects. Detailed mesh comparisons can reveal smallest changes and damages. These analysis and documentation tasks are needed not only in the context of cultural heritage but also in engineering and manufacturing. Differences of surfaces are analyzed to check the quality of productions. Our contribution to this problem is a workflow, which compares a reference / nominal surface with an actual, laser-scanned data set. The reference surface is a procedural model whose accuracy and systematics describe the semantic properties of an object; whereas the laser-scanned object is a real-world data set without any additional semantic information.

  10. The interactome of CCT complex - A computational analysis.

    PubMed

    Narayanan, Aswathy; Pullepu, Dileep; Kabir, M Anaul

    2016-10-01

    The eukaryotic chaperonin, CCT (Chaperonin Containing TCP1 or TriC-TCP-1 Ring Complex) has been subjected to physical and genetic analyses in S. cerevisiae which can be extrapolated to human CCT (hCCT), owing to its structural and functional similarities with yeast CCT (yCCT). Studies on hCCT and its interactome acquire an additional dimension, as it has been implicated in several disease conditions like neurodegeneration and cancer. We attempt to study its stress response role in general, which will be reflected in the aspects of human diseases and yeast physiology, through computational analysis of the interactome. Towards consolidating and analysing the interactome data, we prepared and compared the unique CCT-interacting protein lists for S. cerevisiae and H. sapiens, performed GO term classification and enrichment studies which provide information on the diversity in CCT interactome, in terms of protein classes in the data set. Enrichment with disease-associated proteins and pathways highlight the medical importance of CCT. Different analyses converge, suggesting the significance of WD-repeat proteins, protein kinases and cytoskeletal proteins in the interactome. The prevalence of proteasomal subunits and ribosomal proteins suggest a possible cross-talk between protein-synthesis, folding and degradation machinery. A network of chaperones and chaperonins that function in combination can also be envisaged from the CCT interactome-Hsp70 interactome analysis.

  11. Consequence analysis in LPG installation using an integrated computer package.

    PubMed

    Ditali, S; Colombi, M; Moreschini, G; Senni, S

    2000-01-07

    This paper presents the prototype of the computer code, Atlantide, developed to assess the consequences associated with accidental events that can occur in a LPG storage plant. The characteristic of Atlantide is to be simple enough but at the same time adequate to cope with consequence analysis as required by Italian legislation in fulfilling the Seveso Directive. The application of Atlantide is appropriate for LPG storage/transferring installations. The models and correlations implemented in the code are relevant to flashing liquid releases, heavy gas dispersion and other typical phenomena such as BLEVE/Fireball. The computer code allows, on the basis of the operating/design characteristics, the study of the relevant accidental events from the evaluation of the release rate (liquid, gaseous and two-phase) in the unit involved, to the analysis of the subsequent evaporation and dispersion, up to the assessment of the final phenomena of fire and explosion. This is done taking as reference simplified Event Trees which describe the evolution of accidental scenarios, taking into account the most likely meteorological conditions, the different release situations and other features typical of a LPG installation. The limited input data required and the automatic linking between the single models, that are activated in a defined sequence, depending on the accidental event selected, minimize both the time required for the risk analysis and the possibility of errors. Models and equations implemented in Atlantide have been selected from public literature or in-house developed software and tailored with the aim to be easy to use and fast to run but, nevertheless, able to provide realistic simulation of the accidental event as well as reliable results, in terms of physical effects and hazardous areas. The results have been compared with those of other internationally recognized codes and with the criteria adopted by Italian authorities to verify the Safety Reports for LPG

  12. Computational and Statistical Analysis of Protein Mass Spectrometry Data

    PubMed Central

    Noble, William Stafford; MacCoss, Michael J.

    2012-01-01

    High-throughput proteomics experiments involving tandem mass spectrometry produce large volumes of complex data that require sophisticated computational analyses. As such, the field offers many challenges for computational biologists. In this article, we briefly introduce some of the core computational and statistical problems in the field and then describe a variety of outstanding problems that readers of PLoS Computational Biology might be able to help solve. PMID:22291580

  13. Computational Analysis of the Hypothalamic Control of Food Intake

    PubMed Central

    Tabe-Bordbar, Shayan; Anastasio, Thomas J.

    2016-01-01

    Food-intake control is mediated by a heterogeneous network of different neural subtypes, distributed over various hypothalamic nuclei and other brain structures, in which each subtype can release more than one neurotransmitter or neurohormone. The complexity of the interactions of these subtypes poses a challenge to understanding their specific contributions to food-intake control, and apparent consistencies in the dataset can be contradicted by new findings. For example, the growing consensus that arcuate nucleus neurons expressing Agouti-related peptide (AgRP neurons) promote feeding, while those expressing pro-opiomelanocortin (POMC neurons) suppress feeding, is contradicted by findings that low AgRP neuron activity and high POMC neuron activity can be associated with high levels of food intake. Similarly, the growing consensus that GABAergic neurons in the lateral hypothalamus suppress feeding is contradicted by findings suggesting the opposite. Yet the complexity of the food-intake control network admits many different network behaviors. It is possible that anomalous associations between the responses of certain neural subtypes and feeding are actually consistent with known interactions, but their effect on feeding depends on the responses of the other neural subtypes in the network. We explored this possibility through computational analysis. We made a computer model of the interactions between the hypothalamic and other neural subtypes known to be involved in food-intake control, and optimized its parameters so that model behavior matched observed behavior over an extensive test battery. We then used specialized computational techniques to search the entire model state space, where each state represents a different configuration of the responses of the units (model neural subtypes) in the network. We found that the anomalous associations between the responses of certain hypothalamic neural subtypes and feeding are actually consistent with the known structure

  14. Pulmonary Toxicity in Stage III Non-Small Cell Lung Cancer Patients Treated With High-Dose (74 Gy) 3-Dimensional Conformal Thoracic Radiotherapy and Concurrent Chemotherapy Following Induction Chemotherapy: A Secondary Analysis of Cancer and Leukemia Group B (CALGB) Trial 30105

    SciTech Connect

    Salama, Joseph K.; Stinchcombe, Thomas E.; Gu Lin; Wang Xiaofei; Morano, Karen; Bogart, Jeffrey A.; Crawford, Jeffrey C.; Socinski, Mark A.; Blackstock, A. William; Vokes, Everett E.

    2011-11-15

    Purpose: Cancer and Leukemia Group B (CALGB) 30105 tested two different concurrent chemoradiotherapy platforms with high-dose (74 Gy) three-dimensional conformal radiotherapy (3D-CRT) after two cycles of induction chemotherapy for Stage IIIA/IIIB non-small cell lung cancer (NSCLC) patients to determine if either could achieve a primary endpoint of >18-month median survival. Final results of 30105 demonstrated that induction carboplatin and gemcitabine and concurrent gemcitabine 3D-CRT was not feasible because of treatment-related toxicity. However, induction and concurrent carboplatin/paclitaxel with 74 Gy 3D-CRT had a median survival of 24 months, and is the basis for the experimental arm in CALGB 30610/RTOG 0617/N0628. We conducted a secondary analysis of all patients to determine predictors of treatment-related pulmonary toxicity. Methods and Materials: Patient, tumor, and treatment-related variables were analyzed to determine their relation with treatment-related pulmonary toxicity. Results: Older age, higher N stage, larger planning target volume (PTV)1, smaller total lung volume/PTV1 ratio, larger V20, and larger mean lung dose were associated with increasing pulmonary toxicity on univariate analysis. Multivariate analysis confirmed that V20 and nodal stage as well as treatment with concurrent gemcitabine were associated with treatment-related toxicity. A high-risk group comprising patients with N3 disease and V20 >38% was associated with 80% of Grades 3-5 pulmonary toxicity cases. Conclusions: Elevated V20 and N3 disease status are important predictors of treatment related pulmonary toxicity in patients treated with high-dose 3D-CRT and concurrent chemotherapy. Further studies may use these metrics in considering patients for these treatments.

  15. Computer based imaging and analysis of root gravitropism

    NASA Technical Reports Server (NTRS)

    Evans, M. L.; Ishikawa, H.

    1997-01-01

    Two key issues in studies of the nature of the gravitropic response in roots have been the determination of the precise pattern of differential elongation responsible for downward bending and the identification of the cells that show the initial motor response. The main approach for examining patterns of differential growth during root gravitropic curvature has been to apply markers to the root surface and photograph the root at regular intervals during gravitropic curvature. Although these studies have provided valuable information on the characteristics of the gravitropic motor response in roots, their labor intensive nature limits sample size and discourages both high frequency of sampling and depth of analysis of surface expansion data. In this brief review we describe the development of computer-based video analysis systems for automated measurement of root growth and shape change and discuss some key features of the root gravitropic response that have been revealed using this methodology. We summarize the capabilities of several new pieces of software designed to measure growth and shape changes in graviresponding roots and describe recent progress in developing analysis systems for studying the small, but experimentally popular, primary roots of Arabidopsis. A key finding revealed by such studies is that the initial gravitropic response of roots of maize and Arabidopsis occurs in the distal elongation zone (DEZ) near the root apical meristem, not in the main elongation zone. Another finding is that the initiation of rapid elongation in the DEZ following gravistimulation appears to be related to rapid membrane potential changes in this region of the root. These observations have provided the incentive for ongoing studies examining possible links between potential growth modifying factors (auxin, calcium, protons) and gravistimulated changes in membrane potential and growth patterns in the DEZ.

  16. Trident: scalable compute archives: workflows, visualization, and analysis

    NASA Astrophysics Data System (ADS)

    Gopu, Arvind; Hayashi, Soichi; Young, Michael D.; Kotulla, Ralf; Henschel, Robert; Harbeck, Daniel

    2016-08-01

    The Astronomy scientific community has embraced Big Data processing challenges, e.g. associated with time-domain astronomy, and come up with a variety of novel and efficient data processing solutions. However, data processing is only a small part of the Big Data challenge. Efficient knowledge discovery and scientific advancement in the Big Data era requires new and equally efficient tools: modern user interfaces for searching, identifying and viewing data online without direct access to the data; tracking of data provenance; searching, plotting and analyzing metadata; interactive visual analysis, especially of (time-dependent) image data; and the ability to execute pipelines on supercomputing and cloud resources with minimal user overhead or expertise even to novice computing users. The Trident project at Indiana University offers a comprehensive web and cloud-based microservice software suite that enables the straight forward deployment of highly customized Scalable Compute Archive (SCA) systems; including extensive visualization and analysis capabilities, with minimal amount of additional coding. Trident seamlessly scales up or down in terms of data volumes and computational needs, and allows feature sets within a web user interface to be quickly adapted to meet individual project requirements. Domain experts only have to provide code or business logic about handling/visualizing their domain's data products and about executing their pipelines and application work flows. Trident's microservices architecture is made up of light-weight services connected by a REST API and/or a message bus; a web interface elements are built using NodeJS, AngularJS, and HighCharts JavaScript libraries among others while backend services are written in NodeJS, PHP/Zend, and Python. The software suite currently consists of (1) a simple work flow execution framework to integrate, deploy, and execute pipelines and applications (2) a progress service to monitor work flows and sub

  17. Graphics Flutter Analysis Methods, an interactive computing system at Lockheed-California Company

    NASA Technical Reports Server (NTRS)

    Radovcich, N. A.

    1975-01-01

    An interactive computer graphics system, Graphics Flutter Analysis Methods (GFAM), was developed to complement FAMAS, a matrix-oriented batch computing system, and other computer programs in performing complex numerical calculations using a fully integrated data management system. GFAM has many of the matrix operation capabilities found in FAMAS, but on a smaller scale, and is utilized when the analysis requires a high degree of interaction between the engineer and computer, and schedule constraints exclude the use of batch entry programs. Applications of GFAM to a variety of preliminary design, development design, and project modification programs suggest that interactive flutter analysis using matrix representations is a feasible and cost effective computing tool.

  18. Computational Analysis of a Prototype Martian Rotorcraft Experiment

    NASA Technical Reports Server (NTRS)

    Corfeld, Kelly J.; Strawn, Roger C.; Long, Lyle N.

    2002-01-01

    This paper presents Reynolds-averaged Navier-Stokes calculations for a prototype Martian rotorcraft. The computations are intended for comparison with an ongoing Mars rotor hover test at NASA Ames Research Center. These computational simulations present a new and challenging problem, since rotors that operate on Mars will experience a unique low Reynolds number and high Mach number environment. Computed results for the 3-D rotor differ substantially from 2-D sectional computations in that the 3-D results exhibit a stall delay phenomenon caused by rotational forces along the blade span. Computational results have yet to be compared to experimental data, but computed performance predictions match the experimental design goals fairly well. In addition, the computed results provide a high level of detail in the rotor wake and blade surface aerodynamics. These details provide an important supplement to the expected experimental performance data.

  19. Computational Analysis of a Prototype Martian Rotorcraft Experiment

    NASA Technical Reports Server (NTRS)

    Corfeld, Kelly J.; Strawn, Roger C.; Long, Lyle N.

    2001-01-01

    This paper presents Reynolds-averaged Navier-Stokes calculations for a prototype Martian rotorcraft. The computations are intended for comparison with an ongoing Mars rotor hover test at NASA Ames Research Center. These computational simulations present a new and challenging problem, since rotors that operate on Mars will experience a unique low Reynolds number and high Mach number environment. Computed results for the 3-D rotor differ substantially from 2-D sectional computations in that the 3-D results exhibit a stall delay phenomenon caused by rotational forces along the blade span. Computational results have yet to be compared to experimental data, but computed performance predictions match the experimental design goals fairly well. In addition, the computed results provide a high level of detail in the rotor wake and blade surface aerodynamics. These details provide an important supplement to the expected experimental performance data.

  20. Is a 3-Dimensional Stress Balance Ice-Stream Model Really Better Than a 2-Dimensional "Reduced Order" Ice-Stream Model?

    NASA Astrophysics Data System (ADS)

    Sergienko, O.; Macayeal, D. R.

    2007-12-01

    With growing observational awareness of numerous ice-stream processes occurring on short time and spatial scales, e.g., sub-ice-stream lake volume changes and grounding-line sediment wedge build-up, the question of how well models based on "reduced-order" dynamics can simulate ice-stream behavior becomes paramount. Reduced-order models of ice-streams are typically 2-dimensional, and capture only the largest-magnitude terms in the stress tensor (with other terms being constrained by various assumptions). In predicting the overall magnitude and large-scale pattern of ice-stream flow, the reduced-order models appear to be adequate. Efforts underway in the Glaciological Community to create 3-dimensional models of the "full" ice-stream stress balance, which relax the assumptions associated with reduced-order models, suggest that a cost/benefit analysis should be done to determine how likely these efforts will be fruitful. To assess the overall benefits of full 3-dimensional models in relation to the simpler 2-dimensional counterparts, we present model solutions of the full Stokes equations for ice-stream flow over a variety of basal perturbations (e.g., a sticky spot, a subglacial lake, a grounding line). We also present the solutions derived from reduced 2-dimensional models, and compare the two solutions to estimate effects of simplifications and neglected terms, as well as to advise on what circumstances 3-dimensional models are preferable to 2-dimensional models.

  1. EVA worksite analysis--use of computer analysis for EVA operations development and execution.

    PubMed

    Anderson, D

    1999-01-01

    To sustain the rate of extravehicular activity (EVA) required to assemble and maintain the International Space Station, we must enhance our ability to plan, train for, and execute EVAs. An underlying analysis capability has been developed to ensure EVA access to all external worksites as a starting point for ground training, to generate information needed for on-orbit training, and to react quickly to develop contingency EVA plans, techniques, and procedures. This paper describes the use of computer-based EVA worksite analysis techniques for EVA worksite design. EVA worksite analysis has been used to design 80% of EVA worksites on the U.S. portion of the International Space Station. With the launch of the first U.S. element of the station, EVA worksite analysis is being developed further to support real-time analysis of unplanned EVA operations. This paper describes this development and deployment of EVA worksite analysis for International Space Station (ISS) mission support.

  2. [3-dimensional approach to spinal deformities. Application to the study of the prognosis of pediatric scoliosis].

    PubMed

    Graf, H; Hecquet, J; Dubousset, J

    1983-01-01

    The authors have utilized a computer for a spatial analysis of deformities of the spine using antero-posterior and lateral radiographs. The posterior limits of the sacral plateau and the centre of each vertebral plateau was demarcated on the antero-posterior radiograph. The posterior margin of each vertebra was demarcated on the lateral radiograph. With this information, the computer can make a picture as if the spine were seen from above and vertebral rotation can be assessed. The vertebrae are of different colours according to their level. Thirty cases have been studied and six types of infantile scoliosis defined - scoliosis with a localised hyper-rotation like a hairpin, scoliosis with rotary dislocation at the junction of two rotational levels, infantile scoliosis with a moderate curve, progressive scoliosis, benign scoliosis and spontaneously regressive scoliosis of the newborn. This study was shown that prognostic features were in accordance with classical assessments.

  3. Incorporating a 3-dimensional printer into the management of early-stage cervical cancer.

    PubMed

    Baek, Min-Hyun; Kim, Dae-Yeon; Kim, Namkug; Rhim, Chae Chun; Kim, Jong-Hyeok; Nam, Joo-Hyun

    2016-08-01

    We used a 3-dimensional (3D) printer to create anatomical replicas of real lesions and tested its application in cervical cancer. Our study patient decided to undergo radical hysterectomy after seeing her 3D model which was then used to plan and simulate this surgery. Using 3D printers to create patient-specific 3D tumor models may aid cervical cancer patients make treatment decisions. This technology will lead to better surgical and oncological outcomes for cervical cancer patients. J. Surg. Oncol. 2016;114:150-152. © 2016 Wiley Periodicals, Inc.

  4. Introducing a well-ordered volume porosity in 3-dimensional gold microcantilevers

    NASA Astrophysics Data System (ADS)

    Ayela, Cédric; Lalo, Hélène; Kuhn, Alexander

    2013-02-01

    The purpose of the present work is the introduction of a combined bottom-up and top-down approach to generate 3-dimensional gold microcantilevers, where the porosity in the volume of the free-standing microstructure is well-controlled. By combining the elaboration of a colloidal crystal, followed by electrodeposition, with a sacrificial layer process, free-standing macroporous gold cantilevers are fabricated collectively. In order to validate the proposed concept, a simple application to humidity sensing is evaluated using the devices as mass sensors. A large sensitivity of -529 ppm/%RH and low discrepancy are obtained experimentally, confirming the promising application potential of this original architecture.

  5. Brief communications: visualization of coronary arteries in rats by 3-dimensional real-time contrast echocardiography.

    PubMed

    Ishikura, Fuminobu; Hirayama, Hideo; Iwata, Akiko; Toshida, Tsutomu; Masuda, Kasumi; Otani, Kentaro; Asanuma, Toshihiko; Beppu, Shintaro

    2008-05-01

    Angiogenesis is under intense investigation to advance the treatment of various ischemic diseases. Small animals, such as mice and rats, are often used for this purpose. However, evaluating the structure of coronary arteries in small animals in situ is not easy. We succeeded in visualizing the coronary artery in rats on 3-dimensional real-time contrast echocardiography using a high-frequency transducer. These methods will be applied for more convenient assessment in a new study, examining issues such as angiogenesis using rats in situ.

  6. MMA, A Computer Code for Multi-Model Analysis

    SciTech Connect

    Eileen P. Poeter and Mary C. Hill

    2007-08-20

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations.

  7. Reliability analysis framework for computer-assisted medical decision systems

    SciTech Connect

    Habas, Piotr A.; Zurada, Jacek M.; Elmaghraby, Adel S.; Tourassi, Georgia D.

    2007-02-15

    We present a technique that enhances computer-assisted decision (CAD) systems with the ability to assess the reliability of each individual decision they make. Reliability assessment is achieved by measuring the accuracy of a CAD system with known cases similar to the one in question. The proposed technique analyzes the feature space neighborhood of the query case to dynamically select an input-dependent set of known cases relevant to the query. This set is used to assess the local (query-specific) accuracy of the CAD system. The estimated local accuracy is utilized as a reliability measure of the CAD response to the query case. The underlying hypothesis of the study is that CAD decisions with higher reliability are more accurate. The above hypothesis was tested using a mammographic database of 1337 regions of interest (ROIs) with biopsy-proven ground truth (681 with masses, 656 with normal parenchyma). Three types of decision models, (i) a back-propagation neural network (BPNN), (ii) a generalized regression neural network (GRNN), and (iii) a support vector machine (SVM), were developed to detect masses based on eight morphological features automatically extracted from each ROI. The performance of all decision models was evaluated using the Receiver Operating Characteristic (ROC) analysis. The study showed that the proposed reliability measure is a strong predictor of the CAD system's case-specific accuracy. Specifically, the ROC area index for CAD predictions with high reliability was significantly better than for those with low reliability values. This result was consistent across all decision models investigated in the study. The proposed case-specific reliability analysis technique could be used to alert the CAD user when an opinion that is unlikely to be reliable is offered. The technique can be easily deployed in the clinical environment because it is applicable with a wide range of classifiers regardless of their structure and it requires neither additional

  8. Computer-aided pulmonary image analysis in small animal models

    PubMed Central

    Xu, Ziyue; Bagci, Ulas; Mansoor, Awais; Kramer-Marek, Gabriela; Luna, Brian; Kubler, Andre; Dey, Bappaditya; Foster, Brent; Papadakis, Georgios Z.; Camp, Jeremy V.; Jonsson, Colleen B.; Bishai, William R.; Jain, Sanjay; Udupa, Jayaram K.; Mollura, Daniel J.

    2015-01-01

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases. PMID:26133591

  9. Computer-aided pulmonary image analysis in small animal models

    SciTech Connect

    Xu, Ziyue; Mansoor, Awais; Mollura, Daniel J.; Bagci, Ulas; Kramer-Marek, Gabriela; Luna, Brian; Kubler, Andre; Dey, Bappaditya; Jain, Sanjay; Foster, Brent; Papadakis, Georgios Z.; Camp, Jeremy V.; Jonsson, Colleen B.; Bishai, William R.; Udupa, Jayaram K.

    2015-07-15

    Purpose: To develop an automated pulmonary image analysis framework for infectious lung diseases in small animal models. Methods: The authors describe a novel pathological lung and airway segmentation method for small animals. The proposed framework includes identification of abnormal imaging patterns pertaining to infectious lung diseases. First, the authors’ system estimates an expected lung volume by utilizing a regression function between total lung capacity and approximated rib cage volume. A significant difference between the expected lung volume and the initial lung segmentation indicates the presence of severe pathology, and invokes a machine learning based abnormal imaging pattern detection system next. The final stage of the proposed framework is the automatic extraction of airway tree for which new affinity relationships within the fuzzy connectedness image segmentation framework are proposed by combining Hessian and gray-scale morphological reconstruction filters. Results: 133 CT scans were collected from four different studies encompassing a wide spectrum of pulmonary abnormalities pertaining to two commonly used small animal models (ferret and rabbit). Sensitivity and specificity were greater than 90% for pathological lung segmentation (average dice similarity coefficient > 0.9). While qualitative visual assessments of airway tree extraction were performed by the participating expert radiologists, for quantitative evaluation the authors validated the proposed airway extraction method by using publicly available EXACT’09 data set. Conclusions: The authors developed a comprehensive computer-aided pulmonary image analysis framework for preclinical research applications. The proposed framework consists of automatic pathological lung segmentation and accurate airway tree extraction. The framework has high sensitivity and specificity; therefore, it can contribute advances in preclinical research in pulmonary diseases.

  10. Computational analysis of flow in 3D propulsive transition ducts

    NASA Technical Reports Server (NTRS)

    Sepri, Paavo

    1990-01-01

    A numerical analysis of fully three dimensional, statistically steady flows in propulsive transition ducts being considered for use in future aircraft of higher maneuverability is investigated. The purpose of the transition duct is to convert axisymmetric flow from conventional propulsion systems to that of a rectangular geometry of high aspect ratio. In an optimal design, the transition duct would be of minimal length in order to reduce the weight penalty, while the geometrical change would be gradual enough to avoid detrimental flow perturbations. Recent experiments conducted at the Propulsion Aerodynamics Branch have indicated that thrust losses in ducts of superelliptic cross-section can be surprisingly low, even if flow separation occurs near the divergent walls. In order to address the objective of developing a rational design procedure for optimal transition ducts, it is necessary to have available a reliable computational tool for the analysis of flows achieved in a sequence of configurations. Current CFD efforts involving complicated geometries usually must contend with two separate but interactive aspects: namely, grid generation and flow solution. The first two avenues of the present investigation were comprised of suitable grid generation for a class of transition ducts of superelliptic cross-section, and the subsequent application of the flow solver PAB3D to this geometry. The code, PAB3D, was developed as a comprehensive tool for the solution of both internal and external high speed flows. The third avenue of investigation has involved analytical formulations to aid in the understanding of the nature of duct flows, and also to provide a basis of comparison for subsequent numerical solutions. Numerical results to date include the generation of two preliminary grid systems for duct flows, and the initial application of PAB3D to the corresponding geometries, which are of the class tested experimentally.

  11. Computational heat transfer analysis for oscillatory channel flows

    NASA Technical Reports Server (NTRS)

    Ibrahim, Mounir; Kannapareddy, Mohan

    1993-01-01

    An accurate finite-difference scheme has been utilized to investigate oscillatory, laminar and incompressible flow between two-parallel-plates and in circular tubes. The two-parallel-plates simulate the regenerator of a free-piston Stirling engine (foil type regenerator) and the channel wall was included in the analysis (conjugate heat transfer problem). The circular tubes simulate the cooler and heater of the engine with an isothermal wall. The study conducted covered a wide range for the maximum Reynolds number (from 75 to 60,000), Valensi number (from 2.5 to 700), and relative amplitude of fluid displacement (0.714 and 1.34). The computational results indicate a complex nature of the heat flux distribution with time and axial location in the channel. At the channel mid-plane we observed two thermal cycles (out of phase with the flow) per each flow cycle. At this axial location the wall heat flux mean value, amplitude and phase shift with the flow are dependent upon the maximum Reynolds number, Valensi number and relative amplitude of fluid displacement. At other axial locations, the wall heat flux distribution is more complex.

  12. Computational analysis of molt-inhibiting hormone from selected crustaceans.

    PubMed

    C, Kumaraswamy Naidu; Y, Suneetha; P, Sreenivasula Reddy

    2013-12-01

    Molt-inhibiting hormone (MIH) is a principal endocrine hormone regulating the growth in crustaceans. In total, nine MIH peptide sequences representing members of the family Penaeidae (Penaeus monodon, Litopenaeus vannamei, Marsupenaeus japonicus), Portunidae (Portunus trituberculatus, Charybdis japonica, Charybdis feriata), Cambaridae (Procambarus bouvieri), Parastacidae (Cherax quadricarinatus) and Varunidae (Eriocheir sinensis) were selected for our study. In order to develop a structure based phylogeny, predict functionally important regions and to define stability changes upon single site mutations, the 3D structure of MIH for the crustaceans were built by using homology modeling based on the known structure of MIH from M. japonicus (1J0T). Structure based phylogeny showed a close relationship between P. bouvieri and C. japonica. ConSurf server analysis showed that the residues Cys(8), Arg(15), Cys(25), Asp(27), Cys(28), Asn(30), Arg(33), Cys(41), Cys(45), Phe(51), and Cys(54) may be functionally significant among the MIH of crustaceans. Single amino acid substitutions 'Y' and 'G' at the positions 71 and 72 of the MIH C-terminal region showed an alteration in the stability indicating that a change in this region may alter the function of MIH. In conclusion, we proposed a computational approach to analyze the structure, phylogeny and stability of MIH from crustaceans.

  13. Design of airborne wind turbine and computational fluid dynamics analysis

    NASA Astrophysics Data System (ADS)

    Anbreen, Faiqa

    Wind energy is a promising alternative to the depleting non-renewable sources. The height of the wind turbines becomes a constraint to their efficiency. Airborne wind turbine can reach much higher altitudes and produce higher power due to high wind velocity and energy density. The focus of this thesis is to design a shrouded airborne wind turbine, capable to generate 70 kW to propel a leisure boat with a capacity of 8-10 passengers. The idea of designing an airborne turbine is to take the advantage of higher velocities in the atmosphere. The Solidworks model has been analyzed numerically using Computational Fluid Dynamics (CFD) software StarCCM+. The Unsteady Reynolds Averaged Navier Stokes Simulation (URANS) with K-epsilon turbulence model has been selected, to study the physical properties of the flow, with emphasis on the performance of the turbine and the increase in air velocity at the throat. The analysis has been done using two ambient velocities of 12 m/s and 6 m/s. At 12 m/s inlet velocity, the velocity of air at the turbine has been recorded as 16 m/s. The power generated by the turbine is 61 kW. At inlet velocity of 6 m/s, the velocity of air at turbine increased to 10 m/s. The power generated by turbine is 25 kW.

  14. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  15. Scale analysis using X-ray microfluorescence and computed radiography

    NASA Astrophysics Data System (ADS)

    Candeias, J. P.; de Oliveira, D. F.; dos Anjos, M. J.; Lopes, R. T.

    2014-02-01

    Scale deposits are the most common and most troublesome damage problems in the oil field and can occur in both production and injection wells. They occur because the minerals in produced water exceed their saturation limit as temperatures and pressures change. Scale can vary in appearance from hard crystalline material to soft, friable material and the deposits can contain other minerals and impurities such as paraffin, salt and iron. In severe conditions, scale creates a significant restriction, or even a plug, in the production tubing. This study was conducted to qualify the elements present in scale samples and quantify the thickness of the scale layer using synchrotron radiation micro-X-ray fluorescence (SRμXRF) and computed radiography (CR) techniques. The SRμXRF results showed that the elements found in the scale samples were strontium, barium, calcium, chromium, sulfur and iron. The CR analysis showed that the thickness of the scale layer was identified and quantified with accuracy. These results can help in the decision making about removing the deposited scale.

  16. Computer-based analysis of Haemophilus parasuis protein fingerprints

    PubMed Central

    2004-01-01

    Abstract The present study aimed to compare the whole-cell protein profiles of Haemophilus parasuis field isolates by using a computer-based analysis, and evaluate the relationship between polyacrylamide gel electrophoresis (PAGE) type and virulence potential based on isolation site. A dendrogram clustering isolates with similar protein profiles was generated. Haemophilus parasuis isolates were grouped into 2 major PAGE type groups. The PAGE type II isolates were characterized by the presence of major proteins with molecular weights varying from between 36 and 38 kDa and included 90.7% of the isolates recovered from systemic sites, such as pleura, pericardium, peritoneum, lymph nodes, joints, and brain. Isolates classified as PAGE type I were characterized by the absence of this group of proteins and included 83.4% of the isolates recovered from the upper respiratory tract of healthy animals. The present study further corroborates the existence of a unique group of major proteins in potentially virulent H. parasuis isolates. PMID:14979439

  17. Analysis of coherent dynamical processes through computer vision

    NASA Astrophysics Data System (ADS)

    Hack, M. J. Philipp

    2016-11-01

    Visualizations of turbulent boundary layers show an abundance of characteristic arc-shaped structures whose apparent similarity suggests a common origin in a coherent dynamical process. While the structures have been likened to the hairpin vortices observed in the late stages of transitional flow, a consistent description of the underlying mechanism has remained elusive. Detailed studies are complicated by the chaotic nature of turbulence which modulates each manifestation of the process and which renders the isolation of individual structures a challenging task. The present study applies methods from the field of computer vision to capture the time evolution of turbulent flow features and explore the associated physical mechanisms. The algorithm uses morphological operations to condense the structure of the turbulent flow field into a graph described by nodes and links. The low-dimensional geometric information is stored in a database and allows the identification and analysis of equivalent dynamical processes across multiple scales. The framework is not limited to turbulent boundary layers and can also be applied to different types of flows as well as problems from other fields of science.

  18. Applied and computational harmonic analysis on graphs and networks

    NASA Astrophysics Data System (ADS)

    Irion, Jeff; Saito, Naoki

    2015-09-01

    In recent years, the advent of new sensor technologies and social network infrastructure has provided huge opportunities and challenges for analyzing data recorded on such networks. In the case of data on regular lattices, computational harmonic analysis tools such as the Fourier and wavelet transforms have well-developed theories and proven track records of success. It is therefore quite important to extend such tools from the classical setting of regular lattices to the more general setting of graphs and networks. In this article, we first review basics of graph Laplacian matrices, whose eigenpairs are often interpreted as the frequencies and the Fourier basis vectors on a given graph. We point out, however, that such an interpretation is misleading unless the underlying graph is either an unweighted path or cycle. We then discuss our recent effort of constructing multiscale basis dictionaries on a graph, including the Hierarchical Graph Laplacian Eigenbasis Dictionary and the Generalized Haar-Walsh Wavelet Packet Dictionary, which are viewed as generalizations of the classical hierarchical block DCTs and the Haar-Walsh wavelet packets, respectively, to the graph setting. Finally, we demonstrate the usefulness of our dictionaries by using them to simultaneously segment and denoise 1-D noisy signals sampled on regular lattices, a problem where classical tools have difficulty.

  19. Recent advances in computational structural reliability analysis methods

    NASA Technical Reports Server (NTRS)

    Thacker, Ben H.; Wu, Y.-T.; Millwater, Harry R.; Torng, Tony Y.; Riha, David S.

    1993-01-01

    The goal of structural reliability analysis is to determine the probability that the structure will adequately perform its intended function when operating under the given environmental conditions. Thus, the notion of reliability admits the possibility of failure. Given the fact that many different modes of failure are usually possible, achievement of this goal is a formidable task, especially for large, complex structural systems. The traditional (deterministic) design methodology attempts to assure reliability by the application of safety factors and conservative assumptions. However, the safety factor approach lacks a quantitative basis in that the level of reliability is never known and usually results in overly conservative designs because of compounding conservatisms. Furthermore, problem parameters that control the reliability are not identified, nor their importance evaluated. A summary of recent advances in computational structural reliability assessment is presented. A significant level of activity in the research and development community was seen recently, much of which was directed towards the prediction of failure probabilities for single mode failures. The focus is to present some early results and demonstrations of advanced reliability methods applied to structural system problems. This includes structures that can fail as a result of multiple component failures (e.g., a redundant truss), or structural components that may fail due to multiple interacting failure modes (e.g., excessive deflection, resonate vibration, or creep rupture). From these results, some observations and recommendations are made with regard to future research needs.

  20. Computational Modelling and Movement Analysis of Hip Joint with Muscles

    NASA Astrophysics Data System (ADS)

    Siswanto, W. A.; Yoon, C. C.; Salleh, S. Md.; Ngali, M. Z.; Yusup, Eliza M.

    2017-01-01

    In this study, the model of hip joint and the main muscles are modelled by finite elements. The parts included in the model are hip joint, hemi pelvis, gluteus maximus, quadratus femoris and gamellus inferior. The materials that used in these model are isotropic elastic, Mooney Rivlin and Neo-hookean. The hip resultant force of the normal gait and stair climbing are applied on the model of hip joint. The responses of displacement, stress and strain of the muscles are then recorded. FEBio non-linear solver for biomechanics is employed to conduct the simulation of the model of hip joint with muscles. The contact interfaces that used in this model are sliding contact and tied contact. From the analysis results, the gluteus maximus has the maximum displacement, stress and strain in the stair climbing. Quadratus femoris and gamellus inferior has the maximum displacement and strain in the normal gait however the maximum stress in the stair climbing. Besides that, the computational model of hip joint with muscles is produced for research and investigation platform. The model can be used as a visualization platform of hip joint.

  1. Experimental Validation of Plastic Mandible Models Produced by a “Low-Cost” 3-Dimensional Fused Deposition Modeling Printer

    PubMed Central

    Maschio, Federico; Pandya, Mirali; Olszewski, Raphael

    2016-01-01

    Background The objective of this study was to investigate the accuracy of 3-dimensional (3D) plastic (ABS) models generated using a low-cost 3D fused deposition modelling printer. Material/Methods Two human dry mandibles were scanned with a cone beam computed tomography (CBCT) Accuitomo device. Preprocessing consisted of 3D reconstruction with Maxilim software and STL file repair with Netfabb software. Then, the data were used to print 2 plastic replicas with a low-cost 3D fused deposition modeling printer (Up plus 2®). Two independent observers performed the identification of 26 anatomic landmarks on the 4 mandibles (2 dry and 2 replicas) with a 3D measuring arm. Each observer repeated the identifications 20 times. The comparison between the dry and plastic mandibles was based on 13 distances: 8 distances less than 12 mm and 5 distances greater than 12 mm. Results The mean absolute difference (MAD) was 0.37 mm, and the mean dimensional error (MDE) was 3.76%. The MDE decreased to 0.93% for distances greater than 12 mm. Conclusions Plastic models generated using the low-cost 3D printer UPplus2® provide dimensional accuracies comparable to other well-established rapid prototyping technologies. Validated low-cost 3D printers could represent a step toward the better accessibility of rapid prototyping technologies in the medical field. PMID:27003456

  2. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy

    SciTech Connect

    Solares, Santiago D.

    2015-11-26

    This study introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretation of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tappingmode imaging, for both of which the force curves exhibit the expected features. Lastly, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.

  3. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy

    DOE PAGES

    Solares, Santiago D.

    2015-11-26

    This study introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretationmore » of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tappingmode imaging, for both of which the force curves exhibit the expected features. Lastly, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.« less

  4. A simple and efficient quasi 3-dimensional viscoelastic model and software for simulation of tapping-mode atomic force microscopy.

    PubMed

    Solares, Santiago D

    2015-01-01

    This paper introduces a quasi-3-dimensional (Q3D) viscoelastic model and software tool for use in atomic force microscopy (AFM) simulations. The model is based on a 2-dimensional array of standard linear solid (SLS) model elements. The well-known 1-dimensional SLS model is a textbook example in viscoelastic theory but is relatively new in AFM simulation. It is the simplest model that offers a qualitatively correct description of the most fundamental viscoelastic behaviors, namely stress relaxation and creep. However, this simple model does not reflect the correct curvature in the repulsive portion of the force curve, so its application in the quantitative interpretation of AFM experiments is relatively limited. In the proposed Q3D model the use of an array of SLS elements leads to force curves that have the typical upward curvature in the repulsive region, while still offering a very low computational cost. Furthermore, the use of a multidimensional model allows for the study of AFM tips having non-ideal geometries, which can be extremely useful in practice. Examples of typical force curves are provided for single- and multifrequency tapping-mode imaging, for both of which the force curves exhibit the expected features. Finally, a software tool to simulate amplitude and phase spectroscopy curves is provided, which can be easily modified to implement other controls schemes in order to aid in the interpretation of AFM experiments.

  5. Computational Analysis and Characterization of RC-135 External Aerodynamics

    DTIC Science & Technology

    2012-03-22

    the RJ has experienced structural damage. Computational fluid dynamics ( CFD ) was applied with the intention of characterizing the differences between...13 2.3 Computational Fluid Dynamics . . . . . . . . . . . . . . 18 2.3.1 Grid Generation: ANSYS ICEM CFD . . . . . 18 2.3.2 Flow Solver...another. Computational fluid dynamics ( CFD ) techniques are applied to both the Rivet Joint and Combat Sent variants with the objective of achieving a better

  6. Sodium fast reactor gaps analysis of computer codes and models for accident analysis and reactor safety.

    SciTech Connect

    Carbajo, Juan; Jeong, Hae-Yong; Wigeland, Roald; Corradini, Michael; Schmidt, Rodney Cannon; Thomas, Justin; Wei, Tom; Sofu, Tanju; Ludewig, Hans; Tobita, Yoshiharu; Ohshima, Hiroyuki; Serre, Frederic

    2011-06-01

    This report summarizes the results of an expert-opinion elicitation activity designed to qualitatively assess the status and capabilities of currently available computer codes and models for accident analysis and reactor safety calculations of advanced sodium fast reactors, and identify important gaps. The twelve-member panel consisted of representatives from five U.S. National Laboratories (SNL, ANL, INL, ORNL, and BNL), the University of Wisconsin, the KAERI, the JAEA, and the CEA. The major portion of this elicitation activity occurred during a two-day meeting held on Aug. 10-11, 2010 at Argonne National Laboratory. There were two primary objectives of this work: (1) Identify computer codes currently available for SFR accident analysis and reactor safety calculations; and (2) Assess the status and capability of current US computer codes to adequately model the required accident scenarios and associated phenomena, and identify important gaps. During the review, panel members identified over 60 computer codes that are currently available in the international community to perform different aspects of SFR safety analysis for various event scenarios and accident categories. A brief description of each of these codes together with references (when available) is provided. An adaptation of the Predictive Capability Maturity Model (PCMM) for computational modeling and simulation is described for use in this work. The panel's assessment of the available US codes is presented in the form of nine tables, organized into groups of three for each of three risk categories considered: anticipated operational occurrences (AOOs), design basis accidents (DBA), and beyond design basis accidents (BDBA). A set of summary conclusions are drawn from the results obtained. At the highest level, the panel judged that current US code capabilities are adequate for licensing given reasonable margins, but expressed concern that US code development activities had stagnated and that the

  7. Integrating aerodynamic surface modeling for computational fluid dynamics with computer aided structural analysis, design, and manufacturing

    NASA Technical Reports Server (NTRS)

    Thorp, Scott A.

    1992-01-01

    This presentation will discuss the development of a NASA Geometry Exchange Specification for transferring aerodynamic surface geometry between LeRC systems and grid generation software used for computational fluid dynamics research. The proposed specification is based on a subset of the Initial Graphics Exchange Specification (IGES). The presentation will include discussion of how the NASA-IGES standard will accommodate improved computer aided design inspection methods and reverse engineering techniques currently being developed. The presentation is in viewgraph format.

  8. Principal Component Analysis of Computed Emission Lines from Protostellar Jets

    NASA Astrophysics Data System (ADS)

    Cerqueira, A. H.; Reyes-Iturbide, J.; De Colle, F.; Vasconcelos, M. J.

    2015-08-01

    A very important issue concerning protostellar jets is the mechanism behind their formation. Obtaining information on the region at the base of a jet can shed light on the subject, and some years ago this was done through a search for a rotational signature in the jet line spectrum. The existence of such signatures, however, remains controversial. In order to contribute to the clarification of this issue, in this paper we show that principal component analysis (PCA) can potentially help to distinguish between rotation and precession effects in protostellar jet images. This method reduces the dimensions of the data, facilitating the efficient extraction of information from large data sets such as those arising from integral field spectroscopy. PCA transforms the system of correlated coordinates into a system of uncorrelated coordinates, the eigenvectors, ordered by principal components of decreasing variance. The projection of the data on these coordinates produces images called tomograms, while eigenvectors can be displayed as eigenspectra. The combined analysis of both can allow the identification of patterns correlated to a particular physical property that would otherwise remain hidden, and can help to separate the effects of physically uncorrelated phenomena in the data. These are, for example, rotation and precession in the kinematics of a stellar jet. In order to show the potential of PCA analysis, we apply it to synthetic spectro-imaging datacubes generated as an output of numerical simulations of protostellar jets. In this way we generate a benchmark with which a PCA diagnostics of real observations can be confronted. Using the computed emission line profiles for [O i]λ6300 and [S ii]λ6716, we recover and analyze the effects of rotation and precession in tomograms generated by PCA. We show that different combinations of the eigenvectors can be used to enhance and to identify the rotation features present in the data. Our results indicate that PCA can be

  9. Computational modeling and impact analysis of textile composite structures

    NASA Astrophysics Data System (ADS)

    Hur, Hae-Kyu

    This study is devoted to the development of an integrated numerical modeling enabling one to investigate the static and the dynamic behaviors and failures of 2-D textile composite as well as 3-D orthogonal woven composite structures weakened by cracks and subjected to static-, impact- and ballistic-type loads. As more complicated modeling about textile composite structures is introduced, some of homogenization schemes, geometrical modeling and crack propagations become more difficult problems to solve. To overcome these problems, this study presents effective mesh-generation schemes, homogenization modeling based on a repeating unit cell and sinusoidal functions, and also a cohesive element to study micro-crack shapes. This proposed research has two: (1) studying behavior of textile composites under static loads, (2) studying dynamic responses of these textile composite structures subjected to the transient/ballistic loading. In the first part, efficient homogenization schemes are suggested to show the influence of textile architectures on mechanical characteristics considering the micro modeling of repeating unit cell. Furthermore, the structures of multi-layered or multi-phase composites combined with different laminar such as a sub-laminate, are considered to find the mechanical characteristics. A simple progressive failure mechanism for the textile composites is also presented. In the second part, this study focuses on three main phenomena to solve the dynamic problems: micro-crack shapes, textile architectures and textile effective moduli. To obtain a good solutions of the dynamic problems, this research attempts to use four approaches: (I) determination of governing equations via a three-level hierarchy: micro-mechanical unit cell analysis, layer-wise analysis accounting for transverse strains and stresses, and structural analysis based on anisotropic plate layers, (II) development of an efficient computational approach enabling one to perform transient

  10. Computer Models for IRIS Control System Transient Analysis

    SciTech Connect

    Gary D. Storrick; Bojan Petrovic; Luca Oriani

    2007-01-31

    This report presents results of the Westinghouse work performed under Task 3 of this Financial Assistance Award and it satisfies a Level 2 Milestone for the project. Task 3 of the collaborative effort between ORNL, Brazil and Westinghouse for the International Nuclear Energy Research Initiative entitled “Development of Advanced Instrumentation and Control for an Integrated Primary System Reactor” focuses on developing computer models for transient analysis. This report summarizes the work performed under Task 3 on developing control system models. The present state of the IRIS plant design – such as the lack of a detailed secondary system or I&C system designs – makes finalizing models impossible at this time. However, this did not prevent making considerable progress. Westinghouse has several working models in use to further the IRIS design. We expect to continue modifying the models to incorporate the latest design information until the final IRIS unit becomes operational. Section 1.2 outlines the scope of this report. Section 2 describes the approaches we are using for non-safety transient models. It describes the need for non-safety transient analysis and the model characteristics needed to support those analyses. Section 3 presents the RELAP5 model. This is the highest-fidelity model used for benchmark evaluations. However, it is prohibitively slow for routine evaluations and additional lower-fidelity models have been developed. Section 4 discusses the current Matlab/Simulink model. This is a low-fidelity, high-speed model used to quickly evaluate and compare competing control and protection concepts. Section 5 describes the Modelica models developed by POLIMI and Westinghouse. The object-oriented Modelica language provides convenient mechanisms for developing models at several levels of detail. We have used this to develop a high-fidelity model for detailed analyses and a faster-running simplified model to help speed the I&C development process

  11. Grain boundary segregation in boron added interstitial free steels studied by 3-dimensional atom probe

    SciTech Connect

    Seto, K.; Larson, D.J.; Warren, P.J.; Smith, G.D.W.

    1999-04-09

    The development of deep-drawable sheet steels is of particular significance for the automotive industry. Titanium and/or niobium added extra-low carbon interstitial free (IF) steels are key materials. The virtually complete removal of carbon and nitrogen should lead to superior forming properties. However, the lack of solute carbon at grain boundaries significantly decreases the bonding force at the interfaces, which often causes intergranular brittle fracture when deeply drawn steel sheets are subjected to impact deformation at low temperature. This phenomenon is called secondary working embrittlement (SWE), and is a major problem when solute atoms such as phosphorus, manganese or silicon are added to increase the tensile strength of the steels. Small amounts of boron, which does not affect the formability of the steels significantly, are usually added as a remedial measure in such cases. The 3-dimensional atom probe (3DAP) combined with field ion microscopy (FIM) has the ability to produce 3-dimensional images from regions approximately 20nm*20nm*100nm in size, and identify each atomic species and the relative location of each atom with nearly lattice resolution. In this study, a combination of these methods was applied to produce FIM tips of IF steel containing grain boundaries. The authors report here the first observations of the segregation of boron in IF steels using 3DAP.

  12. A 3-dimensional model for teaching local flaps using porcine skin.

    PubMed

    Hassan, Zahid; Hogg, Fiona; Graham, Ken

    2014-10-01

    The European Working Time Directive and streamlined training has led to reduced training time. Surgery, as an experience-dependent craft specialty is affected more than other medical specialties. Trainees want to maximize all training opportunities in the clinical setting, and having predeveloped basic skills acquired on a simulated model can facilitate this.Here we describe the use of a novel model to design and raise local flaps in the face and scalp regions. The model consists of mannequin heads draped with porcine skin which is skewered with pins at strategic points to give a 3-dimensional model which closely resembles a cadaveric head.The advantages of this model are that it is life size and incorporates all the relevant anatomical features, which can be drawn on if required.This model was used on a recent course, Intermediate Skills in Plastic Surgery: Flaps Around the Face, at the Royal College of Surgeons England. The trainees found that practicing on the porcine skin gave them an opportunity to master the basics of flap design and implementation.In summary, this innovative 3-dimensional training model has received high levels of satisfaction and is currently as close as we can get to cadaveric dissection without the constraints and cost of using human tissue.

  13. Crossover from 2-dimensional to 3-dimensional aggregations of clusters on square lattice substrates

    NASA Astrophysics Data System (ADS)

    Cheng, Yi; Zhu, Yu-Hong; Pan, Qi-Fa; Yang, Bo; Tao, Xiang-Ming; Ye, Gao-Xiang

    2015-11-01

    A Monte Carlo study on the crossover from 2-dimensional to 3-dimensional aggregations of clusters is presented. Based on the traditional cluster-cluster aggregation (CCA) simulation, a modified growth model is proposed. The clusters (including single particles and their aggregates) diffuse with diffusion step length l (1 ≤ l ≤ 7) and aggregate on a square lattice substrate. If the number of particles contained in a cluster is larger than a critical size sc, the particles at the edge of the cluster have a possibility to jump onto the upper layer, which results in the crossover from 2-dimensional to 3-dimensional aggregations. Our simulation results are in good agreement with the experimental findings. Project supported by the National Natural Science Foundation of China (Grant Nos. 11374082 and 11074215), the Science Foundation of Zhejiang Province Department of Education, China (Grant No. Y201018280), the Fundamental Research Funds for Central Universities, China (Grant No. 2012QNA3010), and the Specialized Research Fund for the Doctoral Program of Higher Education of China (Grant No. 20100101110005).

  14. Endothelial cells assemble into a 3-dimensional prevascular network in a bone tissue engineering construct.

    PubMed

    Rouwkema, Jeroen; de Boer, Jan; Van Blitterswijk, Clemens A

    2006-09-01

    To engineer tissues with clinically relevant dimensions, one must overcome the challenge of rapidly creating functional blood vessels to supply cells with oxygen and nutrients and to remove waste products. We tested the hypothesis that endothelial cells, cocultured with osteoprogenitor cells, can organize into a prevascular network in vitro. When cultured in a spheroid coculture model with human mesenchymal stem cells, human umbilical vein endothelial cells (HUVECs) form a 3-dimensional prevascular network within 10 days of in vitro culture. The formation of the prevascular network was promoted by seeding 2% or fewer HUVECs. Moreover, the addition of endothelial cells resulted in a 4-fold upregulation of the osteogenic marker alkaline phosphatase. The addition of mouse embryonic fibroblasts did not result in stabilization of the prevascular network. Upon implantation, the prevascular network developed further and structures including lumen could be seen regularly. However, anastomosis with the host vasculature was limited. We conclude that endothelial cells are able to form a 3-dimensional (3D) prevascular network in vitro in a bone tissue engineering setting. This finding is a strong indication that in vitro prevascularization is a promising strategy to improve implant vascularization in bone tissue engineering.

  15. Computer Majors' Education as Moral Enterprise: A Durkheimian Analysis.

    ERIC Educational Resources Information Center

    Rigoni, David P.; Lamagdeleine, Donald R.

    1998-01-01

    Building on Durkheim's (Emile) emphasis on the moral dimensions of social reality and using it to explore contemporary computer education, contends that many of his claims are justified. Argues that the college computer department has created a set of images, maxims, and operating assumptions that frames its curriculum, courses, and student…

  16. Computational Analysis of Dual Radius Circulation Control Airfoils

    NASA Technical Reports Server (NTRS)

    Lee-Rausch, E. M.; Vatsa, V. N.; Rumsey, C. L.

    2006-01-01

    The goal of the work is to use multiple codes and multiple configurations to provide an assessment of the capability of RANS solvers to predict circulation control dual radius airfoil performance and also to identify key issues associated with the computational predictions of these configurations that can result in discrepancies in the predicted solutions. Solutions were obtained for the Georgia Tech Research Institute (GTRI) dual radius circulation control airfoil and the General Aviation Circulation Control (GACC) dual radius airfoil. For the GTRI-DR airfoil, two-dimensional structured and unstructured grid computations predicted the experimental trend in sectional lift variation with blowing coefficient very well. Good code to code comparisons between the chordwise surface pressure coefficients and the solution streamtraces also indicated that the detailed flow characteristics were matched between the computations. For the GACC-DR airfoil, two-dimensional structured and unstructured grid computations predicted the sectional lift and chordwise pressure distributions accurately at the no blowing condition. However at a moderate blowing coefficient, although the code to code variation was small, the differences between the computations and experiment were significant. Computations were made to investigate the sensitivity of the sectional lift and pressure distributions to some of the experimental and computational parameters, but none of these could entirely account for the differences in the experimental and computational results. Thus, CFD may indeed be adequate as a prediction tool for dual radius CC flows, but limited and difficult to obtain two-dimensional experimental data prevents a confident assessment at this time.

  17. An Analysis of the Computer Equipment Repair Occupation in Illinois.

    ERIC Educational Resources Information Center

    Reneau, Fred W.; And Others

    1985-01-01

    An occupational inventory was completed by 100 Illinois electronic sales and service technicians and electronics field engineers. Data were collected on the following questions: What tasks are performed on the job by computer equipment repairers? and What tools/equipment are used on the job by computer equipment repairers? Results are examined.…

  18. Computational Analysis of Flow Through a Transonic Compressor Rotor

    DTIC Science & Technology

    2005-09-01

    a commercial Computer Aided Design (CAD) software company, has developed a new code that allows modeling of two phase flow. ICEM -CFD and CFX-5, both...commercial Computer Aided Design (CAD) software company, has developed a new code that allows modeling of two phase flow. ICEM -CFD and CFX-5, both Anys...6 III. PROCEDURES............................................................................................................9 A. ICEM -CFD

  19. Generalized Advanced Propeller Analysis System (GAPAS). Volume 2: Computer program user manual

    NASA Technical Reports Server (NTRS)

    Glatt, L.; Crawford, D. R.; Kosmatka, J. B.; Swigart, R. J.; Wong, E. W.

    1986-01-01

    The Generalized Advanced Propeller Analysis System (GAPAS) computer code is described. GAPAS was developed to analyze advanced technology multi-bladed propellers which operate on aircraft with speeds up to Mach 0.8 and altitudes up to 40,000 feet. GAPAS includes technology for analyzing aerodynamic, structural, and acoustic performance of propellers. The computer code was developed for the CDC 7600 computer and is currently available for industrial use on the NASA Langley computer. A description of all the analytical models incorporated in GAPAS is included. Sample calculations are also described as well as users requirements for modifying the analysis system. Computer system core requirements and running times are also discussed.

  20. MMA, A Computer Code for Multi-Model Analysis

    USGS Publications Warehouse

    Poeter, Eileen P.; Hill, Mary C.

    2007-01-01

    This report documents the Multi-Model Analysis (MMA) computer code. MMA can be used to evaluate results from alternative models of a single system using the same set of observations for all models. As long as the observations, the observation weighting, and system being represented are the same, the models can differ in nearly any way imaginable. For example, they may include different processes, different simulation software, different temporal definitions (for example, steady-state and transient models could be considered), and so on. The multiple models need to be calibrated by nonlinear regression. Calibration of the individual models needs to be completed before application of MMA. MMA can be used to rank models and calculate posterior model probabilities. These can be used to (1) determine the relative importance of the characteristics embodied in the alternative models, (2) calculate model-averaged parameter estimates and predictions, and (3) quantify the uncertainty of parameter estimates and predictions in a way that integrates the variations represented by the alternative models. There is a lack of consensus on what model analysis methods are best, so MMA provides four default methods. Two are based on Kullback-Leibler information, and use the AIC (Akaike Information Criterion) or AICc (second-order-bias-corrected AIC) model discrimination criteria. The other two default methods are the BIC (Bayesian Information Criterion) and the KIC (Kashyap Information Criterion) model discrimination criteria. Use of the KIC criterion is equivalent to using the maximum-likelihood Bayesian model averaging (MLBMA) method. AIC, AICc, and BIC can be derived from Frequentist or Bayesian arguments. The default methods based on Kullback-Leibler information have a number of theoretical advantages, including that they tend to favor more complicated models as more data become available than do the other methods, which makes sense in many situations. Many applications of MMA will

  1. A Geometric Modelling Approach to Determining the Best Sensing Coverage for 3-Dimensional Acoustic Target Tracking in Wireless Sensor Networks

    PubMed Central

    Pashazadeh, Saeid; Sharifi, Mohsen

    2009-01-01

    Existing 3-dimensional acoustic target tracking methods that use wired/wireless networked sensor nodes to track targets based on four sensing coverage do not always compute the feasible spatio-temporal information of target objects. To investigate this discrepancy in a formal setting, we propose a geometric model of the target tracking problem alongside its equivalent geometric dual model that is easier to solve. We then study and prove some properties of dual model by exploiting its relationship with algebra. Based on these properties, we propose a four coverage axis line method based on four sensing coverage and prove that four sensing coverage always yields two dual correct answers; usually one of them is infeasible. By showing that the feasible answer can be only sometimes identified by using a simple time test method such as the one proposed by ourselves, we prove that four sensing coverage fails to always yield the feasible spatio-temporal information of a target object. We further prove that five sensing coverage always gives the feasible position of a target object under certain conditions that are discussed in this paper. We propose three extensions to four coverage axis line method, namely, five coverage extent point method, five coverage extended axis lines method, and five coverage redundant axis lines method. Computation and time complexities of all four proposed methods are equal in the worst cases as well as on average being equal to Θ(1) each. Proposed methods and proved facts about capabilities of sensing coverage degree in this paper can be used in all other methods of acoustic target tracking like Bayesian filtering methods. PMID:22423198

  2. Analysis of Drafting Effects in Swimming Using Computational Fluid Dynamics

    PubMed Central

    Silva, António José; Rouboa, Abel; Moreira, António; Reis, Victor Machado; Alves, Francisco; Vilas-Boas, João Paulo; Marinho, Daniel Almeida

    2008-01-01

    The purpose of this study was to determine the effect of drafting distance on the drag coefficient in swimming. A k-epsilon turbulent model was implemented in the commercial code Fluent® and applied to the fluid flow around two swimmers in a drafting situation. Numerical simulations were conducted for various distances between swimmers (0.5-8.0 m) and swimming velocities (1.6-2.0 m.s-1). Drag coefficient (Cd) was computed for each one of the distances and velocities. We found that the drag coefficient of the leading swimmer decreased as the flow velocity increased. The relative drag coefficient of the back swimmer was lower (about 56% of the leading swimmer) for the smallest inter-swimmer distance (0.5 m). This value increased progressively until the distance between swimmers reached 6.0 m, where the relative drag coefficient of the back swimmer was about 84% of the leading swimmer. The results indicated that the Cd of the back swimmer was equal to that of the leading swimmer at distances ranging from 6.45 to 8. 90 m. We conclude that these distances allow the swimmers to be in the same hydrodynamic conditions during training and competitions. Key pointsThe drag coefficient of the leading swimmer decreased as the flow velocity increased.The relative drag coefficient of the back swimmer was least (about 56% of the leading swimmer) for the smallest inter-swimmer distance (0.5 m).The drag coefficient values of both swimmers in drafting were equal to distances ranging between 6.45 m and 8.90 m, considering the different flow velocities.The numerical simulation techniques could be a good approach to enable the analysis of the fluid forces around objects in water, as it happens in swimming. PMID:24150135

  3. Interface design of VSOP'94 computer code for safety analysis

    NASA Astrophysics Data System (ADS)

    Natsir, Khairina; Yazid, Putranto Ilham; Andiwijayakusuma, D.; Wahanani, Nursinta Adi

    2014-09-01

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  4. Interface design of VSOP'94 computer code for safety analysis

    SciTech Connect

    Natsir, Khairina Andiwijayakusuma, D.; Wahanani, Nursinta Adi; Yazid, Putranto Ilham

    2014-09-30

    Today, most software applications, also in the nuclear field, come with a graphical user interface. VSOP'94 (Very Superior Old Program), was designed to simplify the process of performing reactor simulation. VSOP is a integrated code system to simulate the life history of a nuclear reactor that is devoted in education and research. One advantage of VSOP program is its ability to calculate the neutron spectrum estimation, fuel cycle, 2-D diffusion, resonance integral, estimation of reactors fuel costs, and integrated thermal hydraulics. VSOP also can be used to comparative studies and simulation of reactor safety. However, existing VSOP is a conventional program, which was developed using Fortran 65 and have several problems in using it, for example, it is only operated on Dec Alpha mainframe platforms and provide text-based output, difficult to use, especially in data preparation and interpretation of results. We develop a GUI-VSOP, which is an interface program to facilitate the preparation of data, run the VSOP code and read the results in a more user friendly way and useable on the Personal 'Computer (PC). Modifications include the development of interfaces on preprocessing, processing and postprocessing. GUI-based interface for preprocessing aims to provide a convenience way in preparing data. Processing interface is intended to provide convenience in configuring input files and libraries and do compiling VSOP code. Postprocessing interface designed to visualized the VSOP output in table and graphic forms. GUI-VSOP expected to be useful to simplify and speed up the process and analysis of safety aspects.

  5. Analysis of Sacrococcygeal Morphology in Koreans Using Computed Tomography

    PubMed Central

    Yoon, Min Geun; Moon, Myung-Sang; Park, Bong Keun; Kim, Dong-Hyeon

    2016-01-01

    Background The sacrococcygeal morphology of Arabs and Europeans has been studied using computed tomography (CT) or magnetic resonance imaging to determine the cause of coccydynia. Studies have suggested differences in sacrococcygeal morphology among ethnic groups. However, there are no data on the sacrococcygeal anatomy of Koreans. Methods We conducted a retrospective analysis of 606 pelvic CT scans that were taken at Cheju Halla General Hospital between 2008 and 2014. Fractures of the sacrum or coccyx were excluded. Differences in the sacrococcygeal morphology among age groups stratified by decade of life and between genders were analyzed using sagittal plane pelvic CT scans. The morphological parameters studied were the sacral and coccygeal curved indexes, sacrococcygeal angle, intercoccygeal angle, coccygeal type, coccygeal segmental number, and sacrococcygeal fusion. Results The average sacral and coccygeal curved indexes were 6.15 and 7.41, respectively. The average sacrococcygeal and intercoccygeal angles were 110° and 49°, respectively. Type II coccyx was most common, and the rate of sacrococcygeal fusion was 34%. There was a moderate positive correlation between age and the sacral curved index (r = 0.493, p = 0.000) and a weak negative correlation between age and the coccyx curved index (r = −0.257, p = 0.000). There was a weak negative correlation between age and the intercoccygeal angle (r = −0.187, p = 0.000). The average intercoccygeal angle in males and females was 53.9° and 44.7°, respectively. Conclusions The sacrum tended to be more curved and the coccyx straighter with age. The coccyx was straighter in females than males. Knowledge of the sacrococcygeal anatomy of Koreans will promote better understanding of anatomical differences among ethnicities and future studies on coccydynia. PMID:27904724

  6. Olfactory cleft computed tomography analysis and olfaction in chronic rhinosinusitis

    PubMed Central

    Kohli, Preeti; Schlosser, Rodney J.; Storck, Kristina

    2016-01-01

    Background: Volumetric analysis of the olfactory cleft by using computed tomography has been associated with olfaction in patients with chronic rhinosinusitis (CRS). However, existing studies have not comprehensively measured olfaction, and it thus remains unknown whether correlations differ across specific dimensions of odor perception. Objective: To use comprehensive measures of patient-reported and objective olfaction to evaluate the relationship between volumetric olfactory cleft opacification and olfaction. Methods: Olfaction in patients with CRS was evaluated by using “Sniffin' Sticks” tests and a modified version of the Questionnaire of Olfactory Disorders. Olfactory cleft opacification was quantified by using two- and three-dimensional, computerized volumetric analysis. Correlations between olfactory metrics and olfactory cleft opacification were then calculated. Results: The overall CRS cohort included 26 patients without nasal polyposis (CRSsNP) (68.4%) and 12 patients with nasal polyposis (CRSwNP) (31.6%). Across the entire cohort, total olfactory cleft opacification was 82.8%, with greater opacification in the CRSwNP subgroup compared with CRSsNP (92.3 versus 78.4%, p < 0.001). The percent total volume opacification correlated with the total Sniffin' Sticks score (r = −0.568, p < 0.001) as well as individual threshold, discrimination, and identification scores (p < 0.001 for all). Within the CRSwNP subgroup, threshold (r = −0.616, p = 0.033) and identification (r = −0.647, p = 0.023) remained highly correlated with total volume opacification. In patients with CRSsNP, the threshold correlated with total volume scores (r = −0.457, p = 0.019), with weaker and nonsignificant correlations for discrimination and identification. Correlations between total volume opacification and the Questionnaire of Olfactory Disorders were qualitatively similar to objective olfactory findings in both CRSwNP (r = −0.566, p = 0.070) and CRSsNP (r = −0.310, p

  7. Measurement Performance of a Computer Assisted Vertebral Motion Analysis System

    PubMed Central

    Davis, Reginald J.; Lee, David C.; Cheng, Boyle

    2015-01-01

    Background Segmental instability of the lumbar spine is a significant cost within the US health care system; however current thresholds for indication of radiographic instability are not well defined. Purpose To determine the performance measurements of sagittal lumbar intervertebral measurements using computerassisted measurements of the lumbar spine using motion sequences from a video-fluoroscopic technique. Study design Sensitivity, specificity, predictive values, prevalence, and test-retest reliability evaluation of digitized manual versus computer-assisted measurements of the lumbar spine. Patient sample A total of 2239 intervertebral levels from 509 symptomatic patients, and 287 intervertebral levels from 73 asymptomatic participants were retrospectively evaluated. Outcome measures Specificity, sensitivity, negative predictive value (NPV), diagnostic accuracy, and prevalence between the two measurement techniques; Measurements of Coefficient of repeatability (CR), limits of agreement (LOA), intraclass correlation coefficient (ICC; type 3,1), and standard error of measurement for both measurement techniques. Methods Asymptomatic individuals and symptomatic patients were all evaluated using both the Vertebral Motion Analysis (VMA) system and fluoroscopic flexion extension static radiographs (FE). The analysis was compared to known thresholds of 15% intervertebral translation (IVT, equivalent to 5.3mm assuming a 35mm vertebral body depth) and 25° intervertebral rotation (IVR). Results The VMA measurements demonstrated greater specificity, % change in sensitivity, NPV, prevalence, and reliability compared with FE for radiographic evidence of instability. Specificity was 99.4% and 99.1% in the VMA compared to 98.3% and 98.2% in the FE for IVR and IVT, respectively. Sensitivity in this study was 41.2% and 44.6% greater in the VMA compared to the FE for IVR and IVT, respectively. NPV was 91% and 88% in the VMA compared to 62% and 66% in the FE for IVR and IVT

  8. [Computer-based image analysis for experimental and clinical morphology--principles, utilization and marginal limits].

    PubMed

    Seufert, R; Pfarrer, C; Leiser, R; Lellé, R

    1999-01-01

    The new computer based image analysis techniques are powerful tools for mophometrical and quantitative image analysis in case of clinical and experimental morphology. Digital image analysis requires a distinction between two phases 1. generation of fundamental data (x,y coordinates and grey values of the pixel) and 2. calculation of parameters from these data. Stereological procedures are very powerful in quantifying morphological phenomenons, but computer based image analysing techniques allow multiple analysis of morphological objects and analysis of statistical distributions. There is great scientific benefit using modern computer based image analysing techniques.

  9. The analysis of control trajectories using symbolic and database computing

    NASA Technical Reports Server (NTRS)

    Grossman, Robert

    1995-01-01

    This final report comprises the formal semi-annual status reports for this grant for the periods June 30-December 31, 1993, January 1-June 30, 1994, and June 1-December 31, 1994. The research supported by this grant is broadly concerned with the symbolic computation, mixed numeric-symbolic computation, and database computation of trajectories of dynamical systems, especially control systems. A review of work during the report period covers: trajectories and approximating series, the Cayley algebra of trees, actions of differential operators, geometrically stable integration algorithms, hybrid systems, trajectory stores, PTool, and other activities. A list of publications written during the report period is attached.

  10. Performance issues for engineering analysis on MIMD parallel computers

    SciTech Connect

    Fang, H.E.; Vaughan, C.T.; Gardner, D.R.

    1994-08-01

    We discuss how engineering analysts can obtain greater computational resolution in a more timely manner from applications codes running on MIMD parallel computers. Both processor speed and memory capacity are important to achieving better performance than a serial vector supercomputer. To obtain good performance, a parallel applications code must be scalable. In addition, the aspect ratios of the subdomains in the decomposition of the simulation domain onto the parallel computer should be of order 1. We demonstrate these conclusions using simulations conducted with the PCTH shock wave physics code running on a Cray Y-MP, a 1024-node nCUBE 2, and an 1840-node Paragon.

  11. Computer programs: Information retrieval and data analysis, a compilation

    NASA Technical Reports Server (NTRS)

    1972-01-01

    The items presented in this compilation are divided into two sections. Section one treats of computer usage devoted to the retrieval of information that affords the user rapid entry into voluminous collections of data on a selective basis. Section two is a more generalized collection of computer options for the user who needs to take such data and reduce it to an analytical study within a specific discipline. These programs, routines, and subroutines should prove useful to users who do not have access to more sophisticated and expensive computer software.

  12. Computational Environments and Analysis methods available on the NCI High Performance Computing (HPC) and High Performance Data (HPD) Platform

    NASA Astrophysics Data System (ADS)

    Evans, B. J. K.; Foster, C.; Minchin, S. A.; Pugh, T.; Lewis, A.; Wyborn, L. A.; Evans, B. J.; Uhlherr, A.

    2014-12-01

    The National Computational Infrastructure (NCI) has established a powerful in-situ computational environment to enable both high performance computing and data-intensive science across a wide spectrum of national environmental data collections - in particular climate, observational data and geoscientific assets. This paper examines 1) the computational environments that supports the modelling and data processing pipelines, 2) the analysis environments and methods to support data analysis, and 3) the progress in addressing harmonisation of the underlying data collections for future transdisciplinary research that enable accurate climate projections. NCI makes available 10+ PB major data collections from both the government and research sectors based on six themes: 1) weather, climate, and earth system science model simulations, 2) marine and earth observations, 3) geosciences, 4) terrestrial ecosystems, 5) water and hydrology, and 6) astronomy, social and biosciences. Collectively they span the lithosphere, crust, biosphere, hydrosphere, troposphere, and stratosphere. The data is largely sourced from NCI's partners (which include the custodians of many of the national scientific records), major research communities, and collaborating overseas organisations. The data is accessible within an integrated HPC-HPD environment - a 1.2 PFlop supercomputer (Raijin), a HPC class 3000 core OpenStack cloud system and several highly connected large scale and high-bandwidth Lustre filesystems. This computational environment supports a catalogue of integrated reusable software and workflows from earth system and ecosystem modelling, weather research, satellite and other observed data processing and analysis. To enable transdisciplinary research on this scale, data needs to be harmonised so that researchers can readily apply techniques and software across the corpus of data available and not be constrained to work within artificial disciplinary boundaries. Future challenges will

  13. Retrodeformable cross sections for 3-dimensional structural analysis, Ouachita orogen, Arkansas

    NASA Astrophysics Data System (ADS)

    Johnson, H. E.; Wiltschko, D. V.

    2010-12-01

    A fundamental tectonic problem is how deformation proceeds from hinterland to foreland in a fold and thrust belt (FTB). Wedge models explain many of the first-order observations found in most FTBs such as the internal deformation of material, thickening of hinterland, presence of a basal décollement, and an overall wedge shape that tapers to the foreland. These models currently have not been tested at the scale of the individual folds and faults. Moreover, most of the data available on, for instance, the sequence of events is best dated in the syntectonic sediments. Timing of uplift and motion of interior structures are not clear when using dates from these syntectonic sediments to some extent because an absolute connection between them is lacking. The purpose of this project is to develop a model for the evolution of the Ouachita orogen through the construction of a series of retrodeformable cross sections. A novel aspect of these cross sections is the combination of new and published thermal (i.e., illite ‘crystallinity’) and thermochronologic (i.e., zircon fission track) data collected at a variety of stratigraphic depths along the lines of section. These data will help to determine the cessation of thrust motion as well as the initial depth from which the thrust sheet emerged. An Ordovician Mazarn sample in the eastern exposed orogenic core has zircon grains with 55% reset fission track ages, whereas an overlying Ordovician Blakely sample about ~30 km to the southwest along strike has 15% being reset. Illite ‘crystallinity’ (IC) values indicate maximum burial metamorphism temperatures of anchizone (~250-350°C) coinciding with the location of the Ordovician Mazarn sample. Regionally, IC decreases from the culmination of the Benton Uplift and to the southwest along regional strike for samples that have similar stratigraphic age. These new timing and thermal constraints on an improved kinematic model are the necessary first steps in testing wedge models on an individual thrust sheet basis.

  14. Identifying Musical Performance Behavior in Instrumentalists Using Computer-Based Sound Spectrum Analysis.

    ERIC Educational Resources Information Center

    Rees, Fred J.; Michelis, Rainer M.

    1991-01-01

    Examines relationships between musical information processed by a computer-based sound analysis system and the audiovisual record of a performer's response to musical assignments. Concludes that computer analysis permits identification of performance behavior. Suggests that a database could be designed to provide both diagnostic response to…

  15. Analysis and interpretation of arterial sounds using a small clinical computer system

    NASA Technical Reports Server (NTRS)

    Dewey, C. F., Jr.; Metzinger, R. W.; Holford, S. K.; Klitzner, T. S.

    1973-01-01

    A small mobile bed-side computer system is described that is capable of performing phonoangiographic analyses as well as many other common data analysis tasks in a hospital. The clinical application of phonoangiography is found to be greatly facilitated by the computer-provided availability of data acquisition and analysis capabilities.

  16. Spreadsheet Analysis Of Queuing In A Computer Network

    NASA Technical Reports Server (NTRS)

    Galant, David C.

    1992-01-01

    Method of analyzing responses of computer network based on simple queuing-theory mathmatical models via spreadsheet program. Effects of variations in traffic, capacities of channels, and message protocols assessed.

  17. Computational Interpretations of Analysis via Products of Selection Functions

    NASA Astrophysics Data System (ADS)

    Escardó, Martín; Oliva, Paulo

    We show that the computational interpretation of full comprehension via two well-known functional interpretations (dialectica and modified realizability) corresponds to two closely related infinite products of selection functions.

  18. Finite Element Analysis in Concurrent Processing: Computational Issues

    NASA Technical Reports Server (NTRS)

    Sobieszczanski-Sobieski, Jaroslaw; Watson, Brian; Vanderplaats, Garrett

    2004-01-01

    The purpose of this research is to investigate the potential application of new methods for solving large-scale static structural problems on concurrent computers. It is well known that traditional single-processor computational speed will be limited by inherent physical limits. The only path to achieve higher computational speeds lies through concurrent processing. Traditional factorization solution methods for sparse matrices are ill suited for concurrent processing because the null entries get filled, leading to high communication and memory requirements. The research reported herein investigates alternatives to factorization that promise a greater potential to achieve high concurrent computing efficiency. Two methods, and their variants, based on direct energy minimization are studied: a) minimization of the strain energy using the displacement method formulation; b) constrained minimization of the complementary strain energy using the force method formulation. Initial results indicated that in the context of the direct energy minimization the displacement formulation experienced convergence and accuracy difficulties while the force formulation showed promising potential.

  19. Methodology for Uncertainty Analysis of Dynamic Computational Toxicology Models

    EPA Science Inventory

    The task of quantifying the uncertainty in both parameter estimates and model predictions has become more important with the increased use of dynamic computational toxicology models by the EPA. Dynamic toxicological models include physiologically-based pharmacokinetic (PBPK) mode...

  20. An analysis of symbolic linguistic computing models in decision making

    NASA Astrophysics Data System (ADS)

    Rodríguez, Rosa M.; Martínez, Luis

    2013-01-01

    It is common that experts involved in complex real-world decision problems use natural language for expressing their knowledge in uncertain frameworks. The language is inherent vague, hence probabilistic decision models are not very suitable in such cases. Therefore, other tools such as fuzzy logic and fuzzy linguistic approaches have been successfully used to model and manage such vagueness. The use of linguistic information implies to operate with such a type of information, i.e. processes of computing with words (CWW). Different schemes have been proposed to deal with those processes, and diverse symbolic linguistic computing models have been introduced to accomplish the linguistic computations. In this paper, we overview the relationship between decision making and CWW, and focus on symbolic linguistic computing models that have been widely used in linguistic decision making to analyse if all of them can be considered inside of the CWW paradigm.