Sample records for create computational models

  1. Evaluation of a computerized aid for creating human behavioral representations of human-computer interaction.

    PubMed

    Williams, Kent E; Voigt, Jeffrey R

    2004-01-01

    The research reported herein presents the results of an empirical evaluation that focused on the accuracy and reliability of cognitive models created using a computerized tool: the cognitive analysis tool for human-computer interaction (CAT-HCI). A sample of participants, expert in interacting with a newly developed tactical display for the U.S. Army's Bradley Fighting Vehicle, individually modeled their knowledge of 4 specific tasks employing the CAT-HCI tool. Measures of the accuracy and consistency of task models created by these task domain experts using the tool were compared with task models created by a double expert. The findings indicated a high degree of consistency and accuracy between the different "single experts" in the task domain in terms of the resultant models generated using the tool. Actual or potential applications of this research include assessing human-computer interaction complexity, determining the productivity of human-computer interfaces, and analyzing an interface design to determine whether methods can be automated.

  2. The benefits of 3D modelling and animation in medical teaching.

    PubMed

    Vernon, Tim; Peckham, Daniel

    2002-12-01

    Three-dimensional models created using materials such as wax, bronze and ivory, have been used in the teaching of medicine for many centuries. Today, computer technology allows medical illustrators to create virtual three-dimensional medical models. This paper considers the benefits of using still and animated output from computer-generated models in the teaching of medicine, and examines how three-dimensional models are made.

  3. Methods for Creating and Animating a Computer Model Depicting the Structure and Function of the Sarcoplasmic Reticulum Calcium ATPase Enzyme.

    ERIC Educational Resources Information Center

    Chen, Alice Y.; McKee, Nancy

    1999-01-01

    Describes the developmental process used to visualize the calcium ATPase enzyme of the sarcoplasmic reticulum which involves evaluating scientific information, consulting scientists, model making, storyboarding, and creating and editing in a computer medium. (Author/CCM)

  4. Artificial Intelligence and the High School Computer Curriculum.

    ERIC Educational Resources Information Center

    Dillon, Richard W.

    1993-01-01

    Describes a four-part curriculum that can serve as a model for incorporating artificial intelligence (AI) into the high school computer curriculum. The model includes examining questions fundamental to AI, creating and designing an expert system, language processing, and creating programs that integrate machine vision with robotics and…

  5. A Computational Workflow for the Automated Generation of Models of Genetic Designs.

    PubMed

    Misirli, Göksel; Nguyen, Tramy; McLaughlin, James Alastair; Vaidyanathan, Prashant; Jones, Timothy S; Densmore, Douglas; Myers, Chris; Wipat, Anil

    2018-06-05

    Computational models are essential to engineer predictable biological systems and to scale up this process for complex systems. Computational modeling often requires expert knowledge and data to build models. Clearly, manual creation of models is not scalable for large designs. Despite several automated model construction approaches, computational methodologies to bridge knowledge in design repositories and the process of creating computational models have still not been established. This paper describes a workflow for automatic generation of computational models of genetic circuits from data stored in design repositories using existing standards. This workflow leverages the software tool SBOLDesigner to build structural models that are then enriched by the Virtual Parts Repository API using Systems Biology Open Language (SBOL) data fetched from the SynBioHub design repository. The iBioSim software tool is then utilized to convert this SBOL description into a computational model encoded using the Systems Biology Markup Language (SBML). Finally, this SBML model can be simulated using a variety of methods. This workflow provides synthetic biologists with easy to use tools to create predictable biological systems, hiding away the complexity of building computational models. This approach can further be incorporated into other computational workflows for design automation.

  6. Creation of Anatomically Accurate Computer-Aided Design (CAD) Solid Models from Medical Images

    NASA Technical Reports Server (NTRS)

    Stewart, John E.; Graham, R. Scott; Samareh, Jamshid A.; Oberlander, Eric J.; Broaddus, William C.

    1999-01-01

    Most surgical instrumentation and implants used in the world today are designed with sophisticated Computer-Aided Design (CAD)/Computer-Aided Manufacturing (CAM) software. This software automates the mechanical development of a product from its conceptual design through manufacturing. CAD software also provides a means of manipulating solid models prior to Finite Element Modeling (FEM). Few surgical products are designed in conjunction with accurate CAD models of human anatomy because of the difficulty with which these models are created. We have developed a novel technique that creates anatomically accurate, patient specific CAD solids from medical images in a matter of minutes.

  7. Creating a New Model Curriculum: A Rationale for "Computing Curricula 1990".

    ERIC Educational Resources Information Center

    Bruce, Kim B.

    1991-01-01

    Describes a model for the design of undergraduate curricula in the discipline of computing that was developed by the ACM/IEEE (Association for Computing Machinery/Institute of Electrical and Electronics Engineers) Computer Society Joint Curriculum Task Force. Institutional settings and structures in which computing degrees are awarded are…

  8. Learning with Artificial Worlds: Computer-Based Modelling in the Curriculum.

    ERIC Educational Resources Information Center

    Mellar, Harvey, Ed.; And Others

    With the advent of the British National Curriculum, computer-based modeling has become an integral part of the school curriculum. This book is about modeling in education and providing children with computer tools to create and explore representations of the world. Members of the London Mental Models Group contributed their research: (1)…

  9. Fluid{Structure Interaction Modeling of Modified-Porosity Parachutes and Parachute Clusters

    NASA Astrophysics Data System (ADS)

    Boben, Joseph J.

    To increase aerodynamic performance, the geometric porosity of a ringsail spacecraft parachute canopy is sometimes increased, beyond the "rings" and "sails" with hundreds of "ring gaps" and "sail slits." This creates extra computational challenges for fluid-structure interaction (FSI) modeling of clusters of such parachutes, beyond those created by the lightness of the canopy structure, geometric complexities of hundreds of gaps and slits, and the contact between the parachutes of the cluster. In FSI computation of parachutes with such "modified geometric porosity," the ow through the "windows" created by the removal of the panels and the wider gaps created by the removal of the sails cannot be accurately modeled with the Homogenized Modeling of Geometric Porosity (HMGP), which was introduced to deal with the hundreds of gaps and slits. The ow needs to be actually resolved. All these computational challenges need to be addressed simultaneously in FSI modeling of clusters of spacecraft parachutes with modified geometric porosity. The core numerical technology is the Stabilized Space-Time FSI (SSTFSI) technique, and the contact between the parachutes is handled with the Surface-Edge-Node Contact Tracking (SENCT) technique. In the computations reported here, in addition to the SSTFSI and SENCT techniques and HMGP, we use the special techniques we have developed for removing the numerical spinning component of the parachute motion and for restoring the mesh integrity without a remesh. We present results for 2- and 3-parachute clusters with two different payload models. We also present the FSI computations we carried out for a single, subscale modified-porosity parachute.

  10. Development of Intelligent Computer-Assisted Instruction Systems to Facilitate Reading Skills of Learning-Disabled Children

    DTIC Science & Technology

    1993-12-01

    Unclassified/Unlimited 13. ABSTRACT ~Maximum 2W0 worr*J The purpose of this thesis is to develop a high-level model to create seli"adapting software which...Department of Computer Science ABSTRACT The purpose of this thesis is to develop a high-level model to create self-adapting software which teaches learning...stimulating and demanding. The power of the system model described herein is that it can vary as needed by the individual student. The system will

  11. Cloud Computing for radiologists.

    PubMed

    Kharat, Amit T; Safvi, Amjad; Thind, Ss; Singh, Amarjit

    2012-07-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future.

  12. Cloud Computing for radiologists

    PubMed Central

    Kharat, Amit T; Safvi, Amjad; Thind, SS; Singh, Amarjit

    2012-01-01

    Cloud computing is a concept wherein a computer grid is created using the Internet with the sole purpose of utilizing shared resources such as computer software, hardware, on a pay-per-use model. Using Cloud computing, radiology users can efficiently manage multimodality imaging units by using the latest software and hardware without paying huge upfront costs. Cloud computing systems usually work on public, private, hybrid, or community models. Using the various components of a Cloud, such as applications, client, infrastructure, storage, services, and processing power, Cloud computing can help imaging units rapidly scale and descale operations and avoid huge spending on maintenance of costly applications and storage. Cloud computing allows flexibility in imaging. It sets free radiology from the confines of a hospital and creates a virtual mobile office. The downsides to Cloud computing involve security and privacy issues which need to be addressed to ensure the success of Cloud computing in the future. PMID:23599560

  13. The Impact of Three-Dimensional Computational Modeling on Student Understanding of Astronomical Concepts: A Quantitative Analysis

    ERIC Educational Resources Information Center

    Hansen, John; Barnett, Michael; MaKinster, James; Keating, Thomas

    2004-01-01

    The increased availability of computational modeling software has created opportunities for students to engage in scientific inquiry through constructing computer-based models of scientific phenomena. However, despite the growing trend of integrating technology into science curricula, educators need to understand what aspects of these technologies…

  14. Vehicle - Bridge interaction, comparison of two computing models

    NASA Astrophysics Data System (ADS)

    Melcer, Jozef; Kuchárová, Daniela

    2017-07-01

    The paper presents the calculation of the bridge response on the effect of moving vehicle moves along the bridge with various velocities. The multi-body plane computing model of vehicle is adopted. The bridge computing models are created in two variants. One computing model represents the bridge as the Bernoulli-Euler beam with continuously distributed mass and the second one represents the bridge as the lumped mass model with 1 degrees of freedom. The mid-span bridge dynamic deflections are calculated for both computing models. The results are mutually compared and quantitative evaluated.

  15. Fluid-structure interaction modeling of clusters of spacecraft parachutes with modified geometric porosity

    NASA Astrophysics Data System (ADS)

    Takizawa, Kenji; Tezduyar, Tayfun E.; Boben, Joseph; Kostov, Nikolay; Boswell, Cody; Buscher, Austin

    2013-12-01

    To increase aerodynamic performance, the geometric porosity of a ringsail spacecraft parachute canopy is sometimes increased, beyond the "rings" and "sails" with hundreds of "ring gaps" and "sail slits." This creates extra computational challenges for fluid-structure interaction (FSI) modeling of clusters of such parachutes, beyond those created by the lightness of the canopy structure, geometric complexities of hundreds of gaps and slits, and the contact between the parachutes of the cluster. In FSI computation of parachutes with such "modified geometric porosity," the flow through the "windows" created by the removal of the panels and the wider gaps created by the removal of the sails cannot be accurately modeled with the Homogenized Modeling of Geometric Porosity (HMGP), which was introduced to deal with the hundreds of gaps and slits. The flow needs to be actually resolved. All these computational challenges need to be addressed simultaneously in FSI modeling of clusters of spacecraft parachutes with modified geometric porosity. The core numerical technology is the Stabilized Space-Time FSI (SSTFSI) technique, and the contact between the parachutes is handled with the Surface-Edge-Node Contact Tracking (SENCT) technique. In the computations reported here, in addition to the SSTFSI and SENCT techniques and HMGP, we use the special techniques we have developed for removing the numerical spinning component of the parachute motion and for restoring the mesh integrity without a remesh. We present results for 2- and 3-parachute clusters with two different payload models.

  16. Correlation Educational Model in Primary Education Curriculum of Mathematics and Computer Science

    ERIC Educational Resources Information Center

    Macinko Kovac, Maja; Eret, Lidija

    2012-01-01

    This article gives insight into methodical correlation model of teaching mathematics and computer science. The model shows the way in which the related areas of computer science and mathematics can be supplemented, if it transforms the way of teaching and creates a "joint" lessons. Various didactic materials are designed, in which all…

  17. Modeling with Young Students--Quantitative and Qualitative.

    ERIC Educational Resources Information Center

    Bliss, Joan; Ogborn, Jon; Boohan, Richard; Brosnan, Tim; Mellar, Harvey; Sakonidis, Babis

    1999-01-01

    A project created tasks and tools to investigate quality and nature of 11- to 14-year-old pupils' reasoning with quantitative and qualitative computer-based modeling tools. Tasks and tools were used in two innovative modes of learning: expressive, where pupils created their own models, and exploratory, where pupils investigated an expert's model.…

  18. Using a cloud to replenish parched groundwater modeling efforts.

    PubMed

    Hunt, Randall J; Luchette, Joseph; Schreuder, Willem A; Rumbaugh, James O; Doherty, John; Tonkin, Matthew J; Rumbaugh, Douglas B

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate "virtual" computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  19. Using a cloud to replenish parched groundwater modeling efforts

    USGS Publications Warehouse

    Hunt, Randall J.; Luchette, Joseph; Schreuder, Willem A.; Rumbaugh, James O.; Doherty, John; Tonkin, Matthew J.; Rumbaugh, Douglas B.

    2010-01-01

    Groundwater models can be improved by introduction of additional parameter flexibility and simultaneous use of soft-knowledge. However, these sophisticated approaches have high computational requirements. Cloud computing provides unprecedented access to computing power via the Internet to facilitate the use of these techniques. A modeler can create, launch, and terminate “virtual” computers as needed, paying by the hour, and save machine images for future use. Such cost-effective and flexible computing power empowers groundwater modelers to routinely perform model calibration and uncertainty analysis in ways not previously possible.

  20. Learning Mathematics by Designing, Programming, and Investigating with Interactive, Dynamic Computer-Based Objects

    ERIC Educational Resources Information Center

    Marshall, Neil; Buteau, Chantal

    2014-01-01

    As part of their undergraduate mathematics curriculum, students at Brock University learn to create and use computer-based tools with dynamic, visual interfaces, called Exploratory Objects, developed for the purpose of conducting pure or applied mathematical investigations. A student's Development Process Model of creating and using an Exploratory…

  1. Cellular automata-based modelling and simulation of biofilm structure on multi-core computers.

    PubMed

    Skoneczny, Szymon

    2015-01-01

    The article presents a mathematical model of biofilm growth for aerobic biodegradation of a toxic carbonaceous substrate. Modelling of biofilm growth has fundamental significance in numerous processes of biotechnology and mathematical modelling of bioreactors. The process following double-substrate kinetics with substrate inhibition proceeding in a biofilm has not been modelled so far by means of cellular automata. Each process in the model proposed, i.e. diffusion of substrates, uptake of substrates, growth and decay of microorganisms and biofilm detachment, is simulated in a discrete manner. It was shown that for flat biofilm of constant thickness, the results of the presented model agree with those of a continuous model. The primary outcome of the study was to propose a mathematical model of biofilm growth; however a considerable amount of focus was also placed on the development of efficient algorithms for its solution. Two parallel algorithms were created, differing in the way computations are distributed. Computer programs were created using OpenMP Application Programming Interface for C++ programming language. Simulations of biofilm growth were performed on three high-performance computers. Speed-up coefficients of computer programs were compared. Both algorithms enabled a significant reduction of computation time. It is important, inter alia, in modelling and simulation of bioreactor dynamics.

  2. Fluid-Structure Interaction Modeling of the Reefed Stages of the Orion Spacecraft Main Parachutes

    NASA Astrophysics Data System (ADS)

    Boswell, Cody W.

    Spacecraft parachutes are typically used in multiple stages, starting with a "reefed" stage where a cable along the parachute skirt constrains the diameter to be less than the diameter in the subsequent stage. After a certain period of time during the descent, the cable is cut and the parachute "disreefs" (i.e. expands) to the next stage. Computing the parachute shape at the reefed stage and fluid-structure interaction (FSI) modeling during the disreefing involve computational challenges beyond those we have in FSI modeling of fully-open spacecraft parachutes. These additional challenges are created by the increased geometric complexities and by the rapid changes in the parachute geometry. The computational challenges are further increased because of the added geometric porosity of the latest design, where the "windows" created by the removal of panels and the wider gaps created by the removal of sails compound the geometric and flow complexity. Orion spacecraft main parachutes will have three stages, with computation of the Stage 1 shape and FSI modeling of disreefing from Stage 1 to Stage 2 being the most challenging. We present the special modeling techniques we devised to address the computational challenges and the results from the computations carried out. We also present the methods we devised to calculate for a parachute gore the radius of curvature in the circumferential direction. The curvature values are intended for quick and simple engineering analysis in estimating the structural stresses.

  3. Creating photorealistic virtual model with polarization-based vision system

    NASA Astrophysics Data System (ADS)

    Shibata, Takushi; Takahashi, Toru; Miyazaki, Daisuke; Sato, Yoichi; Ikeuchi, Katsushi

    2005-08-01

    Recently, 3D models are used in many fields such as education, medical services, entertainment, art, digital archive, etc., because of the progress of computational time and demand for creating photorealistic virtual model is increasing for higher reality. In computer vision field, a number of techniques have been developed for creating the virtual model by observing the real object in computer vision field. In this paper, we propose the method for creating photorealistic virtual model by using laser range sensor and polarization based image capture system. We capture the range and color images of the object which is rotated on the rotary table. By using the reconstructed object shape and sequence of color images of the object, parameter of a reflection model are estimated in a robust manner. As a result, then, we can make photorealistic 3D model in consideration of surface reflection. The key point of the proposed method is that, first, the diffuse and specular reflection components are separated from the color image sequence, and then, reflectance parameters of each reflection component are estimated separately. In separation of reflection components, we use polarization filter. This approach enables estimation of reflectance properties of real objects whose surfaces show specularity as well as diffusely reflected lights. The recovered object shape and reflectance properties are then used for synthesizing object images with realistic shading effects under arbitrary illumination conditions.

  4. Musculoskeletal Simulation Model Generation from MRI Data Sets and Motion Capture Data

    NASA Astrophysics Data System (ADS)

    Schmid, Jérôme; Sandholm, Anders; Chung, François; Thalmann, Daniel; Delingette, Hervé; Magnenat-Thalmann, Nadia

    Today computer models and computer simulations of the musculoskeletal system are widely used to study the mechanisms behind human gait and its disorders. The common way of creating musculoskeletal models is to use a generic musculoskeletal model based on data derived from anatomical and biomechanical studies of cadaverous specimens. To adapt this generic model to a specific subject, the usual approach is to scale it. This scaling has been reported to introduce several errors because it does not always account for subject-specific anatomical differences. As a result, a novel semi-automatic workflow is proposed that creates subject-specific musculoskeletal models from magnetic resonance imaging (MRI) data sets and motion capture data. Based on subject-specific medical data and a model-based automatic segmentation approach, an accurate modeling of the anatomy can be produced while avoiding the scaling operation. This anatomical model coupled with motion capture data, joint kinematics information, and muscle-tendon actuators is finally used to create a subject-specific musculoskeletal model.

  5. Tips on Creating Complex Geometry Using Solid Modeling Software

    ERIC Educational Resources Information Center

    Gow, George

    2008-01-01

    Three-dimensional computer-aided drafting (CAD) software, sometimes referred to as "solid modeling" software, is easy to learn, fun to use, and becoming the standard in industry. However, many users have difficulty creating complex geometry with the solid modeling software. And the problem is not entirely a student problem. Even some teachers and…

  6. A Computational Model of Selection by Consequences

    ERIC Educational Resources Information Center

    McDowell, J. J.

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of…

  7. Multicellular Models of Morphogenesis

    EPA Science Inventory

    EPA’s Virtual Embryo project (v-Embryo™), in collaboration with developers of CompuCell3D, aims to create computer models of morphogenesis that can be used to address the effects of chemical perturbation on embryo development at the cellular level. Such computational (in silico) ...

  8. Modelling and Simulation of the Knee Joint with a Depth Sensor Camera for Prosthetics and Movement Rehabilitation

    NASA Astrophysics Data System (ADS)

    Risto, S.; Kallergi, M.

    2015-09-01

    The purpose of this project was to model and simulate the knee joint. A computer model of the knee joint was first created, which was controlled by Microsoft's Kinect for Windows. Kinect created a depth map of the knee and lower leg motion independent of lighting conditions through an infrared sensor. A combination of open source software such as Blender, Python, Kinect SDK and NI_Mate were implemented for the creation and control of the simulated knee based on movements of a live physical model. A physical size model of the knee and lower leg was also created, the movement of which was controlled remotely by the computer model and Kinect. The real time communication of the model and the robotic knee was achieved through programming in Python and Arduino language. The result of this study showed that Kinect in the modelling of human kinematics and can play a significant role in the development of prosthetics and other assistive technologies.

  9. Simulated Sustainable Societies: Students' Reflections on Creating Future Cities in Computer Games

    ERIC Educational Resources Information Center

    Nilsson, Elisabet M.; Jakobsson, Anders

    2011-01-01

    The empirical study, in this article, involved 42 students (ages 14-15), who used the urban simulation computer game SimCity 4 to create models of sustainable future cities. The aim was to explore in what ways the simulated "real" worlds provided by this game could be a potential facilitator for science learning contexts. The topic investigated is…

  10. A comparative approach to computer aided design model of a dog femur.

    PubMed

    Turamanlar, O; Verim, O; Karabulut, A

    2016-01-01

    Computer assisted technologies offer new opportunities in medical imaging and rapid prototyping in biomechanical engineering. Three dimensional (3D) modelling of soft tissues and bones are becoming more important. The accuracy of the analysis in modelling processes depends on the outline of the tissues derived from medical images. The aim of this study is the evaluation of the accuracy of 3D models of a dog femur derived from computed tomography data by using point cloud method and boundary line method on several modelling software. Solidworks, Rapidform and 3DSMax software were used to create 3D models and outcomes were evaluated statistically. The most accurate 3D prototype of the dog femur was created with stereolithography method using rapid prototype device. Furthermore, the linearity of the volumes of models was investigated between software and the constructed models. The difference between the software and real models manifests the sensitivity of the software and the devices used in this manner.

  11. Computer Recreations.

    ERIC Educational Resources Information Center

    Dewdney, A. K.

    1988-01-01

    Describes the creation of the computer program "BOUNCE," designed to simulate a weighted piston coming into equilibrium with a cloud of bouncing balls. The model follows the ideal gas law. Utilizes the critical event technique to create the model. Discusses another program, "BOOM," which simulates a chain reaction. (CW)

  12. Adventures in Modeling: Exploring Complex, Dynamic Systems with StarLogo.

    ERIC Educational Resources Information Center

    Colella, Vanessa Stevens; Klopfer, Eric; Resnick, Mitchel

    For thousands of years people from da Vinci to Einstein have created models to help them better understand patterns and processes in the world around them. Computers make it easier for novices to build and explore their own models and learn new scientific ideas in the process. This book introduces teachers and students to designing, creating, and…

  13. Software Simplifies the Sharing of Numerical Models

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To ease the sharing of climate models with university students, Goddard Space Flight Center awarded SBIR funding to Reston, Virginia-based Parabon Computation Inc., a company that specializes in cloud computing. The firm developed a software program capable of running climate models over the Internet, and also created an online environment for people to collaborate on developing such models.

  14. Modeling Students' Problem Solving Performance in the Computer-Based Mathematics Learning Environment

    ERIC Educational Resources Information Center

    Lee, Young-Jin

    2017-01-01

    Purpose: The purpose of this paper is to develop a quantitative model of problem solving performance of students in the computer-based mathematics learning environment. Design/methodology/approach: Regularized logistic regression was used to create a quantitative model of problem solving performance of students that predicts whether students can…

  15. Computer Games versus Maps before Reading Stories: Priming Readers' Spatial Situation Models

    ERIC Educational Resources Information Center

    Smith, Glenn Gordon; Majchrzak, Dan; Hayes, Shelley; Drobisz, Jack

    2011-01-01

    The current study investigated how computer games and maps compare as preparation for readers to comprehend and retain spatial relations in text narratives. Readers create situation models of five dimensions: spatial, temporal, causal, goal, and protagonist (Zwaan, Langston, & Graesser 1995). Of these five, readers mentally model the spatial…

  16. The study of human venous system dynamics using hybrid computer modeling

    NASA Technical Reports Server (NTRS)

    Snyder, M. F.; Rideout, V. C.

    1972-01-01

    A computer-based model of the cardiovascular system was created emphasizing effects on the systemic venous system. Certain physiological aspects were emphasized: effects of heart rate, tilting, changes in respiration, and leg muscular contractions. The results from the model showed close correlation with findings previously reported in the literature.

  17. Fast Learning for Immersive Engagement in Energy Simulations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bush, Brian W; Bugbee, Bruce; Gruchalla, Kenny M

    The fast computation which is critical for immersive engagement with and learning from energy simulations would be furthered by developing a general method for creating rapidly computed simplified versions of NREL's computation-intensive energy simulations. Created using machine learning techniques, these 'reduced form' simulations can provide statistically sound estimates of the results of the full simulations at a fraction of the computational cost with response times - typically less than one minute of wall-clock time - suitable for real-time human-in-the-loop design and analysis. Additionally, uncertainty quantification techniques can document the accuracy of the approximate models and their domain of validity. Approximationmore » methods are applicable to a wide range of computational models, including supply-chain models, electric power grid simulations, and building models. These reduced-form representations cannot replace or re-implement existing simulations, but instead supplement them by enabling rapid scenario design and quality assurance for large sets of simulations. We present an overview of the framework and methods we have implemented for developing these reduced-form representations.« less

  18. Electron Impact Ionization: A New Parameterization for 100 eV to 1 MeV Electrons

    NASA Technical Reports Server (NTRS)

    Fang, Xiaohua; Randall, Cora E.; Lummerzheim, Dirk; Solomon, Stanley C.; Mills, Michael J.; Marsh, Daniel; Jackman, Charles H.; Wang, Wenbin; Lu, Gang

    2008-01-01

    Low, medium and high energy electrons can penetrate to the thermosphere (90-400 km; 55-240 miles) and mesosphere (50-90 km; 30-55 miles). These precipitating electrons ionize that region of the atmosphere, creating positively charged atoms and molecules and knocking off other negatively charged electrons. The precipitating electrons also create nitrogen-containing compounds along with other constituents. Since the electron precipitation amounts change within minutes, it is necessary to have a rapid method of computing the ionization and production of nitrogen-containing compounds for inclusion in computationally-demanding global models. A new methodology has been developed, which has parameterized a more detailed model computation of the ionizing impact of precipitating electrons over the very large range of 100 eV up to 1,000,000 eV. This new parameterization method is more accurate than a previous parameterization scheme, when compared with the more detailed model computation. Global models at the National Center for Atmospheric Research will use this new parameterization method in the near future.

  19. Validating an operational physical method to compute surface radiation from geostationary satellites

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sengupta, Manajit; Dhere, Neelkanth G.; Wohlgemuth, John H.

    We developed models to compute global horizontal irradiance (GHI) and direct normal irradiance (DNI) over the last three decades. These models can be classified as empirical or physical based on the approach. Empirical models relate ground-based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the physics behind the radiation received at the satellite and create retrievals to estimate surface radiation. Furthermore, while empirical methods have been traditionally used for computing surface radiation for the solar energy industry, the advent of faster computing has made operational physical models viable. The Global Solar Insolation Projectmore » (GSIP) is a physical model that computes DNI and GHI using the visible and infrared channel measurements from a weather satellite. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate GHI and DNI. Developed for polar orbiting satellites, GSIP has been adapted to NOAA's Geostationary Operation Environmental Satellite series and can run operationally at high spatial resolutions. Our method holds the possibility of creating high quality datasets of GHI and DNI for use by the solar energy industry. We present an outline of the methodology and results from running the model as well as a validation study using ground-based instruments.« less

  20. Resilient Software Systems

    DTIC Science & Technology

    2015-06-01

    and tools, called model-integrated computing ( MIC ) [3] relies on the use of domain-specific modeling languages for creating models of the system to be...hence giving reflective capabilities to it. We have followed the MIC method here: we designed a domain- specific modeling language for modeling...are produced one-off and not for the mass market , the scope for price reduction based on the market demands is non-existent. Processes to create

  1. A History of the Liberal Arts Computer Science Consortium and Its Model Curricula

    ERIC Educational Resources Information Center

    Bruce, Kim B.; Cupper, Robert D.; Scot Drysdale, Robert L.

    2010-01-01

    With the support of a grant from the Sloan Foundation, nine computer scientists from liberal arts colleges came together in October, 1984 to form the Liberal Arts Computer Science Consortium (LACS) and to create a model curriculum appropriate for liberal arts colleges. Over the years the membership has grown and changed, but the focus has remained…

  2. Computational biology in the cloud: methods and new insights from computing at scale.

    PubMed

    Kasson, Peter M

    2013-01-01

    The past few years have seen both explosions in the size of biological data sets and the proliferation of new, highly flexible on-demand computing capabilities. The sheer amount of information available from genomic and metagenomic sequencing, high-throughput proteomics, experimental and simulation datasets on molecular structure and dynamics affords an opportunity for greatly expanded insight, but it creates new challenges of scale for computation, storage, and interpretation of petascale data. Cloud computing resources have the potential to help solve these problems by offering a utility model of computing and storage: near-unlimited capacity, the ability to burst usage, and cheap and flexible payment models. Effective use of cloud computing on large biological datasets requires dealing with non-trivial problems of scale and robustness, since performance-limiting factors can change substantially when a dataset grows by a factor of 10,000 or more. New computing paradigms are thus often needed. The use of cloud platforms also creates new opportunities to share data, reduce duplication, and to provide easy reproducibility by making the datasets and computational methods easily available.

  3. A novel computer algorithm for modeling and treating mandibular fractures: A pilot study.

    PubMed

    Rizzi, Christopher J; Ortlip, Timothy; Greywoode, Jewel D; Vakharia, Kavita T; Vakharia, Kalpesh T

    2017-02-01

    To describe a novel computer algorithm that can model mandibular fracture repair. To evaluate the algorithm as a tool to model mandibular fracture reduction and hardware selection. Retrospective pilot study combined with cross-sectional survey. A computer algorithm utilizing Aquarius Net (TeraRecon, Inc, Foster City, CA) and Adobe Photoshop CS6 (Adobe Systems, Inc, San Jose, CA) was developed to model mandibular fracture repair. Ten different fracture patterns were selected from nine patients who had already undergone mandibular fracture repair. The preoperative computed tomography (CT) images were processed with the computer algorithm to create virtual images that matched the actual postoperative three-dimensional CT images. A survey comparing the true postoperative image with the virtual postoperative images was created and administered to otolaryngology resident and attending physicians. They were asked to rate on a scale from 0 to 10 (0 = completely different; 10 = identical) the similarity between the two images in terms of the fracture reduction and fixation hardware. Ten mandible fracture cases were analyzed and processed. There were 15 survey respondents. The mean score for overall similarity between the images was 8.41 ± 0.91; the mean score for similarity of fracture reduction was 8.61 ± 0.98; and the mean score for hardware appearance was 8.27 ± 0.97. There were no significant differences between attending and resident responses. There were no significant differences based on fracture location. This computer algorithm can accurately model mandibular fracture repair. Images created by the algorithm are highly similar to true postoperative images. The algorithm can potentially assist a surgeon planning mandibular fracture repair. 4. Laryngoscope, 2016 127:331-336, 2017. © 2016 The American Laryngological, Rhinological and Otological Society, Inc.

  4. Dental application of novel finite element analysis software for three-dimensional finite element modeling of a dentulous mandible from its computed tomography images.

    PubMed

    Nakamura, Keiko; Tajima, Kiyoshi; Chen, Ker-Kong; Nagamatsu, Yuki; Kakigawa, Hiroshi; Masumi, Shin-ich

    2013-12-01

    This study focused on the application of novel finite-element analysis software for constructing a finite-element model from the computed tomography data of a human dentulous mandible. The finite-element model is necessary for evaluating the mechanical response of the alveolar part of the mandible, resulting from occlusal force applied to the teeth during biting. Commercially available patient-specific general computed tomography-based finite-element analysis software was solely applied to the finite-element analysis for the extraction of computed tomography data. The mandibular bone with teeth was extracted from the original images. Both the enamel and the dentin were extracted after image processing, and the periodontal ligament was created from the segmented dentin. The constructed finite-element model was reasonably accurate using a total of 234,644 nodes and 1,268,784 tetrahedral and 40,665 shell elements. The elastic moduli of the heterogeneous mandibular bone were determined from the bone density data of the computed tomography images. The results suggested that the software applied in this study is both useful and powerful for creating a more accurate three-dimensional finite-element model of a dentulous mandible from the computed tomography data without the need for any other software.

  5. Air Vehicles Division Computational Structural Analysis Facilities Policy and Guidelines for Users

    DTIC Science & Technology

    2005-05-01

    34 Thermal " as appropriate and the tolerance set to "default". b) Create the model geometry. c) Create the finite elements. d) Create the...linear, non-linear, dynamic, thermal , acoustic analysis. The modelling of composite materials, creep, fatigue and plasticity are also covered...perform professional, high quality finite element analysis (FEA). FE analysts from many tasks within AVD are using the facilities to conduct FEA with

  6. A Computational Model of Fraction Arithmetic

    ERIC Educational Resources Information Center

    Braithwaite, David W.; Pyke, Aryn A.; Siegler, Robert S.

    2017-01-01

    Many children fail to master fraction arithmetic even after years of instruction, a failure that hinders their learning of more advanced mathematics as well as their occupational success. To test hypotheses about why children have so many difficulties in this area, we created a computational model of fraction arithmetic learning and presented it…

  7. A Model for Conducting and Assessing Interdisciplinary Undergraduate Dissertations

    ERIC Educational Resources Information Center

    Engström, Henrik

    2015-01-01

    This paper presents an effort to create a unified model for conducting and assessing undergraduate dissertations, shared by all disciplines involved in computer game development at a Swedish university. Computer game development includes technology-oriented disciplines as well as disciplines with aesthetical traditions. The challenge has been to…

  8. Virtual Surgical Planning in Craniofacial Surgery

    PubMed Central

    Chim, Harvey; Wetjen, Nicholas; Mardini, Samir

    2014-01-01

    The complex three-dimensional anatomy of the craniofacial skeleton creates a formidable challenge for surgical reconstruction. Advances in computer-aided design and computer-aided manufacturing technology have created increasing applications for virtual surgical planning in craniofacial surgery, such as preoperative planning, fabrication of cutting guides, and stereolithographic models and fabrication of custom implants. In this review, the authors describe current and evolving uses of virtual surgical planning in craniofacial surgery. PMID:25210509

  9. High Tech Programmers in Low-Income Communities: Creating a Computer Culture in a Community Technology Center

    NASA Astrophysics Data System (ADS)

    Kafai, Yasmin B.; Peppler, Kylie A.; Chiu, Grace M.

    For the last twenty years, issues of the digital divide have driven efforts around the world to address the lack of access to computers and the Internet, pertinent and language appropriate content, and technical skills in low-income communities (Schuler & Day, 2004a and b). The title of our paper makes reference to a milestone publication (Schon, Sanyal, & Mitchell, 1998) that showcased some of the early work and thinking in this area. Schon, Sanyal and Mitchell's book edition included an article outlining the Computer Clubhouse, a type of community technology center model, which was developed to create opportunities for youth in low-income communities to become creators and designers of technologies by (1998). The model has been very successful scaling up, with over 110 Computer Clubhouses now in existence worldwide.

  10. Computer-Based Tutoring of Visual Concepts: From Novice to Experts.

    ERIC Educational Resources Information Center

    Sharples, Mike

    1991-01-01

    Description of ways in which computers might be used to teach visual concepts discusses hypermedia systems; describes computer-generated tutorials; explains the use of computers to create learning aids such as concept maps, feature spaces, and structural models; and gives examples of visual concept teaching in medical education. (10 references)…

  11. Distributed Hydrologic Modeling Apps for Decision Support in the Cloud

    NASA Astrophysics Data System (ADS)

    Swain, N. R.; Latu, K.; Christiensen, S.; Jones, N.; Nelson, J.

    2013-12-01

    Advances in computation resources and greater availability of water resources data represent an untapped resource for addressing hydrologic uncertainties in water resources decision-making. The current practice of water authorities relies on empirical, lumped hydrologic models to estimate watershed response. These models are not capable of taking advantage of many of the spatial datasets that are now available. Physically-based, distributed hydrologic models are capable of using these data resources and providing better predictions through stochastic analysis. However, there exists a digital divide that discourages many science-minded decision makers from using distributed models. This divide can be spanned using a combination of existing web technologies. The purpose of this presentation is to present a cloud-based environment that will offer hydrologic modeling tools or 'apps' for decision support and the web technologies that have been selected to aid in its implementation. Compared to the more commonly used lumped-parameter models, distributed models, while being more intuitive, are still data intensive, computationally expensive, and difficult to modify for scenario exploration. However, web technologies such as web GIS, web services, and cloud computing have made the data more accessible, provided an inexpensive means of high-performance computing, and created an environment for developing user-friendly apps for distributed modeling. Since many water authorities are primarily interested in the scenario exploration exercises with hydrologic models, we are creating a toolkit that facilitates the development of a series of apps for manipulating existing distributed models. There are a number of hurdles that cloud-based hydrologic modeling developers face. One of these is how to work with the geospatial data inherent with this class of models in a web environment. Supporting geospatial data in a website is beyond the capabilities of standard web frameworks and it requires the use of additional software. In particular, there are at least three elements that are needed: a geospatially enabled database, a map server, and geoprocessing toolbox. We recommend a software stack for geospatial web application development comprising: MapServer, PostGIS, and 52 North with Python as the scripting language to tie them together. Another hurdle that must be cleared is managing the cloud-computing load. We are using HTCondor as a solution to this end. Finally, we are creating a scripting environment wherein developers will be able to create apps that use existing hydrologic models in our system with minimal effort. This capability will be accomplished by creating a plugin for a Python content management system called CKAN. We are currently developing cyberinfrastructure that utilizes this stack and greatly lowers the investment required to deploy cloud-based modeling apps. This material is based upon work supported by the National Science Foundation under Grant No. 1135482

  12. Building generic anatomical models using virtual model cutting and iterative registration.

    PubMed

    Xiao, Mei; Soh, Jung; Meruvia-Pastor, Oscar; Schmidt, Eric; Hallgrímsson, Benedikt; Sensen, Christoph W

    2010-02-08

    Using 3D generic models to statistically analyze trends in biological structure changes is an important tool in morphometrics research. Therefore, 3D generic models built for a range of populations are in high demand. However, due to the complexity of biological structures and the limited views of them that medical images can offer, it is still an exceptionally difficult task to quickly and accurately create 3D generic models (a model is a 3D graphical representation of a biological structure) based on medical image stacks (a stack is an ordered collection of 2D images). We show that the creation of a generic model that captures spatial information exploitable in statistical analyses is facilitated by coupling our generalized segmentation method to existing automatic image registration algorithms. The method of creating generic 3D models consists of the following processing steps: (i) scanning subjects to obtain image stacks; (ii) creating individual 3D models from the stacks; (iii) interactively extracting sub-volume by cutting each model to generate the sub-model of interest; (iv) creating image stacks that contain only the information pertaining to the sub-models; (v) iteratively registering the corresponding new 2D image stacks; (vi) averaging the newly created sub-models based on intensity to produce the generic model from all the individual sub-models. After several registration procedures are applied to the image stacks, we can create averaged image stacks with sharp boundaries. The averaged 3D model created from those image stacks is very close to the average representation of the population. The image registration time varies depending on the image size and the desired accuracy of the registration. Both volumetric data and surface model for the generic 3D model are created at the final step. Our method is very flexible and easy to use such that anyone can use image stacks to create models and retrieve a sub-region from it at their ease. Java-based implementation allows our method to be used on various visualization systems including personal computers, workstations, computers equipped with stereo displays, and even virtual reality rooms such as the CAVE Automated Virtual Environment. The technique allows biologists to build generic 3D models of their interest quickly and accurately.

  13. A method for using solid modeling CAD software to create an implant library for the fabrication of a custom abutment.

    PubMed

    Zhang, Jing; Zhang, Rimei; Ren, Guanghui; Zhang, Xiaojie

    2017-02-01

    This article describes a method that incorporates the solid modeling CAD software Solidworks with a dental milling machine to fabricate individual abutments in house. This process involves creating an implant library with 3-dimensional (3D) models and manufacturing a base, scan element, abutment, and crown anatomy. The 3D models can be imported into any dental computer-aided design and computer-aided (CAD-CAM) manufacturing system. This platform increases abutment design flexibility, as the base and scan elements can be designed to fit several shapes as needed to meet clinical requirements. Copyright © 2016 Editorial Council for the Journal of Prosthetic Dentistry. Published by Elsevier Inc. All rights reserved.

  14. Vernier Caliper and Micrometer Computer Models Using Easy Java Simulation and Its Pedagogical Design Features--Ideas for Augmenting Learning with Real Instruments

    ERIC Educational Resources Information Center

    Wee, Loo Kang; Ning, Hwee Tiang

    2014-01-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a…

  15. Prosthetically directed implant placement using computer software to ensure precise placement and predictable prosthetic outcomes. Part 2: rapid-prototype medical modeling and stereolithographic drilling guides requiring bone exposure.

    PubMed

    Rosenfeld, Alan L; Mandelaris, George A; Tardieu, Philippe B

    2006-08-01

    The purpose of this paper is to expand on part 1 of this series (published in the previous issue) regarding the emerging future of computer-guided implant dentistry. This article will introduce the concept of rapid-prototype medical modeling as well as describe the utilization and fabrication of computer-generated surgical drilling guides used during implant surgery. The placement of dental implants has traditionally been an intuitive process, whereby the surgeon relies on mental navigation to achieve optimal implant positioning. Through rapid-prototype medical modeling and the ste-reolithographic process, surgical drilling guides (eg, SurgiGuide) can be created. These guides are generated from a surgical implant plan created with a computer software system that incorporates all relevant prosthetic information from which the surgical plan is developed. The utilization of computer-generated planning and stereolithographically generated surgical drilling guides embraces the concept of collaborative accountability and supersedes traditional mental navigation on all levels of implant therapy.

  16. A Computational Model Quantifies the Effect of Anatomical Variability on Velopharyngeal Function

    ERIC Educational Resources Information Center

    Inouye, Joshua M.; Perry, Jamie L.; Lin, Kant Y.; Blemker, Silvia S.

    2015-01-01

    Purpose: This study predicted the effects of velopharyngeal (VP) anatomical parameters on VP function to provide a greater understanding of speech mechanics and aid in the treatment of speech disorders. Method: We created a computational model of the VP mechanism using dimensions obtained from magnetic resonance imaging measurements of 10 healthy…

  17. Learning General Phonological Rules from Distributional Information: A Computational Model

    ERIC Educational Resources Information Center

    Calamaro, Shira; Jarosz, Gaja

    2015-01-01

    Phonological rules create alternations in the phonetic realizations of related words. These rules must be learned by infants in order to identify the phonological inventory, the morphological structure, and the lexicon of a language. Recent work proposes a computational model for the learning of one kind of phonological alternation, allophony…

  18. Computational Modelling and Children's Expressions of Signal and Noise

    ERIC Educational Resources Information Center

    Ainley, Janet; Pratt, Dave

    2017-01-01

    Previous research has demonstrated how young children can identify the signal in data. In this exploratory study we considered how they might also express meanings for noise when creating computational models using recent developments in software tools. We conducted extended clinical interviews with four groups of 11-year-olds and analysed the…

  19. Development and Verification of Body Armor Target Geometry Created Using Computed Tomography Scans

    DTIC Science & Technology

    2017-07-13

    designated by other authorized documents. Citation of manufacturer’s or trade names does not constitute an official endorsement or approval of...modeling consisted of manual measurement of armor systems and translating those measurements to computer-aided design geometry, which can be tedious and...computer-aided design (CAD) human geometry model (referred to throughout as ORCA man) that is used in the Operational Requirement-based Casualty Assessment

  20. P-HS-SFM: a parallel harmony search algorithm for the reproduction of experimental data in the continuous microscopic crowd dynamic models

    NASA Astrophysics Data System (ADS)

    Jaber, Khalid Mohammad; Alia, Osama Moh'd.; Shuaib, Mohammed Mahmod

    2018-03-01

    Finding the optimal parameters that can reproduce experimental data (such as the velocity-density relation and the specific flow rate) is a very important component of the validation and calibration of microscopic crowd dynamic models. Heavy computational demand during parameter search is a known limitation that exists in a previously developed model known as the Harmony Search-Based Social Force Model (HS-SFM). In this paper, a parallel-based mechanism is proposed to reduce the computational time and memory resource utilisation required to find these parameters. More specifically, two MATLAB-based multicore techniques (parfor and create independent jobs) using shared memory are developed by taking advantage of the multithreading capabilities of parallel computing, resulting in a new framework called the Parallel Harmony Search-Based Social Force Model (P-HS-SFM). The experimental results show that the parfor-based P-HS-SFM achieved a better computational time of about 26 h, an efficiency improvement of ? 54% and a speedup factor of 2.196 times in comparison with the HS-SFM sequential processor. The performance of the P-HS-SFM using the create independent jobs approach is also comparable to parfor with a computational time of 26.8 h, an efficiency improvement of about 30% and a speedup of 2.137 times.

  1. The Development of a Virtual McKenna Military Operations in Urban Terrain (MOUT) Site for Command, Control, Communication, Computing, Intelligence, Surveillance, and Reconnaissance (C4ISR) Studies

    DTIC Science & Technology

    2007-06-01

    4.2 Creating the Skybox and Terrain Model .........................................................................7 4.3 Creating New Textures... Skybox and Terrain Model The next step was to build a sky box. Since it already resided in Raven Shield, the creation of the sky box was limited to

  2. Benefit from NASA

    NASA Image and Video Library

    2004-04-15

    Marshall Space Flight Center engineers helped North American Marine Jet (NAMJ), Inc. improve the proposed design of a new impeller for jet propulsion system. With a three-dimensional computer model of the new marine jet engine blades, engineers were able to quickly create a solid ploycarbonate model of it. The rapid prototyping allowed the company to avoid many time-consuming and costly steps in creating the impeller.

  3. Benefit from NASA

    NASA Image and Video Library

    1996-01-01

    Marshall space Flight Center engineers helped North American Marine Jet (NAMJ), Inc. improve the proposed design of a new impeller for a jet-propulsion system. With a three-dimensional computer model of the new marine jet engine blades, engineers were able to quickly create a solid polycarbonate model of it. The rapid prototyping allowed the company to avoid many time-consuming and costly steps in creating the impeller.

  4. Cyberinfrastructure to Support Collaborative and Reproducible Computational Hydrologic Modeling

    NASA Astrophysics Data System (ADS)

    Goodall, J. L.; Castronova, A. M.; Bandaragoda, C.; Morsy, M. M.; Sadler, J. M.; Essawy, B.; Tarboton, D. G.; Malik, T.; Nijssen, B.; Clark, M. P.; Liu, Y.; Wang, S. W.

    2017-12-01

    Creating cyberinfrastructure to support reproducibility of computational hydrologic models is an important research challenge. Addressing this challenge requires open and reusable code and data with machine and human readable metadata, organized in ways that allow others to replicate results and verify published findings. Specific digital objects that must be tracked for reproducible computational hydrologic modeling include (1) raw initial datasets, (2) data processing scripts used to clean and organize the data, (3) processed model inputs, (4) model results, and (5) the model code with an itemization of all software dependencies and computational requirements. HydroShare is a cyberinfrastructure under active development designed to help users store, share, and publish digital research products in order to improve reproducibility in computational hydrology, with an architecture supporting hydrologic-specific resource metadata. Researchers can upload data required for modeling, add hydrology-specific metadata to these resources, and use the data directly within HydroShare.org for collaborative modeling using tools like CyberGIS, Sciunit-CLI, and JupyterHub that have been integrated with HydroShare to run models using notebooks, Docker containers, and cloud resources. Current research aims to implement the Structure For Unifying Multiple Modeling Alternatives (SUMMA) hydrologic model within HydroShare to support hypothesis-driven hydrologic modeling while also taking advantage of the HydroShare cyberinfrastructure. The goal of this integration is to create the cyberinfrastructure that supports hypothesis-driven model experimentation, education, and training efforts by lowering barriers to entry, reducing the time spent on informatics technology and software development, and supporting collaborative research within and across research groups.

  5. Iteration and Prototyping in Creating Technical Specifications.

    ERIC Educational Resources Information Center

    Flynt, John P.

    1994-01-01

    Claims that the development process for computer software can be greatly aided by the writers of specifications if they employ basic iteration and prototyping techniques. Asserts that computer software configuration management practices provide ready models for iteration and prototyping. (HB)

  6. Televirtuality: "Being There" in the 21st Century.

    ERIC Educational Resources Information Center

    Jacobson, Robert

    Virtual worlds technology (VWT) uses special computer hardware and software to link humans with computers in natural ways. A data model, or virtual world, is created and presented as a three-dimensional world of sights and sounds. The participant manipulates apparent objects in the world, and in so doing, alters the data model. VWT will become…

  7. Diy Geospatial Web Service Chains: Geochaining Make it Easy

    NASA Astrophysics Data System (ADS)

    Wu, H.; You, L.; Gui, Z.

    2011-08-01

    It is a great challenge for beginners to create, deploy and utilize a Geospatial Web Service Chain (GWSC). People in Computer Science are usually not familiar with geospatial domain knowledge. Geospatial practitioners may lack the knowledge about web services and service chains. The end users may lack both. However, integrated visual editing interfaces, validation tools, and oneclick deployment wizards may help to lower the learning curve and improve modelling skills so beginners will have a better experience. GeoChaining is a GWSC modelling tool designed and developed based on these ideas. GeoChaining integrates visual editing, validation, deployment, execution etc. into a unified platform. By employing a Virtual Globe, users can intuitively visualize raw data and results produced by GeoChaining. All of these features allow users to easily start using GWSC, regardless of their professional background and computer skills. Further, GeoChaining supports GWSC model reuse, meaning that an entire GWSC model created or even a specific part can be directly reused in a new model. This greatly improves the efficiency of creating a new GWSC, and also contributes to the sharing and interoperability of GWSC.

  8. Training Maneuver Evaluation for Reduced Order Modeling of Stability & Control Properties Using Computational Fluid Dynamics

    DTIC Science & Technology

    2013-03-01

    reduced order model is created. Finally, previous research in this area of study will be examined, and its application to this research will be...TRAINING MANEUVER EVALUATION FOR REDUCED ORDER MODELING OF STABILITY & CONTROL PROPERTIES USING COMPUTATIONAL FLUID DYNAMICS THESIS Craig Curtis...Government and is not subject to copyright protection in the United States. AFIT-ENY-13-M-28 TRAINING MANEUVER EVALUATION FOR REDUCED ORDER MODELING OF

  9. Creating, documenting and sharing network models.

    PubMed

    Crook, Sharon M; Bednar, James A; Berger, Sandra; Cannon, Robert; Davison, Andrew P; Djurfeldt, Mikael; Eppler, Jochen; Kriener, Birgit; Furber, Steve; Graham, Bruce; Plesser, Hans E; Schwabe, Lars; Smith, Leslie; Steuber, Volker; van Albada, Sacha

    2012-01-01

    As computational neuroscience matures, many simulation environments are available that are useful for neuronal network modeling. However, methods for successfully documenting models for publication and for exchanging models and model components among these projects are still under development. Here we briefly review existing software and applications for network model creation, documentation and exchange. Then we discuss a few of the larger issues facing the field of computational neuroscience regarding network modeling and suggest solutions to some of these problems, concentrating in particular on standardized network model terminology, notation, and descriptions and explicit documentation of model scaling. We hope this will enable and encourage computational neuroscientists to share their models more systematically in the future.

  10. PILOT-SPION: A Computer Game for German Students.

    ERIC Educational Resources Information Center

    Sanders, Ruth H.

    1984-01-01

    Describes a computer game designed for students of German, which uses techniques of artificial intelligence to create a model of language understanding by computer in an adventure game set in Berlin. In addition to providing a concrete means for testing students' language understanding, the game is a useful, highly motivating, learning mode. (SL)

  11. Feasibility of Clinician-Facilitated Three-Dimensional Printing of Synthetic Cranioplasty Flaps.

    PubMed

    Panesar, Sandip S; Belo, Joao Tiago A; D'Souza, Rhett N

    2018-05-01

    Integration of three-dimensional (3D) printing and stereolithography into clinical practice is in its nascence, and concepts may be esoteric to the practicing neurosurgeon. Currently, creation of 3D printed implants involves recruitment of offsite third parties. We explored a range of 3D scanning and stereolithographic techniques to create patient-specific synthetic implants using an onsite, clinician-facilitated approach. We simulated bilateral craniectomies in a single cadaveric specimen. We devised 3 methods of creating stereolithographically viable virtual models from removed bone. First, we used preoperative and postoperative computed tomography scanner-derived bony window models from which the flap was extracted. Second, we used an entry-level 3D light scanner to scan and render models of the individual bone pieces. Third, we used an arm-mounted, 3D laser scanner to create virtual models using a real-time approach. Flaps were printed from the computed tomography scanner and laser scanner models only in a ultraviolet-cured polymer. The light scanner did not produce suitable virtual models for printing. The computed tomography scanner-derived models required extensive postfabrication modification to fit the existing defects. The laser scanner models assumed good fit within the defects without any modification. The methods presented varying levels of complexity in acquisition and model rendering. Each technique required hardware at varying in price points from $0 to approximately $100,000. The laser scanner models produced the best quality parts, which had near-perfect fit with the original defects. Potential neurosurgical applications of this technology are discussed. Copyright © 2018 Elsevier Inc. All rights reserved.

  12. Using Computational Cognitive Modeling to Diagnose Possible Sources of Aviation Error

    NASA Technical Reports Server (NTRS)

    Byrne, M. D.; Kirlik, Alex

    2003-01-01

    We present a computational model of a closed-loop, pilot-aircraft-visual scene-taxiway system created to shed light on possible sources of taxi error. Creating the cognitive aspects of the model using ACT-R required us to conduct studies with subject matter experts to identify experiential adaptations pilots bring to taxiing. Five decision strategies were found, ranging from cognitively-intensive but precise, to fast, frugal but robust. We provide evidence for the model by comparing its behavior to a NASA Ames Research Center simulation of Chicago O'Hare surface operations. Decision horizons were highly variable; the model selected the most accurate strategy given time available. We found a signature in the simulation data of the use of globally robust heuristics to cope with short decision horizons as revealed by errors occurring most frequently at atypical taxiway geometries or clearance routes. These data provided empirical support for the model.

  13. META-X Design Flow Tools

    DTIC Science & Technology

    2013-04-01

    Forces can be computed at specific angular positions, and geometrical parameters can be evaluated. Much higher resolution models are required, along...composition engines (C#, C++, Python, Java ) Desert operates on the CyPhy model, converting from a design space alternative structure to a set of design...consists of scripts to execute dymola, post-processing of results to create metrics, and general management of the job sequence. An earlier version created

  14. Computer Science Career Network

    DTIC Science & Technology

    2013-03-01

    development model. TopCoder’s development model is competition-based, meaning that TopCoder conducts competitions to develop digital assets. TopCoder...success in running a competition that had as an objective creating digital assets, and we intend to run more of them, to create assets for...cash prizes and merchandise . This includes social media contests, contests will all our games, special referral contests, and a couple NASA

  15. Integrating interactive computational modeling in biology curricula.

    PubMed

    Helikar, Tomáš; Cutucache, Christine E; Dahlquist, Lauren M; Herek, Tyler A; Larson, Joshua J; Rogers, Jim A

    2015-03-01

    While the use of computer tools to simulate complex processes such as computer circuits is normal practice in fields like engineering, the majority of life sciences/biological sciences courses continue to rely on the traditional textbook and memorization approach. To address this issue, we explored the use of the Cell Collective platform as a novel, interactive, and evolving pedagogical tool to foster student engagement, creativity, and higher-level thinking. Cell Collective is a Web-based platform used to create and simulate dynamical models of various biological processes. Students can create models of cells, diseases, or pathways themselves or explore existing models. This technology was implemented in both undergraduate and graduate courses as a pilot study to determine the feasibility of such software at the university level. First, a new (In Silico Biology) class was developed to enable students to learn biology by "building and breaking it" via computer models and their simulations. This class and technology also provide a non-intimidating way to incorporate mathematical and computational concepts into a class with students who have a limited mathematical background. Second, we used the technology to mediate the use of simulations and modeling modules as a learning tool for traditional biological concepts, such as T cell differentiation or cell cycle regulation, in existing biology courses. Results of this pilot application suggest that there is promise in the use of computational modeling and software tools such as Cell Collective to provide new teaching methods in biology and contribute to the implementation of the "Vision and Change" call to action in undergraduate biology education by providing a hands-on approach to biology.

  16. Future aircraft networks and schedules

    NASA Astrophysics Data System (ADS)

    Shu, Yan

    2011-07-01

    Because of the importance of air transportation scheduling, the emergence of small aircraft and the vision of future fuel-efficient aircraft, this thesis has focused on the study of aircraft scheduling and network design involving multiple types of aircraft and flight services. It develops models and solution algorithms for the schedule design problem and analyzes the computational results. First, based on the current development of small aircraft and on-demand flight services, this thesis expands a business model for integrating on-demand flight services with the traditional scheduled flight services. This thesis proposes a three-step approach to the design of aircraft schedules and networks from scratch under the model. In the first step, both a frequency assignment model for scheduled flights that incorporates a passenger path choice model and a frequency assignment model for on-demand flights that incorporates a passenger mode choice model are created. In the second step, a rough fleet assignment model that determines a set of flight legs, each of which is assigned an aircraft type and a rough departure time is constructed. In the third step, a timetable model that determines an exact departure time for each flight leg is developed. Based on the models proposed in the three steps, this thesis creates schedule design instances that involve almost all the major airports and markets in the United States. The instances of the frequency assignment model created in this thesis are large-scale non-convex mixed-integer programming problems, and this dissertation develops an overall network structure and proposes iterative algorithms for solving these instances. The instances of both the rough fleet assignment model and the timetable model created in this thesis are large-scale mixed-integer programming problems, and this dissertation develops subproblem schemes for solving these instances. Based on these solution algorithms, this dissertation also presents computational results of these large-scale instances. To validate the models and solution algorithms developed, this thesis also compares the daily flight schedules that it designs with the schedules of the existing airlines. Furthermore, it creates instances that represent different economic and fuel-prices conditions and derives schedules under these different conditions. In addition, it discusses the implication of using new aircraft in the future flight schedules. Finally, future research in three areas---model, computational method, and simulation for validation---is proposed.

  17. Modular, Semantics-Based Composition of Biosimulation Models

    ERIC Educational Resources Information Center

    Neal, Maxwell Lewis

    2010-01-01

    Biosimulation models are valuable, versatile tools used for hypothesis generation and testing, codification of biological theory, education, and patient-specific modeling. Driven by recent advances in computational power and the accumulation of systems-level experimental data, modelers today are creating models with an unprecedented level of…

  18. Computational Medicine: Translating Models to Clinical Care

    PubMed Central

    Winslow, Raimond L.; Trayanova, Natalia; Geman, Donald; Miller, Michael I.

    2013-01-01

    Because of the inherent complexity of coupled nonlinear biological systems, the development of computational models is necessary for achieving a quantitative understanding of their structure and function in health and disease. Statistical learning is applied to high-dimensional biomolecular data to create models that describe relationships between molecules and networks. Multiscale modeling links networks to cells, organs, and organ systems. Computational approaches are used to characterize anatomic shape and its variations in health and disease. In each case, the purposes of modeling are to capture all that we know about disease and to develop improved therapies tailored to the needs of individuals. We discuss advances in computational medicine, with specific examples in the fields of cancer, diabetes, cardiology, and neurology. Advances in translating these computational methods to the clinic are described, as well as challenges in applying models for improving patient health. PMID:23115356

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Habte, A.; Sengupta, M.; Wilcox, S.

    Models to compute Global Horizontal Irradiance (GHI) and Direct Normal Irradiance (DNI) have been in development over the last 3 decades. These models can be classified as empirical or physical, based on the approach. Empirical models relate ground based observations with satellite measurements and use these relations to compute surface radiation. Physical models consider the radiation received from the earth at the satellite and create retrievals to estimate surface radiation. While empirical methods have been traditionally used for computing surface radiation for the solar energy industry the advent of faster computing has made operational physical models viable. The Global Solarmore » Insolation Project (GSIP) is an operational physical model from NOAA that computes GHI using the visible and infrared channel measurements from the GOES satellites. GSIP uses a two-stage scheme that first retrieves cloud properties and uses those properties in a radiative transfer model to calculate surface radiation. NREL, University of Wisconsin and NOAA have recently collaborated to adapt GSIP to create a 4 km GHI and DNI product every 30 minutes. This paper presents an outline of the methodology and a comprehensive validation using high quality ground based solar data from the National Oceanic and Atmospheric Administration (NOAA) Surface Radiation (SURFRAD) (http://www.srrb.noaa.gov/surfrad/sitepage.html) and Integrated Surface Insolation Study (ISIS) http://www.srrb.noaa.gov/isis/isissites.html), the Solar Radiation Research Laboratory (SRRL) at National Renewable Energy Laboratory (NREL), and Sun Spot One (SS1) stations.« less

  20. Temporal Subtraction of Digital Breast Tomosynthesis Images for Improved Mass Detection

    DTIC Science & Technology

    2008-10-01

    K. Fishman and B. M. W. Tsui, "Development of a computer-generated model for the coronary arterial tree based on multislice CT and morphometric data...mathematical models based on geometric primitives8-22. Bakic et al created synthetic x-ray mammograms using a 3D simulated breast tissue model consisting of...utilized a combination of voxel matrices and geometric primitives to create a breast phantom that includes the breast surface, the duct system, and

  1. Structure of the Brazilian Sign Language (Libras) for Computational Tools: Citizenship and Social Inclusion

    NASA Astrophysics Data System (ADS)

    Guimaraes, Cayley; Antunes, Diego R.; de F. Guilhermino Trindade, Daniela; da Silva, Rafaella A. Lopes; Garcia, Laura Sanchez

    This work presents a computational model (XML) of the Brazilian Sign Language (Libras), based on its phonology. The model was used to create a sample of representative signs to aid the recording of a base of videos whose aim is to support the development of tools to support genuine social inclusion of the deaf.

  2. Evaluation of a Computational Model of Situational Awareness

    NASA Technical Reports Server (NTRS)

    Burdick, Mark D.; Shively, R. Jay; Rutkewski, Michael (Technical Monitor)

    2000-01-01

    Although the use of the psychological construct of situational awareness (SA) assists researchers in creating a flight environment that is safer and more predictable, its true potential remains untapped until a valid means of predicting SA a priori becomes available. Previous work proposed a computational model of SA (CSA) that sought to Fill that void. The current line of research is aimed at validating that model. The results show that the model accurately predicted SA in a piloted simulation.

  3. Computational Modeling for Language Acquisition: A Tutorial With Syntactic Islands.

    PubMed

    Pearl, Lisa S; Sprouse, Jon

    2015-06-01

    Given the growing prominence of computational modeling in the acquisition research community, we present a tutorial on how to use computational modeling to investigate learning strategies that underlie the acquisition process. This is useful for understanding both typical and atypical linguistic development. We provide a general overview of why modeling can be a particularly informative tool and some general considerations when creating a computational acquisition model. We then review a concrete example of a computational acquisition model for complex structural knowledge referred to as syntactic islands. This includes an overview of syntactic islands knowledge, a precise definition of the acquisition task being modeled, the modeling results, and how to meaningfully interpret those results in a way that is relevant for questions about knowledge representation and the learning process. Computational modeling is a powerful tool that can be used to understand linguistic development. The general approach presented here can be used to investigate any acquisition task and any learning strategy, provided both are precisely defined.

  4. VRML Industry: Microcosms in the Making.

    ERIC Educational Resources Information Center

    Brown, Eric

    1998-01-01

    Discusses VRML (Virtual Reality Modeling Language) technology and some of its possible applications, including creating three-dimensional images on the Web, advertising, and data visualization in computer-assisted design and computer-assisted manufacturing (CAD/CAM). Future improvements are discussed, including streaming, database support, and…

  5. Creating a Pipeline for African American Computing Science Faculty: An Innovative Faculty/Research Mentoring Program Model

    ERIC Educational Resources Information Center

    Charleston, LaVar J.; Gilbert, Juan E.; Escobar, Barbara; Jackson, Jerlando F. L.

    2014-01-01

    African Americans represent 1.3% of all computing sciences faculty in PhD-granting departments, underscoring the severe underrepresentation of Black/African American tenure-track faculty in computing (CRA, 2012). The Future Faculty/Research Scientist Mentoring (FFRM) program, funded by the National Science Foundation, was found to be an effective…

  6. OpenKIM - Building a Knowledgebase of Interatomic Models

    NASA Astrophysics Data System (ADS)

    Bierbaum, Matthew; Tadmor, Ellad; Elliott, Ryan; Wennblom, Trevor; Alemi, Alexander; Chen, Yan-Jiun; Karls, Daniel; Ludvik, Adam; Sethna, James

    2014-03-01

    The Knowledgebase of Interatomic Models (KIM) is an effort by the computational materials community to provide a standard interface for the development, characterization, and use of interatomic potentials. The KIM project has developed an API between simulation codes and interatomic models written in several different languages including C, Fortran, and Python. This interface is already supported in popular simulation environments such as LAMMPS and ASE, giving quick access to over a hundred compatible potentials that have been contributed so far. To compare and characterize models, we have developed a computational processing pipeline which automatically runs a series of tests for each model in the system, such as phonon dispersion relations and elastic constant calculations. To view the data from these tests, we created a rich set of interactive visualization tools located online. Finally, we created a Web repository to store and share these potentials, tests, and visualizations which can be found at https://openkim.org along with futher information.

  7. Including crystal structure attributes in machine learning models of formation energies via Voronoi tessellations

    NASA Astrophysics Data System (ADS)

    Ward, Logan; Liu, Ruoqian; Krishna, Amar; Hegde, Vinay I.; Agrawal, Ankit; Choudhary, Alok; Wolverton, Chris

    2017-07-01

    While high-throughput density functional theory (DFT) has become a prevalent tool for materials discovery, it is limited by the relatively large computational cost. In this paper, we explore using DFT data from high-throughput calculations to create faster, surrogate models with machine learning (ML) that can be used to guide new searches. Our method works by using decision tree models to map DFT-calculated formation enthalpies to a set of attributes consisting of two distinct types: (i) composition-dependent attributes of elemental properties (as have been used in previous ML models of DFT formation energies), combined with (ii) attributes derived from the Voronoi tessellation of the compound's crystal structure. The ML models created using this method have half the cross-validation error and similar training and evaluation speeds to models created with the Coulomb matrix and partial radial distribution function methods. For a dataset of 435 000 formation energies taken from the Open Quantum Materials Database (OQMD), our model achieves a mean absolute error of 80 meV/atom in cross validation, which is lower than the approximate error between DFT-computed and experimentally measured formation enthalpies and below 15% of the mean absolute deviation of the training set. We also demonstrate that our method can accurately estimate the formation energy of materials outside of the training set and be used to identify materials with especially large formation enthalpies. We propose that our models can be used to accelerate the discovery of new materials by identifying the most promising materials to study with DFT at little additional computational cost.

  8. Scaffolding Learning by Modelling: The Effects of Partially Worked-out Models

    ERIC Educational Resources Information Center

    Mulder, Yvonne G.; Bollen, Lars; de Jong, Ton; Lazonder, Ard W.

    2016-01-01

    Creating executable computer models is a potentially powerful approach to science learning. Learning by modelling is also challenging because students can easily get overwhelmed by the inherent complexities of the task. This study investigated whether offering partially worked-out models can facilitate students' modelling practices and promote…

  9. Automatic mathematical modeling for real time simulation program (AI application)

    NASA Technical Reports Server (NTRS)

    Wang, Caroline; Purinton, Steve

    1989-01-01

    A methodology is described for automatic mathematical modeling and generating simulation models. The major objective was to create a user friendly environment for engineers to design, maintain, and verify their models; to automatically convert the mathematical models into conventional code for computation; and finally, to document the model automatically.

  10. Holodeck Testbed Project

    NASA Technical Reports Server (NTRS)

    Arias, Adriel (Inventor)

    2016-01-01

    The main objective of the Holodeck Testbed is to create a cost effective, realistic, and highly immersive environment that can be used to train astronauts, carry out engineering analysis, develop procedures, and support various operations tasks. Currently, the Holodeck testbed allows to step into a simulated ISS (International Space Station) and interact with objects; as well as, perform Extra Vehicular Activities (EVA) on the surface of the Moon or Mars. The Holodeck Testbed is using the products being developed in the Hybrid Reality Lab (HRL). The HRL is combining technologies related to merging physical models with photo-realistic visuals to create a realistic and highly immersive environment. The lab also investigates technologies and concepts that are needed to allow it to be integrated with other testbeds; such as, the gravity offload capability provided by the Active Response Gravity Offload System (ARGOS). My main two duties were to develop and animate models for use in the HRL environments and work on a new way to interface with computers using Brain Computer Interface (BCI) technology. On my first task, I was able to create precise computer virtual tool models (accurate down to the thousandths or hundredths of an inch). To make these tools even more realistic, I produced animations for these tools so they would have the same mechanical features as the tools in real life. The computer models were also used to create 3D printed replicas that will be outfitted with tracking sensors. The sensor will allow the 3D printed models to align precisely with the computer models in the physical world and provide people with haptic/tactile feedback while wearing a VR (Virtual Reality) headset and interacting with the tools. Getting close to the end of my internship the lab bought a professional grade 3D Scanner. With this, I was able to replicate more intricate tools at a much more time-effective rate. The second task was to investigate the use of BCI to control objects inside the hybrid reality ISS environment. This task looked at using an Electroencephalogram (EEG) headset to collect brain state data that could be mapped to commands that a computer could execute. On this Task, I had a setback with the hardware, which stopped working and was returned to the vendor for repair. However, I was still able to collect some data, was able to process it, and started to create correlation algorithms between the electrical patterns in the brain and the commands we wanted the computer to carry out. I also carried out a test to investigate the comfort of the headset if it is worn for a long time. The knowledge gained will benefit me in my future career. I learned how to use various modeling and programming tools that included Blender, Maya, Substance Painter, Artec Studio, Github, and Unreal Engine 4. I learned how to use a professional grade 3D scanner and 3D printer. On the BCI Project I learned about data mining and how to create correlation algorithms. I also supported various demos including a live demo of the hybrid reality lab capabilities at ComicPalooza. This internship has given me a good look into engineering at NASA. I developed a more thorough understanding of engineering and my overall confidence has grown. I have also realized that any problem can be fixed, if you try hard enough, and as an engineer it is your job to not only fix problems but to embrace coming up with solutions to those problems.

  11. Proceedings of Conference on Variable-Resolution Modeling, Washington, DC, 5-6 May 1992

    DTIC Science & Technology

    1992-05-01

    of powerful new computer architectures for supporting object-oriented computing. Objects, as self -contained data-code packages with orderly...another entity structure. For example, (copy-entstr e:sys- tcm ’ new -system) creates an entity structure named c:new-system that has the same structure...324 Parry, S-H. (1984): A Self -contained Hierarchical Model Construct. In: Systems Analysis and Modeling in Defense (R.K. Huber, Ed.), New York

  12. Presenting Mitosis

    ERIC Educational Resources Information Center

    Roche, Stephanie; Sterling, Donna R.

    2005-01-01

    When the topic of cell division is introduced in the classroom, students can showcase their interpretations of the stages of mitosis by creating a slide show illustrating prophase, metaphase, anaphase, and telophase (see samples in Figure 1). With the help of a computer, they can create a model of mitosis that will help them distinguish the…

  13. Neural Dynamics as Sampling: A Model for Stochastic Computation in Recurrent Networks of Spiking Neurons

    PubMed Central

    Buesing, Lars; Bill, Johannes; Nessler, Bernhard; Maass, Wolfgang

    2011-01-01

    The organization of computations in networks of spiking neurons in the brain is still largely unknown, in particular in view of the inherently stochastic features of their firing activity and the experimentally observed trial-to-trial variability of neural systems in the brain. In principle there exists a powerful computational framework for stochastic computations, probabilistic inference by sampling, which can explain a large number of macroscopic experimental data in neuroscience and cognitive science. But it has turned out to be surprisingly difficult to create a link between these abstract models for stochastic computations and more detailed models of the dynamics of networks of spiking neurons. Here we create such a link and show that under some conditions the stochastic firing activity of networks of spiking neurons can be interpreted as probabilistic inference via Markov chain Monte Carlo (MCMC) sampling. Since common methods for MCMC sampling in distributed systems, such as Gibbs sampling, are inconsistent with the dynamics of spiking neurons, we introduce a different approach based on non-reversible Markov chains that is able to reflect inherent temporal processes of spiking neuronal activity through a suitable choice of random variables. We propose a neural network model and show by a rigorous theoretical analysis that its neural activity implements MCMC sampling of a given distribution, both for the case of discrete and continuous time. This provides a step towards closing the gap between abstract functional models of cortical computation and more detailed models of networks of spiking neurons. PMID:22096452

  14. Creating an Electronic Reference and Information Database for Computer-aided ECM Design

    NASA Astrophysics Data System (ADS)

    Nekhoroshev, M. V.; Pronichev, N. D.; Smirnov, G. V.

    2018-01-01

    The paper presents a review on electrochemical shaping. An algorithm has been developed to implement a computer shaping model applicable to pulse electrochemical machining. For that purpose, the characteristics of pulse current occurring in electrochemical machining of aviation materials have been studied. Based on integrating the experimental results and comprehensive electrochemical machining process data modeling, a subsystem for computer-aided design of electrochemical machining for gas turbine engine blades has been developed; the subsystem was implemented in the Teamcenter PLM system.

  15. The study of early human embryos using interactive 3-dimensional computer reconstructions.

    PubMed

    Scarborough, J; Aiton, J F; McLachlan, J C; Smart, S D; Whiten, S C

    1997-07-01

    Tracings of serial histological sections from 4 human embryos at different Carnegie stages were used to create 3-dimensional (3D) computer models of the developing heart. The models were constructed using commercially available software developed for graphic design and the production of computer generated virtual reality environments. They are available as interactive objects which can be downloaded via the World Wide Web. This simple method of 3D reconstruction offers significant advantages for understanding important events in morphological sciences.

  16. Design Through Manufacturing: The Solid Model - Finite Element Analysis Interface

    NASA Technical Reports Server (NTRS)

    Rubin, Carol

    2003-01-01

    State-of-the-art computer aided design (CAD) presently affords engineers the opportunity to create solid models of machine parts which reflect every detail of the finished product. Ideally, these models should fulfill two very important functions: (1) they must provide numerical control information for automated manufacturing of precision parts, and (2) they must enable analysts to easily evaluate the stress levels (using finite element analysis - FEA) for all structurally significant parts used in space missions. Today's state-of-the-art CAD programs perform function (1) very well, providing an excellent model for precision manufacturing. But they do not provide a straightforward and simple means of automating the translation from CAD to FEA models, especially for aircraft-type structures. The research performed during the fellowship period investigated the transition process from the solid CAD model to the FEA stress analysis model with the final goal of creating an automatic interface between the two. During the period of the fellowship a detailed multi-year program for the development of such an interface was created. The ultimate goal of this program will be the development of a fully parameterized automatic ProE/FEA translator for parts and assemblies, with the incorporation of data base management into the solution, and ultimately including computational fluid dynamics and thermal modeling in the interface.

  17. The IRGen infrared data base modeler

    NASA Technical Reports Server (NTRS)

    Bernstein, Uri

    1993-01-01

    IRGen is a modeling system which creates three-dimensional IR data bases for real-time simulation of thermal IR sensors. Starting from a visual data base, IRGen computes the temperature and radiance of every data base surface with a user-specified thermal environment. The predicted gray shade of each surface is then computed from the user specified sensor characteristics. IRGen is based on first-principles models of heat transport and heat flux sources, and it accurately simulates the variations of IR imagery with time of day and with changing environmental conditions. The starting point for creating an IRGen data base is a visual faceted data base, in which every facet has been labeled with a material code. This code is an index into a material data base which contains surface and bulk thermal properties for the material. IRGen uses the material properties to compute the surface temperature at the specified time of day. IRGen also supports image generator features such as texturing and smooth shading, which greatly enhance image realism.

  18. The emergence of understanding in a computer model of concepts and analogy-making

    NASA Astrophysics Data System (ADS)

    Mitchell, Melanie; Hofstadter, Douglas R.

    1990-06-01

    This paper describes Copycat, a computer model of the mental mechanisms underlying the fluidity and adaptability of the human conceptual system in the context of analogy-making. Copycat creates analogies between idealized situations in a microworld that has been designed to capture and isolate many of the central issues of analogy-making. In Copycat, an understanding of the essence of a situation and the recognition of deep similarity between two superficially different situations emerge from the interaction of a large number of perceptual agents with an associative, overlapping, and context-sensitive network of concepts. Central features of the model are: a high degree of parallelism; competition and cooperation among a large number of small, locally acting agents that together create a global understanding of the situation at hand; and a computational temperature that measures the amount of perceptual organization as processing proceeds and that in turn controls the degree of randomness with which decisions are made in the system.

  19. [Computer aided design for fixed partial denture framework based on reverse engineering technology].

    PubMed

    Sun, Yu-chun; Lü, Pei-jun; Wang, Yong

    2006-03-01

    To explore a computer aided design (CAD) route for the framework of domestic fixed partial denture (FPD) and confirm the suitable method of 3-D CAD. The working area of a dentition model was scanned with a 3-D mechanical scanner. Using the reverse engineering (RE) software, margin and border curves were extracted and several reference curves were created to ensure the dimension and location of pontic framework that was taken from the standard database. The shoulder parts of the retainers were created after axial surfaces constructed. The connecting areas, axial line and curving surface of the framework connector were finally created. The framework of a three-unit FPD was designed with RE technology, which showed smooth surfaces and continuous contours. The design route is practical. The result of this study is significant in theory and practice, which will provide a reference for establishing the computer aided design/computer aided manufacture (CAD/CAM) system of domestic FPD.

  20. Advanced earth observation spacecraft computer-aided design software: Technical, user and programmer guide

    NASA Technical Reports Server (NTRS)

    Farrell, C. E.; Krauze, L. D.

    1983-01-01

    The IDEAS computer of NASA is a tool for interactive preliminary design and analysis of LSS (Large Space System). Nine analysis modules were either modified or created. These modules include the capabilities of automatic model generation, model mass properties calculation, model area calculation, nonkinematic deployment modeling, rigid-body controls analysis, RF performance prediction, subsystem properties definition, and EOS science sensor selection. For each module, a section is provided that contains technical information, user instructions, and programmer documentation.

  1. A Comparison of Approximation Modeling Techniques: Polynomial Versus Interpolating Models

    NASA Technical Reports Server (NTRS)

    Giunta, Anthony A.; Watson, Layne T.

    1998-01-01

    Two methods of creating approximation models are compared through the calculation of the modeling accuracy on test problems involving one, five, and ten independent variables. Here, the test problems are representative of the modeling challenges typically encountered in realistic engineering optimization problems. The first approximation model is a quadratic polynomial created using the method of least squares. This type of polynomial model has seen considerable use in recent engineering optimization studies due to its computational simplicity and ease of use. However, quadratic polynomial models may be of limited accuracy when the response data to be modeled have multiple local extrema. The second approximation model employs an interpolation scheme known as kriging developed in the fields of spatial statistics and geostatistics. This class of interpolating model has the flexibility to model response data with multiple local extrema. However, this flexibility is obtained at an increase in computational expense and a decrease in ease of use. The intent of this study is to provide an initial exploration of the accuracy and modeling capabilities of these two approximation methods.

  2. Composite panel development at JPL

    NASA Technical Reports Server (NTRS)

    Mcelroy, Paul; Helms, Rich

    1988-01-01

    Parametric computer studies can be use in a cost effective manner to determine optimized composite mirror panel designs. An InterDisciplinary computer Model (IDM) was created to aid in the development of high precision reflector panels for LDR. The materials properties, thermal responses, structural geometries, and radio/optical precision are synergistically analyzed for specific panel designs. Promising panels designs are fabricated and tested so that comparison with panel test results can be used to verify performance prediction models and accommodate design refinement. The iterative approach of computer design and model refinement with performance testing and materials optimization has shown good results for LDR panels.

  3. A Multi-Fidelity Surrogate Model for the Equation of State for Mixtures of Real Gases

    NASA Astrophysics Data System (ADS)

    Ouellet, Frederick; Park, Chanyoung; Koneru, Rahul; Balachandar, S.; Rollin, Bertrand

    2017-11-01

    The explosive dispersal of particles is a complex multiphase and multi-species fluid flow problem. In these flows, the products of detonated explosives must be treated as real gases while the ideal gas equation of state is used for the ambient air. As the products expand outward, they mix with the air and create a region where both state equations must be satisfied. One of the most accurate, yet expensive, methods to handle this problem is an algorithm that iterates between both state equations until both pressure and thermal equilibrium are achieved inside of each computational cell. This work creates a multi-fidelity surrogate model to replace this process. This is achieved by using a Kriging model to produce a curve fit which interpolates selected data from the iterative algorithm. The surrogate is optimized for computing speed and model accuracy by varying the number of sampling points chosen to construct the model. The performance of the surrogate with respect to the iterative method is tested in simulations using a finite volume code. The model's computational speed and accuracy are analyzed to show the benefits of this novel approach. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA00023.

  4. The "Day in the Life of a Teenage Hobo" Project: Integrating Technology with Shneiderman's Collect-Relate-Create- Donate Framework

    ERIC Educational Resources Information Center

    Reich, Justin; Daccord, Thomas

    2009-01-01

    Used wisely, academic technology empowers students to take responsibility for their own learning. "In Leonardo's Laptop," Ben Shneiderman provides teachers with a powerful framework, "Collect-Relate-Create-Donate" (CRCD), for designing student-centered learning opportunities using computers. Shneiderman developed his model by…

  5. Computational modeling to predict mechanical function of joints: application to the lower leg with simulation of two cadaver studies.

    PubMed

    Liacouras, Peter C; Wayne, Jennifer S

    2007-12-01

    Computational models of musculoskeletal joints and limbs can provide useful information about joint mechanics. Validated models can be used as predictive devices for understanding joint function and serve as clinical tools for predicting the outcome of surgical procedures. A new computational modeling approach was developed for simulating joint kinematics that are dictated by bone/joint anatomy, ligamentous constraints, and applied loading. Three-dimensional computational models of the lower leg were created to illustrate the application of this new approach. Model development began with generating three-dimensional surfaces of each bone from CT images and then importing into the three-dimensional solid modeling software SOLIDWORKS and motion simulation package COSMOSMOTION. Through SOLIDWORKS and COSMOSMOTION, each bone surface file was filled to create a solid object and positioned necessary components added, and simulations executed. Three-dimensional contacts were added to inhibit intersection of the bones during motion. Ligaments were represented as linear springs. Model predictions were then validated by comparison to two different cadaver studies, syndesmotic injury and repair and ankle inversion following ligament transection. The syndesmotic injury model was able to predict tibial rotation, fibular rotation, and anterior/posterior displacement. In the inversion simulation, calcaneofibular ligament extension and angles of inversion compared well. Some experimental data proved harder to simulate accurately, due to certain software limitations and lack of complete experimental data. Other parameters that could not be easily obtained experimentally can be predicted and analyzed by the computational simulations. In the syndesmotic injury study, the force generated in the tibionavicular and calcaneofibular ligaments reduced with the insertion of the staple, indicating how this repair technique changes joint function. After transection of the calcaneofibular ligament in the inversion stability study, a major increase in force was seen in several of the ligaments on the lateral aspect of the foot and ankle, indicating the recruitment of other structures to permit function after injury. Overall, the computational models were able to predict joint kinematics of the lower leg with particular focus on the ankle complex. This same approach can be taken to create models of other limb segments such as the elbow and wrist. Additional parameters can be calculated in the models that are not easily obtained experimentally such as ligament forces, force transmission across joints, and three-dimensional movement of all bones. Muscle activation can be incorporated in the model through the action of applied forces within the software for future studies.

  6. A Multi-Fidelity Surrogate Model for Handling Real Gas Equations of State

    NASA Astrophysics Data System (ADS)

    Ouellet, Frederick; Park, Chanyoung; Rollin, Bertrand; Balachandar, S."bala"

    2016-11-01

    The explosive dispersal of particles is an example of a complex multiphase and multi-species fluid flow problem. This problem has many engineering applications including particle-laden explosives. In these flows, the detonation products of the explosive cannot be treated as a perfect gas so a real gas equation of state is used to close the governing equations (unlike air, which uses the ideal gas equation for closure). As the products expand outward from the detonation point, they mix with ambient air and create a mixing region where both of the state equations must be satisfied. One of the more accurate, yet computationally expensive, methods to deal with this is a scheme that iterates between the two equations of state until pressure and thermal equilibrium are achieved inside of each computational cell. This work strives to create a multi-fidelity surrogate model of this process. We then study the performance of the model with respect to the iterative method by performing both gas-only and particle laden flow simulations using an Eulerian-Lagrangian approach with a finite volume code. Specifically, the model's (i) computational speed, (ii) memory requirements and (iii) computational accuracy are analyzed to show the benefits of this novel modeling approach. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program, as a Cooperative Agreement under the Predictive Science Academic Alliance Program, under Contract No. DE-NA00023.

  7. Advanced Technology for Portable Personal Visualization

    DTIC Science & Technology

    1991-03-01

    walking. The building model is created using AutoCAD. Realism is enhanced by calculating a radiosity solution for the lighting model. This has an added...lighting, color combinations and decor. Due to the computationally intensive nature of the radiosity solution, modeling changes cannot be made on-line

  8. Associative Algorithms for Computational Creativity

    ERIC Educational Resources Information Center

    Varshney, Lav R.; Wang, Jun; Varshney, Kush R.

    2016-01-01

    Computational creativity, the generation of new, unimagined ideas or artifacts by a machine that are deemed creative by people, can be applied in the culinary domain to create novel and flavorful dishes. In fact, we have done so successfully using a combinatorial algorithm for recipe generation combined with statistical models for recipe ranking…

  9. Automated inverse computer modeling of borehole flow data in heterogeneous aquifers

    NASA Astrophysics Data System (ADS)

    Sawdey, J. R.; Reeve, A. S.

    2012-09-01

    A computer model has been developed to simulate borehole flow in heterogeneous aquifers where the vertical distribution of permeability may vary significantly. In crystalline fractured aquifers, flow into or out of a borehole occurs at discrete locations of fracture intersection. Under these circumstances, flow simulations are defined by independent variables of transmissivity and far-field heads for each flow contributing fracture intersecting the borehole. The computer program, ADUCK (A Downhole Underwater Computational Kit), was developed to automatically calibrate model simulations to collected flowmeter data providing an inverse solution to fracture transmissivity and far-field head. ADUCK has been tested in variable borehole flow scenarios, and converges to reasonable solutions in each scenario. The computer program has been created using open-source software to make the ADUCK model widely available to anyone who could benefit from its utility.

  10. Rapid Prototyping of Hydrologic Model Interfaces with IPython

    NASA Astrophysics Data System (ADS)

    Farthing, M. W.; Winters, K. D.; Ahmadia, A. J.; Hesser, T.; Howington, S. E.; Johnson, B. D.; Tate, J.; Kees, C. E.

    2014-12-01

    A significant gulf still exists between the state of practice and state of the art in hydrologic modeling. Part of this gulf is due to the lack of adequate pre- and post-processing tools for newly developed computational models. The development of user interfaces has traditionally lagged several years behind the development of a particular computational model or suite of models. As a result, models with mature interfaces often lack key advancements in model formulation, solution methods, and/or software design and technology. Part of the problem has been a focus on developing monolithic tools to provide comprehensive interfaces for the entire suite of model capabilities. Such efforts require expertise in software libraries and frameworks for creating user interfaces (e.g., Tcl/Tk, Qt, and MFC). These tools are complex and require significant investment in project resources (time and/or money) to use. Moreover, providing the required features for the entire range of possible applications and analyses creates a cumbersome interface. For a particular site or application, the modeling requirements may be simplified or at least narrowed, which can greatly reduce the number and complexity of options that need to be accessible to the user. However, monolithic tools usually are not adept at dynamically exposing specific workflows. Our approach is to deliver highly tailored interfaces to users. These interfaces may be site and/or process specific. As a result, we end up with many, customized interfaces rather than a single, general-use tool. For this approach to be successful, it must be efficient to create these tailored interfaces. We need technology for creating quality user interfaces that is accessible and has a low barrier for integration into model development efforts. Here, we present efforts to leverage IPython notebooks as tools for rapid prototyping of site and application-specific user interfaces. We provide specific examples from applications in near-shore environments as well as levee analysis. We discuss our design decisions and methodology for developing customized interfaces, strategies for delivery of the interfaces to users in various computing environments, as well as implications for the design/implementation of simulation models.

  11. High performance cellular level agent-based simulation with FLAME for the GPU.

    PubMed

    Richmond, Paul; Walker, Dawn; Coakley, Simon; Romano, Daniela

    2010-05-01

    Driven by the availability of experimental data and ability to simulate a biological scale which is of immediate interest, the cellular scale is fast emerging as an ideal candidate for middle-out modelling. As with 'bottom-up' simulation approaches, cellular level simulations demand a high degree of computational power, which in large-scale simulations can only be achieved through parallel computing. The flexible large-scale agent modelling environment (FLAME) is a template driven framework for agent-based modelling (ABM) on parallel architectures ideally suited to the simulation of cellular systems. It is available for both high performance computing clusters (www.flame.ac.uk) and GPU hardware (www.flamegpu.com) and uses a formal specification technique that acts as a universal modelling format. This not only creates an abstraction from the underlying hardware architectures, but avoids the steep learning curve associated with programming them. In benchmarking tests and simulations of advanced cellular systems, FLAME GPU has reported massive improvement in performance over more traditional ABM frameworks. This allows the time spent in the development and testing stages of modelling to be drastically reduced and creates the possibility of real-time visualisation for simple visual face-validation.

  12. CARE 3 user-friendly interface user's guide

    NASA Technical Reports Server (NTRS)

    Martensen, A. L.

    1987-01-01

    CARE 3 predicts the unreliability of highly reliable reconfigurable fault-tolerant systems that include redundant computers or computer systems. CARE3MENU is a user-friendly interface used to create an input for the CARE 3 program. The CARE3MENU interface has been designed to minimize user input errors. Although a CARE3MENU session may be successfully completed and all parameters may be within specified limits or ranges, the CARE 3 program is not guaranteed to produce meaningful results if the user incorrectly interprets the CARE 3 stochastic model. The CARE3MENU User Guide provides complete information on how to create a CARE 3 model with the interface. The CARE3MENU interface runs under the VAX/VMS operating system.

  13. A LANGUAGE FOR MODULAR SPATIO-TEMPORAL SIMULATION (R824766)

    EPA Science Inventory

    Creating an effective environment for collaborative spatio-temporal model development will require computational systems that provide support for the user in three key areas: (1) Support for modular, hierarchical model construction and archiving/linking of simulation modules; (2)...

  14. EXPOSURE ASSESSMENT MODELING FOR HYDROCARBON SPILLS INTO THE SUBSURFACE

    EPA Science Inventory

    Hydrocarbons which enter the subsurface through spills or leaks may create serious, long-lived ground-water contamination problems. onventional finite difference and finite element models of multiphase, multicomponent flow often have extreme requirements for both computer time an...

  15. Application of computer graphics in the design of custom orthopedic implants.

    PubMed

    Bechtold, J E

    1986-10-01

    Implementation of newly developed computer modelling techniques and computer graphics displays and software have greatly aided the orthopedic design engineer and physician in creating a custom implant with good anatomic conformity in a short turnaround time. Further advances in computerized design and manufacturing will continue to simplify the development of custom prostheses and enlarge their niche in the joint replacement market.

  16. Novel 3-D Computer Model Can Help Predict Pathogens’ Roles in Cancer | Poster

    Cancer.gov

    To understand how bacterial and viral infections contribute to human cancers, four NCI at Frederick scientists turned not to the lab bench, but to a computer. The team has created the world’s first—and currently, only—3-D computational approach for studying interactions between pathogen proteins and human proteins based on a molecular adaptation known as interface mimicry.

  17. Surrogate modeling of deformable joint contact using artificial neural networks.

    PubMed

    Eskinazi, Ilan; Fregly, Benjamin J

    2015-09-01

    Deformable joint contact models can be used to estimate loading conditions for cartilage-cartilage, implant-implant, human-orthotic, and foot-ground interactions. However, contact evaluations are often so expensive computationally that they can be prohibitive for simulations or optimizations requiring thousands or even millions of contact evaluations. To overcome this limitation, we developed a novel surrogate contact modeling method based on artificial neural networks (ANNs). The method uses special sampling techniques to gather input-output data points from an original (slow) contact model in multiple domains of input space, where each domain represents a different physical situation likely to be encountered. For each contact force and torque output by the original contact model, a multi-layer feed-forward ANN is defined, trained, and incorporated into a surrogate contact model. As an evaluation problem, we created an ANN-based surrogate contact model of an artificial tibiofemoral joint using over 75,000 evaluations of a fine-grid elastic foundation (EF) contact model. The surrogate contact model computed contact forces and torques about 1000 times faster than a less accurate coarse grid EF contact model. Furthermore, the surrogate contact model was seven times more accurate than the coarse grid EF contact model within the input domain of a walking motion. For larger input domains, the surrogate contact model showed the expected trend of increasing error with increasing domain size. In addition, the surrogate contact model was able to identify out-of-contact situations with high accuracy. Computational contact models created using our proposed ANN approach may remove an important computational bottleneck from musculoskeletal simulations or optimizations incorporating deformable joint contact models. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Surrogate Modeling of Deformable Joint Contact using Artificial Neural Networks

    PubMed Central

    Eskinazi, Ilan; Fregly, Benjamin J.

    2016-01-01

    Deformable joint contact models can be used to estimate loading conditions for cartilage-cartilage, implant-implant, human-orthotic, and foot-ground interactions. However, contact evaluations are often so expensive computationally that they can be prohibitive for simulations or optimizations requiring thousands or even millions of contact evaluations. To overcome this limitation, we developed a novel surrogate contact modeling method based on artificial neural networks (ANNs). The method uses special sampling techniques to gather input-output data points from an original (slow) contact model in multiple domains of input space, where each domain represents a different physical situation likely to be encountered. For each contact force and torque output by the original contact model, a multi-layer feed-forward ANN is defined, trained, and incorporated into a surrogate contact model. As an evaluation problem, we created an ANN-based surrogate contact model of an artificial tibiofemoral joint using over 75,000 evaluations of a fine-grid elastic foundation (EF) contact model. The surrogate contact model computed contact forces and torques about 1000 times faster than a less accurate coarse grid EF contact model. Furthermore, the surrogate contact model was seven times more accurate than the coarse grid EF contact model within the input domain of a walking motion. For larger input domains, the surrogate contact model showed the expected trend of increasing error with increasing domain size. In addition, the surrogate contact model was able to identify out-of-contact situations with high accuracy. Computational contact models created using our proposed ANN approach may remove an important computational bottleneck from musculoskeletal simulations or optimizations incorporating deformable joint contact models. PMID:26220591

  19. Polarization Considerations for the Laser Interferometer Space Antenna

    NASA Technical Reports Server (NTRS)

    Waluschka, Eugene; Pedersen, Tracy R.; McNamara, Paul

    2005-01-01

    A polarization ray trace model of the Laser Interferometer Space Antenna s (LISA) optical path is being created. The model will be able to assess the effects of various polarizing elements and the optical coatings on the required, very long path length, picometer level dynamic interferometry. The computational steps are described. This should eliminate any ambiguities associated with polarization ray tracing of interferometers and provide a basis for determining the computer model s limitations and serve as a clearly defined starting point for future work.

  20. Utility usage forecasting

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hosking, Jonathan R. M.; Natarajan, Ramesh

    The computer creates a utility demand forecast model for weather parameters by receiving a plurality of utility parameter values, wherein each received utility parameter value corresponds to a weather parameter value. Determining that a range of weather parameter values lacks a sufficient amount of corresponding received utility parameter values. Determining one or more utility parameter values that corresponds to the range of weather parameter values. Creating a model which correlates the received and the determined utility parameter values with the corresponding weather parameters values.

  1. CT-assisted reverse engineering for aging aircraft resupply

    NASA Astrophysics Data System (ADS)

    Yancey, Robert N.

    1998-03-01

    The service life of military aircraft is being extended beyond original design intents. When the C-130 is eventually retired, it will have been in service for 79 years, well beyond its planned life expectancy of 40 years. Similarly, the KC-135s are presently expected to remain operational for 86 years, and the B-52 for 94 years. Not only are inventories of parts in short supply, but it is necessary to acquire parts no one expected to replace. The first step in any resupply activity is the creation of a data package. If nor computer-aided design (CAD) model exists, the demands of modern electronic commerce dictate than one be created. Creating a CAD model of an existing part is referred to as 'reverse engineering.' Computed tomography (CT) offers an ideal way to obtain metrology data critical to reveres engineering activities. Industrial CT systems have progressed to the point where they can nondestructively measure part dimensions at an accuracy competitive with coordinate measuring machines and a speed competitive with laser scanners. However, of the existing methods, only CT can nondestructively dimension interior surfaces, and only CT has the ability to densitometrically quantify the internal state of materials. The use of CT to help create CAD models for resupply efforts will be described and examples presented. Additionally, examples will be presented how the CT-created CAD models were then sued to fabricate replacement parts for aging systems.

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kornreich, Drew E; Vaidya, Rajendra U; Ammerman, Curtt N

    Integrated Computational Materials Engineering (ICME) is a novel overarching approach to bridge length and time scales in computational materials science and engineering. This approach integrates all elements of multi-scale modeling (including various empirical and science-based models) with materials informatics to provide users the opportunity to tailor material selections based on stringent application needs. Typically, materials engineering has focused on structural requirements (stress, strain, modulus, fracture toughness etc.) while multi-scale modeling has been science focused (mechanical threshold strength model, grain-size models, solid-solution strengthening models etc.). Materials informatics (mechanical property inventories) on the other hand, is extensively data focused. All of thesemore » elements are combined within the framework of ICME to create architecture for the development, selection and design new composite materials for challenging environments. We propose development of the foundations for applying ICME to composite materials development for nuclear and high-radiation environments (including nuclear-fusion energy reactors, nuclear-fission reactors, and accelerators). We expect to combine all elements of current material models (including thermo-mechanical and finite-element models) into the ICME framework. This will be accomplished through the use of a various mathematical modeling constructs. These constructs will allow the integration of constituent models, which in tum would allow us to use the adaptive strengths of using a combinatorial scheme (fabrication and computational) for creating new composite materials. A sample problem where these concepts are used is provided in this summary.« less

  3. A novel computer-aided method to fabricate a custom one-piece glass fiber dowel-and-core based on digitized impression and crown preparation data.

    PubMed

    Chen, Zhiyu; Li, Ya; Deng, Xuliang; Wang, Xinzhi

    2014-06-01

    Fiber-reinforced composite dowels have been widely used for their superior biomechanical properties; however, their preformed shape cannot fit irregularly shaped root canals. This study aimed to describe a novel computer-aided method to create a custom-made one-piece dowel-and-core based on the digitization of impressions and clinical standard crown preparations. A standard maxillary die stone model containing three prepared teeth each (maxillary lateral incisor, canine, premolar) requiring dowel restorations was made. It was then mounted on an average value articulator with the mandibular stone model to simulate natural occlusion. Impressions for each tooth were obtained using vinylpolysiloxane with a sectional dual-arch tray and digitized with an optical scanner. The dowel-and-core virtual model was created by slicing 3D dowel data from impression digitization with core data selected from a standard crown preparation database of 107 records collected from clinics and digitized. The position of the chosen digital core was manually regulated to coordinate with the adjacent teeth to fulfill the crown restorative requirements. Based on virtual models, one-piece custom dowel-and-cores for three experimental teeth were milled from a glass fiber block with computer-aided manufacturing techniques. Furthermore, two patients were treated to evaluate the practicality of this new method. The one-piece glass fiber dowel-and-core made for experimental teeth fulfilled the clinical requirements for dowel restorations. Moreover, two patients were treated to validate the technique. This novel computer-aided method to create a custom one-piece glass fiber dowel-and-core proved to be practical and efficient. © 2013 by the American College of Prosthodontists.

  4. Computational physics of the mind

    NASA Astrophysics Data System (ADS)

    Duch, Włodzisław

    1996-08-01

    In the XIX century and earlier physicists such as Newton, Mayer, Hooke, Helmholtz and Mach were actively engaged in the research on psychophysics, trying to relate psychological sensations to intensities of physical stimuli. Computational physics allows to simulate complex neural processes giving a chance to answer not only the original psychophysical questions but also to create models of the mind. In this paper several approaches relevant to modeling of the mind are outlined. Since direct modeling of the brain functions is rather limited due to the complexity of such models a number of approximations is introduced. The path from the brain, or computational neurosciences, to the mind, or cognitive sciences, is sketched, with emphasis on higher cognitive functions such as memory and consciousness. No fundamental problems in understanding of the mind seem to arise. From a computational point of view realistic models require massively parallel architectures.

  5. Tree-Structured Digital Organisms Model

    NASA Astrophysics Data System (ADS)

    Suzuki, Teruhiko; Nobesawa, Shiho; Tahara, Ikuo

    Tierra and Avida are well-known models of digital organisms. They describe a life process as a sequence of computation codes. A linear sequence model may not be the only way to describe a digital organism, though it is very simple for a computer-based model. Thus we propose a new digital organism model based on a tree structure, which is rather similar to the generic programming. With our model, a life process is a combination of various functions, as if life in the real world is. This implies that our model can easily describe the hierarchical structure of life, and it can simulate evolutionary computation through mutual interaction of functions. We verified our model by simulations that our model can be regarded as a digital organism model according to its definitions. Our model even succeeded in creating species such as viruses and parasites.

  6. Dynamic reduction of dimensions of a document vector in a document search and retrieval system

    DOEpatents

    Jiao, Yu; Potok, Thomas E.

    2011-05-03

    The method and system of the invention involves processing each new document (20) coming into the system into a document vector (16), and creating a document vector with reduced dimensionality (17) for comparison with the data model (15) without recomputing the data model (15). These operations are carried out by a first computer (11) while a second computer (12) updates the data model (18), which can be comprised of an initial large group of documents (19) and is premised on the computing an initial data model (13, 14, 15) to provide a reference point for determining document vectors from documents processed from the data stream (20).

  7. Learning from Listservs: Collaboration, Knowledge Exchange, and the Formation of Distributed Leadership for Farmers' Markets and the Food Movement

    ERIC Educational Resources Information Center

    Quintana, Maclovia; Morales, Alfonso

    2015-01-01

    Computer-mediated communications, in particular listservs, can be powerful tools for creating social change--namely, shifting our food system to a more healthy, just, and localised model. They do this by creating the conditions--collaborations, interaction, self-reflection, and personal empowerment--that cultivate distributed leadership. In this…

  8. Economic Model For a Return on Investment Analysis of United States Government High Performance Computing (HPC) Research and Development (R & D) Investment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, Earl C.; Conway, Steve; Dekate, Chirag

    This study investigated how high-performance computing (HPC) investments can improve economic success and increase scientific innovation. This research focused on the common good and provided uses for DOE, other government agencies, industry, and academia. The study created two unique economic models and an innovation index: 1 A macroeconomic model that depicts the way HPC investments result in economic advancements in the form of ROI in revenue (GDP), profits (and cost savings), and jobs. 2 A macroeconomic model that depicts the way HPC investments result in basic and applied innovations, looking at variations by sector, industry, country, and organization size. Amore » new innovation index that provides a means of measuring and comparing innovation levels. Key findings of the pilot study include: IDC collected the required data across a broad set of organizations, with enough detail to create these models and the innovation index. The research also developed an expansive list of HPC success stories.« less

  9. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  10. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation.

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  11. ASSESSING A COMPUTER MODEL FOR PREDICTING HUMAN EXPOSURE TO PM2.5

    EPA Science Inventory

    This paper compares outputs of a model for predicting PM2.5 exposure with experimental data obtained from exposure studies of selected subpopulations. The exposure model is built on a WWW platform called pCNEM, "A PC Version of pNEM." Exposure models created by pCNEM are sim...

  12. Agent-Based Computing in Distributed Adversarial Planning

    DTIC Science & Technology

    2010-08-09

    plans. An agent is expected to agree to deviate from its optimal uncoordinated plan only if it improves its position. - process models for opponent...Game . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12 2.2 Improvements ...plan only if it improves its position. – process models for opponent modeling – We have analyzed the suitability of business process models for creating

  13. User's guide to the Reliability Estimation System Testbed (REST)

    NASA Technical Reports Server (NTRS)

    Nicol, David M.; Palumbo, Daniel L.; Rifkin, Adam

    1992-01-01

    The Reliability Estimation System Testbed is an X-window based reliability modeling tool that was created to explore the use of the Reliability Modeling Language (RML). RML was defined to support several reliability analysis techniques including modularization, graphical representation, Failure Mode Effects Simulation (FMES), and parallel processing. These techniques are most useful in modeling large systems. Using modularization, an analyst can create reliability models for individual system components. The modules can be tested separately and then combined to compute the total system reliability. Because a one-to-one relationship can be established between system components and the reliability modules, a graphical user interface may be used to describe the system model. RML was designed to permit message passing between modules. This feature enables reliability modeling based on a run time simulation of the system wide effects of a component's failure modes. The use of failure modes effects simulation enhances the analyst's ability to correctly express system behavior when using the modularization approach to reliability modeling. To alleviate the computation bottleneck often found in large reliability models, REST was designed to take advantage of parallel processing on hypercube processors.

  14. Three-Dimensional Modeling May Improve Surgical Education and Clinical Practice.

    PubMed

    Jones, Daniel B; Sung, Robert; Weinberg, Crispin; Korelitz, Theodore; Andrews, Robert

    2016-04-01

    Three-dimensional (3D) printing has been used in the manufacturing industry for rapid prototyping and product testing. The aim of our study was to assess the feasibility of creating anatomical 3D models from a digital image using 3D printers. Furthermore, we sought face validity of models and explored potential opportunities for using 3D printing to enhance surgical education and clinical practice. Computed tomography and magnetic resonance images were reviewed, converted to computer models, and printed by stereolithography to create near exact replicas of human organs. Medical students and surgeons provided feedback via survey at the 2014 Surgical Education Week conference. There were 51 respondents, and 95.8% wanted these models for their patients. Cost was a concern, but 82.6% found value in these models at a price less than $500. All respondents thought the models would be useful for integration into the medical school curriculum. Three-dimensional printing is a potentially disruptive technology to improve both surgical education and clinical practice. As the technology matures and cost decreases, we envision 3D models being increasingly used in surgery. © The Author(s) 2015.

  15. Software for Building Models of 3D Objects via the Internet

    NASA Technical Reports Server (NTRS)

    Schramer, Tim; Jensen, Jeff

    2003-01-01

    The Virtual EDF Builder (where EDF signifies Electronic Development Fixture) is a computer program that facilitates the use of the Internet for building and displaying digital models of three-dimensional (3D) objects that ordinarily comprise assemblies of solid models created previously by use of computer-aided-design (CAD) programs. The Virtual EDF Builder resides on a Unix-based server computer. It is used in conjunction with a commercially available Web-based plug-in viewer program that runs on a client computer. The Virtual EDF Builder acts as a translator between the viewer program and a database stored on the server. The translation function includes the provision of uniform resource locator (URL) links to other Web-based computer systems and databases. The Virtual EDF builder can be used in two ways: (1) If the client computer is Unix-based, then it can assemble a model locally; the computational load is transferred from the server to the client computer. (2) Alternatively, the server can be made to build the model, in which case the server bears the computational load and the results are downloaded to the client computer or workstation upon completion.

  16. Stereoscopic vascular models of the head and neck: A computed tomography angiography visualization.

    PubMed

    Cui, Dongmei; Lynch, James C; Smith, Andrew D; Wilson, Timothy D; Lehman, Michael N

    2016-01-01

    Computer-assisted 3D models are used in some medical and allied health science schools; however, they are often limited to online use and 2D flat screen-based imaging. Few schools take advantage of 3D stereoscopic learning tools in anatomy education and clinically relevant anatomical variations when teaching anatomy. A new approach to teaching anatomy includes use of computed tomography angiography (CTA) images of the head and neck to create clinically relevant 3D stereoscopic virtual models. These high resolution images of the arteries can be used in unique and innovative ways to create 3D virtual models of the vasculature as a tool for teaching anatomy. Blood vessel 3D models are presented stereoscopically in a virtual reality environment, can be rotated 360° in all axes, and magnified according to need. In addition, flexible views of internal structures are possible. Images are displayed in a stereoscopic mode, and students view images in a small theater-like classroom while wearing polarized 3D glasses. Reconstructed 3D models enable students to visualize vascular structures with clinically relevant anatomical variations in the head and neck and appreciate spatial relationships among the blood vessels, the skull and the skin. © 2015 American Association of Anatomists.

  17. Initial draft of CSE-UCLA evaluation model based on weighted product in order to optimize digital library services in computer college in Bali

    NASA Astrophysics Data System (ADS)

    Divayana, D. G. H.; Adiarta, A.; Abadi, I. B. G. S.

    2018-01-01

    The aim of this research was to create initial design of CSE-UCLA evaluation model modified with Weighted Product in evaluating digital library service at Computer College in Bali. The method used in this research was developmental research method and developed by Borg and Gall model design. The results obtained from the research that conducted earlier this month was a rough sketch of Weighted Product based CSE-UCLA evaluation model that the design had been able to provide a general overview of the stages of weighted product based CSE-UCLA evaluation model used in order to optimize the digital library services at the Computer Colleges in Bali.

  18. Creating Effective Educational Computer Games for Undergraduate Classroom Learning: A Conceptual Model

    ERIC Educational Resources Information Center

    Rapeepisarn, Kowit; Wong, Kok Wai; Fung, Chun Che; Khine, Myint Swe

    2008-01-01

    When designing Educational Computer Games, designers usually consider target age, interactivity, interface and other related issues. They rarely explore the genres which should employ into one type of educational game. Recently, some digital game-based researchers made attempt to combine game genre with learning theory. Different researchers use…

  19. Information Commons to Go

    ERIC Educational Resources Information Center

    Bayer, Marc Dewey

    2008-01-01

    Since 2004, Buffalo State College's E. H. Butler Library has used the Information Commons (IC) model to assist its 8,500 students with library research and computer applications. Campus Technology Services (CTS) plays a very active role in its IC, with a centrally located Computer Help Desk and a newly created Application Support Desk right in the…

  20. Smaller Satellite Operations Near Geostationary Orbit

    DTIC Science & Technology

    2007-09-01

    At the time, this was considered a very difficult task, due to the complexity involved with creating computer code to autonomously perform... computer systems and even permanently damage equipment. Depending on the solar cycle, solar weather will be properly characterized and modeled to...30 Wayne Tomasi. Electronic Communciations Systems. Upper Saddle River: Pearson Education, 2004. 1041

  1. Computed tomography-based tissue-engineered scaffolds in craniomaxillofacial surgery.

    PubMed

    Smith, M H; Flanagan, C L; Kemppainen, J M; Sack, J A; Chung, H; Das, S; Hollister, S J; Feinberg, S E

    2007-09-01

    Tissue engineering provides an alternative modality allowing for decreased morbidity of donor site grafting and decreased rejection of less compatible alloplastic tissues. Using image-based design and computer software, a precisely sized and shaped scaffold for osseous tissue regeneration can be created via selective laser sintering. Polycaprolactone has been used to create a condylar ramus unit (CRU) scaffold for application in temporomandibular joint reconstruction in a Yucatan minipig animal model. Following sacrifice, micro-computed tomography and histology was used to demonstrate the efficacy of this particular scaffold design. A proof-of-concept surgery has demonstrated cartilaginous tissue regeneration along the articulating surface with exuberant osseous tissue formation. Bone volumes and tissue mineral density at both the 1 and 3 month time points demonstrated significant new bone growth interior and exterior to the scaffold. Computationally designed scaffolds can support masticatory function in a large animal model as well as both osseous and cartilage regeneration. Our group is continuing to evaluate multiple implant designs in both young and mature Yucatan minipig animals. 2007 John Wiley & Sons, Ltd.

  2. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler.

    PubMed

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O'Connor, Mary; Shapiro, Bruce A

    2008-10-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes.

  3. Computational strategies for the automated design of RNA nanoscale structures from building blocks using NanoTiler☆

    PubMed Central

    Bindewald, Eckart; Grunewald, Calvin; Boyle, Brett; O’Connor, Mary; Shapiro, Bruce A.

    2013-01-01

    One approach to designing RNA nanoscale structures is to use known RNA structural motifs such as junctions, kissing loops or bulges and to construct a molecular model by connecting these building blocks with helical struts. We previously developed an algorithm for detecting internal loops, junctions and kissing loops in RNA structures. Here we present algorithms for automating or assisting many of the steps that are involved in creating RNA structures from building blocks: (1) assembling building blocks into nanostructures using either a combinatorial search or constraint satisfaction; (2) optimizing RNA 3D ring structures to improve ring closure; (3) sequence optimisation; (4) creating a unique non-degenerate RNA topology descriptor. This effectively creates a computational pipeline for generating molecular models of RNA nanostructures and more specifically RNA ring structures with optimized sequences from RNA building blocks. We show several examples of how the algorithms can be utilized to generate RNA tecto-shapes. PMID:18838281

  4. [The role of external letter positions in visual word recognition].

    PubMed

    Perea, Manuel; Lupker, Sthephen J

    2007-11-01

    A key issue for any computational model of visual word recognition is the choice of an input coding schema, which is responsible for assigning letter positions. Such a schema must reflect the fact that, according to recent research, nonwords created by transposing letters (e.g., caniso for CASINO ), typically, appear to be more similar to the word than nonwords created by replacing letters (e.g., caviro ). In the present research, we initially carried out a computational analysis examining the degree to which the position of the transposition influences transposed-letter similarity effects. We next conducted a masked priming experiment with the lexical decision task to determine whether a transposed-letter priming advantage occurs when the first letter position is involved. Primes were created by either transposing the first and third letters (démula-MEDULA ) or replacing the first and third letters (bérula-MEDULA). Results showed that there was no transposed-letter priming advantage in this situation. We discuss the implications of these results for models of visual word recognition.

  5. Economic Modeling as a Component of Academic Strategic Planning.

    ERIC Educational Resources Information Center

    MacKinnon, Joyce; Sothmann, Mark; Johnson, James

    2001-01-01

    Computer-based economic modeling was used to enable a school of allied health to define outcomes, identify associated costs, develop cost and revenue models, and create a financial planning system. As a strategic planning tool, it assisted realistic budgeting and improved efficiency and effectiveness. (Contains 18 references.) (SK)

  6. COMPUTER RECONSTRUCTION OF A HUMAN LUNG MORPHOLOGY MODEL FROM MAGNETIC RESONANCE (MR) IMAGES

    EPA Science Inventory


    A mathematical description of the morphological structure of the lung is necessary for modeling and analysis of the deposition of inhaled aerosols. A morphological model of the lung boundary was generated from magnetic resonance (MR) images, with the goal of creating a frame...

  7. High resolution, MRI-based, segmented, computerized head phantom

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zubal, I.G.; Harrell, C.R.; Smith, E.O.

    1999-01-01

    The authors have created a high-resolution software phantom of the human brain which is applicable to voxel-based radiation transport calculations yielding nuclear medicine simulated images and/or internal dose estimates. A software head phantom was created from 124 transverse MRI images of a healthy normal individual. The transverse T2 slices, recorded in a 256x256 matrix from a GE Signa 2 scanner, have isotropic voxel dimensions of 1.5 mm and were manually segmented by the clinical staff. Each voxel of the phantom contains one of 62 index numbers designating anatomical, neurological, and taxonomical structures. The result is stored as a 256x256x128 bytemore » array. Internal volumes compare favorably to those described in the ICRP Reference Man. The computerized array represents a high resolution model of a typical human brain and serves as a voxel-based anthropomorphic head phantom suitable for computer-based modeling and simulation calculations. It offers an improved realism over previous mathematically described software brain phantoms, and creates a reference standard for comparing results of newly emerging voxel-based computations. Such voxel-based computations lead the way to developing diagnostic and dosimetry calculations which can utilize patient-specific diagnostic images. However, such individualized approaches lack fast, automatic segmentation schemes for routine use; therefore, the high resolution, typical head geometry gives the most realistic patient model currently available.« less

  8. Status of Computational Aerodynamic Modeling Tools for Aircraft Loss-of-Control

    NASA Technical Reports Server (NTRS)

    Frink, Neal T.; Murphy, Patrick C.; Atkins, Harold L.; Viken, Sally A.; Petrilli, Justin L.; Gopalarathnam, Ashok; Paul, Ryan C.

    2016-01-01

    A concerted effort has been underway over the past several years to evolve computational capabilities for modeling aircraft loss-of-control under the NASA Aviation Safety Program. A principal goal has been to develop reliable computational tools for predicting and analyzing the non-linear stability & control characteristics of aircraft near stall boundaries affecting safe flight, and for utilizing those predictions for creating augmented flight simulation models that improve pilot training. Pursuing such an ambitious task with limited resources required the forging of close collaborative relationships with a diverse body of computational aerodynamicists and flight simulation experts to leverage their respective research efforts into the creation of NASA tools to meet this goal. Considerable progress has been made and work remains to be done. This paper summarizes the status of the NASA effort to establish computational capabilities for modeling aircraft loss-of-control and offers recommendations for future work.

  9. Creating a Simple Single Computational Approach to Modeling Rarefied and Continuum Flow About Aerospace Vehicles

    NASA Technical Reports Server (NTRS)

    Goldstein, David B.; Varghese, Philip L.

    1997-01-01

    We proposed to create a single computational code incorporating methods that can model both rarefied and continuum flow to enable the efficient simulation of flow about space craft and high altitude hypersonic aerospace vehicles. The code was to use a single grid structure that permits a smooth transition between the continuum and rarefied portions of the flow. Developing an appropriate computational boundary between the two regions represented a major challenge. The primary approach chosen involves coupling a four-speed Lattice Boltzmann model for the continuum flow with the DSMC method in the rarefied regime. We also explored the possibility of using a standard finite difference Navier Stokes solver for the continuum flow. With the resulting code we will ultimately investigate three-dimensional plume impingement effects, a subject of critical importance to NASA and related to the work of Drs. Forrest Lumpkin, Steve Fitzgerald and Jay Le Beau at Johnson Space Center. Below is a brief background on the project and a summary of the results as of the end of the grant.

  10. The rise of machine consciousness: studying consciousness with computational models.

    PubMed

    Reggia, James A

    2013-08-01

    Efforts to create computational models of consciousness have accelerated over the last two decades, creating a field that has become known as artificial consciousness. There have been two main motivations for this controversial work: to develop a better scientific understanding of the nature of human/animal consciousness and to produce machines that genuinely exhibit conscious awareness. This review begins by briefly explaining some of the concepts and terminology used by investigators working on machine consciousness, and summarizes key neurobiological correlates of human consciousness that are particularly relevant to past computational studies. Models of consciousness developed over the last twenty years are then surveyed. These models are largely found to fall into five categories based on the fundamental issue that their developers have selected as being most central to consciousness: a global workspace, information integration, an internal self-model, higher-level representations, or attention mechanisms. For each of these five categories, an overview of past work is given, a representative example is presented in some detail to illustrate the approach, and comments are provided on the contributions and limitations of the methodology. Three conclusions are offered about the state of the field based on this review: (1) computational modeling has become an effective and accepted methodology for the scientific study of consciousness, (2) existing computational models have successfully captured a number of neurobiological, cognitive, and behavioral correlates of conscious information processing as machine simulations, and (3) no existing approach to artificial consciousness has presented a compelling demonstration of phenomenal machine consciousness, or even clear evidence that artificial phenomenal consciousness will eventually be possible. The paper concludes by discussing the importance of continuing work in this area, considering the ethical issues it raises, and making predictions concerning future developments. Copyright © 2013 Elsevier Ltd. All rights reserved.

  11. Increasing productivity of the McAuto CAD/CAE system by user-specific applications programming

    NASA Technical Reports Server (NTRS)

    Plotrowski, S. M.; Vu, T. H.

    1985-01-01

    Significant improvements in the productivity of the McAuto Computer-Aided Design/Computer-Aided Engineering (CAD/CAE) system were achieved by applications programming using the system's own Graphics Interactive Programming language (GRIP) and the interface capabilities with the main computer on which the system resides. The GRIP programs for creating springs, bar charts, finite element model representations and aiding management planning are presented as examples.

  12. GRAPHICS MANAGER (GFXMGR): An interactive graphics software program for the Advanced Electronics Design (AED) graphics controller, Model 767

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faculjak, D.A.

    1988-03-01

    Graphics Manager (GFXMGR) is menu-driven, user-friendly software designed to interactively create, edit, and delete graphics displays on the Advanced Electronics Design (AED) graphics controller, Model 767. The software runs on the VAX family of computers and has been used successfully in security applications to create and change site layouts (maps) of specific facilities. GFXMGR greatly benefits graphics development by minimizing display-development time, reducing tedium on the part of the user, and improving system performance. It is anticipated that GFXMGR can be used to create graphics displays for many types of applications. 8 figs., 2 tabs.

  13. Restoring Fort Frontenac in 3D: Effective Usage of 3D Technology for Heritage Visualization

    NASA Astrophysics Data System (ADS)

    Yabe, M.; Goins, E.; Jackson, C.; Halbstein, D.; Foster, S.; Bazely, S.

    2015-02-01

    This paper is composed of three elements: 3D modeling, web design, and heritage visualization. The aim is to use computer graphics design to inform and create an interest in historical visualization by rebuilding Fort Frontenac using 3D modeling and interactive design. The final model will be integr ated into an interactive website to learn more about the fort's historic imp ortance. It is apparent that using computer graphics can save time and money when it comes to historical visualization. Visitors do not have to travel to the actual archaeological buildings. They can simply use the Web in their own home to learn about this information virtually. Meticulously following historical records to create a sophisticated restoration of archaeological buildings will draw viewers into visualizations, such as the historical world of Fort Frontenac. As a result, it allows the viewers to effectively understand the fort's social sy stem, habits, and historical events.

  14. Environmental Models as a Service: Enabling Interoperability through RESTful Endpoints and API Documentation (presentation)

    EPA Science Inventory

    Achieving interoperability in environmental modeling has evolved as software technology has progressed. The recent rise of cloud computing and proliferation of web services initiated a new stage for creating interoperable systems. Scientific programmers increasingly take advantag...

  15. Using Modeling and Simulation to Complement Testing for Increased Understanding of Weapon Subassembly Response.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wong, Michael K.; Davidson, Megan

    As part of Sandia’s nuclear deterrence mission, the B61-12 Life Extension Program (LEP) aims to modernize the aging weapon system. Modernization requires requalification and Sandia is using high performance computing to perform advanced computational simulations to better understand, evaluate, and verify weapon system performance in conjunction with limited physical testing. The Nose Bomb Subassembly (NBSA) of the B61-12 is responsible for producing a fuzing signal upon ground impact. The fuzing signal is dependent upon electromechanical impact sensors producing valid electrical fuzing signals at impact. Computer generated models were used to assess the timing between the impact sensor’s response to themore » deceleration of impact and damage to major components and system subassemblies. The modeling and simulation team worked alongside the physical test team to design a large-scale reverse ballistic test to not only assess system performance, but to also validate their computational models. The reverse ballistic test conducted at Sandia’s sled test facility sent a rocket sled with a representative target into a stationary B61-12 (NBSA) to characterize the nose crush and functional response of NBSA components. Data obtained from data recorders and high-speed photometrics were integrated with previously generated computer models in order to refine and validate the model’s ability to reliably simulate real-world effects. Large-scale tests are impractical to conduct for every single impact scenario. By creating reliable computer models, we can perform simulations that identify trends and produce estimates of outcomes over the entire range of required impact conditions. Sandia’s HPCs enable geometric resolution that was unachievable before, allowing for more fidelity and detail, and creating simulations that can provide insight to support evaluation of requirements and performance margins. As computing resources continue to improve, researchers at Sandia are hoping to improve these simulations so they provide increasingly credible analysis of the system response and performance over the full range of conditions.« less

  16. Updated Panel-Method Computer Program

    NASA Technical Reports Server (NTRS)

    Ashby, Dale L.

    1995-01-01

    Panel code PMARC_12 (Panel Method Ames Research Center, version 12) computes potential-flow fields around complex three-dimensional bodies such as complete aircraft models. Contains several advanced features, including internal mathematical modeling of flow, time-stepping wake model for simulating either steady or unsteady motions, capability for Trefftz computation of drag induced by plane, and capability for computation of off-body and on-body streamlines, and capability of computation of boundary-layer parameters by use of two-dimensional integral boundary-layer method along surface streamlines. Investigators interested in visual representations of phenomena, may want to consider obtaining program GVS (ARC-13361), General visualization System. GVS is Silicon Graphics IRIS program created to support scientific-visualization needs of PMARC_12. GVS available separately from COSMIC. PMARC_12 written in standard FORTRAN 77, with exception of NAMELIST extension used for input.

  17. An evaluation method of computer usability based on human-to-computer information transmission model.

    PubMed

    Ogawa, K

    1992-01-01

    This paper proposes a new evaluation and prediction method for computer usability. This method is based on our two previously proposed information transmission measures created from a human-to-computer information transmission model. The model has three information transmission levels: the device, software, and task content levels. Two measures, called the device independent information measure (DI) and the computer independent information measure (CI), defined on the software and task content levels respectively, are given as the amount of information transmitted. Two information transmission rates are defined as DI/T and CI/T, where T is the task completion time: the device independent information transmission rate (RDI), and the computer independent information transmission rate (RCI). The method utilizes the RDI and RCI rates to evaluate relatively the usability of software and device operations on different computer systems. Experiments using three different systems, in this case a graphical information input task, confirm that the method offers an efficient way of determining computer usability.

  18. Use of Parallel Micro-Platform for the Simulation the Space Exploration

    NASA Astrophysics Data System (ADS)

    Velasco Herrera, Victor Manuel; Velasco Herrera, Graciela; Rosano, Felipe Lara; Rodriguez Lozano, Salvador; Lucero Roldan Serrato, Karen

    The purpose of this work is to create a parallel micro-platform, that simulates the virtual movements of a space exploration in 3D. One of the innovations presented in this design consists of the application of a lever mechanism for the transmission of the movement. The development of such a robot is a challenging task very different of the industrial manipulators due to a totally different target system of requirements. This work presents the study and simulation, aided by computer, of the movement of this parallel manipulator. The development of this model has been developed using the platform of computer aided design Unigraphics, in which it was done the geometric modeled of each one of the components and end assembly (CAD), the generation of files for the computer aided manufacture (CAM) of each one of the pieces and the kinematics simulation of the system evaluating different driving schemes. We used the toolbox (MATLAB) of aerospace and create an adaptive control module to simulate the system.

  19. Two-dimensional heat flow apparatus

    NASA Astrophysics Data System (ADS)

    McDougall, Patrick; Ayars, Eric

    2014-06-01

    We have created an apparatus to quantitatively measure two-dimensional heat flow in a metal plate using a grid of temperature sensors read by a microcontroller. Real-time temperature data are collected from the microcontroller by a computer for comparison with a computational model of the heat equation. The microcontroller-based sensor array allows previously unavailable levels of precision at very low cost, and the combination of measurement and modeling makes for an excellent apparatus for the advanced undergraduate laboratory course.

  20. Lowering the barriers to computational modeling of Earth's surface: coupling Jupyter Notebooks with Landlab, HydroShare, and CyberGIS for research and education.

    NASA Astrophysics Data System (ADS)

    Bandaragoda, C.; Castronova, A. M.; Phuong, J.; Istanbulluoglu, E.; Strauch, R. L.; Nudurupati, S. S.; Tarboton, D. G.; Wang, S. W.; Yin, D.; Barnhart, K. R.; Tucker, G. E.; Hutton, E.; Hobley, D. E. J.; Gasparini, N. M.; Adams, J. M.

    2017-12-01

    The ability to test hypotheses about hydrology, geomorphology and atmospheric processes is invaluable to research in the era of big data. Although community resources are available, there remain significant educational, logistical and time investment barriers to their use. Knowledge infrastructure is an emerging intellectual framework to understand how people are creating, sharing and distributing knowledge - which has been dramatically transformed by Internet technologies. In addition to the technical and social components in a cyberinfrastructure system, knowledge infrastructure considers educational, institutional, and open source governance components required to advance knowledge. We are designing an infrastructure environment that lowers common barriers to reproducing modeling experiments for earth surface investigation. Landlab is an open-source modeling toolkit for building, coupling, and exploring two-dimensional numerical models. HydroShare is an online collaborative environment for sharing hydrologic data and models. CyberGIS-Jupyter is an innovative cyberGIS framework for achieving data-intensive, reproducible, and scalable geospatial analytics using the Jupyter Notebook based on ROGER - the first cyberGIS supercomputer, so that models that can be elastically reproduced through cloud computing approaches. Our team of geomorphologists, hydrologists, and computer geoscientists has created a new infrastructure environment that combines these three pieces of software to enable knowledge discovery. Through this novel integration, any user can interactively execute and explore their shared data and model resources. Landlab on HydroShare with CyberGIS-Jupyter supports the modeling continuum from fully developed modelling applications, prototyping new science tools, hands on research demonstrations for training workshops, and classroom applications. Computational geospatial models based on big data and high performance computing can now be more efficiently developed, improved, scaled, and seamlessly reproduced among multidisciplinary users, thereby expanding the active learning curriculum and research opportunities for students in earth surface modeling and informatics.

  1. Development of a computer model to predict platform station keeping requirements in the Gulf of Mexico using remote sensing data

    NASA Technical Reports Server (NTRS)

    Barber, Bryan; Kahn, Laura; Wong, David

    1990-01-01

    Offshore operations such as oil drilling and radar monitoring require semisubmersible platforms to remain stationary at specific locations in the Gulf of Mexico. Ocean currents, wind, and waves in the Gulf of Mexico tend to move platforms away from their desired locations. A computer model was created to predict the station keeping requirements of a platform. The computer simulation uses remote sensing data from satellites and buoys as input. A background of the project, alternate approaches to the project, and the details of the simulation are presented.

  2. Three-dimensional computer simulation of radiostereometric analysis (RSA) in distal radius fractures.

    PubMed

    Madanat, Rami; Moritz, Niko; Aro, Hannu T

    2007-01-01

    Physical phantom models have conventionally been used to determine the accuracy and precision of radiostereometric analysis (RSA) in various orthopaedic applications. Using a phantom model of a fracture of the distal radius it has previously been shown that RSA is a highly accurate and precise method for measuring both translation and rotation in three-dimensions (3-D). The main shortcoming of a physical phantom model is its inability to mimic complex 3-D motion. The goal of this study was to create a realistic computer model for preoperative planning of RSA studies and to test the accuracy of RSA in measuring complex movements in fractures of the distal radius using this new model. The 3-D computer model was created from a set of tomographic scans. The simulation of the radiographic imaging was performed using ray-tracing software (POV-Ray). RSA measurements were performed according to standard protocol. Using a two-part fracture model (AO/ASIF type A2), it was found that for simple movements in one axis, translations in the range of 25microm-2mm could be measured with an accuracy of +/-2microm. Rotations ranging from 16 degrees to 2 degrees could be measured with an accuracy of +/-0.015 degrees . Using a three-part fracture model the corresponding values of accuracy were found to be +/-4microm and +/-0.031 degrees for translation and rotation, respectively. For complex 3-D motion in a three-part fracture model (AO/ASIF type C1) the accuracy was +/-6microm for translation and +/-0.120 degrees for rotation. The use of 3-D computer modelling can provide a method for preoperative planning of RSA studies in complex fractures of the distal radius and in other clinical situations in which the RSA method is applicable.

  3. OASIS: A GEOGRAPHICAL DECISION SUPPORT SYSTEM FOR GROUND-WATER CONTAMINANT MODELING

    EPA Science Inventory

    Three new software technologies were applied to develop an efficient and easy to use decision support system for ground-water contaminant modeling. Graphical interfaces create a more intuitive and effective form of communication with the computer compared to text-based interfaces...

  4. Rapid Prototyping of Physiologically-Based Toxicokinetic (PBTK) Models (SOT annual meeting)

    EPA Science Inventory

    Determining the tissue concentrations resulting from chemical exposure (i.e., toxicokinetics (TK)) is essential in emergency or other situations where time and data are lacking. Generic TK models can be created rapidly using in vitro assays and computational approaches to generat...

  5. Finite difference time domain modeling of spiral antennas

    NASA Technical Reports Server (NTRS)

    Penney, Christopher W.; Beggs, John H.; Luebbers, Raymond J.

    1992-01-01

    The objectives outlined in the original proposal for this project were to create a well-documented computer analysis model based on the finite-difference, time-domain (FDTD) method that would be capable of computing antenna impedance, far-zone radiation patterns, and radar cross-section (RCS). The ability to model a variety of penetrable materials in addition to conductors is also desired. The spiral antennas under study by this project meet these requirements since they are constructed of slots cut into conducting surfaces which are backed by dielectric materials.

  6. The visual attention saliency map for movie retrospection

    NASA Astrophysics Data System (ADS)

    Rogalska, Anna; Napieralski, Piotr

    2018-04-01

    The visual saliency map is becoming important and challenging for many scientific disciplines (robotic systems, psychophysics, cognitive neuroscience and computer science). Map created by the model indicates possible salient regions by taking into consideration face presence and motion which is essential in motion pictures. By combining we can obtain credible saliency map with a low computational cost.

  7. Bayes' theorem application in the measure information diagnostic value assessment

    NASA Astrophysics Data System (ADS)

    Orzechowski, Piotr D.; Makal, Jaroslaw; Nazarkiewicz, Andrzej

    2006-03-01

    The paper presents Bayesian method application in the measure information diagnostic value assessment that is used in the computer-aided diagnosis system. The computer system described here has been created basing on the Bayesian Network and is used in Benign Prostatic Hyperplasia (BPH) diagnosis. The graphic diagnostic model enables to juxtapose experts' knowledge with data.

  8. Evaluation of Physiologically-Based Artificial Neural Network Models to Detect Operator Workload in Remotely Piloted Aircraft Operations

    DTIC Science & Technology

    2016-07-13

    to a computer via Bluetooth . Respiration is captured as a breathing waveform signal using a capacitive pressure sensor, sampled at 18 Hz. The...dropouts in the Bluetooth signal and artifacts caused by body movement. Workload models. Four artificial neural network models were created using

  9. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kandler A; Usseglio Viretta, Francois L; Graf, Peter A

    This presentation describes research work led by NREL with team members from Argonne National Laboratory and Texas A&M University in microstructure analysis, modeling and validation under DOE's Computer-Aided Engineering of Batteries (CAEBAT) program. The goal of the project is to close the gaps between CAEBAT models and materials research by creating predictive models that can be used for electrode design.

  10. Hysteresis in the trade cycle

    NASA Astrophysics Data System (ADS)

    Mc Namara, Hugh A.; Pokrovskii, Alexei V.

    2006-02-01

    The Kaldor model-one of the first nonlinear models of macroeconomics-is modified to incorporate a Preisach nonlinearity. The new dynamical system thus created shows highly complicated behaviour. This paper presents a rigorous (computer aided) proof of chaos in this new model, and of the existence of unstable periodic orbits of all minimal periods p>57.

  11. Optimization in modeling the ribs-bounded contour from computer tomography scan

    NASA Astrophysics Data System (ADS)

    Bilinskas, M. J.; Dzemyda, G.

    2016-10-01

    In this paper a method for analyzing transversal plane images from computer tomography scans is presented. A mathematical model that describes the ribs-bounded contour was created and the problem of approximation is solved by finding out the optimal parameters of the model in the least-squares sense. Such model would be useful in registration of images independently on the patient position on the bed and on the radio-contrast agent injection. We consider the slices, where ribs are visible, because many important internal organs are located here: liver, heart, stomach, pancreas, lung, etc.

  12. Pressure Analysis

    NASA Technical Reports Server (NTRS)

    1990-01-01

    FluiDyne Engineering Corporation, Minneapolis, MN is one of the world's leading companies in design and construction of wind tunnels. In its designing work, FluiDyne uses a computer program called GTRAN. With GTRAN, engineers create a design and test its performance on the computer before actually building a model; should the design fail to meet criteria, the system or any component part can be redesigned and retested on the computer, saving a great deal of time and money.

  13. Aeroelastic Deflection of NURBS Geometry

    NASA Technical Reports Server (NTRS)

    Samareh, Jamshid A.

    1998-01-01

    The purpose of this paper is to present an algorithm for using NonUniform Rational B-Spline (NURBS) representation in an aeroelastic loop. The algorithm is based on creating a least-squares NURBS surface representing the aeroelastic defection. The resulting NURBS surfaces are used to update either the original Computer- Aided Design (CAD) model, Computational Structural Mechanics (CSM) grid or the Computational Fluid Dynamics (CFD) grid. Results are presented for a generic High-Speed Civil Transport (HSCT).

  14. A distributed computing model for telemetry data processing

    NASA Astrophysics Data System (ADS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-05-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  15. A distributed computing model for telemetry data processing

    NASA Technical Reports Server (NTRS)

    Barry, Matthew R.; Scott, Kevin L.; Weismuller, Steven P.

    1994-01-01

    We present a new approach to distributing processed telemetry data among spacecraft flight controllers within the control centers at NASA's Johnson Space Center. This approach facilitates the development of application programs which integrate spacecraft-telemetered data and ground-based synthesized data, then distributes this information to flight controllers for analysis and decision-making. The new approach combines various distributed computing models into one hybrid distributed computing model. The model employs both client-server and peer-to-peer distributed computing models cooperating to provide users with information throughout a diverse operations environment. Specifically, it provides an attractive foundation upon which we are building critical real-time monitoring and control applications, while simultaneously lending itself to peripheral applications in playback operations, mission preparations, flight controller training, and program development and verification. We have realized the hybrid distributed computing model through an information sharing protocol. We shall describe the motivations that inspired us to create this protocol, along with a brief conceptual description of the distributed computing models it employs. We describe the protocol design in more detail, discussing many of the program design considerations and techniques we have adopted. Finally, we describe how this model is especially suitable for supporting the implementation of distributed expert system applications.

  16. Use of declarative statements in creating and maintaining computer-interpretable knowledge bases for guideline-based care.

    PubMed

    Tu, Samson W; Hrabak, Karen M; Campbell, James R; Glasgow, Julie; Nyman, Mark A; McClure, Robert; McClay, James; Abarbanel, Robert; Mansfield, James G; Martins, Susana M; Goldstein, Mary K; Musen, Mark A

    2006-01-01

    Developing computer-interpretable clinical practice guidelines (CPGs) to provide decision support for guideline-based care is an extremely labor-intensive task. In the EON/ATHENA and SAGE projects, we formulated substantial portions of CPGs as computable statements that express declarative relationships between patient conditions and possible interventions. We developed query and expression languages that allow a decision-support system (DSS) to evaluate these statements in specific patient situations. A DSS can use these guideline statements in multiple ways, including: (1) as inputs for determining preferred alternatives in decision-making, and (2) as a way to provide targeted commentaries in the clinical information system. The use of these declarative statements significantly reduces the modeling expertise and effort required to create and maintain computer-interpretable knowledge bases for decision-support purpose. We discuss possible implications for sharing of such knowledge bases.

  17. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing.

    PubMed

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles; Mousses, Spyro

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information.

  18. Applying mathematical modeling to create job rotation schedules for minimizing occupational noise exposure.

    PubMed

    Tharmmaphornphilas, Wipawee; Green, Benjamin; Carnahan, Brian J; Norman, Bryan A

    2003-01-01

    This research developed worker schedules by using administrative controls and a computer programming model to reduce the likelihood of worker hearing loss. By rotating the workers through different jobs during the day it was possible to reduce their exposure to hazardous noise levels. Computer simulations were made based on data collected in a real setting. Worker schedules currently used at the site are compared with proposed worker schedules from the computer simulations. For the worker assignment plans found by the computer model, the authors calculate a significant decrease in time-weighted average (TWA) sound level exposure. The maximum daily dose that any worker is exposed to is reduced by 58.8%, and the maximum TWA value for the workers is reduced by 3.8 dB from the current schedule.

  19. Informatics in radiology: an information model of the DICOM standard.

    PubMed

    Kahn, Charles E; Langlotz, Curtis P; Channin, David S; Rubin, Daniel L

    2011-01-01

    The Digital Imaging and Communications in Medicine (DICOM) Standard is a key foundational technology for radiology. However, its complexity creates challenges for information system developers because the current DICOM specification requires human interpretation and is subject to nonstandard implementation. To address this problem, a formally sound and computationally accessible information model of the DICOM Standard was created. The DICOM Standard was modeled as an ontology, a machine-accessible and human-interpretable representation that may be viewed and manipulated by information-modeling tools. The DICOM Ontology includes a real-world model and a DICOM entity model. The real-world model describes patients, studies, images, and other features of medical imaging. The DICOM entity model describes connections between real-world entities and the classes that model the corresponding DICOM information entities. The DICOM Ontology was created to support the Cancer Biomedical Informatics Grid (caBIG) initiative, and it may be extended to encompass the entire DICOM Standard and serve as a foundation of medical imaging systems for research and patient care. RSNA, 2010

  20. Algodoo: A Tool for Encouraging Creativity in Physics Teaching and Learning

    NASA Astrophysics Data System (ADS)

    Gregorcic, Bor; Bodin, Madelen

    2017-01-01

    Algodoo (http://www.algodoo.com) is a digital sandbox for physics 2D simulations. It allows students and teachers to easily create simulated "scenes" and explore physics through a user-friendly and visually attractive interface. In this paper, we present different ways in which students and teachers can use Algodoo to visualize and solve physics problems, investigate phenomena and processes, and engage in out-of-school activities and projects. Algodoo, with its approachable interface, inhabits a middle ground between computer games and "serious" computer modeling. It is suitable as an entry-level modeling tool for students of all ages and can facilitate discussions about the role of computer modeling in physics.

  1. Studying an Eulerian Computer Model on Different High-performance Computer Platforms and Some Applications

    NASA Astrophysics Data System (ADS)

    Georgiev, K.; Zlatev, Z.

    2010-11-01

    The Danish Eulerian Model (DEM) is an Eulerian model for studying the transport of air pollutants on large scale. Originally, the model was developed at the National Environmental Research Institute of Denmark. The model computational domain covers Europe and some neighbour parts belong to the Atlantic Ocean, Asia and Africa. If DEM model is to be applied by using fine grids, then its discretization leads to a huge computational problem. This implies that such a model as DEM must be run only on high-performance computer architectures. The implementation and tuning of such a complex large-scale model on each different computer is a non-trivial task. Here, some comparison results of running of this model on different kind of vector (CRAY C92A, Fujitsu, etc.), parallel computers with distributed memory (IBM SP, CRAY T3E, Beowulf clusters, Macintosh G4 clusters, etc.), parallel computers with shared memory (SGI Origin, SUN, etc.) and parallel computers with two levels of parallelism (IBM SMP, IBM BlueGene/P, clusters of multiprocessor nodes, etc.) will be presented. The main idea in the parallel version of DEM is domain partitioning approach. Discussions according to the effective use of the cache and hierarchical memories of the modern computers as well as the performance, speed-ups and efficiency achieved will be done. The parallel code of DEM, created by using MPI standard library, appears to be highly portable and shows good efficiency and scalability on different kind of vector and parallel computers. Some important applications of the computer model output are presented in short.

  2. Perceptual category learning and visual processing: An exercise in computational cognitive neuroscience.

    PubMed

    Cantwell, George; Riesenhuber, Maximilian; Roeder, Jessica L; Ashby, F Gregory

    2017-05-01

    The field of computational cognitive neuroscience (CCN) builds and tests neurobiologically detailed computational models that account for both behavioral and neuroscience data. This article leverages a key advantage of CCN-namely, that it should be possible to interface different CCN models in a plug-and-play fashion-to produce a new and biologically detailed model of perceptual category learning. The new model was created from two existing CCN models: the HMAX model of visual object processing and the COVIS model of category learning. Using bitmap images as inputs and by adjusting only a couple of learning-rate parameters, the new HMAX/COVIS model provides impressively good fits to human category-learning data from two qualitatively different experiments that used different types of category structures and different types of visual stimuli. Overall, the model provides a comprehensive neural and behavioral account of basal ganglia-mediated learning. Copyright © 2017 Elsevier Ltd. All rights reserved.

  3. A Three-Dimensional Mediastinal Model Created with Rapid Prototyping in a Patient with Ectopic Thymoma

    PubMed Central

    Nakada, Takeo; Inagaki, Takuya

    2014-01-01

    Preoperative three-dimensional (3D) imaging of a mediastinal tumor using two-dimensional (2D) axial computed tomography is sometimes difficult, and an unexpected appearance of the tumor may be encountered during surgery. In order to evaluate the preoperative feasibility of a 3D mediastinal model that used the rapid prototyping technique, we created a model and report its results. The 2D image showed some of the relationship between the tumor and the pericardium, but the 3D mediastinal model that was created using the rapid prototyping technique showed the 3D lesion in the outer side of the extrapericardium. The patient underwent a thoracoscopic resection of the tumor, and the pathological examination showed a rare middle mediastinal ectopic thymoma. We believe that the construction of mediastinal models is useful for thoracoscopic surgery and other complicated surgeries of the chest diseases. PMID:24633133

  4. A three-dimensional mediastinal model created with rapid prototyping in a patient with ectopic thymoma.

    PubMed

    Akiba, Tadashi; Nakada, Takeo; Inagaki, Takuya

    2015-01-01

    Preoperative three-dimensional (3D) imaging of a mediastinal tumor using two-dimensional (2D) axial computed tomography is sometimes difficult, and an unexpected appearance of the tumor may be encountered during surgery. In order to evaluate the preoperative feasibility of a 3D mediastinal model that used the rapid prototyping technique, we created a model and report its results. The 2D image showed some of the relationship between the tumor and the pericardium, but the 3D mediastinal model that was created using the rapid prototyping technique showed the 3D lesion in the outer side of the extrapericardium. The patient underwent a thoracoscopic resection of the tumor, and the pathological examination showed a rare middle mediastinal ectopic thymoma. We believe that the construction of mediastinal models is useful for thoracoscopic surgery and other complicated surgeries of the chest diseases.

  5. A computational model of selection by consequences.

    PubMed

    McDowell, J J

    2004-05-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied over wide ranges in these experiments, and many of the qualitative features of the model also were varied. The digital organism consistently showed a hyperbolic relation between response and reinforcement rates, and this hyperbolic description of the data was consistently better than the description provided by other, similar, function forms. In addition, the parameters of the hyperbola varied systematically with the quantitative, and some of the qualitative, properties of the model in ways that were consistent with findings from biological organisms. These results suggest that the material events responsible for an organism's responding on RI schedules are computationally equivalent to Darwinian selection by consequences. They also suggest that the computational model developed here is worth pursuing further as a possible dynamic account of behavior.

  6. Tumbling: From Rally Cars to Toast

    NASA Astrophysics Data System (ADS)

    Belloni, Mario; Christian, Wolfgang

    2012-10-01

    The article by Rod Cross describing the translational and rotational motion of the "Launch of a Vehicle from a Ramp" motivated us to create two computer models showing this type of dynamical behavior.

  7. Tumbling: From Rally Cars to Toast

    ERIC Educational Resources Information Center

    Belloni, Mario; Christian, Wolfgang

    2012-01-01

    The article by Rod Cross describing the translational and rotational motion of the "Launch of a Vehicle from a Ramp" motivated us to create two computer models showing this type of dynamical behavior.

  8. Learned Vector-Space Models for Document Retrieval.

    ERIC Educational Resources Information Center

    Caid, William R.; And Others

    1995-01-01

    The Latent Semantic Indexing and MatchPlus systems examine similar contexts in which words appear and create representational models that capture the similarity of meaning of terms and then use the representation for retrieval. Text Retrieval Conference experiments using these systems demonstrate the computational feasibility of using…

  9. Some Automated Cartography Developments at the Defense Mapping Agency.

    DTIC Science & Technology

    1981-01-01

    on a pantographic router creating a laminate step model which was moulded in plaster for carving Into a terrain model. This section will trace DMA’s...offering economical automation. Precision flatbed Concord plotters were brought into DMA with sufficiently programmable control computers to perform these

  10. A Novel Approach for Creating Activity-Aware Applications in a Hospital Environment

    NASA Astrophysics Data System (ADS)

    Bardram, Jakob E.

    Context-aware and activity-aware computing has been proposed as a way to adapt the computer to the user’s ongoing activity. However, deductively moving from physical context - like location - to establishing human activity has proved difficult. This paper proposes a novel approach to activity-aware computing. Instead of inferring activities, this approach enables the user to explicitly model their activity, and then use sensor-based events to create, manage, and use these computational activities adjusted to a specific context. This approach was crafted through a user-centered design process in collaboration with a hospital department. We propose three strategies for activity-awareness: context-based activity matching, context-based activity creation, and context-based activity adaptation. We present the implementation of these strategies and present an experimental evaluation of them. The experiments demonstrate that rather than considering context as information, context can be a relational property that links ’real-world activities’ with their ’computational activities’.

  11. Web-based segmentation and display of three-dimensional radiologic image data.

    PubMed

    Silverstein, J; Rubenstein, J; Millman, A; Panko, W

    1998-01-01

    In many clinical circumstances, viewing sequential radiological image data as three-dimensional models is proving beneficial. However, designing customized computer-generated radiological models is beyond the scope of most physicians, due to specialized hardware and software requirements. We have created a simple method for Internet users to remotely construct and locally display three-dimensional radiological models using only a standard web browser. Rapid model construction is achieved by distributing the hardware intensive steps to a remote server. Once created, the model is automatically displayed on the requesting browser and is accessible to multiple geographically distributed users. Implementation of our server software on large scale systems could be of great service to the worldwide medical community.

  12. Reduced complexity structural modeling for automated airframe synthesis

    NASA Technical Reports Server (NTRS)

    Hajela, Prabhat

    1987-01-01

    A procedure is developed for the optimum sizing of wing structures based on representing the built-up finite element assembly of the structure by equivalent beam models. The reduced-order beam models are computationally less demanding in an optimum design environment which dictates repetitive analysis of several trial designs. The design procedure is implemented in a computer program requiring geometry and loading information to create the wing finite element model and its equivalent beam model, and providing a rapid estimate of the optimum weight obtained from a fully stressed design approach applied to the beam. The synthesis procedure is demonstrated for representative conventional-cantilever and joined wing configurations.

  13. Source Term Model for Steady Micro Jets in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2005-01-01

    A source term model for steady micro jets was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the mass flow and momentum created by a steady blowing micro jet. The model is obtained by adding the momentum and mass flow created by the jet to the Navier-Stokes equations. The model was tested by comparing with data from numerical simulations of a single, steady micro jet on a flat plate in two and three dimensions. The source term model predicted the velocity distribution well compared to the two-dimensional plate using a steady mass flow boundary condition, which was used to simulate a steady micro jet. The model was also compared to two three-dimensional flat plate cases using a steady mass flow boundary condition to simulate a steady micro jet. The three-dimensional comparison included a case with a grid generated to capture the circular shape of the jet and a case without a grid generated for the micro jet. The case without the jet grid mimics the application of the source term. The source term model compared well with both of the three-dimensional cases. Comparisons of velocity distribution were made before and after the jet and Mach and vorticity contours were examined. The source term model allows a researcher to quickly investigate different locations of individual or several steady micro jets. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  14. Evolving Non-Dominated Parameter Sets for Computational Models from Multiple Experiments

    NASA Astrophysics Data System (ADS)

    Lane, Peter C. R.; Gobet, Fernand

    2013-03-01

    Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the `speciated non-dominated sorting genetic algorithm' for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.

  15. Computer modelling of technogenic thermal pollution zones in large water bodies

    NASA Astrophysics Data System (ADS)

    Parshakova, Ya N.; Lyubimova, T. P.

    2018-01-01

    In the present work, the thermal pollution zones created due to discharge of heated water from thermal power plants are investigated using the example of the Permskaya Thermal Power Plant (Permskaya TPP or Permskaya GRES), which is one of the largest thermal power plants in Europe. The study is performed for different technological and hydrometeorological conditions. Since the vertical temperature distribution in such wastewater reservoirs is highly inhomogeneous, the computations are performed in the framework of 3D model.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dress, W.B.

    Rosen's modeling relation is embedded in Popper's three worlds to provide an heuristic tool for model building and a guide for thinking about complex systems. The utility of this construct is demonstrated by suggesting a solution to the problem of pseudo science and a resolution of the famous Bohr-Einstein debates. A theory of bizarre systems is presented by an analogy with entangled particles of quantum mechanics. This theory underscores the poverty of present-day computational systems (e.g., computers) for creating complex and bizarre entities by distinguishing between mechanism and organism.

  17. Rapid prototyping in aortic surgery.

    PubMed

    Bangeas, Petros; Voulalas, Grigorios; Ktenidis, Kiriakos

    2016-04-01

    3D printing provides the sequential addition of material layers and, thus, the opportunity to print parts and components made of different materials with variable mechanical and physical properties. It helps us create 3D anatomical models for the better planning of surgical procedures when needed, since it can reveal any complex anatomical feature. Images of abdominal aortic aneurysms received by computed tomographic angiography were converted into 3D images using a Google SketchUp free software and saved in stereolithography format. Using a 3D printer (Makerbot), a model made of polylactic acid material (thermoplastic filament) was printed. A 3D model of an abdominal aorta aneurysm was created in 138 min, while the model was a precise copy of the aorta visualized in the computed tomographic images. The total cost (including the initial cost of the printer) reached 1303.00 euros. 3D imaging and modelling using different materials can be very useful in cases when anatomical difficulties are recognized through the computed tomographic images and a tactile approach is demanded preoperatively. In this way, major complications during abdominal aorta aneurysm management can be predicted and prevented. Furthermore, the model can be used as a mould; the development of new, more biocompatible, less antigenic and individualized can become a challenge in the future. © The Author 2016. Published by Oxford University Press on behalf of the European Association for Cardio-Thoracic Surgery. All rights reserved.

  18. Using Hybrid Techniques for Generating Watershed-scale Flood Models in an Integrated Modeling Framework

    NASA Astrophysics Data System (ADS)

    Saksena, S.; Merwade, V.; Singhofen, P.

    2017-12-01

    There is an increasing global trend towards developing large scale flood models that account for spatial heterogeneity at watershed scales to drive the future flood risk planning. Integrated surface water-groundwater modeling procedures can elucidate all the hydrologic processes taking part during a flood event to provide accurate flood outputs. Even though the advantages of using integrated modeling are widely acknowledged, the complexity of integrated process representation, computation time and number of input parameters required have deterred its application to flood inundation mapping, especially for large watersheds. This study presents a faster approach for creating watershed scale flood models using a hybrid design that breaks down the watershed into multiple regions of variable spatial resolution by prioritizing higher order streams. The methodology involves creating a hybrid model for the Upper Wabash River Basin in Indiana using Interconnected Channel and Pond Routing (ICPR) and comparing the performance with a fully-integrated 2D hydrodynamic model. The hybrid approach involves simplification procedures such as 1D channel-2D floodplain coupling; hydrologic basin (HUC-12) integration with 2D groundwater for rainfall-runoff routing; and varying spatial resolution of 2D overland flow based on stream order. The results for a 50-year return period storm event show that hybrid model (NSE=0.87) performance is similar to the 2D integrated model (NSE=0.88) but the computational time is reduced to half. The results suggest that significant computational efficiency can be obtained while maintaining model accuracy for large-scale flood models by using hybrid approaches for model creation.

  19. Computational Approaches to Chemical Hazard Assessment

    PubMed Central

    Luechtefeld, Thomas; Hartung, Thomas

    2018-01-01

    Summary Computational prediction of toxicity has reached new heights as a result of decades of growth in the magnitude and diversity of biological data. Public packages for statistics and machine learning make model creation faster. New theory in machine learning and cheminformatics enables integration of chemical structure, toxicogenomics, simulated and physical data in the prediction of chemical health hazards, and other toxicological information. Our earlier publications have characterized a toxicological dataset of unprecedented scale resulting from the European REACH legislation (Registration Evaluation Authorisation and Restriction of Chemicals). These publications dove into potential use cases for regulatory data and some models for exploiting this data. This article analyzes the options for the identification and categorization of chemicals, moves on to the derivation of descriptive features for chemicals, discusses different kinds of targets modeled in computational toxicology, and ends with a high-level perspective of the algorithms used to create computational toxicology models. PMID:29101769

  20. A New Formulation for Hybrid LES-RANS Computations

    NASA Technical Reports Server (NTRS)

    Woodruff, Stephen L.

    2013-01-01

    Ideally, a hybrid LES-RANS computation would employ LES only where necessary to make up for the failure of the RANS model to provide sufficient accuracy or to provide time-dependent information. Current approaches are fairly restrictive in the placement of LES and RANS regions; an LES-RANS transition in a boundary layer, for example, yields an unphysical log-layer shift. A hybrid computation is formulated here to allow greater control over the placement of LES and RANS regions and the transitions between them. The concept of model invariance is introduced, which provides a basis for interpreting hybrid results within an LES-RANS transition zone. Consequences of imposing model invariance include the addition of terms to the governing equations that compensate for unphysical gradients created as the model changes between RANS and LES. Computational results illustrate the increased accuracy of the approach and its insensitivity to the location of the transition and to the blending function employed.

  1. The Virtual Watershed Observatory: Cyberinfrastructure for Model-Data Integration and Access

    NASA Astrophysics Data System (ADS)

    Duffy, C.; Leonard, L. N.; Giles, L.; Bhatt, G.; Yu, X.

    2011-12-01

    The Virtual Watershed Observatory (VWO) is a concept where scientists, water managers, educators and the general public can create a virtual observatory from integrated hydrologic model results, national databases and historical or real-time observations via web services. In this paper, we propose a prototype for automated and virtualized web services software using national data products for climate reanalysis, soils, geology, terrain and land cover. The VWO has the broad purpose of making accessible water resource simulations, real-time data assimilation, calibration and archival at the scale of HUC 12 watersheds (Hydrologic Unit Code) anywhere in the continental US. Our prototype for model-data integration focuses on creating tools for fast data storage from selected national databases, as well as the computational resources necessary for a dynamic, distributed watershed simulation. The paper will describe cyberinfrastructure tools and workflow that attempts to resolve the problem of model-data accessibility and scalability such that individuals, research teams, managers and educators can create a WVO in a desired context. Examples are given for the NSF-funded Shale Hills Critical Zone Observatory and the European Critical Zone Observatories within the SoilTrEC project. In the future implementation of WVO services will benefit from the development of a cloud cyber infrastructure as the prototype evolves to data and model intensive computation for continental scale water resource predictions.

  2. Enterprise application architecture development based on DoDAF and TOGAF

    NASA Astrophysics Data System (ADS)

    Tao, Zhi-Gang; Luo, Yun-Feng; Chen, Chang-Xin; Wang, Ming-Zhe; Ni, Feng

    2017-05-01

    For the purpose of supporting the design and analysis of enterprise application architecture, here, we report a tailored enterprise application architecture description framework and its corresponding design method. The presented framework can effectively support service-oriented architecting and cloud computing by creating the metadata model based on architecture content framework (ACF), DoDAF metamodel (DM2) and Cloud Computing Modelling Notation (CCMN). The framework also makes an effort to extend and improve the mapping between The Open Group Architecture Framework (TOGAF) application architectural inputs/outputs, deliverables and Department of Defence Architecture Framework (DoDAF)-described models. The roadmap of 52 DoDAF-described models is constructed by creating the metamodels of these described models and analysing the constraint relationship among metamodels. By combining the tailored framework and the roadmap, this article proposes a service-oriented enterprise application architecture development process. Finally, a case study is presented to illustrate the results of implementing the tailored framework in the Southern Base Management Support and Information Platform construction project using the development process proposed by the paper.

  3. Computer graphics testbed to simulate and test vision systems for space applications

    NASA Technical Reports Server (NTRS)

    Cheatham, John B.; Wu, Chris K.; Lin, Y. H.

    1991-01-01

    A system was developed for displaying computer graphics images of space objects and the use of the system was demonstrated as a testbed for evaluating vision systems for space applications. In order to evaluate vision systems, it is desirable to be able to control all factors involved in creating the images used for processing by the vision system. Considerable time and expense is involved in building accurate physical models of space objects. Also, precise location of the model relative to the viewer and accurate location of the light source require additional effort. As part of this project, graphics models of space objects such as the Solarmax satellite are created that the user can control the light direction and the relative position of the object and the viewer. The work is also aimed at providing control of hue, shading, noise and shadows for use in demonstrating and testing imaging processing techniques. The simulated camera data can provide XYZ coordinates, pitch, yaw, and roll for the models. A physical model is also being used to provide comparison of camera images with the graphics images.

  4. The Role of Networks in Cloud Computing

    NASA Astrophysics Data System (ADS)

    Lin, Geng; Devine, Mac

    The confluence of technology advancements and business developments in Broadband Internet, Web services, computing systems, and application software over the past decade has created a perfect storm for cloud computing. The "cloud model" of delivering and consuming IT functions as services is poised to fundamentally transform the IT industry and rebalance the inter-relationships among end users, enterprise IT, software companies, and the service providers in the IT ecosystem (Armbrust et al., 2009; Lin, Fu, Zhu, & Dasmalchi, 2009).

  5. Computer Science (CS) Education in Indian Schools: Situation Analysis Using Darmstadt Model

    ERIC Educational Resources Information Center

    Raman, Raghu; Venkatasubramanian, Smrithi; Achuthan, Krishnashree; Nedungadi, Prema

    2015-01-01

    Computer science (CS) and its enabling technologies are at the heart of this information age, yet its adoption as a core subject by senior secondary students in Indian schools is low and has not reached critical mass. Though there have been efforts to create core curriculum standards for subjects like Physics, Chemistry, Biology, and Math, CS…

  6. Efficient computation of the joint probability of multiple inherited risk alleles from pedigree data.

    PubMed

    Madsen, Thomas; Braun, Danielle; Peng, Gang; Parmigiani, Giovanni; Trippa, Lorenzo

    2018-06-25

    The Elston-Stewart peeling algorithm enables estimation of an individual's probability of harboring germline risk alleles based on pedigree data, and serves as the computational backbone of important genetic counseling tools. However, it remains limited to the analysis of risk alleles at a small number of genetic loci because its computing time grows exponentially with the number of loci considered. We propose a novel, approximate version of this algorithm, dubbed the peeling and paring algorithm, which scales polynomially in the number of loci. This allows extending peeling-based models to include many genetic loci. The algorithm creates a trade-off between accuracy and speed, and allows the user to control this trade-off. We provide exact bounds on the approximation error and evaluate it in realistic simulations. Results show that the loss of accuracy due to the approximation is negligible in important applications. This algorithm will improve genetic counseling tools by increasing the number of pathogenic risk alleles that can be addressed. To illustrate we create an extended five genes version of BRCAPRO, a widely used model for estimating the carrier probabilities of BRCA1 and BRCA2 risk alleles and assess its computational properties. © 2018 WILEY PERIODICALS, INC.

  7. Pre- and post-processing for Cosmic/NASTRAN on personal computers and mainframes

    NASA Technical Reports Server (NTRS)

    Kamel, H. A.; Mobley, A. V.; Nagaraj, B.; Watkins, K. W.

    1986-01-01

    An interface between Cosmic/NASTRAN and GIFTS has recently been released, combining the powerful pre- and post-processing capabilities of GIFTS with Cosmic/NASTRAN's analysis capabilities. The interface operates on a wide range of computers, even linking Cosmic/NASTRAN and GIFTS when the two are on different computers. GIFTS offers a wide range of elements for use in model construction, each translated by the interface into the nearest Cosmic/NASTRAN equivalent; and the options of automatic or interactive modelling and loading in GIFTS make pre-processing easy and effective. The interface itself includes the programs GFTCOS, which creates the Cosmic/NASTRAN input deck (and, if desired, control deck) from the GIFTS Unified Data Base, COSGFT, which translates the displacements from the Cosmic/NASTRAN analysis back into GIFTS; and HOSTR, which handles stress computations for a few higher-order elements available in the interface, but not supported by the GIFTS processor STRESS. Finally, the versatile display options in GIFTS post-processing allow the user to examine the analysis results through an especially wide range of capabilities, including such possibilities as creating composite loading cases, plotting in color and animating the analysis.

  8. Specialized computer architectures for computational aerodynamics

    NASA Technical Reports Server (NTRS)

    Stevenson, D. K.

    1978-01-01

    In recent years, computational fluid dynamics has made significant progress in modelling aerodynamic phenomena. Currently, one of the major barriers to future development lies in the compute-intensive nature of the numerical formulations and the relative high cost of performing these computations on commercially available general purpose computers, a cost high with respect to dollar expenditure and/or elapsed time. Today's computing technology will support a program designed to create specialized computing facilities to be dedicated to the important problems of computational aerodynamics. One of the still unresolved questions is the organization of the computing components in such a facility. The characteristics of fluid dynamic problems which will have significant impact on the choice of computer architecture for a specialized facility are reviewed.

  9. Training a Joint and Expeditionary Mindset

    DTIC Science & Technology

    2006-12-01

    associated with the JEM constructs and for using them to create effective computer-mediated training scenarios. The pedagogic model enables development of...ensure the instructional rigor of scenarios and provide a sound basis for determining performance indicators. The pedagogical model enables development...and Subordinate Constructs ........................................................................... 3 Pedagogical Fram ew ork

  10. The Role of Item Models in Automatic Item Generation

    ERIC Educational Resources Information Center

    Gierl, Mark J.; Lai, Hollis

    2012-01-01

    Automatic item generation represents a relatively new but rapidly evolving research area where cognitive and psychometric theories are used to produce tests that include items generated using computer technology. Automatic item generation requires two steps. First, test development specialists create item models, which are comparable to templates…

  11. Tree growth visualization

    Treesearch

    L. Linsen; B.J. Karis; E.G. McPherson; B. Hamann

    2005-01-01

    In computer graphics, models describing the fractal branching structure of trees typically exploit the modularity of tree structures. The models are based on local production rules, which are applied iteratively and simultaneously to create a complex branching system. The objective is to generate three-dimensional scenes of often many realistic- looking and non-...

  12. Benchmarking and Modeling of a Conventional Mid-Size Car Using ALPHA (SAE Paper 2015-01-1140)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...

  13. A Head in Virtual Reality: Development of A Dynamic Head and Neck Model

    ERIC Educational Resources Information Center

    Nguyen, Ngan; Wilson, Timothy D.

    2009-01-01

    Advances in computer and interface technologies have made it possible to create three-dimensional (3D) computerized models of anatomical structures for visualization, manipulation, and interaction in a virtual 3D environment. In the past few decades, a multitude of digital models have been developed to facilitate complex spatial learning of the…

  14. A convergent model for distributed processing of Big Sensor Data in urban engineering networks

    NASA Astrophysics Data System (ADS)

    Parygin, D. S.; Finogeev, A. G.; Kamaev, V. A.; Finogeev, A. A.; Gnedkova, E. P.; Tyukov, A. P.

    2017-01-01

    The problems of development and research of a convergent model of the grid, cloud, fog and mobile computing for analytical Big Sensor Data processing are reviewed. The model is meant to create monitoring systems of spatially distributed objects of urban engineering networks and processes. The proposed approach is the convergence model of the distributed data processing organization. The fog computing model is used for the processing and aggregation of sensor data at the network nodes and/or industrial controllers. The program agents are loaded to perform computing tasks for the primary processing and data aggregation. The grid and the cloud computing models are used for integral indicators mining and accumulating. A computing cluster has a three-tier architecture, which includes the main server at the first level, a cluster of SCADA system servers at the second level, a lot of GPU video cards with the support for the Compute Unified Device Architecture at the third level. The mobile computing model is applied to visualize the results of intellectual analysis with the elements of augmented reality and geo-information technologies. The integrated indicators are transferred to the data center for accumulation in a multidimensional storage for the purpose of data mining and knowledge gaining.

  15. On the Modeling and Management of Cloud Data Analytics

    NASA Astrophysics Data System (ADS)

    Castillo, Claris; Tantawi, Asser; Steinder, Malgorzata; Pacifici, Giovanni

    A new era is dawning where vast amount of data is subjected to intensive analysis in a cloud computing environment. Over the years, data about a myriad of things, ranging from user clicks to galaxies, have been accumulated, and continue to be collected, on storage media. The increasing availability of such data, along with the abundant supply of compute power and the urge to create useful knowledge, gave rise to a new data analytics paradigm in which data is subjected to intensive analysis, and additional data is created in the process. Meanwhile, a new cloud computing environment has emerged where seemingly limitless compute and storage resources are being provided to host computation and data for multiple users through virtualization technologies. Such a cloud environment is becoming the home for data analytics. Consequently, providing good performance at run-time to data analytics workload is an important issue for cloud management. In this paper, we provide an overview of the data analytics and cloud environment landscapes, and investigate the performance management issues related to running data analytics in the cloud. In particular, we focus on topics such as workload characterization, profiling analytics applications and their pattern of data usage, cloud resource allocation, placement of computation and data and their dynamic migration in the cloud, and performance prediction. In solving such management problems one relies on various run-time analytic models. We discuss approaches for modeling and optimizing the dynamic data analytics workload in the cloud environment. All along, we use the Map-Reduce paradigm as an illustration of data analytics.

  16. The application of cloud computing to scientific workflows: a study of cost and performance.

    PubMed

    Berriman, G Bruce; Deelman, Ewa; Juve, Gideon; Rynge, Mats; Vöckler, Jens-S

    2013-01-28

    The current model of transferring data from data centres to desktops for analysis will soon be rendered impractical by the accelerating growth in the volume of science datasets. Processing will instead often take place on high-performance servers co-located with data. Evaluations of how new technologies such as cloud computing would support such a new distributed computing model are urgently needed. Cloud computing is a new way of purchasing computing and storage resources on demand through virtualization technologies. We report here the results of investigations of the applicability of commercial cloud computing to scientific computing, with an emphasis on astronomy, including investigations of what types of applications can be run cheaply and efficiently on the cloud, and an example of an application well suited to the cloud: processing a large dataset to create a new science product.

  17. Bigdata Driven Cloud Security: A Survey

    NASA Astrophysics Data System (ADS)

    Raja, K.; Hanifa, Sabibullah Mohamed

    2017-08-01

    Cloud Computing (CC) is a fast-growing technology to perform massive-scale and complex computing. It eliminates the need to maintain expensive computing hardware, dedicated space, and software. Recently, it has been observed that massive growth in the scale of data or big data generated through cloud computing. CC consists of a front-end, includes the users’ computers and software required to access the cloud network, and back-end consists of various computers, servers and database systems that create the cloud. In SaaS (Software as-a-Service - end users to utilize outsourced software), PaaS (Platform as-a-Service-platform is provided) and IaaS (Infrastructure as-a-Service-physical environment is outsourced), and DaaS (Database as-a-Service-data can be housed within a cloud), where leading / traditional cloud ecosystem delivers the cloud services become a powerful and popular architecture. Many challenges and issues are in security or threats, most vital barrier for cloud computing environment. The main barrier to the adoption of CC in health care relates to Data security. When placing and transmitting data using public networks, cyber attacks in any form are anticipated in CC. Hence, cloud service users need to understand the risk of data breaches and adoption of service delivery model during deployment. This survey deeply covers the CC security issues (covering Data Security in Health care) so as to researchers can develop the robust security application models using Big Data (BD) on CC (can be created / deployed easily). Since, BD evaluation is driven by fast-growing cloud-based applications developed using virtualized technologies. In this purview, MapReduce [12] is a good example of big data processing in a cloud environment, and a model for Cloud providers.

  18. Whole earth modeling: developing and disseminating scientific software for computational geophysics.

    NASA Astrophysics Data System (ADS)

    Kellogg, L. H.

    2016-12-01

    Historically, a great deal of specialized scientific software for modeling and data analysis has been developed by individual researchers or small groups of scientists working on their own specific research problems. As the magnitude of available data and computer power has increased, so has the complexity of scientific problems addressed by computational methods, creating both a need to sustain existing scientific software, and expand its development to take advantage of new algorithms, new software approaches, and new computational hardware. To that end, communities like the Computational Infrastructure for Geodynamics (CIG) have been established to support the use of best practices in scientific computing for solid earth geophysics research and teaching. Working as a scientific community enables computational geophysicists to take advantage of technological developments, improve the accuracy and performance of software, build on prior software development, and collaborate more readily. The CIG community, and others, have adopted an open-source development model, in which code is developed and disseminated by the community in an open fashion, using version control and software repositories like Git. One emerging issue is how to adequately identify and credit the intellectual contributions involved in creating open source scientific software. The traditional method of disseminating scientific ideas, peer reviewed publication, was not designed for review or crediting scientific software, although emerging publication strategies such software journals are attempting to address the need. We are piloting an integrated approach in which authors are identified and credited as scientific software is developed and run. Successful software citation requires integration with the scholarly publication and indexing mechanisms as well, to assign credit, ensure discoverability, and provide provenance for software.

  19. Systems Biology in Immunology – A Computational Modeling Perspective

    PubMed Central

    Germain, Ronald N.; Meier-Schellersheim, Martin; Nita-Lazar, Aleksandra; Fraser, Iain D. C.

    2011-01-01

    Systems biology is an emerging discipline that combines high-content, multiplexed measurements with informatic and computational modeling methods to better understand biological function at various scales. Here we present a detailed review of the methods used to create computational models and conduct simulations of immune function, We provide descriptions of the key data gathering techniques employed to generate the quantitative and qualitative data required for such modeling and simulation and summarize the progress to date in applying these tools and techniques to questions of immunological interest, including infectious disease. We include comments on what insights modeling can provide that complement information obtained from the more familiar experimental discovery methods used by most investigators and why quantitative methods are needed to eventually produce a better understanding of immune system operation in health and disease. PMID:21219182

  20. Creating vascular models by postprocessing computed tomography angiography images: a guide for anatomical education.

    PubMed

    Govsa, Figen; Ozer, Mehmet Asim; Sirinturk, Suzan; Eraslan, Cenk; Alagoz, Ahmet Kemal

    2017-08-01

    A new application of teaching anatomy includes the use of computed tomography angiography (CTA) images to create clinically relevant three-dimensional (3D) printed models. The purpose of this article is to review recent innovations on the process and the application of 3D printed models as a tool for using under and post-graduate medical education. Images of aortic arch pattern received by CTA were converted into 3D images using the Google SketchUp free software and were saved in stereolithography format. Using a 3D printer (Makerbot), a model mode polylactic acid material was printed. A two-vessel left aortic arch was identified consisting of the brachiocephalic trunk and left subclavian artery. The life-like 3D models were rotated 360° in all axes in hand. The early adopters in education and clinical practices have embraced the medical imaging-guided 3D printed anatomical models for their ability to provide tactile feedback and a superior appreciation of visuospatial relationship between the anatomical structures. Printed vascular models are used to assist in preoperative planning, develop intraoperative guidance tools, and to teach patients surgical trainees in surgical practice.

  1. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems.

    PubMed

    Williams, Richard A; Timmis, Jon; Qwarnstrom, Eva E

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model.

  2. Statistical Techniques Complement UML When Developing Domain Models of Complex Dynamical Biosystems

    PubMed Central

    Timmis, Jon; Qwarnstrom, Eva E.

    2016-01-01

    Computational modelling and simulation is increasingly being used to complement traditional wet-lab techniques when investigating the mechanistic behaviours of complex biological systems. In order to ensure computational models are fit for purpose, it is essential that the abstracted view of biology captured in the computational model, is clearly and unambiguously defined within a conceptual model of the biological domain (a domain model), that acts to accurately represent the biological system and to document the functional requirements for the resultant computational model. We present a domain model of the IL-1 stimulated NF-κB signalling pathway, which unambiguously defines the spatial, temporal and stochastic requirements for our future computational model. Through the development of this model, we observe that, in isolation, UML is not sufficient for the purpose of creating a domain model, and that a number of descriptive and multivariate statistical techniques provide complementary perspectives, in particular when modelling the heterogeneity of dynamics at the single-cell level. We believe this approach of using UML to define the structure and interactions within a complex system, along with statistics to define the stochastic and dynamic nature of complex systems, is crucial for ensuring that conceptual models of complex dynamical biosystems, which are developed using UML, are fit for purpose, and unambiguously define the functional requirements for the resultant computational model. PMID:27571414

  3. Advancing the Implementation of Hydrologic Models as Web-based Applications

    NASA Astrophysics Data System (ADS)

    Dahal, P.; Tarboton, D. G.; Castronova, A. M.

    2017-12-01

    Advanced computer simulations are required to understand hydrologic phenomenon such as rainfall-runoff response, groundwater hydrology, snow hydrology, etc. Building a hydrologic model instance to simulate a watershed requires investment in data (diverse geospatial datasets such as terrain, soil) and computer resources, typically demands a wide skill set from the analyst, and the workflow involved is often difficult to reproduce. This work introduces a web-based prototype infrastructure in the form of a web application that provides researchers with easy to use access to complete hydrological modeling functionality. This includes creating the necessary geospatial and forcing data, preparing input files for a model by applying complex data preprocessing, running the model for a user defined watershed, and saving the results to a web repository. The open source Tethys Platform was used to develop the web app front-end Graphical User Interface (GUI). We used HydroDS, a webservice that provides data preparation processing capability to support backend computations used by the app. Results are saved in HydroShare, a hydrologic information system that supports the sharing of hydrologic data, model and analysis tools. The TOPographic Kinematic APproximation and Integration (TOPKAPI) model served as the example for which we developed a complete hydrologic modeling service to demonstrate the approach. The final product is a complete modeling system accessible through the web to create input files, and run the TOPKAPI hydrologic model for a watershed of interest. We are investigating similar functionality for the preparation of input to Regional Hydro-Ecological Simulation System (RHESSys). Key Words: hydrologic modeling, web services, hydrologic information system, HydroShare, HydroDS, Tethys Platform

  4. Provider-Independent Use of the Cloud

    NASA Astrophysics Data System (ADS)

    Harmer, Terence; Wright, Peter; Cunningham, Christina; Perrott, Ron

    Utility computing offers researchers and businesses the potential of significant cost-savings, making it possible for them to match the cost of their computing and storage to their demand for such resources. A utility compute provider enables the purchase of compute infrastructures on-demand; when a user requires computing resources a provider will provision a resource for them and charge them only for their period of use of that resource. There has been a significant growth in the number of cloud computing resource providers and each has a different resource usage model, application process and application programming interface (API)-developing generic multi-resource provider applications is thus difficult and time consuming. We have developed an abstraction layer that provides a single resource usage model, user authentication model and API for compute providers that enables cloud-provider neutral applications to be developed. In this paper we outline the issues in using external resource providers, give examples of using a number of the most popular cloud providers and provide examples of developing provider neutral applications. In addition, we discuss the development of the API to create a generic provisioning model based on a common architecture for cloud computing providers.

  5. A survey on hair modeling: styling, simulation, and rendering.

    PubMed

    Ward, Kelly; Bertails, Florence; Kim, Tae-Yong; Marschner, Stephen R; Cani, Marie-Paule; Lin, Ming C

    2007-01-01

    Realistic hair modeling is a fundamental part of creating virtual humans in computer graphics. This paper surveys the state of the art in the major topics of hair modeling: hairstyling, hair simulation, and hair rendering. Because of the difficult, often unsolved problems that arise in all these areas, a broad diversity of approaches are used, each with strengths that make it appropriate for particular applications. We discuss each of these major topics in turn, presenting the unique challenges facing each area and describing solutions that have been presented over the years to handle these complex issues. Finally, we outline some of the remaining computational challenges in hair modeling.

  6. A Computational Model of Human Table Tennis for Robot Application

    NASA Astrophysics Data System (ADS)

    Mülling, Katharina; Peters, Jan

    Table tennis is a difficult motor skill which requires all basic components of a general motor skill learning system. In order to get a step closer to such a generic approach to the automatic acquisition and refinement of table tennis, we study table tennis from a human motor control point of view. We make use of the basic models of discrete human movement phases, virtual hitting points, and the operational timing hypothesis. Using these components, we create a computational model which is aimed at reproducing human-like behavior. We verify the functionality of this model in a physically realistic simulation of a Barrett WAM.

  7. Conifer ovulate cones accumulate pollen principally by simple impaction.

    PubMed

    Cresswell, James E; Henning, Kevin; Pennel, Christophe; Lahoubi, Mohamed; Patrick, Michael A; Young, Phillipe G; Tabor, Gavin R

    2007-11-13

    In many pine species (Family Pinaceae), ovulate cones structurally resemble a turbine, which has been widely interpreted as an adaptation for improving pollination by producing complex aerodynamic effects. We tested the turbine interpretation by quantifying patterns of pollen accumulation on ovulate cones in a wind tunnel and by using simulation models based on computational fluid dynamics. We used computer-aided design and computed tomography to create computational fluid dynamics model cones. We studied three species: Pinus radiata, Pinus sylvestris, and Cedrus libani. Irrespective of the approach or species studied, we found no evidence that turbine-like aerodynamics made a significant contribution to pollen accumulation, which instead occurred primarily by simple impaction. Consequently, we suggest alternative adaptive interpretations for the structure of ovulate cones.

  8. Conifer ovulate cones accumulate pollen principally by simple impaction

    PubMed Central

    Cresswell, James E.; Henning, Kevin; Pennel, Christophe; Lahoubi, Mohamed; Patrick, Michael A.; Young, Phillipe G.; Tabor, Gavin R.

    2007-01-01

    In many pine species (Family Pinaceae), ovulate cones structurally resemble a turbine, which has been widely interpreted as an adaptation for improving pollination by producing complex aerodynamic effects. We tested the turbine interpretation by quantifying patterns of pollen accumulation on ovulate cones in a wind tunnel and by using simulation models based on computational fluid dynamics. We used computer-aided design and computed tomography to create computational fluid dynamics model cones. We studied three species: Pinus radiata, Pinus sylvestris, and Cedrus libani. Irrespective of the approach or species studied, we found no evidence that turbine-like aerodynamics made a significant contribution to pollen accumulation, which instead occurred primarily by simple impaction. Consequently, we suggest alternative adaptive interpretations for the structure of ovulate cones. PMID:17986613

  9. Interactomes to Biological Phase Space: a call to begin thinking at a new level in computational biology.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Davidson, George S.; Brown, William Michael

    2007-09-01

    Techniques for high throughput determinations of interactomes, together with high resolution protein collocalizations maps within organelles and through membranes will soon create a vast resource. With these data, biological descriptions, akin to the high dimensional phase spaces familiar to physicists, will become possible. These descriptions will capture sufficient information to make possible realistic, system-level models of cells. The descriptions and the computational models they enable will require powerful computing techniques. This report is offered as a call to the computational biology community to begin thinking at this scale and as a challenge to develop the required algorithms and codes tomore » make use of the new data.3« less

  10. The PyRosetta Toolkit: a graphical user interface for the Rosetta software suite.

    PubMed

    Adolf-Bryfogle, Jared; Dunbrack, Roland L

    2013-01-01

    The Rosetta Molecular Modeling suite is a command-line-only collection of applications that enable high-resolution modeling and design of proteins and other molecules. Although extremely useful, Rosetta can be difficult to learn for scientists with little computational or programming experience. To that end, we have created a Graphical User Interface (GUI) for Rosetta, called the PyRosetta Toolkit, for creating and running protocols in Rosetta for common molecular modeling and protein design tasks and for analyzing the results of Rosetta calculations. The program is highly extensible so that developers can add new protocols and analysis tools to the PyRosetta Toolkit GUI.

  11. The implications of free 3D scanning in the conservation state assessment of old wood painted icon

    NASA Astrophysics Data System (ADS)

    Munteanu, Marius; Sandu, Ion

    2016-06-01

    The present paper presents the conservation state and the making of a 3D model of a XVIII-th century orthodox icon on wood support, using free available software and cloud computing. In order to create the 3D model of the painting layer of the icon a number of 70 pictures were taken using a Nikon DSLR D3300, 24.2 MP in setup with a Hama Star 75 photo tripod, in loops 360° around the painting, at three different angles. The pictures were processed with Autodesk I23D Catch, which automatically finds and matches common features among all of the uploaded photographs in order to create the 3D scene, using the power and speed of cloud computing. The obtained 3D model was afterwards analyzed and processed in order to obtain a final version, which can now be use to better identify, to map and to prioritize the future conservation processes and finally can be shared online as an animation.

  12. Simulation of the modulation transfer function dependent on the partial Fourier fraction in dynamic contrast enhancement magnetic resonance imaging.

    PubMed

    Takatsu, Yasuo; Ueyama, Tsuyoshi; Miyati, Tosiaki; Yamamura, Kenichirou

    2016-12-01

    The image characteristics in dynamic contrast-enhanced magnetic resonance imaging (DCE-MRI) depend on the partial Fourier fraction and contrast medium concentration. These characteristics were assessed and the modulation transfer function (MTF) was calculated by computer simulation. A digital phantom was created from signal intensity data acquired at different contrast medium concentrations on a breast model. The frequency images [created by fast Fourier transform (FFT)] were divided into 512 parts and rearranged to form a new image. The inverse FFT of this image yielded the MTF. From the reference data, three linear models (low, medium, and high) and three exponential models (slow, medium, and rapid) of the signal intensity were created. Smaller partial Fourier fractions, and higher gradients in the linear models, corresponded to faster MTF decline. The MTF more gradually decreased in the exponential models than in the linear models. The MTF, which reflects the image characteristics in DCE-MRI, was more degraded as the partial Fourier fraction decreased.

  13. Learning-based stochastic object models for characterizing anatomical variations

    NASA Astrophysics Data System (ADS)

    Dolly, Steven R.; Lou, Yang; Anastasio, Mark A.; Li, Hua

    2018-03-01

    It is widely known that the optimization of imaging systems based on objective, task-based measures of image quality via computer-simulation requires the use of a stochastic object model (SOM). However, the development of computationally tractable SOMs that can accurately model the statistical variations in human anatomy within a specified ensemble of patients remains a challenging task. Previously reported numerical anatomic models lack the ability to accurately model inter-patient and inter-organ variations in human anatomy among a broad patient population, mainly because they are established on image data corresponding to a few of patients and individual anatomic organs. This may introduce phantom-specific bias into computer-simulation studies, where the study result is heavily dependent on which phantom is used. In certain applications, however, databases of high-quality volumetric images and organ contours are available that can facilitate this SOM development. In this work, a novel and tractable methodology for learning a SOM and generating numerical phantoms from a set of volumetric training images is developed. The proposed methodology learns geometric attribute distributions (GAD) of human anatomic organs from a broad patient population, which characterize both centroid relationships between neighboring organs and anatomic shape similarity of individual organs among patients. By randomly sampling the learned centroid and shape GADs with the constraints of the respective principal attribute variations learned from the training data, an ensemble of stochastic objects can be created. The randomness in organ shape and position reflects the learned variability of human anatomy. To demonstrate the methodology, a SOM of an adult male pelvis is computed and examples of corresponding numerical phantoms are created.

  14. Computed tomography landmark-based semi-automated mesh morphing and mapping techniques: generation of patient specific models of the human pelvis without segmentation.

    PubMed

    Salo, Zoryana; Beek, Maarten; Wright, David; Whyne, Cari Marisa

    2015-04-13

    Current methods for the development of pelvic finite element (FE) models generally are based upon specimen specific computed tomography (CT) data. This approach has traditionally required segmentation of CT data sets, which is time consuming and necessitates high levels of user intervention due to the complex pelvic anatomy. The purpose of this research was to develop and assess CT landmark-based semi-automated mesh morphing and mapping techniques to aid the generation and mechanical analysis of specimen-specific FE models of the pelvis without the need for segmentation. A specimen-specific pelvic FE model (source) was created using traditional segmentation methods and morphed onto a CT scan of a different (target) pelvis using a landmark-based method. The morphed model was then refined through mesh mapping by moving the nodes to the bone boundary. A second target model was created using traditional segmentation techniques. CT intensity based material properties were assigned to the morphed/mapped model and to the traditionally segmented target models. Models were analyzed to evaluate their geometric concurrency and strain patterns. Strains generated in a double-leg stance configuration were compared to experimental strain gauge data generated from the same target cadaver pelvis. CT landmark-based morphing and mapping techniques were efficiently applied to create a geometrically multifaceted specimen-specific pelvic FE model, which was similar to the traditionally segmented target model and better replicated the experimental strain results (R(2)=0.873). This study has shown that mesh morphing and mapping represents an efficient validated approach for pelvic FE model generation without the need for segmentation. Copyright © 2015 Elsevier Ltd. All rights reserved.

  15. Finite Element Model to Reduce Fire and Blast Vulnerability

    DTIC Science & Technology

    2013-01-01

    4 Figure 4. Scapula, Clavicle and Arm Models Attached to the Larger Model .............................. 5 Figure 5. The Full Body...Finite Element Model of the Lower Limbs UNCLASSIFIED 4 UNCLASSIFIED Anatomical surfaces of the scapula and clavicle were obtained and...ulna and hand bones. For the arms, hands, scapula and clavicle , the materials were made to be rigid and joints created using computational constraints

  16. Researchers Mine Information from Next-Generation Subsurface Flow Simulations

    DOE PAGES

    Gedenk, Eric D.

    2015-12-01

    A research team based at Virginia Tech University leveraged computing resources at the US Department of Energy's (DOE's) Oak Ridge National Laboratory to explore subsurface multiphase flow phenomena that can't be experimentally observed. Using the Cray XK7 Titan supercomputer at the Oak Ridge Leadership Computing Facility, the team took Micro-CT images of subsurface geologic systems and created two-phase flow simulations. The team's model development has implications for computational research pertaining to carbon sequestration, oil recovery, and contaminant transport.

  17. An Upgrade of the Aeroheating Software ''MINIVER''

    NASA Technical Reports Server (NTRS)

    Louderback, Pierce

    2013-01-01

    Detailed computational modeling: CFO often used to create and execute computational domains. Increasing complexity when moving from 20 to 30 geometries. Computational time increased as finer grids are used (accuracy). Strong tool, but takes time to set up and run. MINIVER: Uses theoretical and empirical correlations. Orders of magnitude faster to set up and run. Not as accurate as CFO, but gives reasonable estimations. MINIVER's Drawbacks: Rigid command-line interface. Lackluster, unorganized documentation. No central control; multiple versions exist and have diverged.

  18. Adaptive System Modeling for Spacecraft Simulation

    NASA Technical Reports Server (NTRS)

    Thomas, Justin

    2011-01-01

    This invention introduces a methodology and associated software tools for automatically learning spacecraft system models without any assumptions regarding system behavior. Data stream mining techniques were used to learn models for critical portions of the International Space Station (ISS) Electrical Power System (EPS). Evaluation on historical ISS telemetry data shows that adaptive system modeling reduces simulation error anywhere from 50 to 90 percent over existing approaches. The purpose of the methodology is to outline how someone can create accurate system models from sensor (telemetry) data. The purpose of the software is to support the methodology. The software provides analysis tools to design the adaptive models. The software also provides the algorithms to initially build system models and continuously update them from the latest streaming sensor data. The main strengths are as follows: Creates accurate spacecraft system models without in-depth system knowledge or any assumptions about system behavior. Automatically updates/calibrates system models using the latest streaming sensor data. Creates device specific models that capture the exact behavior of devices of the same type. Adapts to evolving systems. Can reduce computational complexity (faster simulations).

  19. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2016-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  20. Computational Modelling of Patella Femoral Kinematics During Gait Cycle and Experimental Validation

    NASA Astrophysics Data System (ADS)

    Maiti, Raman

    2018-06-01

    The effect of loading and boundary conditions on patellar mechanics is significant due to the complications arising in patella femoral joints during total knee replacements. To understand the patellar mechanics with respect to loading and motion, a computational model representing the patella femoral joint was developed and validated against experimental results. The computational model was created in IDEAS NX and simulated in MSC ADAMS/VIEW software. The results obtained in the form of internal external rotations and anterior posterior displacements for a new and experimentally simulated specimen for patella femoral joint under standard gait condition were compared with experimental measurements performed on the Leeds ProSim knee simulator. A good overall agreement between the computational prediction and the experimental data was obtained for patella femoral kinematics. Good agreement between the model and the past studies was observed when the ligament load was removed and the medial lateral displacement was constrained. The model is sensitive to ±5 % change in kinematics, frictional, force and stiffness coefficients and insensitive to time step.

  1. Fluid-Structure Interaction Modeling of Parachutes with Disreefing and Modified Geometric Porosity and Separation Aerodynamics of a Cover Jettisoned to the Spacecraft Wake

    NASA Astrophysics Data System (ADS)

    Fritze, Matthew D.

    Fluid-structure interaction (FSI) modeling of spacecraft parachutes involves a number of computational challenges. The canopy complexity created by the hundreds of gaps and slits and design-related modification of that geometric porosity by removal of some of the sails and panels are among the formidable challenges. Disreefing from one stage to another when the parachute is used in multiple stages is another formidable challenge. This thesis addresses the computational challenges involved in disreefing of spacecraft parachutes and fully-open and reefed stages of the parachutes with modified geometric porosity. The special techniques developed to address these challenges are described and the FSI computations are be reported. The thesis also addresses the modeling and computation challenges involved in very early stages, where the sudden separation of a cover jettisoned to the spacecraft wake needs to be modeled. Higher-order temporal representations used in modeling the separation motion are described, and the computed separation and wake-induced forces acting on the cover are reported.

  2. Study of a scanning HIFU therapy protocol, Part II: Experiment and results

    NASA Astrophysics Data System (ADS)

    Andrew, Marilee A.; Kaczkowski, Peter; Cunitz, Bryan W.; Brayman, Andrew A.; Kargl, Steven G.

    2003-04-01

    Instrumentation and protocols for creating scanned HIFU lesions in freshly excised bovine liver were developed in order to study the in vitro HIFU dose response and validate models. Computer-control of the HIFU transducer and 3-axis positioning system provided precise spatial placement of the thermal lesions. Scan speeds were selected in the range of 1 to 8 mm/s, and the applied electrical power was varied from 20 to 60 W. These parameters were chosen to hold the thermal dose constant. A total of six valid scans of 15 mm length were created in each sample; a 3.5 MHz single-element, spherically focused transducer was used. Treated samples were frozen, then sliced in 1.27 mm increments. Digital photographs of slices were downloaded to computer for image processing and analysis. Lesion characteristics, including the depth within the tissue, axial length, and radial width, were computed. Results were compared with those generated from modified KZK and BHTE models, and include a comparison of the statistical variation in the across-scan lesion radial width. [Work supported by USAMRMC.

  3. A network-analysis-based comparative study of the throughput behavior of polymer melts in barrier screw geometries

    NASA Astrophysics Data System (ADS)

    Aigner, M.; Köpplmayr, T.; Kneidinger, C.; Miethlinger, J.

    2014-05-01

    Barrier screws are widely used in the plastics industry. Due to the extreme diversity of their geometries, describing the flow behavior is difficult and rarely done in practice. We present a systematic approach based on networks that uses tensor algebra and numerical methods to model and calculate selected barrier screw geometries in terms of pressure, mass flow, and residence time. In addition, we report the results of three-dimensional simulations using the commercially available ANSYS Polyflow software. The major drawbacks of three-dimensional finite-element-method (FEM) simulations are that they require vast computational power and, large quantities of memory, and consume considerable time to create a geometric model created by computer-aided design (CAD) and complete a flow calculation. Consequently, a modified 2.5-dimensional finite volume method, termed network analysis is preferable. The results obtained by network analysis and FEM simulations correlated well. Network analysis provides an efficient alternative to complex FEM software in terms of computing power and memory consumption. Furthermore, typical barrier screw geometries can be parameterized and used for flow calculations without timeconsuming CAD-constructions.

  4. Optimization of a Monte Carlo Model of the Transient Reactor Test Facility

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Kristin; DeHart, Mark; Goluoglu, Sedat

    2017-03-01

    The ultimate goal of modeling and simulation is to obtain reasonable answers to problems that don’t have representations which can be easily evaluated while minimizing the amount of computational resources. With the advances during the last twenty years of large scale computing centers, researchers have had the ability to create a multitude of tools to minimize the number of approximations necessary when modeling a system. The tremendous power of these centers requires the user to possess an immense amount of knowledge to optimize the models for accuracy and efficiency.This paper seeks to evaluate the KENO model of TREAT to optimizemore » calculational efforts.« less

  5. Full-Body Musculoskeletal Model for Muscle-Driven Simulation of Human Gait.

    PubMed

    Rajagopal, Apoorva; Dembia, Christopher L; DeMers, Matthew S; Delp, Denny D; Hicks, Jennifer L; Delp, Scott L

    2016-10-01

    Musculoskeletal models provide a non-invasive means to study human movement and predict the effects of interventions on gait. Our goal was to create an open-source 3-D musculoskeletal model with high-fidelity representations of the lower limb musculature of healthy young individuals that can be used to generate accurate simulations of gait. Our model includes bony geometry for the full body, 37 degrees of freedom to define joint kinematics, Hill-type models of 80 muscle-tendon units actuating the lower limbs, and 17 ideal torque actuators driving the upper body. The model's musculotendon parameters are derived from previous anatomical measurements of 21 cadaver specimens and magnetic resonance images of 24 young healthy subjects. We tested the model by evaluating its computational time and accuracy of simulations of healthy walking and running. Generating muscle-driven simulations of normal walking and running took approximately 10 minutes on a typical desktop computer. The differences between our muscle-generated and inverse dynamics joint moments were within 3% (RMSE) of the peak inverse dynamics joint moments in both walking and running, and our simulated muscle activity showed qualitative agreement with salient features from experimental electromyography data. These results suggest that our model is suitable for generating muscle-driven simulations of healthy gait. We encourage other researchers to further validate and apply the model to study other motions of the lower extremity. The model is implemented in the open-source software platform OpenSim. The model and data used to create and test the simulations are freely available at https://simtk.org/home/full_body/, allowing others to reproduce these results and create their own simulations.

  6. Can "CANISO" Activate "CASINO"? Transposed-Letter Similarity Effects with Nonadjacent Letter Positions

    ERIC Educational Resources Information Center

    Perea, Manuel; Lupker, Stephen J.

    2004-01-01

    Nonwords created by transposing two "adjacent" letters (i.e., transposed-letter (TL) nonwords like "jugde") are very effective at activating the lexical representation of their base words. This fact poses problems for most computational models of word recognition (e.g., the interactive-activation model and its extensions), which assume that exact…

  7. Vibro-Acoustic FE Analyses of the Saab 2000 Aircraft

    NASA Technical Reports Server (NTRS)

    Green, Inge S.

    1992-01-01

    A finite element model of the Saab 2000 fuselage structure and interior cavity has been created in order to compute the noise level in the passenger cabin due to propeller noise. Areas covered in viewgraph format include the following: coupled acoustic/structural noise; data base creation; frequency response analysis; model validation; and planned analyses.

  8. A New Tradition To Fit the Model.

    ERIC Educational Resources Information Center

    Darnell, D. Roe; Rosenthal, Donna McCrohan

    2001-01-01

    Discusses Cerro Coso Community College in Ridgecrest (California), where 80-85 of all local jobs are with one employer, the China Lake Naval Air Weapons Station (NAWS). States that massive layoffs at NAWS inspired creative ways of rethinking the community college model at Cerro Coso, such as creating the nation's first computer graphics imagery…

  9. Computer Administering of the Psychological Investigations: Set-Relational Representation

    NASA Astrophysics Data System (ADS)

    Yordzhev, Krasimir

    Computer administering of a psychological investigation is the computer representation of the entire procedure of psychological assessments - test construction, test implementation, results evaluation, storage and maintenance of the developed database, its statistical processing, analysis and interpretation. A mathematical description of psychological assessment with the aid of personality tests is discussed in this article. The set theory and the relational algebra are used in this description. A relational model of data, needed to design a computer system for automation of certain psychological assessments is given. Some finite sets and relation on them, which are necessary for creating a personality psychological test, are described. The described model could be used to develop real software for computer administering of any psychological test and there is full automation of the whole process: test construction, test implementation, result evaluation, storage of the developed database, statistical implementation, analysis and interpretation. A software project for computer administering personality psychological tests is suggested.

  10. ARACHNE: A neural-neuroglial network builder with remotely controlled parallel computing

    PubMed Central

    Rusakov, Dmitri A.; Savtchenko, Leonid P.

    2017-01-01

    Creating and running realistic models of neural networks has hitherto been a task for computing professionals rather than experimental neuroscientists. This is mainly because such networks usually engage substantial computational resources, the handling of which requires specific programing skills. Here we put forward a newly developed simulation environment ARACHNE: it enables an investigator to build and explore cellular networks of arbitrary biophysical and architectural complexity using the logic of NEURON and a simple interface on a local computer or a mobile device. The interface can control, through the internet, an optimized computational kernel installed on a remote computer cluster. ARACHNE can combine neuronal (wired) and astroglial (extracellular volume-transmission driven) network types and adopt realistic cell models from the NEURON library. The program and documentation (current version) are available at GitHub repository https://github.com/LeonidSavtchenko/Arachne under the MIT License (MIT). PMID:28362877

  11. A computational model of selection by consequences.

    PubMed Central

    McDowell, J J

    2004-01-01

    Darwinian selection by consequences was instantiated in a computational model that consisted of a repertoire of behaviors undergoing selection, reproduction, and mutation over many generations. The model in effect created a digital organism that emitted behavior continuously. The behavior of this digital organism was studied in three series of computational experiments that arranged reinforcement according to random-interval (RI) schedules. The quantitative features of the model were varied over wide ranges in these experiments, and many of the qualitative features of the model also were varied. The digital organism consistently showed a hyperbolic relation between response and reinforcement rates, and this hyperbolic description of the data was consistently better than the description provided by other, similar, function forms. In addition, the parameters of the hyperbola varied systematically with the quantitative, and some of the qualitative, properties of the model in ways that were consistent with findings from biological organisms. These results suggest that the material events responsible for an organism's responding on RI schedules are computationally equivalent to Darwinian selection by consequences. They also suggest that the computational model developed here is worth pursuing further as a possible dynamic account of behavior. PMID:15357512

  12. A computer simulation model of Wolbachia invasion for disease vector population modification.

    PubMed

    Guevara-Souza, Mauricio; Vallejo, Edgar E

    2015-10-05

    Wolbachia invasion has been proved to be a promising alternative for controlling vector-borne diseases, particularly Dengue fever. Creating computer models that can provide insight into how vector population modification can be achieved under different conditions would be most valuable for assessing the efficacy of control strategies for this disease. In this paper, we present a computer model that simulates the behavior of native mosquito populations after the introduction of mosquitoes infected with the Wolbachia bacteria. We studied how different factors such as fecundity, fitness cost of infection, migration rates, number of populations, population size, and number of introduced infected mosquitoes affect the spread of the Wolbachia bacteria among native mosquito populations. Two main scenarios of the island model are presented in this paper, with infected mosquitoes introduced into the largest source population and peripheral populations. Overall, the results are promising; Wolbachia infection spreads among native populations and the computer model is capable of reproducing the results obtained by mathematical models and field experiments. Computer models can be very useful for gaining insight into how Wolbachia invasion works and are a promising alternative for complementing experimental and mathematical approaches for vector-borne disease control.

  13. Image/video understanding systems based on network-symbolic models

    NASA Astrophysics Data System (ADS)

    Kuvich, Gary

    2004-03-01

    Vision is a part of a larger information system that converts visual information into knowledge structures. These structures drive vision process, resolve ambiguity and uncertainty via feedback projections, and provide image understanding that is an interpretation of visual information in terms of such knowledge models. Computer simulation models are built on the basis of graphs/networks. The ability of human brain to emulate similar graph/network models is found. Symbols, predicates and grammars naturally emerge in such networks, and logic is simply a way of restructuring such models. Brain analyzes an image as a graph-type relational structure created via multilevel hierarchical compression of visual information. Primary areas provide active fusion of image features on a spatial grid-like structure, where nodes are cortical columns. Spatial logic and topology naturally present in such structures. Mid-level vision processes like perceptual grouping, separation of figure from ground, are special kinds of network transformations. They convert primary image structure into the set of more abstract ones, which represent objects and visual scene, making them easy for analysis by higher-level knowledge structures. Higher-level vision phenomena are results of such analysis. Composition of network-symbolic models combines learning, classification, and analogy together with higher-level model-based reasoning into a single framework, and it works similar to frames and agents. Computational intelligence methods transform images into model-based knowledge representation. Based on such principles, an Image/Video Understanding system can convert images into the knowledge models, and resolve uncertainty and ambiguity. This allows creating intelligent computer vision systems for design and manufacturing.

  14. Crops in silico: A community wide multi-scale computational modeling framework of plant canopies

    NASA Astrophysics Data System (ADS)

    Srinivasan, V.; Christensen, A.; Borkiewic, K.; Yiwen, X.; Ellis, A.; Panneerselvam, B.; Kannan, K.; Shrivastava, S.; Cox, D.; Hart, J.; Marshall-Colon, A.; Long, S.

    2016-12-01

    Current crop models predict a looming gap between supply and demand for primary foodstuffs over the next 100 years. While significant yield increases were achieved in major food crops during the early years of the green revolution, the current rates of yield increases are insufficient to meet future projected food demand. Furthermore, with projected reduction in arable land, decrease in water availability, and increasing impacts of climate change on future food production, innovative technologies are required to sustainably improve crop yield. To meet these challenges, we are developing Crops in silico (Cis), a biologically informed, multi-scale, computational modeling framework that can facilitate whole plant simulations of crop systems. The Cis framework is capable of linking models of gene networks, protein synthesis, metabolic pathways, physiology, growth, and development in order to investigate crop response to different climate scenarios and resource constraints. This modeling framework will provide the mechanistic details to generate testable hypotheses toward accelerating directed breeding and engineering efforts to increase future food security. A primary objective for building such a framework is to create synergy among an inter-connected community of biologists and modelers to create a realistic virtual plant. This framework advantageously casts the detailed mechanistic understanding of individual plant processes across various scales in a common scalable framework that makes use of current advances in high performance and parallel computing. We are currently designing a user friendly interface that will make this tool equally accessible to biologists and computer scientists. Critically, this framework will provide the community with much needed tools for guiding future crop breeding and engineering, understanding the emergent implications of discoveries at the molecular level for whole plant behavior, and improved prediction of plant and ecosystem responses to the environment.

  15. The effect of in situ/in vitro three-dimensional quantitative computed tomography image voxel size on the finite element model of human vertebral cancellous bone.

    PubMed

    Lu, Yongtao; Engelke, Klaus; Glueer, Claus-C; Morlock, Michael M; Huber, Gerd

    2014-11-01

    Quantitative computed tomography-based finite element modeling technique is a promising clinical tool for the prediction of bone strength. However, quantitative computed tomography-based finite element models were created from image datasets with different image voxel sizes. The aim of this study was to investigate whether there is an influence of image voxel size on the finite element models. In all 12 thoracolumbar vertebrae were scanned prior to autopsy (in situ) using two different quantitative computed tomography scan protocols, which resulted in image datasets with two different voxel sizes (0.29 × 0.29 × 1.3 mm(3) vs 0.18 × 0.18 × 0.6 mm(3)). Eight of them were scanned after autopsy (in vitro) and the datasets were reconstructed with two voxel sizes (0.32 × 0.32 × 0.6 mm(3) vs. 0.18 × 0.18 × 0.3 mm(3)). Finite element models with cuboid volume of interest extracted from the vertebral cancellous part were created and inhomogeneous bilinear bone properties were defined. Axial compression was simulated. No effect of voxel size was detected on the apparent bone mineral density for both the in situ and in vitro cases. However, the apparent modulus and yield strength showed significant differences in the two voxel size group pairs (in situ and in vitro). In conclusion, the image voxel size may have to be considered when the finite element voxel modeling technique is used in clinical applications. © IMechE 2014.

  16. Modeling the dynamics of multipartite quantum systems created departing from two-level systems using general local and non-local interactions

    NASA Astrophysics Data System (ADS)

    Delgado, Francisco

    2017-12-01

    Quantum information is an emergent area merging physics, mathematics, computer science and engineering. To reach its technological goals, it is requiring adequate approaches to understand how to combine physical restrictions, computational approaches and technological requirements to get functional universal quantum information processing. This work presents the modeling and the analysis of certain general type of Hamiltonian representing several physical systems used in quantum information and establishing a dynamics reduction in a natural grammar for bipartite processing based on entangled states.

  17. TethysCluster: A comprehensive approach for harnessing cloud resources for hydrologic modeling

    NASA Astrophysics Data System (ADS)

    Nelson, J.; Jones, N.; Ames, D. P.

    2015-12-01

    Advances in water resources modeling are improving the information that can be supplied to support decisions affecting the safety and sustainability of society. However, as water resources models become more sophisticated and data-intensive they require more computational power to run. Purchasing and maintaining the computing facilities needed to support certain modeling tasks has been cost-prohibitive for many organizations. With the advent of the cloud, the computing resources needed to address this challenge are now available and cost-effective, yet there still remains a significant technical barrier to leverage these resources. This barrier inhibits many decision makers and even trained engineers from taking advantage of the best science and tools available. Here we present the Python tools TethysCluster and CondorPy, that have been developed to lower the barrier to model computation in the cloud by providing (1) programmatic access to dynamically scalable computing resources, (2) a batch scheduling system to queue and dispatch the jobs to the computing resources, (3) data management for job inputs and outputs, and (4) the ability to dynamically create, submit, and monitor computing jobs. These Python tools leverage the open source, computing-resource management, and job management software, HTCondor, to offer a flexible and scalable distributed-computing environment. While TethysCluster and CondorPy can be used independently to provision computing resources and perform large modeling tasks, they have also been integrated into Tethys Platform, a development platform for water resources web apps, to enable computing support for modeling workflows and decision-support systems deployed as web apps.

  18. The BioIntelligence Framework: a new computational platform for biomedical knowledge computing

    PubMed Central

    Farley, Toni; Kiefer, Jeff; Lee, Preston; Von Hoff, Daniel; Trent, Jeffrey M; Colbourn, Charles

    2013-01-01

    Breakthroughs in molecular profiling technologies are enabling a new data-intensive approach to biomedical research, with the potential to revolutionize how we study, manage, and treat complex diseases. The next great challenge for clinical applications of these innovations will be to create scalable computational solutions for intelligently linking complex biomedical patient data to clinically actionable knowledge. Traditional database management systems (DBMS) are not well suited to representing complex syntactic and semantic relationships in unstructured biomedical information, introducing barriers to realizing such solutions. We propose a scalable computational framework for addressing this need, which leverages a hypergraph-based data model and query language that may be better suited for representing complex multi-lateral, multi-scalar, and multi-dimensional relationships. We also discuss how this framework can be used to create rapid learning knowledge base systems to intelligently capture and relate complex patient data to biomedical knowledge in order to automate the recovery of clinically actionable information. PMID:22859646

  19. The Modular Modeling System (MMS): User's Manual

    USGS Publications Warehouse

    Leavesley, G.H.; Restrepo, Pedro J.; Markstrom, S.L.; Dixon, M.; Stannard, L.G.

    1996-01-01

    The Modular Modeling System (MMS) is an integrated system of computer software that has been developed to provide the research and operational framework needed to support development, testing, and evaluation of physical-process algorithms and to facilitate integration of user-selected sets of algorithms into operational physical-process models. MMS uses a module library that contains modules for simulating a variety of water, energy, and biogeochemical processes. A model is created by selectively coupling the most appropriate modules from the library to create a 'suitable' model for the desired application. Where existing modules do not provide appropriate process algorithms, new modules can be developed. The MMS user's manual provides installation instructions and a detailed discussion of system concepts, module development, and model development and application using the MMS graphical user interface.

  20. A method for transferring NASTRAN data between dissimilar computers. [application to CDC 6000 series, IBM 360-370 series, and Univac 1100 series computers

    NASA Technical Reports Server (NTRS)

    Rogers, J. L., Jr.

    1973-01-01

    The NASTRAN computer program is capable of executing on three different types of computers: (1) the CDC 6000 series, (2) the IBM 360-370 series, and (3) the Univac 1100 series. A typical activity requiring transfer of data between dissimilar computers is the analysis of a large structure such as the space shuttle by substructuring. Models of portions of the vehicle which have been analyzed by subcontractors using their computers must be integrated into a model of the complete structure by the prime contractor on his computer. Presently the transfer of NASTRAN matrices or tables between two different types of computers is accomplished by punched cards or a magnetic tape containing card images. These methods of data transfer do not satisfy the requirements for intercomputer data transfer associated with a substructuring activity. To provide a more satisfactory transfer of data, two new programs, RDUSER and WRTUSER, were created.

  1. Community Cloud Computing

    NASA Astrophysics Data System (ADS)

    Marinos, Alexandros; Briscoe, Gerard

    Cloud Computing is rising fast, with its data centres growing at an unprecedented rate. However, this has come with concerns over privacy, efficiency at the expense of resilience, and environmental sustainability, because of the dependence on Cloud vendors such as Google, Amazon and Microsoft. Our response is an alternative model for the Cloud conceptualisation, providing a paradigm for Clouds in the community, utilising networked personal computers for liberation from the centralised vendor model. Community Cloud Computing (C3) offers an alternative architecture, created by combing the Cloud with paradigms from Grid Computing, principles from Digital Ecosystems, and sustainability from Green Computing, while remaining true to the original vision of the Internet. It is more technically challenging than Cloud Computing, having to deal with distributed computing issues, including heterogeneous nodes, varying quality of service, and additional security constraints. However, these are not insurmountable challenges, and with the need to retain control over our digital lives and the potential environmental consequences, it is a challenge we must pursue.

  2. Full body musculoskeletal model for muscle-driven simulation of human gait

    PubMed Central

    Rajagopal, Apoorva; Dembia, Christopher L.; DeMers, Matthew S.; Delp, Denny D.; Hicks, Jennifer L.; Delp, Scott L.

    2017-01-01

    Objective Musculoskeletal models provide a non-invasive means to study human movement and predict the effects of interventions on gait. Our goal was to create an open-source, three-dimensional musculoskeletal model with high-fidelity representations of the lower limb musculature of healthy young individuals that can be used to generate accurate simulations of gait. Methods Our model includes bony geometry for the full body, 37 degrees of freedom to define joint kinematics, Hill-type models of 80 muscle-tendon units actuating the lower limbs, and 17 ideal torque actuators driving the upper body. The model’s musculotendon parameters are derived from previous anatomical measurements of 21 cadaver specimens and magnetic resonance images of 24 young healthy subjects. We tested the model by evaluating its computational time and accuracy of simulations of healthy walking and running. Results Generating muscle-driven simulations of normal walking and running took approximately 10 minutes on a typical desktop computer. The differences between our muscle-generated and inverse dynamics joint moments were within 3% (RMSE) of the peak inverse dynamics joint moments in both walking and running, and our simulated muscle activity showed qualitative agreement with salient features from experimental electromyography data. Conclusion These results suggest that our model is suitable for generating muscle-driven simulations of healthy gait. We encourage other researchers to further validate and apply the model to study other motions of the lower-extremity. Significance The model is implemented in the open source software platform OpenSim. The model and data used to create and test the simulations are freely available at https://simtk.org/home/full_body/, allowing others to reproduce these results and create their own simulations. PMID:27392337

  3. Manufacturing Magic and Computational Creativity

    PubMed Central

    Williams, Howard; McOwan, Peter W.

    2016-01-01

    This paper describes techniques in computational creativity, blending mathematical modeling and psychological insight, to generate new magic tricks. The details of an explicit computational framework capable of creating new magic tricks are summarized, and evaluated against a range of contemporary theories about what constitutes a creative system. To allow further development of the proposed system we situate this approach to the generation of magic in the wider context of other areas of application in computational creativity in performance arts. We show how approaches in these domains could be incorporated to enhance future magic generation systems, and critically review possible future applications of such magic generating computers. PMID:27375533

  4. Parameterized reduced-order models using hyper-dual numbers.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fike, Jeffrey A.; Brake, Matthew Robert

    2013-10-01

    The goal of most computational simulations is to accurately predict the behavior of a real, physical system. Accurate predictions often require very computationally expensive analyses and so reduced order models (ROMs) are commonly used. ROMs aim to reduce the computational cost of the simulations while still providing accurate results by including all of the salient physics of the real system in the ROM. However, real, physical systems often deviate from the idealized models used in simulations due to variations in manufacturing or other factors. One approach to this issue is to create a parameterized model in order to characterize themore » effect of perturbations from the nominal model on the behavior of the system. This report presents a methodology for developing parameterized ROMs, which is based on Craig-Bampton component mode synthesis and the use of hyper-dual numbers to calculate the derivatives necessary for the parameterization.« less

  5. Towards agile large-scale predictive modelling in drug discovery with flow-based programming design principles.

    PubMed

    Lampa, Samuel; Alvarsson, Jonathan; Spjuth, Ola

    2016-01-01

    Predictive modelling in drug discovery is challenging to automate as it often contains multiple analysis steps and might involve cross-validation and parameter tuning that create complex dependencies between tasks. With large-scale data or when using computationally demanding modelling methods, e-infrastructures such as high-performance or cloud computing are required, adding to the existing challenges of fault-tolerant automation. Workflow management systems can aid in many of these challenges, but the currently available systems are lacking in the functionality needed to enable agile and flexible predictive modelling. We here present an approach inspired by elements of the flow-based programming paradigm, implemented as an extension of the Luigi system which we name SciLuigi. We also discuss the experiences from using the approach when modelling a large set of biochemical interactions using a shared computer cluster.Graphical abstract.

  6. Creation of a Rapid High-Fidelity Aerodynamics Module for a Multidisciplinary Design Environment

    NASA Technical Reports Server (NTRS)

    Srinivasan, Muktha; Whittecar, William; Edwards, Stephen; Mavris, Dimitri N.

    2012-01-01

    In the traditional aerospace vehicle design process, each successive design phase is accompanied by an increment in the modeling fidelity of the disciplinary analyses being performed. This trend follows a corresponding shrinking of the design space as more and more design decisions are locked in. The correlated increase in knowledge about the design and decrease in design freedom occurs partly because increases in modeling fidelity are usually accompanied by significant increases in the computational expense of performing the analyses. When running high fidelity analyses, it is not usually feasible to explore a large number of variations, and so design space exploration is reserved for conceptual design, and higher fidelity analyses are run only once a specific point design has been selected to carry forward. The designs produced by this traditional process have been recognized as being limited by the uncertainty that is present early on due to the use of lower fidelity analyses. For example, uncertainty in aerodynamics predictions produces uncertainty in trajectory optimization, which can impact overall vehicle sizing. This effect can become more significant when trajectories are being shaped by active constraints. For example, if an optimal trajectory is running up against a normal load factor constraint, inaccuracies in the aerodynamic coefficient predictions can cause a feasible trajectory to be considered infeasible, or vice versa. For this reason, a trade must always be performed between the desired fidelity and the resources available. Apart from this trade between fidelity and computational expense, it is very desirable to use higher fidelity analyses earlier in the design process. A large body of work has been performed to this end, led by efforts in the area of surrogate modeling. In surrogate modeling, an up-front investment is made by running a high fidelity code over a Design of Experiments (DOE); once completed, the DOE data is used to create a surrogate model, which captures the relationships between input variables and responses into regression equations. Depending on the dimensionality of the problem and the fidelity of the code for which a surrogate model is being created, the initial DOE can itself be computationally prohibitive to run. Cokriging, a modeling approach from the field of geostatistics, provides a desirable compromise between computational expense and fidelity. To do this, cokriging leverages a large body of data generated by a low fidelity analysis, combines it with a smaller set of data from a higher fidelity analysis, and creates a kriging surrogate model with prediction fidelity approaching that of the higher fidelity analysis. When integrated into a multidisciplinary environment, a disciplinary analysis module employing cokriging can raise the analysis fidelity without drastically impacting the expense of design iterations. This is demonstrated through the creation of an aerodynamics analysis module in NASA s OpenMDAO framework. Aerodynamic analyses including Missile DATCOM, APAS, and USM3D are leveraged to create high fidelity aerodynamics decks for parametric vehicle geometries, which are created in NASA s Vehicle Sketch Pad (VSP). Several trade studies are performed to examine the achieved level of model fidelity, and the overall impact to vehicle design is quantified.

  7. Queueing analysis of a canonical model of real-time multiprocessors

    NASA Technical Reports Server (NTRS)

    Krishna, C. M.; Shin, K. G.

    1983-01-01

    A logical classification of multiprocessor structures from the point of view of control applications is presented. A computation of the response time distribution for a canonical model of a real time multiprocessor is presented. The multiprocessor is approximated by a blocking model. Two separate models are derived: one created from the system's point of view, and the other from the point of view of an incoming task.

  8. Soft computing methods for geoidal height transformation

    NASA Astrophysics Data System (ADS)

    Akyilmaz, O.; Özlüdemir, M. T.; Ayan, T.; Çelik, R. N.

    2009-07-01

    Soft computing techniques, such as fuzzy logic and artificial neural network (ANN) approaches, have enabled researchers to create precise models for use in many scientific and engineering applications. Applications that can be employed in geodetic studies include the estimation of earth rotation parameters and the determination of mean sea level changes. Another important field of geodesy in which these computing techniques can be applied is geoidal height transformation. We report here our use of a conventional polynomial model, the Adaptive Network-based Fuzzy (or in some publications, Adaptive Neuro-Fuzzy) Inference System (ANFIS), an ANN and a modified ANN approach to approximate geoid heights. These approximation models have been tested on a number of test points. The results obtained through the transformation processes from ellipsoidal heights into local levelling heights have also been compared.

  9. Determination of apparent coupling factors for adhesive bonded acrylic plates using SEAL approach

    NASA Astrophysics Data System (ADS)

    Pankaj, Achuthan. C.; Shivaprasad, M. V.; Murigendrappa, S. M.

    2018-04-01

    Apparent coupling loss factors (CLF) and velocity responses has been computed for two lap joined adhesive bonded plates using finite element and experimental statistical energy analysis like approach. A finite element model of the plates has been created using ANSYS software. The statistical energy parameters have been computed using the velocity responses obtained from a harmonic forced excitation analysis. Experiments have been carried out for two different cases of adhesive bonded joints and the results have been compared with the apparent coupling factors and velocity responses obtained from finite element analysis. The results obtained from the studies signify the importance of modeling of adhesive bonded joints in computation of the apparent coupling factors and its further use in computation of energies and velocity responses using statistical energy analysis like approach.

  10. Computational intelligence models to predict porosity of tablets using minimum features

    PubMed Central

    Khalid, Mohammad Hassan; Kazemi, Pezhman; Perez-Gandarillas, Lucia; Michrafy, Abderrahim; Szlęk, Jakub; Jachowicz, Renata; Mendyk, Aleksander

    2017-01-01

    The effects of different formulations and manufacturing process conditions on the physical properties of a solid dosage form are of importance to the pharmaceutical industry. It is vital to have in-depth understanding of the material properties and governing parameters of its processes in response to different formulations. Understanding the mentioned aspects will allow tighter control of the process, leading to implementation of quality-by-design (QbD) practices. Computational intelligence (CI) offers an opportunity to create empirical models that can be used to describe the system and predict future outcomes in silico. CI models can help explore the behavior of input parameters, unlocking deeper understanding of the system. This research endeavor presents CI models to predict the porosity of tablets created by roll-compacted binary mixtures, which were milled and compacted under systematically varying conditions. CI models were created using tree-based methods, artificial neural networks (ANNs), and symbolic regression trained on an experimental data set and screened using root-mean-square error (RMSE) scores. The experimental data were composed of proportion of microcrystalline cellulose (MCC) (in percentage), granule size fraction (in micrometers), and die compaction force (in kilonewtons) as inputs and porosity as an output. The resulting models show impressive generalization ability, with ANNs (normalized root-mean-square error [NRMSE] =1%) and symbolic regression (NRMSE =4%) as the best-performing methods, also exhibiting reliable predictive behavior when presented with a challenging external validation data set (best achieved symbolic regression: NRMSE =3%). Symbolic regression demonstrates the transition from the black box modeling paradigm to more transparent predictive models. Predictive performance and feature selection behavior of CI models hints at the most important variables within this factor space. PMID:28138223

  11. Computational intelligence models to predict porosity of tablets using minimum features.

    PubMed

    Khalid, Mohammad Hassan; Kazemi, Pezhman; Perez-Gandarillas, Lucia; Michrafy, Abderrahim; Szlęk, Jakub; Jachowicz, Renata; Mendyk, Aleksander

    2017-01-01

    The effects of different formulations and manufacturing process conditions on the physical properties of a solid dosage form are of importance to the pharmaceutical industry. It is vital to have in-depth understanding of the material properties and governing parameters of its processes in response to different formulations. Understanding the mentioned aspects will allow tighter control of the process, leading to implementation of quality-by-design (QbD) practices. Computational intelligence (CI) offers an opportunity to create empirical models that can be used to describe the system and predict future outcomes in silico. CI models can help explore the behavior of input parameters, unlocking deeper understanding of the system. This research endeavor presents CI models to predict the porosity of tablets created by roll-compacted binary mixtures, which were milled and compacted under systematically varying conditions. CI models were created using tree-based methods, artificial neural networks (ANNs), and symbolic regression trained on an experimental data set and screened using root-mean-square error (RMSE) scores. The experimental data were composed of proportion of microcrystalline cellulose (MCC) (in percentage), granule size fraction (in micrometers), and die compaction force (in kilonewtons) as inputs and porosity as an output. The resulting models show impressive generalization ability, with ANNs (normalized root-mean-square error [NRMSE] =1%) and symbolic regression (NRMSE =4%) as the best-performing methods, also exhibiting reliable predictive behavior when presented with a challenging external validation data set (best achieved symbolic regression: NRMSE =3%). Symbolic regression demonstrates the transition from the black box modeling paradigm to more transparent predictive models. Predictive performance and feature selection behavior of CI models hints at the most important variables within this factor space.

  12. An EMTP system level model of the PMAD DC test bed

    NASA Technical Reports Server (NTRS)

    Dravid, Narayan V.; Kacpura, Thomas J.; Tam, Kwa-Sur

    1991-01-01

    A power management and distribution direct current (PMAD DC) test bed was set up at the NASA Lewis Research Center to investigate Space Station Freedom Electric Power Systems issues. Efficiency of test bed operation significantly improves with a computer simulation model of the test bed as an adjunct tool of investigation. Such a model is developed using the Electromagnetic Transients Program (EMTP) and is available to the test bed developers and experimenters. The computer model is assembled on a modular basis. Device models of different types can be incorporated into the system model with only a few lines of code. A library of the various model types is created for this purpose. Simulation results and corresponding test bed results are presented to demonstrate model validity.

  13. Narrowing the scope of failure prediction using targeted fault load injection

    NASA Astrophysics Data System (ADS)

    Jordan, Paul L.; Peterson, Gilbert L.; Lin, Alan C.; Mendenhall, Michael J.; Sellers, Andrew J.

    2018-05-01

    As society becomes more dependent upon computer systems to perform increasingly critical tasks, ensuring that those systems do not fail becomes increasingly important. Many organizations depend heavily on desktop computers for day-to-day operations. Unfortunately, the software that runs on these computers is written by humans and, as such, is still subject to human error and consequent failure. A natural solution is to use statistical machine learning to predict failure. However, since failure is still a relatively rare event, obtaining labelled training data to train these models is not a trivial task. This work presents new simulated fault-inducing loads that extend the focus of traditional fault injection techniques to predict failure in the Microsoft enterprise authentication service and Apache web server. These new fault loads were successful in creating failure conditions that were identifiable using statistical learning methods, with fewer irrelevant faults being created.

  14. Source Term Model for Vortex Generator Vanes in a Navier-Stokes Computer Code

    NASA Technical Reports Server (NTRS)

    Waithe, Kenrick A.

    2004-01-01

    A source term model for an array of vortex generators was implemented into a non-proprietary Navier-Stokes computer code, OVERFLOW. The source term models the side force created by a vortex generator vane. The model is obtained by introducing a side force to the momentum and energy equations that can adjust its strength automatically based on the local flow. The model was tested and calibrated by comparing data from numerical simulations and experiments of a single low profile vortex generator vane on a flat plate. In addition, the model was compared to experimental data of an S-duct with 22 co-rotating, low profile vortex generators. The source term model allowed a grid reduction of about seventy percent when compared with the numerical simulations performed on a fully gridded vortex generator on a flat plate without adversely affecting the development and capture of the vortex created. The source term model was able to predict the shape and size of the stream-wise vorticity and velocity contours very well when compared with both numerical simulations and experimental data. The peak vorticity and its location were also predicted very well when compared to numerical simulations and experimental data. The circulation predicted by the source term model matches the prediction of the numerical simulation. The source term model predicted the engine fan face distortion and total pressure recovery of the S-duct with 22 co-rotating vortex generators very well. The source term model allows a researcher to quickly investigate different locations of individual or a row of vortex generators. The researcher is able to conduct a preliminary investigation with minimal grid generation and computational time.

  15. Roadmap for cardiovascular circulation model

    PubMed Central

    Bradley, Christopher P.; Suresh, Vinod; Mithraratne, Kumar; Muller, Alexandre; Ho, Harvey; Ladd, David; Hellevik, Leif R.; Omholt, Stig W.; Chase, J. Geoffrey; Müller, Lucas O.; Watanabe, Sansuke M.; Blanco, Pablo J.; de Bono, Bernard; Hunter, Peter J.

    2016-01-01

    Abstract Computational models of many aspects of the mammalian cardiovascular circulation have been developed. Indeed, along with orthopaedics, this area of physiology is one that has attracted much interest from engineers, presumably because the equations governing blood flow in the vascular system are well understood and can be solved with well‐established numerical techniques. Unfortunately, there have been only a few attempts to create a comprehensive public domain resource for cardiovascular researchers. In this paper we propose a roadmap for developing an open source cardiovascular circulation model. The model should be registered to the musculo‐skeletal system. The computational infrastructure for the cardiovascular model should provide for near real‐time computation of blood flow and pressure in all parts of the body. The model should deal with vascular beds in all tissues, and the computational infrastructure for the model should provide links into CellML models of cell function and tissue function. In this work we review the literature associated with 1D blood flow modelling in the cardiovascular system, discuss model encoding standards, software and a model repository. We then describe the coordinate systems used to define the vascular geometry, derive the equations and discuss the implementation of these coupled equations in the open source computational software OpenCMISS. Finally, some preliminary results are presented and plans outlined for the next steps in the development of the model, the computational software and the graphical user interface for accessing the model. PMID:27506597

  16. Roadmap for cardiovascular circulation model.

    PubMed

    Safaei, Soroush; Bradley, Christopher P; Suresh, Vinod; Mithraratne, Kumar; Muller, Alexandre; Ho, Harvey; Ladd, David; Hellevik, Leif R; Omholt, Stig W; Chase, J Geoffrey; Müller, Lucas O; Watanabe, Sansuke M; Blanco, Pablo J; de Bono, Bernard; Hunter, Peter J

    2016-12-01

    Computational models of many aspects of the mammalian cardiovascular circulation have been developed. Indeed, along with orthopaedics, this area of physiology is one that has attracted much interest from engineers, presumably because the equations governing blood flow in the vascular system are well understood and can be solved with well-established numerical techniques. Unfortunately, there have been only a few attempts to create a comprehensive public domain resource for cardiovascular researchers. In this paper we propose a roadmap for developing an open source cardiovascular circulation model. The model should be registered to the musculo-skeletal system. The computational infrastructure for the cardiovascular model should provide for near real-time computation of blood flow and pressure in all parts of the body. The model should deal with vascular beds in all tissues, and the computational infrastructure for the model should provide links into CellML models of cell function and tissue function. In this work we review the literature associated with 1D blood flow modelling in the cardiovascular system, discuss model encoding standards, software and a model repository. We then describe the coordinate systems used to define the vascular geometry, derive the equations and discuss the implementation of these coupled equations in the open source computational software OpenCMISS. Finally, some preliminary results are presented and plans outlined for the next steps in the development of the model, the computational software and the graphical user interface for accessing the model. © 2016 The Authors. The Journal of Physiology © 2016 The Physiological Society.

  17. Computational Design of Materials: Planetary Entry to Electric Aircraft and Beyond

    NASA Technical Reports Server (NTRS)

    Thompson, Alexander; Lawson, John W.

    2014-01-01

    NASA's projects and missions push the bounds of what is possible. To support the agency's work, materials development must stay on the cutting edge in order to keep pace. Today, researchers at NASA Ames Research Center perform multiscale modeling to aid the development of new materials and provide insight into existing ones. Multiscale modeling enables researchers to determine micro- and macroscale properties by connecting computational methods ranging from the atomic level (density functional theory, molecular dynamics) to the macroscale (finite element method). The output of one level is passed on as input to the next level, creating a powerful predictive model.

  18. Viewing CAD Drawings on the Internet

    ERIC Educational Resources Information Center

    Schwendau, Mark

    2004-01-01

    Computer aided design (CAD) has been producing 3-D models for years. AutoCAD software is frequently used to create sophisticated 3-D models. These CAD files can be exported as 3DS files for import into Autodesk's 3-D Studio Viz. In this program, the user can render and modify the 3-D model before exporting it out as a WRL (world file hyperlinked)…

  19. An Educational Exercise Examining the Role of Model Attributes on the Creation and Alteration of CAD Models

    ERIC Educational Resources Information Center

    Johnson, Michael D.; Diwakaran, Ram Prasad

    2011-01-01

    Computer-aided design (CAD) is a ubiquitous tool that today's students will be expected to use proficiently for numerous engineering purposes. Taking full advantage of the features available in modern CAD programs requires that models are created in a manner that allows others to easily understand how they are organized and alter them in an…

  20. Study of the counting efficiency of a WBC setup by using a computational 3D human body library in sitting position based on polygonal mesh surfaces.

    PubMed

    Fonseca, T C Ferreira; Bogaerts, R; Lebacq, A L; Mihailescu, C L; Vanhavere, F

    2014-04-01

    A realistic computational 3D human body library, called MaMP and FeMP (Male and Female Mesh Phantoms), based on polygonal mesh surface geometry, has been created to be used for numerical calibration of the whole body counter (WBC) system of the nuclear power plant (NPP) in Doel, Belgium. The main objective was to create flexible computational models varying in gender, body height, and mass for studying the morphology-induced variation of the detector counting efficiency (CE) and reducing the measurement uncertainties. First, the counting room and an HPGe detector were modeled using MCNPX (Monte Carlo radiation transport code). The validation of the model was carried out for different sample-detector geometries with point sources and a physical phantom. Second, CE values were calculated for a total of 36 different mesh phantoms in a seated position using the validated Monte Carlo model. This paper reports on the validation process of the in vivo whole body system and the CE calculated for different body heights and weights. The results reveal that the CE is strongly dependent on the individual body shape, size, and gender and may vary by a factor of 1.5 to 3 depending on the morphology aspects of the individual to be measured.

  1. Hierarchical Ada robot programming system (HARPS)- A complete and working telerobot control system based on the NASREM model

    NASA Technical Reports Server (NTRS)

    Leake, Stephen; Green, Tom; Cofer, Sue; Sauerwein, Tim

    1989-01-01

    HARPS is a telerobot control system that can perform some simple but useful tasks. This capability is demonstrated by performing the ORU exchange demonstration. HARPS is based on NASREM (NASA Standard Reference Model). All software is developed in Ada, and the project incorporates a number of different CASE (computer-aided software engineering) tools. NASREM was found to be a valid and useful model for building a telerobot control system. Its hierarchical and distributed structure creates a natural and logical flow for implementing large complex robust control systems. The ability of Ada to create and enforce abstraction enhanced the implementation of such control systems.

  2. CFD modelling of abdominal aortic aneurysm on hemodynamic loads using a realistic geometry with CT.

    PubMed

    Soudah, Eduardo; Ng, E Y K; Loong, T H; Bordone, Maurizio; Pua, Uei; Narayanan, Sriram

    2013-01-01

    The objective of this study is to find a correlation between the abdominal aortic aneurysm (AAA) geometric parameters, wall stress shear (WSS), abdominal flow patterns, intraluminal thrombus (ILT), and AAA arterial wall rupture using computational fluid dynamics (CFD). Real AAA 3D models were created by three-dimensional (3D) reconstruction of in vivo acquired computed tomography (CT) images from 5 patients. Based on 3D AAA models, high quality volume meshes were created using an optimal tetrahedral aspect ratio for the whole domain. In order to quantify the WSS and the recirculation inside the AAA, a 3D CFD using finite elements analysis was used. The CFD computation was performed assuming that the arterial wall is rigid and the blood is considered a homogeneous Newtonian fluid with a density of 1050 kg/m(3) and a kinematic viscosity of 4 × 10(-3) Pa·s. Parallelization procedures were used in order to increase the performance of the CFD calculations. A relation between AAA geometric parameters (asymmetry index ( β ), saccular index ( γ ), deformation diameter ratio ( χ ), and tortuosity index ( ε )) and hemodynamic loads was observed, and it could be used as a potential predictor of AAA arterial wall rupture and potential ILT formation.

  3. Create full-scale predictive economic models on ROI and innovation with performance computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Joseph, Earl C.; Conway, Steve

    The U.S. Department of Energy (DOE), the world's largest buyer and user of supercomputers, awarded IDC Research, Inc. a grant to create two macroeconomic models capable of quantifying, respectively, financial and non-financial (innovation) returns on investments in HPC resources. Following a 2013 pilot study in which we created the models and tested them on about 200 real-world HPC cases, DOE authorized us to conduct a full-out, three-year grant study to collect and measure many more examples, a process that would also subject the methodology to further testing and validation. A secondary, "stretch" goal of the full-out study was to advancemore » the methodology from association toward (but not all the way to) causation, by eliminating the effects of some of the other factors that might be contributing, along with HPC investments, to the returns produced in the investigated projects.« less

  4. Resources and Approaches for Teaching Quantitative and Computational Skills in the Geosciences and Allied Fields

    NASA Astrophysics Data System (ADS)

    Orr, C. H.; Mcfadden, R. R.; Manduca, C. A.; Kempler, L. A.

    2016-12-01

    Teaching with data, simulations, and models in the geosciences can increase many facets of student success in the classroom, and in the workforce. Teaching undergraduates about programming and improving students' quantitative and computational skills expands their perception of Geoscience beyond field-based studies. Processing data and developing quantitative models are critically important for Geoscience students. Students need to be able to perform calculations, analyze data, create numerical models and visualizations, and more deeply understand complex systems—all essential aspects of modern science. These skills require students to have comfort and skill with languages and tools such as MATLAB. To achieve comfort and skill, computational and quantitative thinking must build over a 4-year degree program across courses and disciplines. However, in courses focused on Geoscience content it can be challenging to get students comfortable with using computational methods to answers Geoscience questions. To help bridge this gap, we have partnered with MathWorks to develop two workshops focused on collecting and developing strategies and resources to help faculty teach students to incorporate data, simulations, and models into the curriculum at the course and program levels. We brought together faculty members from the sciences, including Geoscience and allied fields, who teach computation and quantitative thinking skills using MATLAB to build a resource collection for teaching. These materials, and the outcomes of the workshops are freely available on our website. The workshop outcomes include a collection of teaching activities, essays, and course descriptions that can help faculty incorporate computational skills at the course or program level. The teaching activities include in-class assignments, problem sets, labs, projects, and toolboxes. These activities range from programming assignments to creating and using models. The outcomes also include workshop syntheses that highlights best practices, a set of webpages to support teaching with software such as MATLAB, and an interest group actively discussing aspects these issues in Geoscience and allied fields. Learn more and view the resources at http://serc.carleton.edu/matlab_computation2016/index.html

  5. Mind the Noise When Identifying Computational Models of Cognition from Brain Activity.

    PubMed

    Kolossa, Antonio; Kopp, Bruno

    2016-01-01

    The aim of this study was to analyze how measurement error affects the validity of modeling studies in computational neuroscience. A synthetic validity test was created using simulated P300 event-related potentials as an example. The model space comprised four computational models of single-trial P300 amplitude fluctuations which differed in terms of complexity and dependency. The single-trial fluctuation of simulated P300 amplitudes was computed on the basis of one of the models, at various levels of measurement error and at various numbers of data points. Bayesian model selection was performed based on exceedance probabilities. At very low numbers of data points, the least complex model generally outperformed the data-generating model. Invalid model identification also occurred at low levels of data quality and under low numbers of data points if the winning model's predictors were closely correlated with the predictors from the data-generating model. Given sufficient data quality and numbers of data points, the data-generating model could be correctly identified, even against models which were very similar to the data-generating model. Thus, a number of variables affects the validity of computational modeling studies, and data quality and numbers of data points are among the main factors relevant to the issue. Further, the nature of the model space (i.e., model complexity, model dependency) should not be neglected. This study provided quantitative results which show the importance of ensuring the validity of computational modeling via adequately prepared studies. The accomplishment of synthetic validity tests is recommended for future applications. Beyond that, we propose to render the demonstration of sufficient validity via adequate simulations mandatory to computational modeling studies.

  6. Using the Item Response Theory (IRT) for Educational Evaluation through Games

    ERIC Educational Resources Information Center

    Euzébio Batista, Marcelo Henrique; Victória Barbosa, Jorge Luis; da Rosa Tavares, João Elison; Hackenhaar, Jonathan Luis

    2013-01-01

    This article shows the application of Item Response Theory (IRT) for educational evaluation using games. The article proposes a computational model to create user profiles, called Psychometric Profile Generator (PPG). PPG uses the IRT mathematical model for exploring the levels of skills and behaviors in the form of items and/or stimuli. The model…

  7. Interaction of Simple Ions with Water: Theoretical Models for the Study of Ion Hydration

    ERIC Educational Resources Information Center

    Gancheff, Jorge S.; Kremer, Carlos; Ventura, Oscar N.

    2009-01-01

    A computational experiment aimed to create and systematically analyze models of simple cation hydrates is presented. The changes in the structure (bond distances and angles) and the electronic density distribution of the solvent and the thermodynamic parameters of the hydration process are calculated and compared with the experimental data. The…

  8. Modelling the snowmelt and the snow water equivalent by creating a simplified energy balance conceptual snow model

    NASA Astrophysics Data System (ADS)

    Riboust, Philippe; Thirel, Guillaume; Le Moine, Nicolas; Ribstein, Pierre

    2016-04-01

    A better knowledge of the accumulated snow on the watersheds will help flood forecasting centres and hydro-power companies to predict the amount of water released during spring snowmelt. Since precipitations gauges are sparse at high elevations and integrative measurements of the snow accumulated on watershed surface are hard to obtain, using snow models is an adequate way to estimate snow water equivalent (SWE) on watersheds. In addition to short term prediction, simulating accurately SWE with snow models should have many advantages. Validating the snow module on both SWE and snowmelt should give a more reliable model for climate change studies or regionalization for ungauged watersheds. The aim of this study is to create a new snow module, which has a structure that allows the use of measured snow data for calibration or assimilation. Energy balance modelling seems to be the logical choice for designing a model in which internal variables, such as SWE, could be compared to observations. Physical models are complex, needing high computational resources and many different types of inputs that are not widely measured at meteorological stations. At the opposite, simple conceptual degree-day models offer to simulate snowmelt using only temperature and precipitation as inputs with fast computing. Its major drawback is to be empirical, i.e. not taking into account all of the processes of the energy balance, which makes this kind of model more difficult to use when willing to compare SWE to observed measurements. In order to reach our objectives, we created a snow model structured by a simplified energy balance where each of the processes is empirically parameterized in order to be calculated using only temperature, precipitation and cloud cover variables. This model's structure is similar to the one created by M.T. Walter (2005), where parameterizations from the literature were used to compute all of the processes of the energy balance. The conductive fluxes into the snowpack were modelled by using analytical solutions to the heat equation taking phase change into account. This approach has the advantage to use few forcing variables and to take into account all the processes of the energy balance. Indeed, the simulations should be quick enough to allow, for example, ensemble prediction or simulation of numerous basins, more easily than physical snow models. The snow module formulation has been completed and is in its validation phase using data from the experimental station of Col de Porte, Alpes, France. Data from the US SNOTEL product will be used in order to test the model structure on a larger scale and to test diverse calibration procedures, since the aim is to use it on a basin scale for discharge modelling purposes.

  9. OpenWorm: an open-science approach to modeling Caenorhabditis elegans.

    PubMed

    Szigeti, Balázs; Gleeson, Padraig; Vella, Michael; Khayrulin, Sergey; Palyanov, Andrey; Hokanson, Jim; Currie, Michael; Cantarelli, Matteo; Idili, Giovanni; Larson, Stephen

    2014-01-01

    OpenWorm is an international collaboration with the aim of understanding how the behavior of Caenorhabditis elegans (C. elegans) emerges from its underlying physiological processes. The project has developed a modular simulation engine to create computational models of the worm. The modularity of the engine makes it possible to easily modify the model, incorporate new experimental data and test hypotheses. The modeling framework incorporates both biophysical neuronal simulations and a novel fluid-dynamics-based soft-tissue simulation for physical environment-body interactions. The project's open-science approach is aimed at overcoming the difficulties of integrative modeling within a traditional academic environment. In this article the rationale is presented for creating the OpenWorm collaboration, the tools and resources developed thus far are outlined and the unique challenges associated with the project are discussed.

  10. Development of Reduced-Order Models for Aeroelastic and Flutter Prediction Using the CFL3Dv6.0 Code

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Bartels, Robert E.

    2002-01-01

    A reduced-order model (ROM) is developed for aeroelastic analysis using the CFL3D version 6.0 computational fluid dynamics (CFD) code, recently developed at the NASA Langley Research Center. This latest version of the flow solver includes a deforming mesh capability, a modal structural definition for nonlinear aeroelastic analyses, and a parallelization capability that provides a significant increase in computational efficiency. Flutter results for the AGARD 445.6 Wing computed using CFL3D v6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are then computed using the CFL3Dv6 code and transformed into state-space form. Important numerical issues associated with the computation of the impulse responses are presented. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is used to rapidly compute aeroelastic transients including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly.

  11. Design and Analysis Tools for Supersonic Inlets

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Folk, Thomas C.

    2009-01-01

    Computational tools are being developed for the design and analysis of supersonic inlets. The objective is to update existing tools and provide design and low-order aerodynamic analysis capability for advanced inlet concepts. The Inlet Tools effort includes aspects of creating an electronic database of inlet design information, a document describing inlet design and analysis methods, a geometry model for describing the shape of inlets, and computer tools that implement the geometry model and methods. The geometry model has a set of basic inlet shapes that include pitot, two-dimensional, axisymmetric, and stream-traced inlet shapes. The inlet model divides the inlet flow field into parts that facilitate the design and analysis methods. The inlet geometry model constructs the inlet surfaces through the generation and transformation of planar entities based on key inlet design factors. Future efforts will focus on developing the inlet geometry model, the inlet design and analysis methods, a Fortran 95 code to implement the model and methods. Other computational platforms, such as Java, will also be explored.

  12. Aerodynamic Optimization of Rocket Control Surface Geometry Using Cartesian Methods and CAD Geometry

    NASA Technical Reports Server (NTRS)

    Nelson, Andrea; Aftosmis, Michael J.; Nemec, Marian; Pulliam, Thomas H.

    2004-01-01

    Aerodynamic design is an iterative process involving geometry manipulation and complex computational analysis subject to physical constraints and aerodynamic objectives. A design cycle consists of first establishing the performance of a baseline design, which is usually created with low-fidelity engineering tools, and then progressively optimizing the design to maximize its performance. Optimization techniques have evolved from relying exclusively on designer intuition and insight in traditional trial and error methods, to sophisticated local and global search methods. Recent attempts at automating the search through a large design space with formal optimization methods include both database driven and direct evaluation schemes. Databases are being used in conjunction with surrogate and neural network models as a basis on which to run optimization algorithms. Optimization algorithms are also being driven by the direct evaluation of objectives and constraints using high-fidelity simulations. Surrogate methods use data points obtained from simulations, and possibly gradients evaluated at the data points, to create mathematical approximations of a database. Neural network models work in a similar fashion, using a number of high-fidelity database calculations as training iterations to create a database model. Optimal designs are obtained by coupling an optimization algorithm to the database model. Evaluation of the current best design then gives either a new local optima and/or increases the fidelity of the approximation model for the next iteration. Surrogate methods have also been developed that iterate on the selection of data points to decrease the uncertainty of the approximation model prior to searching for an optimal design. The database approximation models for each of these cases, however, become computationally expensive with increase in dimensionality. Thus the method of using optimization algorithms to search a database model becomes problematic as the number of design variables is increased.

  13. The use of computed tomographic three-dimensional reconstructions to develop instructional models for equine pelvic ultrasonography.

    PubMed

    Whitcomb, Mary Beth; Doval, John; Peters, Jason

    2011-01-01

    Ultrasonography has gained increased utility to diagnose pelvic fractures in horses; however, internal pelvic contours can be difficult to appreciate from external palpable landmarks. We developed three-dimensional (3D) simulations of the pelvic ultrasonographic examination to assist with translation of pelvic contours into two-dimensional (2D) images. Contiguous 1mm transverse computed tomography (CT) images were acquired through an equine femur and hemipelvis using a single slice helical scanner. 3D surface models were created using a DICOM reader and imported into a 3D modeling and animation program. The bone models were combined with a purchased 3D horse model and the skin made translucent to visualize pelvic surface contours. 3D models of ultrasound transducers were made from reference photos, and a thin sector shape was created to depict the ultrasound beam. Ultrasonographic examinations were simulated by moving transducers on the skin surface and rectally to produce images of pelvic structures. Camera angles were manipulated to best illustrate the transducer-beam-bone interface. Fractures were created in multiple configurations. Animations were exported as QuickTime movie files for use in presentations coupled with corresponding ultrasound videoclips. 3D models provide a link between ultrasonographic technique and image generation by depicting the interaction of the transducer, ultrasound beam, and structure of interest. The horse model was important to facilitate understanding of the location of pelvic structures relative to the skin surface. While CT acquisition time was brief, manipulation within the 3D software program was time intensive. Results were worthwhile from an instructional standpoint based on user feedback. © 2011 Veterinary Radiology & Ultrasound.

  14. A generic framework to simulate realistic lung, liver and renal pathologies in CT imaging

    NASA Astrophysics Data System (ADS)

    Solomon, Justin; Samei, Ehsan

    2014-11-01

    Realistic three-dimensional (3D) mathematical models of subtle lesions are essential for many computed tomography (CT) studies focused on performance evaluation and optimization. In this paper, we develop a generic mathematical framework that describes the 3D size, shape, contrast, and contrast-profile characteristics of a lesion, as well as a method to create lesion models based on CT data of real lesions. Further, we implemented a technique to insert the lesion models into CT images in order to create hybrid CT datasets. This framework was used to create a library of realistic lesion models and corresponding hybrid CT images. The goodness of fit of the models was assessed using the coefficient of determination (R2) and the visual appearance of the hybrid images was assessed with an observer study using images of both real and simulated lesions and receiver operator characteristic (ROC) analysis. The average R2 of the lesion models was 0.80, implying that the models provide a good fit to real lesion data. The area under the ROC curve was 0.55, implying that the observers could not readily distinguish between real and simulated lesions. Therefore, we conclude that the lesion-modeling framework presented in this paper can be used to create realistic lesion models and hybrid CT images. These models could be instrumental in performance evaluation and optimization of novel CT systems.

  15. Bayesian molecular design with a chemical language model

    NASA Astrophysics Data System (ADS)

    Ikebata, Hisaki; Hongo, Kenta; Isomura, Tetsu; Maezono, Ryo; Yoshida, Ryo

    2017-04-01

    The aim of computational molecular design is the identification of promising hypothetical molecules with a predefined set of desired properties. We address the issue of accelerating the material discovery with state-of-the-art machine learning techniques. The method involves two different types of prediction; the forward and backward predictions. The objective of the forward prediction is to create a set of machine learning models on various properties of a given molecule. Inverting the trained forward models through Bayes' law, we derive a posterior distribution for the backward prediction, which is conditioned by a desired property requirement. Exploring high-probability regions of the posterior with a sequential Monte Carlo technique, molecules that exhibit the desired properties can computationally be created. One major difficulty in the computational creation of molecules is the exclusion of the occurrence of chemically unfavorable structures. To circumvent this issue, we derive a chemical language model that acquires commonly occurring patterns of chemical fragments through natural language processing of ASCII strings of existing compounds, which follow the SMILES chemical language notation. In the backward prediction, the trained language model is used to refine chemical strings such that the properties of the resulting structures fall within the desired property region while chemically unfavorable structures are successfully removed. The present method is demonstrated through the design of small organic molecules with the property requirements on HOMO-LUMO gap and internal energy. The R package iqspr is available at the CRAN repository.

  16. Accuracy assessment of 3D bone reconstructions using CT: an intro comparison.

    PubMed

    Lalone, Emily A; Willing, Ryan T; Shannon, Hannah L; King, Graham J W; Johnson, James A

    2015-08-01

    Computed tomography provides high contrast imaging of the joint anatomy and is used routinely to reconstruct 3D models of the osseous and cartilage geometry (CT arthrography) for use in the design of orthopedic implants, for computer assisted surgeries and computational dynamic and structural analysis. The objective of this study was to assess the accuracy of bone and cartilage surface model reconstructions by comparing reconstructed geometries with bone digitizations obtained using an optical tracking system. Bone surface digitizations obtained in this study determined the ground truth measure for the underlying geometry. We evaluated the use of a commercially available reconstruction technique using clinical CT scanning protocols using the elbow joint as an example of a surface with complex geometry. To assess the accuracies of the reconstructed models (8 fresh frozen cadaveric specimens) against the ground truth bony digitization-as defined by this study-proximity mapping was used to calculate residual error. The overall mean error was less than 0.4 mm in the cortical region and 0.3 mm in the subchondral region of the bone. Similarly creating 3D cartilage surface models from CT scans using air contrast had a mean error of less than 0.3 mm. Results from this study indicate that clinical CT scanning protocols and commonly used and commercially available reconstruction algorithms can create models which accurately represent the true geometry. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.

  17. Bayesian molecular design with a chemical language model.

    PubMed

    Ikebata, Hisaki; Hongo, Kenta; Isomura, Tetsu; Maezono, Ryo; Yoshida, Ryo

    2017-04-01

    The aim of computational molecular design is the identification of promising hypothetical molecules with a predefined set of desired properties. We address the issue of accelerating the material discovery with state-of-the-art machine learning techniques. The method involves two different types of prediction; the forward and backward predictions. The objective of the forward prediction is to create a set of machine learning models on various properties of a given molecule. Inverting the trained forward models through Bayes' law, we derive a posterior distribution for the backward prediction, which is conditioned by a desired property requirement. Exploring high-probability regions of the posterior with a sequential Monte Carlo technique, molecules that exhibit the desired properties can computationally be created. One major difficulty in the computational creation of molecules is the exclusion of the occurrence of chemically unfavorable structures. To circumvent this issue, we derive a chemical language model that acquires commonly occurring patterns of chemical fragments through natural language processing of ASCII strings of existing compounds, which follow the SMILES chemical language notation. In the backward prediction, the trained language model is used to refine chemical strings such that the properties of the resulting structures fall within the desired property region while chemically unfavorable structures are successfully removed. The present method is demonstrated through the design of small organic molecules with the property requirements on HOMO-LUMO gap and internal energy. The R package iqspr is available at the CRAN repository.

  18. Computer-generated forces in distributed interactive simulation

    NASA Astrophysics Data System (ADS)

    Petty, Mikel D.

    1995-04-01

    Distributed Interactive Simulation (DIS) is an architecture for building large-scale simulation models from a set of independent simulator nodes communicating via a common network protocol. DIS is most often used to create a simulated battlefield for military training. Computer Generated Forces (CGF) systems control large numbers of autonomous battlefield entities in a DIS simulation using computer equipment and software rather than humans in simulators. CGF entities serve as both enemy forces and supplemental friendly forces in a DIS exercise. Research into various aspects of CGF systems is ongoing. Several CGF systems have been implemented.

  19. Efficient scatter model for simulation of ultrasound images from computed tomography data

    NASA Astrophysics Data System (ADS)

    D'Amato, J. P.; Lo Vercio, L.; Rubi, P.; Fernandez Vera, E.; Barbuzza, R.; Del Fresno, M.; Larrabide, I.

    2015-12-01

    Background and motivation: Real-time ultrasound simulation refers to the process of computationally creating fully synthetic ultrasound images instantly. Due to the high value of specialized low cost training for healthcare professionals, there is a growing interest in the use of this technology and the development of high fidelity systems that simulate the acquisitions of echographic images. The objective is to create an efficient and reproducible simulator that can run either on notebooks or desktops using low cost devices. Materials and methods: We present an interactive ultrasound simulator based on CT data. This simulator is based on ray-casting and provides real-time interaction capabilities. The simulation of scattering that is coherent with the transducer position in real time is also introduced. Such noise is produced using a simplified model of multiplicative noise and convolution with point spread functions (PSF) tailored for this purpose. Results: The computational efficiency of scattering maps generation was revised with an improved performance. This allowed a more efficient simulation of coherent scattering in the synthetic echographic images while providing highly realistic result. We describe some quality and performance metrics to validate these results, where a performance of up to 55fps was achieved. Conclusion: The proposed technique for real-time scattering modeling provides realistic yet computationally efficient scatter distributions. The error between the original image and the simulated scattering image was compared for the proposed method and the state-of-the-art, showing negligible differences in its distribution.

  20. Vernier caliper and micrometer computer models using Easy Java Simulation and its pedagogical design features—ideas for augmenting learning with real instruments

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Tiang Ning, Hwee

    2014-09-01

    This paper presents the customization of Easy Java Simulation models, used with actual laboratory instruments, to create active experiential learning for measurements. The laboratory instruments are the vernier caliper and the micrometer. Three computer model design ideas that complement real equipment are discussed. These ideas involve (1) a simple two-dimensional view for learning from pen and paper questions and the real world; (2) hints, answers, different scale options and the inclusion of zero error; (3) assessment for learning feedback. The initial positive feedback from Singaporean students and educators indicates that these tools could be successfully shared and implemented in learning communities. Educators are encouraged to change the source code for these computer models to suit their own purposes; they have creative commons attribution licenses for the benefit of all.

  1. Open-Source Integrated Design-Analysis Environment For Nuclear Energy Advanced Modeling & Simulation Final Scientific/Technical Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    O'Leary, Patrick

    The framework created through the Open-Source Integrated Design-Analysis Environment (IDAE) for Nuclear Energy Advanced Modeling & Simulation grant has simplify and democratize advanced modeling and simulation in the nuclear energy industry that works on a range of nuclear engineering applications. It leverages millions of investment dollars from the Department of Energy's Office of Nuclear Energy for modeling and simulation of light water reactors and the Office of Nuclear Energy's research and development. The IDEA framework enhanced Kitware’s Computational Model Builder (CMB) while leveraging existing open-source toolkits and creating a graphical end-to-end umbrella guiding end-users and developers through the nuclear energymore » advanced modeling and simulation lifecycle. In addition, the work deliver strategic advancements in meshing and visualization for ensembles.« less

  2. Using virtual ridge augmentation and 3D printing to fabricate a titanium mesh positioning device: A novel technique letter.

    PubMed

    Al-Ardah, Aladdin; Alqahtani, Nasser; AlHelal, Abdulaziz; Goodacre, Brian; Swamidass, Rajesh; Garbacea, Antoanela; Lozada, Jaime

    2018-05-02

    This technique describes a novel approach for planning and augmenting a large bony defect using a titanium mesh (TiMe). A 3-dimensional (3D) surgical model was virtually created from a cone beam computed tomography (CBCT) and wax-pattern of the final prosthetic outcome. The required bone volume (horizontally and vertically) was digitally augmented and then 3D printed to create a bone model. The 3D model was then used to contour the TiMe in accordance with the digital augmentation. With the contoured / preformed TiMe on the 3D printed model a positioning jig was made to aid the placement of the TiMe as planned during surgery. Although this technique does not impact the final outcome of the augmentation procedure, it allows the clinician to virtually design the augmentation, preform and contour the TiMe, and create a positioning jig reducing surgical time and error.

  3. The utilization of cranial models created using rapid prototyping techniques in the development of models for navigation training.

    PubMed

    Waran, V; Pancharatnam, Devaraj; Thambinayagam, Hari Chandran; Raman, Rajagopal; Rathinam, Alwin Kumar; Balakrishnan, Yuwaraj Kumar; Tung, Tan Su; Rahman, Z A

    2014-01-01

    Navigation in neurosurgery has expanded rapidly; however, suitable models to train end users to use the myriad software and hardware that come with these systems are lacking. Utilizing three-dimensional (3D) industrial rapid prototyping processes, we have been able to create models using actual computed tomography (CT) data from patients with pathology and use these models to simulate a variety of commonly performed neurosurgical procedures with navigation systems. To assess the possibility of utilizing models created from CT scan dataset obtained from patients with cranial pathology to simulate common neurosurgical procedures using navigation systems. Three patients with pathology were selected (hydrocephalus, right frontal cortical lesion, and midline clival meningioma). CT scan data following an image-guidance surgery protocol in DIACOM format and a Rapid Prototyping Machine were taken to create the necessary printed model with the corresponding pathology embedded. The ability in registration, planning, and navigation of two navigation systems using a variety of software and hardware provided by these platforms was assessed. We were able to register all models accurately using both navigation systems and perform the necessary simulations as planned. Models with pathology utilizing 3D rapid prototyping techniques accurately reflect data of actual patients and can be used in the simulation of neurosurgical operations using navigation systems. Georg Thieme Verlag KG Stuttgart · New York.

  4. Bringing MapReduce Closer To Data With Active Drives

    NASA Astrophysics Data System (ADS)

    Golpayegani, N.; Prathapan, S.; Warmka, R.; Wyatt, B.; Halem, M.; Trantham, J. D.; Markey, C. A.

    2017-12-01

    Moving computation closer to the data location has been a much theorized improvement to computation for decades. The increase in processor performance, the decrease in processor size and power requirement combined with the increase in data intensive computing has created a push to move computation as close to data as possible. We will show the next logical step in this evolution in computing: moving computation directly to storage. Hypothetical systems, known as Active Drives, have been proposed as early as 1998. These Active Drives would have a general-purpose CPU on each disk allowing for computations to be performed on them without the need to transfer the data to the computer over the system bus or via a network. We will utilize Seagate's Active Drives to perform general purpose parallel computing using the MapReduce programming model directly on each drive. We will detail how the MapReduce programming model can be adapted to the Active Drive compute model to perform general purpose computing with comparable results to traditional MapReduce computations performed via Hadoop. We will show how an Active Drive based approach significantly reduces the amount of data leaving the drive when performing several common algorithms: subsetting and gridding. We will show that an Active Drive based design significantly improves data transfer speeds into and out of drives compared to Hadoop's HDFS while at the same time keeping comparable compute speeds as Hadoop.

  5. Science Education and Technology: Opportunities to Enhance Student Learning.

    ERIC Educational Resources Information Center

    Woolsey, Kristina; Bellamy, Rachel

    1997-01-01

    Describes how technological capabilities such as calculation, imaging, networking, and portability support a range of pedagogical approaches, such as inquiry-based science and dynamic modeling. Includes as examples software products created at Apple Computer and others available in the marketplace. (KDFB)

  6. Bayesian Treed Calibration: An Application to Carbon Capture With AX Sorbent

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Konomi, Bledar A.; Karagiannis, Georgios; Lai, Kevin

    2017-01-02

    In cases where field or experimental measurements are not available, computer models can model real physical or engineering systems to reproduce their outcomes. They are usually calibrated in light of experimental data to create a better representation of the real system. Statistical methods, based on Gaussian processes, for calibration and prediction have been especially important when the computer models are expensive and experimental data limited. In this paper, we develop the Bayesian treed calibration (BTC) as an extension of standard Gaussian process calibration methods to deal with non-stationarity computer models and/or their discrepancy from the field (or experimental) data. Ourmore » proposed method partitions both the calibration and observable input space, based on a binary tree partitioning, into sub-regions where existing model calibration methods can be applied to connect a computer model with the real system. The estimation of the parameters in the proposed model is carried out using Markov chain Monte Carlo (MCMC) computational techniques. Different strategies have been applied to improve mixing. We illustrate our method in two artificial examples and a real application that concerns the capture of carbon dioxide with AX amine based sorbents. The source code and the examples analyzed in this paper are available as part of the supplementary materials.« less

  7. Designing and Creating Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    McMeen, George R.

    Designed to encourage the use of a defined methodology and careful planning in creating computer-assisted instructional programs, this paper describes the instructional design process, compares computer-assisted instruction (CAI) and programmed instruction (PI), and discusses pragmatic concerns in computer programming. Topics addressed include:…

  8. The Path of the Blind Watchmaker: A Model of Evolution

    DTIC Science & Technology

    2011-04-06

    computational biology has now reached the point that astronomy reached when it began to look backward in time to the Big Bang. Our goal is look backward in...treatment. We claim that computational biology has now reached the point that astronomy reached when it began to look backward in time to the Big...evolutionary process itself, in fact, created it. When astronomy reached a critical mass of theory, technology, and observational data, astronomers

  9. Flow field prediction in full-scale Carrousel oxidation ditch by using computational fluid dynamics.

    PubMed

    Yang, Yin; Wu, Yingying; Yang, Xiao; Zhang, Kai; Yang, Jiakuan

    2010-01-01

    In order to optimize the flow field in a full-scale Carrousel oxidation ditch with many sets of disc aerators operating simultaneously, an experimentally validated numerical tool, based on computational fluid dynamics (CFD), was proposed. A full-scale, closed-loop bioreactor (Carrousel oxidation ditch) in Ping Dingshan Sewage Treatment Plant in Ping Dingshan City, a medium-sized city in Henan Province of China, was evaluated using CFD. Moving wall model was created to simulate many sets of disc aerators which created fluid motion in the ditch. The simulated results were acceptable compared with the experimental data and the following results were obtained: (1) a new method called moving wall model could simulate the flow field in Carrousel oxidation ditch with many sets of disc aerators operating simultaneously. The whole number of cells of grids decreased significantly, thus the calculation amount decreased, and (2) CFD modeling generally characterized the flow pattern in the full-scale tank. 3D simulation could be a good supplement for improving the hydrodynamic performance in oxidation ditch designs.

  10. Physics-based analysis and control of human snoring

    NASA Astrophysics Data System (ADS)

    Sanchez, Yaselly; Wang, Junshi; Han, Pan; Xi, Jinxiang; Dong, Haibo

    2017-11-01

    In order to advance the understanding of biological fluid dynamics and its effects on the acoustics of human snoring, the study pursued a physics-based computational approach. From human magnetic resonance image (MRI) scans, the researchers were able to develop both anatomically and dynamically accurate airway-uvula models. With airways defined as rigid, and the uvula defined as flexible, computational models were created with various pharynx thickness and geometries. In order to determine vortex shedding with prescribed uvula movement, the uvula fluctuation was categorized by its specific parameters: magnitude, frequency, and phase lag. Uvula vibration modes were based on one oscillation, or one harmonic frequency, and pressure probes were located in seven different positions throughout the airway-uvula model. By taking fast Fourier transforms (FFT) from the pressure probe data, it was seen that four harmonics were created throughout the simulation within one oscillation of uvula movement. Of the four harmonics, there were two pressure probes which maintained high amplitudes and led the researcher to believe that different vortices formed with different snoring frequencies. This work is supported by the NSF Grant CBET-1605434.

  11. A Prototyping Effort for the Integrated Spacecraft Analysis System

    NASA Technical Reports Server (NTRS)

    Wong, Raymond; Tung, Yu-Wen; Maldague, Pierre

    2011-01-01

    Computer modeling and simulation has recently become an essential technique for predicting and validating spacecraft performance. However, most computer models only examine spacecraft subsystems, and the independent nature of the models creates integration problems, which lowers the possibilities of simulating a spacecraft as an integrated unit despite a desire for this type of analysis. A new project called Integrated Spacecraft Analysis was proposed to serve as a framework for an integrated simulation environment. The project is still in its infancy, but a software prototype would help future developers assess design issues. The prototype explores a service oriented design paradigm that theoretically allows programs written in different languages to communicate with one another. It includes creating a uniform interface to the SPICE libraries such that different in-house tools like APGEN or SEQGEN can exchange information with it without much change. Service orientation may result in a slower system as compared to a single application, and more research needs to be done on the different available technologies, but a service oriented approach could increase long term maintainability and extensibility.

  12. Closed-loop dialog model of face-to-face communication with a photo-real virtual human

    NASA Astrophysics Data System (ADS)

    Kiss, Bernadette; Benedek, Balázs; Szijárto, Gábor; Takács, Barnabás

    2004-01-01

    We describe an advanced Human Computer Interaction (HCI) model that employs photo-realistic virtual humans to provide digital media users with information, learning services and entertainment in a highly personalized and adaptive manner. The system can be used as a computer interface or as a tool to deliver content to end-users. We model the interaction process between the user and the system as part of a closed loop dialog taking place between the participants. This dialog, exploits the most important characteristics of a face-to-face communication process, including the use of non-verbal gestures and meta communication signals to control the flow of information. Our solution is based on a Virtual Human Interface (VHI) technology that was specifically designed to be able to create emotional engagement between the virtual agent and the user, thus increasing the efficiency of learning and/or absorbing any information broadcasted through this device. The paper reviews the basic building blocks and technologies needed to create such a system and discusses its advantages over other existing methods.

  13. Inspection of aeronautical mechanical parts with a pan-tilt-zoom camera: an approach guided by the computer-aided design model

    NASA Astrophysics Data System (ADS)

    Viana, Ilisio; Orteu, Jean-José; Cornille, Nicolas; Bugarin, Florian

    2015-11-01

    We focus on quality control of mechanical parts in aeronautical context using a single pan-tilt-zoom (PTZ) camera and a computer-aided design (CAD) model of the mechanical part. We use the CAD model to create a theoretical image of the element to be checked, which is further matched with the sensed image of the element to be inspected, using a graph theory-based approach. The matching is carried out in two stages. First, the two images are used to create two attributed graphs representing the primitives (ellipses and line segments) in the images. In the second stage, the graphs are matched using a similarity function built from the primitive parameters. The similarity scores of the matching are injected in the edges of a bipartite graph. A best-match-search procedure in the bipartite graph guarantees the uniqueness of the match solution. The method achieves promising performance in tests with synthetic data including missing elements, displaced elements, size changes, and combinations of these cases. The results open good prospects for using the method with realistic data.

  14. Development of Boundary Condition Independent Reduced Order Thermal Models using Proper Orthogonal Decomposition

    NASA Astrophysics Data System (ADS)

    Raghupathy, Arun; Ghia, Karman; Ghia, Urmila

    2008-11-01

    Compact Thermal Models (CTM) to represent IC packages has been traditionally developed using the DELPHI-based (DEvelopment of Libraries of PHysical models for an Integrated design) methodology. The drawbacks of this method are presented, and an alternative method is proposed. A reduced-order model that provides the complete thermal information accurately with less computational resources can be effectively used in system level simulations. Proper Orthogonal Decomposition (POD), a statistical method, can be used to reduce the order of the degree of freedom or variables of the computations for such a problem. POD along with the Galerkin projection allows us to create reduced-order models that reproduce the characteristics of the system with a considerable reduction in computational resources while maintaining a high level of accuracy. The goal of this work is to show that this method can be applied to obtain a boundary condition independent reduced-order thermal model for complex components. The methodology is applied to the 1D transient heat equation.

  15. Aerodynamic Interaction Effects of a Helicopter Rotor and Fuselage

    NASA Technical Reports Server (NTRS)

    Boyd, David D., Jr.

    1999-01-01

    A three year Cooperative Research Agreements made in each of the three years between the Subsonic Aerodynamics Branch of the NASA Langley Research Center and the Virginia Polytechnic Institute and State University (Va. Tech) has been completed. This document presents results from this three year endeavor. The goal of creating an efficient method to compute unsteady interactional effects between a helicopter rotor and fuselage has been accomplished. This paper also includes appendices to support these findings. The topics are: 1) Rotor-Fuselage Interactions Aerodynamics: An Unsteady Rotor Model; and 2) Rotor/Fuselage Unsteady Interactional Aerodynamics: A New Computational Model.

  16. Computational Design of Functionalized Metal–Organic Framework Nodes for Catalysis

    PubMed Central

    2017-01-01

    Recent progress in the synthesis and characterization of metal–organic frameworks (MOFs) has opened the door to an increasing number of possible catalytic applications. The great versatility of MOFs creates a large chemical space, whose thorough experimental examination becomes practically impossible. Therefore, computational modeling is a key tool to support, rationalize, and guide experimental efforts. In this outlook we survey the main methodologies employed to model MOFs for catalysis, and we review selected recent studies on the functionalization of their nodes. We pay special attention to catalytic applications involving natural gas conversion. PMID:29392172

  17. Above the cloud computing orbital services distributed data model

    NASA Astrophysics Data System (ADS)

    Straub, Jeremy

    2014-05-01

    Technology miniaturization and system architecture advancements have created an opportunity to significantly lower the cost of many types of space missions by sharing capabilities between multiple spacecraft. Historically, most spacecraft have been atomic entities that (aside from their communications with and tasking by ground controllers) operate in isolation. Several notable example exist; however, these are purpose-designed systems that collaborate to perform a single goal. The above the cloud computing (ATCC) concept aims to create ad-hoc collaboration between service provider and consumer craft. Consumer craft can procure processing, data transmission, storage, imaging and other capabilities from provider craft. Because of onboard storage limitations, communications link capability limitations and limited windows of communication, data relevant to or required for various operations may span multiple craft. This paper presents a model for the identification, storage and accessing of this data. This model includes appropriate identification features for this highly distributed environment. It also deals with business model constraints such as data ownership, retention and the rights of the storing craft to access, resell, transmit or discard the data in its possession. The model ensures data integrity and confidentiality (to the extent applicable to a given data item), deals with unique constraints of the orbital environment and tags data with business model (contractual) obligation data.

  18. Computational Toxicology at the US EPA | Science Inventory ...

    EPA Pesticide Factsheets

    Computational toxicology is the application of mathematical and computer models to help assess chemical hazards and risks to human health and the environment. Supported by advances in informatics, high-throughput screening (HTS) technologies, and systems biology, EPA is developing robust and flexible computational tools that can be applied to the thousands of chemicals in commerce, and contaminant mixtures found in America’s air, water, and hazardous-waste sites. The ORD Computational Toxicology Research Program (CTRP) is composed of three main elements. The largest component is the National Center for Computational Toxicology (NCCT), which was established in 2005 to coordinate research on chemical screening and prioritization, informatics, and systems modeling. The second element consists of related activities in the National Health and Environmental Effects Research Laboratory (NHEERL) and the National Exposure Research Laboratory (NERL). The third and final component consists of academic centers working on various aspects of computational toxicology and funded by the EPA Science to Achieve Results (STAR) program. Key intramural projects of the CTRP include digitizing legacy toxicity testing information toxicity reference database (ToxRefDB), predicting toxicity (ToxCast™) and exposure (ExpoCast™), and creating virtual liver (v-Liver™) and virtual embryo (v-Embryo™) systems models. The models and underlying data are being made publicly available t

  19. Reaching for the cloud: on the lessons learned from grid computing technology transfer process to the biomedical community.

    PubMed

    Mohammed, Yassene; Dickmann, Frank; Sax, Ulrich; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which led to the creation of the Grid. The inter domain transfer process of this technology has hitherto been an intuitive process without in depth analysis. Some difficulties facing the life science community in this transfer can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies which have achieved certain stability. Grid and Cloud solutions are technologies, which are still in flux. We show how Grid computing creates new difficulties in the transfer process that are not considered in Bozeman's model. We show why the success of healthgrids should be measured by the qualified scientific human capital and the opportunities created, and not primarily by the market impact. We conclude with recommendations that can help improve the adoption of Grid and Cloud solutions into the biomedical community. These results give a more concise explanation of the difficulties many life science IT projects are facing in the late funding periods, and show leveraging steps that can help overcoming the "vale of tears".

  20. A Cognitive Computing Approach for Classification of Complaints in the Insurance Industry

    NASA Astrophysics Data System (ADS)

    Forster, J.; Entrup, B.

    2017-10-01

    In this paper we present and evaluate a cognitive computing approach for classification of dissatisfaction and four complaint specific complaint classes in correspondence documents between insurance clients and an insurance company. A cognitive computing approach includes the combination classical natural language processing methods, machine learning algorithms and the evaluation of hypothesis. The approach combines a MaxEnt machine learning algorithm with language modelling, tf-idf and sentiment analytics to create a multi-label text classification model. The result is trained and tested with a set of 2500 original insurance communication documents written in German, which have been manually annotated by the partnering insurance company. With a F1-Score of 0.9, a reliable text classification component has been implemented and evaluated. A final outlook towards a cognitive computing insurance assistant is given in the end.

  1. Understanding resonance graphs using Easy Java Simulations (EJS) and why we use EJS

    NASA Astrophysics Data System (ADS)

    Wee, Loo Kang; Lee, Tat Leong; Chew, Charles; Wong, Darren; Tan, Samuel

    2015-03-01

    This paper reports a computer model simulation created using Easy Java Simulation (EJS) for learners to visualize how the steady-state amplitude of a driven oscillating system varies with the frequency of the periodic driving force. The simulation shows (N = 100) identical spring-mass systems being subjected to (1) a periodic driving force of equal amplitude but different driving frequencies, and (2) different amounts of damping. The simulation aims to create a visually intuitive way of understanding how the series of amplitude versus driving frequency graphs are obtained by showing how the displacement of the system changes over time as it transits from the transient to the steady state. A suggested ‘how to use’ the model is added to help educators and students in their teaching and learning, where we explain the theoretical steady-state equation time conditions when the model begins to allow data recording of maximum amplitudes to closely match the theoretical equation, and the steps to collect different runs of the degree of damping. We also discuss two of the design features in our computer model: displaying the instantaneous oscillation together with the achieved steady-state amplitudes, and the explicit world view overlay with scientific representation with different degrees of damping runs. Three advantages of using EJS include: (1) open source codes and creative commons attribution licenses for scaling up of interactively engaging educational practices; (2) the models made can run on almost any device, including Android and iOS; and (3) it allows the redefinition of physics educational practices through computer modeling.

  2. A Pythonic Approach for Computational Geosciences and Geo-Data Processing

    NASA Astrophysics Data System (ADS)

    Morra, G.; Yuen, D. A.; Lee, S. M.

    2016-12-01

    Computational methods and data analysis play a constantly increasing role in Earth Sciences however students and professionals need to climb a steep learning curve before reaching a sufficient level that allows them to run effective models. Furthermore the recent arrival and new powerful machine learning tools such as Torch and Tensor Flow has opened new possibilities but also created a new realm of complications related to the completely different technology employed. We present here a series of examples entirely written in Python, a language that combines the simplicity of Matlab with the power and speed of compiled languages such as C, and apply them to a wide range of geological processes such as porous media flow, multiphase fluid-dynamics, creeping flow and many-faults interaction. We also explore ways in which machine learning can be employed in combination with numerical modelling. From immediately interpreting a large number of modeling results to optimizing a set of modeling parameters to obtain a desired optimal simulation. We show that by using Python undergraduate and graduate can learn advanced numerical technologies with a minimum dedicated effort, which in turn encourages them to develop more numerical tools and quickly progress in their computational abilities. We also show how Python allows combining modeling with machine learning as pieces of LEGO, therefore simplifying the transition towards a new kind of scientific geo-modelling. The conclusion is that Python is an ideal tool to create an infrastructure for geosciences that allows users to quickly develop tools, reuse techniques and encourage collaborative efforts to interpret and integrate geo-data in profound new ways.

  3. Using CFD Techniques to Predict Slosh Force Frequency and Damping Rate

    NASA Technical Reports Server (NTRS)

    Marsell, Brandon; Gangadharan, Sathya; Chatman, Yadira; Sudermann, James

    2009-01-01

    Resonant effects and energy dissipation due to sloshing fuel inside propellant tanks are problems that arise in the initial design of any spacecraft or launch vehicle. A faster and more reliable method for calculating these effects during the design stages is needed. Using Computational Fluid Dynamics (CFD) techniques, a model of these fuel tanks can be created and used to predict important parameters such as resonant slosh frequency and damping rate. This initial study addresses the case of free surface slosh. Future studies will focus on creating models for tanks fitted with propellant management devices (PMD) such as diaphragms and baffles.

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lovelace, III, Henry H.

    In accelerator physics, models of a given machine are used to predict the behaviors of the beam, magnets, and radiofrequency cavities. The use of the computational model has become wide spread to ease the development period of the accelerator lattice. There are various programs that are used to create lattices and run simulations of both transverse and longitudinal beam dynamics. The programs include Methodical Accelerator Design(MAD) MAD8, MADX, Zgoubi, Polymorphic Tracking Code (PTC), and many others. In this discussion the BMAD (Baby Methodical Accelerator Design) is presented as an additional tool in creating and simulating accelerator lattices for the studymore » of beam dynamics in the Relativistic Heavy Ion Collider (RHIC).« less

  5. Creating, generating and comparing random network models with NetworkRandomizer.

    PubMed

    Tosadori, Gabriele; Bestvina, Ivan; Spoto, Fausto; Laudanna, Carlo; Scardoni, Giovanni

    2016-01-01

    Biological networks are becoming a fundamental tool for the investigation of high-throughput data in several fields of biology and biotechnology. With the increasing amount of information, network-based models are gaining more and more interest and new techniques are required in order to mine the information and to validate the results. To fill the validation gap we present an app, for the Cytoscape platform, which aims at creating randomised networks and randomising existing, real networks. Since there is a lack of tools that allow performing such operations, our app aims at enabling researchers to exploit different, well known random network models that could be used as a benchmark for validating real, biological datasets. We also propose a novel methodology for creating random weighted networks, i.e. the multiplication algorithm, starting from real, quantitative data. Finally, the app provides a statistical tool that compares real versus randomly computed attributes, in order to validate the numerical findings. In summary, our app aims at creating a standardised methodology for the validation of the results in the context of the Cytoscape platform.

  6. Bayesian Computation Emerges in Generic Cortical Microcircuits through Spike-Timing-Dependent Plasticity

    PubMed Central

    Nessler, Bernhard; Pfeiffer, Michael; Buesing, Lars; Maass, Wolfgang

    2013-01-01

    The principles by which networks of neurons compute, and how spike-timing dependent plasticity (STDP) of synaptic weights generates and maintains their computational function, are unknown. Preceding work has shown that soft winner-take-all (WTA) circuits, where pyramidal neurons inhibit each other via interneurons, are a common motif of cortical microcircuits. We show through theoretical analysis and computer simulations that Bayesian computation is induced in these network motifs through STDP in combination with activity-dependent changes in the excitability of neurons. The fundamental components of this emergent Bayesian computation are priors that result from adaptation of neuronal excitability and implicit generative models for hidden causes that are created in the synaptic weights through STDP. In fact, a surprising result is that STDP is able to approximate a powerful principle for fitting such implicit generative models to high-dimensional spike inputs: Expectation Maximization. Our results suggest that the experimentally observed spontaneous activity and trial-to-trial variability of cortical neurons are essential features of their information processing capability, since their functional role is to represent probability distributions rather than static neural codes. Furthermore it suggests networks of Bayesian computation modules as a new model for distributed information processing in the cortex. PMID:23633941

  7. A Computational Model of the Self-Teaching Hypothesis Based on the Dual-Route Cascaded Model of Reading

    ERIC Educational Resources Information Center

    Pritchard, Stephen C.; Coltheart, Max; Marinus, Eva; Castles, Anne

    2018-01-01

    The self-teaching hypothesis describes how children progress toward skilled sight-word reading. It proposes that children do this via phonological recoding with assistance from contextual cues, to identify the target pronunciation for a novel letter string, and in so doing create an opportunity to self-teach new orthographic knowledge. We present…

  8. NREL Leads Wind Farm Modeling Research - Continuum Magazine | NREL

    Science.gov Websites

    ten 2-MW Bonus wind turbines. Photo provided by HC Sorensen, Middelgrunden Wind Turbine Cooperative ) has created complex computer modeling tools to improve wind turbine design and overall wind farm activity surrounding a multi-megawatt wind turbine. In addition to its work with Doppler LIDAR, the

  9. A Comparison of Video Modeling, Text-Based Instruction, and No Instruction for Creating Multiple Baseline Graphs in Microsoft Excel

    ERIC Educational Resources Information Center

    Tyner, Bryan C.; Fienup, Daniel M.

    2015-01-01

    Graphing is socially significant for behavior analysts; however, graphing can be difficult to learn. Video modeling (VM) may be a useful instructional method but lacks evidence for effective teaching of computer skills. A between-groups design compared the effects of VM, text-based instruction, and no instruction on graphing performance.…

  10. The GRASP 3: Graphical Reliability Analysis Simulation Program. Version 3: A users' manual and modelling guide

    NASA Technical Reports Server (NTRS)

    Phillips, D. T.; Manseur, B.; Foster, J. W.

    1982-01-01

    Alternate definitions of system failure create complex analysis for which analytic solutions are available only for simple, special cases. The GRASP methodology is a computer simulation approach for solving all classes of problems in which both failure and repair events are modeled according to the probability laws of the individual components of the system.

  11. Animating a Human Body Mesh with Maya for Doppler Signature Computer Modeling

    DTIC Science & Technology

    2009-06-01

    three color-coded transformation tools attached to the AK47 mesh used to create the man with weapon model. Red, green, and blue correspond to the x-, y...animation scenario, the human moves in a natural walking motion cycle (figure 9). In the other two, the walking human carries an AK47 rifle in the port

  12. Computational knee ligament modeling using experimentally determined zero-load lengths.

    PubMed

    Bloemker, Katherine H; Guess, Trent M; Maletsky, Lorin; Dodd, Kevin

    2012-01-01

    This study presents a subject-specific method of determining the zero-load lengths of the cruciate and collateral ligaments in computational knee modeling. Three cadaver knees were tested in a dynamic knee simulator. The cadaver knees also underwent manual envelope of motion testing to find their passive range of motion in order to determine the zero-load lengths for each ligament bundle. Computational multibody knee models were created for each knee and model kinematics were compared to experimental kinematics for a simulated walk cycle. One-dimensional non-linear spring damper elements were used to represent cruciate and collateral ligament bundles in the knee models. This study found that knee kinematics were highly sensitive to altering of the zero-load length. The results also suggest optimal methods for defining each of the ligament bundle zero-load lengths, regardless of the subject. These results verify the importance of the zero-load length when modeling the knee joint and verify that manual envelope of motion measurements can be used to determine the passive range of motion of the knee joint. It is also believed that the method described here for determining zero-load length can be used for in vitro or in vivo subject-specific computational models.

  13. A new approach to tag design in dolphin telemetry: Computer simulations to minimise deleterious effects

    NASA Astrophysics Data System (ADS)

    Pavlov, V. V.; Wilson, R. P.; Lucke, K.

    2007-02-01

    Remote-sensors and transmitters are powerful devices for studying cetaceans at sea. However, despite substantial progress in microelectronics and miniaturisation of systems, dolphin tags are imperfectly designed; additional drag from tags increases swim costs, compromises swimming capacity and manoeuvrability, and leads to extra loads on the animal's tissue. We propose a new approach to tag design, elaborating basic principles and incorporating design stages to minimise device effects by using computer-aided design. Initially, the operational conditions of the device are defined by quantifying the shape, hydrodynamics and range of the natural deformation of the dolphin body at the tag attachment site (such as close to the dorsal fin). Then, parametric models of both of the dorsal fin and a tag are created using the derived data. The link between parameters of the fin and a tag model allows redesign of tag models according to expected changes of fin geometry (difference in fin shape related with species, sex, and age peculiarities, simulation of the bend of the fin during manoeuvres). A final virtual modelling stage uses iterative improvement of a tag model in a computer fluid dynamics (CFD) environment to enhance tag performance. This new method is considered as a suitable tool of tag design before creation of the physical model of a tag and testing with conventional wind/water tunnel technique. Ultimately, tag materials are selected to conform to the conditions identified by the modelling process and thus help create a physical model of a tag, which should minimise its impact on the animal carrier and thus increase the reliability and quality of the data obtained.

  14. ATLAS Cloud R&D

    NASA Astrophysics Data System (ADS)

    Panitkin, Sergey; Barreiro Megino, Fernando; Caballero Bejar, Jose; Benjamin, Doug; Di Girolamo, Alessandro; Gable, Ian; Hendrix, Val; Hover, John; Kucharczyk, Katarzyna; Medrano Llamas, Ramon; Love, Peter; Ohman, Henrik; Paterson, Michael; Sobie, Randall; Taylor, Ryan; Walker, Rodney; Zaytsev, Alexander; Atlas Collaboration

    2014-06-01

    The computing model of the ATLAS experiment was designed around the concept of grid computing and, since the start of data taking, this model has proven very successful. However, new cloud computing technologies bring attractive features to improve the operations and elasticity of scientific distributed computing. ATLAS sees grid and cloud computing as complementary technologies that will coexist at different levels of resource abstraction, and two years ago created an R&D working group to investigate the different integration scenarios. The ATLAS Cloud Computing R&D has been able to demonstrate the feasibility of offloading work from grid to cloud sites and, as of today, is able to integrate transparently various cloud resources into the PanDA workload management system. The ATLAS Cloud Computing R&D is operating various PanDA queues on private and public resources and has provided several hundred thousand CPU days to the experiment. As a result, the ATLAS Cloud Computing R&D group has gained a significant insight into the cloud computing landscape and has identified points that still need to be addressed in order to fully utilize this technology. This contribution will explain the cloud integration models that are being evaluated and will discuss ATLAS' learning during the collaboration with leading commercial and academic cloud providers.

  15. A novel method for designing and fabricating low-cost facepiece prototypes.

    PubMed

    Joe, Paula S; Shum, Phillip C; Brown, David W; Lungu, Claudiu T

    2014-01-01

    In 2010, the National Institute for Occupational Safety and Health (NIOSH) published new digital head form models based on their recently updated fit-test panel. The new panel, based on the 2000 census to better represent the modern work force, created two additional sizes: Short/Wide and Long/Narrow. While collecting the anthropometric data that comprised the panel, additional three-dimensional data were collected on a subset of the subjects. Within each sizing category, five individuals' three-dimensional data were used to create the new head form models. While NIOSH has recommended a switch to a five-size system for designing respirators, little has been done in assessing the potential benefits of this change. With commercially available elastomeric facepieces available in only three or four size systems, it was necessary to develop the facepieces to enable testing. This study aims to develop a method for designing and fabricating elastomeric facepieces tailored to the new head form designs for use in fit-testing studies. This novel method used computed tomography of a solid silicone facepiece and a number of computer-aided design programs (VolView, ParaView, MEGG3D, and RapidForm XOR) to develop a facepiece model to accommodate the Short/Wide head form. The generated model was given a physical form by means of three-dimensional printing using stereolithography (SLA). The printed model was then used to create a silicone mold from which elastomeric prototypes can be cast. The prototype facepieces were cast in two types of silicone for use in future fit-testing.

  16. Parallel computing works

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    An account of the Caltech Concurrent Computation Program (C{sup 3}P), a five year project that focused on answering the question: Can parallel computers be used to do large-scale scientific computations '' As the title indicates, the question is answered in the affirmative, by implementing numerous scientific applications on real parallel computers and doing computations that produced new scientific results. In the process of doing so, C{sup 3}P helped design and build several new computers, designed and implemented basic system software, developed algorithms for frequently used mathematical computations on massively parallel machines, devised performance models and measured the performance of manymore » computers, and created a high performance computing facility based exclusively on parallel computers. While the initial focus of C{sup 3}P was the hypercube architecture developed by C. Seitz, many of the methods developed and lessons learned have been applied successfully on other massively parallel architectures.« less

  17. Estimation of Nasal Tip Support Using Computer-Aided Design and 3-Dimensional Printed Models

    PubMed Central

    Gray, Eric; Maducdoc, Marlon; Manuel, Cyrus; Wong, Brian J. F.

    2016-01-01

    IMPORTANCE Palpation of the nasal tip is an essential component of the preoperative rhinoplasty examination. Measuring tip support is challenging, and the forces that correspond to ideal tip support are unknown. OBJECTIVE To identify the integrated reaction force and the minimum and ideal mechanical properties associated with nasal tip support. DESIGN, SETTING, AND PARTICIPANTS Three-dimensional (3-D) printed anatomic silicone nasal models were created using a computed tomographic scan and computer-aided design software. From this model, 3-D printing and casting methods were used to create 5 anatomically correct nasal models of varying constitutive Young moduli (0.042, 0.086, 0.098, 0.252, and 0.302 MPa) from silicone. Thirty rhinoplasty surgeons who attended a regional rhinoplasty course evaluated the reaction force (nasal tip recoil) of each model by palpation and selected the model that satisfied their requirements for minimum and ideal tip support. Data were collected from May 3 to 4, 2014. RESULTS Of the 30 respondents, 4 surgeons had been in practice for 1 to 5 years; 9 surgeons, 6 to 15 years; 7 surgeons, 16 to 25 years; and 10 surgeons, 26 or more years. Seventeen surgeons considered themselves in the advanced to expert skill competency levels. Logistic regression estimated the minimum threshold for the Young moduli for adequate and ideal tip support to be 0.096 and 0.154 MPa, respectively. Logistic regression estimated the thresholds for the reaction force associated with the absolute minimum and ideal requirements for good tip recoil to be 0.26 to 4.74 N and 0.37 to 7.19 N during 1- to 8-mm displacement, respectively. CONCLUSIONS AND RELEVANCE This study presents a method to estimate clinically relevant nasal tip reaction forces, which serve as a proxy for nasal tip support. This information will become increasingly important in computational modeling of nasal tip mechanics and ultimately will enhance surgical planning for rhinoplasty. LEVEL OF EVIDENCE NA. PMID:27124818

  18. Integration and Exposure of Large Scale Computational Resources Across the Earth System Grid Federation (ESGF)

    NASA Astrophysics Data System (ADS)

    Duffy, D.; Maxwell, T. P.; Doutriaux, C.; Williams, D. N.; Chaudhary, A.; Ames, S.

    2015-12-01

    As the size of remote sensing observations and model output data grows, the volume of the data has become overwhelming, even to many scientific experts. As societies are forced to better understand, mitigate, and adapt to climate changes, the combination of Earth observation data and global climate model projects is crucial to not only scientists but to policy makers, downstream applications, and even the public. Scientific progress on understanding climate is critically dependent on the availability of a reliable infrastructure that promotes data access, management, and provenance. The Earth System Grid Federation (ESGF) has created such an environment for the Intergovernmental Panel on Climate Change (IPCC). ESGF provides a federated global cyber infrastructure for data access and management of model outputs generated for the IPCC Assessment Reports (AR). The current generation of the ESGF federated grid allows consumers of the data to find and download data with limited capabilities for server-side processing. Since the amount of data for future AR is expected to grow dramatically, ESGF is working on integrating server-side analytics throughout the federation. The ESGF Compute Working Team (CWT) has created a Web Processing Service (WPS) Application Programming Interface (API) to enable access scalable computational resources. The API is the exposure point to high performance computing resources across the federation. Specifically, the API allows users to execute simple operations, such as maximum, minimum, average, and anomalies, on ESGF data without having to download the data. These operations are executed at the ESGF data node site with access to large amounts of parallel computing capabilities. This presentation will highlight the WPS API, its capabilities, provide implementation details, and discuss future developments.

  19. Use of HyperCard to Simulate a Tissue Culture Laboratory.

    ERIC Educational Resources Information Center

    Nester, Bradley S.; Turney, Tully H.

    1992-01-01

    Describes the use of a Macintosh computer and HyperCard software to create an introduction to cell culture techniques that closely approximates a hands-on laboratory experiment. Highlights include data acquisition, data analysis, the generation of growth curves, and electronic modeling. (LRW)

  20. A summer blender camp: modeling, rendering, and animation for high school students.

    PubMed

    Bailey, Mike; Law, Cathy

    2014-01-01

    At Camp Blender, high-school students of varying backgrounds learned how to use the Blender software package to create computer graphics content. In a postclass survey, most of them indicated that the camp affected how they thought about their career path.

  1. An experimental study on the manufacture and characterization of in-plane fibre-waviness defects in composites.

    PubMed

    Christian, W J R; DiazDelaO, F A; Atherton, K; Patterson, E A

    2018-05-01

    A new method has been developed for creating localized in-plane fibre waviness in composite coupons and used to create a large batch of specimens. This method could be used by manufacturers to experimentally explore the effect of fibre waviness on composite structures both directly and indirectly to develop and validate computational models. The specimens were assessed using ultrasound, digital image correlation and a novel inspection technique capable of measuring residual strain fields. To explore how the defect affects the performance of composite structures, the specimens were then loaded to failure. Predictions of remnant strength were made using a simple ultrasound damage metric and a new residual strain-based damage metric. The predictions made using residual strain measurements were found to be substantially more effective at characterizing ultimate strength than ultrasound measurements. This suggests that residual strains have a significant effect on the failure of laminates containing fibre waviness and that these strains could be incorporated into computational models to improve their ability to simulate the defect.

  2. Computational Fluid Dynamics Modeling of Nickel Hydrogen Batteries

    NASA Technical Reports Server (NTRS)

    Cullion, R.; Gu, W. B.; Wang, C. Y.; Timmerman, P.

    2000-01-01

    An electrochemical Ni-H2 battery model has been expanded to include thermal effects. A thermal energy conservation equation was derived from first principles. An electrochemical and thermal coupled model was created by the addition of this equation to an existing multiphase, electrochemical model. Charging at various rates was investigated and the results validated against experimental data. Reaction currents, pressure changes, temperature profiles, and concentration variations within the cell are predicted numerically and compared with available data and theory.

  3. Multiagent Modeling and Simulation in Human-Robot Mission Operations Work System Design

    NASA Technical Reports Server (NTRS)

    Sierhuis, Maarten; Clancey, William J.; Sims, Michael H.; Shafto, Michael (Technical Monitor)

    2001-01-01

    This paper describes a collaborative multiagent modeling and simulation approach for designing work systems. The Brahms environment is used to model mission operations for a semi-autonomous robot mission to the Moon at the work practice level. It shows the impact of human-decision making on the activities and energy consumption of a robot. A collaborative work systems design methodology is described that allows informal models, created with users and stakeholders, to be used as input to the development of formal computational models.

  4. Demystifying the cytokine network: Mathematical models point the way.

    PubMed

    Morel, Penelope A; Lee, Robin E C; Faeder, James R

    2017-10-01

    Cytokines provide the means by which immune cells communicate with each other and with parenchymal cells. There are over one hundred cytokines and many exist in families that share receptor components and signal transduction pathways, creating complex networks. Reductionist approaches to understanding the role of specific cytokines, through the use of gene-targeted mice, have revealed further complexity in the form of redundancy and pleiotropy in cytokine function. Creating an understanding of the complex interactions between cytokines and their target cells is challenging experimentally. Mathematical and computational modeling provides a robust set of tools by which complex interactions between cytokines can be studied and analyzed, in the process creating novel insights that can be further tested experimentally. This review will discuss and provide examples of the different modeling approaches that have been used to increase our understanding of cytokine networks. This includes discussion of knowledge-based and data-driven modeling approaches and the recent advance in single-cell analysis. The use of modeling to optimize cytokine-based therapies will also be discussed. Copyright © 2016 Elsevier Ltd. All rights reserved.

  5. Towards Modeling False Memory With Computational Knowledge Bases.

    PubMed

    Li, Justin; Kohanyi, Emma

    2017-01-01

    One challenge to creating realistic cognitive models of memory is the inability to account for the vast common-sense knowledge of human participants. Large computational knowledge bases such as WordNet and DBpedia may offer a solution to this problem but may pose other challenges. This paper explores some of these difficulties through a semantic network spreading activation model of the Deese-Roediger-McDermott false memory task. In three experiments, we show that these knowledge bases only capture a subset of human associations, while irrelevant information introduces noise and makes efficient modeling difficult. We conclude that the contents of these knowledge bases must be augmented and, more important, that the algorithms must be refined and optimized, before large knowledge bases can be widely used for cognitive modeling. Copyright © 2016 Cognitive Science Society, Inc.

  6. Coupling of EIT with computational lung modeling for predicting patient-specific ventilatory responses.

    PubMed

    Roth, Christian J; Becher, Tobias; Frerichs, Inéz; Weiler, Norbert; Wall, Wolfgang A

    2017-04-01

    Providing optimal personalized mechanical ventilation for patients with acute or chronic respiratory failure is still a challenge within a clinical setting for each case anew. In this article, we integrate electrical impedance tomography (EIT) monitoring into a powerful patient-specific computational lung model to create an approach for personalizing protective ventilatory treatment. The underlying computational lung model is based on a single computed tomography scan and able to predict global airflow quantities, as well as local tissue aeration and strains for any ventilation maneuver. For validation, a novel "virtual EIT" module is added to our computational lung model, allowing to simulate EIT images based on the patient's thorax geometry and the results of our numerically predicted tissue aeration. Clinically measured EIT images are not used to calibrate the computational model. Thus they provide an independent method to validate the computational predictions at high temporal resolution. The performance of this coupling approach has been tested in an example patient with acute respiratory distress syndrome. The method shows good agreement between computationally predicted and clinically measured airflow data and EIT images. These results imply that the proposed framework can be used for numerical prediction of patient-specific responses to certain therapeutic measures before applying them to an actual patient. In the long run, definition of patient-specific optimal ventilation protocols might be assisted by computational modeling. NEW & NOTEWORTHY In this work, we present a patient-specific computational lung model that is able to predict global and local ventilatory quantities for a given patient and any selected ventilation protocol. For the first time, such a predictive lung model is equipped with a virtual electrical impedance tomography module allowing real-time validation of the computed results with the patient measurements. First promising results obtained in an acute respiratory distress syndrome patient show the potential of this approach for personalized computationally guided optimization of mechanical ventilation in future. Copyright © 2017 the American Physiological Society.

  7. Fast and Accurate Circuit Design Automation through Hierarchical Model Switching.

    PubMed

    Huynh, Linh; Tagkopoulos, Ilias

    2015-08-21

    In computer-aided biological design, the trifecta of characterized part libraries, accurate models and optimal design parameters is crucial for producing reliable designs. As the number of parts and model complexity increase, however, it becomes exponentially more difficult for any optimization method to search the solution space, hence creating a trade-off that hampers efficient design. To address this issue, we present a hierarchical computer-aided design architecture that uses a two-step approach for biological design. First, a simple model of low computational complexity is used to predict circuit behavior and assess candidate circuit branches through branch-and-bound methods. Then, a complex, nonlinear circuit model is used for a fine-grained search of the reduced solution space, thus achieving more accurate results. Evaluation with a benchmark of 11 circuits and a library of 102 experimental designs with known characterization parameters demonstrates a speed-up of 3 orders of magnitude when compared to other design methods that provide optimality guarantees.

  8. Patient-Specific Geometry Modeling and Mesh Generation for Simulating Obstructive Sleep Apnea Syndrome Cases by Maxillomandibular Advancement

    PubMed Central

    Ito, Yasushi; Cheng, Gary C.; Shih, Alan M.; Koomullil, Roy P.; Soni, Bharat K.; Sittitavornwong, Somsak; Waite, Peter D.

    2011-01-01

    The objective of this paper is the reconstruction of upper airway geometric models as hybrid meshes from clinically used Computed Tomography (CT) data sets in order to understand the dynamics and behaviors of the pre- and postoperative upper airway systems of Obstructive Sleep Apnea Syndrome (OSAS) patients by viscous Computational Fluid Dynamics (CFD) simulations. The selection criteria for OSAS cases studied are discussed because two reasonable pre- and postoperative upper airway models for CFD simulations may not be created for every case without a special protocol for CT scanning. The geometry extraction and manipulation methods are presented with technical barriers that must be overcome so that they can be used along with computational simulation software as a daily clinical evaluation tool. Eight cases are presented in this paper, and each case consists of pre- and postoperative configurations. The results of computational simulations of two cases are included in this paper as demonstration. PMID:21625395

  9. Towards Large-area Field-scale Operational Evapotranspiration for Water Use Mapping

    NASA Astrophysics Data System (ADS)

    Senay, G. B.; Friedrichs, M.; Morton, C.; Huntington, J. L.; Verdin, J.

    2017-12-01

    Field-scale evapotranspiration (ET) estimates are needed for improving surface and groundwater use and water budget studies. Ideally, field-scale ET estimates would be at regional to national levels and cover long time periods. As a result of large data storage and computational requirements associated with processing field-scale satellite imagery such as Landsat, numerous challenges remain to develop operational ET estimates over large areas for detailed water use and availability studies. However, the combination of new science, data availability, and cloud computing technology is enabling unprecedented capabilities for ET mapping. To demonstrate this capability, we used Google's Earth Engine cloud computing platform to create nationwide annual ET estimates with 30-meter resolution Landsat ( 16,000 images) and gridded weather data using the Operational Simplified Surface Energy Balance (SSEBop) model in support of the National Water Census, a USGS research program designed to build decision support capacity for water management agencies and other natural resource managers. By leveraging Google's Earth Engine Application Programming Interface (API) and developing software in a collaborative, open-platform environment, we rapidly advance from research towards applications for large-area field-scale ET mapping. Cloud computing of the Landsat image archive combined with other satellite, climate, and weather data, is creating never imagined opportunities for assessing ET model behavior and uncertainty, and ultimately providing the ability for more robust operational monitoring and assessment of water use at field-scales.

  10. Measurement of Satellite Impact Test Fragments for Modeling Orbital Debris

    NASA Technical Reports Server (NTRS)

    Hill, Nicole M.

    2009-01-01

    There are over 13,000 pieces of catalogued objects 10cm and larger in orbit around Earth [ODQN, January 2009, p12]. More than 6000 of these objects are fragments from explosions and collisions. As the earth-orbiting object count increases, debris-generating collisions in the future become a statistical inevitability. To aid in understanding this collision risk, the NASA Orbital Debris Program Office has developed computer models that calculate quantity and orbits of debris both currently in orbit and in future epochs. In order to create a reasonable computer model of the orbital debris environment, it is important to understand the mechanics of creation of debris as a result of a collision. The measurement of the physical characteristics of debris resulting from ground-based, hypervelocity impact testing aids in understanding the sizes and shapes of debris produced from potential impacts in orbit. To advance the accuracy of fragment shape/size determination, the NASA Orbital Debris Program Office recently implemented a computerized measurement system. The goal of this system is to improve knowledge and understanding of the relation between commonly used dimensions and overall shape. The technique developed involves scanning a single fragment with a hand-held laser device, measuring its size properties using a sophisticated software tool, and creating a three-dimensional computer model to demonstrate how the object might appear in orbit. This information is used to aid optical techniques in shape determination. This more automated and repeatable method provides higher accuracy in the size and shape determination of debris.

  11. Computational Modeling of Sinkage of Objects into Porous Bed under Cyclic Loading

    NASA Astrophysics Data System (ADS)

    Sheikh, B.; Qiu, T.; Liu, X.

    2017-12-01

    This work is a companion of another abstract submitted to this session on the computational modeling for the prediction of underwater munitions. In the other abstract, the focus is the hydrodynamics and sediment transport. In this work, the focus is on the geotechnical aspect and granular material behavior when the munitions interact with the porous bed. The final goal of the project is to create and utilize a comprehensive modeling framework, which integrates the flow and granular material models, to simulate and investigate the motion of the munitions. In this work, we present the computational modeling of one important process: the sinkage of rigid-body objects into porous bed under cyclic loading. To model the large deformation of granular bed materials around sinking objects under cyclic loading, a rate-independent elasto-plastic constitutive model is implemented into a Smoothed Particle Hydrodynamics (SPH) model. The effect of loading conditions (e.g., amplitude and frequency of shaking), object properties (e.g., geometry and density), and granular bed material properties (e.g., density) on object singkage is discussed.

  12. Running climate model on a commercial cloud computing environment: A case study using Community Earth System Model (CESM) on Amazon AWS

    NASA Astrophysics Data System (ADS)

    Chen, Xiuhong; Huang, Xianglei; Jiao, Chaoyi; Flanner, Mark G.; Raeker, Todd; Palen, Brock

    2017-01-01

    The suites of numerical models used for simulating climate of our planet are usually run on dedicated high-performance computing (HPC) resources. This study investigates an alternative to the usual approach, i.e. carrying out climate model simulations on commercially available cloud computing environment. We test the performance and reliability of running the CESM (Community Earth System Model), a flagship climate model in the United States developed by the National Center for Atmospheric Research (NCAR), on Amazon Web Service (AWS) EC2, the cloud computing environment by Amazon.com, Inc. StarCluster is used to create virtual computing cluster on the AWS EC2 for the CESM simulations. The wall-clock time for one year of CESM simulation on the AWS EC2 virtual cluster is comparable to the time spent for the same simulation on a local dedicated high-performance computing cluster with InfiniBand connections. The CESM simulation can be efficiently scaled with the number of CPU cores on the AWS EC2 virtual cluster environment up to 64 cores. For the standard configuration of the CESM at a spatial resolution of 1.9° latitude by 2.5° longitude, increasing the number of cores from 16 to 64 reduces the wall-clock running time by more than 50% and the scaling is nearly linear. Beyond 64 cores, the communication latency starts to outweigh the benefit of distributed computing and the parallel speedup becomes nearly unchanged.

  13. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY

    PubMed Central

    Somogyi, Endre; Hagar, Amit; Glazier, James A.

    2017-01-01

    Living tissues are dynamic, heterogeneous compositions of objects, including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes. Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology (CCOPM) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models. PMID:29282379

  14. TOWARDS A MULTI-SCALE AGENT-BASED PROGRAMMING LANGUAGE METHODOLOGY.

    PubMed

    Somogyi, Endre; Hagar, Amit; Glazier, James A

    2016-12-01

    Living tissues are dynamic, heterogeneous compositions of objects , including molecules, cells and extra-cellular materials, which interact via chemical, mechanical and electrical process and reorganize via transformation, birth, death and migration processes . Current programming language have difficulty describing the dynamics of tissues because: 1: Dynamic sets of objects participate simultaneously in multiple processes, 2: Processes may be either continuous or discrete, and their activity may be conditional, 3: Objects and processes form complex, heterogeneous relationships and structures, 4: Objects and processes may be hierarchically composed, 5: Processes may create, destroy and transform objects and processes. Some modeling languages support these concepts, but most cannot translate models into executable simulations. We present a new hybrid executable modeling language paradigm, the Continuous Concurrent Object Process Methodology ( CCOPM ) which naturally expresses tissue models, enabling users to visually create agent-based models of tissues, and also allows computer simulation of these models.

  15. Alloy Design Workbench-Surface Modeling Package Developed

    NASA Technical Reports Server (NTRS)

    Abel, Phillip B.; Noebe, Ronald D.; Bozzolo, Guillermo H.; Good, Brian S.; Daugherty, Elaine S.

    2003-01-01

    NASA Glenn Research Center's Computational Materials Group has integrated a graphical user interface with in-house-developed surface modeling capabilities, with the goal of using computationally efficient atomistic simulations to aid the development of advanced aerospace materials, through the modeling of alloy surfaces, surface alloys, and segregation. The software is also ideal for modeling nanomaterials, since surface and interfacial effects can dominate material behavior and properties at this level. Through the combination of an accurate atomistic surface modeling methodology and an efficient computational engine, it is now possible to directly model these types of surface phenomenon and metallic nanostructures without a supercomputer. Fulfilling a High Operating Temperature Propulsion Components (HOTPC) project level-I milestone, a graphical user interface was created for a suite of quantum approximate atomistic materials modeling Fortran programs developed at Glenn. The resulting "Alloy Design Workbench-Surface Modeling Package" (ADW-SMP) is the combination of proven quantum approximate Bozzolo-Ferrante-Smith (BFS) algorithms (refs. 1 and 2) with a productivity-enhancing graphical front end. Written in the portable, platform independent Java programming language, the graphical user interface calls on extensively tested Fortran programs running in the background for the detailed computational tasks. Designed to run on desktop computers, the package has been deployed on PC, Mac, and SGI computer systems. The graphical user interface integrates two modes of computational materials exploration. One mode uses Monte Carlo simulations to determine lowest energy equilibrium configurations. The second approach is an interactive "what if" comparison of atomic configuration energies, designed to provide real-time insight into the underlying drivers of alloying processes.

  16. A computational model of self-efficacy's various effects on performance: Moving the debate forward.

    PubMed

    Vancouver, Jeffrey B; Purl, Justin D

    2017-04-01

    Self-efficacy, which is one's belief in one's capacity, has been found to both positively and negatively influence effort and performance. The reasons for these different effects have been a major topic of debate among social-cognitive and perceptual control theorists. In particular, the findings of various self-efficacy effects has been motivated by a perceptual control theory view of self-regulation that social-cognitive theorists' question. To provide more clarity to the theoretical arguments, a computational model of the multiple processes presumed to create the positive, negative, and null effects for self-efficacy is presented. Building on an existing computational model of goal choice that produces a positive effect for self-efficacy, the current article adds a symbolic processing structure used during goal striving that explains the negative self-efficacy effect observed in recent studies. Moreover, the multiple processes, operating together, allow the model to recreate the various effects found in a published study of feedback ambiguity's moderating role on the self-efficacy to performance relationship (Schmidt & DeShon, 2010). Discussion focuses on the implications of the model for the self-efficacy debate, alternative computational models, the overlap between control theory and social-cognitive theory explanations, the value of using computational models for resolving theoretical disputes, and future research and directions the model inspires. (PsycINFO Database Record (c) 2017 APA, all rights reserved).

  17. Automatic mathematical modeling for space application

    NASA Technical Reports Server (NTRS)

    Wang, Caroline K.

    1987-01-01

    A methodology for automatic mathematical modeling is described. The major objective is to create a very friendly environment for engineers to design, maintain and verify their model and also automatically convert the mathematical model into FORTRAN code for conventional computation. A demonstration program was designed for modeling the Space Shuttle Main Engine simulation mathematical model called Propulsion System Automatic Modeling (PSAM). PSAM provides a very friendly and well organized environment for engineers to build a knowledge base for base equations and general information. PSAM contains an initial set of component process elements for the Space Shuttle Main Engine simulation and a questionnaire that allows the engineer to answer a set of questions to specify a particular model. PSAM is then able to automatically generate the model and the FORTRAN code. A future goal is to download the FORTRAN code to the VAX/VMS system for conventional computation.

  18. AstroBlend: Visualization package for use with Blender

    NASA Astrophysics Data System (ADS)

    Naiman, J. P.

    2015-12-01

    AstroBlend is a visualization package for use in the three dimensional animation and modeling software, Blender. It reads data in via a text file or can use pre-fab isosurface files stored as OBJ or Wavefront files. AstroBlend supports a variety of codes such as FLASH (ascl:1010.082), Enzo (ascl:1010.072), and Athena (ascl:1010.014), and combines artistic 3D models with computational astrophysics datasets to create models and animations.

  19. New Actions Upon Old Objects: A New Ontological Perspective on Functions.

    ERIC Educational Resources Information Center

    Schwarz, Baruch; Dreyfus, Tommy

    1995-01-01

    A computer microworld called Triple Representation Model uses graphical, tabular, and algebraic representations to influence conceptions of function. A majority of students were able to cope with partial data, recognize invariants while coordinating actions among representations, and recognize invariants while creating and comparing different…

  20. Off We Go Cybernetting--Staff Development Makes the Difference.

    ERIC Educational Resources Information Center

    Joseph, Linda

    1995-01-01

    Describes how to create a school or district model for an Internet staff development training program for integrating information access skills into the school curriculum. Highlights include instructional design; facility development, including computer workstations; hands-on workshops that include electronic mail, gopher, and downloading;…

  1. Modeling and Controls Development of 48V Mild Hybrid Electric Vehicles

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool (ALPHA) was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...

  2. Overview of High-Fidelity Modeling Activities in the Numerical Propulsion System Simulations (NPSS) Project

    NASA Technical Reports Server (NTRS)

    Veres, Joseph P.

    2002-01-01

    A high-fidelity simulation of a commercial turbofan engine has been created as part of the Numerical Propulsion System Simulation Project. The high-fidelity computer simulation utilizes computer models that were developed at NASA Glenn Research Center in cooperation with turbofan engine manufacturers. The average-passage (APNASA) Navier-Stokes based viscous flow computer code is used to simulate the 3D flow in the compressors and turbines of the advanced commercial turbofan engine. The 3D National Combustion Code (NCC) is used to simulate the flow and chemistry in the advanced aircraft combustor. The APNASA turbomachinery code and the NCC combustor code exchange boundary conditions at the interface planes at the combustor inlet and exit. This computer simulation technique can evaluate engine performance at steady operating conditions. The 3D flow models provide detailed knowledge of the airflow within the fan and compressor, the high and low pressure turbines, and the flow and chemistry within the combustor. The models simulate the performance of the engine at operating conditions that include sea level takeoff and the altitude cruise condition.

  3. Patient-specific models of cardiac biomechanics

    NASA Astrophysics Data System (ADS)

    Krishnamurthy, Adarsh; Villongco, Christopher T.; Chuang, Joyce; Frank, Lawrence R.; Nigam, Vishal; Belezzuoli, Ernest; Stark, Paul; Krummen, David E.; Narayan, Sanjiv; Omens, Jeffrey H.; McCulloch, Andrew D.; Kerckhoffs, Roy C. P.

    2013-07-01

    Patient-specific models of cardiac function have the potential to improve diagnosis and management of heart disease by integrating medical images with heterogeneous clinical measurements subject to constraints imposed by physical first principles and prior experimental knowledge. We describe new methods for creating three-dimensional patient-specific models of ventricular biomechanics in the failing heart. Three-dimensional bi-ventricular geometry is segmented from cardiac CT images at end-diastole from patients with heart failure. Human myofiber and sheet architecture is modeled using eigenvectors computed from diffusion tensor MR images from an isolated, fixed human organ-donor heart and transformed to the patient-specific geometric model using large deformation diffeomorphic mapping. Semi-automated methods were developed for optimizing the passive material properties while simultaneously computing the unloaded reference geometry of the ventricles for stress analysis. Material properties of active cardiac muscle contraction were optimized to match ventricular pressures measured by cardiac catheterization, and parameters of a lumped-parameter closed-loop model of the circulation were estimated with a circulatory adaptation algorithm making use of information derived from echocardiography. These components were then integrated to create a multi-scale model of the patient-specific heart. These methods were tested in five heart failure patients from the San Diego Veteran's Affairs Medical Center who gave informed consent. The simulation results showed good agreement with measured echocardiographic and global functional parameters such as ejection fraction and peak cavity pressures.

  4. Flow modification in canine intracranial aneurysm model by an asymmetric stent: studies using digital subtraction angiography (DSA) and image-based computational fluid dynamics (CFD) analyses

    NASA Astrophysics Data System (ADS)

    Hoi, Yiemeng; Ionita, Ciprian N.; Tranquebar, Rekha V.; Hoffmann, Kenneth R.; Woodward, Scott H.; Taulbee, Dale B.; Meng, Hui; Rudin, Stephen

    2006-03-01

    An asymmetric stent with low porosity patch across the intracranial aneurysm neck and high porosity elsewhere is designed to modify the flow to result in thrombogenesis and occlusion of the aneurysm and yet to reduce the possibility of also occluding adjacent perforator vessels. The purposes of this study are to evaluate the flow field induced by an asymmetric stent using both numerical and digital subtraction angiography (DSA) methods and to quantify the flow dynamics of an asymmetric stent in an in vivo aneurysm model. We created a vein-pouch aneurysm model on the canine carotid artery. An asymmetric stent was implanted at the aneurysm, with 25% porosity across the aneurysm neck and 80% porosity elsewhere. The aneurysm geometry, before and after stent implantation, was acquired using cone beam CT and reconstructed for computational fluid dynamics (CFD) analysis. Both steady-state and pulsatile flow conditions using the measured waveforms from the aneurysm model were studied. To reduce computational costs, we modeled the asymmetric stent effect by specifying a pressure drop over the layer across the aneurysm orifice where the low porosity patch was located. From the CFD results, we found the asymmetric stent reduced the inflow into the aneurysm by 51%, and appeared to create a stasis-like environment which favors thrombus formation. The DSA sequences also showed substantial flow reduction into the aneurysm. Asymmetric stents may be a viable image guided intervention for treating intracranial aneurysms with desired flow modification features.

  5. Interactive computer simulations of knee-replacement surgery.

    PubMed

    Gunther, Stephen B; Soto, Gabriel E; Colman, William W

    2002-07-01

    Current surgical training programs in the United States are based on an apprenticeship model. This model is outdated because it does not provide conceptual scaffolding, promote collaborative learning, or offer constructive reinforcement. Our objective was to create a more useful approach by preparing students and residents for operative cases using interactive computer simulations of surgery. Total-knee-replacement surgery (TKR) is an ideal procedure to model on the computer because there is a systematic protocol for the procedure. Also, this protocol is difficult to learn by the apprenticeship model because of the multiple instruments that must be used in a specific order. We designed an interactive computer tutorial to teach medical students and residents how to perform knee-replacement surgery. We also aimed to reinforce the specific protocol of the operative procedure. Our final goal was to provide immediate, constructive feedback. We created a computer tutorial by generating three-dimensional wire-frame models of the surgical instruments. Next, we applied a surface to the wire-frame models using three-dimensional modeling. Finally, the three-dimensional models were animated to simulate the motions of an actual TKR. The tutorial is a step-by-step tutorial that teaches and tests the correct sequence of steps in a TKR. The student or resident must select the correct instruments in the correct order. The learner is encouraged to learn the stepwise surgical protocol through repetitive use of the computer simulation. Constructive feedback is acquired through a grading system, which rates the student's or resident's ability to perform the task in the correct order. The grading system also accounts for the time required to perform the simulated procedure. We evaluated the efficacy of this teaching technique by testing medical students who learned by the computer simulation and those who learned by reading the surgical protocol manual. Both groups then performed TKR on manufactured bone models using real instruments. Their technique was graded with the standard protocol. The students who learned on the computer simulation performed the task in a shorter time and with fewer errors than the control group. They were also more engaged in the learning process. Surgical training programs generally lack a consistent approach to preoperative education related to surgical procedures. This interactive computer tutorial has allowed us to make a quantum leap in medical student and resident teaching in our orthopedic department because the students actually participate in the entire process. Our technique provides a linear, sequential method of skill acquisition and direct feedback, which is ideally suited for learning stepwise surgical protocols. Since our initial evaluation has shown the efficacy of this program, we have implemented this teaching tool into our orthopedic curriculum. Our plans for future work with this simulator include modeling procedures involving other anatomic areas of interest, such as the hip and shoulder.

  6. EarthCube - Earth System Bridge: Spanning Scientific Communities with Interoperable Modeling Frameworks

    NASA Astrophysics Data System (ADS)

    Peckham, S. D.; DeLuca, C.; Gochis, D. J.; Arrigo, J.; Kelbert, A.; Choi, E.; Dunlap, R.

    2014-12-01

    In order to better understand and predict environmental hazards of weather/climate, ecology and deep earth processes, geoscientists develop and use physics-based computational models. These models are used widely both in academic and federal communities. Because of the large effort required to develop and test models, there is widespread interest in component-based modeling, which promotes model reuse and simplified coupling to tackle problems that often cross discipline boundaries. In component-based modeling, the goal is to make relatively small changes to models that make it easy to reuse them as "plug-and-play" components. Sophisticated modeling frameworks exist to rapidly couple these components to create new composite models. They allow component models to exchange variables while accommodating different programming languages, computational grids, time-stepping schemes, variable names and units. Modeling frameworks have arisen in many modeling communities. CSDMS (Community Surface Dynamics Modeling System) serves the academic earth surface process dynamics community, while ESMF (Earth System Modeling Framework) serves many federal Earth system modeling projects. Others exist in both the academic and federal domains and each satisfies design criteria that are determined by the community they serve. While they may use different interface standards or semantic mediation strategies, they share fundamental similarities. The purpose of the Earth System Bridge project is to develop mechanisms for interoperability between modeling frameworks, such as the ability to share a model or service component. This project has three main goals: (1) Develop a Framework Description Language (ES-FDL) that allows modeling frameworks to be described in a standard way so that their differences and similarities can be assessed. (2) Demonstrate that if a model is augmented with a framework-agnostic Basic Model Interface (BMI), then simple, universal adapters can go from BMI to a modeling framework's native component interface. (3) Create semantic mappings between modeling frameworks that support semantic mediation. This third goal involves creating a crosswalk between the CF Standard Names and the CSDMS Standard Names (a set of naming conventions). This talk will summarize progress towards these goals.

  7. GRAVTool, a Package to Compute Geoid Model by Remove-Compute-Restore Technique

    NASA Astrophysics Data System (ADS)

    Marotta, G. S.; Blitzkow, D.; Vidotti, R. M.

    2015-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astro-geodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove-Compute-Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and global geopotential coefficients, respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and that adjust these models to one local vertical datum. This research presents a developed package called GRAVTool based on MATLAB software to compute local geoid models by RCR technique and its application in a study area. The studied area comprehends the federal district of Brazil, with ~6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show the local geoid model computed by the GRAVTool package (Figure), using 1377 terrestrial gravity data, SRTM data with 3 arc second of resolution, and geopotential coefficients of the EIGEN-6C4 model to degree 360. The accuracy of the computed model (σ = ± 0.071 m, RMS = 0.069 m, maximum = 0.178 m and minimum = -0.123 m) matches the uncertainty (σ =± 0.073) of 21 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.099 m, RMS = 0.208 m, maximum = 0.419 m and minimum = -0.040 m).

  8. Effects of Computer Animation Exercises on Student Cognitive Processes.

    ERIC Educational Resources Information Center

    Fowler, Will

    A study examining the effects of computer animation exercises on cognitive development asked two groups of seventh graders to create computer animations, working from a simple mythic text. The ability of students to create narrative scenarios from this mythic text was analyzed. These scenarios were then recreated in the school computer lab, using…

  9. Framework for emotional mobile computation for creating entertainment experience

    NASA Astrophysics Data System (ADS)

    Lugmayr, Artur R.

    2007-02-01

    Ambient media are media, which are manifesting in the natural environment of the consumer. The perceivable borders between the media and the context, where the media is used are getting more and more blurred. The consumer is moving through a digital space of services throughout his daily life. As we are developing towards an experience society, the central point in the development of services is the creation of a consumer experience. This paper reviews possibilities and potentials of the creation of entertainment experiences with mobile phone platforms. It reviews sensor network capable of acquiring consumer behavior data, interactivity strategies, psychological models for emotional computation on mobile phones, and lays the foundations of a nomadic experience society. The paper rounds up with a presentation of several different possible service scenarios in the field of entertainment and leisure computation on mobiles. The goal of this paper is to present a framework and evaluation of possibilities of applying sensor technology on mobile platforms to create an increasing consumer entertainment experience.

  10. 76 FR 58466 - Models To Advance Voluntary Corporate Notification to Consumers Regarding the Illicit Use of...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-21

    ...The U.S. Department of Commerce and U.S. Department of Homeland Security are requesting information on the requirements of, and possible approaches to creating, a voluntary industry code of conduct to address the detection, notification and mitigation of botnets.\\1\\ Over the past several years, botnets have increasingly put computer owners at risk. A botnet infection can lead to the monitoring of a consumer's personal information and communication, and exploitation of that consumer's computing power and Internet access. Networks of these compromised computers are often used to disseminate spam, to store and transfer illegal content, and to attack the servers of government and private entities with massive, distributed denial of service attacks. The Departments seek public comment from all Internet stakeholders, including the commercial, academic, and civil society sectors, on potential models for detection, notification, prevention, and mitigation of botnets' illicit use of computer equipment. ---------------------------------------------------------------------------

  11. Reusable Component Model Development Approach for Parallel and Distributed Simulation

    PubMed Central

    Zhu, Feng; Yao, Yiping; Chen, Huilong; Yao, Feng

    2014-01-01

    Model reuse is a key issue to be resolved in parallel and distributed simulation at present. However, component models built by different domain experts usually have diversiform interfaces, couple tightly, and bind with simulation platforms closely. As a result, they are difficult to be reused across different simulation platforms and applications. To address the problem, this paper first proposed a reusable component model framework. Based on this framework, then our reusable model development approach is elaborated, which contains two phases: (1) domain experts create simulation computational modules observing three principles to achieve their independence; (2) model developer encapsulates these simulation computational modules with six standard service interfaces to improve their reusability. The case study of a radar model indicates that the model developed using our approach has good reusability and it is easy to be used in different simulation platforms and applications. PMID:24729751

  12. C-Language Integrated Production System, Version 6.0

    NASA Technical Reports Server (NTRS)

    Riley, Gary; Donnell, Brian; Ly, Huyen-Anh Bebe; Ortiz, Chris

    1995-01-01

    C Language Integrated Production System (CLIPS) computer programs are specifically intended to model human expertise or other knowledge. CLIPS is designed to enable research on, and development and delivery of, artificial intelligence on conventional computers. CLIPS 6.0 provides cohesive software tool for handling wide variety of knowledge with support for three different programming paradigms: rule-based, object-oriented, and procedural. Rule-based programming: representation of knowledge as heuristics - essentially, rules of thumb that specify set of actions performed in given situation. Object-oriented programming: modeling of complex systems comprised of modular components easily reused to model other systems or create new components. Procedural-programming: representation of knowledge in ways similar to those of such languages as C, Pascal, Ada, and LISP. Version of CLIPS 6.0 for IBM PC-compatible computers requires DOS v3.3 or later and/or Windows 3.1 or later.

  13. Computational modeling for prediction of the shear stress of three-dimensional isotropic and aligned fiber networks.

    PubMed

    Park, Seungman

    2017-09-01

    Interstitial flow (IF) is a creeping flow through the interstitial space of the extracellular matrix (ECM). IF plays a key role in diverse biological functions, such as tissue homeostasis, cell function and behavior. Currently, most studies that have characterized IF have focused on the permeability of ECM or shear stress distribution on the cells, but less is known about the prediction of shear stress on the individual fibers or fiber networks despite its significance in the alignment of matrix fibers and cells observed in fibrotic or wound tissues. In this study, I developed a computational model to predict shear stress for different structured fibrous networks. To generate isotropic models, a random growth algorithm and a second-order orientation tensor were employed. Then, a three-dimensional (3D) solid model was created using computer-aided design (CAD) software for the aligned models (i.e., parallel, perpendicular and cubic models). Subsequently, a tetrahedral unstructured mesh was generated and flow solutions were calculated by solving equations for mass and momentum conservation for all models. Through the flow solutions, I estimated permeability using Darcy's law. Average shear stress (ASS) on the fibers was calculated by averaging the wall shear stress of the fibers. By using nonlinear surface fitting of permeability, viscosity, velocity, porosity and ASS, I devised new computational models. Overall, the developed models showed that higher porosity induced higher permeability, as previous empirical and theoretical models have shown. For comparison of the permeability, the present computational models were matched well with previous models, which justify our computational approach. ASS tended to increase linearly with respect to inlet velocity and dynamic viscosity, whereas permeability was almost the same. Finally, the developed model nicely predicted the ASS values that had been directly estimated from computational fluid dynamics (CFD). The present computational models will provide new tools for predicting accurate functional properties and designing fibrous porous materials, thereby significantly advancing tissue engineering. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Finite Element Aircraft Simulation of Turbulence

    NASA Technical Reports Server (NTRS)

    McFarland, R. E.

    1997-01-01

    A turbulence model has been developed for realtime aircraft simulation that accommodates stochastic turbulence and distributed discrete gusts as a function of the terrain. This model is applicable to conventional aircraft, V/STOL aircraft, and disc rotor model helicopter simulations. Vehicle angular activity in response to turbulence is computed from geometrical and temporal relationships rather than by using the conventional continuum approximations that assume uniform gust immersion and low frequency responses. By using techniques similar to those recently developed for blade-element rotor models, the angular-rate filters of conventional turbulence models are not required. The model produces rotational rates as well as air mass translational velocities in response to both stochastic and deterministic disturbances, where the discrete gusts and turbulence magnitudes may be correlated with significant terrain features or ship models. Assuming isotropy, a two-dimensional vertical turbulence field is created. A novel Gaussian interpolation technique is used to distribute vertical turbulence on the wing span or lateral rotor disc, and this distribution is used to compute roll responses. Air mass velocities are applied at significant centers of pressure in the computation of the aircraft's pitch and roll responses.

  15. Synthetic river valleys: Creating prescribed topography for form-process inquiry and river rehabilitation design

    NASA Astrophysics Data System (ADS)

    Brown, R. A.; Pasternack, G. B.; Wallender, W. W.

    2014-06-01

    The synthesis of artificial landforms is complementary to geomorphic analysis because it affords a reflection on both the characteristics and intrinsic formative processes of real world conditions. Moreover, the applied terminus of geomorphic theory is commonly manifested in the engineering and rehabilitation of riverine landforms where the goal is to create specific processes associated with specific morphology. To date, the synthesis of river topography has been explored outside of geomorphology through artistic renderings, computer science applications, and river rehabilitation design; while within geomorphology it has been explored using morphodynamic modeling, such as one-dimensional simulation of river reach profiles, two-dimensional simulation of river networks, and three-dimensional simulation of subreach scale river morphology. To date, no approach allows geomorphologists, engineers, or river rehabilitation practitioners to create landforms of prescribed conditions. In this paper a method for creating topography of synthetic river valleys is introduced that utilizes a theoretical framework that draws from fluvial geomorphology, computer science, and geometric modeling. Such a method would be valuable to geomorphologists in understanding form-process linkages as well as to engineers and river rehabilitation practitioners in developing design surfaces that can be rapidly iterated. The method introduced herein relies on the discretization of river valley topography into geometric elements associated with overlapping and orthogonal two-dimensional planes such as the planform, profile, and cross section that are represented by mathematical functions, termed geometric element equations. Topographic surfaces can be parameterized independently or dependently using a geomorphic covariance structure between the spatial series of geometric element equations. To illustrate the approach and overall model flexibility examples are provided that are associated with mountain, lowland, and hybrid synthetic river valleys. To conclude, recommended advances such as multithread channels are discussed along with potential applications.

  16. Identification of Computational and Experimental Reduced-Order Models

    NASA Technical Reports Server (NTRS)

    Silva, Walter A.; Hong, Moeljo S.; Bartels, Robert E.; Piatak, David J.; Scott, Robert C.

    2003-01-01

    The identification of computational and experimental reduced-order models (ROMs) for the analysis of unsteady aerodynamic responses and for efficient aeroelastic analyses is presented. For the identification of a computational aeroelastic ROM, the CFL3Dv6.0 computational fluid dynamics (CFD) code is used. Flutter results for the AGARD 445.6 Wing and for a Rigid Semispan Model (RSM) computed using CFL3Dv6.0 are presented, including discussion of associated computational costs. Modal impulse responses of the unsteady aerodynamic system are computed using the CFL3Dv6.0 code and transformed into state-space form. The unsteady aerodynamic state-space ROM is then combined with a state-space model of the structure to create an aeroelastic simulation using the MATLAB/SIMULINK environment. The MATLAB/SIMULINK ROM is then used to rapidly compute aeroelastic transients, including flutter. The ROM shows excellent agreement with the aeroelastic analyses computed using the CFL3Dv6.0 code directly. For the identification of experimental unsteady pressure ROMs, results are presented for two configurations: the RSM and a Benchmark Supercritical Wing (BSCW). Both models were used to acquire unsteady pressure data due to pitching oscillations on the Oscillating Turntable (OTT) system at the Transonic Dynamics Tunnel (TDT). A deconvolution scheme involving a step input in pitch and the resultant step response in pressure, for several pressure transducers, is used to identify the unsteady pressure impulse responses. The identified impulse responses are then used to predict the pressure responses due to pitching oscillations at several frequencies. Comparisons with the experimental data are then presented.

  17. Towards anatomic scale agent-based modeling with a massively parallel spatially explicit general-purpose model of enteric tissue (SEGMEnT_HPC).

    PubMed

    Cockrell, Robert Chase; Christley, Scott; Chang, Eugene; An, Gary

    2015-01-01

    Perhaps the greatest challenge currently facing the biomedical research community is the ability to integrate highly detailed cellular and molecular mechanisms to represent clinical disease states as a pathway to engineer effective therapeutics. This is particularly evident in the representation of organ-level pathophysiology in terms of abnormal tissue structure, which, through histology, remains a mainstay in disease diagnosis and staging. As such, being able to generate anatomic scale simulations is a highly desirable goal. While computational limitations have previously constrained the size and scope of multi-scale computational models, advances in the capacity and availability of high-performance computing (HPC) resources have greatly expanded the ability of computational models of biological systems to achieve anatomic, clinically relevant scale. Diseases of the intestinal tract are exemplary examples of pathophysiological processes that manifest at multiple scales of spatial resolution, with structural abnormalities present at the microscopic, macroscopic and organ-levels. In this paper, we describe a novel, massively parallel computational model of the gut, the Spatially Explicitly General-purpose Model of Enteric Tissue_HPC (SEGMEnT_HPC), which extends an existing model of the gut epithelium, SEGMEnT, in order to create cell-for-cell anatomic scale simulations. We present an example implementation of SEGMEnT_HPC that simulates the pathogenesis of ileal pouchitis, and important clinical entity that affects patients following remedial surgery for ulcerative colitis.

  18. Building Cognition: The Construction of Computational Representations for Scientific Discovery.

    PubMed

    Chandrasekharan, Sanjay; Nersessian, Nancy J

    2015-11-01

    Novel computational representations, such as simulation models of complex systems and video games for scientific discovery (Foldit, EteRNA etc.), are dramatically changing the way discoveries emerge in science and engineering. The cognitive roles played by such computational representations in discovery are not well understood. We present a theoretical analysis of the cognitive roles such representations play, based on an ethnographic study of the building of computational models in a systems biology laboratory. Specifically, we focus on a case of model-building by an engineer that led to a remarkable discovery in basic bioscience. Accounting for such discoveries requires a distributed cognition (DC) analysis, as DC focuses on the roles played by external representations in cognitive processes. However, DC analyses by and large have not examined scientific discovery, and they mostly focus on memory offloading, particularly how the use of existing external representations changes the nature of cognitive tasks. In contrast, we study discovery processes and argue that discoveries emerge from the processes of building the computational representation. The building process integrates manipulations in imagination and in the representation, creating a coupled cognitive system of model and modeler, where the model is incorporated into the modeler's imagination. This account extends DC significantly, and we present some of the theoretical and application implications of this extended account. Copyright © 2014 Cognitive Science Society, Inc.

  19. Interactive design and analysis of future large spacecraft concepts

    NASA Technical Reports Server (NTRS)

    Garrett, L. B.

    1981-01-01

    An interactive computer aided design program used to perform systems level design and analysis of large spacecraft concepts is presented. Emphasis is on rapid design, analysis of integrated spacecraft, and automatic spacecraft modeling for lattice structures. Capabilities and performance of multidiscipline applications modules, the executive and data management software, and graphics display features are reviewed. A single user at an interactive terminal create, design, analyze, and conduct parametric studies of Earth orbiting spacecraft with relative ease. Data generated in the design, analysis, and performance evaluation of an Earth-orbiting large diameter antenna satellite are used to illustrate current capabilities. Computer run time statistics for the individual modules quantify the speed at which modeling, analysis, and design evaluation of integrated spacecraft concepts is accomplished in a user interactive computing environment.

  20. Challenges in reducing the computational time of QSTS simulations for distribution system analysis.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Deboever, Jeremiah; Zhang, Xiaochen; Reno, Matthew J.

    The rapid increase in penetration of distributed energy resources on the electric power distribution system has created a need for more comprehensive interconnection modelling and impact analysis. Unlike conventional scenario - based studies , quasi - static time - series (QSTS) simulation s can realistically model time - dependent voltage controllers and the diversity of potential impacts that can occur at different times of year . However, to accurately model a distribution system with all its controllable devices, a yearlong simulation at 1 - second resolution is often required , which could take conventional computers a computational time of 10more » to 120 hours when an actual unbalanced distribution feeder is modeled . This computational burden is a clear l imitation to the adoption of QSTS simulation s in interconnection studies and for determining optimal control solutions for utility operations . Our ongoing research to improve the speed of QSTS simulation has revealed many unique aspects of distribution system modelling and sequential power flow analysis that make fast QSTS a very difficult problem to solve. In this report , the most relevant challenges in reducing the computational time of QSTS simulations are presented: number of power flows to solve, circuit complexity, time dependence between time steps, multiple valid power flow solutions, controllable element interactions, and extensive accurate simulation analysis.« less

  1. Computer-aided controllability assessment of generic manned Space Station concepts

    NASA Technical Reports Server (NTRS)

    Ferebee, M. J.; Deryder, L. J.; Heck, M. L.

    1984-01-01

    NASA's Concept Development Group assessment methodology for the on-orbit rigid body controllability characteristics of each generic configuration proposed for the manned space station is presented; the preliminary results obtained represent the first step in the analysis of these eight configurations. Analytical computer models of each configuration were developed by means of the Interactive Design Evaluation of Advanced Spacecraft CAD system, which created three-dimensional geometry models of each configuration to establish dimensional requirements for module connectivity, payload accommodation, and Space Shuttle berthing; mass, center-of-gravity, inertia, and aerodynamic drag areas were then derived. Attention was also given to the preferred flight attitude of each station concept.

  2. A Web Tool for Research in Nonlinear Optics

    NASA Astrophysics Data System (ADS)

    Prikhod'ko, Nikolay V.; Abramovsky, Viktor A.; Abramovskaya, Natalia V.; Demichev, Andrey P.; Kryukov, Alexandr P.; Polyakov, Stanislav P.

    2016-02-01

    This paper presents a project of developing the web platform called WebNLO for computer modeling of nonlinear optics phenomena. We discuss a general scheme of the platform and a model for interaction between the platform modules. The platform is built as a set of interacting RESTful web services (SaaS approach). Users can interact with the platform through a web browser or command line interface. Such a resource has no analogues in the field of nonlinear optics and will be created for the first time therefore allowing researchers to access high-performance computing resources that will significantly reduce the cost of the research and development process.

  3. Analysis of a Multi-Fidelity Surrogate for Handling Real Gas Equations of State

    NASA Astrophysics Data System (ADS)

    Ouellet, Frederick; Park, Chanyoung; Rollin, Bertrand; Balachandar, S.

    2017-06-01

    The explosive dispersal of particles is a complex multiphase and multi-species fluid flow problem. In these flows, the detonation products of the explosive must be treated as real gas while the ideal gas equation of state is used for the surrounding air. As the products expand outward from the detonation point, they mix with ambient air and create a mixing region where both state equations must be satisfied. One of the most accurate, yet computationally expensive, methods to handle this problem is an algorithm that iterates between both equations of state until pressure and thermal equilibrium are achieved inside of each computational cell. This work aims to use a multi-fidelity surrogate model to replace this process. A Kriging model is used to produce a curve fit which interpolates selected data from the iterative algorithm using Bayesian statistics. We study the model performance with respect to the iterative method in simulations using a finite volume code. The model's (i) computational speed, (ii) memory requirements and (iii) computational accuracy are analyzed to show the benefits of this novel approach. Also, optimizing the combination of model accuracy and computational speed through the choice of sampling points is explained. This work was supported by the U.S. Department of Energy, National Nuclear Security Administration, Advanced Simulation and Computing Program as a Cooperative Agreement under the Predictive Science Academic Alliance Program under Contract No. DE-NA0002378.

  4. Theoretical and experimental analysis of amplitude control ablation and bipolar ablation in creating linear lesion and discrete lesions for treating atrial fibrillation.

    PubMed

    Yan, Shengjie; Wu, Xiaomei; Wang, Weiqi

    2017-09-01

    Radiofrequency (RF) energy is often used to create a linear lesion or discrete lesions for blocking the accessory conduction pathways for treating atrial fibrillation. By using finite element analysis, we study the ablation effect of amplitude control ablation mode (AcM) and bipolar ablation mode (BiM) in creating a linear lesion and discrete lesions in a 5-mm-thick atrial wall; particularly, the characteristic of lesion shape has been investigated in amplitude control ablation. Computer models of multipolar catheter were developed to study the lesion dimensions in atrial walls created through AcM, BiM and special electrodes activated ablation methods in AcM and BiM. To validate the theoretical results in this study, an in vitro experiment with porcine cardiac tissue was performed. At 40 V/20 V root mean squared (RMS) of the RF voltage for AcM, the continuous and transmural lesion was created by AcM-15s, AcM-5s and AcM-ad-20V ablation in 5-mm-thick atrial wall. At 20 V RMS for BiM, the continuous but not transmural lesion was created. AcM ablation yielded asymmetrical and discrete lesions shape, whereas the lesion shape turned to more symmetrical and continuous as the electrodes alternative activated period decreased from 15 s to 5 s. Two discrete lesions were created when using AcM, AcM-ad-40V, BiM-ad-20V and BiM-ad-40V. The experimental and computational thermal lesion shapes created in cardiac tissue were in agreement. Amplitude control ablation technology and bipolar ablation technology are feasible methods to create continuous lesion or discrete for pulmonary veins isolation.

  5. The Software Design Document: More than a User's Manual.

    ERIC Educational Resources Information Center

    Bowers, Dennis

    1989-01-01

    Discusses the value of creating design documentation for computer software so that it may serve as a model for similar design efforts. Components of the software design document are described, including program flowcharts, graphic representation of screen displays, storyboards, and evaluation procedures. An example is given using HyperCard. (three…

  6. Modeling and Validation of Lithium-ion Automotive Battery Packs (SAE 2013-01-1539)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types c...

  7. Computational Approach using Mouse Embryonic Stem Cells to Define a Mechanistic Applicability Domain for Prenatal Developmental Toxicity

    EPA Science Inventory

    Identification of mechanisms responsible for adverse developmental effects is the first step in creating predictive toxicity models. Identification of putative mechanisms was performed by co-analyzing three datasets for the effects of ToxCast phase Ia and II chemicals: 1.In vitro...

  8. Technical Drafting and Mental Visualization in Interior Architecture Education

    ERIC Educational Resources Information Center

    Arslan, Ali Riza; Dazkir, Sibel Seda

    2017-01-01

    We explored how beginning-level interior architecture students develop skills to create mental visualizations of three-dimensional objects and environments, how they develop their technical drawing skills, and whether or not physical and computer generated models aid this design process. We used interviews and observations to collect data. The…

  9. Air Force Laboratory’s 2005 Technology Milestones

    DTIC Science & Technology

    2006-01-01

    Computational materials science methods can benefit the design and property prediction of complex real-world materials. With these models , scientists and...Warfighter Page Air High - Frequency Acoustic System...800) 203-6451 High - Frequency Acoustic System Payoff Scientists created the High - Frequency Acoustic Suppression Technology (HiFAST) airflow control

  10. On the Shoulders of Giants: New Approaches to Numeracy.

    ERIC Educational Resources Information Center

    Steen, Lynn Arthur, Ed.

    Forces created by the proliferation of computer hardware and software, by innovative methods of mathematical modelling and applications, by broader demographic considerations, and by schools themselves are profoundly changing the way mathematics is practiced, the way mathematics is taught, and the way mathematics is learned. In this volume, a…

  11. To the Cloud! A Grassroots Proposal to Accelerate Brain Science Discovery

    PubMed Central

    Vogelstein, Joshua T.; Mensh, Brett; Hausser, Michael; Spruston, Nelson; Evans, Alan; Kording, Konrad; Amunts, Katrin; Ebell, Christoph; Muller, Jeff; Telefont, Martin; Hill, Sean; Koushika, Sandhya P.; Cali, Corrado; Valdés-Sosa, Pedro Antonio; Littlewood, Peter; Koch, Christof; Saalfeld, Stephan; Kepecs, Adam; Peng, Hanchuan; Halchenko, Yaroslav O.; Kiar, Gregory; Poo, Mu-Ming; Poline, Jean-Baptiste; Milham, Michael P.; Schaffer, Alyssa Picchini; Gidron, Rafi; Okano, Hideyuki; Calhoun, Vince D; Chun, Miyoung; Kleissas, Dean M.; Vogelstein, R. Jacob; Perlman, Eric; Burns, Randal; Huganir, Richard; Miller, Michael I.

    2018-01-01

    The revolution in neuroscientific data acquisition is creating an analysis challenge. We propose leveraging cloud-computing technologies to enable large-scale neurodata storing, exploring, analyzing, and modeling. This utility will empower scientists globally to generate and test theories of brain function and dysfunction. PMID:27810005

  12. School System Simulation: An Effective Model for Educational Leaders.

    ERIC Educational Resources Information Center

    Nelson, Jorge O.

    This study reviews the literature regarding the theoretical rationale for creating a computer-based school system simulation for educational leaders' use in problem solving and decision making. Like all social systems, educational systems are so complex that individuals are hard-pressed to consider all interrelated parts as a totality. A…

  13. Modeling and Analysis of Large Amplitude Flight Maneuvers

    NASA Technical Reports Server (NTRS)

    Anderson, Mark R.

    2004-01-01

    Analytical methods for stability analysis of large amplitude aircraft motion have been slow to develop because many nonlinear system stability assessment methods are restricted to a state-space dimension of less than three. The proffered approach is to create regional cell-to-cell maps for strategically located two-dimensional subspaces within the higher-dimensional model statespace. These regional solutions capture nonlinear behavior better than linearized point solutions. They also avoid the computational difficulties that emerge when attempting to create a cell map for the entire state-space. Example stability results are presented for a general aviation aircraft and a micro-aerial vehicle configuration. The analytical results are consistent with characteristics that were discovered during previous flight-testing.

  14. Using Apex To Construct CPM-GOMS Models

    NASA Technical Reports Server (NTRS)

    John, Bonnie; Vera, Alonso; Matessa, Michael; Freed, Michael; Remington, Roger

    2006-01-01

    process for automatically generating computational models of human/computer interactions as well as graphical and textual representations of the models has been built on the conceptual foundation of a method known in the art as CPM-GOMS. This method is so named because it combines (1) the task decomposition of analysis according to an underlying method known in the art as the goals, operators, methods, and selection (GOMS) method with (2) a model of human resource usage at the level of cognitive, perceptual, and motor (CPM) operations. CPM-GOMS models have made accurate predictions about behaviors of skilled computer users in routine tasks, but heretofore, such models have been generated in a tedious, error-prone manual process. In the present process, CPM-GOMS models are generated automatically from a hierarchical task decomposition expressed by use of a computer program, known as Apex, designed previously to be used to model human behavior in complex, dynamic tasks. An inherent capability of Apex for scheduling of resources automates the difficult task of interleaving the cognitive, perceptual, and motor resources that underlie common task operators (e.g., move and click mouse). The user interface of Apex automatically generates Program Evaluation Review Technique (PERT) charts, which enable modelers to visualize the complex parallel behavior represented by a model. Because interleaving and the generation of displays to aid visualization are automated, it is now feasible to construct arbitrarily long sequences of behaviors. The process was tested by using Apex to create a CPM-GOMS model of a relatively simple human/computer-interaction task and comparing the time predictions of the model and measurements of the times taken by human users in performing the various steps of the task. The task was to withdraw $80 in cash from an automated teller machine (ATM). For the test, a Visual Basic mockup of an ATM was created, with a provision for input from (and measurement of the performance of) the user via a mouse. The times predicted by the automatically generated model turned out to approximate the measured times fairly well (see figure). While these results are promising, there is need for further development of the process. Moreover, it will also be necessary to test other, more complex models: The actions required of the user in the ATM task are too sequential to involve substantial parallelism and interleaving and, hence, do not serve as an adequate test of the unique strength of CPM-GOMS models to accommodate parallelism and interleaving.

  15. Biomechanical testing simulation of a cadaver spine specimen: development and evaluation study.

    PubMed

    Ahn, Hyung Soo; DiAngelo, Denis J

    2007-05-15

    This article describes a computer model of the cadaver cervical spine specimen and virtual biomechanical testing. To develop a graphics-oriented, multibody model of a cadaver cervical spine and to build a virtual laboratory simulator for the biomechanical testing using physics-based dynamic simulation techniques. Physics-based computer simulations apply the laws of physics to solid bodies with defined material properties. This technique can be used to create a virtual simulator for the biomechanical testing of a human cadaver spine. An accurate virtual model and simulation would complement tissue-based in vitro studies by providing a consistent test bed with minimal variability and by reducing cost. The geometry of cervical vertebrae was created from computed tomography images. Joints linking adjacent vertebrae were modeled as a triple-joint complex, comprised of intervertebral disc joints in the anterior region, 2 facet joints in the posterior region, and the surrounding ligament structure. A virtual laboratory simulation of an in vitro testing protocol was performed to evaluate the model responses during flexion, extension, and lateral bending. For kinematic evaluation, the rotation of motion segment unit, coupling behaviors, and 3-dimensional helical axes of motion were analyzed. The simulation results were in correlation with the findings of in vitro tests and published data. For kinetic evaluation, the forces of the intervertebral discs and facet joints of each segment were determined and visually animated. This methodology produced a realistic visualization of in vitro experiment, and allowed for the analyses of the kinematics and kinetics of the cadaver cervical spine. With graphical illustrations and animation features, this modeling technique has provided vivid and intuitive information.

  16. Microarray-based cancer prediction using soft computing approach.

    PubMed

    Wang, Xiaosheng; Gotoh, Osamu

    2009-05-26

    One of the difficulties in using gene expression profiles to predict cancer is how to effectively select a few informative genes to construct accurate prediction models from thousands or ten thousands of genes. We screen highly discriminative genes and gene pairs to create simple prediction models involved in single genes or gene pairs on the basis of soft computing approach and rough set theory. Accurate cancerous prediction is obtained when we apply the simple prediction models for four cancerous gene expression datasets: CNS tumor, colon tumor, lung cancer and DLBCL. Some genes closely correlated with the pathogenesis of specific or general cancers are identified. In contrast with other models, our models are simple, effective and robust. Meanwhile, our models are interpretable for they are based on decision rules. Our results demonstrate that very simple models may perform well on cancerous molecular prediction and important gene markers of cancer can be detected if the gene selection approach is chosen reasonably.

  17. Optimum structural sizing of conventional cantilever and joined wing configurations using equivalent beam models

    NASA Technical Reports Server (NTRS)

    Hajela, P.; Chen, J. L.

    1986-01-01

    The present paper describes an approach for the optimum sizing of single and joined wing structures that is based on representing the built-up finite element model of the structure by an equivalent beam model. The low order beam model is computationally more efficient in an environment that requires repetitive analysis of several trial designs. The design procedure is implemented in a computer program that requires geometry and loading data typically available from an aerodynamic synthesis program, to create the finite element model of the lifting surface and an equivalent beam model. A fully stressed design procedure is used to obtain rapid estimates of the optimum structural weight for the beam model for a given geometry, and a qualitative description of the material distribution over the wing structure. The synthesis procedure is demonstrated for representative single wing and joined wing structures.

  18. Design Of Computer Based Test Using The Unified Modeling Language

    NASA Astrophysics Data System (ADS)

    Tedyyana, Agus; Danuri; Lidyawati

    2017-12-01

    The Admission selection of Politeknik Negeri Bengkalis through interest and talent search (PMDK), Joint Selection of admission test for state Polytechnics (SB-UMPN) and Independent (UM-Polbeng) were conducted by using paper-based Test (PBT). Paper Based Test model has some weaknesses. They are wasting too much paper, the leaking of the questios to the public, and data manipulation of the test result. This reasearch was Aimed to create a Computer-based Test (CBT) models by using Unified Modeling Language (UML) the which consists of Use Case diagrams, Activity diagram and sequence diagrams. During the designing process of the application, it is important to pay attention on the process of giving the password for the test questions before they were shown through encryption and description process. RSA cryptography algorithm was used in this process. Then, the questions shown in the questions banks were randomized by using the Fisher-Yates Shuffle method. The network architecture used in Computer Based test application was a client-server network models and Local Area Network (LAN). The result of the design was the Computer Based Test application for admission to the selection of Politeknik Negeri Bengkalis.

  19. IMPACT: a generic tool for modelling and simulating public health policy.

    PubMed

    Ainsworth, J D; Carruthers, E; Couch, P; Green, N; O'Flaherty, M; Sperrin, M; Williams, R; Asghar, Z; Capewell, S; Buchan, I E

    2011-01-01

    Populations are under-served by local health policies and management of resources. This partly reflects a lack of realistically complex models to enable appraisal of a wide range of potential options. Rising computing power coupled with advances in machine learning and healthcare information now enables such models to be constructed and executed. However, such models are not generally accessible to public health practitioners who often lack the requisite technical knowledge or skills. To design and develop a system for creating, executing and analysing the results of simulated public health and healthcare policy interventions, in ways that are accessible and usable by modellers and policy-makers. The system requirements were captured and analysed in parallel with the statistical method development for the simulation engine. From the resulting software requirement specification the system architecture was designed, implemented and tested. A model for Coronary Heart Disease (CHD) was created and validated against empirical data. The system was successfully used to create and validate the CHD model. The initial validation results show concordance between the simulation results and the empirical data. We have demonstrated the ability to connect health policy-modellers and policy-makers in a unified system, thereby making population health models easier to share, maintain, reuse and deploy.

  20. Customized "In-Office" Three-Dimensional Printing for Virtual Surgical Planning in Craniofacial Surgery.

    PubMed

    Mendez, Bernardino M; Chiodo, Michael V; Patel, Parit A

    2015-07-01

    Virtual surgical planning using three-dimensional (3D) printing technology has improved surgical efficiency and precision. A limitation to this technology is that production of 3D surgical models requires a third-party source, leading to increased costs (up to $4000) and prolonged assembly times (averaging 2-3 weeks). The purpose of this study is to evaluate the feasibility, cost, and production time of customized skull models created by an "in-office" 3D printer for craniofacial reconstruction. Two patients underwent craniofacial reconstruction with the assistance of "in-office" 3D printing technology. Three-dimensional skull models were created from a bioplastic filament with a 3D printer using computed tomography (CT) image data. The cost and production time for each model were measured. For both patients, a customized 3D surgical model was used preoperatively to plan split calvarial bone grafting and intraoperatively to more efficiently and precisely perform the craniofacial reconstruction. The average cost for surgical model production with the "in-office" 3D printer was $25 (cost of bioplastic materials used to create surgical model) and the average production time was 14  hours. Virtual surgical planning using "in office" 3D printing is feasible and allows for a more cost-effective and less time consuming method for creating surgical models and guides. By bringing 3D printing to the office setting, we hope to improve intraoperative efficiency, surgical precision, and overall cost for various types of craniofacial and reconstructive surgery.

  1. Computer Model Locates Environmental Hazards

    NASA Technical Reports Server (NTRS)

    2008-01-01

    Catherine Huybrechts Burton founded San Francisco-based Endpoint Environmental (2E) LLC in 2005 while she was a student intern and project manager at Ames Research Center with NASA's DEVELOP program. The 2E team created the Tire Identification from Reflectance model, which algorithmically processes satellite images using turnkey technology to retain only the darkest parts of an image. This model allows 2E to locate piles of rubber tires, which often are stockpiled illegally and cause hazardous environmental conditions and fires.

  2. Three-dimensional model of the skull and the cranial bones reconstructed from CT scans designed for rapid prototyping process.

    PubMed

    Skrzat, Janusz; Spulber, Alexandru; Walocha, Jerzy

    This paper presents the effects of building mesh models of the human skull and the cranial bones from a series of CT-scans. With the aid of computer so ware, 3D reconstructions of the whole skull and segmented cranial bones were performed and visualized by surface rendering techniques. The article briefly discusses clinical and educational applications of 3D cranial models created using stereolitographic reproduction.

  3. Mining for Data

    NASA Technical Reports Server (NTRS)

    1998-01-01

    AbTech Corporation used an F-18 HARV (High Alpha Research Vehicle) simulation developed by NASA to create an interactive computer-based prototype of the MQ (Model Quest) SV (System Validator) tool. Dryden Flight Research Center provided support to develop, test, and rapidly reprogram the validation function. AbTech's ModelQuest Enterprises highly automated and outperforms other modeling techniques to quickly discover meaningful relationships, patterns, and trends in databases. Applications include technical and business professionals in finance, marketing, business, banking, retail, healthcare, and aerospace.

  4. Assessment of Life Cycle Information Exchanges (LCie): Understanding the Value-Added Benefit of a COBie Process

    DTIC Science & Technology

    2013-10-01

    exchange (COBie), Building Information Modeling ( BIM ), value-added analysis, business processes, project management 16. SECURITY CLASSIFICATION OF: 17...equipment. The innovative aspect of Building In- formation Modeling ( BIM ) is that it creates a computable building descrip- tion. The ability to use a...interoperability. In order for the building information to be interoperable, it must also con- form to a common data model , or schema, that defines the class

  5. User's Guide for ENSAERO_FE Parallel Finite Element Solver

    NASA Technical Reports Server (NTRS)

    Eldred, Lloyd B.; Guruswamy, Guru P.

    1999-01-01

    A high fidelity parallel static structural analysis capability is created and interfaced to the multidisciplinary analysis package ENSAERO-MPI of Ames Research Center. This new module replaces ENSAERO's lower fidelity simple finite element and modal modules. Full aircraft structures may be more accurately modeled using the new finite element capability. Parallel computation is performed by breaking the full structure into multiple substructures. This approach is conceptually similar to ENSAERO's multizonal fluid analysis capability. The new substructure code is used to solve the structural finite element equations for each substructure in parallel. NASTRANKOSMIC is utilized as a front end for this code. Its full library of elements can be used to create an accurate and realistic aircraft model. It is used to create the stiffness matrices for each substructure. The new parallel code then uses an iterative preconditioned conjugate gradient method to solve the global structural equations for the substructure boundary nodes.

  6. Creating and parameterizing patient-specific deep brain stimulation pathway-activation models using the hyperdirect pathway as an example.

    PubMed

    Gunalan, Kabilar; Chaturvedi, Ashutosh; Howell, Bryan; Duchin, Yuval; Lempka, Scott F; Patriat, Remi; Sapiro, Guillermo; Harel, Noam; McIntyre, Cameron C

    2017-01-01

    Deep brain stimulation (DBS) is an established clinical therapy and computational models have played an important role in advancing the technology. Patient-specific DBS models are now common tools in both academic and industrial research, as well as clinical software systems. However, the exact methodology for creating patient-specific DBS models can vary substantially and important technical details are often missing from published reports. Provide a detailed description of the assembly workflow and parameterization of a patient-specific DBS pathway-activation model (PAM) and predict the response of the hyperdirect pathway to clinical stimulation. Integration of multiple software tools (e.g. COMSOL, MATLAB, FSL, NEURON, Python) enables the creation and visualization of a DBS PAM. An example DBS PAM was developed using 7T magnetic resonance imaging data from a single unilaterally implanted patient with Parkinson's disease (PD). This detailed description implements our best computational practices and most elaborate parameterization steps, as defined from over a decade of technical evolution. Pathway recruitment curves and strength-duration relationships highlight the non-linear response of axons to changes in the DBS parameter settings. Parameterization of patient-specific DBS models can be highly detailed and constrained, thereby providing confidence in the simulation predictions, but at the expense of time demanding technical implementation steps. DBS PAMs represent new tools for investigating possible correlations between brain pathway activation patterns and clinical symptom modulation.

  7. Flight Dynamics of Flexible Aircraft with Aeroelastic and Inertial Force Interactions

    NASA Technical Reports Server (NTRS)

    Nguyen, Nhan T.; Tuzcu, Ilhan

    2009-01-01

    This paper presents an integrated flight dynamic modeling method for flexible aircraft that captures coupled physics effects due to inertial forces, aeroelasticity, and propulsive forces that are normally present in flight. The present approach formulates the coupled flight dynamics using a structural dynamic modeling method that describes the elasticity of a flexible, twisted, swept wing using an equivalent beam-rod model. The structural dynamic model allows for three types of wing elastic motion: flapwise bending, chordwise bending, and torsion. Inertial force coupling with the wing elasticity is formulated to account for aircraft acceleration. The structural deflections create an effective aeroelastic angle of attack that affects the rigid-body motion of flexible aircraft. The aeroelastic effect contributes to aerodynamic damping forces that can influence aerodynamic stability. For wing-mounted engines, wing flexibility can cause the propulsive forces and moments to couple with the wing elastic motion. The integrated flight dynamics for a flexible aircraft are formulated by including generalized coordinate variables associated with the aeroelastic-propulsive forces and moments in the standard state-space form for six degree-of-freedom flight dynamics. A computational structural model for a generic transport aircraft has been created. The eigenvalue analysis is performed to compute aeroelastic frequencies and aerodynamic damping. The results will be used to construct an integrated flight dynamic model of a flexible generic transport aircraft.

  8. Coherence in the Visual Imagination.

    PubMed

    Vertolli, Michael O; Kelly, Matthew A; Davies, Jim

    2018-04-01

    An incoherent visualization is when aspects of different senses of a word (e.g., the biological "mouse" vs. the computer "mouse") are present in the same visualization (e.g., a visualization of a biological mouse in the same image with a computer tower). We describe and implement a new model of creating contextual coherence in the visual imagination called Coherencer, based on the SOILIE model of imagination. We show that Coherencer is able to generate scene descriptions that are more coherent than SOILIE's original approach as well as a parallel connectionist algorithm that is considered competitive in the literature on general coherence. We also show that co-occurrence probabilities are a better association representation than holographic vectors and that better models of coherence improve the resulting output independent of the association type that is used. Theoretically, we show that Coherencer is consistent with other models of cognitive generation. In particular, Coherencer is a similar, but more cognitively plausible model than the C 3 model of concept combination created by Costello and Keane (2000). We show that Coherencer is also consistent with both the modal schematic indices of perceptual symbol systems theory (Barsalou, 1999) and the amodal contextual constraints of Thagard's (2002) theory of coherence. Finally, we describe how Coherencer is consistent with contemporary research on the hippocampus, and we show evidence that the process of making a visualization coherent is serial. Copyright © 2017 Cognitive Science Society, Inc.

  9. Parallel Computing for Brain Simulation.

    PubMed

    Pastur-Romay, L A; Porto-Pazos, A B; Cedron, F; Pazos, A

    2017-01-01

    The human brain is the most complex system in the known universe, it is therefore one of the greatest mysteries. It provides human beings with extraordinary abilities. However, until now it has not been understood yet how and why most of these abilities are produced. For decades, researchers have been trying to make computers reproduce these abilities, focusing on both understanding the nervous system and, on processing data in a more efficient way than before. Their aim is to make computers process information similarly to the brain. Important technological developments and vast multidisciplinary projects have allowed creating the first simulation with a number of neurons similar to that of a human brain. This paper presents an up-to-date review about the main research projects that are trying to simulate and/or emulate the human brain. They employ different types of computational models using parallel computing: digital models, analog models and hybrid models. This review includes the current applications of these works, as well as future trends. It is focused on various works that look for advanced progress in Neuroscience and still others which seek new discoveries in Computer Science (neuromorphic hardware, machine learning techniques). Their most outstanding characteristics are summarized and the latest advances and future plans are presented. In addition, this review points out the importance of considering not only neurons: Computational models of the brain should also include glial cells, given the proven importance of astrocytes in information processing. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  10. Process and representation in graphical displays

    NASA Technical Reports Server (NTRS)

    Gillan, Douglas J.; Lewis, Robert; Rudisill, Marianne

    1993-01-01

    Our initial model of graphic comprehension has focused on statistical graphs. Like other models of human-computer interaction, models of graphical comprehension can be used by human-computer interface designers and developers to create interfaces that present information in an efficient and usable manner. Our investigation of graph comprehension addresses two primary questions: how do people represent the information contained in a data graph?; and how do they process information from the graph? The topics of focus for graphic representation concern the features into which people decompose a graph and the representations of the graph in memory. The issue of processing can be further analyzed as two questions: what overall processing strategies do people use?; and what are the specific processing skills required?

  11. Modeling Code Is Helping Cleveland Develop New Products

    NASA Technical Reports Server (NTRS)

    1998-01-01

    Master Builders, Inc., is a 350-person company in Cleveland, Ohio, that develops and markets specialty chemicals for the construction industry. Developing new products involves creating many potential samples and running numerous tests to characterize the samples' performance. Company engineers enlisted NASA's help to replace cumbersome physical testing with computer modeling of the samples' behavior. Since the NASA Lewis Research Center's Structures Division develops mathematical models and associated computation tools to analyze the deformation and failure of composite materials, its researchers began a two-phase effort to modify Lewis' Integrated Composite Analyzer (ICAN) software for Master Builders' use. Phase I has been completed, and Master Builders is pleased with the results. The company is now working to begin implementation of Phase II.

  12. Local and global Λ polarization in a vortical fluid

    DOE PAGES

    Li, Hui; Petersen, Hannah; Pang, Long -Gang; ...

    2017-09-25

    We compute the fermion spin distribution in the vortical fluid created in off-central high energy heavy-ion collisions. We employ the event-by-event (3+1)D viscous hydrodynamic model. The spin polarization density is proportional to the local fluid vorticity in quantum kinetic theory. As a result of strong collectivity, the spatial distribution of the local vorticity on the freeze-out hyper-surface strongly correlates to the rapidity and azimuthal angle distribution of fermion spins. We investigate the sensitivity of the local polarization to the initial fluid velocity in the hydrodynamic model and compute the global polarization of Λ hyperons by the AMPT model. The energymore » dependence of the global polarization agrees with the STAR data.« less

  13. Local and global Λ polarization in a vortical fluid

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Hui; Petersen, Hannah; Pang, Long -Gang

    We compute the fermion spin distribution in the vortical fluid created in off-central high energy heavy-ion collisions. We employ the event-by-event (3+1)D viscous hydrodynamic model. The spin polarization density is proportional to the local fluid vorticity in quantum kinetic theory. As a result of strong collectivity, the spatial distribution of the local vorticity on the freeze-out hyper-surface strongly correlates to the rapidity and azimuthal angle distribution of fermion spins. We investigate the sensitivity of the local polarization to the initial fluid velocity in the hydrodynamic model and compute the global polarization of Λ hyperons by the AMPT model. The energymore » dependence of the global polarization agrees with the STAR data.« less

  14. Large-Eddy Simulation of Internal Flow through Human Vocal Folds

    NASA Astrophysics Data System (ADS)

    Lasota, Martin; Šidlof, Petr

    2018-06-01

    The phonatory process occurs when air is expelled from the lungs through the glottis and the pressure drop causes flow-induced oscillations of the vocal folds. The flow fields created in phonation are highly unsteady and the coherent vortex structures are also generated. For accuracy it is essential to compute on humanlike computational domain and appropriate mathematical model. The work deals with numerical simulation of air flow within the space between plicae vocales and plicae vestibulares. In addition to the dynamic width of the rima glottidis, where the sound is generated, there are lateral ventriculus laryngis and sacculus laryngis included in the computational domain as well. The paper presents the results from OpenFOAM which are obtained with a large-eddy simulation using second-order finite volume discretization of incompressible Navier-Stokes equations. Large-eddy simulations with different subgrid scale models are executed on structured mesh. In these cases are used only the subgrid scale models which model turbulence via turbulent viscosity and Boussinesq approximation in subglottal and supraglottal area in larynx.

  15. Computational Aerodynamic Analysis of Three-Dimensional Ice Shapes on a NACA 23012 Airfoil

    NASA Technical Reports Server (NTRS)

    Jun, GaRam; Oliden, Daniel; Potapczuk, Mark G.; Tsao, Jen-Ching

    2014-01-01

    The present study identifies a process for performing computational fluid dynamic calculations of the flow over full three-dimensional (3D) representations of complex ice shapes deposited on aircraft surfaces. Rime and glaze icing geometries formed on a NACA23012 airfoil were obtained during testing in the NASA Glenn Research Centers Icing Research Tunnel (IRT). The ice shape geometries were scanned as a cloud of data points using a 3D laser scanner. The data point clouds were meshed using Geomagic software to create highly accurate models of the ice surface. The surface data was imported into Pointwise grid generation software to create the CFD surface and volume grids. It was determined that generating grids in Pointwise for complex 3D icing geometries was possible using various techniques that depended on the ice shape. Computations of the flow fields over these ice shapes were performed using the NASA National Combustion Code (NCC). Results for a rime ice shape for angle of attack conditions ranging from 0 to 10 degrees and for freestream Mach numbers of 0.10 and 0.18 are presented. For validation of the computational results, comparisons were made to test results from rapid-prototype models of the selected ice accretion shapes, obtained from a separate study in a subsonic wind tunnel at the University of Illinois at Urbana-Champaign. The computational and experimental results were compared for values of pressure coefficient and lift. Initial results show fairly good agreement for rime ice accretion simulations across the range of conditions examined. The glaze ice results are promising but require some further examination.

  16. Computational Aerodynamic Analysis of Three-Dimensional Ice Shapes on a NACA 23012 Airfoil

    NASA Technical Reports Server (NTRS)

    Jun, Garam; Oliden, Daniel; Potapczuk, Mark G.; Tsao, Jen-Ching

    2014-01-01

    The present study identifies a process for performing computational fluid dynamic calculations of the flow over full three-dimensional (3D) representations of complex ice shapes deposited on aircraft surfaces. Rime and glaze icing geometries formed on a NACA23012 airfoil were obtained during testing in the NASA Glenn Research Center's Icing Research Tunnel (IRT). The ice shape geometries were scanned as a cloud of data points using a 3D laser scanner. The data point clouds were meshed using Geomagic software to create highly accurate models of the ice surface. The surface data was imported into Pointwise grid generation software to create the CFD surface and volume grids. It was determined that generating grids in Pointwise for complex 3D icing geometries was possible using various techniques that depended on the ice shape. Computations of the flow fields over these ice shapes were performed using the NASA National Combustion Code (NCC). Results for a rime ice shape for angle of attack conditions ranging from 0 to 10 degrees and for freestream Mach numbers of 0.10 and 0.18 are presented. For validation of the computational results, comparisons were made to test results from rapid-prototype models of the selected ice accretion shapes, obtained from a separate study in a subsonic wind tunnel at the University of Illinois at Urbana-Champaign. The computational and experimental results were compared for values of pressure coefficient and lift. Initial results show fairly good agreement for rime ice accretion simulations across the range of conditions examined. The glaze ice results are promising but require some further examination.

  17. Spring assisted cranioplasty: A patient specific computational model.

    PubMed

    Borghi, Alessandro; Rodriguez-Florez, Naiara; Rodgers, Will; James, Gregory; Hayward, Richard; Dunaway, David; Jeelani, Owase; Schievano, Silvia

    2018-03-01

    Implantation of spring-like distractors in the treatment of sagittal craniosynostosis is a novel technique that has proven functionally and aesthetically effective in correcting skull deformities; however, final shape outcomes remain moderately unpredictable due to an incomplete understanding of the skull-distractor interaction. The aim of this study was to create a patient specific computational model of spring assisted cranioplasty (SAC) that can help predict the individual overall final head shape. Pre-operative computed tomography images of a SAC patient were processed to extract a 3D model of the infant skull anatomy and simulate spring implantation. The distractors were modeled based on mechanical experimental data. Viscoelastic bone properties from the literature were tuned using the specific patient procedural information recorded during surgery and from x-ray measurements at follow-up. The model accurately captured spring expansion on-table (within 9% of the measured values), as well as at first and second follow-ups (within 8% of the measured values). Comparison between immediate post-operative 3D head scanning and numerical results for this patient proved that the model could successfully predict the final overall head shape. This preliminary work showed the potential application of computational modeling to study SAC, to support pre-operative planning and guide novel distractor design. Copyright © 2018 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Computational Knee Ligament Modeling Using Experimentally Determined Zero-Load Lengths

    PubMed Central

    Bloemker, Katherine H; Guess, Trent M; Maletsky, Lorin; Dodd, Kevin

    2012-01-01

    This study presents a subject-specific method of determining the zero-load lengths of the cruciate and collateral ligaments in computational knee modeling. Three cadaver knees were tested in a dynamic knee simulator. The cadaver knees also underwent manual envelope of motion testing to find their passive range of motion in order to determine the zero-load lengths for each ligament bundle. Computational multibody knee models were created for each knee and model kinematics were compared to experimental kinematics for a simulated walk cycle. One-dimensional non-linear spring damper elements were used to represent cruciate and collateral ligament bundles in the knee models. This study found that knee kinematics were highly sensitive to altering of the zero-load length. The results also suggest optimal methods for defining each of the ligament bundle zero-load lengths, regardless of the subject. These results verify the importance of the zero-load length when modeling the knee joint and verify that manual envelope of motion measurements can be used to determine the passive range of motion of the knee joint. It is also believed that the method described here for determining zero-load length can be used for in vitro or in vivo subject-specific computational models. PMID:22523522

  19. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis

    PubMed Central

    Mootanah, R.; Imhauser, C.W.; Reisse, F.; Carpanen, D.; Walker, R.W.; Koff, M.F.; Lenhoff, M.W.; Rozbruch, S.R.; Fragomen, A.T.; Dewan, Z.; Kirane, Y.M.; Cheah, Pamela A.; Dowell, J.K.; Hillstrom, H.J.

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between EE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning. PMID:24786914

  20. Development and validation of a computational model of the knee joint for the evaluation of surgical treatments for osteoarthritis.

    PubMed

    Mootanah, R; Imhauser, C W; Reisse, F; Carpanen, D; Walker, R W; Koff, M F; Lenhoff, M W; Rozbruch, S R; Fragomen, A T; Dewan, Z; Kirane, Y M; Cheah, K; Dowell, J K; Hillstrom, H J

    2014-01-01

    A three-dimensional (3D) knee joint computational model was developed and validated to predict knee joint contact forces and pressures for different degrees of malalignment. A 3D computational knee model was created from high-resolution radiological images to emulate passive sagittal rotation (full-extension to 65°-flexion) and weight acceptance. A cadaveric knee mounted on a six-degree-of-freedom robot was subjected to matching boundary and loading conditions. A ligament-tuning process minimised kinematic differences between the robotically loaded cadaver specimen and the finite element (FE) model. The model was validated by measured intra-articular force and pressure measurements. Percent full scale error between FE-predicted and in vitro-measured values in the medial and lateral compartments were 6.67% and 5.94%, respectively, for normalised peak pressure values, and 7.56% and 4.48%, respectively, for normalised force values. The knee model can accurately predict normalised intra-articular pressure and forces for different loading conditions and could be further developed for subject-specific surgical planning.

  1. Dynamic modeling of Tampa Bay urban development using parallel computing

    USGS Publications Warehouse

    Xian, G.; Crane, M.; Steinwand, D.

    2005-01-01

    Urban land use and land cover has changed significantly in the environs of Tampa Bay, Florida, over the past 50 years. Extensive urbanization has created substantial change to the region's landscape and ecosystems. This paper uses a dynamic urban-growth model, SLEUTH, which applies six geospatial data themes (slope, land use, exclusion, urban extent, transportation, hillside), to study the process of urbanization and associated land use and land cover change in the Tampa Bay area. To reduce processing time and complete the modeling process within an acceptable period, the model is recoded and ported to a Beowulf cluster. The parallel-processing computer system accomplishes the massive amount of computation the modeling simulation requires. SLEUTH calibration process for the Tampa Bay urban growth simulation spends only 10 h CPU time. The model predicts future land use/cover change trends for Tampa Bay from 1992 to 2025. Urban extent is predicted to double in the Tampa Bay watershed between 1992 and 2025. Results show an upward trend of urbanization at the expense of a decline of 58% and 80% in agriculture and forested lands, respectively.

  2. Biokinetic modelling development and analysis of arsenic dissolution into the gastrointestinal tract using SAAM II

    NASA Astrophysics Data System (ADS)

    Perama, Yasmin Mohd Idris; Siong, Khoo Kok

    2018-04-01

    A mathematical model comprising 8 compartments were designed to describe the kinetic dissolution of arsenic (As) from water leach purification (WLP) waste samples ingested into the gastrointestinal system. A totally reengineered software system named Simulation, Analysis and Modelling II (SAAM II) was employed to aid in the experimental design and data analysis. As a powerful tool that creates, simulate and analyze data accurately and rapidly, SAAM II computationally creates a system of ordinary differential equations according to the specified compartmental model structure and simulates the solutions based upon the parameter and model inputs provided. The experimental design of in vitro DIN approach was applied to create an artificial gastric and gastrointestinal fluids. These synthetic fluids assay were produced to determine the concentrations of As ingested into the gastrointestinal tract. The model outputs were created based upon the experimental inputs and the recommended fractional transfer rates parameter. As a result, the measured and predicted As concentrations in gastric fluids were much similar against the time of study. In contrast, the concentrations of As in the gastrointestinal fluids were only similar during the first hour and eventually started decreasing until the fifth hours of study between the measured and predicted values. This is due to the loss of As through the fractional transfer rates of q2 compartment to corresponding compartments of q3 and q5 which are involved with excretion and distribution to the whole body, respectively. The model outputs obtained after best fit to the data were influenced significantly by the fractional transfer rates between each compartment. Therefore, a series of compartmental model created with the association of fractional transfer rates parameter with the aid of SAAM II provides better estimation that simulate the kinetic behavior of As ingested into the gastrointestinal system.

  3. BioSPICE: access to the most current computational tools for biologists.

    PubMed

    Garvey, Thomas D; Lincoln, Patrick; Pedersen, Charles John; Martin, David; Johnson, Mark

    2003-01-01

    The goal of the BioSPICE program is to create a framework that provides biologists access to the most current computational tools. At the program midpoint, the BioSPICE member community has produced a software system that comprises contributions from approximately 20 participating laboratories integrated under the BioSPICE Dashboard and a methodology for continued software integration. These contributed software modules are the BioSPICE Dashboard, a graphical environment that combines Open Agent Architecture and NetBeans software technologies in a coherent, biologist-friendly user interface. The current Dashboard permits data sources, models, simulation engines, and output displays provided by different investigators and running on different machines to work together across a distributed, heterogeneous network. Among several other features, the Dashboard enables users to create graphical workflows by configuring and connecting available BioSPICE components. Anticipated future enhancements to BioSPICE include a notebook capability that will permit researchers to browse and compile data to support model building, a biological model repository, and tools to support the development, control, and data reduction of wet-lab experiments. In addition to the BioSPICE software products, a project website supports information exchange and community building.

  4. nmsBuilder: Freeware to create subject-specific musculoskeletal models for OpenSim.

    PubMed

    Valente, Giordano; Crimi, Gianluigi; Vanella, Nicola; Schileo, Enrico; Taddei, Fulvia

    2017-12-01

    Musculoskeletal modeling and simulations of movement have been increasingly used in orthopedic and neurological scenarios, with increased attention to subject-specific applications. In general, musculoskeletal modeling applications have been facilitated by the development of dedicated software tools; however, subject-specific studies have been limited also by time-consuming modeling workflows and high skilled expertise required. In addition, no reference tools exist to standardize the process of musculoskeletal model creation and make it more efficient. Here we present a freely available software application, nmsBuilder 2.0, to create musculoskeletal models in the file format of OpenSim, a widely-used open-source platform for musculoskeletal modeling and simulation. nmsBuilder 2.0 is the result of a major refactoring of a previous implementation that moved a first step toward an efficient workflow for subject-specific model creation. nmsBuilder includes a graphical user interface that provides access to all functionalities, based on a framework for computer-aided medicine written in C++. The operations implemented can be used in a workflow to create OpenSim musculoskeletal models from 3D surfaces. A first step includes data processing to create supporting objects necessary to create models, e.g. surfaces, anatomical landmarks, reference systems; and a second step includes the creation of OpenSim objects, e.g. bodies, joints, muscles, and the corresponding model. We present a case study using nmsBuilder 2.0: the creation of an MRI-based musculoskeletal model of the lower limb. The model included four rigid bodies, five degrees of freedom and 43 musculotendon actuators, and was created from 3D surfaces of the segmented images of a healthy subject through the modeling workflow implemented in the software application. We have presented nmsBuilder 2.0 for the creation of musculoskeletal OpenSim models from image-based data, and made it freely available via nmsbuilder.org. This application provides an efficient workflow for model creation and helps standardize the process. We hope this would help promote personalized applications in musculoskeletal biomechanics, including larger sample size studies, and might also represent a basis for future developments for specific applications. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Multiscale Models of Melting Arctic Sea Ice

    DTIC Science & Technology

    2014-09-30

    from weakly to highly correlated, or Poissonian toward Wigner -Dyson, as a function of system connectedness. This provides a mechanism for explaining...eluded us. Court Strong found such a method. It creates an optimal fit of a hyperbolic tangent model for the fractal dimension as a function of log A...actual melt pond images, and have made significant advances in the underlying functional and numerical analysis needed for these computations

  6. Numerical and Analytical Modeling of Laser Deposition with Preheating (Preprint)

    DTIC Science & Technology

    2007-03-01

    temperature materials, Numerical Heat Transfer 11 (1987) 477-491. [9] L. Han, F.W. Liou, K.M. Phatk, Modeling of laser cladding with powder injection... cladding process. This laser additive manufacturing technique allows quick fabrication of fully-dense metallic components directly from Computer...1, laser deposition uses a focused laser beam as a heat source to create a melt pool on an underlying substrate. Powder material is then injected

  7. Natural and accelerated recovery from brain damage: experimental and theoretical approaches.

    PubMed

    Andersen, Richard A; Schieber, Marc H; Thakor, Nitish; Loeb, Gerald E

    2012-03-01

    The goal of the Caltech group is to gain insight into the processes that occur within the primate nervous system during dexterous reaching and grasping and to see whether natural recovery from local brain damage can be accelerated by artificial means. We will create computational models of the nervous system embodying this insight and explain a variety of clinically observed neurological deficits in human subjects using these models.

  8. Behavioral modeling of VCSELs for high-speed optical interconnects

    NASA Astrophysics Data System (ADS)

    Szczerba, Krzysztof; Kocot, Chris

    2018-02-01

    Transition from on-off keying to 4-level pulse amplitude modulation (PAM) in VCSEL based optical interconnects allows for an increase of data rates, at the cost of 4.8 dB sensitivity penalty. The resulting strained link budget creates a need for accurate VCSEL models for driver integrated circuit (IC) design and system level simulations. Rate equation based equivalent circuit models are convenient for the IC design, but system level analysis requires computationally efficient closed form behavioral models based Volterra series and neural networks. In this paper we present and compare these models.

  9. Geopressure modeling from petrophysical data: An example from East Kalimantan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Herkommer, M.A.

    1994-07-01

    Localized models of abnormal formation pressure (geopressure) are important economic and safety tools frequently used for well planning and drilling operations. Simplified computer-based procedures have been developed that permit these models to be developed more rapidly and with greater accuracy. These techniques are broadly applicable to basins throughout the world where abnormal formation pressures occur. An example from the Attaka field of East Kalimantan, southeast Asia, shows how geopressure models are developed. Using petrophysical and engineering data, empirical correlations between observed pressure and petrophysical logs can be created by computer-assisted data-fitting techniques. These correlations serve as the basis for modelsmore » of the geopressure. By performing repeated analyses on wells at various locations, contour maps on the top of abnormal geopressure can be created. Methods that are simple in their development and application make the task of geopressure estimation less formidable to the geologist and petroleum engineer. Further, more accurate estimates can significantly improve drilling speeds while reducing the incidence of stuck pipe, kicks, and blowouts. In general, geopressure estimates are used in all phases of drilling operations: To develop mud plans and specify equipment ratings, to assist in the recognition of geopressured formations and determination of mud weights, and to improve predictions at offset locations and geologically comparable areas.« less

  10. Development of a numerical model for the electric current in burner-stabilised methane-air flames

    NASA Astrophysics Data System (ADS)

    Speelman, N.; de Goey, L. P. H.; van Oijen, J. A.

    2015-03-01

    This study presents a new model to simulate the electric behaviour of one-dimensional ionised flames and to predict the electric currents in these flames. The model utilises Poisson's equation to compute the electric potential. A multi-component diffusion model, including the influence of an electric field, is used to model the diffusion of neutral and charged species. The model is incorporated into the existing CHEM1D flame simulation software. A comparison between the computed electric currents and experimental values from the literature shows good qualitative agreement for the voltage-current characteristic. Physical phenomena, such as saturation and the diodic effect, are captured by the model. The dependence of the saturation current on the equivalence ratio is also captured well for equivalence ratios between 0.6 and 1.2. Simulations show a clear relation between the saturation current and the total number of charged particles created. The model shows that the potential at which the electric field saturates is strongly dependent on the recombination rate and the diffusivity of the charged particles. The onset of saturation occurs because most created charged particles are withdrawn from the flame and because the electric field effects start dominating over mass based diffusion. It is shown that this knowledge can be used to optimise ionisation chemistry mechanisms. It is shown numerically that the so-called diodic effect is caused primarily by the distance the heavier cations have to travel to the cathode.

  11. The Role of Physicality in Rich Programming Environments

    ERIC Educational Resources Information Center

    Liu, Allison S.; Schunn, Christian D.; Flot, Jesse; Shoop, Robin

    2013-01-01

    Computer science proficiency continues to grow in importance, while the number of students entering computer science-related fields declines. Many rich programming environments have been created to motivate student interest and expertise in computer science. In the current study, we investigated whether a recently created environment, Robot…

  12. Learning general phonological rules from distributional information: a computational model.

    PubMed

    Calamaro, Shira; Jarosz, Gaja

    2015-04-01

    Phonological rules create alternations in the phonetic realizations of related words. These rules must be learned by infants in order to identify the phonological inventory, the morphological structure, and the lexicon of a language. Recent work proposes a computational model for the learning of one kind of phonological alternation, allophony (Peperkamp, Le Calvez, Nadal, & Dupoux, 2006). This paper extends the model to account for learning of a broader set of phonological alternations and the formalization of these alternations as general rules. In Experiment 1, we apply the original model to new data in Dutch and demonstrate its limitations in learning nonallophonic rules. In Experiment 2, we extend the model to allow it to learn general rules for alternations that apply to a class of segments. In Experiment 3, the model is further extended to allow for generalization by context; we argue that this generalization must be constrained by linguistic principles. Copyright © 2014 Cognitive Science Society, Inc.

  13. The Effects of Gender on the Attitudes towards the Computer Assisted Instruction: A Meta-Analysis

    ERIC Educational Resources Information Center

    Cam, Sefika Sumeyye; Yarar, Gokhan; Toraman, Cetin; Erdamar, Gurcu Koc

    2016-01-01

    The idea that gender factor creates a difference on computer usage and computer-assisted instruction is based upon previous years. At that time, it was thought that some areas like engineering, science and mathematics were for males so it created a difference on the computer usage. Nevertheless, developing technology and females becoming more…

  14. A computable expression of closure to efficient causation.

    PubMed

    Mossio, Matteo; Longo, Giuseppe; Stewart, John

    2009-04-07

    In this paper, we propose a mathematical expression of closure to efficient causation in terms of lambda-calculus; we argue that this opens up the perspective of developing principled computer simulations of systems closed to efficient causation in an appropriate programming language. An important implication of our formulation is that, by exhibiting an expression in lambda-calculus, which is a paradigmatic formalism for computability and programming, we show that there are no conceptual or principled problems in realizing a computer simulation or model of closure to efficient causation. We conclude with a brief discussion of the question whether closure to efficient causation captures all relevant properties of living systems. We suggest that it might not be the case, and that more complex definitions could indeed create crucial some obstacles to computability.

  15. Simulating the nasal cycle with computational fluid dynamics

    PubMed Central

    Patel, Ruchin G.; Garcia, Guilherme J. M.; Frank-Ito, Dennis O.; Kimbell, Julia S.; Rhee, John S.

    2015-01-01

    Objectives (1) Develop a method to account for the confounding effect of the nasal cycle when comparing pre- and post-surgery objective measures of nasal patency. (2) Illustrate this method by reporting objective measures derived from computational fluid dynamics (CFD) models spanning the full range of mucosal engorgement associated with the nasal cycle in two subjects. Study Design Retrospective Setting Academic tertiary medical center. Subjects and Methods A cohort of 24 nasal airway obstruction patients was reviewed to select the two patients with the greatest reciprocal change in mucosal engorgement between pre- and post-surgery computed tomography (CT) scans. Three-dimensional anatomic models were created based on the pre- and post-operative CT scans. Nasal cycling models were also created by gradually changing the thickness of the inferior turbinate, middle turbinate, and septal swell body. CFD was used to simulate airflow and to calculate nasal resistance and average heat flux. Results Before accounting for the nasal cycle, Patient A appeared to have a paradoxical worsening nasal obstruction in the right cavity postoperatively. After accounting for the nasal cycle, Patient A had small improvements in objective measures postoperatively. The magnitude of the surgical effect also differed in Patient B after accounting for the nasal cycle. Conclusion By simulating the nasal cycle and comparing models in similar congestive states, surgical changes in nasal patency can be distinguished from physiological changes associated with the nasal cycle. This ability can lead to more precise comparisons of pre and post-surgery objective measures and potentially more accurate virtual surgery planning. PMID:25450411

  16. Study on the application of mobile internet cloud computing platform

    NASA Astrophysics Data System (ADS)

    Gong, Songchun; Fu, Songyin; Chen, Zheng

    2012-04-01

    The innovative development of computer technology promotes the application of the cloud computing platform, which actually is the substitution and exchange of a sort of resource service models and meets the needs of users on the utilization of different resources after changes and adjustments of multiple aspects. "Cloud computing" owns advantages in many aspects which not merely reduce the difficulties to apply the operating system and also make it easy for users to search, acquire and process the resources. In accordance with this point, the author takes the management of digital libraries as the research focus in this paper, and analyzes the key technologies of the mobile internet cloud computing platform in the operation process. The popularization and promotion of computer technology drive people to create the digital library models, and its core idea is to strengthen the optimal management of the library resource information through computers and construct an inquiry and search platform with high performance, allowing the users to access to the necessary information resources at any time. However, the cloud computing is able to promote the computations within the computers to distribute in a large number of distributed computers, and hence implement the connection service of multiple computers. The digital libraries, as a typical representative of the applications of the cloud computing, can be used to carry out an analysis on the key technologies of the cloud computing.

  17. Multi-material 3D Models for Temporal Bone Surgical Simulation.

    PubMed

    Rose, Austin S; Kimbell, Julia S; Webster, Caroline E; Harrysson, Ola L A; Formeister, Eric J; Buchman, Craig A

    2015-07-01

    A simulated, multicolor, multi-material temporal bone model can be created using 3-dimensional (3D) printing that will prove both safe and beneficial in training for actual temporal bone surgical cases. As the process of additive manufacturing, or 3D printing, has become more practical and affordable, a number of applications for the technology in the field of Otolaryngology-Head and Neck Surgery have been considered. One area of promise is temporal bone surgical simulation. Three-dimensional representations of human temporal bones were created from temporal bone computed tomography (CT) scans using biomedical image processing software. Multi-material models were then printed and dissected in a temporal bone laboratory by attending and resident otolaryngologists. A 5-point Likert scale was used to grade the models for their anatomical accuracy and suitability as a simulation of cadaveric and operative temporal bone drilling. The models produced for this study demonstrate significant anatomic detail and a likeness to human cadaver specimens for drilling and dissection. Simulated temporal bones created by this process have potential benefit in surgical training, preoperative simulation for challenging otologic cases, and the standardized testing of temporal bone surgical skills. © The Author(s) 2015.

  18. Single Cell Genomics: Approaches and Utility in Immunology

    PubMed Central

    Neu, Karlynn E; Tang, Qingming; Wilson, Patrick C; Khan, Aly A

    2017-01-01

    Single cell genomics offers powerful tools for studying lymphocytes, which make it possible to observe rare and intermediate cell states that cannot be resolved at the population-level. Advances in computer science and single cell sequencing technology have created a data-driven revolution in immunology. The challenge for immunologists is to harness computing and turn an avalanche of quantitative data into meaningful discovery of immunological principles, predictive models, and strategies for therapeutics. Here, we review the current literature on computational analysis of single cell RNA-seq data and discuss underlying assumptions, methods, and applications in immunology, and highlight important directions for future research. PMID:28094102

  19. Statistical Model Applied to NetFlow for Network Intrusion Detection

    NASA Astrophysics Data System (ADS)

    Proto, André; Alexandre, Leandro A.; Batista, Maira L.; Oliveira, Isabela L.; Cansian, Adriano M.

    The computers and network services became presence guaranteed in several places. These characteristics resulted in the growth of illicit events and therefore the computers and networks security has become an essential point in any computing environment. Many methodologies were created to identify these events; however, with increasing of users and services on the Internet, many difficulties are found in trying to monitor a large network environment. This paper proposes a methodology for events detection in large-scale networks. The proposal approaches the anomaly detection using the NetFlow protocol, statistical methods and monitoring the environment in a best time for the application.

  20. Patient-Specific Computational Modeling of Upper Extremity Arteriovenous Fistula Creation: Its Feasibility to Support Clinical Decision-Making

    PubMed Central

    Bosboom, E. Marielle H.; Kroon, Wilco; van der Linden, Wim P. M.; Planken, R. Nils; van de Vosse, Frans N.; Tordoir, Jan H. M.

    2012-01-01

    Introduction Inadequate flow enhancement on the one hand, and excessive flow enhancement on the other hand, remain frequent complications of arteriovenous fistula (AVF) creation, and hamper hemodialysis therapy in patients with end-stage renal disease. In an effort to reduce these, a patient-specific computational model, capable of predicting postoperative flow, has been developed. The purpose of this study was to determine the accuracy of the patient-specific model and to investigate its feasibility to support decision-making in AVF surgery. Methods Patient-specific pulse wave propagation models were created for 25 patients awaiting AVF creation. Model input parameters were obtained from clinical measurements and literature. For every patient, a radiocephalic AVF, a brachiocephalic AVF, and a brachiobasilic AVF configuration were simulated and analyzed for their postoperative flow. The most distal configuration with a predicted flow between 400 and 1500 ml/min was considered the preferred location for AVF surgery. The suggestion of the model was compared to the choice of an experienced vascular surgeon. Furthermore, predicted flows were compared to measured postoperative flows. Results Taken into account the confidence interval (25th and 75th percentile interval), overlap between predicted and measured postoperative flows was observed in 70% of the patients. Differentiation between upper and lower arm configuration was similar in 76% of the patients, whereas discrimination between two upper arm AVF configurations was more difficult. In 3 patients the surgeon created an upper arm AVF, while model based predictions allowed for lower arm AVF creation, thereby preserving proximal vessels. In one patient early thrombosis in a radiocephalic AVF was observed which might have been indicated by the low predicted postoperative flow. Conclusions Postoperative flow can be predicted relatively accurately for multiple AVF configurations by using computational modeling. This model may therefore be considered a valuable additional tool in the preoperative work-up of patients awaiting AVF creation. PMID:22496816

  1. Synthetic Biology: Knowledge Accessed by Everyone (Open Sources)

    ERIC Educational Resources Information Center

    Sánchez Reyes, Patricia Margarita

    2016-01-01

    Using the principles of biology, along with engineering and with the help of computer, scientists manage to copy. DNA sequences from nature and use them to create new organisms. DNA is created through engineering and computer science managing to create life inside a laboratory. We cannot dismiss the role that synthetic biology could lead in…

  2. Development of a Distributed Parallel Computing Framework to Facilitate Regional/Global Gridded Crop Modeling with Various Scenarios

    NASA Astrophysics Data System (ADS)

    Jang, W.; Engda, T. A.; Neff, J. C.; Herrick, J.

    2017-12-01

    Many crop models are increasingly used to evaluate crop yields at regional and global scales. However, implementation of these models across large areas using fine-scale grids is limited by computational time requirements. In order to facilitate global gridded crop modeling with various scenarios (i.e., different crop, management schedule, fertilizer, and irrigation) using the Environmental Policy Integrated Climate (EPIC) model, we developed a distributed parallel computing framework in Python. Our local desktop with 14 cores (28 threads) was used to test the distributed parallel computing framework in Iringa, Tanzania which has 406,839 grid cells. High-resolution soil data, SoilGrids (250 x 250 m), and climate data, AgMERRA (0.25 x 0.25 deg) were also used as input data for the gridded EPIC model. The framework includes a master file for parallel computing, input database, input data formatters, EPIC model execution, and output analyzers. Through the master file for parallel computing, the user-defined number of threads of CPU divides the EPIC simulation into jobs. Then, Using EPIC input data formatters, the raw database is formatted for EPIC input data and the formatted data moves into EPIC simulation jobs. Then, 28 EPIC jobs run simultaneously and only interesting results files are parsed and moved into output analyzers. We applied various scenarios with seven different slopes and twenty-four fertilizer ranges. Parallelized input generators create different scenarios as a list for distributed parallel computing. After all simulations are completed, parallelized output analyzers are used to analyze all outputs according to the different scenarios. This saves significant computing time and resources, making it possible to conduct gridded modeling at regional to global scales with high-resolution data. For example, serial processing for the Iringa test case would require 113 hours, while using the framework developed in this study requires only approximately 6 hours, a nearly 95% reduction in computing time.

  3. Self-reconfigurable ship fluid-network modeling for simulation-based design

    NASA Astrophysics Data System (ADS)

    Moon, Kyungjin

    Our world is filled with large-scale engineering systems, which provide various services and conveniences in our daily life. A distinctive trend in the development of today's large-scale engineering systems is the extensive and aggressive adoption of automation and autonomy that enable the significant improvement of systems' robustness, efficiency, and performance, with considerably reduced manning and maintenance costs, and the U.S. Navy's DD(X), the next-generation destroyer program, is considered as an extreme example of such a trend. This thesis pursues a modeling solution for performing simulation-based analysis in the conceptual or preliminary design stage of an intelligent, self-reconfigurable ship fluid system, which is one of the concepts of DD(X) engineering plant development. Through the investigations on the Navy's approach for designing a more survivable ship system, it is found that the current naval simulation-based analysis environment is limited by the capability gaps in damage modeling, dynamic model reconfiguration, and simulation speed of the domain specific models, especially fluid network models. As enablers of filling these gaps, two essential elements were identified in the formulation of the modeling method. The first one is the graph-based topological modeling method, which will be employed for rapid model reconstruction and damage modeling, and the second one is the recurrent neural network-based, component-level surrogate modeling method, which will be used to improve the affordability and efficiency of the modeling and simulation (M&S) computations. The integration of the two methods can deliver computationally efficient, flexible, and automation-friendly M&S which will create an environment for more rigorous damage analysis and exploration of design alternatives. As a demonstration for evaluating the developed method, a simulation model of a notional ship fluid system was created, and a damage analysis was performed. Next, the models representing different design configurations of the fluid system were created, and damage analyses were performed with them in order to find an optimal design configuration for system survivability. Finally, the benefits and drawbacks of the developed method were discussed based on the result of the demonstration.

  4. 3D Printer Instrumentation to Create Varied Geometries of Robotic Limbs and Heterogeneous Granular Media

    DTIC Science & Technology

    2015-05-20

    Transfer Robo Ant The 3D printer was used to rapidly prototype a robot ant . The robot ant was used to model the behavior of the fire ant and to model...computer models and 3D printed ant robots are shown below. Snake Bot We used the 3D printed to rapidly design a modular, easily-modified snake...living organism (modern mudskippers, a terrestrial fish) and extinct early tetrapods (e.g. Ichthyostega, Acanthostega) while allowing us to explore

  5. Bayesian depth estimation from monocular natural images.

    PubMed

    Su, Che-Chun; Cormack, Lawrence K; Bovik, Alan C

    2017-05-01

    Estimating an accurate and naturalistic dense depth map from a single monocular photographic image is a difficult problem. Nevertheless, human observers have little difficulty understanding the depth structure implied by photographs. Two-dimensional (2D) images of the real-world environment contain significant statistical information regarding the three-dimensional (3D) structure of the world that the vision system likely exploits to compute perceived depth, monocularly as well as binocularly. Toward understanding how this might be accomplished, we propose a Bayesian model of monocular depth computation that recovers detailed 3D scene structures by extracting reliable, robust, depth-sensitive statistical features from single natural images. These features are derived using well-accepted univariate natural scene statistics (NSS) models and recent bivariate/correlation NSS models that describe the relationships between 2D photographic images and their associated depth maps. This is accomplished by building a dictionary of canonical local depth patterns from which NSS features are extracted as prior information. The dictionary is used to create a multivariate Gaussian mixture (MGM) likelihood model that associates local image features with depth patterns. A simple Bayesian predictor is then used to form spatial depth estimates. The depth results produced by the model, despite its simplicity, correlate well with ground-truth depths measured by a current-generation terrestrial light detection and ranging (LIDAR) scanner. Such a strong form of statistical depth information could be used by the visual system when creating overall estimated depth maps incorporating stereopsis, accommodation, and other conditions. Indeed, even in isolation, the Bayesian predictor delivers depth estimates that are competitive with state-of-the-art "computer vision" methods that utilize highly engineered image features and sophisticated machine learning algorithms.

  6. Surface and subsurface geologic risk factors to ground water affecting brownfield redevelopment potential.

    PubMed

    Kaufman, Martin M; Murray, Kent S; Rogers, Daniel T

    2003-01-01

    A model is created for assessing the redevelopment potential of brownfields. The model is derived from a space and time conceptual framework that identifies and measures the surface and subsurface risk factors present at brownfield sites. The model then combines these factors with a contamination extent multiplier at each site to create an index of redevelopment potential. Results from the application of the model within an urbanized watershed demonstrate clear differences between the redevelopment potential present within five different near-surface geologic units, with those units containing clay being less vulnerable to subsurface contamination. With and without the extent multiplier, the total risk present at the brownfield sites within all the geologic units is also strongly correlated to the actual costs of remediation. Thus, computing the total surface and subsurface risk within a watershed can help guide the remediation efforts at broad geographic scales, and prioritize the locations for redevelopment.

  7. Statistical density modification using local pattern matching

    DOEpatents

    Terwilliger, Thomas C.

    2007-01-23

    A computer implemented method modifies an experimental electron density map. A set of selected known experimental and model electron density maps is provided and standard templates of electron density are created from the selected experimental and model electron density maps by clustering and averaging values of electron density in a spherical region about each point in a grid that defines each selected known experimental and model electron density maps. Histograms are also created from the selected experimental and model electron density maps that relate the value of electron density at the center of each of the spherical regions to a correlation coefficient of a density surrounding each corresponding grid point in each one of the standard templates. The standard templates and the histograms are applied to grid points on the experimental electron density map to form new estimates of electron density at each grid point in the experimental electron density map.

  8. Getting in touch--3D printing in forensic imaging.

    PubMed

    Ebert, Lars Chr; Thali, Michael J; Ross, Steffen

    2011-09-10

    With the increasing use of medical imaging in forensics, as well as the technological advances in rapid prototyping, we suggest combining these techniques to generate displays of forensic findings. We used computed tomography (CT), CT angiography, magnetic resonance imaging (MRI) and surface scanning with photogrammetry in conjunction with segmentation techniques to generate 3D polygon meshes. Based on these data sets, a 3D printer created colored models of the anatomical structures. Using this technique, we could create models of bone fractures, vessels, cardiac infarctions, ruptured organs as well as bitemark wounds. The final models are anatomically accurate, fully colored representations of bones, vessels and soft tissue, and they demonstrate radiologically visible pathologies. The models are more easily understood by laypersons than volume rendering or 2D reconstructions. Therefore, they are suitable for presentations in courtrooms and for educational purposes. Copyright © 2011 Elsevier Ireland Ltd. All rights reserved.

  9. Experimental and Numerical Analyses of Dynamic Deformation and Failure in Marine Structures Subjected to Underwater Impulsive Loads

    DTIC Science & Technology

    2012-08-01

    based impulsive loading ......................................... 48 4.4 Computational modeling of USLS ...56 4.5 Underwater Shock Loading Simulator ( USLS ) ...................................................... 59 4.6 Concluding...42 Figure 4.1 Schematic of Underwater Shock Loading Simulator ( USLS ). A high-velocity projectile hits the flyer-plate and creates a stress

  10. The Sounds of Nanoscience: Acoustic STM Analogues

    ERIC Educational Resources Information Center

    Euler, Manfred

    2013-01-01

    A hands-on model of scanning tunnelling microscopy (STM) is presented. It uses near-field imaging with sound and computer assisted visualization to create acoustic mappings of resonator arrangements. Due to the (partial) analogy of matter and sound waves the images closely resemble STM scans of atoms. Moreover, the method can be extended to build…

  11. Modeling and Validation of Power-split and P2 Parallel Hybrid Electric Vehicles SAE 2013-01-1470)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis tool was created by EPA to evaluate the Greenhouse Gas (GHG) emissions of Light-Duty (LD) vehicles. It is a physics-based, forward-looking, full vehicle computer simulator capable of analyzing various vehicle types combined ...

  12. Creating a Complex Measurement Model Using Evidence Centered Design.

    ERIC Educational Resources Information Center

    Williamson, David M.; Bauer, Malcom; Steinberg, Linda S.; Mislevy, Robert J.; Behrens, John T.

    In computer-based simulations meant to support learning, students must bring a wide range of relevant knowledge, skills, and abilities to bear jointly as they solve meaningful problems in a learning domain. To function efficiently as an assessment, a simulation system must also be able to evoke and interpret observable evidence about targeted…

  13. Programs as Causal Models: Speculations on Mental Programs and Mental Representation

    ERIC Educational Resources Information Center

    Chater, Nick; Oaksford, Mike

    2013-01-01

    Judea Pearl has argued that counterfactuals and causality are central to intelligence, whether natural or artificial, and has helped create a rich mathematical and computational framework for formally analyzing causality. Here, we draw out connections between these notions and various current issues in cognitive science, including the nature of…

  14. Education on the Electronic Frontier: Teleapprentices in Globally Distributed Educational Contexts. Interactive Technology Laboratory Report #14.

    ERIC Educational Resources Information Center

    Levin, James A.; And Others

    1987-01-01

    The instructional media created by microcomputers interconnected by modems to form long-distance networks present some powerful new opportunities for education. While other uses of computers in education have been built on conventional instructional models of classroom interaction, instructional electronic networks facilitate a wider use of…

  15. Creating a New Definition of Library Cooperation: Past, Present, and Future Models.

    ERIC Educational Resources Information Center

    Lenzini, Rebecca T.; Shaw, Ward

    1991-01-01

    Describes the creation and purpose of the Colorado Alliance of Research Libraries (CARL), the subsequent development of CARL Systems, and its current research projects. Topics discussed include online catalogs; UnCover, a journal article database; full text data; document delivery; visual images in computer systems; networks; and implications for…

  16. High Speed Cylindrical Roller Bearing Analysis, SKF Computer Program CYBEAN. Volume 1: Analysis

    NASA Technical Reports Server (NTRS)

    Kleckner, R. J.; Pirvics, J.

    1978-01-01

    The CYBEAN (CYlindrical BEaring ANalysis) program was created to detail radially loaded, aligned and misaligned Cylindrical roller bearing performance under a variety of operating conditions. The models and associated mathematics used within CYBEAN are described. The user is referred to the material for formulation assumptions and algorithm detail.

  17. Development and Testing of an Automatic Transmission Shift Schedule Algorithm for Vehicle Simulation (SAE Paper 2015-01-1142)

    EPA Science Inventory

    The Advanced Light-Duty Powertrain and Hybrid Analysis (ALPHA) modeling tool was created by EPA to estimate greenhouse gas (GHG) emissions of light-duty vehicles. ALPHA is a physics-based, forward-looking, full vehicle computer simulation capable of analyzing various vehicle type...

  18. Protocol standards and implementation within the digital engineering laboratory computer network (DELNET) using the universal network interface device (UNID). Part 2

    NASA Astrophysics Data System (ADS)

    Phister, P. W., Jr.

    1983-12-01

    Development of the Air Force Institute of Technology's Digital Engineering Laboratory Network (DELNET) was continued with the development of an initial draft of a protocol standard for all seven layers as specified by the International Standards Organization's (ISO) Reference Model for Open Systems Interconnections. This effort centered on the restructuring of the Network Layer to perform Datagram routing and to conform to the developed protocol standards and actual software module development of the upper four protocol layers residing within the DELNET Monitor (Zilog MCZ 1/25 Computer System). Within the guidelines of the ISO Reference Model the Transport Layer was developed utilizing the Internet Header Format (IHF) combined with the Transport Control Protocol (TCP) to create a 128-byte Datagram. Also a limited Application Layer was created to pass the Gettysburg Address through the DELNET. This study formulated a first draft for the DELNET Protocol Standard and designed, implemented, and tested the Network, Transport, and Application Layers to conform to these protocol standards.

  19. Study of CPM Device used for Rehabilitation and Effective Pain Management Following Knee Alloplasty

    NASA Astrophysics Data System (ADS)

    Trochimczuk, R.; Kuźmierowski, T.; Anchimiuk, P.

    2017-02-01

    This paper defines the design assumptions for the construction of an original demonstration of a CPM device, based on which a solid virtual model will be created in a CAD software environment. The overall dimensions and other input parameters for the design were determined for the entire patient population according to an anatomical atlas of human measures. The medical and physiotherapeutic community were also consulted with respect to the proposed engineering solutions. The virtual model of the CPM device that will be created will be used for computer simulations of changes in motion parameters as a function of time, accounting for loads and static states. The results obtained from computer simulation will be used to confirm the correctness of the design adopted assumptions and of the accepted structure of the CPM mechanism, and potentially to introduce necessary corrections. They will also provide a basis for the development of a control strategy for the laboratory prototype and for the selection of the strategy of the patient's rehabilitation in the future. This paper will be supplemented with identification of directions of further research.

  20. On transferring the grid technology to the biomedical community.

    PubMed

    Mohammed, Yassene; Sax, Ulrich; Dickmann, Frank; Lippert, Joerg; Solodenko, Juri; von Voigt, Gabriele; Smith, Matthew; Rienhoff, Otto

    2010-01-01

    Natural scientists such as physicists pioneered the sharing of computing resources, which resulted in the Grid. The inter domain transfer process of this technology has been an intuitive process. Some difficulties facing the life science community can be understood using the Bozeman's "Effectiveness Model of Technology Transfer". Bozeman's and classical technology transfer approaches deal with technologies that have achieved certain stability. Grid and Cloud solutions are technologies that are still in flux. We illustrate how Grid computing creates new difficulties for the technology transfer process that are not considered in Bozeman's model. We show why the success of health Grids should be measured by the qualified scientific human capital and opportunities created, and not primarily by the market impact. With two examples we show how the Grid technology transfer theory corresponds to the reality. We conclude with recommendations that can help improve the adoption of Grid solutions into the biomedical community. These results give a more concise explanation of the difficulties most life science IT projects are facing in the late funding periods, and show some leveraging steps which can help to overcome the "vale of tears".

  1. An experimental study on the manufacture and characterization of in-plane fibre-waviness defects in composites

    PubMed Central

    DiazDelaO, F. A.; Atherton, K.

    2018-01-01

    A new method has been developed for creating localized in-plane fibre waviness in composite coupons and used to create a large batch of specimens. This method could be used by manufacturers to experimentally explore the effect of fibre waviness on composite structures both directly and indirectly to develop and validate computational models. The specimens were assessed using ultrasound, digital image correlation and a novel inspection technique capable of measuring residual strain fields. To explore how the defect affects the performance of composite structures, the specimens were then loaded to failure. Predictions of remnant strength were made using a simple ultrasound damage metric and a new residual strain-based damage metric. The predictions made using residual strain measurements were found to be substantially more effective at characterizing ultimate strength than ultrasound measurements. This suggests that residual strains have a significant effect on the failure of laminates containing fibre waviness and that these strains could be incorporated into computational models to improve their ability to simulate the defect. PMID:29892446

  2. The Living Heart Project: A robust and integrative simulator for human heart function.

    PubMed

    Baillargeon, Brian; Rebelo, Nuno; Fox, David D; Taylor, Robert L; Kuhl, Ellen

    2014-11-01

    The heart is not only our most vital, but also our most complex organ: Precisely controlled by the interplay of electrical and mechanical fields, it consists of four chambers and four valves, which act in concert to regulate its filling, ejection, and overall pump function. While numerous computational models exist to study either the electrical or the mechanical response of its individual chambers, the integrative electro-mechanical response of the whole heart remains poorly understood. Here we present a proof-of-concept simulator for a four-chamber human heart model created from computer topography and magnetic resonance images. We illustrate the governing equations of excitation-contraction coupling and discretize them using a single, unified finite element environment. To illustrate the basic features of our model, we visualize the electrical potential and the mechanical deformation across the human heart throughout its cardiac cycle. To compare our simulation against common metrics of cardiac function, we extract the pressure-volume relationship and show that it agrees well with clinical observations. Our prototype model allows us to explore and understand the key features, physics, and technologies to create an integrative, predictive model of the living human heart. Ultimately, our simulator will open opportunities to probe landscapes of clinical parameters, and guide device design and treatment planning in cardiac diseases such as stenosis, regurgitation, or prolapse of the aortic, pulmonary, tricuspid, or mitral valve.

  3. Analysis of the mechanical behavior of a titanium scaffold with a repeating unit-cell substructure.

    PubMed

    Ryan, Garrett; McGarry, Patrick; Pandit, Abhay; Apatsidis, Dimitrios

    2009-08-01

    Titanium scaffolds with controlled microarchitecture have been developed for load bearing orthopedic applications. The controlled microarchitecture refers to a repeating array of unit-cells, composed of sintered titanium powder, which make up the scaffold structure. The objective of this current research was to characterize the mechanical performance of three scaffolds with increasing porosity, using finite element analysis (FEA) and to compare the results with experimental data. Scaffolds were scanned using microcomputed tomography and FEA models were generated from the resulting computer models. Macroscale and unit-cell models of the scaffolds were created. The material properties of the sintered titanium powders were first evaluated in mechanical tests and the data used in the FEA. The macroscale and unit-cell FEA models proved to be a good predictor of Young's modulus and yield strength. Although macroscale models showed similar failure patterns and an expected trend in UCS, strain at UCS did not compare well with experimental data. Since a rapid prototyping method was used to create the scaffolds, the original CAD geometries of the scaffold were also evaluated using FEA but they did not reflect the mechanical properties of the physical scaffolds. This indicates that at present, determining the actual geometry of the scaffold through computed tomography imaging is important. Finally, a fatigue analysis was performed on the scaffold to simulate the loading conditions it would experience as a spinal interbody fusion device.

  4. The application of virtual reality systems as a support of digital manufacturing and logistics

    NASA Astrophysics Data System (ADS)

    Golda, G.; Kampa, A.; Paprocka, I.

    2016-08-01

    Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.

  5. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology.

    PubMed

    Bañares, Miguel A; Haase, Andrea; Tran, Lang; Lobaskin, Vladimir; Oberdörster, Günter; Rallo, Robert; Leszczynski, Jerzy; Hoet, Peter; Korenstein, Rafi; Hardy, Barry; Puzyn, Tomasz

    2017-09-01

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for cross fertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data and relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.

  6. CompNanoTox2015: novel perspectives from a European conference on computational nanotoxicology on predictive nanotoxicology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bañares, Miguel A.; Haase, Andrea; Tran, Lang

    A first European Conference on Computational Nanotoxicology, CompNanoTox, was held in November 2015 in Benahavís, Spain with the objectives to disseminate and integrate results from the European modeling and database projects (NanoPUZZLES, ModENPTox, PreNanoTox, MembraneNanoPart, MODERN, eNanoMapper and EU COST TD1204 MODENA) as well as to create synergies within the European NanoSafety Cluster. This conference was supported by the COST Action TD1204 MODENA on developing computational methods for toxicological risk assessment of engineered nanoparticles and provided a unique opportunity for crossfertilization among complementary disciplines. The efforts to develop and validate computational models crucially depend on high quality experimental data andmore » relevant assays which will be the basis to identify relevant descriptors. The ambitious overarching goal of this conference was to promote predictive nanotoxicology, which can only be achieved by a close collaboration between the computational scientists (e.g. database experts, modeling experts for structure, (eco) toxicological effects, performance and interaction of nanomaterials) and experimentalists from different areas (in particular toxicologists, biologists, chemists and material scientists, among others). The main outcome and new perspectives of this conference are summarized here.« less

  7. Enabling Real-time Water Decision Support Services Using Model as a Service

    NASA Astrophysics Data System (ADS)

    Zhao, T.; Minsker, B. S.; Lee, J. S.; Salas, F. R.; Maidment, D. R.; David, C. H.

    2014-12-01

    Through application of computational methods and an integrated information system, data and river modeling services can help researchers and decision makers more rapidly understand river conditions under alternative scenarios. To enable this capability, workflows (i.e., analysis and model steps) are created and published as Web services delivered through an internet browser, including model inputs, a published workflow service, and visualized outputs. The RAPID model, which is a river routing model developed at University of Texas Austin for parallel computation of river discharge, has been implemented as a workflow and published as a Web application. This allows non-technical users to remotely execute the model and visualize results as a service through a simple Web interface. The model service and Web application has been prototyped in the San Antonio and Guadalupe River Basin in Texas, with input from university and agency partners. In the future, optimization model workflows will be developed to link with the RAPID model workflow to provide real-time water allocation decision support services.

  8. Utilizing Three-Dimensional Printing Technology to Assess the Feasibility of High-Fidelity Synthetic Ventricular Septal Defect Models for Simulation in Medical Education.

    PubMed

    Costello, John P; Olivieri, Laura J; Krieger, Axel; Thabit, Omar; Marshall, M Blair; Yoo, Shi-Joon; Kim, Peter C; Jonas, Richard A; Nath, Dilip S

    2014-07-01

    The current educational approach for teaching congenital heart disease (CHD) anatomy to students involves instructional tools and techniques that have significant limitations. This study sought to assess the feasibility of utilizing present-day three-dimensional (3D) printing technology to create high-fidelity synthetic heart models with ventricular septal defect (VSD) lesions and applying these models to a novel, simulation-based educational curriculum for premedical and medical students. Archived, de-identified magnetic resonance images of five common VSD subtypes were obtained. These cardiac images were then segmented and built into 3D computer-aided design models using Mimics Innovation Suite software. An Objet500 Connex 3D printer was subsequently utilized to print a high-fidelity heart model for each VSD subtype. Next, a simulation-based educational curriculum using these heart models was developed and implemented in the instruction of 29 premedical and medical students. Assessment of this curriculum was undertaken with Likert-type questionnaires. High-fidelity VSD models were successfully created utilizing magnetic resonance imaging data and 3D printing. Following instruction with these high-fidelity models, all students reported significant improvement in knowledge acquisition (P < .0001), knowledge reporting (P < .0001), and structural conceptualization (P < .0001) of VSDs. It is feasible to use present-day 3D printing technology to create high-fidelity heart models with complex intracardiac defects. Furthermore, this tool forms the foundation for an innovative, simulation-based educational approach to teach students about CHD and creates a novel opportunity to stimulate their interest in this field. © The Author(s) 2014.

  9. Computer Graphic Design Using Auto-CAD and Plug Nozzle Research

    NASA Technical Reports Server (NTRS)

    Rogers, Rayna C.

    2004-01-01

    The purpose of creating computer generated images varies widely. They can be use for computational fluid dynamics (CFD), or as a blueprint for designing parts. The schematic that I will be working on the summer will be used to create nozzles that are a part of a larger system. At this phase in the project, the nozzles needed for the systems have been fabricated. One part of my mission is to create both three dimensional and two dimensional models on Auto-CAD 2002 of the nozzles. The research on plug nozzles will allow me to have a better understanding of how they assist in the thrust need for a missile to take off. NASA and the United States military are working together to develop a new design concept. On most missiles a convergent-divergent nozzle is used to create thrust. However, the two are looking into different concepts for the nozzle. The standard convergent-divergent nozzle forces a mixture of combustible fluids and air through a smaller area in comparison to where the combination was mixed. Once it passes through the smaller area known as A8 it comes out the end of the nozzle which is larger the first or area A9. This creates enough thrust for the mechanism whether it is an F-18 fighter jet or a missile. The A9 section of the convergent-divergent nozzle has a mechanism that controls how large A9 can be. This is needed because the pressure of the air coming out nozzle must be equal to that of the ambient pressure other wise there will be a loss of performance in the machine. The plug nozzle however does not need to have an A9 that can vary. When the air flow comes out it can automatically sense what the ambient pressure is and will adjust accordingly. The objective of this design is to create a plug nozzle that is not as complicated mechanically as it counterpart the convergent-divergent nozzle.

  10. Application of cellular automata approach for cloud simulation and rendering

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Christopher Immanuel, W.; Paul Mary Deborrah, S.; Samuel Selvaraj, R.

    Current techniques for creating clouds in games and other real time applications produce static, homogenous clouds. These clouds, while viable for real time applications, do not exhibit an organic feel that clouds in nature exhibit. These clouds, when viewed over a time period, were able to deform their initial shape and move in a more organic and dynamic way. With cloud shape technology we should be able in the future to extend to create even more cloud shapes in real time with more forces. Clouds are an essential part of any computer model of a landscape or an animation ofmore » an outdoor scene. A realistic animation of clouds is also important for creating scenes for flight simulators, movies, games, and other. Our goal was to create a realistic animation of clouds.« less

  11. Evaluating local indirect addressing in SIMD proc essors

    NASA Technical Reports Server (NTRS)

    Middleton, David; Tomboulian, Sherryl

    1989-01-01

    In the design of parallel computers, there exists a tradeoff between the number and power of individual processors. The single instruction stream, multiple data stream (SIMD) model of parallel computers lies at one extreme of the resulting spectrum. The available hardware resources are devoted to creating the largest possible number of processors, and consequently each individual processor must use the fewest possible resources. Disagreement exists as to whether SIMD processors should be able to generate addresses individually into their local data memory, or all processors should access the same address. The tradeoff is examined between the increased capability and the reduced number of processors that occurs in this single instruction stream, multiple, locally addressed, data (SIMLAD) model. Factors are assembled that affect this design choice, and the SIMLAD model is compared with the bare SIMD and the MIMD models.

  12. A distributed, dynamic, parallel computational model: the role of noise in velocity storage

    PubMed Central

    Merfeld, Daniel M.

    2012-01-01

    Networks of neurons perform complex calculations using distributed, parallel computation, including dynamic “real-time” calculations required for motion control. The brain must combine sensory signals to estimate the motion of body parts using imperfect information from noisy neurons. Models and experiments suggest that the brain sometimes optimally minimizes the influence of noise, although it remains unclear when and precisely how neurons perform such optimal computations. To investigate, we created a model of velocity storage based on a relatively new technique–“particle filtering”–that is both distributed and parallel. It extends existing observer and Kalman filter models of vestibular processing by simulating the observer model many times in parallel with noise added. During simulation, the variance of the particles defining the estimator state is used to compute the particle filter gain. We applied our model to estimate one-dimensional angular velocity during yaw rotation, which yielded estimates for the velocity storage time constant, afferent noise, and perceptual noise that matched experimental data. We also found that the velocity storage time constant was Bayesian optimal by comparing the estimate of our particle filter with the estimate of the Kalman filter, which is optimal. The particle filter demonstrated a reduced velocity storage time constant when afferent noise increased, which mimics what is known about aminoglycoside ablation of semicircular canal hair cells. This model helps bridge the gap between parallel distributed neural computation and systems-level behavioral responses like the vestibuloocular response and perception. PMID:22514288

  13. Towards Anatomic Scale Agent-Based Modeling with a Massively Parallel Spatially Explicit General-Purpose Model of Enteric Tissue (SEGMEnT_HPC)

    PubMed Central

    Cockrell, Robert Chase; Christley, Scott; Chang, Eugene; An, Gary

    2015-01-01

    Perhaps the greatest challenge currently facing the biomedical research community is the ability to integrate highly detailed cellular and molecular mechanisms to represent clinical disease states as a pathway to engineer effective therapeutics. This is particularly evident in the representation of organ-level pathophysiology in terms of abnormal tissue structure, which, through histology, remains a mainstay in disease diagnosis and staging. As such, being able to generate anatomic scale simulations is a highly desirable goal. While computational limitations have previously constrained the size and scope of multi-scale computational models, advances in the capacity and availability of high-performance computing (HPC) resources have greatly expanded the ability of computational models of biological systems to achieve anatomic, clinically relevant scale. Diseases of the intestinal tract are exemplary examples of pathophysiological processes that manifest at multiple scales of spatial resolution, with structural abnormalities present at the microscopic, macroscopic and organ-levels. In this paper, we describe a novel, massively parallel computational model of the gut, the Spatially Explicitly General-purpose Model of Enteric Tissue_HPC (SEGMEnT_HPC), which extends an existing model of the gut epithelium, SEGMEnT, in order to create cell-for-cell anatomic scale simulations. We present an example implementation of SEGMEnT_HPC that simulates the pathogenesis of ileal pouchitis, and important clinical entity that affects patients following remedial surgery for ulcerative colitis. PMID:25806784

  14. Mind, Machine, and Creativity: An Artist's Perspective.

    PubMed

    Sundararajan, Louise

    2014-06-01

    Harold Cohen is a renowned painter who has developed a computer program, AARON, to create art. While AARON has been hailed as one of the most creative AI programs, Cohen consistently rejects the claims of machine creativity. Questioning the possibility for AI to model human creativity, Cohen suggests in so many words that the human mind takes a different route to creativity, a route that privileges the relational, rather than the computational, dimension of cognition. This unique perspective on the tangled web of mind, machine, and creativity is explored by an application of three relational models of the mind to an analysis of Cohen's talks and writings, which are available on his website: www.aaronshome.com.

  15. Mind, Machine, and Creativity: An Artist's Perspective

    PubMed Central

    Sundararajan, Louise

    2014-01-01

    Harold Cohen is a renowned painter who has developed a computer program, AARON, to create art. While AARON has been hailed as one of the most creative AI programs, Cohen consistently rejects the claims of machine creativity. Questioning the possibility for AI to model human creativity, Cohen suggests in so many words that the human mind takes a different route to creativity, a route that privileges the relational, rather than the computational, dimension of cognition. This unique perspective on the tangled web of mind, machine, and creativity is explored by an application of three relational models of the mind to an analysis of Cohen's talks and writings, which are available on his website: www.aaronshome.com. PMID:25541564

  16. Computer-aided position planning of miniplates to treat facial bone defects

    PubMed Central

    Wallner, Jürgen; Gall, Markus; Chen, Xiaojun; Schwenzer-Zimmerer, Katja; Reinbacher, Knut; Schmalstieg, Dieter

    2017-01-01

    In this contribution, a software system for computer-aided position planning of miniplates to treat facial bone defects is proposed. The intra-operatively used bone plates have to be passively adapted on the underlying bone contours for adequate bone fragment stabilization. However, this procedure can lead to frequent intra-operatively performed material readjustments especially in complex surgical cases. Our approach is able to fit a selection of common implant models on the surgeon’s desired position in a 3D computer model. This happens with respect to the surrounding anatomical structures, always including the possibility of adjusting both the direction and the position of the used osteosynthesis material. By using the proposed software, surgeons are able to pre-plan the out coming implant in its form and morphology with the aid of a computer-visualized model within a few minutes. Further, the resulting model can be stored in STL file format, the commonly used format for 3D printing. Using this technology, surgeons are able to print the virtual generated implant, or create an individually designed bending tool. This method leads to adapted osteosynthesis materials according to the surrounding anatomy and requires further a minimum amount of money and time. PMID:28817607

  17. Computer-aided position planning of miniplates to treat facial bone defects.

    PubMed

    Egger, Jan; Wallner, Jürgen; Gall, Markus; Chen, Xiaojun; Schwenzer-Zimmerer, Katja; Reinbacher, Knut; Schmalstieg, Dieter

    2017-01-01

    In this contribution, a software system for computer-aided position planning of miniplates to treat facial bone defects is proposed. The intra-operatively used bone plates have to be passively adapted on the underlying bone contours for adequate bone fragment stabilization. However, this procedure can lead to frequent intra-operatively performed material readjustments especially in complex surgical cases. Our approach is able to fit a selection of common implant models on the surgeon's desired position in a 3D computer model. This happens with respect to the surrounding anatomical structures, always including the possibility of adjusting both the direction and the position of the used osteosynthesis material. By using the proposed software, surgeons are able to pre-plan the out coming implant in its form and morphology with the aid of a computer-visualized model within a few minutes. Further, the resulting model can be stored in STL file format, the commonly used format for 3D printing. Using this technology, surgeons are able to print the virtual generated implant, or create an individually designed bending tool. This method leads to adapted osteosynthesis materials according to the surrounding anatomy and requires further a minimum amount of money and time.

  18. A vibration model for centrifugal contactors

    NASA Astrophysics Data System (ADS)

    Leonard, R. A.; Wasserman, M. O.; Wygmans, D. G.

    1992-11-01

    Using the transfer matrix method, we created the Excel worksheet 'Beam' for analyzing vibrations in centrifugal contactors. With this worksheet, a user can calculate the first natural frequency of the motor/rotor system for a centrifugal contactor. We determined a typical value for the bearing stiffness (k(sub B)) of a motor after measuring the k(sub B) value for three different motors. The k(sub B) value is an important parameter in this model, but it is not normally available for motors. The assumptions that we made in creating the Beam worksheet were verified by comparing the calculated results with those from a VAX computer program, BEAM IV. The Beam worksheet was applied to several contactor designs for which we have experimental data and found to work well.

  19. Assessment of performances of sun zenith angle and altitude parameterisations of atmospheric radiative transfer for spectral surface downwelling solar irradiance

    NASA Astrophysics Data System (ADS)

    Wald, L.; Blanc, Ph.

    2010-09-01

    Satellite-derived assessments of surface downwelling solar irradiance are more and more used by engineering companies in solar energy. Performances are judged satisfactory for the time being. Nevertheless, requests for more accuracy are increasing, in particular in the spectral definition and in the decomposition of the global radiation into direct and diffuse radiations. One approach to reach this goal is to improve both the modelling of the radiative transfer and the quality of the inputs describing the optical state. Within their joint project Heliosat-4, DLR and MINES ParisTech have adopted this approach to create advanced databases of solar irradiance succeeding to the current ones HelioClim and SolEMi. Regarding the model, we have opted for libRadtran, a well-known model of proven quality. As many similar models, running libRadtran is very time-consuming when it comes to process millions or more pixels or grid cells. This is incompatible with real-time operational process. One may adopt the abacus approach, or look-up tables, to overcome the problem. The model is run for a limited number of cases, covering the whole range of values taken by the various inputs of the model. Abaci are such constructed. For each real case, the irradiance value is computed by interpolating within the abaci. In this way, real-time can be envisioned. Nevertheless, the computation of the abaci themselves requires large computing capabilities. In addition, searching the abaci to find the values to interpolate can be time-consuming as the abaci are very large: several millions of values in total. Moreover, it raises the extrapolation problem of parameter out-of-range during the utilisation of the abaci. Parameterisation, when possible, is a means to reduce the amount of computations to be made and subsequently, the computation effort to create the abaci, the size of the abaci, the extrapolation and the searching time. It describes in analytical manner and with a few parameters the change in irradiance with a specific variable. The communication discusses two parameterisations found in the literature. One deals with the solar zenith angle, the other with the altitude. We assess their performances in retrieving solar irradiance for 32 spectral bands, from 240 nm to 4606 nm. The model libRadtran is run to create data sets for all sun zenith angles (every 5 degrees) and all altitudes (every km). These data sets are considered as a reference. Then, for each parameterisation, we compute the parameters using two irradiance values for specific values of angle (e.g., 0 and 60 degrees) or altitude (e.g., 0 and 3 km). The parameterisations are then applied to other values of angle and altitude. Differences between these assessments and the reference values of irradiance are computed and analysed. We conclude on the level of performances of each parameterisation for each spectral band as well as for the total irradiance. We discuss the possible use of these parameterisations in the future method Heliosat-4 and possible improvements. The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under Grant Agreement no. 218793 (MACC project).

  20. Analysis of geologic terrain models for determination of optimum SAR sensor configuration and optimum information extraction for exploration of global non-renewable resources. Pilot study: Arkansas Remote Sensing Laboratory, part 1, part 2, and part 3

    NASA Technical Reports Server (NTRS)

    Kaupp, V. H.; Macdonald, H. C.; Waite, W. P.; Stiles, J. A.; Frost, F. S.; Shanmugam, K. S.; Smith, S. A.; Narayanan, V.; Holtzman, J. C. (Principal Investigator)

    1982-01-01

    Computer-generated radar simulations and mathematical geologic terrain models were used to establish the optimum radar sensor operating parameters for geologic research. An initial set of mathematical geologic terrain models was created for three basic landforms and families of simulated radar images were prepared from these models for numerous interacting sensor, platform, and terrain variables. The tradeoffs between the various sensor parameters and the quantity and quality of the extractable geologic data were investigated as well as the development of automated techniques of digital SAR image analysis. Initial work on a texture analysis of SEASAT SAR imagery is reported. Computer-generated radar simulations are shown for combinations of two geologic models and three SAR angles of incidence.

  1. Mapping chemicals in air using an environmental CAT scanning system: evaluation of algorithms

    NASA Astrophysics Data System (ADS)

    Samanta, A.; Todd, L. A.

    A new technique is being developed which creates near real-time maps of chemical concentrations in air for environmental and occupational environmental applications. This technique, we call Environmental CAT Scanning, combines the real-time measuring technique of open-path Fourier transform infrared spectroscopy with the mapping capabilitites of computed tomography to produce two-dimensional concentration maps. With this system, a network of open-path measurements is obtained over an area; measurements are then processed using a tomographic algorithm to reconstruct the concentrations. This research focussed on the process of evaluating and selecting appropriate reconstruction algorithms, for use in the field, by using test concentration data from both computer simultation and laboratory chamber studies. Four algorithms were tested using three types of data: (1) experimental open-path data from studies that used a prototype opne-path Fourier transform/computed tomography system in an exposure chamber; (2) synthetic open-path data generated from maps created by kriging point samples taken in the chamber studies (in 1), and; (3) synthetic open-path data generated using a chemical dispersion model to create time seires maps. The iterative algorithms used to reconstruct the concentration data were: Algebraic Reconstruction Technique without Weights (ART1), Algebraic Reconstruction Technique with Weights (ARTW), Maximum Likelihood with Expectation Maximization (MLEM) and Multiplicative Algebraic Reconstruction Technique (MART). Maps were evaluated quantitatively and qualitatively. In general, MART and MLEM performed best, followed by ARTW and ART1. However, algorithm performance varied under different contaminant scenarios. This study showed the importance of using a variety of maps, particulary those generated using dispersion models. The time series maps provided a more rigorous test of the algorithms and allowed distinctions to be made among the algorithms. A comprehensive evaluation of algorithms, for the environmental application of tomography, requires the use of a battery of test concentration data before field implementation, which models reality and tests the limits of the algorithms.

  2. Proceedings of the Augmented VIsual Display (AVID) Research Workshop

    NASA Technical Reports Server (NTRS)

    Kaiser, Mary K. (Editor); Sweet, Barbara T. (Editor)

    1993-01-01

    The papers, abstracts, and presentations were presented at a three day workshop focused on sensor modeling and simulation, and image enhancement, processing, and fusion. The technical sessions emphasized how sensor technology can be used to create visual imagery adequate for aircraft control and operations. Participants from industry, government, and academic laboratories contributed to panels on Sensor Systems, Sensor Modeling, Sensor Fusion, Image Processing (Computer and Human Vision), and Image Evaluation and Metrics.

  3. Sensitivity analysis of navy aviation readiness based sparing model

    DTIC Science & Technology

    2017-09-01

    variability. (See Figure 4.) Figure 4. Research design flowchart 18 Figure 4 lays out the four steps of the methodology , starting in the upper left-hand...as a function of changes in key inputs. We develop NAVARM Experimental Designs (NED), a computational tool created by applying a state-of-the-art...experimental design to the NAVARM model. Statistical analysis of the resulting data identifies the most influential cost factors. Those are, in order of

  4. Rapid prototyping and stereolithography in dentistry

    PubMed Central

    Nayar, Sanjna; Bhuminathan, S.; Bhat, Wasim Manzoor

    2015-01-01

    The word rapid prototyping (RP) was first used in mechanical engineering field in the early 1980s to describe the act of producing a prototype, a unique product, the first product, or a reference model. In the past, prototypes were handmade by sculpting or casting, and their fabrication demanded a long time. Any and every prototype should undergo evaluation, correction of defects, and approval before the beginning of its mass or large scale production. Prototypes may also be used for specific or restricted purposes, in which case they are usually called a preseries model. With the development of information technology, three-dimensional models can be devised and built based on virtual prototypes. Computers can now be used to create accurately detailed projects that can be assessed from different perspectives in a process known as computer aided design (CAD). To materialize virtual objects using CAD, a computer aided manufacture (CAM) process has been developed. To transform a virtual file into a real object, CAM operates using a machine connected to a computer, similar to a printer or peripheral device. In 1987, Brix and Lambrecht used, for the first time, a prototype in health care. It was a three-dimensional model manufactured using a computer numerical control device, a type of machine that was the predecessor of RP. In 1991, human anatomy models produced with a technology called stereolithography were first used in a maxillofacial surgery clinic in Viena. PMID:26015715

  5. Rapid prototyping and stereolithography in dentistry.

    PubMed

    Nayar, Sanjna; Bhuminathan, S; Bhat, Wasim Manzoor

    2015-04-01

    The word rapid prototyping (RP) was first used in mechanical engineering field in the early 1980s to describe the act of producing a prototype, a unique product, the first product, or a reference model. In the past, prototypes were handmade by sculpting or casting, and their fabrication demanded a long time. Any and every prototype should undergo evaluation, correction of defects, and approval before the beginning of its mass or large scale production. Prototypes may also be used for specific or restricted purposes, in which case they are usually called a preseries model. With the development of information technology, three-dimensional models can be devised and built based on virtual prototypes. Computers can now be used to create accurately detailed projects that can be assessed from different perspectives in a process known as computer aided design (CAD). To materialize virtual objects using CAD, a computer aided manufacture (CAM) process has been developed. To transform a virtual file into a real object, CAM operates using a machine connected to a computer, similar to a printer or peripheral device. In 1987, Brix and Lambrecht used, for the first time, a prototype in health care. It was a three-dimensional model manufactured using a computer numerical control device, a type of machine that was the predecessor of RP. In 1991, human anatomy models produced with a technology called stereolithography were first used in a maxillofacial surgery clinic in Viena.

  6. Real longitudinal data analysis for real people: building a good enough mixed model.

    PubMed

    Cheng, Jing; Edwards, Lloyd J; Maldonado-Molina, Mildred M; Komro, Kelli A; Muller, Keith E

    2010-02-20

    Mixed effects models have become very popular, especially for the analysis of longitudinal data. One challenge is how to build a good enough mixed effects model. In this paper, we suggest a systematic strategy for addressing this challenge and introduce easily implemented practical advice to build mixed effects models. A general discussion of the scientific strategies motivates the recommended five-step procedure for model fitting. The need to model both the mean structure (the fixed effects) and the covariance structure (the random effects and residual error) creates the fundamental flexibility and complexity. Some very practical recommendations help to conquer the complexity. Centering, scaling, and full-rank coding of all the predictor variables radically improve the chances of convergence, computing speed, and numerical accuracy. Applying computational and assumption diagnostics from univariate linear models to mixed model data greatly helps to detect and solve the related computational problems. Applying computational and assumption diagnostics from the univariate linear models to the mixed model data can radically improve the chances of convergence, computing speed, and numerical accuracy. The approach helps to fit more general covariance models, a crucial step in selecting a credible covariance model needed for defensible inference. A detailed demonstration of the recommended strategy is based on data from a published study of a randomized trial of a multicomponent intervention to prevent young adolescents' alcohol use. The discussion highlights a need for additional covariance and inference tools for mixed models. The discussion also highlights the need for improving how scientists and statisticians teach and review the process of finding a good enough mixed model. (c) 2009 John Wiley & Sons, Ltd.

  7. Influence of Alveolar Bone Defects on the Stress Distribution in Quad Zygomatic Implant-Supported Maxillary Prosthesis.

    PubMed

    Duan, Yuanyuan; Chandran, Ravi; Cherry, Denise

    The purpose of this study was to create three-dimensional composite models of quad zygomatic implant-supported maxillary prostheses with a variety of alveolar bone defects around implant sites, and to investigate the stress distribution in the surrounding bone using the finite element analysis (FEA) method. Three-dimensional models of titanium zygomatic implants, maxillary prostheses, and human skulls were created and assembled using Mimics based on microcomputed tomography and cone beam computed tomography images. A variety of additional bone defects were created at the locations of four zygomatic implants to simulate multiple clinical scenarios. The volume meshes were created and exported into FEA software. Material properties were assigned respectively for all the structures, and von Mises stress data were collected and plotted in the postprocessing module. The maximum stress in the surrounding bone was located in the crestal bone around zygomatic implants. The maximum stress in the prostheses was located at the angled area of the implant-abutment connection. The model with anterior defects had a higher peak stress value than the model with posterior defects. All the models with additional bone defects had higher maximum stress values than the control model without additional bone loss. Additional alveolar bone loss has a negative influence on the stress concentration in the surrounding bone of quad zygomatic implant-supported prostheses. More care should be taken if these additional bone defects are at the sites of anterior zygomatic implants.

  8. Building a Propulsion Experiment Project Management Environment

    NASA Technical Reports Server (NTRS)

    Keiser, Ken; Tanner, Steve; Hatcher, Danny; Graves, Sara

    2004-01-01

    What do you get when you cross rocket scientists with computer geeks? It is an interactive, distributed computing web of tools and services providing a more productive environment for propulsion research and development. The Rocket Engine Advancement Program 2 (REAP2) project involves researchers at several institutions collaborating on propulsion experiments and modeling. In an effort to facilitate these collaborations among researchers at different locations and with different specializations, researchers at the Information Technology and Systems Center,' University of Alabama in Huntsville, are creating a prototype web-based interactive information system in support of propulsion research. This system, to be based on experience gained in creating similar systems for NASA Earth science field experiment campaigns such as the Convection and Moisture Experiments (CAMEX), will assist in the planning and analysis of model and experiment results across REAP2 participants. The initial version of the Propulsion Experiment Project Management Environment (PExPM) consists of a controlled-access web portal facilitating the drafting and sharing of working documents and publications. Interactive tools for building and searching an annotated bibliography of publications related to REAP2 research topics have been created to help organize and maintain the results of literature searches. Also work is underway, with some initial prototypes in place, for interactive project management tools allowing project managers to schedule experiment activities, track status and report on results. This paper describes current successes, plans, and expected challenges for this project.

  9. GRIDGEN Version 1.0: a computer program for generating unstructured finite-volume grids

    USGS Publications Warehouse

    Lien, Jyh-Ming; Liu, Gaisheng; Langevin, Christian D.

    2015-01-01

    GRIDGEN is a computer program for creating layered quadtree grids for use with numerical models, such as the MODFLOW–USG program for simulation of groundwater flow. The program begins by reading a three-dimensional base grid, which can have variable row and column widths and spatially variable cell top and bottom elevations. From this base grid, GRIDGEN will continuously divide into four any cell intersecting user-provided refinement features (points, lines, and polygons) until the desired level of refinement is reached. GRIDGEN will then smooth, or balance, the grid so that no two adjacent cells, including overlying and underlying cells, differ by more than a user-specified level tolerance. Once these gridding processes are completed, GRIDGEN saves a tree structure file so that the layered quadtree grid can be quickly reconstructed as needed. Once a tree structure file has been created, GRIDGEN can then be used to (1) export the layered quadtree grid as a shapefile, (2) export grid connectivity and cell information as ASCII text files for use with MODFLOW–USG or other numerical models, and (3) intersect the grid with shapefiles of points, lines, or polygons, and save intersection output as ASCII text files and shapefiles. The GRIDGEN program is demonstrated by creating a layered quadtree grid for the Biscayne aquifer in Miami-Dade County, Florida, using hydrologic features to control where refinement is added.

  10. Creating Printed Materials for Mathematics with a Macintosh Computer.

    ERIC Educational Resources Information Center

    Mahler, Philip

    This document gives instructions on how to use a Macintosh computer to create printed materials for mathematics. A Macintosh computer, Microsoft Word, and objected-oriented (Draw-type) art program, and a function-graphing program are capable of producing high quality printed instructional materials for mathematics. Word 5.1 has an equation editor…

  11. Creating and Using a Computer Networking and Systems Administration Laboratory Built under Relaxed Financial Constraints

    ERIC Educational Resources Information Center

    Conlon, Michael P.; Mullins, Paul

    2011-01-01

    The Computer Science Department at Slippery Rock University created a laboratory for its Computer Networks and System Administration and Security courses under relaxed financial constraints. This paper describes the department's experience designing and using this laboratory, including lessons learned and descriptions of some student projects…

  12. A conceptual snow model with an analytic resolution of the heat and phase change equations

    NASA Astrophysics Data System (ADS)

    Riboust, Philippe; Le Moine, Nicolas; Thirel, Guillaume; Ribstein, Pierre

    2017-04-01

    Compared to degree-day snow models, physically-based snow models resolve more processes in an attempt to achieve a better representation of reality. Often these physically-based models resolve the heat transport equations in snow using a vertical discretization of the snowpack. The snowpack is decomposed into several layers in which the mechanical and thermal states of the snow are calculated. A higher number of layers in the snowpack allow for better accuracy but it also tends to increase the computational costs. In order to develop a snow model that estimates the temperature profile of snow with a lower computational cost, we used an analytical decomposition of the vertical profile using eigenfunctions (i.e. trigonometric functions adapted to the specific boundary conditions). The mass transfer of snow melt has also been estimated using an analytical conceptualization of runoff fingering and matrix flow. As external meteorological forcing, the model uses solar and atmospheric radiation, air temperature, atmospheric humidity and precipitations. It has been tested and calibrated at point scale at two different stations in the Alps: Col de Porte (France, 1325 m) and Weissfluhjoch (Switzerland, 2540 m). A sensitivity analysis of model parameters and model inputs will be presented together with a comparison with measured snow surface temperature, SWE, snow depth, temperature profile and snow melt data. The snow model is created in order to be ultimately coupled with hydrological models for rainfall-runoff modeling in mountainous areas. We hope to create a model faster than physically-based models but capable to estimate more physical processes than degree-day snow models. This should help to build a more reliable snow model capable of being easily calibrated by remote sensing and in situ observation or to assimilate these data for forecasting purposes.

  13. Standardized Cardiovascular Data for Clinical Research, Registries, and Patient Care

    PubMed Central

    Anderson, H. Vernon; Weintraub, William S.; Radford, Martha J.; Kremers, Mark S.; Roe, Matthew T.; Shaw, Richard E.; Pinchotti, Dana M.; Tcheng, James E.

    2013-01-01

    Relatively little attention has been focused on standardization of data exchange in clinical research studies and patient care activities. Both are usually managed locally using separate and generally incompatible data systems at individual hospitals or clinics. In the past decade there have been nascent efforts to create data standards for clinical research and patient care data, and to some extent these are helpful in providing a degree of uniformity. Nevertheless these data standards generally have not been converted into accepted computer-based language structures that could permit reliable data exchange across computer networks. The National Cardiovascular Research Infrastructure (NCRI) project was initiated with a major objective of creating a model framework for standard data exchange in all clinical research, clinical registry, and patient care environments, including all electronic health records. The goal is complete syntactic and semantic interoperability. A Data Standards Workgroup was established to create or identify and then harmonize clinical definitions for a base set of standardized cardiovascular data elements that could be used in this network infrastructure. Recognizing the need for continuity with prior efforts, the Workgroup examined existing data standards sources. A basic set of 353 elements was selected. The NCRI staff then collaborated with the two major technical standards organizations in healthcare, the Clinical Data Interchange Standards Consortium and Health Level 7 International, as well as with staff from the National Cancer Institute Enterprise Vocabulary Services. Modeling and mapping were performed to represent (instantiate) the data elements in appropriate technical computer language structures for endorsement as an accepted data standard for public access and use. Fully implemented, these elements will facilitate clinical research, registry reporting, administrative reporting and regulatory compliance, and patient care. PMID:23500238

  14. Computed tomography-guided tissue engineering of upper airway cartilage.

    PubMed

    Brown, Bryan N; Siebenlist, Nicholas J; Cheetham, Jonathan; Ducharme, Norm G; Rawlinson, Jeremy J; Bonassar, Lawrence J

    2014-06-01

    Normal laryngeal function has a large impact on quality of life, and dysfunction can be life threatening. In general, airway obstructions arise from a reduction in neuromuscular function or a decrease in mechanical stiffness of the structures of the upper airway. These reductions decrease the ability of the airway to resist inspiratory or expiratory pressures, causing laryngeal collapse. We propose to restore airway patency through methods that replace damaged tissue and improve the stiffness of airway structures. A number of recent studies have utilized image-guided approaches to create cell-seeded constructs that reproduce the shape and size of the tissue of interest with high geometric fidelity. The objective of the present study was to establish a tissue engineering approach to the creation of viable constructs that approximate the shape and size of equine airway structures, in particular the epiglottis. Computed tomography images were used to create three-dimensional computer models of the cartilaginous structures of the larynx. Anatomically shaped injection molds were created from the three-dimensional models and were seeded with bovine auricular chondrocytes that were suspended within alginate before static culture. Constructs were then cultured for approximately 4 weeks post-seeding and evaluated for biochemical content, biomechanical properties, and histologic architecture. Results showed that the three-dimensional molded constructs had the approximate size and shape of the equine epiglottis and that it is possible to seed such constructs while maintaining 75%+ cell viability. Extracellular matrix content was observed to increase with time in culture and was accompanied by an increase in the mechanical stiffness of the construct. If successful, such an approach may represent a significant improvement on the currently available treatments for damaged airway cartilage and may provide clinical options for replacement of damaged tissue during treatment of obstructive airway disease.

  15. Incorporating intelligence into structured radiology reports

    NASA Astrophysics Data System (ADS)

    Kahn, Charles E.

    2014-03-01

    The new standard for radiology reporting templates being developed through the Integrating the Healthcare Enterprise (IHE) and DICOM organizations defines the storage and exchange of reporting templates as Hypertext Markup Language version 5 (HTML5) documents. The use of HTML5 enables the incorporation of "dynamic HTML," in which documents can be altered in response to their content. HTML5 documents can employ JavaScript, the HTML Document Object Model (DOM), and external web services to create intelligent reporting templates. Several reporting templates were created to demonstrate the use of scripts to perform in-template calculations and decision support. For example, a template for adrenal CT was created to compute contrast washout percentage from input values of precontrast, dynamic postcontrast, and delayed adrenal nodule attenuation values; the washout value can used to classify an adrenal nodule as a benign cortical adenoma. Dynamic templates were developed to compute volumes and apply diagnostic criteria, such as those for determination of internal carotid artery stenosis. Although reporting systems need not use a web browser to render the templates or their contents, the use of JavaScript creates innumerable opportunities to construct highly sophisticated HTML5 reporting templates. This report demonstrates the ability to incorporate dynamic content to enhance the use of radiology reporting templates.

  16. Fabrication of custom-shaped grafts for cartilage regeneration.

    PubMed

    Koo, Seungbum; Hargreaves, Brian A; Gold, Garry E; Dragoo, Jason L

    2010-10-01

    to create a custom-shaped graft through 3D tissue shape reconstruction and rapid-prototype molding methods using MRI data, and to test the accuracy of the custom-shaped graft against the original anatomical defect. An iatrogenic defect on the distal femur was identified with a 1.5 Tesla MRI and its shape was reconstructed into a three-dimensional (3D) computer model by processing the 3D MRI data. First, the accuracy of the MRI-derived 3D model was tested against a laser-scan based 3D model of the defect. A custom-shaped polyurethane graft was fabricated from the laser-scan based 3D model by creating custom molds through computer aided design and rapid-prototyping methods. The polyurethane tissue was laser-scanned again to calculate the accuracy of this process compared to the original defect. The volumes of the defect models from MRI and laser-scan were 537 mm3 and 405 mm3, respectively, implying that the MRI model was 33% larger than the laser-scan model. The average (±SD) distance deviation of the exterior surface of the MRI model from the laser-scan model was 0.4 ± 0.4 mm. The custom-shaped tissue created from the molds was qualitatively very similar to the original shape of the defect. The volume of the custom-shaped cartilage tissue was 463 mm3 which was 15% larger than the laser-scan model. The average (±SD) distance deviation between the two models was 0.04 ± 0.19 mm. This investigation proves the concept that custom-shaped engineered grafts can be fabricated from standard sequence 3-D MRI data with the use of CAD and rapid-prototyping technology. The accuracy of this technology may help solve the interfacial problem between native cartilage and graft, if the grafts are custom made for the specific defect. The major source of error in fabricating a 3D custom-shaped cartilage graft appears to be the accuracy of a MRI data itself; however, the precision of the model is expected to increase by the utilization of advanced MR sequences with higher magnet strengths.

  17. Designers workbench: toward real-time immersive modeling

    NASA Astrophysics Data System (ADS)

    Kuester, Falko; Duchaineau, Mark A.; Hamann, Bernd; Joy, Kenneth I.; Ma, Kwan-Liu

    2000-05-01

    This paper introduces the Designers Workbench, a semi- immersive virtual environment for two-handed modeling, sculpting and analysis tasks. The paper outlines the fundamental tools, design metaphors and hardware components required for an intuitive real-time modeling system. As companies focus on streamlining productivity to cope with global competition, the migration to computer-aided design (CAD), computer-aided manufacturing, and computer-aided engineering systems has established a new backbone of modern industrial product development. However, traditionally a product design frequently originates form a clay model that, after digitization, forms the basis for the numerical description of CAD primitives. The Designers Workbench aims at closing this technology or 'digital gap' experienced by design and CAD engineers by transforming the classical design paradigm into its fully integrate digital and virtual analog allowing collaborative development in a semi- immersive virtual environment. This project emphasizes two key components form the classical product design cycle: freeform modeling and analysis. In the freedom modeling stage, content creation in the form of two-handed sculpting of arbitrary objects using polygonal, volumetric or mathematically defined primitives is emphasized, whereas the analysis component provides the tools required for pre- and post-processing steps for finite element analysis tasks applied to the created models.

  18. The Cell Collective: Toward an open and collaborative approach to systems biology

    PubMed Central

    2012-01-01

    Background Despite decades of new discoveries in biomedical research, the overwhelming complexity of cells has been a significant barrier to a fundamental understanding of how cells work as a whole. As such, the holistic study of biochemical pathways requires computer modeling. Due to the complexity of cells, it is not feasible for one person or group to model the cell in its entirety. Results The Cell Collective is a platform that allows the world-wide scientific community to create these models collectively. Its interface enables users to build and use models without specifying any mathematical equations or computer code - addressing one of the major hurdles with computational research. In addition, this platform allows scientists to simulate and analyze the models in real-time on the web, including the ability to simulate loss/gain of function and test what-if scenarios in real time. Conclusions The Cell Collective is a web-based platform that enables laboratory scientists from across the globe to collaboratively build large-scale models of various biological processes, and simulate/analyze them in real time. In this manuscript, we show examples of its application to a large-scale model of signal transduction. PMID:22871178

  19. Expert models and modeling processes associated with a computer-modeling tool

    NASA Astrophysics Data System (ADS)

    Zhang, Baohui; Liu, Xiufeng; Krajcik, Joseph S.

    2006-07-01

    Holding the premise that the development of expertise is a continuous process, this study concerns expert models and modeling processes associated with a modeling tool called Model-It. Five advanced Ph.D. students in environmental engineering and public health used Model-It to create and test models of water quality. Using think aloud technique and video recording, we captured their computer screen modeling activities and thinking processes. We also interviewed them the day following their modeling sessions to further probe the rationale of their modeling practices. We analyzed both the audio-video transcripts and the experts' models. We found the experts' modeling processes followed the linear sequence built in the modeling program with few instances of moving back and forth. They specified their goals up front and spent a long time thinking through an entire model before acting. They specified relationships with accurate and convincing evidence. Factors (i.e., variables) in expert models were clustered, and represented by specialized technical terms. Based on the above findings, we made suggestions for improving model-based science teaching and learning using Model-It.

  20. GRAVTool, Advances on the Package to Compute Geoid Model path by the Remove-Compute-Restore Technique, Following Helmert's Condensation Method

    NASA Astrophysics Data System (ADS)

    Marotta, G. S.

    2017-12-01

    Currently, there are several methods to determine geoid models. They can be based on terrestrial gravity data, geopotential coefficients, astrogeodetic data or a combination of them. Among the techniques to compute a precise geoid model, the Remove Compute Restore (RCR) has been widely applied. It considers short, medium and long wavelengths derived from altitude data provided by Digital Terrain Models (DTM), terrestrial gravity data and Global Geopotential Model (GGM), respectively. In order to apply this technique, it is necessary to create procedures that compute gravity anomalies and geoid models, by the integration of different wavelengths, and adjust these models to one local vertical datum. This research presents the advances on the package called GRAVTool to compute geoid models path by the RCR, following Helmert's condensation method, and its application in a study area. The studied area comprehends the federal district of Brazil, with 6000 km², wavy relief, heights varying from 600 m to 1340 m, located between the coordinates 48.25ºW, 15.45ºS and 47.33ºW, 16.06ºS. The results of the numerical example on the studied area show a geoid model computed by the GRAVTool package, after analysis of the density, DTM and GGM values, more adequate to the reference values used on the study area. The accuracy of the computed model (σ = ± 0.058 m, RMS = 0.067 m, maximum = 0.124 m and minimum = -0.155 m), using density value of 2.702 g/cm³ ±0.024 g/cm³, DTM SRTM Void Filled 3 arc-second and GGM EIGEN-6C4 up to degree and order 250, matches the uncertainty (σ =± 0.073) of 26 points randomly spaced where the geoid was computed by geometrical leveling technique supported by positioning GNSS. The results were also better than those achieved by Brazilian official regional geoid model (σ = ± 0.076 m, RMS = 0.098 m, maximum = 0.320 m and minimum = -0.061 m).

Top